NED-IIS: An Intelligent Information System for Forest Ecosystem Management
W.D. Potter; S. Somasekar; R. Kommineni; H.M. Rauscher
1999-01-01
We view Intelligent Information System (IIS) as composed of a unified knowledge base, database, and model base. The model base includes decision support models, forecasting models, and cvsualization models for example. In addition, we feel that the model base should include domain specific porblems solving modules as well as decision support models. This, then,...
Assessing College Students' Understanding of Acid Base Chemistry Concepts
ERIC Educational Resources Information Center
Wan, Yanjun Jean
2014-01-01
Typically most college curricula include three acid base models: Arrhenius', Bronsted-Lowry's, and Lewis'. Although Lewis' acid base model is generally thought to be the most sophisticated among these three models, and can be further applied in reaction mechanisms, most general chemistry curricula either do not include Lewis' acid base model, or…
Ground-Based Telescope Parametric Cost Model
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Rowell, Ginger Holmes
2004-01-01
A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.
EMPIRE and pyenda: Two ensemble-based data assimilation systems written in Fortran and Python
NASA Astrophysics Data System (ADS)
Geppert, Gernot; Browne, Phil; van Leeuwen, Peter Jan; Merker, Claire
2017-04-01
We present and compare the features of two ensemble-based data assimilation frameworks, EMPIRE and pyenda. Both frameworks allow to couple models to the assimilation codes using the Message Passing Interface (MPI), leading to extremely efficient and fast coupling between models and the data-assimilation codes. The Fortran-based system EMPIRE (Employing Message Passing Interface for Researching Ensembles) is optimized for parallel, high-performance computing. It currently includes a suite of data assimilation algorithms including variants of the ensemble Kalman and several the particle filters. EMPIRE is targeted at models of all kinds of complexity and has been coupled to several geoscience models, eg. the Lorenz-63 model, a barotropic vorticity model, the general circulation model HadCM3, the ocean model NEMO, and the land-surface model JULES. The Python-based system pyenda (Python Ensemble Data Assimilation) allows Fortran- and Python-based models to be used for data assimilation. Models can be coupled either using MPI or by using a Python interface. Using Python allows quick prototyping and pyenda is aimed at small to medium scale models. pyenda currently includes variants of the ensemble Kalman filter and has been coupled to the Lorenz-63 model, an advection-based precipitation nowcasting scheme, and the dynamic global vegetation model JSBACH.
Representing Micro-Macro Linkages by Actor-Based Dynamic Network Models
ERIC Educational Resources Information Center
Snijders, Tom A. B.; Steglich, Christian E. G.
2015-01-01
Stochastic actor-based models for network dynamics have the primary aim of statistical inference about processes of network change, but may be regarded as a kind of agent-based models. Similar to many other agent-based models, they are based on local rules for actor behavior. Different from many other agent-based models, by including elements of…
Design of a component-based integrated environmental modeling framework
Integrated environmental modeling (IEM) includes interdependent science-based components (e.g., models, databases, viewers, assessment protocols) that comprise an appropriate software modeling system. The science-based components are responsible for consuming and producing inform...
The Use of Modeling-Based Text to Improve Students' Modeling Competencies
ERIC Educational Resources Information Center
Jong, Jing-Ping; Chiu, Mei-Hung; Chung, Shiao-Lan
2015-01-01
This study investigated the effects of a modeling-based text on 10th graders' modeling competencies. Fifteen 10th graders read a researcher-developed modeling-based science text on the ideal gas law that included explicit descriptions and representations of modeling processes (i.e., model selection, model construction, model validation, model…
Using a data base management system for modelling SSME test history data
NASA Technical Reports Server (NTRS)
Abernethy, K.
1985-01-01
The usefulness of a data base management system (DBMS) for modelling historical test data for the complete series of static test firings for the Space Shuttle Main Engine (SSME) was assessed. From an analysis of user data base query requirements, it became clear that a relational DMBS which included a relationally complete query language would permit a model satisfying the query requirements. Representative models and sample queries are discussed. A list of environment-particular evaluation criteria for the desired DBMS was constructed; these criteria include requirements in the areas of user-interface complexity, program independence, flexibility, modifiability, and output capability. The evaluation process included the construction of several prototype data bases for user assessement. The systems studied, representing the three major DBMS conceptual models, were: MIRADS, a hierarchical system; DMS-1100, a CODASYL-based network system; ORACLE, a relational system; and DATATRIEVE, a relational-type system.
Dhana, Klodian; Ikram, M Arfan; Hofman, Albert; Franco, Oscar H; Kavousi, Maryam
2015-03-01
Body mass index (BMI) has been used to simplify cardiovascular risk prediction models by substituting total cholesterol and high-density lipoprotein cholesterol. In the elderly, the ability of BMI as a predictor of cardiovascular disease (CVD) declines. We aimed to find the most predictive anthropometric measure for CVD risk to construct a non-laboratory-based model and to compare it with the model including laboratory measurements. The study included 2675 women and 1902 men aged 55-79 years from the prospective population-based Rotterdam Study. We used Cox proportional hazard regression analysis to evaluate the association of BMI, waist circumference, waist-to-hip ratio and a body shape index (ABSI) with CVD, including coronary heart disease and stroke. The performance of the laboratory-based and non-laboratory-based models was evaluated by studying the discrimination, calibration, correlation and risk agreement. Among men, ABSI was the most informative measure associated with CVD, therefore ABSI was used to construct the non-laboratory-based model. Discrimination of the non-laboratory-based model was not different than laboratory-based model (c-statistic: 0.680-vs-0.683, p=0.71); both models were well calibrated (15.3% observed CVD risk vs 16.9% and 17.0% predicted CVD risks by the non-laboratory-based and laboratory-based models, respectively) and Spearman rank correlation and the agreement between non-laboratory-based and laboratory-based models were 0.89 and 91.7%, respectively. Among women, none of the anthropometric measures were independently associated with CVD. Among middle-aged and elderly where the ability of BMI to predict CVD declines, the non-laboratory-based model, based on ABSI, could predict CVD risk as accurately as the laboratory-based model among men. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Testing the Predictive Power of Coulomb Stress on Aftershock Sequences
NASA Astrophysics Data System (ADS)
Woessner, J.; Lombardi, A.; Werner, M. J.; Marzocchi, W.
2009-12-01
Empirical and statistical models of clustered seismicity are usually strongly stochastic and perceived to be uninformative in their forecasts, since only marginal distributions are used, such as the Omori-Utsu and Gutenberg-Richter laws. In contrast, so-called physics-based aftershock models, based on seismic rate changes calculated from Coulomb stress changes and rate-and-state friction, make more specific predictions: anisotropic stress shadows and multiplicative rate changes. We test the predictive power of models based on Coulomb stress changes against statistical models, including the popular Short Term Earthquake Probabilities and Epidemic-Type Aftershock Sequences models: We score and compare retrospective forecasts on the aftershock sequences of the 1992 Landers, USA, the 1997 Colfiorito, Italy, and the 2008 Selfoss, Iceland, earthquakes. To quantify predictability, we use likelihood-based metrics that test the consistency of the forecasts with the data, including modified and existing tests used in prospective forecast experiments within the Collaboratory for the Study of Earthquake Predictability (CSEP). Our results indicate that a statistical model performs best. Moreover, two Coulomb model classes seem unable to compete: Models based on deterministic Coulomb stress changes calculated from a given fault-slip model, and those based on fixed receiver faults. One model of Coulomb stress changes does perform well and sometimes outperforms the statistical models, but its predictive information is diluted, because of uncertainties included in the fault-slip model. Our results suggest that models based on Coulomb stress changes need to incorporate stochastic features that represent model and data uncertainty.
Cellular-based modeling of oscillatory dynamics in brain networks.
Skinner, Frances K
2012-08-01
Oscillatory, population activities have long been known to occur in our brains during different behavioral states. We know that many different cell types exist and that they contribute in distinct ways to the generation of these activities. I review recent papers that involve cellular-based models of brain networks, most of which include theta, gamma and sharp wave-ripple activities. To help organize the modeling work, I present it from a perspective of three different types of cellular-based modeling: 'Generic', 'Biophysical' and 'Linking'. Cellular-based modeling is taken to encompass the four features of experiment, model development, theory/analyses, and model usage/computation. The three modeling types are shown to include these features and interactions in different ways. Copyright © 2012 Elsevier Ltd. All rights reserved.
Model-based surgical planning and simulation of cranial base surgery.
Abe, M; Tabuchi, K; Goto, M; Uchino, A
1998-11-01
Plastic skull models of seven individual patients were fabricated by stereolithography from three-dimensional data based on computed tomography bone images. Skull models were utilized for neurosurgical planning and simulation in the seven patients with cranial base lesions that were difficult to remove. Surgical approaches and areas of craniotomy were evaluated using the fabricated skull models. In preoperative simulations, hand-made models of the tumors, major vessels and nerves were placed in the skull models. Step-by-step simulation of surgical procedures was performed using actual surgical tools. The advantages of using skull models to plan and simulate cranial base surgery include a better understanding of anatomic relationships, preoperative evaluation of the proposed procedure, increased understanding by the patient and family, and improved educational experiences for residents and other medical staff. The disadvantages of using skull models include the time and cost of making the models. The skull models provide a more realistic tool that is easier to handle than computer-graphic images. Surgical simulation using models facilitates difficult cranial base surgery and may help reduce surgical complications.
ABSTRACT
Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...
Tempest in a Therapeutic Community: Implementation and Evaluation Issues for Faith-Based Programming
ERIC Educational Resources Information Center
Scott, Diane L.; Crow, Matthew S.; Thompson, Carla J.
2010-01-01
The therapeutic community (TC) is an increasingly utilized intervention model in corrections settings. Rarely do these TCs include faith-based curriculum other than that included in Alcoholics Anonymous or Narcotics Anonymous programs as does the faith-based TC that serves as the basis for this article. Borrowing from the successful TC model, the…
The Systems Biology Markup Language (SBML) Level 3 Package: Flux Balance Constraints.
Olivier, Brett G; Bergmann, Frank T
2015-09-04
Constraint-based modeling is a well established modelling methodology used to analyze and study biological networks on both a medium and genome scale. Due to their large size, genome scale models are typically analysed using constraint-based optimization techniques. One widely used method is Flux Balance Analysis (FBA) which, for example, requires a modelling description to include: the definition of a stoichiometric matrix, an objective function and bounds on the values that fluxes can obtain at steady state. The Flux Balance Constraints (FBC) Package extends SBML Level 3 and provides a standardized format for the encoding, exchange and annotation of constraint-based models. It includes support for modelling concepts such as objective functions, flux bounds and model component annotation that facilitates reaction balancing. The FBC package establishes a base level for the unambiguous exchange of genome-scale, constraint-based models, that can be built upon by the community to meet future needs (e. g. by extending it to cover dynamic FBC models).
The Systems Biology Markup Language (SBML) Level 3 Package: Flux Balance Constraints.
Olivier, Brett G; Bergmann, Frank T
2015-06-01
Constraint-based modeling is a well established modelling methodology used to analyze and study biological networks on both a medium and genome scale. Due to their large size, genome scale models are typically analysed using constraint-based optimization techniques. One widely used method is Flux Balance Analysis (FBA) which, for example, requires a modelling description to include: the definition of a stoichiometric matrix, an objective function and bounds on the values that fluxes can obtain at steady state. The Flux Balance Constraints (FBC) Package extends SBML Level 3 and provides a standardized format for the encoding, exchange and annotation of constraint-based models. It includes support for modelling concepts such as objective functions, flux bounds and model component annotation that facilitates reaction balancing. The FBC package establishes a base level for the unambiguous exchange of genome-scale, constraint-based models, that can be built upon by the community to meet future needs (e. g. by extending it to cover dynamic FBC models).
What are the Starting Points? Evaluating Base-Year Assumptions in the Asian Modeling Exercise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chaturvedi, Vaibhav; Waldhoff, Stephanie; Clarke, Leon E.
2012-12-01
A common feature of model inter-comparison efforts is that the base year numbers for important parameters such as population and GDP can differ substantially across models. This paper explores the sources and implications of this variation in Asian countries across the models participating in the Asian Modeling Exercise (AME). Because the models do not all have a common base year, each team was required to provide data for 2005 for comparison purposes. This paper compares the year 2005 information for different models, noting the degree of variation in important parameters, including population, GDP, primary energy, electricity, and CO2 emissions. Itmore » then explores the difference in these key parameters across different sources of base-year information. The analysis confirms that the sources provide different values for many key parameters. This variation across data sources and additional reasons why models might provide different base-year numbers, including differences in regional definitions, differences in model base year, and differences in GDP transformation methodologies, are then discussed in the context of the AME scenarios. Finally, the paper explores the implications of base-year variation on long-term model results.« less
Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...
Avian seasonal productivity is often modeled as a time-limited stochastic process. Many mathematical formulations have been proposed, including individual based models, continuous-time differential equations, and discrete Markov models. All such models typically include paramete...
Modeling formalisms in Systems Biology
2011-01-01
Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422
Modelling of capillary-driven flow for closed paper-based microfluidic channels
NASA Astrophysics Data System (ADS)
Songok, Joel; Toivakka, Martti
2017-06-01
Paper-based microfluidics is an emerging field focused on creating inexpensive devices, with simple fabrication methods for applications in various fields including healthcare, environmental monitoring and veterinary medicine. Understanding the flow of liquid is important in achieving consistent operation of the devices. This paper proposes capillary models to predict flow in paper-based microfluidic channels, which include a flow accelerating hydrophobic top cover. The models, which consider both non-absorbing and absorbing substrates, are in good agreement with the experimental results.
Model-Based Economic Evaluation of Treatments for Depression: A Systematic Literature Review.
Kolovos, Spyros; Bosmans, Judith E; Riper, Heleen; Chevreul, Karine; Coupé, Veerle M H; van Tulder, Maurits W
2017-09-01
An increasing number of model-based studies that evaluate the cost effectiveness of treatments for depression are being published. These studies have different characteristics and use different simulation methods. We aimed to systematically review model-based studies evaluating the cost effectiveness of treatments for depression and examine which modelling technique is most appropriate for simulating the natural course of depression. The literature search was conducted in the databases PubMed, EMBASE and PsycInfo between 1 January 2002 and 1 October 2016. Studies were eligible if they used a health economic model with quality-adjusted life-years or disability-adjusted life-years as an outcome measure. Data related to various methodological characteristics were extracted from the included studies. The available modelling techniques were evaluated based on 11 predefined criteria. This methodological review included 41 model-based studies, of which 21 used decision trees (DTs), 15 used cohort-based state-transition Markov models (CMMs), two used individual-based state-transition models (ISMs), and three used discrete-event simulation (DES) models. Just over half of the studies (54%) evaluated antidepressants compared with a control condition. The data sources, time horizons, cycle lengths, perspectives adopted and number of health states/events all varied widely between the included studies. DTs scored positively in four of the 11 criteria, CMMs in five, ISMs in six, and DES models in seven. There were substantial methodological differences between the studies. Since the individual history of each patient is important for the prognosis of depression, DES and ISM simulation methods may be more appropriate than the others for a pragmatic representation of the course of depression. However, direct comparisons between the available modelling techniques are necessary to yield firm conclusions.
QuakeSim: a Web Service Environment for Productive Investigations with Earth Surface Sensor Data
NASA Astrophysics Data System (ADS)
Parker, J. W.; Donnellan, A.; Granat, R. A.; Lyzenga, G. A.; Glasscoe, M. T.; McLeod, D.; Al-Ghanmi, R.; Pierce, M.; Fox, G.; Grant Ludwig, L.; Rundle, J. B.
2011-12-01
The QuakeSim science gateway environment includes a visually rich portal interface, web service access to data and data processing operations, and the QuakeTables ontology-based database of fault models and sensor data. The integrated tools and services are designed to assist investigators by covering the entire earthquake cycle of strain accumulation and release. The Web interface now includes Drupal-based access to diverse and changing content, with new ability to access data and data processing directly from the public page, as well as the traditional project management areas that require password access. The system is designed to make initial browsing of fault models and deformation data particularly engaging for new users. Popular data and data processing include GPS time series with data mining techniques to find anomalies in time and space, experimental forecasting methods based on catalogue seismicity, faulted deformation models (both half-space and finite element), and model-based inversion of sensor data. The fault models include the CGS and UCERF 2.0 faults of California and are easily augmented with self-consistent fault models from other regions. The QuakeTables deformation data include the comprehensive set of UAVSAR interferograms as well as a growing collection of satellite InSAR data.. Fault interaction simulations are also being incorporated in the web environment based on Virtual California. A sample usage scenario is presented which follows an investigation of UAVSAR data from viewing as an overlay in Google Maps, to selection of an area of interest via a polygon tool, to fast extraction of the relevant correlation and phase information from large data files, to a model inversion of fault slip followed by calculation and display of a synthetic model interferogram.
A Physiologically-based Model for Methylmercury Uptake and Accumulation in Female American Kestrels
A physiologically-based model was developed to describe the uptake, distribution, and elimination of methylmercury in female American Kestrels (Falco sparverius). The model was adapted from established models for methylmercury in rodents. Features unique to the model include meth...
Investigating Island Evolution: A Galapagos-Based Lesson Using the 5E Instructional Model.
ERIC Educational Resources Information Center
DeFina, Anthony V.
2002-01-01
Introduces an inquiry-based lesson plan on evolution and the Galapagos Islands. Uses the 5E instructional model which includes phases of engagement, exploration, explanation, elaboration, and evaluation. Includes information on species for exploration and elaboration purposes, and a general rubric for student evaluation. (YDS)
NASA Technical Reports Server (NTRS)
Flowers, George T.
1994-01-01
Progress over the past year includes the following: A simplified rotor model with a flexible shaft and backup bearings has been developed. A simple rotor model which includes a flexible disk and bearings with clearance has been developed and the dynamics of the model investigated. A rotor model based upon the T-501 engine has been developed which includes backup bearing effects. Parallel simulation runs are being conducted using an ANSYS based finite element model of the T-501. The magnetic bearing test rig is currently floating and dynamics/control tests are being conducted. A paper has been written that documents the work using the T-501 engine model. Work has continued with the simplified model. The finite element model is currently being modified to include the effects of foundation dynamics. A literature search for material on foil bearings has been conducted. A finite element model is being developed for a magnetic bearing in series with a foil backup bearing.
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
Multilevel Modeling of Social Segregation
ERIC Educational Resources Information Center
Leckie, George; Pillinger, Rebecca; Jones, Kelvyn; Goldstein, Harvey
2012-01-01
The traditional approach to measuring segregation is based upon descriptive, non-model-based indices. A recently proposed alternative is multilevel modeling. The authors further develop the argument for a multilevel modeling approach by first describing and expanding upon its notable advantages, which include an ability to model segregation at a…
Hinckley, Daniel M.; Freeman, Gordon S.; Whitmer, Jonathan K.; de Pablo, Juan J.
2013-01-01
A new 3-Site-Per-Nucleotide coarse-grained model for DNA is presented. The model includes anisotropic potentials between bases involved in base stacking and base pair interactions that enable the description of relevant structural properties, including the major and minor grooves. In an improvement over available coarse-grained models, the correct persistence length is recovered for both ssDNA and dsDNA, allowing for simulation of non-canonical structures such as hairpins. DNA melting temperatures, measured for duplexes and hairpins by integrating over free energy surfaces generated using metadynamics simulations, are shown to be in quantitative agreement with experiment for a variety of sequences and conditions. Hybridization rate constants, calculated using forward-flux sampling, are also shown to be in good agreement with experiment. The coarse-grained model presented here is suitable for use in biological and engineering applications, including nucleosome positioning and DNA-templated engineering. PMID:24116642
NASA Technical Reports Server (NTRS)
Maluf, David A. (Inventor); Bell, David G. (Inventor); Gurram, Mohana M. (Inventor); Gawdiak, Yuri O. (Inventor)
2009-01-01
A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as a monthly report, a task plan report, a budget report and a risk management report, are generated and made available for display or further analysis. An extensible database allows searching for information based upon context and upon content.
SECURITY MODELING FOR MARITIME PORT DEFENSE RESOURCE ALLOCATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, S.; Dunn, D.
2010-09-07
Redeployment of existing law enforcement resources and optimal use of geographic terrain are examined for countering the threat of a maritime based small-vessel radiological or nuclear attack. The evaluation was based on modeling conducted by the Savannah River National Laboratory that involved the development of options for defensive resource allocation that can reduce the risk of a maritime based radiological or nuclear threat. A diverse range of potential attack scenarios has been assessed. As a result of identifying vulnerable pathways, effective countermeasures can be deployed using current resources. The modeling involved the use of the Automated Vulnerability Evaluation for Risksmore » of Terrorism (AVERT{reg_sign}) software to conduct computer based simulation modeling. The models provided estimates for the probability of encountering an adversary based on allocated resources including response boats, patrol boats and helicopters over various environmental conditions including day, night, rough seas and various traffic flow rates.« less
ERIC Educational Resources Information Center
Moallem, Mahnaz
2001-01-01
Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…
Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K
2011-10-01
To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods including only regression or both regression and ranking constraints on clinical data. On high dimensional data, the former model performs better. However, this approach does not have a theoretical link with standard statistical models for survival data. This link can be made by means of transformation models when ranking constraints are included. Copyright © 2011 Elsevier B.V. All rights reserved.
Toward Agent-Based Models of the Development And Evolution of Business Relations and Networks
NASA Astrophysics Data System (ADS)
Wilkinson, Ian F.; Marks, Robert E.; Young, Louise
Firms achieve competitive advantage in part through the development of cooperative relations with other firms and organisations. We describe a program of research designed to map and model the development of cooperative inter-firm relations, including the processes and paths by which firms may evolve from adversarial to more cooperative relations. Narrative-event-history methods will be used to develop stylised histories of the emergence of business relations in various contexts and to identify relevant causal mechanisms to be included in the agent-based models of relationship and network evolution. The relationship histories will provide the means of assuring the agent-based models developed.
NASA Technical Reports Server (NTRS)
Flowers, George T.
1994-01-01
Substantial progress has been made toward the goals of this research effort in the past six months. A simplified rotor model with a flexible shaft and backup bearings has been developed. The model is based upon the work of Ishii and Kirk. Parameter studies of the behavior of this model are currently being conducted. A simple rotor model which includes a flexible disk and bearings with clearance has been developed and the dynamics of the model investigated. The study consists of simulation work coupled with experimental verification. The work is documented in the attached paper. A rotor model based upon the T-501 engine has been developed which includes backup bearing effects. The dynamics of this model are currently being studied with the objective of verifying the conclusions obtained from the simpler models. Parallel simulation runs are being conducted using an ANSYS based finite element model of the T-501.
Toward a Computational Model of Tutoring.
ERIC Educational Resources Information Center
Woolf, Beverly Park
1992-01-01
Discusses the integration of instructional science and computer science. Topics addressed include motivation for building knowledge-based systems; instructional design issues, including cognitive models, representing student intentions, and student models and error diagnosis; representing tutoring knowledge; building a tutoring system, including…
Comparison of a Conceptual Groundwater Model and Physically Based Groundwater Mode
NASA Astrophysics Data System (ADS)
Yang, J.; Zammit, C.; Griffiths, J.; Moore, C.; Woods, R. A.
2017-12-01
Groundwater is a vital resource for human activities including agricultural practice and urban water demand. Hydrologic modelling is an important way to study groundwater recharge, movement and discharge, and its response to both human activity and climate change. To understand the groundwater hydrologic processes nationally in New Zealand, we have developed a conceptually based groundwater flow model, which is fully integrated into a national surface-water model (TopNet), and able to simulate groundwater recharge, movement, and interaction with surface water. To demonstrate the capability of this groundwater model (TopNet-GW), we applied the model to an irrigated area with water shortage and pollution problems in the upper Ruamahanga catchment in Great Wellington Region, New Zealand, and compared its performance with a physically-based groundwater model (MODFLOW). The comparison includes river flow at flow gauging sites, and interaction between groundwater and river. Results showed that the TopNet-GW produced similar flow and groundwater interaction patterns as the MODFLOW model, but took less computation time. This shows the conceptually-based groundwater model has the potential to simulate national groundwater process, and could be used as a surrogate for the more physically based model.
2012-02-28
Interaction Model based on Accelerated Reactive Molecular Dynamics for Hypersonic conditions including Thermal Conduction FA9550-09-1-0157 Schwartzentruber...Dynamics for Hypersonic Conditions including Thermal Conduction Grant/Contract Number: FA9550-09-1-0157 Program Manager: Dr. John Schmisseur PI...through the boundary layer and may chemically react with the vehicle’s thermal protection system (TPS). Many TPS materials act as a catalyst for the
Simulating Runoff from a Grid Based Mercury Model: Flow Comparisons
Several mercury cycling models, including general mass balance approaches, mixed-batch reactors in streams or lakes, or regional process-based models, exist to assess the ecological exposure risks associated with anthropogenically increased atmospheric mercury (Hg) deposition, so...
Murray, Thomas A; Yuan, Ying; Thall, Peter F; Elizondo, Joan H; Hofstetter, Wayne L
2018-01-22
A design is proposed for randomized comparative trials with ordinal outcomes and prognostic subgroups. The design accounts for patient heterogeneity by allowing possibly different comparative conclusions within subgroups. The comparative testing criterion is based on utilities for the levels of the ordinal outcome and a Bayesian probability model. Designs based on two alternative models that include treatment-subgroup interactions are considered, the proportional odds model and a non-proportional odds model with a hierarchical prior that shrinks toward the proportional odds model. A third design that assumes homogeneity and ignores possible treatment-subgroup interactions also is considered. The three approaches are applied to construct group sequential designs for a trial of nutritional prehabilitation versus standard of care for esophageal cancer patients undergoing chemoradiation and surgery, including both untreated patients and salvage patients whose disease has recurred following previous therapy. A simulation study is presented that compares the three designs, including evaluation of within-subgroup type I and II error probabilities under a variety of scenarios including different combinations of treatment-subgroup interactions. © 2018, The International Biometric Society.
Performer-centric Interface Design.
ERIC Educational Resources Information Center
McGraw, Karen L.
1995-01-01
Describes performer-centric interface design and explains a model-based approach for conducting performer-centric analysis and design. Highlights include design methodology, including cognitive task analysis; creating task scenarios; creating the presentation model; creating storyboards; proof of concept screens; object models and icons;…
Johnson, Timothy R; Kuhn, Kristine M
2015-12-01
This paper introduces the ltbayes package for R. This package includes a suite of functions for investigating the posterior distribution of latent traits of item response models. These include functions for simulating realizations from the posterior distribution, profiling the posterior density or likelihood function, calculation of posterior modes or means, Fisher information functions and observed information, and profile likelihood confidence intervals. Inferences can be based on individual response patterns or sets of response patterns such as sum scores. Functions are included for several common binary and polytomous item response models, but the package can also be used with user-specified models. This paper introduces some background and motivation for the package, and includes several detailed examples of its use.
Hamann, Hendrik F.; Hwang, Youngdeok; van Kessel, Theodore G.; Khabibrakhmanov, Ildar K.; Muralidhar, Ramachandran
2016-10-18
A method and a system to perform multi-model blending are described. The method includes obtaining one or more sets of predictions of historical conditions, the historical conditions corresponding with a time T that is historical in reference to current time, and the one or more sets of predictions of the historical conditions being output by one or more models. The method also includes obtaining actual historical conditions, the actual historical conditions being measured conditions at the time T, assembling a training data set including designating the two or more set of predictions of historical conditions as predictor variables and the actual historical conditions as response variables, and training a machine learning algorithm based on the training data set. The method further includes obtaining a blended model based on the machine learning algorithm.
Development and verification of an agent-based model of opinion leadership.
Anderson, Christine A; Titler, Marita G
2014-09-27
The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.
Second Generation Models for Strain-Based Design
DOT National Transportation Integrated Search
2011-08-30
This project covers the development of tensile strain design models which form a key part of the strain-based design of pipelines. The strain-based design includes at least two limit states, tensile rupture, and compressive buckling. The tensile stra...
Klein, Tracy Ann
2012-01-01
The purpose of this study was to identify and implement a competency-based regulatory model that transitions clinical nurse specialists (CNSs) to autonomous prescriptive authority pursuant to change in state law. Prescriptive authority for CNSs may be optional or restricted under current state law. Implementation of the APRN Consensus Model includes full prescriptive authority for all advanced practice registered nurses. Clinical nurse specialists face barriers to establishing their prescribing authority when laws or practice change. Identification of transition models will assist CNSs who need to add prescriptive authority to their scope of practice. Identification and implementation of a competency-based transition model for expansion of CNS prescriptive authority. By January 1, 2012, 9 CNSs in the state exemplar have completed a practicum and been granted full prescriptive authority including scheduled drug prescribing. No complaints or board actions resulted from the transition to autonomous prescribing. Transition to prescribing may be facilitated through competency-based outcomes including practicum hours as appropriate to the individual CNS nursing specialty. Outcomes from this model can be used to develop and further validate educational and credentialing policies to reduce barriers for CNSs requiring prescriptive authority in other states.
NASA Astrophysics Data System (ADS)
Benjanirat, Sarun
Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.
Vaidya, Anil; Joore, Manuela A; ten Cate-Hoek, Arina J; Kleinegris, Marie-Claire; ten Cate, Hugo; Severens, Johan L
2014-01-01
Lower extremity artery disease (LEAD) is a sign of wide spread atherosclerosis also affecting coronary, cerebral and renal arteries and is associated with increased risk of cardiovascular events. Many economic evaluations have been published for LEAD due to its clinical, social and economic importance. The aim of this systematic review was to assess modelling methods used in published economic evaluations in the field of LEAD. Our review appraised and compared the general characteristics, model structure and methodological quality of published models. Electronic databases MEDLINE and EMBASE were searched until February 2013 via OVID interface. Cochrane database of systematic reviews, Health Technology Assessment database hosted by National Institute for Health research and National Health Services Economic Evaluation Database (NHSEED) were also searched. The methodological quality of the included studies was assessed by using the Philips' checklist. Sixteen model-based economic evaluations were identified and included. Eleven models compared therapeutic health technologies; three models compared diagnostic tests and two models compared a combination of diagnostic and therapeutic options for LEAD. Results of this systematic review revealed an acceptable to low methodological quality of the included studies. Methodological diversity and insufficient information posed a challenge for valid comparison of the included studies. In conclusion, there is a need for transparent, methodologically comparable and scientifically credible model-based economic evaluations in the field of LEAD. Future modelling studies should include clinically and economically important cardiovascular outcomes to reflect the wider impact of LEAD on individual patients and on the society.
Quantifying the Global Nitrous Oxide Emissions Using a Trait-based Biogeochemistry Model
NASA Astrophysics Data System (ADS)
Zhuang, Q.; Yu, T.
2017-12-01
Nitrogen is an essential element for the global biogeochemical cycle. It is a key nutrient for organisms and N compounds including nitrous oxide significantly influence the global climate. The activities of bacteria and archaea are responsible for the nitrification and denitrification in a wide variety of environments, so microbes play an important role in the nitrogen cycle in soils. To date, most existing process-based models treated nitrification and denitrification as chemical reactions driven by soil physical variables including soil temperature and moisture. In general, the effect of microbes on N cycling has not been modeled in sufficient details. Soil organic carbon also affects the N cycle because it supplies energy to microbes. In my study, a trait-based biogeochemistry model quantifying N2O emissions from the terrestrial ecosystems is developed based on an extant process-based model TEM (Terrestrial Ecosystem Model). Specifically, the improvement to TEM includes: 1) Incorporating the N fixation process to account for the inflow of N from the atmosphere to biosphere; 2) Implementing the effects of microbial dynamics on nitrification process; 3) fully considering the effects of carbon cycling on N nitrogen cycling following the principles of stoichiometry of carbon and nitrogen in soils, plants, and microbes. The difference between simulations with and without the consideration of bacterial activity lies between 5% 25% based on climate conditions and vegetation types. The trait based module allows a more detailed estimation of global N2O emissions.
A Physiologically Based Kinetic Model of Rat and Mouse Gestation: Disposition of a Weak Acid
A physiologically based toxicokinetic model of gestation in the rat mouse has been developed. The model is superimposed on the normal growth curve for nonpregnant females. It describes the entire gestation period including organogenesis. The model consists of uterus, mammary tiss...
Engineers' Non-Scientific Models in Technology Education
ERIC Educational Resources Information Center
Norstrom, Per
2013-01-01
Engineers commonly use rules, theories and models that lack scientific justification. Examples include rules of thumb based on experience, but also models based on obsolete science or folk theories. Centrifugal forces, heat and cold as substances, and sucking vacuum all belong to the latter group. These models contradict scientific knowledge, but…
Automatic pole-like object modeling via 3D part-based analysis of point cloud
NASA Astrophysics Data System (ADS)
He, Liu; Yang, Haoxiang; Huang, Yuchun
2016-10-01
Pole-like objects, including trees, lampposts and traffic signs, are indispensable part of urban infrastructure. With the advance of vehicle-based laser scanning (VLS), massive point cloud of roadside urban areas becomes applied in 3D digital city modeling. Based on the property that different pole-like objects have various canopy parts and similar trunk parts, this paper proposed the 3D part-based shape analysis to robustly extract, identify and model the pole-like objects. The proposed method includes: 3D clustering and recognition of trunks, voxel growing and part-based 3D modeling. After preprocessing, the trunk center is identified as the point that has local density peak and the largest minimum inter-cluster distance. Starting from the trunk centers, the remaining points are iteratively clustered to the same centers of their nearest point with higher density. To eliminate the noisy points, cluster border is refined by trimming boundary outliers. Then, candidate trunks are extracted based on the clustering results in three orthogonal planes by shape analysis. Voxel growing obtains the completed pole-like objects regardless of overlaying. Finally, entire trunk, branch and crown part are analyzed to obtain seven feature parameters. These parameters are utilized to model three parts respectively and get signal part-assembled 3D model. The proposed method is tested using the VLS-based point cloud of Wuhan University, China. The point cloud includes many kinds of trees, lampposts and other pole-like posters under different occlusions and overlaying. Experimental results show that the proposed method can extract the exact attributes and model the roadside pole-like objects efficiently.
Royle, J. Andrew; Dorazio, Robert M.
2008-01-01
A guide to data collection, modeling and inference strategies for biological survey data using Bayesian and classical statistical methods. This book describes a general and flexible framework for modeling and inference in ecological systems based on hierarchical models, with a strict focus on the use of probability models and parametric inference. Hierarchical models represent a paradigm shift in the application of statistics to ecological inference problems because they combine explicit models of ecological system structure or dynamics with models of how ecological systems are observed. The principles of hierarchical modeling are developed and applied to problems in population, metapopulation, community, and metacommunity systems. The book provides the first synthetic treatment of many recent methodological advances in ecological modeling and unifies disparate methods and procedures. The authors apply principles of hierarchical modeling to ecological problems, including * occurrence or occupancy models for estimating species distribution * abundance models based on many sampling protocols, including distance sampling * capture-recapture models with individual effects * spatial capture-recapture models based on camera trapping and related methods * population and metapopulation dynamic models * models of biodiversity, community structure and dynamics.
Alternative Payment Models in Radiology: The Legislative and Regulatory Roadmap for Reform.
Silva, Ezequiel; McGinty, Geraldine B; Hughes, Danny R; Duszak, Richard
2016-10-01
The Medicare Access and CHIP Reauthorization Act (MACRA) replaces the sustainable growth rate with a payment system based on the Merit-Based Incentive Payment System and incentives for alternative payment model participation. It is important that radiologists understand the statutory requirements of MACRA. This includes the nature of the Merit-Based Incentive Payment System composite performance score and its impact on payments. The timeline for MACRA implementation is fairly aggressive and includes a robust effort to define episode groups, which include radiologic services. A number of organizations, including the ACR, are commenting on the structure of MACRA-directed initiatives. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Moutinho, Sara; Moura, Rui; Vasconcelos, Clara
2017-01-01
Model-Based learning is a methodology that facilitates students' construction of scientific knowledge, which, sometimes, includes restructuring their mental models. Taking into consideration students' learning process, its aim is to promote a deeper understanding of phenomena's dynamics through the manipulation of models. Our aim was to ascertain…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hinckley, Daniel M.; Freeman, Gordon S.; Whitmer, Jonathan K.
2013-10-14
A new 3-Site-Per-Nucleotide coarse-grained model for DNA is presented. The model includes anisotropic potentials between bases involved in base stacking and base pair interactions that enable the description of relevant structural properties, including the major and minor grooves. In an improvement over available coarse-grained models, the correct persistence length is recovered for both ssDNA and dsDNA, allowing for simulation of non-canonical structures such as hairpins. DNA melting temperatures, measured for duplexes and hairpins by integrating over free energy surfaces generated using metadynamics simulations, are shown to be in quantitative agreement with experiment for a variety of sequences and conditions. Hybridizationmore » rate constants, calculated using forward-flux sampling, are also shown to be in good agreement with experiment. The coarse-grained model presented here is suitable for use in biological and engineering applications, including nucleosome positioning and DNA-templated engineering.« less
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Lindgren, Eric A.
2018-04-01
This paper expands on the objective and motivation for NDE-based characterization and includes a discussion of the current approach using model-assisted inversion being pursued within the Air Force Research Laboratory (AFRL). This includes a discussion of the multiple model-based methods that can be used, including physics-based models, deep machine learning, and heuristic approaches. The benefits and drawbacks of each method is reviewed and the potential to integrate multiple methods is discussed. Initial successes are included to highlight the ability to obtain quantitative values of damage. Additional steps remaining to realize this capability with statistical metrics of accuracy are discussed, and how these results can be used to enable probabilistic life management are addressed. The outcome of this initiative will realize the long-term desired capability of NDE methods to provide quantitative characterization to accelerate certification of new materials and enhance life management of engineered systems.
Activity-Based Costing in the Naval Postgraduate School
2015-03-01
SCHOOL ACTIVITY-BASED COSTING MODEL ...29 A. MODELING ISSUES AND LIMITATIONS ..............................................29 B. COST POOL ALLOCATIONS TO PRODUCTION DEPARTMENTS...Activity, Monterey Bay (NSAMB). As such, MWR is not included in the costing model . Similarly, the cost of student salaries and benefits are not
Kline, Ronald M; Muldoon, L Daniel; Schumacher, Heidi K; Strawbridge, Larisa M; York, Andrew W; Mortimer, Laura K; Falb, Alison F; Cox, Katherine J; Bazell, Carol; Lukens, Ellen W; Kapp, Mary C; Rajkumar, Rahul; Bassano, Amy; Conway, Patrick H
2017-07-01
The Centers for Medicare & Medicaid Services developed the Oncology Care Model as an episode-based payment model to encourage participating practitioners to provide higher-quality, better-coordinated care at a lower cost to the nearly three-quarter million fee-for-service Medicare beneficiaries with cancer who receive chemotherapy each year. Episode payment models can be complex. They combine into a single benchmark price all payments for services during an episode of illness, many of which may be delivered at different times by different providers in different locations. Policy and technical decisions include the definition of the episode, including its initiation, duration, and included services; the identification of beneficiaries included in the model; and beneficiary attribution to practitioners with overall responsibility for managing their care. In addition, the calculation and risk adjustment of benchmark episode prices for the bundle of services must reflect geographic cost variations and diverse patient populations, including varying disease subtypes, medical comorbidities, changes in standards of care over time, the adoption of expensive new drugs (especially in oncology), as well as diverse practice patterns. Other steps include timely monitoring and intervention as needed to avoid shifting the attribution of beneficiaries on the basis of their expected episode expenditures as well as to ensure the provision of necessary medical services and the development of a meaningful link to quality measurement and improvement through the episode-based payment methodology. The complex and diverse nature of oncology business relationships and the specific rules and requirements of Medicare payment systems for different types of providers intensify these issues. The Centers for Medicare & Medicaid Services believes that by sharing its approach to addressing these decisions and challenges, it may facilitate greater understanding of the model within the oncology community and provide insight to others considering the development of episode-based payment models in the commercial or government sectors.
Comparison of measurement- and proxy-based Vs30 values in California
Yong, Alan K.
2016-01-01
This study was prompted by the recent availability of a significant amount of openly accessible measured VS30 values and the desire to investigate the trend of using proxy-based models to predict VS30 in the absence of measurements. Comparisons between measured and model-based values were performed. The measured data included 503 VS30 values collected from various projects for 482 seismographic station sites in California. Six proxy-based models—employing geologic mapping, topographic slope, and terrain classification—were also considered. Included was a new terrain class model based on the Yong et al. (2012) approach but recalibrated with updated measured VS30 values. Using the measured VS30 data as the metric for performance, the predictive capabilities of the six models were determined to be statistically indistinguishable. This study also found three models that tend to underpredict VS30 at lower velocities (NEHRP Site Classes D–E) and overpredict at higher velocities (Site Classes B–C).
NASA Astrophysics Data System (ADS)
Malard, J. J.; Rojas, M.; Adamowski, J. F.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.
2015-12-01
While cropping models represent the biophysical aspects of agricultural systems, system dynamics modelling offers the possibility of representing the socioeconomic (including social and cultural) aspects of these systems. The two types of models can then be coupled in order to include the socioeconomic dimensions of climate change adaptation in the predictions of cropping models.We develop a dynamically coupled socioeconomic-biophysical model of agricultural production and its repercussions on food security in two case studies from Guatemala (a market-based, intensive agricultural system and a low-input, subsistence crop-based system). Through the specification of the climate inputs to the cropping model, the impacts of climate change on the entire system can be analysed, and the participatory nature of the system dynamics model-building process, in which stakeholders from NGOs to local governmental extension workers were included, helps ensure local trust in and use of the model.However, the analysis of climate variability's impacts on agroecosystems includes uncertainty, especially in the case of joint physical-socioeconomic modelling, and the explicit representation of this uncertainty in the participatory development of the models is important to ensure appropriate use of the models by the end users. In addition, standard model calibration, validation, and uncertainty interval estimation techniques used for physically-based models are impractical in the case of socioeconomic modelling. We present a methodology for the calibration and uncertainty analysis of coupled biophysical (cropping) and system dynamics (socioeconomic) agricultural models, using survey data and expert input to calibrate and evaluate the uncertainty of the system dynamics as well as of the overall coupled model. This approach offers an important tool for local decision makers to evaluate the potential impacts of climate change and their feedbacks through the associated socioeconomic system.
Constraint Based Modeling Going Multicellular.
Martins Conde, Patricia do Rosario; Sauter, Thomas; Pfau, Thomas
2016-01-01
Constraint based modeling has seen applications in many microorganisms. For example, there are now established methods to determine potential genetic modifications and external interventions to increase the efficiency of microbial strains in chemical production pipelines. In addition, multiple models of multicellular organisms have been created including plants and humans. While initially the focus here was on modeling individual cell types of the multicellular organism, this focus recently started to switch. Models of microbial communities, as well as multi-tissue models of higher organisms have been constructed. These models thereby can include different parts of a plant, like root, stem, or different tissue types in the same organ. Such models can elucidate details of the interplay between symbiotic organisms, as well as the concerted efforts of multiple tissues and can be applied to analyse the effects of drugs or mutations on a more systemic level. In this review we give an overview of the recent development of multi-tissue models using constraint based techniques and the methods employed when investigating these models. We further highlight advances in combining constraint based models with dynamic and regulatory information and give an overview of these types of hybrid or multi-level approaches.
Robust Programming Problems Based on the Mean-Variance Model Including Uncertainty Factors
NASA Astrophysics Data System (ADS)
Hasuike, Takashi; Ishii, Hiroaki
2009-01-01
This paper considers robust programming problems based on the mean-variance model including uncertainty sets and fuzzy factors. Since these problems are not well-defined problems due to fuzzy factors, it is hard to solve them directly. Therefore, introducing chance constraints, fuzzy goals and possibility measures, the proposed models are transformed into the deterministic equivalent problems. Furthermore, in order to solve these equivalent problems efficiently, the solution method is constructed introducing the mean-absolute deviation and doing the equivalent transformations.
Effects of Instructional Design with Mental Model Analysis on Learning.
ERIC Educational Resources Information Center
Hong, Eunsook
This paper presents a model for systematic instructional design that includes mental model analysis together with the procedures used in developing computer-based instructional materials in the area of statistical hypothesis testing. The instructional design model is based on the premise that the objective for learning is to achieve expert-like…
A Life-stage Physiologically-Based Pharmacokinetic (PBPK) model was developed to include descriptions of several life-stage events such as pregnancy, fetal development, the neonate and child growth. The overall modeling strategy was used for in vitro to in vivo (IVIVE) extrapolat...
Physiologically based kinetic (PBK) models are used widely throughout a number of working sectors, including academia and industry, to provide insight into the dosimetry related to observed adverse health effects in humans and other species. Use of these models has increased over...
Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil
2016-01-01
Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry.
Practical Application of Model-based Programming and State-based Architecture to Space Missions
NASA Technical Reports Server (NTRS)
Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian
2006-01-01
A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps
The Role of Inertia in Modeling Decisions from Experience with Instance-Based Learning
Dutt, Varun; Gonzalez, Cleotilde
2012-01-01
One form of inertia is the tendency to repeat the last decision irrespective of the obtained outcomes while making decisions from experience (DFE). A number of computational models based upon the Instance-Based Learning Theory, a theory of DFE, have included different inertia implementations and have shown to simultaneously account for both risk-taking and alternations between alternatives. The role that inertia plays in these models, however, is unclear as the same model without inertia is also able to account for observed risk-taking quite well. This paper demonstrates the predictive benefits of incorporating one particular implementation of inertia in an existing IBL model. We use two large datasets, estimation and competition, from the Technion Prediction Tournament involving a repeated binary-choice task to show that incorporating an inertia mechanism in an IBL model enables it to account for the observed average risk-taking and alternations. Including inertia, however, does not help the model to account for the trends in risk-taking and alternations over trials compared to the IBL model without the inertia mechanism. We generalize the two IBL models, with and without inertia, to the competition set by using the parameters determined in the estimation set. The generalization process demonstrates both the advantages and disadvantages of including inertia in an IBL model. PMID:22685443
The role of inertia in modeling decisions from experience with instance-based learning.
Dutt, Varun; Gonzalez, Cleotilde
2012-01-01
One form of inertia is the tendency to repeat the last decision irrespective of the obtained outcomes while making decisions from experience (DFE). A number of computational models based upon the Instance-Based Learning Theory, a theory of DFE, have included different inertia implementations and have shown to simultaneously account for both risk-taking and alternations between alternatives. The role that inertia plays in these models, however, is unclear as the same model without inertia is also able to account for observed risk-taking quite well. This paper demonstrates the predictive benefits of incorporating one particular implementation of inertia in an existing IBL model. We use two large datasets, estimation and competition, from the Technion Prediction Tournament involving a repeated binary-choice task to show that incorporating an inertia mechanism in an IBL model enables it to account for the observed average risk-taking and alternations. Including inertia, however, does not help the model to account for the trends in risk-taking and alternations over trials compared to the IBL model without the inertia mechanism. We generalize the two IBL models, with and without inertia, to the competition set by using the parameters determined in the estimation set. The generalization process demonstrates both the advantages and disadvantages of including inertia in an IBL model.
Determinants of perceived sleep quality in normal sleepers.
Goelema, M S; Regis, M; Haakma, R; van den Heuvel, E R; Markopoulos, P; Overeem, S
2017-09-20
This study aimed to establish the determinants of perceived sleep quality over a longer period of time, taking into account the separate contributions of actigraphy-based sleep measures and self-reported sleep indices. Fifty participants (52 ± 6.6 years; 27 females) completed two consecutive weeks of home monitoring, during which they kept a sleep-wake diary while their sleep was monitored using a wrist-worn actigraph. The diary included questions on perceived sleep quality, sleep-wake information, and additional factors such as well-being and stress. The data were analyzed using multilevel models to compare a model that included only actigraphy-based sleep measures (model Acti) to a model that included only self-reported sleep measures to explain perceived sleep quality (model Self). In addition, a model based on the self-reported sleep measures and extended with nonsleep-related factors was analyzed to find the most significant determinants of perceived sleep quality (model Extended). Self-reported sleep measures (model Self) explained 61% of the total variance, while actigraphy-based sleep measures (model Acti) only accounted for 41% of the perceived sleep quality. The main predictors in the self-reported model were number of awakenings during the night, sleep onset latency, and wake time after sleep onset. In the extended model, the number of awakenings during the night and total sleep time of the previous night were the strongest determinants of perceived sleep quality, with 64% of the variance explained. In our cohort, perceived sleep quality was mainly determined by self-reported sleep measures and less by actigraphy-based sleep indices. These data further stress the importance of taking multiple nights into account when trying to understand perceived sleep quality.
Development , Implementation and Evaluation of a Physics-Base Windblown Dust Emission Model
A physics-based windblown dust emission parametrization scheme is developed and implemented in the CMAQ modeling system. A distinct feature of the present model includes the incorporation of a newly developed, dynamic relation for the surface roughness length, which is important ...
van Kleef, R C; van Vliet, R C J A; van Rooijen, E M
2014-03-01
The Dutch basic health-insurance scheme for curative care includes a risk equalization model (RE-model) to compensate competing health insurers for the predictable high costs of people in poor health. Since 2004, this RE-model includes the so-called Diagnoses-based Cost Groups (DCGs) as a risk adjuster. Until 2013, these DCGs have been mainly based on diagnoses from inpatient hospital treatment. This paper examines (1) to what extent the Dutch RE-model can be improved by extending the inpatient DCGs with diagnoses from outpatient hospital treatment and (2) how to treat outpatient diagnoses relative to their corresponding inpatient diagnoses. Based on individual-level administrative costs we estimate the Dutch RE-model with three different DCG modalities. Using individual-level survey information from a prior year we examine the outcomes of these modalities for different groups of people in poor health. We find that extending DCGs with outpatient diagnoses has hardly any effect on the R-squared of the RE-model, but reduces the undercompensation for people with a chronic condition by about 8%. With respect to incentives, it may be preferable to make no distinction between corresponding inpatient and outpatient diagnoses in the DCG-classification, although this will be at the expense of the predictive accuracy of the RE-model. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Logic-Based Models for the Analysis of Cell Signaling Networks†
2010-01-01
Computational models are increasingly used to analyze the operation of complex biochemical networks, including those involved in cell signaling networks. Here we review recent advances in applying logic-based modeling to mammalian cell biology. Logic-based models represent biomolecular networks in a simple and intuitive manner without describing the detailed biochemistry of each interaction. A brief description of several logic-based modeling methods is followed by six case studies that demonstrate biological questions recently addressed using logic-based models and point to potential advances in model formalisms and training procedures that promise to enhance the utility of logic-based methods for studying the relationship between environmental inputs and phenotypic or signaling state outputs of complex signaling networks. PMID:20225868
New V and V Tools for Diagnostic Modeling Environment (DME)
NASA Technical Reports Server (NTRS)
Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)
2002-01-01
The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.
Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination
NASA Technical Reports Server (NTRS)
Groen, Frank
2010-01-01
This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.
A Cognitive Model Based on Neuromodulated Plasticity
Ruan, Xiaogang
2016-01-01
Associative learning, including classical conditioning and operant conditioning, is regarded as the most fundamental type of learning for animals and human beings. Many models have been proposed surrounding classical conditioning or operant conditioning. However, a unified and integrated model to explain the two types of conditioning is much less studied. Here, a model based on neuromodulated synaptic plasticity is presented. The model is bioinspired including multistored memory module and simulated VTA dopaminergic neurons to produce reward signal. The synaptic weights are modified according to the reward signal, which simulates the change of associative strengths in associative learning. The experiment results in real robots prove the suitability and validity of the proposed model. PMID:27872638
Model-based diagnostics for Space Station Freedom
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.
1991-01-01
An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.
Preliminary Cost Model for Space Telescopes
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Prince, F. Andrew; Smart, Christian; Stephens, Kyle; Henrichs, Todd
2009-01-01
Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. However, great care is required. Some space telescope cost models, such as those based only on mass, lack sufficient detail to support such analysis and may lead to inaccurate conclusions. Similarly, using ground based telescope models which include the dome cost will also lead to inaccurate conclusions. This paper reviews current and historical models. Then, based on data from 22 different NASA space telescopes, this paper tests those models and presents preliminary analysis of single and multi-variable space telescope cost models.
Predicting the Magnetic Properties of ICMEs: A Pragmatic View
NASA Astrophysics Data System (ADS)
Riley, P.; Linker, J.; Ben-Nun, M.; Torok, T.; Ulrich, R. K.; Russell, C. T.; Lai, H.; de Koning, C. A.; Pizzo, V. J.; Liu, Y.; Hoeksema, J. T.
2017-12-01
The southward component of the interplanetary magnetic field plays a crucial role in being able to successfully predict space weather phenomena. Yet, thus far, it has proven extremely difficult to forecast with any degree of accuracy. In this presentation, we describe an empirically-based modeling framework for estimating Bz values during the passage of interplanetary coronal mass ejections (ICMEs). The model includes: (1) an empirically-based estimate of the magnetic properties of the flux rope in the low corona (including helicity and field strength); (2) an empirically-based estimate of the dynamic properties of the flux rope in the high corona (including direction, speed, and mass); and (3) a physics-based estimate of the evolution of the flux rope during its passage to 1 AU driven by the output from (1) and (2). We compare model output with observations for a selection of events to estimate the accuracy of this approach. Importantly, we pay specific attention to the uncertainties introduced by the components within the framework, separating intrinsic limitations from those that can be improved upon, either by better observations or more sophisticated modeling. Our analysis suggests that current observations/modeling are insufficient for this empirically-based framework to provide reliable and actionable prediction of the magnetic properties of ICMEs. We suggest several paths that may lead to better forecasts.
NASA Astrophysics Data System (ADS)
McConnell, William J.
Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed models to better assist them in scientific argumentation over paper drawing models. In fact, when given a choice, students rarely used paper drawing to assist in argument. There was also a difference in model utility between the two different model types. Participants explicitly used 3D printed models to complete gestural modeling, while participants rarely looked at 2D models when involved in gestural modeling. This study's findings added to current theory dealing with the varied spatial challenges involved in different modes of expressed models. This study found that depth, symmetry and the manipulation of perspectives are typically spatial challenges students will attend to using CAD while they will typically ignore them when drawing using paper and pencil. This study also revealed a major difference in model-based argument in a design-based instruction context as opposed to model-based argument in a typical science classroom context. In the context of design-based instruction, data revealed that design process is an important part of model-based argument. Due to the importance of design process in model-based argumentation in this context, trusted methods of argument analysis, like the coding system of the IASCA, was found lacking in many respects. Limitations and recommendations for further research were also presented.
Representing Micro-Macro Linkages by Actor-Based Dynamic Network Models
Snijders, Tom A.B.; Steglich, Christian E.G.
2014-01-01
Stochastic actor-based models for network dynamics have the primary aim of statistical inference about processes of network change, but may be regarded as a kind of agent-based models. Similar to many other agent-based models, they are based on local rules for actor behavior. Different from many other agent-based models, by including elements of generalized linear statistical models they aim to be realistic detailed representations of network dynamics in empirical data sets. Statistical parallels to micro-macro considerations can be found in the estimation of parameters determining local actor behavior from empirical data, and the assessment of goodness of fit from the correspondence with network-level descriptives. This article studies several network-level consequences of dynamic actor-based models applied to represent cross-sectional network data. Two examples illustrate how network-level characteristics can be obtained as emergent features implied by micro-specifications of actor-based models. PMID:25960578
A Model for Recruiting the New Community College Student.
ERIC Educational Resources Information Center
Pitt Community Coll., Greenville, NC.
This proposal represents a systematic and research-based effort to revitalize traditional recruiting practices and to establish a new recruitment model for reaching the nontraditional student in North Carolina two-year colleges. It includes activities of and findings from a model adult recruiting project. A project overview includes the objectives…
A Survey of Model Evaluation Approaches with a Tutorial on Hierarchical Bayesian Methods
ERIC Educational Resources Information Center
Shiffrin, Richard M.; Lee, Michael D.; Kim, Woojae; Wagenmakers, Eric-Jan
2008-01-01
This article reviews current methods for evaluating models in the cognitive sciences, including theoretically based approaches, such as Bayes factors and minimum description length measures; simulation approaches, including model mimicry evaluations; and practical approaches, such as validation and generalization measures. This article argues…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.; Britt, J.; Birkmire, R.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less
A fugacity-based indoor residential pesticide fate model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Deborah H.; Furtaw, Edward J.; McKone, Thomas E.
Dermal and non-dietary pathways are potentially significant exposure pathways to pesticides used in residences. Exposure pathways include dermal contact with residues on surfaces, ingestion from hand- and object-to-mouth activities, and absorption of pesticides into food. A limited amount of data has been collected on pesticide concentrations in various residential compartments following an application. But models are needed to interpret this data and make predictions about other pesticides based on chemical properties. In this paper, we propose a mass-balance compartment model based on fugacity principles. We include air (both gas phase and aerosols), carpet, smooth flooring, and walls as model compartments.more » Pesticide concentrations on furniture and toys, and in food, are being added to the model as data becomes available. We determine the compartmental fugacity capacity and mass transfer-rate coefficient for wallboard as an example. We also present the framework and equations needed for a dynamic mass-balance model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruzic, Jamie J.; Evans, T. Matthew; Greaney, P. Alex
The report describes the development of a discrete element method (DEM) based modeling approach to quantitatively predict deformation and failure of typical nickel based superalloys. A series of experimental data, including microstructure and mechanical property characterization at 600°C, was collected for a relatively simple, model solid solution Ni-20Cr alloy (Nimonic 75) to determine inputs for the model and provide data for model validation. Nimonic 75 was considered ideal for this study because it is a certified tensile and creep reference material. A series of new DEM modeling approaches were developed to capture the complexity of metal deformation, including cubic elasticmore » anisotropy and plastic deformation both with and without strain hardening. Our model approaches were implemented into a commercially available DEM code, PFC3D, that is commonly used by engineers. It is envisioned that once further developed, this new DEM modeling approach can be adapted to a wide range of engineering applications.« less
Hyper-Book: A Formal Model for Electronic Books.
ERIC Educational Resources Information Center
Catenazzi, Nadia; Sommaruga, Lorenzo
1994-01-01
Presents a model for electronic books based on the paper book metaphor. Discussion includes how the book evolves under the effects of its functional components; the use and impact of the model for organizing and presenting electronic documents in the context of electronic publishing; and the possible applications of a system based on the model.…
Highly Relevant Mentoring (HRM) as a Faculty Development Model for Web-Based Instruction
ERIC Educational Resources Information Center
Carter, Lorraine; Salyers, Vincent; Page, Aroha; Williams, Lynda; Albl, Liz; Hofsink, Clarence
2012-01-01
This paper describes a faculty development model called the highly relevant mentoring (HRM) model; the model includes a framework as well as some practical strategies for meeting the professional development needs of faculty who teach web-based courses. The paper further emphasizes the need for faculty and administrative buy-in for HRM and…
Models for Theory-Based M.A. and Ph.D. Programs.
ERIC Educational Resources Information Center
Botan, Carl; Vasquez, Gabriel
1999-01-01
Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…
Efficacy of a surfactant-based wound dressing on biofilm control.
Percival, Steven L; Mayer, Dieter; Salisbury, Anne-Marie
2017-09-01
The aim of this study was to evaluate the efficacy of both a nonantimicrobial and antimicrobial (1% silver sulfadiazine-SSD) surfactant-based wound dressing in the control of Pseudomonas aeruginosa, Enterococcus sp, Staphylococcus epidermidis, Staphylococcus aureus, and methicillin-resistant S. aureus (MRSA) biofilms. Anti-biofilm efficacy was evaluated in numerous adapted American Standards for Testing and Materials (ASTM) standard biofilm models and other bespoke biofilm models. The ASTM standard models employed included the Minimum biofilm eradication concentration (MBEC) biofilm model (ASTM E2799) and the Centers for Disease Control (CDC) biofilm reactor model (ASTM 2871). Such bespoke biofilm models included the filter biofilm model and the chamberslide biofilm model. Results showed complete kill of microorganisms within a biofilm using the antimicrobial surfactant-based wound dressing. Interestingly, the nonantimicrobial surfactant-based dressing could disrupt existing biofilms by causing biofilm detachment. Prior to biofilm detachment, we demonstrated, using confocal laser scanning microscopy (CLSM), the dispersive effect of the nonantimicrobial surfactant-based wound dressing on the biofilm within 10 minutes of treatment. Furthermore, the non-antimicrobial surfactant-based wound dressing caused an increase in microbial flocculation/aggregation, important for microbial concentration. In conclusion, this nonantimicrobial surfactant-based wound dressing leads to the effective detachment and dispersion of in vitro biofilms. The use of surfactant-based wound dressings in a clinical setting may help to disrupt existing biofilm from wound tissue and may increase the action of antimicrobial treatment. © 2017 by the Wound Healing Society.
Construction of dynamic stochastic simulation models using knowledge-based techniques
NASA Technical Reports Server (NTRS)
Williams, M. Douglas; Shiva, Sajjan G.
1990-01-01
Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).
Kim, Sook-Nam; Choi, Soon-Ock; Shin, Seong Hoon; Ryu, Ji-Sun; Baik, Jeong-Won
2017-07-01
A feasible palliative care model for advance cancer patients is needed in Korea with its rapidly aging population and corresponding increase in cancer prevalence. This study describes the process involved in the development of a community-based palliative care (CBPC) model implemented originally in a Busan pilot project. The model development included steps I and II of the pilot project, identification of the service types, a survey exploring the community demand for palliative care, construction of an operational infrastructure, and the establishment of a service delivery system. Public health centers (including Busan regional cancer centers, palliative care centers, and social welfare centers) served as the regional hubs in the development of a palliative care model. The palliative care project included the provision of palliative care, establishment of a support system for the operations, improvement of personnel capacity, development of an educational and promotional program, and the establishment of an assessment system to improve quality. The operational infrastructure included a service management team, provision teams, and a support team. The Busan Metropolitan City CBPC model was based on the principles of palliative care as well as the characteristics of public health centers that implemented the community health projects. The potential use of the Busan CBPC model in Korea should be explored further through service evaluations.
A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network
NASA Astrophysics Data System (ADS)
Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.
2018-02-01
Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.
Creep fatigue life prediction for engine hot section materials (ISOTROPIC)
NASA Technical Reports Server (NTRS)
Nelson, R. S.; Schoendorf, J. F.; Lin, L. S.
1986-01-01
The specific activities summarized include: verification experiments (base program); thermomechanical cycling model; multiaxial stress state model; cumulative loading model; screening of potential environmental and protective coating models; and environmental attack model.
Sakhteman, Amirhossein; Zare, Bijan
2016-01-01
An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276
Rothman, Jessica; Kayigamba, Felix; Hills, Victoria; Gupta, Neil; Machara, Faustin; Niyigena, Peter; Franke, Molly F
2018-01-01
The objective of this study was to examine how food insecurity changed among HIV-positive adults during the first 12 months of combination antiretroviral therapy (cART) and whether any change differed according to the receipt of food support, which was provided in the context of a comprehensive community-based intervention. We conducted secondary data analyses of data from a prospective cohort study of the effectiveness of a community-based cART delivery model when added to clinic-based cART delivery in Rwanda. We included patients from four health centers that implemented a clinic-based cART delivery model alone and five health centers that additionally implemented the intervention, which included 10 months of food support. We compared food insecurity at 3, 6, and 12 months, relative to baseline, and stratified by receipt of the intervention. Relative to baseline, median food insecurity score decreased after 3, 6, and 12 months (p value <0.0001 for all) for patients receiving a food ration through the community-based model for cART delivery. Among patients receiving care under the clinic-based cART model, food insecurity scores remained unchanged at 3 and 12 months and were significantly higher after 6 months. In adjusted analyses, participants enrolled in the community-based intervention with a food ration had a lower risk of severe food insecurity and a lower risk of moderate or severe food insecurity after 12 months. A comprehensive community-based HIV program including a food ration likely contributes to an alleviation of food insecurity among adults newly initiating cART.
Designing Interactive Learning Systems.
ERIC Educational Resources Information Center
Barker, Philip
1990-01-01
Describes multimedia, computer-based interactive learning systems that support various forms of individualized study. Highlights include design models; user interfaces; design guidelines; media utilization paradigms, including hypermedia and learner-controlled models; metaphors and myths; authoring tools; optical media; workstations; four case…
NASA Astrophysics Data System (ADS)
Barker, J. Burdette
Spatially informed irrigation management may improve the optimal use of water resources. Sub-field scale water balance modeling and measurement were studied in the context of irrigation management. A spatial remote-sensing-based evapotranspiration and soil water balance model was modified and validated for use in real-time irrigation management. The modeled ET compared well with eddy covariance data from eastern Nebraska. Placement and quantity of sub-field scale soil water content measurement locations was also studied. Variance reduction factor and temporal stability were used to analyze soil water content data from an eastern Nebraska field. No consistent predictor of soil water temporal stability patterns was identified. At least three monitoring locations were needed per irrigation management zone to adequately quantify the mean soil water content. The remote-sensing-based water balance model was used to manage irrigation in a field experiment. The research included an eastern Nebraska field in 2015 and 2016 and a western Nebraska field in 2016 for a total of 210 plot-years. The response of maize and soybean to irrigation using variations of the model were compared with responses from treatments using soil water content measurement and a rainfed treatment. The remote-sensing-based treatment prescribed more irrigation than the other treatments in all cases. Excessive modeled soil evaporation and insufficient drainage times were suspected causes of the model drift. Modifying evaporation and drainage reduced modeled soil water depletion error. None of the included response variables were significantly different between treatments in western Nebraska. In eastern Nebraska, treatment differences for maize and soybean included evapotranspiration and a combined variable including evapotranspiration and deep percolation. Both variables were greatest for the remote-sensing model when differences were found to be statistically significant. Differences in maize yield in 2015 were attributed to random error. Soybean yield was lowest for the remote-sensing-based treatment and greatest for rainfed, possibly because of overwatering and lodging. The model performed well considering that it did not include soil water content measurements during the season. Future work should improve the soil evaporation and drainage formulations, because of excessive precipitation and include aerial remote sensing imagery and soil water content measurement as model inputs.
Specifying and Refining a Measurement Model for a Computer-Based Interactive Assessment
ERIC Educational Resources Information Center
Levy, Roy; Mislevy, Robert J.
2004-01-01
The challenges of modeling students' performance in computer-based interactive assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance. This article describes a Bayesian approach to modeling and estimating cognitive models…
The Meta-Ontology Model of the Fishdisease Diagnostic Knowledge Based on Owl
NASA Astrophysics Data System (ADS)
Shi, Yongchang; Gao, Wen; Hu, Liang; Fu, Zetian
For improving available and reusable of knowledge in fish disease diagnosis (FDD) domain and facilitating knowledge acquisition, an ontology model of FDD knowledge was developed based on owl according to FDD knowledge model. It includes terminology of terms in FDD knowledge and hierarchies of their class.
Report of the LSPI/NASA Workshop on Lunar Base Methodology Development
NASA Technical Reports Server (NTRS)
Nozette, Stewart; Roberts, Barney
1985-01-01
Groundwork was laid for computer models which will assist in the design of a manned lunar base. The models, herein described, will provide the following functions for the successful conclusion of that task: strategic planning; sensitivity analyses; impact analyses; and documentation. Topics addressed include: upper level model description; interrelationship matrix; user community; model features; model descriptions; system implementation; model management; and plans for future action.
Pombo, Nuno; Garcia, Nuno; Bousson, Kouamana
2017-03-01
Sleep apnea syndrome (SAS), which can significantly decrease the quality of life is associated with a major risk factor of health implications such as increased cardiovascular disease, sudden death, depression, irritability, hypertension, and learning difficulties. Thus, it is relevant and timely to present a systematic review describing significant applications in the framework of computational intelligence-based SAS, including its performance, beneficial and challenging effects, and modeling for the decision-making on multiple scenarios. This study aims to systematically review the literature on systems for the detection and/or prediction of apnea events using a classification model. Forty-five included studies revealed a combination of classification techniques for the diagnosis of apnea, such as threshold-based (14.75%) and machine learning (ML) models (85.25%). In addition, the ML models, were clustered in a mind map, include neural networks (44.26%), regression (4.91%), instance-based (11.47%), Bayesian algorithms (1.63%), reinforcement learning (4.91%), dimensionality reduction (8.19%), ensemble learning (6.55%), and decision trees (3.27%). A classification model should provide an auto-adaptive and no external-human action dependency. In addition, the accuracy of the classification models is related with the effective features selection. New high-quality studies based on randomized controlled trials and validation of models using a large and multiple sample of data are recommended. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Regression-based model of skin diffuse reflectance for skin color analysis
NASA Astrophysics Data System (ADS)
Tsumura, Norimichi; Kawazoe, Daisuke; Nakaguchi, Toshiya; Ojima, Nobutoshi; Miyake, Yoichi
2008-11-01
A simple regression-based model of skin diffuse reflectance is developed based on reflectance samples calculated by Monte Carlo simulation of light transport in a two-layered skin model. This reflectance model includes the values of spectral reflectance in the visible spectra for Japanese women. The modified Lambert Beer law holds in the proposed model with a modified mean free path length in non-linear density space. The averaged RMS and maximum errors of the proposed model were 1.1 and 3.1%, respectively, in the above range.
A Risk Prediction Model for Sporadic CRC Based on Routine Lab Results.
Boursi, Ben; Mamtani, Ronac; Hwang, Wei-Ting; Haynes, Kevin; Yang, Yu-Xiao
2016-07-01
Current risk scores for colorectal cancer (CRC) are based on demographic and behavioral factors and have limited predictive values. To develop a novel risk prediction model for sporadic CRC using clinical and laboratory data in electronic medical records. We conducted a nested case-control study in a UK primary care database. Cases included those with a diagnostic code of CRC, aged 50-85. Each case was matched with four controls using incidence density sampling. CRC predictors were examined using univariate conditional logistic regression. Variables with p value <0.25 in the univariate analysis were further evaluated in multivariate models using backward elimination. Discrimination was assessed using receiver operating curve. Calibration was evaluated using the McFadden's R2. Net reclassification index (NRI) associated with incorporation of laboratory results was calculated. Results were internally validated. A model similar to existing CRC prediction models which included age, sex, height, obesity, ever smoking, alcohol dependence, and previous screening colonoscopy had an AUC of 0.58 (0.57-0.59) with poor goodness of fit. A laboratory-based model including hematocrit, MCV, lymphocytes, and neutrophil-lymphocyte ratio (NLR) had an AUC of 0.76 (0.76-0.77) and a McFadden's R2 of 0.21 with a NRI of 47.6 %. A combined model including sex, hemoglobin, MCV, white blood cells, platelets, NLR, and oral hypoglycemic use had an AUC of 0.80 (0.79-0.81) with a McFadden's R2 of 0.27 and a NRI of 60.7 %. Similar results were shown in an internal validation set. A laboratory-based risk model had good predictive power for sporadic CRC risk.
Developing a Conceptual Architecture for a Generalized Agent-based Modeling Environment (GAME)
2008-03-01
4. REPAST (Java, Python , C#, Open Source) ........28 5. MASON: Multi-Agent Modeling Language (Swarm Extension... Python , C#, Open Source) Repast (Recursive Porous Agent Simulation Toolkit) was designed for building agent-based models and simulations in the...Repast makes it easy for inexperienced users to build models by including a built-in simple model and provide interfaces through which menus and Python
‘Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...
2014-09-23
conduct simulations with a high-latitude data assimilation model. The specific objectives are to study magnetosphere-ionosphere ( M -I) coupling processes...based on three physics-based models, including a magnetosphere-ionosphere ( M -I) electrodynamics model, an ionosphere model, and a magnetic...inversion code. The ionosphere model is a high-resolution version of the Ionosphere Forecast Model ( IFM ), which is a 3-D, multi-ion model of the ionosphere
Modeling and experimental study of resistive switching in vertically aligned carbon nanotubes
NASA Astrophysics Data System (ADS)
Ageev, O. A.; Blinov, Yu F.; Ilina, M. V.; Ilin, O. I.; Smirnov, V. A.
2016-08-01
Model of the resistive switching in vertically aligned carbon nanotube (VA CNT) taking into account the processes of deformation, polarization and piezoelectric charge accumulation have been developed. Origin of hysteresis in VA CNT-based structure is described. Based on modeling results the VACNTs-based structure has been created. The ration resistance of high-resistance to low-resistance states of the VACNTs-based structure amounts 48. The correlation the modeling results with experimental studies is shown. The results can be used in the development nanoelectronics devices based on VA CNTs, including the nonvolatile resistive random-access memory.
Modeling to predict pilot performance during CDTI-based in-trail following experiments
NASA Technical Reports Server (NTRS)
Sorensen, J. A.; Goka, T.
1984-01-01
A mathematical model was developed of the flight system with the pilot using a cockpit display of traffic information (CDTI) to establish and maintain in-trail spacing behind a lead aircraft during approach. Both in-trail and vertical dynamics were included. The nominal spacing was based on one of three criteria (Constant Time Predictor; Constant Time Delay; or Acceleration Cue). This model was used to simulate digitally the dynamics of a string of multiple following aircraft, including response to initial position errors. The simulation was used to predict the outcome of a series of in-trail following experiments, including pilot performance in maintaining correct longitudinal spacing and vertical position. The experiments were run in the NASA Ames Research Center multi-cab cockpit simulator facility. The experimental results were then used to evaluate the model and its prediction accuracy. Model parameters were adjusted, so that modeled performance matched experimental results. Lessons learned in this modeling and prediction study are summarized.
Networks for image acquisition, processing and display
NASA Technical Reports Server (NTRS)
Ahumada, Albert J., Jr.
1990-01-01
The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.
NASA Technical Reports Server (NTRS)
Lee, S. S.; Sengupta, S.; Nwadike, E. V.
1980-01-01
A one dimensional model for studying the thermal dynamics of cooling lakes was developed and verified. The model is essentially a set of partial differential equations which are solved by finite difference methods. The model includes the effects of variation of area with depth, surface heating due to solar radiation absorbed at the upper layer, and internal heating due to the transmission of solar radiation to the sub-surface layers. The exchange of mechanical energy between the lake and the atmosphere is included through the coupling of thermal diffusivity and wind speed. The effects of discharge and intake by power plants are also included. The numerical model was calibrated by applying it to Cayuga Lake. The model was then verified through a long term simulation using Lake Keowee data base. The comparison between measured and predicted vertical temperature profiles for the nine years is good. The physical limnology of Lake Keowee is presented through a set of graphical representations of the measured data base.
A Unified Framework for Complex Networks with Degree Trichotomy Based on Markov Chains.
Hui, David Shui Wing; Chen, Yi-Chao; Zhang, Gong; Wu, Weijie; Chen, Guanrong; Lui, John C S; Li, Yingtao
2017-06-16
This paper establishes a Markov chain model as a unified framework for describing the evolution processes in complex networks. The unique feature of the proposed model is its capability in addressing the formation mechanism that can reflect the "trichotomy" observed in degree distributions, based on which closed-form solutions can be derived. Important special cases of the proposed unified framework are those classical models, including Poisson, Exponential, Power-law distributed networks. Both simulation and experimental results demonstrate a good match of the proposed model with real datasets, showing its superiority over the classical models. Implications of the model to various applications including citation analysis, online social networks, and vehicular networks design, are also discussed in the paper.
NASA Astrophysics Data System (ADS)
Miner, Nadine Elizabeth
1998-09-01
This dissertation presents a new wavelet-based method for synthesizing perceptually convincing, dynamic sounds using parameterized sound models. The sound synthesis method is applicable to a variety of applications including Virtual Reality (VR), multi-media, entertainment, and the World Wide Web (WWW). A unique contribution of this research is the modeling of the stochastic, or non-pitched, sound components. This stochastic-based modeling approach leads to perceptually compelling sound synthesis. Two preliminary studies conducted provide data on multi-sensory interaction and audio-visual synchronization timing. These results contributed to the design of the new sound synthesis method. The method uses a four-phase development process, including analysis, parameterization, synthesis and validation, to create the wavelet-based sound models. A patent is pending for this dynamic sound synthesis method, which provides perceptually-realistic, real-time sound generation. This dissertation also presents a battery of perceptual experiments developed to verify the sound synthesis results. These experiments are applicable for validation of any sound synthesis technique.
Hart, Carl R; Reznicek, Nathan J; Wilson, D Keith; Pettit, Chris L; Nykaza, Edward T
2016-05-01
Many outdoor sound propagation models exist, ranging from highly complex physics-based simulations to simplified engineering calculations, and more recently, highly flexible statistical learning methods. Several engineering and statistical learning models are evaluated by using a particular physics-based model, namely, a Crank-Nicholson parabolic equation (CNPE), as a benchmark. Narrowband transmission loss values predicted with the CNPE, based upon a simulated data set of meteorological, boundary, and source conditions, act as simulated observations. In the simulated data set sound propagation conditions span from downward refracting to upward refracting, for acoustically hard and soft boundaries, and low frequencies. Engineering models used in the comparisons include the ISO 9613-2 method, Harmonoise, and Nord2000 propagation models. Statistical learning methods used in the comparisons include bagged decision tree regression, random forest regression, boosting regression, and artificial neural network models. Computed skill scores are relative to sound propagation in a homogeneous atmosphere over a rigid ground. Overall skill scores for the engineering noise models are 0.6%, -7.1%, and 83.8% for the ISO 9613-2, Harmonoise, and Nord2000 models, respectively. Overall skill scores for the statistical learning models are 99.5%, 99.5%, 99.6%, and 99.6% for bagged decision tree, random forest, boosting, and artificial neural network regression models, respectively.
NASA Technical Reports Server (NTRS)
Miller, Robert H. (Inventor); Ribbens, William B. (Inventor)
2003-01-01
A method and system for detecting a failure or performance degradation in a dynamic system having sensors for measuring state variables and providing corresponding output signals in response to one or more system input signals are provided. The method includes calculating estimated gains of a filter and selecting an appropriate linear model for processing the output signals based on the input signals. The step of calculating utilizes one or more models of the dynamic system to obtain estimated signals. The method further includes calculating output error residuals based on the output signals and the estimated signals. The method also includes detecting one or more hypothesized failures or performance degradations of a component or subsystem of the dynamic system based on the error residuals. The step of calculating the estimated values is performed optimally with respect to one or more of: noise, uncertainty of parameters of the models and un-modeled dynamics of the dynamic system which may be a flight vehicle or financial market or modeled financial system.
Genders, Tessa S S; Steyerberg, Ewout W; Nieman, Koen; Galema, Tjebbe W; Mollet, Nico R; de Feyter, Pim J; Krestin, Gabriel P; Alkadhi, Hatem; Leschka, Sebastian; Desbiolles, Lotus; Meijs, Matthijs F L; Cramer, Maarten J; Knuuti, Juhani; Kajander, Sami; Bogaert, Jan; Goetschalckx, Kaatje; Cademartiri, Filippo; Maffei, Erica; Martini, Chiara; Seitun, Sara; Aldrovandi, Annachiara; Wildermuth, Simon; Stinn, Björn; Fornaro, Jürgen; Feuchtner, Gudrun; De Zordo, Tobias; Auer, Thomas; Plank, Fabian; Friedrich, Guy; Pugliese, Francesca; Petersen, Steffen E; Davies, L Ceri; Schoepf, U Joseph; Rowe, Garrett W; van Mieghem, Carlos A G; van Driessche, Luc; Sinitsyn, Valentin; Gopalan, Deepa; Nikolaou, Konstantin; Bamberg, Fabian; Cury, Ricardo C; Battle, Juan; Maurovich-Horvat, Pál; Bartykowszki, Andrea; Merkely, Bela; Becker, Dávid; Hadamitzky, Martin; Hausleiter, Jörg; Dewey, Marc; Zimmermann, Elke; Laule, Michael
2012-01-01
Objectives To develop prediction models that better estimate the pretest probability of coronary artery disease in low prevalence populations. Design Retrospective pooled analysis of individual patient data. Setting 18 hospitals in Europe and the United States. Participants Patients with stable chest pain without evidence for previous coronary artery disease, if they were referred for computed tomography (CT) based coronary angiography or catheter based coronary angiography (indicated as low and high prevalence settings, respectively). Main outcome measures Obstructive coronary artery disease (≥50% diameter stenosis in at least one vessel found on catheter based coronary angiography). Multiple imputation accounted for missing predictors and outcomes, exploiting strong correlation between the two angiography procedures. Predictive models included a basic model (age, sex, symptoms, and setting), clinical model (basic model factors and diabetes, hypertension, dyslipidaemia, and smoking), and extended model (clinical model factors and use of the CT based coronary calcium score). We assessed discrimination (c statistic), calibration, and continuous net reclassification improvement by cross validation for the four largest low prevalence datasets separately and the smaller remaining low prevalence datasets combined. Results We included 5677 patients (3283 men, 2394 women), of whom 1634 had obstructive coronary artery disease found on catheter based coronary angiography. All potential predictors were significantly associated with the presence of disease in univariable and multivariable analyses. The clinical model improved the prediction, compared with the basic model (cross validated c statistic improvement from 0.77 to 0.79, net reclassification improvement 35%); the coronary calcium score in the extended model was a major predictor (0.79 to 0.88, 102%). Calibration for low prevalence datasets was satisfactory. Conclusions Updated prediction models including age, sex, symptoms, and cardiovascular risk factors allow for accurate estimation of the pretest probability of coronary artery disease in low prevalence populations. Addition of coronary calcium scores to the prediction models improves the estimates. PMID:22692650
Probabilistic analysis for fatigue strength degradation of materials
NASA Technical Reports Server (NTRS)
Royce, Lola
1989-01-01
This report presents the results of the first year of a research program conducted for NASA-LeRC by the University of Texas at San Antonio. The research included development of methodology that provides a probabilistic treatment of lifetime prediction of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Linear elastic fracture mechanics is utilized in the latter model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs, RANDOM2, RANDOM3, and RANDOM4. These programs determine the random lifetime of an engine component, in mechanical load cycles, to reach a critical fatigue strength or crack size. The material considered was a cast nickel base superalloy, one typical of those used in the Space Shuttle Main Engine.
Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil
2017-01-01
Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated – from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry. PMID:28691120
Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne
2015-01-01
Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support.
A physical data model for fields and agents
NASA Astrophysics Data System (ADS)
de Jong, Kor; de Bakker, Merijn; Karssenberg, Derek
2016-04-01
Two approaches exist in simulation modeling: agent-based and field-based modeling. In agent-based (or individual-based) simulation modeling, the entities representing the system's state are represented by objects, which are bounded in space and time. Individual objects, like an animal, a house, or a more abstract entity like a country's economy, have properties representing their state. In an agent-based model this state is manipulated. In field-based modeling, the entities representing the system's state are represented by fields. Fields capture the state of a continuous property within a spatial extent, examples of which are elevation, atmospheric pressure, and water flow velocity. With respect to the technology used to create these models, the domains of agent-based and field-based modeling have often been separate worlds. In environmental modeling, widely used logical data models include feature data models for point, line and polygon objects, and the raster data model for fields. Simulation models are often either agent-based or field-based, even though the modeled system might contain both entities that are better represented by individuals and entities that are better represented by fields. We think that the reason for this dichotomy in kinds of models might be that the traditional object and field data models underlying those models are relatively low level. We have developed a higher level conceptual data model for representing both non-spatial and spatial objects, and spatial fields (De Bakker et al. 2016). Based on this conceptual data model we designed a logical and physical data model for representing many kinds of data, including the kinds used in earth system modeling (e.g. hydrological and ecological models). The goal of this work is to be able to create high level code and tools for the creation of models in which entities are representable by both objects and fields. Our conceptual data model is capable of representing the traditional feature data models and the raster data model, among many other data models. Our physical data model is capable of storing a first set of kinds of data, like omnipresent scalars, mobile spatio-temporal points and property values, and spatio-temporal rasters. With our poster we will provide an overview of the physical data model expressed in HDF5 and show examples of how it can be used to capture both object- and field-based information. References De Bakker, M, K. de Jong, D. Karssenberg. 2016. A conceptual data model and language for fields and agents. European Geosciences Union, EGU General Assembly, 2016, Vienna.
A small, single stage orifice pulse tube cryocooler demonstration
NASA Technical Reports Server (NTRS)
Hendricks, John B.
1990-01-01
This final report summarizes and presents the analytical and experimental progress in the present effort. The principal objective of this effort was the demonstration of a 0.25 Watt, 80 Kelvin orifice pulse tube refrigerator. The experimental apparatus is described. The design of a partially optimized pulse tube refrigerator is included. The refrigerator demonstrates an ultimate temperature of 77 K, has a projected cooling power of 0.18 Watts at 80 K, and has a measured cooling power of 1 Watt at 97 K, with an electrical efficiency of 250 Watts/Watt, much better than previous pulse tube refrigerators. A model of the pulse tube refrigerator that provides estimates of pressure ratio and mass flow within the pulse tube refrigerator, based on component physical characteristics is included. A model of a pulse tube operation based on generalized analysis which is adequate to support local optimization of existing designs is included. A model of regenerator performance based on an analogy to counterflow heat exchangers is included.
Leeson, S C; Beaver, K; Ezendam, N P M; Mačuks, R; Martin-Hirsch, P L; Miles, T; Jeppesen, M M; Jensen, P T; Zola, P
2017-03-01
After completing treatment, most patients follow a pre-determined schedule of regular hospital outpatient appointments, which includes clinical examinations, consultations and routine tests. After several years of surveillance, patients are transferred back to primary care. However, there is limited evidence to support the effectiveness and efficiency of this approach. This paper examines the current rationale and evidence base for hospital-based follow-up after treatment for gynaecological cancer. We investigate what alternative models of care have been formally evaluated and what research is currently in progress in Europe, in order to make tentative recommendations for a model of follow-up. The evidence base for traditional hospital based follow-up is limited. Alternative models have been reported for other cancer types but there are few evaluations of alternative approaches for gynaecological cancers. We identified five ongoing European studies; four were focused on endometrial cancer patients and one feasibility study included all gynaecological cancers. Only one study had reached the reporting stage. Alternative models included nurse-led telephone follow-up and comparisons of more intensive versus less intensive regimes. Outcomes included survival, quality of life, psychological morbidity, patient satisfaction and cost effectiveness of service. More work is needed on alternative strategies for all gynaecological cancer types. New models will be likely to include risk stratification with early discharge from secondary care for early stage disease with fast track access to specialist services for suspected cancer recurrence or other problems. Copyright © 2017 Elsevier B.V. All rights reserved.
Model Based Mission Assurance: Emerging Opportunities for Robotic Systems
NASA Technical Reports Server (NTRS)
Evans, John W.; DiVenti, Tony
2016-01-01
The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).
ME science as mobile learning based on virtual reality
NASA Astrophysics Data System (ADS)
Fradika, H. D.; Surjono, H. D.
2018-04-01
The purpose of this article described about ME Science (Mobile Education Science) as mobile learning application learning of Fisika Inti. ME Science is a product of research and development (R&D) that was using Alessi and Trollip model. Alessi and Trollip model consists three stages that are: (a) planning include analysis of problems, goals, need, and idea of development product, (b) designing includes collecting of materials, designing of material content, creating of story board, evaluating and review product, (c) developing includes development of product, alpha testing, revision of product, validation of product, beta testing, and evaluation of product. The article describes ME Science only to development of product which include development stages. The result of development product has been generates mobile learning application based on virtual reality that can be run on android-based smartphone. These application consist a brief description of learning material, quizzes, video of material summery, and learning material based on virtual reality.
Comparison of the CEAS and Williams-type barley yield models for North Dakota and Minnesota
NASA Technical Reports Server (NTRS)
Leduc, S. (Principal Investigator)
1982-01-01
The CEAS and Williams type models were compared based on specified selection criteria which includes a ten year bootstrap test (1970-1979). Based on this, the models were quite comparable; however, the CEAS model was slightly better overall. The Williams type model seemed better for the 1974 estimates. Because that year spring wheat yield was particularly low, the Williams type model should not be excluded from further consideration.
McGill, M J; Hart, W D; McKay, J A; Spinhirne, J D
1999-10-20
Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar system: the double-edge and the multichannel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only approximately 10-20% compared with nighttime performance, provided that a proper solar filter is included in the instrument design.
Oscillating in synchrony with a metronome: serial dependence, limit cycle dynamics, and modeling.
Torre, Kjerstin; Balasubramaniam, Ramesh; Delignières, Didier
2010-07-01
We analyzed serial dependencies in periods and asynchronies collected during oscillations performed in synchrony with a metronome. Results showed that asynchronies contain 1/f fluctuations, and the series of periods contain antipersistent dependence. The analysis of the phase portrait revealed a specific asymmetry induced by synchronization. We propose a hybrid limit cycle model including a cycle-dependent stiffness parameter provided with fractal properties, and a parametric driving function based on velocity. This model accounts for most experimentally evidenced statistical features, including serial dependence and limit cycle dynamics. We discuss the results and modeling choices within the framework of event-based and emergent timing.
NASA Technical Reports Server (NTRS)
Liu, Donhang
2014-01-01
This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.
Popadyuk, A; Kalita, H; Chisholm, B J; Voronov, A
2014-12-01
A new non-toxic soybean oil-based polymeric surfactant (SBPS) for personal-care products was developed and extensively characterized, including an evaluation of the polymeric surfactant performance in model shampoo formulations. To experimentally assure applicability of the soy-based macromolecules in shampoos, either in combination with common anionic surfactants (in this study, sodium lauryl sulfate, SLS) or as a single surface-active ingredient, the testing of SBPS physicochemical properties, performance and visual assessment of SBPS-based model shampoos was carried out. The results obtained, including foaming and cleaning ability of model formulations, were compared to those with only SLS as a surfactant as well as to SLS-free shampoos. Overall, the results show that the presence of SBPS improves cleaning, foaming, and conditioning of model formulations. SBPS-based formulations meet major requirements of multifunctional shampoos - mild detergency, foaming, good conditioning, and aesthetic appeal, which are comparable to commercially available shampoos. In addition, examination of SBPS/SLS mixtures in model shampoos showed that the presence of the SBPS enables the concentration of SLS to be significantly reduced without sacrificing shampoo performance. © 2014 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
V and V Efforts of Auroral Precipitation Models: Preliminary Results
NASA Technical Reports Server (NTRS)
Zheng, Yihua; Kuznetsova, Masha; Rastaetter, Lutz; Hesse, Michael
2011-01-01
Auroral precipitation models have been valuable both in terms of space weather applications and space science research. Yet very limited testing has been performed regarding model performance. A variety of auroral models are available, including empirical models that are parameterized by geomagnetic indices or upstream solar wind conditions, now casting models that are based on satellite observations, or those derived from physics-based, coupled global models. In this presentation, we will show our preliminary results regarding V&V efforts of some of the models.
Use of generalized linear models and digital data in a forest inventory of Northern Utah
Moisen, Gretchen G.; Edwards, Thomas C.
1999-01-01
Forest inventories, like those conducted by the Forest Service's Forest Inventory and Analysis Program (FIA) in the Rocky Mountain Region, are under increased pressure to produce better information at reduced costs. Here we describe our efforts in Utah to merge satellite-based information with forest inventory data for the purposes of reducing the costs of estimates of forest population totals and providing spatial depiction of forest resources. We illustrate how generalized linear models can be used to construct approximately unbiased and efficient estimates of population totals while providing a mechanism for prediction in space for mapping of forest structure. We model forest type and timber volume of five tree species groups as functions of a variety of predictor variables in the northern Utah mountains. Predictor variables include elevation, aspect, slope, geographic coordinates, as well as vegetation cover types based on satellite data from both the Advanced Very High Resolution Radiometer (AVHRR) and Thematic Mapper (TM) platforms. We examine the relative precision of estimates of area by forest type and mean cubic-foot volumes under six different models, including the traditional double sampling for stratification strategy. Only very small gains in precision were realized through the use of expensive photointerpreted or TM-based data for stratification, while models based on topography and spatial coordinates alone were competitive. We also compare the predictive capability of the models through various map accuracy measures. The models including the TM-based vegetation performed best overall, while topography and spatial coordinates alone provided substantial information at very low cost.
Simulation-based modeling of building complexes construction management
NASA Astrophysics Data System (ADS)
Shepelev, Aleksandr; Severova, Galina; Potashova, Irina
2018-03-01
The study reported here examines the experience in the development and implementation of business simulation games based on network planning and management of high-rise construction. Appropriate network models of different types and levels of detail have been developed; a simulation model including 51 blocks (11 stages combined in 4 units) is proposed.
Medical Models for Teachers' Learning: Asking for a Second Opinion
ERIC Educational Resources Information Center
Philpott, Carey
2017-01-01
Recently there has been renewed interest in basing teachers' professional learning on medically derived models. This interest has included clinical practice models and evidence-based teaching as well as the use of various forms of "Rounds" which claim to derive from medical rounds. However, many arguing for these approaches may well not…
A Multinomial Model of Event-Based Prospective Memory
ERIC Educational Resources Information Center
Smith, Rebekah E.; Bayen, Ute J.
2004-01-01
Prospective memory is remembering to perform an action in the future. The authors introduce the 1st formal model of event-based prospective memory, namely, a multinomial model that includes 2 separate parameters related to prospective memory processes. The 1st measures preparatory attentional processes, and the 2nd measures retrospective memory…
Specifying and Refining a Measurement Model for a Simulation-Based Assessment. CSE Report 619.
ERIC Educational Resources Information Center
Levy, Roy; Mislevy, Robert J.
2004-01-01
The challenges of modeling students' performance in simulation-based assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance in a complex assessment. This paper describes a Bayesian approach to modeling and estimating…
Efficient Band-to-Trap Tunneling Model Including Heterojunction Band Offset
Gao, Xujiao; Huang, Andy; Kerr, Bert
2017-10-25
In this paper, we present an efficient band-to-trap tunneling model based on the Schenk approach, in which an analytic density-of-states (DOS) model is developed based on the open boundary scattering method. The new model explicitly includes the effect of heterojunction band offset, in addition to the well-known field effect. Its analytic form enables straightforward implementation into TCAD device simulators. It is applicable to all one-dimensional potentials, which can be approximated to a good degree such that the approximated potentials lead to piecewise analytic wave functions with open boundary conditions. The model allows for simulating both the electric-field-enhanced and band-offset-enhanced carriermore » recombination due to the band-to-trap tunneling near the heterojunction in a heterojunction bipolar transistor (HBT). Simulation results of an InGaP/GaAs/GaAs NPN HBT show that the proposed model predicts significantly increased base currents, due to the hole-to-trap tunneling enhanced by the emitter-base junction band offset. Finally, the results compare favorably with experimental observation.« less
Efficient Band-to-Trap Tunneling Model Including Heterojunction Band Offset
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Xujiao; Huang, Andy; Kerr, Bert
In this paper, we present an efficient band-to-trap tunneling model based on the Schenk approach, in which an analytic density-of-states (DOS) model is developed based on the open boundary scattering method. The new model explicitly includes the effect of heterojunction band offset, in addition to the well-known field effect. Its analytic form enables straightforward implementation into TCAD device simulators. It is applicable to all one-dimensional potentials, which can be approximated to a good degree such that the approximated potentials lead to piecewise analytic wave functions with open boundary conditions. The model allows for simulating both the electric-field-enhanced and band-offset-enhanced carriermore » recombination due to the band-to-trap tunneling near the heterojunction in a heterojunction bipolar transistor (HBT). Simulation results of an InGaP/GaAs/GaAs NPN HBT show that the proposed model predicts significantly increased base currents, due to the hole-to-trap tunneling enhanced by the emitter-base junction band offset. Finally, the results compare favorably with experimental observation.« less
Electromagnetic interference modeling and suppression techniques in variable-frequency drive systems
NASA Astrophysics Data System (ADS)
Yang, Le; Wang, Shuo; Feng, Jianghua
2017-11-01
Electromagnetic interference (EMI) causes electromechanical damage to the motors and degrades the reliability of variable-frequency drive (VFD) systems. Unlike fundamental frequency components in motor drive systems, high-frequency EMI noise, coupled with the parasitic parameters of the trough system, are difficult to analyze and reduce. In this article, EMI modeling techniques for different function units in a VFD system, including induction motors, motor bearings, and rectifierinverters, are reviewed and evaluated in terms of applied frequency range, model parameterization, and model accuracy. The EMI models for the motors are categorized based on modeling techniques and model topologies. Motor bearing and shaft models are also reviewed, and techniques that are used to eliminate bearing current are evaluated. Modeling techniques for conventional rectifierinverter systems are also summarized. EMI noise suppression techniques, including passive filter, Wheatstone bridge balance, active filter, and optimized modulation, are reviewed and compared based on the VFD system models.
Climate-based models for West Nile Culex mosquito vectors in the Northeastern US
NASA Astrophysics Data System (ADS)
Gong, Hongfei; Degaetano, Arthur T.; Harrington, Laura C.
2011-05-01
Climate-based models simulating Culex mosquito population abundance in the Northeastern US were developed. Two West Nile vector species, Culex pipiens and Culex restuans, were included in model simulations. The model was optimized by a parameter-space search within biological bounds. Mosquito population dynamics were driven by major environmental factors including temperature, rainfall, evaporation rate and photoperiod. The results show a strong correlation between the timing of early population increases (as early warning of West Nile virus risk) and decreases in late summer. Simulated abundance was highly correlated with actual mosquito capture in New Jersey light traps and validated with field data. This climate-based model simulates the population dynamics of both the adult and immature mosquito life stage of Culex arbovirus vectors in the Northeastern US. It is expected to have direct and practical application for mosquito control and West Nile prevention programs.
NASA Astrophysics Data System (ADS)
Clark, Martyn; Essery, Richard
2017-04-01
When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.
Overview of physical models of liquid entrainment in annular gas-liquid flow
NASA Astrophysics Data System (ADS)
Cherdantsev, Andrey V.
2018-03-01
A number of recent papers devoted to development of physically-based models for prediction of liquid entrainment in annular regime of two-phase flow are analyzed. In these models shearing-off the crests of disturbance waves by the gas drag force is supposed to be the physical mechanism of entrainment phenomenon. The models are based on a number of assumptions on wavy structure, including inception of disturbance waves due to Kelvin-Helmholtz instability, linear velocity profile inside liquid film and high degree of three-dimensionality of disturbance waves. Validity of the assumptions is analyzed by comparison to modern experimental observations. It was shown that nearly every assumption is in strong qualitative and quantitative disagreement with experiments, which leads to massive discrepancies between the modeled and real properties of the disturbance waves. As a result, such models over-predict the entrained fraction by several orders of magnitude. The discrepancy is usually reduced using various kinds of empirical corrections. This, combined with empiricism already included in the models, turns the models into another kind of empirical correlations rather than physically-based models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
Software Tools to Support Research on Airport Departure Planning
NASA Technical Reports Server (NTRS)
Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul
2003-01-01
A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.
A Revised Thermosphere for the Mars Global Reference Atmospheric Model (Mars-GRAM Version 3.4)
NASA Technical Reports Server (NTRS)
Justus, C. G.; Johnson, D. L.; James, B. F.
1996-01-01
This report describes the newly-revised model thermosphere for the Mars Global Reference Atmospheric Model (Mars-GRAM, Version 3.4). It also provides descriptions of other changes made to the program since publication of the programmer's guide for Mars-GRAM Version 3.34. The original Mars-GRAM model thermosphere was based on the global-mean model of Stewart. The revised thermosphere is based largely on parameterizations derived from output data from the three-dimensional Mars Thermospheric Global Circulation Model (MTGCM). The new thermospheric model includes revised dependence on the 10.7 cm solar flux for the global means of exospheric temperature, temperature of the base of the thermosphere, and scale height for the thermospheric temperature variations, as well as revised dependence on orbital position for global mean height of the base of the thermosphere. Other features of the new thermospheric model are: (1) realistic variations of temperature and density with latitude and time of day, (2) more realistic wind magnitudes, based on improved estimates of horizontal pressure gradients, and (3) allowance for user-input adjustments to the model values for mean exospheric temperature and for height and temperature at the base of the thermosphere. Other new features of Mars-GRAM 3.4 include: (1) allowance for user-input values of climatic adjustment factors for temperature profiles from the surface to 75 km, and (2) a revised method for computing the sub-solar longitude position in the 'ORBIT' subroutine.
An Operational Model for the Prediction of Jet Blast
DOT National Transportation Integrated Search
2012-01-09
This paper presents an operational model for the prediction of jet blast. The model was : developed based upon three modules including a jet exhaust model, jet centerline decay : model and aircraft motion model. The final analysis was compared with d...
NASA Astrophysics Data System (ADS)
Gray, S. G.; Voinov, A. A.; Jordan, R.; Paolisso, M.
2016-12-01
Model-based reasoning is a basic part of human understanding, decision-making, and communication. Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding environmental change since stakeholders often hold valuable knowledge about socio-environmental dynamics and since collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four dimensional framework that includes reporting on dimensions of: (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human-environment interactions and the consequences of environmental changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of environmental policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM.
Two Strain Dengue Model with Temporary Cross Immunity and Seasonality
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Ballesteros, Sebastien; Stollenwerk, Nico
2010-09-01
Models on dengue fever epidemiology have previously shown critical fluctuations with power law distributions and also deterministic chaos in some parameter regions due to the multi-strain structure of the disease pathogen. In our first model including well known biological features, we found a rich dynamical structure including limit cycles, symmetry breaking bifurcations, torus bifurcations, coexisting attractors including isola solutions and deterministic chaos (as indicated by positive Lyapunov exponents) in a much larger parameter region, which is also biologically more plausible than the previous results of other researches. Based on these findings we will investigate the model structures further including seasonality.
Two Strain Dengue Model with Temporary Cross Immunity and Seasonality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguiar, Maira; Ballesteros, Sebastien; Stollenwerk, Nico
Models on dengue fever epidemiology have previously shown critical fluctuations with power law distributions and also deterministic chaos in some parameter regions due to the multi-strain structure of the disease pathogen. In our first model including well known biological features, we found a rich dynamical structure including limit cycles, symmetry breaking bifurcations, torus bifurcations, coexisting attractors including isola solutions and deterministic chaos (as indicated by positive Lyapunov exponents) in a much larger parameter region, which is also biologically more plausible than the previous results of other researches. Based on these findings we will investigate the model structures further including seasonality.
ERIC Educational Resources Information Center
Mbaziira, Alex Vincent
2017-01-01
Cybercriminals are increasingly using Internet-based text messaging applications to exploit their victims. Incidents of deceptive cybercrime in text-based communication are increasing and include fraud, scams, as well as favorable and unfavorable fake reviews. In this work, we use a text-based deception detection approach to train models for…
Ratcliffe, Michelle M
2012-08-01
Farm to School programs hold promise to address childhood obesity. These programs may increase students’ access to healthier foods, increase students’ knowledge of and desire to eat these foods, and increase their consumption of them. Implementing Farm to School programs requires the involvement of multiple people, including nutrition services, educators, and food producers. Because these groups have not traditionally worked together and each has different goals, it is important to demonstrate how Farm to School programs that are designed to decrease childhood obesity may also address others’ objectives, such as academic achievement and economic development. A logic model is an effective tool to help articulate a shared vision for how Farm to School programs may work to accomplish multiple goals. Furthermore, there is evidence that programs based on theory are more likely to be effective at changing individuals’ behaviors. Logic models based on theory may help to explain how a program works, aid in efficient and sustained implementation, and support the development of a coherent evaluation plan. This article presents a sample theory-based logic model for Farm to School programs. The presented logic model is informed by the polytheoretical model for food and garden-based education in school settings (PMFGBE). The logic model has been applied to multiple settings, including Farm to School program development and evaluation in urban and rural school districts. This article also includes a brief discussion on the development of the PMFGBE, a detailed explanation of how Farm to School programs may enhance the curricular, physical, and social learning environments of schools, and suggestions for the applicability of the logic model for practitioners, researchers, and policy makers.
Data Intensive Systems (DIS) Benchmark Performance Summary
2003-08-01
models assumed by today’s conventional architectures. Such applications include model- based Automatic Target Recognition (ATR), synthetic aperture...radar (SAR) codes, large scale dynamic databases/battlefield integration, dynamic sensor- based processing, high-speed cryptanalysis, high speed...distributed interactive and data intensive simulations, data-oriented problems characterized by pointer- based and other highly irregular data structures
A Multilevel Latent Growth Curve Approach to Predicting Student Proficiency
ERIC Educational Resources Information Center
Choi, Kilchan; Goldschmidt, Pete
2012-01-01
Value-added models and growth-based accountability aim to evaluate school's performance based on student growth in learning. The current focus is on linking the results from value-added models to the ones from growth-based accountability systems including Adequate Yearly Progress decisions mandated by No Child Left Behind. We present a new…
Surface-Charge-Based Micro-Models--A Solid Foundation for Learning about Direct Current Circuits
ERIC Educational Resources Information Center
Hirvonen, P. E.
2007-01-01
This study explores how the use of a surface-charge-based instructional approach affects introductory university level students' understanding of direct current (dc) circuits. The introduced teaching intervention includes electrostatics, surface-charge-based micro-models that explain the existence of an electric field inside the current-carrying…
nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.
Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia
2017-12-01
Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Modeling and Simulation of Lab-on-a-Chip Systems
2005-08-12
complex chip geometries (including multiple turns). Variations of sample concentration profiles in laminar diffusion-based micromixers are also derived...CHAPTER 6 MODELING OF LAMINAR DIFFUSION-BASED COMPLEX ELECTROKINETIC PASSIVE MICROMIXERS ...140 6.4.4 Multi-Stream (Inter-Digital) Micromixers
Educational Television: Brazil.
ERIC Educational Resources Information Center
Bretz, R.; Shinar, D.
Based on evaluation of nine Brazilian educational television centers, an Instructional Television Training Model (ITV) was developed to aid in determining and designing training requirements for instructional television systems. Analysis based on this model would include these tasks: (1) determine instructional purpose of the television…
NASA Technical Reports Server (NTRS)
Tischer, A. E.
1987-01-01
The failure information propagation model (FIPM) data base was developed to store and manipulate the large amount of information anticipated for the various Space Shuttle Main Engine (SSME) FIPMs. The organization and structure of the FIPM data base is described, including a summary of the data fields and key attributes associated with each FIPM data file. The menu-driven software developed to facilitate and control the entry, modification, and listing of data base records is also discussed. The transfer of the FIPM data base and software to the NASA Marshall Space Flight Center is described. Complete listings of all of the data base definition commands and software procedures are included in the appendixes.
Bao, Wei; Hu, Frank B.; Rong, Shuang; Rong, Ying; Bowers, Katherine; Schisterman, Enrique F.; Liu, Liegang; Zhang, Cuilin
2013-01-01
This study aimed to evaluate the predictive performance of genetic risk models based on risk loci identified and/or confirmed in genome-wide association studies for type 2 diabetes mellitus. A systematic literature search was conducted in the PubMed/MEDLINE and EMBASE databases through April 13, 2012, and published data relevant to the prediction of type 2 diabetes based on genome-wide association marker–based risk models (GRMs) were included. Of the 1,234 potentially relevant articles, 21 articles representing 23 studies were eligible for inclusion. The median area under the receiver operating characteristic curve (AUC) among eligible studies was 0.60 (range, 0.55–0.68), which did not differ appreciably by study design, sample size, participants’ race/ethnicity, or the number of genetic markers included in the GRMs. In addition, the AUCs for type 2 diabetes did not improve appreciably with the addition of genetic markers into conventional risk factor–based models (median AUC, 0.79 (range, 0.63–0.91) vs. median AUC, 0.78 (range, 0.63–0.90), respectively). A limited number of included studies used reclassification measures and yielded inconsistent results. In conclusion, GRMs showed a low predictive performance for risk of type 2 diabetes, irrespective of study design, participants’ race/ethnicity, and the number of genetic markers included. Moreover, the addition of genome-wide association markers into conventional risk models produced little improvement in predictive performance. PMID:24008910
Modeling the long-term evolution of space debris
Nikolaev, Sergei; De Vries, Willem H.; Henderson, John R.; Horsley, Matthew A.; Jiang, Ming; Levatin, Joanne L.; Olivier, Scot S.; Pertica, Alexander J.; Phillion, Donald W.; Springer, Harry K.
2017-03-07
A space object modeling system that models the evolution of space debris is provided. The modeling system simulates interaction of space objects at simulation times throughout a simulation period. The modeling system includes a propagator that calculates the position of each object at each simulation time based on orbital parameters. The modeling system also includes a collision detector that, for each pair of objects at each simulation time, performs a collision analysis. When the distance between objects satisfies a conjunction criterion, the modeling system calculates a local minimum distance between the pair of objects based on a curve fitting to identify a time of closest approach at the simulation times and calculating the position of the objects at the identified time. When the local minimum distance satisfies a collision criterion, the modeling system models the debris created by the collision of the pair of objects.
Let's Go Off the Grid: Subsurface Flow Modeling With Analytic Elements
NASA Astrophysics Data System (ADS)
Bakker, M.
2017-12-01
Subsurface flow modeling with analytic elements has the major advantage that no grid or time stepping are needed. Analytic element formulations exist for steady state and transient flow in layered aquifers and unsaturated flow in the vadose zone. Analytic element models are vector-based and consist of points, lines and curves that represent specific features in the subsurface. Recent advances allow for the simulation of partially penetrating wells and multi-aquifer wells, including skin effect and wellbore storage, horizontal wells of poly-line shape including skin effect, sharp changes in subsurface properties, and surface water features with leaky beds. Input files for analytic element models are simple, short and readable, and can easily be generated from, for example, GIS databases. Future plans include the incorporation of analytic element in parts of grid-based models where additional detail is needed. This presentation will give an overview of advanced flow features that can be modeled, many of which are implemented in free and open-source software.
InSTREAM: the individual-based stream trout research and environmental assessment model
Steven F. Railsback; Bret C. Harvey; Stephen K. Jackson; Roland H. Lamberson
2009-01-01
This report documents Version 4.2 of InSTREAM, including its formulation, software, and application to research and management problems. InSTREAM is a simulation model designed to understand how stream and river salmonid populations respond to habitat alteration, including altered flow, temperature, and turbidity regimes and changes in channel morphology. The model...
Development of 3D Oxide Fuel Mechanics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, B. W.; Casagranda, A.; Pitts, S. A.
This report documents recent work to improve the accuracy and robustness of the mechanical constitutive models used in the BISON fuel performance code. These developments include migration of the fuel mechanics models to be based on the MOOSE Tensor Mechanics module, improving the robustness of the smeared cracking model, implementing a capability to limit the time step size based on material model response, and improving the robustness of the return mapping iterations used in creep and plasticity models.
CELSS scenario analysis: Breakeven calculations
NASA Technical Reports Server (NTRS)
Mason, R. M.
1980-01-01
A model of the relative mass requirements of food production components in a controlled ecological life support system (CELSS) based on regenerative concepts is described. Included are a discussion of model scope, structure, and example calculations. Computer programs for cultivar and breakeven calculations are also included.
Pattern-oriented modeling of agent-based complex systems: Lessons from ecology
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-01-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology
NASA Astrophysics Data System (ADS)
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-11-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
Inan, Halil Ibrahim; Sagris, Valentina; Devos, Wim; Milenov, Pavel; van Oosterom, Peter; Zevenbergen, Jaap
2010-12-01
The Common Agricultural Policy (CAP) of the European Union (EU) has dramatically changed after 1992, and from then on the CAP focused on the management of direct income subsidies instead of production-based subsidies. For this focus, Member States (MS) are expected to establish Integrated Administration and Control System (IACS), including a Land Parcel Identification System (LPIS) as the spatial part of IACS. Different MS have chosen different solutions for their LPIS. Currently, some MS based their IACS/LPIS on data from their Land Administration Systems (LAS), and many others use purpose built special systems for their IACS/LPIS. The issue with these different IACS/LPIS is that they do not have standardized structures; rather, each represents a unique design in each MS, both in the case of LAS based or special systems. In this study, we aim at designing a core data model for those IACS/LPIS based on LAS. For this purpose, we make use of the ongoing standardization initiatives for LAS (Land Administration Domain Model: LADM) and IACS/LPIS (LPIS Core Model: LCM). The data model we propose in this study implies the collaboration between LADM and LCM and includes some extensions. Some basic issues with the collaboration model are discussed within this study: registration of farmers, land use rights and farming limitations, geometry/topology, temporal data management etc. For further explanation of the model structure, sample instance level diagrams illustrating some typical situations are also included. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Jain, Prateek; Yadav, Chandan; Agarwal, Amit; Chauhan, Yogesh Singh
2017-08-01
We present a surface potential based analytical model for double gate tunnel field effect transistor (DGTFET) for the current, terminal charges, and terminal capacitances. The model accounts for the effect of the mobile charge in the channel and captures the device physics in depletion as well as in the strong inversion regime. The narrowing of the tunnel barrier in the presence of mobile charges in the channel is incorporated via modeling of the inverse decay length, which is constant under channel depletion condition and bias dependent under inversion condition. To capture the ambipolar current behavior in the model, tunneling at the drain junction is also included. The proposed model is validated against TCAD simulation data and it shows close match with the simulation data.
Stagnaro-Green, Alex; Roe, David; Soto-Greene, Maria; Joffe, Russell
2008-01-01
Increasing financial pressures, along with a desire to realign resources with institutional priorities, has resulted in the adoption of mission-based funding (MBF) at many medical schools. The lack of inclusion of quality and the time and expense in developing and implementing mission based funding are major deficiencies in the models reported to date. In academic year 2002-2003 New Jersey Medical School developed a model that included both quantity and quality in the education metric and that was departmentally based. Eighty percent of the undergraduate medical education allocation was based on the quantity of undergraduate medical education taught by the department ($7.35 million), and 20% ($1.89 million) was allocated based on the quality of the education delivered. Quality determinations were made by the educational leadership based on student evaluations and departmental compliance with educational administrative requirements. Evolution of the model has included the development of a faculty oversight committee and the integration of peer evaluation in the determination of educational quality. Six departments had a documented increase in quality over time, and one department had a transient decrease in quality. The MBF model has been well accepted by chairs, educational leaders, and faculty and has been instrumental in enhancing the stature of education at our institution.
Re-evaluation of model-based light-scattering spectroscopy for tissue spectroscopy
Lau, Condon; Šćepanović, Obrad; Mirkovic, Jelena; McGee, Sasha; Yu, Chung-Chieh; Fulghum, Stephen; Wallace, Michael; Tunnell, James; Bechtel, Kate; Feld, Michael
2009-01-01
Model-based light scattering spectroscopy (LSS) seemed a promising technique for in-vivo diagnosis of dysplasia in multiple organs. In the studies, the residual spectrum, the difference between the observed and modeled diffuse reflectance spectra, was attributed to single elastic light scattering from epithelial nuclei, and diagnostic information due to nuclear changes was extracted from it. We show that this picture is incorrect. The actual single scattering signal arising from epithelial nuclei is much smaller than the previously computed residual spectrum, and does not have the wavelength dependence characteristic of Mie scattering. Rather, the residual spectrum largely arises from assuming a uniform hemoglobin distribution. In fact, hemoglobin is packaged in blood vessels, which alters the reflectance. When we include vessel packaging, which accounts for an inhomogeneous hemoglobin distribution, in the diffuse reflectance model, the reflectance is modeled more accurately, greatly reducing the amplitude of the residual spectrum. These findings are verified via numerical estimates based on light propagation and Mie theory, tissue phantom experiments, and analysis of published data measured from Barrett’s esophagus. In future studies, vessel packaging should be included in the model of diffuse reflectance and use of model-based LSS should be discontinued. PMID:19405760
An EKV-based high voltage MOSFET model with improved mobility and drift model
NASA Astrophysics Data System (ADS)
Chauhan, Yogesh Singh; Gillon, Renaud; Bakeroot, Benoit; Krummenacher, Francois; Declercq, Michel; Ionescu, Adrian Mihai
2007-11-01
An EKV-based high voltage MOSFET model is presented. The intrinsic channel model is derived based on the charge based EKV-formalism. An improved mobility model is used for the modeling of the intrinsic channel to improve the DC characteristics. The model uses second order dependence on the gate bias and an extra parameter for the smoothening of the saturation voltage of the intrinsic drain. An improved drift model [Chauhan YS, Anghel C, Krummenacher F, Ionescu AM, Declercq M, Gillon R, et al. A highly scalable high voltage MOSFET model. In: IEEE European solid-state device research conference (ESSDERC), September 2006. p. 270-3; Chauhan YS, Anghel C, Krummenacher F, Maier C, Gillon R, Bakeroot B, et al. Scalable general high voltage MOSFET model including quasi-saturation and self-heating effect. Solid State Electron 2006;50(11-12):1801-13] is used for the modeling of the drift region, which gives smoother transition on output characteristics and also models well the quasi-saturation region of high voltage MOSFETs. First, the model is validated on the numerical device simulation of the VDMOS transistor and then, on the measured characteristics of the SOI-LDMOS transistor. The accuracy of the model is better than our previous model [Chauhan YS, Anghel C, Krummenacher F, Maier C, Gillon R, Bakeroot B, et al. Scalable general high voltage MOSFET model including quasi-saturation and self-heating effect. Solid State Electron 2006;50(11-12):1801-13] especially in the quasi-saturation region of output characteristics.
Bernstein, Leslie R; Trahiotis, Constantine
2017-02-01
Interaural cross-correlation-based models of binaural processing have accounted successfully for a wide variety of binaural phenomena, including binaural detection, binaural discrimination, and measures of extents of laterality based on interaural temporal disparities, interaural intensitive disparities, and their combination. This report focuses on quantitative accounts of data obtained from binaural detection experiments published over five decades. Particular emphasis is placed on stimulus contexts for which commonly used correlation-based approaches fail to provide adequate explanations of the data. One such context concerns binaural detection of signals masked by certain noises that are narrow-band and/or interaurally partially correlated. It is shown that a cross-correlation-based model that includes stages of peripheral auditory processing can, when coupled with an appropriate decision variable, account well for a wide variety of classic and recently published binaural detection data including those that have, heretofore, proven to be problematic.
AgBase: supporting functional modeling in agricultural organisms
McCarthy, Fiona M.; Gresham, Cathy R.; Buza, Teresia J.; Chouvarine, Philippe; Pillai, Lakshmi R.; Kumar, Ranjit; Ozkan, Seval; Wang, Hui; Manda, Prashanti; Arick, Tony; Bridges, Susan M.; Burgess, Shane C.
2011-01-01
AgBase (http://www.agbase.msstate.edu/) provides resources to facilitate modeling of functional genomics data and structural and functional annotation of agriculturally important animal, plant, microbe and parasite genomes. The website is redesigned to improve accessibility and ease of use, including improved search capabilities. Expanded capabilities include new dedicated pages for horse, cat, dog, cotton, rice and soybean. We currently provide 590 240 Gene Ontology (GO) annotations to 105 454 gene products in 64 different species, including GO annotations linked to transcripts represented on agricultural microarrays. For many of these arrays, this provides the only functional annotation available. GO annotations are available for download and we provide comprehensive, species-specific GO annotation files for 18 different organisms. The tools available at AgBase have been expanded and several existing tools improved based upon user feedback. One of seven new tools available at AgBase, GOModeler, supports hypothesis testing from functional genomics data. We host several associated databases and provide genome browsers for three agricultural pathogens. Moreover, we provide comprehensive training resources (including worked examples and tutorials) via links to Educational Resources at the AgBase website. PMID:21075795
Integrated Main Propulsion System Performance Reconstruction Process/Models
NASA Technical Reports Server (NTRS)
Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael
2013-01-01
The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.
Interactive classification and content-based retrieval of tissue images
NASA Astrophysics Data System (ADS)
Aksoy, Selim; Marchisio, Giovanni B.; Tusk, Carsten; Koperski, Krzysztof
2002-11-01
We describe a system for interactive classification and retrieval of microscopic tissue images. Our system models tissues in pixel, region and image levels. Pixel level features are generated using unsupervised clustering of color and texture values. Region level features include shape information and statistics of pixel level feature values. Image level features include statistics and spatial relationships of regions. To reduce the gap between low-level features and high-level expert knowledge, we define the concept of prototype regions. The system learns the prototype regions in an image collection using model-based clustering and density estimation. Different tissue types are modeled using spatial relationships of these regions. Spatial relationships are represented by fuzzy membership functions. The system automatically selects significant relationships from training data and builds models which can also be updated using user relevance feedback. A Bayesian framework is used to classify tissues based on these models. Preliminary experiments show that the spatial relationship models we developed provide a flexible and powerful framework for classification and retrieval of tissue images.
Model-Based Prognostics of Hybrid Systems
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Roychoudhury, Indranil; Bregon, Anibal
2015-01-01
Model-based prognostics has become a popular approach to solving the prognostics problem. However, almost all work has focused on prognostics of systems with continuous dynamics. In this paper, we extend the model-based prognostics framework to hybrid systems models that combine both continuous and discrete dynamics. In general, most systems are hybrid in nature, including those that combine physical processes with software. We generalize the model-based prognostics formulation to hybrid systems, and describe the challenges involved. We present a general approach for modeling hybrid systems, and overview methods for solving estimation and prediction in hybrid systems. As a case study, we consider the problem of conflict (i.e., loss of separation) prediction in the National Airspace System, in which the aircraft models are hybrid dynamical systems.
Empirical validation of an agent-based model of wood markets in Switzerland
Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver
2018-01-01
We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300
NASA Astrophysics Data System (ADS)
Jitsuhiro, Takatoshi; Toriyama, Tomoji; Kogure, Kiyoshi
We propose a noise suppression method based on multi-model compositions and multi-pass search. In real environments, input speech for speech recognition includes many kinds of noise signals. To obtain good recognized candidates, suppressing many kinds of noise signals at once and finding target speech is important. Before noise suppression, to find speech and noise label sequences, we introduce multi-pass search with acoustic models including many kinds of noise models and their compositions, their n-gram models, and their lexicon. Noise suppression is frame-synchronously performed using the multiple models selected by recognized label sequences with time alignments. We evaluated this method using the E-Nightingale task, which contains voice memoranda spoken by nurses during actual work at hospitals. The proposed method obtained higher performance than the conventional method.
Enhanced project management tool
NASA Technical Reports Server (NTRS)
Hsu, Chen-Jung (Inventor); Patel, Hemil N. (Inventor); Maluf, David A. (Inventor); Moh Hashim, Jairon C. (Inventor); Tran, Khai Peter B. (Inventor)
2012-01-01
A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as one or more of a monthly report, a task plan report, a schedule report, a budget report and a risk management report, are generated and made available for display or further analysis or collection into a customized report template. An extensible database allows searching for information based upon context and upon content. Seven different types of project risks are addressed, including non-availability of required skill mix of workers. The system can be configured to exchange data and results with corresponding portions of similar project analyses, and to provide user-specific access to specified information.
Dynamic Analyses Including Joints Of Truss Structures
NASA Technical Reports Server (NTRS)
Belvin, W. Keith
1991-01-01
Method for mathematically modeling joints to assess influences of joints on dynamic response of truss structures developed in study. Only structures with low-frequency oscillations considered; only Coulomb friction and viscous damping included in analysis. Focus of effort to obtain finite-element mathematical models of joints exhibiting load-vs.-deflection behavior similar to measured load-vs.-deflection behavior of real joints. Experiments performed to determine stiffness and damping nonlinearities typical of joint hardware. Algorithm for computing coefficients of analytical joint models based on test data developed to enable study of linear and nonlinear effects of joints on global structural response. Besides intended application to large space structures, applications in nonaerospace community include ground-based antennas and earthquake-resistant steel-framed buildings.
Mathematical modeling and computational prediction of cancer drug resistance.
Sun, Xiaoqiang; Hu, Bin
2017-06-23
Diverse forms of resistance to anticancer drugs can lead to the failure of chemotherapy. Drug resistance is one of the most intractable issues for successfully treating cancer in current clinical practice. Effective clinical approaches that could counter drug resistance by restoring the sensitivity of tumors to the targeted agents are urgently needed. As numerous experimental results on resistance mechanisms have been obtained and a mass of high-throughput data has been accumulated, mathematical modeling and computational predictions using systematic and quantitative approaches have become increasingly important, as they can potentially provide deeper insights into resistance mechanisms, generate novel hypotheses or suggest promising treatment strategies for future testing. In this review, we first briefly summarize the current progress of experimentally revealed resistance mechanisms of targeted therapy, including genetic mechanisms, epigenetic mechanisms, posttranslational mechanisms, cellular mechanisms, microenvironmental mechanisms and pharmacokinetic mechanisms. Subsequently, we list several currently available databases and Web-based tools related to drug sensitivity and resistance. Then, we focus primarily on introducing some state-of-the-art computational methods used in drug resistance studies, including mechanism-based mathematical modeling approaches (e.g. molecular dynamics simulation, kinetic model of molecular networks, ordinary differential equation model of cellular dynamics, stochastic model, partial differential equation model, agent-based model, pharmacokinetic-pharmacodynamic model, etc.) and data-driven prediction methods (e.g. omics data-based conventional screening approach for node biomarkers, static network approach for edge biomarkers and module biomarkers, dynamic network approach for dynamic network biomarkers and dynamic module network biomarkers, etc.). Finally, we discuss several further questions and future directions for the use of computational methods for studying drug resistance, including inferring drug-induced signaling networks, multiscale modeling, drug combinations and precision medicine. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Sutton, J P; DeJong, G; Song, H; Wilkerson, D
1997-12-01
To operationalize research findings about a medical rehabilitation classification and payment model by building a prototype of a prospective payment system, and to determine whether this prototype model promotes payment equity. This latter objective is accomplished by identifying whether any facility or payment model characteristics are systematically associated with financial performance. This study was conducted in two phases. In Phase 1 the components of a diagnosis-related group (DRG)-like payment system, including a base rate, function-related group (FRG) weights, and adjusters, were identified and estimated using hospital cost functions. Phase 2 consisted of a simulation analysis in which each facility's financial performance was modeled, based on its 1990-1991 case mix. A multivariate regression equation was conducted to assess the extent to which characteristics of 42 rehabilitation facilities contribute toward determining financial performance under the present Medicare payment system as well as under the hypothetical model developed. Phase 1 (model development) included 61 rehabilitation hospitals. Approximately 59% were rehabilitation units within a general hospital and 48% were teaching facilities. The number of rehabilitation beds averaged 52. Phase 2 of the stimulation analysis included 42 rehabilitation facilities, subscribers to UDS in 1990-1991. Of these, 69% were rehabilitation units and 52% were teaching facilities. The number of rehabilitation beds averaged 48. Financial performance, as measured by the ratio of reimbursement to average costs. Case-mix index is the primary determinant of financial performance under the present Medicare payment system. None of the facility characteristics included in this analysis were associated with financial performance under the hypothetical FRG payment model. The most notable impact of an FRG-based payment model would be to create a stronger link between resource intensity and level of reimbursement, resulting in greater equity in the reimbursement of inpatient medical rehabilitation hospitals.
Selection of Variables in Cluster Analysis: An Empirical Comparison of Eight Procedures
ERIC Educational Resources Information Center
Steinley, Douglas; Brusco, Michael J.
2008-01-01
Eight different variable selection techniques for model-based and non-model-based clustering are evaluated across a wide range of cluster structures. It is shown that several methods have difficulties when non-informative variables (i.e., random noise) are included in the model. Furthermore, the distribution of the random noise greatly impacts the…
The Use of Theory in School Effectiveness Research Revisited
ERIC Educational Resources Information Center
Scheerens, Jaap
2013-01-01
From an international review of 109 school effectiveness research studies, only 6 could be seen as theory driven. As the border between substantive conceptual models of educational effectiveness and theory-based models is not always very sharp, this number might be increased to 11 by including those studies that are based on models that make…
Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne
2014-01-01
The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.
An Approach to the Evaluation of Hypermedia.
ERIC Educational Resources Information Center
Knussen, Christina; And Others
1991-01-01
Discusses methods that may be applied to the evaluation of hypermedia, based on six models described by Lawton. Techniques described include observation, self-report measures, interviews, automated measures, psychometric tests, checklists and criterion-based techniques, process models, Experimentally Measuring Usability (EMU), and a naturalistic…
A School-Based Mental Health Consultation Curriculum.
ERIC Educational Resources Information Center
Sandoval, Jonathan; Davis, John M.
1984-01-01
Presents one position on consultation that integrates a theoretical model, a process model, and a curriculum for training school-based mental health consultants. Elements of the proposed curriculum include: ethics, relationship building, maintaining rapport, defining problems, gathering data, sharing information, generating and supporting…
Building occupancy simulation and data assimilation using a graph-based agent-oriented model
NASA Astrophysics Data System (ADS)
Rai, Sanish; Hu, Xiaolin
2018-07-01
Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.
Model-free and model-based reward prediction errors in EEG.
Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy
2018-05-24
Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.
Predicting Acute Exacerbations in Chronic Obstructive Pulmonary Disease.
Samp, Jennifer C; Joo, Min J; Schumock, Glen T; Calip, Gregory S; Pickard, A Simon; Lee, Todd A
2018-03-01
With increasing health care costs that have outpaced those of other industries, payers of health care are moving from a fee-for-service payment model to one in which reimbursement is tied to outcomes. Chronic obstructive pulmonary disease (COPD) is a disease where this payment model has been implemented by some payers, and COPD exacerbations are a quality metric that is used. Under an outcomes-based payment model, it is important for health systems to be able to identify patients at risk for poor outcomes so that they can target interventions to improve outcomes. To develop and evaluate predictive models that could be used to identify patients at high risk for COPD exacerbations. This study was retrospective and observational and included COPD patients treated with a bronchodilator-based combination therapy. We used health insurance claims data to obtain demographics, enrollment information, comorbidities, medication use, and health care resource utilization for each patient over a 6-month baseline period. Exacerbations were examined over a 6-month outcome period and included inpatient (primary discharge diagnosis for COPD), outpatient, and emergency department (outpatient/emergency department visits with a COPD diagnosis plus an acute prescription for an antibiotic or corticosteroid within 5 days) exacerbations. The cohort was split into training (75%) and validation (25%) sets. Within the training cohort, stepwise logistic regression models were created to evaluate risk of exacerbations based on factors measured during the baseline period. Models were evaluated using sensitivity, specificity, and positive and negative predictive values. The base model included all confounding or effect modifier covariates. Several other models were explored using different sets of observations and variables to determine the best predictive model. There were 478,772 patients included in the analytic sample, of which 40.5% had exacerbations during the outcome period. Patients with exacerbations had slightly more comorbidities, medication use, and health care resource utilization compared with patients without exacerbations. In the base model, sensitivity was 41.6% and specificity was 85.5%. Positive and negative predictive values were 66.2% and 68.2%, respectively. Other models that were evaluated resulted in similar test characteristics as the base model. In this study, we were not able to predict COPD exacerbations with a high level of accuracy using health insurance claims data from COPD patients treated with bronchodilator-based combination therapy. Future studies should be done to explore predictive models for exacerbations. No outside funding supported this study. Samp is now employed by, and owns stock in, AbbVie. The other authors have nothing to disclose. Study concept and design were contributed by Joo and Pickard, along with the other authors. Samp and Lee performed the data analysis, with assistance from the other authors. Samp wrote the manuscript, which was revised by Schumock and Calip, along with the other authors.
Safaei, Soroush; Blanco, Pablo J; Müller, Lucas O; Hellevik, Leif R; Hunter, Peter J
2018-01-01
We propose a detailed CellML model of the human cerebral circulation that runs faster than real time on a desktop computer and is designed for use in clinical settings when the speed of response is important. A lumped parameter mathematical model, which is based on a one-dimensional formulation of the flow of an incompressible fluid in distensible vessels, is constructed using a bond graph formulation to ensure mass conservation and energy conservation. The model includes arterial vessels with geometric and anatomical data based on the ADAN circulation model. The peripheral beds are represented by lumped parameter compartments. We compare the hemodynamics predicted by the bond graph formulation of the cerebral circulation with that given by a classical one-dimensional Navier-Stokes model working on top of the whole-body ADAN model. Outputs from the bond graph model, including the pressure and flow signatures and blood volumes, are compared with physiological data.
Metal Ion Modeling Using Classical Mechanics
2017-01-01
Metal ions play significant roles in numerous fields including chemistry, geochemistry, biochemistry, and materials science. With computational tools increasingly becoming important in chemical research, methods have emerged to effectively face the challenge of modeling metal ions in the gas, aqueous, and solid phases. Herein, we review both quantum and classical modeling strategies for metal ion-containing systems that have been developed over the past few decades. This Review focuses on classical metal ion modeling based on unpolarized models (including the nonbonded, bonded, cationic dummy atom, and combined models), polarizable models (e.g., the fluctuating charge, Drude oscillator, and the induced dipole models), the angular overlap model, and valence bond-based models. Quantum mechanical studies of metal ion-containing systems at the semiempirical, ab initio, and density functional levels of theory are reviewed as well with a particular focus on how these methods inform classical modeling efforts. Finally, conclusions and future prospects and directions are offered that will further enhance the classical modeling of metal ion-containing systems. PMID:28045509
Model-based Clustering of High-Dimensional Data in Astrophysics
NASA Astrophysics Data System (ADS)
Bouveyron, C.
2016-05-01
The nature of data in Astrophysics has changed, as in other scientific fields, in the past decades due to the increase of the measurement capabilities. As a consequence, data are nowadays frequently of high dimensionality and available in mass or stream. Model-based techniques for clustering are popular tools which are renowned for their probabilistic foundations and their flexibility. However, classical model-based techniques show a disappointing behavior in high-dimensional spaces which is mainly due to their dramatical over-parametrization. The recent developments in model-based classification overcome these drawbacks and allow to efficiently classify high-dimensional data, even in the "small n / large p" situation. This work presents a comprehensive review of these recent approaches, including regularization-based techniques, parsimonious modeling, subspace classification methods and classification methods based on variable selection. The use of these model-based methods is also illustrated on real-world classification problems in Astrophysics using R packages.
Risk-Based Fire Safety Experiment Definition for Manned Spacecraft
NASA Technical Reports Server (NTRS)
Apostolakis, G. E.; Ho, V. S.; Marcus, E.; Perry, A. T.; Thompson, S. L.
1989-01-01
Risk methodology is used to define experiments to be conducted in space which will help to construct and test the models required for accident sequence identification. The development of accident scenarios is based on the realization that whether damage occurs depends on the time competition of two processes: the ignition and creation of an adverse environment, and the detection and suppression activities. If the fire grows and causes damage faster than it is detected and suppressed, then an accident occurred. The proposed integrated experiments will provide information on individual models that apply to each of the above processes, as well as previously unidentified interactions and processes, if any. Initially, models that are used in terrestrial fire risk assessments are considered. These include heat and smoke release models, detection and suppression models, as well as damage models. In cases where the absence of gravity substantially invalidates a model, alternate models will be developed. Models that depend on buoyancy effects, such as the multizone compartment fire models, are included in these cases. The experiments will be performed in a variety of geometries simulating habitable areas, racks, and other spaces. These simulations will necessitate theoretical studies of scaling effects. Sensitivity studies will also be carried out including the effects of varying oxygen concentrations, pressures, fuel orientation and geometry, and air flow rates. The experimental apparatus described herein includes three major modules: the combustion, the fluids, and the command and power modules.
NASA Astrophysics Data System (ADS)
Caldararu, S.; Smith, M. J.; Purves, D.; Emmott, S.
2013-12-01
Global agriculture will, in the future, be faced with two main challenges: climate change and an increase in global food demand driven by an increase in population and changes in consumption habits. To be able to predict both the impacts of changes in climate on crop yields and the changes in agricultural practices necessary to respond to such impacts we currently need to improve our understanding of crop responses to climate and the predictive capability of our models. Ideally, what we would have at our disposal is a modelling tool which, given certain climatic conditions and agricultural practices, can predict the growth pattern and final yield of any of the major crops across the globe. We present a simple, process-based crop growth model based on the assumption that plants allocate above- and below-ground biomass to maintain overall carbon optimality and that, to maintain this optimality, the reproductive stage begins at peak nitrogen uptake. The model includes responses to available light, water, temperature and carbon dioxide concentration as well as nitrogen fertilisation and irrigation. The model is data constrained at two sites, the Yaqui Valley, Mexico for wheat and the Southern Great Plains flux site for maize and soybean, using a robust combination of space-based vegetation data (including data from the MODIS and Landsat TM and ETM+ instruments), as well as ground-based biomass and yield measurements. We show a number of climate response scenarios, including increases in temperature and carbon dioxide concentrations as well as responses to irrigation and fertiliser application.
Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Wenzel, Sally E.; Lin, Ching-Long
2016-01-01
We propose a method to construct three-dimensional airway geometric models based on airway skeletons, or centerlines (CLs). Given a CT-segmented airway skeleton and surface, the proposed CL-based method automatically constructs subject-specific models that contain anatomical information regarding branches, include bifurcations and trifurcations, and extend from the trachea to terminal bronchioles. The resulting model can be anatomically realistic with the assistance of an image-based surface; alternatively a model with an idealized skeleton and/or branch diameters is also possible. This method systematically identifies and classifies trifurcations to successfully construct the models, which also provides the number and type of trifurcations for the analysis of the airways from an anatomical point of view. We applied this method to 16 normal and 16 severe asthmatic subjects using their computed tomography images. The average distance between the surface of the model and the image-based surface was 11% of the average voxel size of the image. The four most frequent locations of trifurcations were the left upper division bronchus, left lower lobar bronchus, right upper lobar bronchus, and right intermediate bronchus. The proposed method automatically constructed accurate subject-specific three-dimensional airway geometric models that contain anatomical information regarding branches using airway skeleton, diameters, and image-based surface geometry. The proposed method can construct (i) geometry automatically for population-based studies, (ii) trifurcations to retain the original airway topology, (iii) geometry that can be used for automatic generation of computational fluid dynamics meshes, and (iv) geometry based only on a skeleton and diameters for idealized branches. PMID:27704229
AgMIP: Next Generation Models and Assessments
NASA Astrophysics Data System (ADS)
Rosenzweig, C.
2014-12-01
Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6 that involves the key modeling groups from around the world including North America, Europe, South America, Sub-Saharan Africa, South Asia, East Asia, and Australia and Oceania. This community process will lead to mutually agreed protocols for coordinated global and regional assessments.
Dynamics of Zika virus outbreaks: an overview of mathematical modeling approaches.
Wiratsudakul, Anuwat; Suparit, Parinya; Modchang, Charin
2018-01-01
The Zika virus was first discovered in 1947. It was neglected until a major outbreak occurred on Yap Island, Micronesia, in 2007. Teratogenic effects resulting in microcephaly in newborn infants is the greatest public health threat. In 2016, the Zika virus epidemic was declared as a Public Health Emergency of International Concern (PHEIC). Consequently, mathematical models were constructed to explicitly elucidate related transmission dynamics. In this review article, two steps of journal article searching were performed. First, we attempted to identify mathematical models previously applied to the study of vector-borne diseases using the search terms "dynamics," "mathematical model," "modeling," and "vector-borne" together with the names of vector-borne diseases including chikungunya, dengue, malaria, West Nile, and Zika. Then the identified types of model were further investigated. Second, we narrowed down our survey to focus on only Zika virus research. The terms we searched for were "compartmental," "spatial," "metapopulation," "network," "individual-based," "agent-based" AND "Zika." All relevant studies were included regardless of the year of publication. We have collected research articles that were published before August 2017 based on our search criteria. In this publication survey, we explored the Google Scholar and PubMed databases. We found five basic model architectures previously applied to vector-borne virus studies, particularly in Zika virus simulations. These include compartmental, spatial, metapopulation, network, and individual-based models. We found that Zika models carried out for early epidemics were mostly fit into compartmental structures and were less complicated compared to the more recent ones. Simple models are still commonly used for the timely assessment of epidemics. Nevertheless, due to the availability of large-scale real-world data and computational power, recently there has been growing interest in more complex modeling frameworks. Mathematical models are employed to explore and predict how an infectious disease spreads in the real world, evaluate the disease importation risk, and assess the effectiveness of intervention strategies. As the trends in modeling of infectious diseases have been shifting towards data-driven approaches, simple and complex models should be exploited differently. Simple models can be produced in a timely fashion to provide an estimation of the possible impacts. In contrast, complex models integrating real-world data require more time to develop but are far more realistic. The preparation of complicated modeling frameworks prior to the outbreaks is recommended, including the case of future Zika epidemic preparation.
Multiscale Modeling of Angiogenesis and Predictive Capacity
NASA Astrophysics Data System (ADS)
Pillay, Samara; Byrne, Helen; Maini, Philip
Tumors induce the growth of new blood vessels from existing vasculature through angiogenesis. Using an agent-based approach, we model the behavior of individual endothelial cells during angiogenesis. We incorporate crowding effects through volume exclusion, motility of cells through biased random walks, and include birth and death-like processes. We use the transition probabilities associated with the discrete model and a discrete conservation equation for cell occupancy to determine collective cell behavior, in terms of partial differential equations (PDEs). We derive three PDE models incorporating single, multi-species and no volume exclusion. By fitting the parameters in our PDE models and other well-established continuum models to agent-based simulations during a specific time period, and then comparing the outputs from the PDE models and agent-based model at later times, we aim to determine how well the PDE models predict the future behavior of the agent-based model. We also determine whether predictions differ across PDE models and the significance of those differences. This may impact drug development strategies based on PDE models.
Watershed modeling at the Savannah River Site.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vache, Kellie
2015-04-29
The overall goal of the work was the development of a watershed scale model of hydrological function for application to the US Department of Energy’s (DOE) Savannah River Site (SRS). The primary outcomes is a grid based hydrological modeling system that captures near surface runoff as well as groundwater recharge and contributions of groundwater to streams. The model includes a physically-based algorithm to capture both evaporation and transpiration from forestland.
ERIC Educational Resources Information Center
Hsu, Shun-Yi
An instructional model based on a learning cycle including correlation, analysis, and generalization (CAG) was developed and applied to design an instructional module for grade 8 students in Taiwan, Republic of China. The CAG model was based on Piagetian theory and a concept model (Pella, 1975). The module developed for heat and temperature was…
Katsogiannis, Konstantinos Alexandros G; Vladisavljević, Goran T; Georgiadou, Stella; Rahmani, Ramin
2016-10-26
The effect of pore induction on increasing electrospun fibrous network specific surface area was investigated in this study. Theoretical models based on the available surface area of the fibrous network and exclusion of the surface area lost due to fiber-to-fiber contacts were developed. The models for calculation of the excluded area are based on Hertzian, Derjaguin-Muller-Toporov (DMT), and Johnson-Kendall-Roberts (JKR) contact models. Overall, the theoretical models correlated the network specific surface area to the material properties including density, surface tension, Young's modulus, Poisson's ratio, as well as network physical properties, such as density and geometrical characteristics including fiber radius, fiber aspect ratio and network thickness. Pore induction proved to increase the network specific surface area up to 52%, compared to the maximum surface area that could be achieved by nonporous fiber network with the same physical properties and geometrical characteristics. The model based on Johnson-Kendall-Roberts contact model describes accurately the fiber-to-fiber contact area under the experimental conditions used for pore generation. The experimental results and the theoretical model based on Johnson-Kendall-Roberts contact model show that the increase in network surface area due to pore induction can reach to up to 58%.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Viewgraphs on Rubber Airplane: Constraint-based Component-Modeling for Knowledge Representation in Computer Aided Conceptual Design are presented. Topics covered include: computer aided design; object oriented programming; airfoil design; surveillance aircraft; commercial aircraft; aircraft design; and launch vehicles.
Simulating Cancer Growth with Multiscale Agent-Based Modeling
Wang, Zhihui; Butner, Joseph D.; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S.
2014-01-01
There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. PMID:24793698
Hamiltonian closures in fluid models for plasmas
NASA Astrophysics Data System (ADS)
Tassi, Emanuele
2017-11-01
This article reviews recent activity on the Hamiltonian formulation of fluid models for plasmas in the non-dissipative limit, with emphasis on the relations between the fluid closures adopted for the different models and the Hamiltonian structures. The review focuses on results obtained during the last decade, but a few classical results are also described, in order to illustrate connections with the most recent developments. With the hope of making the review accessible not only to specialists in the field, an introduction to the mathematical tools applied in the Hamiltonian formalism for continuum models is provided. Subsequently, we review the Hamiltonian formulation of models based on the magnetohydrodynamics description, including those based on the adiabatic and double adiabatic closure. It is shown how Dirac's theory of constrained Hamiltonian systems can be applied to impose the incompressibility closure on a magnetohydrodynamic model and how an extended version of barotropic magnetohydrodynamics, accounting for two-fluid effects, is amenable to a Hamiltonian formulation. Hamiltonian reduced fluid models, valid in the presence of a strong magnetic field, are also reviewed. In particular, reduced magnetohydrodynamics and models assuming cold ions and different closures for the electron fluid are discussed. Hamiltonian models relaxing the cold-ion assumption are then introduced. These include models where finite Larmor radius effects are added by means of the gyromap technique, and gyrofluid models. Numerical simulations of Hamiltonian reduced fluid models investigating the phenomenon of magnetic reconnection are illustrated. The last part of the review concerns recent results based on the derivation of closures preserving a Hamiltonian structure, based on the Hamiltonian structure of parent kinetic models. Identification of such closures for fluid models derived from kinetic systems based on the Vlasov and drift-kinetic equations are presented, and connections with previously discussed fluid models are pointed out.
Daily life activity routine discovery in hemiparetic rehabilitation patients using topic models.
Seiter, J; Derungs, A; Schuster-Amft, C; Amft, O; Tröster, G
2015-01-01
Monitoring natural behavior and activity routines of hemiparetic rehabilitation patients across the day can provide valuable progress information for therapists and patients and contribute to an optimized rehabilitation process. In particular, continuous patient monitoring could add type, frequency and duration of daily life activity routines and hence complement standard clinical scores that are assessed for particular tasks only. Machine learning methods have been applied to infer activity routines from sensor data. However, supervised methods require activity annotations to build recognition models and thus require extensive patient supervision. Discovery methods, including topic models could provide patient routine information and deal with variability in activity and movement performance across patients. Topic models have been used to discover characteristic activity routine patterns of healthy individuals using activity primitives recognized from supervised sensor data. Yet, the applicability of topic models for hemiparetic rehabilitation patients and techniques to derive activity primitives without supervision needs to be addressed. We investigate, 1) whether a topic model-based activity routine discovery framework can infer activity routines of rehabilitation patients from wearable motion sensor data. 2) We compare the performance of our topic model-based activity routine discovery using rule-based and clustering-based activity vocabulary. We analyze the activity routine discovery in a dataset recorded with 11 hemiparetic rehabilitation patients during up to ten full recording days per individual in an ambulatory daycare rehabilitation center using wearable motion sensors attached to both wrists and the non-affected thigh. We introduce and compare rule-based and clustering-based activity vocabulary to process statistical and frequency acceleration features to activity words. Activity words were used for activity routine pattern discovery using topic models based on Latent Dirichlet Allocation. Discovered activity routine patterns were then mapped to six categorized activity routines. Using the rule-based approach, activity routines could be discovered with an average accuracy of 76% across all patients. The rule-based approach outperformed clustering by 10% and showed less confusions for predicted activity routines. Topic models are suitable to discover daily life activity routines in hemiparetic rehabilitation patients without trained classifiers and activity annotations. Activity routines show characteristic patterns regarding activity primitives including body and extremity postures and movement. A patient-independent rule set can be derived. Including expert knowledge supports successful activity routine discovery over completely data-driven clustering.
Telephone-based disease management: why it does not save money.
Motheral, Brenda R
2011-01-01
To understand why the current telephone-based model of disease management (DM) does not provide cost savings and how DM can be retooled based on the best available evidence to deliver better value. Literature review. The published peer-reviewed evaluations of DM and transitional care models from 1990 to 2010 were reviewed. Also examined was the cost-effectiveness literature on the treatment of chronic conditions that are commonly included in DM programs, including heart failure, diabetes mellitus, coronary artery disease, and asthma. First, transitional care models, which have historically been confused with commercial DM programs, can provide credible savings over a short period, rendering them low-hanging fruit for plan sponsors who desire real savings. Second, cost-effectiveness research has shown that the individual activities that constitute contemporary DM programs are not cost saving except for heart failure. Targeting of specific patients and activity combinations based on risk, actionability, treatment and program effectiveness, and costs will be necessary to deliver a cost-saving DM program, combined with an outreach model that brings vendors closer to the patient and physician. Barriers to this evidence-driven approach include resources required, marketability, and business model disruption. After a decade of market experimentation with limited success, new thinking is called for in the design of DM programs. A program design that is based on a cost-effectiveness approach, combined with greater program efficacy, will allow for the development of DM programs that are cost saving.
Human Pose Estimation from Monocular Images: A Comprehensive Survey
Gong, Wenjuan; Zhang, Xuena; Gonzàlez, Jordi; Sobral, Andrews; Bouwmans, Thierry; Tu, Changhe; Zahzah, El-hadi
2016-01-01
Human pose estimation refers to the estimation of the location of body parts and how they are connected in an image. Human pose estimation from monocular images has wide applications (e.g., image indexing). Several surveys on human pose estimation can be found in the literature, but they focus on a certain category; for example, model-based approaches or human motion analysis, etc. As far as we know, an overall review of this problem domain has yet to be provided. Furthermore, recent advancements based on deep learning have brought novel algorithms for this problem. In this paper, a comprehensive survey of human pose estimation from monocular images is carried out including milestone works and recent advancements. Based on one standard pipeline for the solution of computer vision problems, this survey splits the problem into several modules: feature extraction and description, human body models, and modeling methods. Problem modeling methods are approached based on two means of categorization in this survey. One way to categorize includes top-down and bottom-up methods, and another way includes generative and discriminative methods. Considering the fact that one direct application of human pose estimation is to provide initialization for automatic video surveillance, there are additional sections for motion-related methods in all modules: motion features, motion models, and motion-based methods. Finally, the paper also collects 26 publicly available data sets for validation and provides error measurement methods that are frequently used. PMID:27898003
Modeling of circulating fluised beds for post-combustion carbon capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, A.; Shadle, L.; Miller, D.
2011-01-01
A compartment based model for a circulating fluidized bed reactor has been developed based on experimental observations of riser hydrodynamics. The model uses a cluster based approach to describe the two-phase behavior of circulating fluidized beds. Fundamental mass balance equations have been derived to describe the movement of both gas and solids though the system. Additional work is being performed to develop the correlations required to describe the hydrodynamics of the system. Initial testing of the model with experimental data shows promising results and highlights the importance of including end effects within the model.
Liquid phase methanol LaPorte process development unit: Modification, operation, and support studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The objectives of the present work can be divided into three parts. The first objective was to develop a best-fit model for the older methanol synthesis catalyst (BASF S3-85) data base. At the time that this work commenced (June 1989), the BASF S3-85 data base contained many rate measurements accumulated over a few years. The newer catalyst (BASF S3-86) data base, at that time, contained only a few observations and did not include a broad range of conditions. Thus, a second objective of this work was to expand the BASF S3-86 data base to include more rate observations over amore » broader range of conditions. Finally, after expansion of the BASF S3-86 data base, the third objective was to develop a rate expression to describe this data base. This would include the application of rate expressions developed for the BASF S3-85 catalyst, as well as new models. (VC)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahn, Y.K.; Chen, H.T.; Helm, R.W.
1980-01-01
A biomass allocation model has been developed to show the most profitable combination of biomass feedstocks thermochemical conversion processes, and fuel products to serve the seasonal conditions in a regional market. This optimization model provides a tool for quickly calculating the most profitable biomass missions from a large number of potential biomass missions. Other components of the system serve as a convenient storage and retrieval mechanism for biomass marketing and thermochemical conversion processing data. The system can be accessed through the use of a computer terminal, or it could be adapted to a portable micro-processor. A User's Manual for themore » system has been included in Appendix A of the report. The validity of any biomass allocation solution provided by the allocation model is dependent on the accuracy of the data base. The initial data base was constructed from values obtained from the literature, and, consequently, as more current thermochemical conversion processing and manufacturing costs and efficiencies become available, the data base should be revised. Biomass derived fuels included in the data base are the following: medium Btu gas low Btu gas, substitute natural gas, ammonia, methanol, electricity, gasoline, and fuel oil. The market sectors served by the fuels include: residential, electric utility, chemical (industrial), and transportation. Regional/seasonal costs and availabilities and heating values for 61 woody and non-woody biomass species are included. The study has included four regions in the United States which were selected because there was both an availability of biomass and a commercial demand for the derived fuels: Region I: NY, WV, PA; Region II: GA, AL, MS; Region III: IN, IL, IA; and Region IV: OR, WA.« less
The NASA Space Radiobiology Risk Assessment Project
NASA Astrophysics Data System (ADS)
Cucinotta, Francis A.; Huff, Janice; Ponomarev, Artem; Patel, Zarana; Kim, Myung-Hee
The current first phase (2006-2011) has the three major goals of: 1) optimizing the conventional cancer risk models currently used based on the double-detriment life-table and radiation quality functions; 2) the integration of biophysical models of acute radiation syndromes; and 3) the development of new systems radiation biology models of cancer processes. The first-phase also includes continued uncertainty assessment of space radiation environmental models and transport codes, and relative biological effectiveness factors (RBE) based on flight data and NSRL results, respectively. The second phase of the (2012-2016) will: 1) develop biophysical models of central nervous system risks (CNS); 2) achieve comphrensive systems biology models of cancer processes using data from proton and heavy ion studies performed at NSRL; and 3) begin to identify computational models of biological countermeasures. Goals for the third phase (2017-2021) include: 1) the development of a systems biology model of cancer risks for operational use at NASA; 2) development of models of degenerative risks, 2) quantitative models of counter-measure impacts on cancer risks; and 3) indiviudal based risk assessments. Finally, we will support a decision point to continue NSRL research in support of NASA's exploration goals beyond 2021, and create an archival of NSRL research results for continued analysis. Details on near term goals, plans for a WEB based data resource of NSRL results, and a space radiation Wikepedia are described.
NASA Astrophysics Data System (ADS)
Merkord, C. L.; Liu, Y.; DeVos, M.; Wimberly, M. C.
2015-12-01
Malaria early detection and early warning systems are important tools for public health decision makers in regions where malaria transmission is seasonal and varies from year to year with fluctuations in rainfall and temperature. Here we present a new data-driven dynamic linear model based on the Kalman filter with time-varying coefficients that are used to identify malaria outbreaks as they occur (early detection) and predict the location and timing of future outbreaks (early warning). We fit linear models of malaria incidence with trend and Fourier form seasonal components using three years of weekly malaria case data from 30 districts in the Amhara Region of Ethiopia. We identified past outbreaks by comparing the modeled prediction envelopes with observed case data. Preliminary results demonstrated the potential for improved accuracy and timeliness over commonly-used methods in which thresholds are based on simpler summary statistics of historical data. Other benefits of the dynamic linear modeling approach include robustness to missing data and the ability to fit models with relatively few years of training data. To predict future outbreaks, we started with the early detection model for each district and added a regression component based on satellite-derived environmental predictor variables including precipitation data from the Tropical Rainfall Measuring Mission (TRMM) and land surface temperature (LST) and spectral indices from the Moderate Resolution Imaging Spectroradiometer (MODIS). We included lagged environmental predictors in the regression component of the model, with lags chosen based on cross-correlation of the one-step-ahead forecast errors from the first model. Our results suggest that predictions of future malaria outbreaks can be improved by incorporating lagged environmental predictors.
Physically based modeling of bedrock incision by abrasion, plucking, and macroabrasion
NASA Astrophysics Data System (ADS)
Chatanantavet, Phairot; Parker, Gary
2009-11-01
Many important insights into the dynamic coupling among climate, erosion, and tectonics in mountain areas have derived from several numerical models of the past few decades which include descriptions of bedrock incision. However, many questions regarding incision processes and morphology of bedrock streams still remain unanswered. A more mechanistically based incision model is needed as a component to study landscape evolution. Major bedrock incision processes include (among other mechanisms) abrasion by bed load, plucking, and macroabrasion (a process of fracturing of the bedrock into pluckable sizes mediated by particle impacts). The purpose of this paper is to develop a physically based model of bedrock incision that includes all three processes mentioned above. To build the model, we start by developing a theory of abrasion, plucking, and macroabrasion mechanisms. We then incorporate hydrology, the evaluation of boundary shear stress, capacity transport, an entrainment relation for pluckable particles, a routing model linking in-stream sediment and hillslopes, a formulation for alluvial channel coverage, a channel width relation, Hack's law, and Exner equation into the model so that we can simulate the evolution of bedrock channels. The model successfully simulates various features of bed elevation profiles of natural bedrock rivers under a variety of input or boundary conditions. The results also illustrate that knickpoints found in bedrock rivers may be autogenic in addition to being driven by base level fall and lithologic changes. This supports the concept that bedrock incision by knickpoint migration may be an integral part of normal incision processes. The model is expected to improve the current understanding of the linkage among physically meaningful input parameters, the physics of incision process, and morphological changes in bedrock streams.
NASA Astrophysics Data System (ADS)
Newman, A. J.; Sampson, K. M.; Wood, A. W.; Hopson, T. M.; Brekke, L. D.; Arnold, J.; Raff, D. A.; Clark, M. P.
2013-12-01
Skill in model-based hydrologic forecasting depends on the ability to estimate a watershed's initial moisture and energy conditions, to forecast future weather and climate inputs, and on the quality of the hydrologic model's representation of watershed processes. The impact of these factors on prediction skill varies regionally, seasonally, and by model. We are investigating these influences using a watershed simulation platform that spans the continental US (CONUS), encompassing a broad range of hydroclimatic variation, and that uses the current simulation models of National Weather Service streamflow forecasting operations. The first phase of this effort centered on the implementation and calibration of the SNOW-17 and Sacramento soil moisture accounting (SAC-SMA) based hydrologic modeling system for a range of watersheds. The base configuration includes 630 basins in the United States Geological Survey's Hydro-Climatic Data Network 2009 (HCDN-2009, Lins 2012) conterminous U.S. basin subset. Retrospective model forcings were derived from Daymet (http://daymet.ornl.gov/), and where available, a priori parameter estimates were based on or compared with the operational NWS model parameters. Model calibration was accomplished by several objective, automated strategies, including the shuffled complex evolution (SCE) optimization approach developed within the NWS in the early 1990s (Duan et al. 1993). This presentation describes outcomes from this effort, including insights about measuring simulation skill, and on relationships between simulation skill and model parameters, basin characteristics (climate, topography, vegetation, soils), and the quality of forcing inputs. References: %Z Thornton, P.; Thornton, M.; Mayer, B.; Wilhelmi, N.; Wei, Y.; Devarakonda, R; Cook, R. Daymet: Daily Surface Weather on a 1 km Grid for North America. 1980-2008; Oak Ridge National Laboratory Distributed Active Archive Center: Oak Ridge, TN, USA, 2012; Volume 10.
Characterizing Uncertainty and Variability in PBPK Models ...
Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variability; such characterization for PBPK model predictions represents a continuing challenge to both modelers and users. Current practices show significant progress in specifying deterministic biological models and the non-deterministic (often statistical) models, estimating their parameters using diverse data sets from multiple sources, and using them to make predictions and characterize uncertainty and variability. The International Workshop on Uncertainty and Variability in PBPK Models, held Oct 31-Nov 2, 2006, sought to identify the state-of-the-science in this area and recommend priorities for research and changes in practice and implementation. For the short term, these include: (1) multidisciplinary teams to integrate deterministic and non-deterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through more complete documentation of the model structure(s) and parameter values, the results of sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include: (1) theoretic and practical methodological impro
Case Study: Organotypic human in vitro models of embryonic ...
Morphogenetic fusion of tissues is a common event in embryonic development and disruption of fusion is associated with birth defects of the eye, heart, neural tube, phallus, palate, and other organ systems. Embryonic tissue fusion requires precise regulation of cell-cell and cell-matrix interactions that drive proliferation, differentiation, and morphogenesis. Chemical low-dose exposures can disrupt morphogenesis across space and time by interfering with key embryonic fusion events. The Morphogenetic Fusion Task uses computer and in vitro models to elucidate consequences of developmental exposures. The Morphogenetic Fusion Task integrates multiple approaches to model responses to chemicals that leaad to birth defects, including integrative mining on ToxCast DB, ToxRefDB, and chemical structures, advanced computer agent-based models, and human cell-based cultures that model disruption of cellular and molecular behaviors including mechanisms predicted from integrative data mining and agent-based models. The purpose of the poster is to indicate progress on the CSS 17.02 Virtual Tissue Models Morphogenesis Task 1 products for the Board of Scientific Counselors meeting on Nov 16-17.
Methodology for Uncertainty Analysis of Dynamic Computational Toxicology Models
The task of quantifying the uncertainty in both parameter estimates and model predictions has become more important with the increased use of dynamic computational toxicology models by the EPA. Dynamic toxicological models include physiologically-based pharmacokinetic (PBPK) mode...
Freight model improvement project for ECWRPC.
DOT National Transportation Integrated Search
2011-08-01
In early 2009 WisDOT, HNTB and ECWRPC completed the first phase of the Northeast Region Travel Demand Model. : While the model includes a truck trip generation based on the quick response freight manual, the model lacks enough : truck classification ...
Neuman systems model-based research: an integrative review project.
Fawcett, J; Giangrande, S K
2001-07-01
The project integrated Neuman systems model-based research literature. Two hundred published studies were located. This article is limited to the 59 full journal articles and 3 book chapters identified. A total of 37% focused on prevention interventions; 21% on perception of stressors; and 10% on stressor reactions. Only 50% of the reports explicitly linked the model with the study variables, and 61% did not include conclusions regarding model utility or credibility. No programs of research were identified. Academic courses and continuing education workshops are needed to help researchers design programs of Neuman systems model-based research and better explicate linkages between the model and the research.
Chiu, Peter K F; Roobol, Monique J; Teoh, Jeremy Y; Lee, Wai-Man; Yip, Siu-Ying; Hou, See-Ming; Bangma, Chris H; Ng, Chi-Fai
2016-10-01
To investigate PSA- and PHI (prostate health index)-based models for prediction of prostate cancer (PCa) and the feasibility of using DRE-estimated prostate volume (DRE-PV) in the models. This study included 569 Chinese men with PSA 4-10 ng/mL and non-suspicious DRE with transrectal ultrasound (TRUS) 10-core prostate biopsies performed between April 2008 and July 2015. DRE-PV was estimated using 3 pre-defined classes: 25, 40, or 60 ml. The performance of PSA-based and PHI-based predictive models including age, DRE-PV, and TRUS prostate volume (TRUS-PV) was analyzed using logistic regression and area under the receiver operating curves (AUC), in both the whole cohort and the screening age group of 55-75. PCa and high-grade PCa (HGPCa) was diagnosed in 10.9 % (62/569) and 2.8 % (16/569) men, respectively. The performance of DRE-PV-based models was similar to TRUS-PV-based models. In the age group 55-75, the AUCs for PCa of PSA alone, PSA with DRE-PV and age, PHI alone, PHI with DRE-PV and age, and PHI with TRUS-PV and age were 0.54, 0.71, 0.76, 0.78, and 0.78, respectively. The corresponding AUCs for HGPCa were higher (0.60, 0.70, 0.85, 0.83, and 0.83). At 10 and 20 % risk threshold for PCa, 38.4 and 55.4 % biopsies could be avoided in the PHI-based model, respectively. PHI had better performance over PSA-based models and could reduce unnecessary biopsies. A DRE-assessed PV can replace TRUS-assessed PV in multivariate prediction models to facilitate clinical use.
NASA Technical Reports Server (NTRS)
Sadunas, J. A.; French, E. P.; Sexton, H.
1973-01-01
A 1/25 scale model S-2 stage base region thermal environment test is presented. Analytical results are included which reflect the effect of engine operating conditions, model scale, turbo-pump exhaust gas injection on base region thermal environment. Comparisons are made between full scale flight data, model test data, and analytical results. The report is prepared in two volumes. The description of analytical predictions and comparisons with flight data are presented. Tabulation of the test data is provided.
An error bound for a discrete reduced order model of a linear multivariable system
NASA Technical Reports Server (NTRS)
Al-Saggaf, Ubaid M.; Franklin, Gene F.
1987-01-01
The design of feasible controllers for high dimension multivariable systems can be greatly aided by a method of model reduction. In order for the design based on the order reduction to include a guarantee of stability, it is sufficient to have a bound on the model error. Previous work has provided such a bound for continuous-time systems for algorithms based on balancing. In this note an L-infinity bound is derived for model error for a method of order reduction of discrete linear multivariable systems based on balancing.
The Planetary Data System Information Model for Geometry Metadata
NASA Astrophysics Data System (ADS)
Guinness, E. A.; Gordon, M. K.
2014-12-01
The NASA Planetary Data System (PDS) has recently developed a new set of archiving standards based on a rigorously defined information model. An important part of the new PDS information model is the model for geometry metadata, which includes, for example, attributes of the lighting and viewing angles of observations, position and velocity vectors of a spacecraft relative to Sun and observing body at the time of observation and the location and orientation of an observation on the target. The PDS geometry model is based on requirements gathered from the planetary research community, data producers, and software engineers who build search tools. A key requirement for the model is that it fully supports the breadth of PDS archives that include a wide range of data types from missions and instruments observing many types of solar system bodies such as planets, ring systems, and smaller bodies (moons, comets, and asteroids). Thus, important design aspects of the geometry model are that it standardizes the definition of the geometry attributes and provides consistency of geometry metadata across planetary science disciplines. The model specification also includes parameters so that the context of values can be unambiguously interpreted. For example, the reference frame used for specifying geographic locations on a planetary body is explicitly included with the other geometry metadata parameters. The structure and content of the new PDS geometry model is designed to enable both science analysis and efficient development of search tools. The geometry model is implemented in XML, as is the main PDS information model, and uses XML schema for validation. The initial version of the geometry model is focused on geometry for remote sensing observations conducted by flyby and orbiting spacecraft. Future releases of the PDS geometry model will be expanded to include metadata for landed and rover spacecraft.
Agent-based modeling: Methods and techniques for simulating human systems
Bonabeau, Eric
2002-01-01
Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407
A Modelica-based Model Library for Building Energy and Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael
2009-04-07
This paper describes an open-source library with component models for building energy and control systems that is based on Modelica, an equation-based objectoriented language that is well positioned to become the standard for modeling of dynamic systems in various industrial sectors. The library is currently developed to support computational science and engineering for innovative building energy and control systems. Early applications will include controls design and analysis, rapid prototyping to support innovation of new building systems and the use of models during operation for controls, fault detection and diagnostics. This paper discusses the motivation for selecting an equation-based object-oriented language.more » It presents the architecture of the library and explains how base models can be used to rapidly implement new models. To demonstrate the capability of analyzing novel energy and control systems, the paper closes with an example where we compare the dynamic performance of a conventional hydronic heating system with thermostatic radiator valves to an innovative heating system. In the new system, instead of a centralized circulation pump, each of the 18 radiators has a pump whose speed is controlled using a room temperature feedback loop, and the temperature of the boiler is controlled based on the speed of the radiator pump. All flows are computed by solving for the pressure distribution in the piping network, and the controls include continuous and discrete time controls.« less
Posada, David
2006-01-01
ModelTest server is a web-based application for the selection of models of nucleotide substitution using the program ModelTest. The server takes as input a text file with likelihood scores for the set of candidate models. Models can be selected with hierarchical likelihood ratio tests, or with the Akaike or Bayesian information criteria. The output includes several statistics for the assessment of model selection uncertainty, for model averaging or to estimate the relative importance of model parameters. The server can be accessed at . PMID:16845102
Multi-Year Revenue and Expenditure Forecasting for Small Municipal Governments.
1981-03-01
Management Audit Econometric Revenue Forecast Gap and Impact Analysis Deterministic Expenditure Forecast Municipal Forecasting Municipal Budget Formlto...together with a multi-year revenue and expenditure forecasting model for the City of Monterey, California. The Monterey model includes an econometric ...65 5 D. FORECAST BASED ON THE ECONOMETRIC MODEL ------- 67 E. FORECAST BASED ON EXPERT JUDGMENT AND TREND ANALYSIS
A Model for Communications Satellite System Architecture Assessment
2011-09-01
This is shown in Equation 4. The total system cost includes all development, acquisition, fielding, operations, maintenance and upgrades, and system...protection. A mathematical model was implemented to enable the analysis of communications satellite system architectures based on multiple system... implemented to enable the analysis of communications satellite system architectures based on multiple system attributes. Utilization of the model in
NASA Technical Reports Server (NTRS)
Hogan, John; Kang, Sukwon; Cavazzoni, Jim; Levri, Julie; Finn, Cory; Luna, Bernadette (Technical Monitor)
2000-01-01
The objective of this study is to compare incineration and composting in a Mars-based advanced life support (ALS) system. The variables explored include waste pre-processing requirements, reactor sizing and buffer capacities. The study incorporates detailed mathematical models of biomass production and waste processing into an existing dynamic ALS system model. The ALS system and incineration models (written in MATLAB/SIMULINK(c)) were developed at the NASA Ames Research Center. The composting process is modeled using first order kinetics, with different degradation rates for individual waste components (carbohydrates, proteins, fats, cellulose and lignin). The biomass waste streams are generated using modified "Eneray Cascade" crop models, which use light- and dark-cycle temperatures, irradiance, photoperiod, [CO2], planting density, and relative humidity as model inputs. The study also includes an evaluation of equivalent system mass (ESM).
Reasoning with Vectors: A Continuous Model for Fast Robust Inference.
Widdows, Dominic; Cohen, Trevor
2015-10-01
This paper describes the use of continuous vector space models for reasoning with a formal knowledge base. The practical significance of these models is that they support fast, approximate but robust inference and hypothesis generation, which is complementary to the slow, exact, but sometimes brittle behavior of more traditional deduction engines such as theorem provers. The paper explains the way logical connectives can be used in semantic vector models, and summarizes the development of Predication-based Semantic Indexing, which involves the use of Vector Symbolic Architectures to represent the concepts and relationships from a knowledge base of subject-predicate-object triples. Experiments show that the use of continuous models for formal reasoning is not only possible, but already demonstrably effective for some recognized informatics tasks, and showing promise in other traditional problem areas. Examples described in this paper include: predicting new uses for existing drugs in biomedical informatics; removing unwanted meanings from search results in information retrieval and concept navigation; type-inference from attributes; comparing words based on their orthography; and representing tabular data, including modelling numerical values. The algorithms and techniques described in this paper are all publicly released and freely available in the Semantic Vectors open-source software package.
Geo3DML: A standard-based exchange format for 3D geological models
NASA Astrophysics Data System (ADS)
Wang, Zhangang; Qu, Honggang; Wu, Zixing; Wang, Xianghong
2018-01-01
A geological model (geomodel) in three-dimensional (3D) space is a digital representation of the Earth's subsurface, recognized by geologists and stored in resultant geological data (geodata). The increasing demand for data management and interoperable applications of geomodelscan be addressed by developing standard-based exchange formats for the representation of not only a single geological object, but also holistic geomodels. However, current standards such as GeoSciML cannot incorporate all the geomodel-related information. This paper presents Geo3DML for the exchange of 3D geomodels based on the existing Open Geospatial Consortium (OGC) standards. Geo3DML is based on a unified and formal representation of structural models, attribute models and hierarchical structures of interpreted resultant geodata in different dimensional views, including drills, cross-sections/geomaps and 3D models, which is compatible with the conceptual model of GeoSciML. Geo3DML aims to encode all geomodel-related information integrally in one framework, including the semantic and geometric information of geoobjects and their relationships, as well as visual information. At present, Geo3DML and some supporting tools have been released as a data-exchange standard by the China Geological Survey (CGS).
Reasoning with Vectors: A Continuous Model for Fast Robust Inference
Widdows, Dominic; Cohen, Trevor
2015-01-01
This paper describes the use of continuous vector space models for reasoning with a formal knowledge base. The practical significance of these models is that they support fast, approximate but robust inference and hypothesis generation, which is complementary to the slow, exact, but sometimes brittle behavior of more traditional deduction engines such as theorem provers. The paper explains the way logical connectives can be used in semantic vector models, and summarizes the development of Predication-based Semantic Indexing, which involves the use of Vector Symbolic Architectures to represent the concepts and relationships from a knowledge base of subject-predicate-object triples. Experiments show that the use of continuous models for formal reasoning is not only possible, but already demonstrably effective for some recognized informatics tasks, and showing promise in other traditional problem areas. Examples described in this paper include: predicting new uses for existing drugs in biomedical informatics; removing unwanted meanings from search results in information retrieval and concept navigation; type-inference from attributes; comparing words based on their orthography; and representing tabular data, including modelling numerical values. The algorithms and techniques described in this paper are all publicly released and freely available in the Semantic Vectors open-source software package.1 PMID:26582967
NASA Astrophysics Data System (ADS)
Ronayne, Michael J.; Gorelick, Steven M.; Zheng, Chunmiao
2010-10-01
We developed a new model of aquifer heterogeneity to analyze data from a single-well injection-withdrawal tracer test conducted at the Macrodispersion Experiment (MADE) site on the Columbus Air Force Base in Mississippi (USA). The physical heterogeneity model is a hybrid that combines 3-D lithofacies to represent submeter scale, highly connected channels within a background matrix based on a correlated multivariate Gaussian hydraulic conductivity field. The modeled aquifer architecture is informed by a variety of field data, including geologic core sampling. Geostatistical properties of this hybrid heterogeneity model are consistent with the statistics of the hydraulic conductivity data set based on extensive borehole flowmeter testing at the MADE site. The representation of detailed, small-scale geologic heterogeneity allows for explicit simulation of local preferential flow and slow advection, processes that explain the complex tracer response from the injection-withdrawal test. Based on the new heterogeneity model, advective-dispersive transport reproduces key characteristics of the observed tracer recovery curve, including a delayed concentration peak and a low-concentration tail. Importantly, our results suggest that intrafacies heterogeneity is responsible for local-scale mass transfer.
Creep-fatigue life prediction for engine hot section materials (isotropic)
NASA Technical Reports Server (NTRS)
Moreno, V.
1982-01-01
The objectives of this program are the investigation of fundamental approaches to high temperature crack initiation life prediction, identification of specific modeling strategies and the development of specific models for component relevant loading conditions. A survey of the hot section material/coating systems used throughout the gas turbine industry is included. Two material/coating systems will be identified for the program. The material/coating system designated as the base system shall be used throughout Tasks 1-12. The alternate material/coating system will be used only in Task 12 for further evaluation of the models developed on the base material. In Task II, candidate life prediction approaches will be screened based on a set of criteria that includes experience of the approaches within the literature, correlation with isothermal data generated on the base material, and judgements relative to the applicability of the approach for the complex cycles to be considered in the option program. The two most promising approaches will be identified. Task 3 further evaluates the best approach using additional base material fatigue testing including verification tests. Task 4 consists of technical, schedular, financial and all other reporting requirements in accordance with the Reports of Work clause.
Study of solid state photomultiplier
NASA Technical Reports Server (NTRS)
Hays, K. M.; Laviolette, R. A.
1987-01-01
Available solid state photomultiplier (SSPM) detectors were tested under low-background, low temperature conditions to determine the conditions producing optimal sensitivity in a space-based astronomy system such as a liquid cooled helium telescope in orbit. Detector temperatures varied between 6 and 9 K, with background flux ranging from 10 to the 13th power to less than 10 to the 6th power photons/square cm-s. Measured parameters included quantum efficiency, noise, dark current, and spectral response. Experimental data were reduced, analyzed, and combined with existing data to build the SSPM data base included herein. The results were compared to analytical models of SSPM performance where appropriate models existed. Analytical models presented here were developed to be as consistent with the data base as practicable. Significant differences between the theory and data are described. Some models were developed or updated as a result of this study.
Resource Aware Intelligent Network Services (RAINS) Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehman, Tom; Yang, Xi
The Resource Aware Intelligent Network Services (RAINS) project conducted research and developed technologies in the area of cyber infrastructure resource modeling and computation. The goal of this work was to provide a foundation to enable intelligent, software defined services which spanned the network AND the resources which connect to the network. A Multi-Resource Service Plane (MRSP) was defined, which allows resource owners/managers to locate and place themselves from a topology and service availability perspective within the dynamic networked cyberinfrastructure ecosystem. The MRSP enables the presentation of integrated topology views and computation results which can include resources across the spectrum ofmore » compute, storage, and networks. The RAINS project developed MSRP includes the following key components: i) Multi-Resource Service (MRS) Ontology/Multi-Resource Markup Language (MRML), ii) Resource Computation Engine (RCE), iii) Modular Driver Framework (to allow integration of a variety of external resources). The MRS/MRML is a general and extensible modeling framework that allows for resource owners to model, or describe, a wide variety of resource types. All resources are described using three categories of elements: Resources, Services, and Relationships between the elements. This modeling framework defines a common method for the transformation of cyber infrastructure resources into data in the form of MRML models. In order to realize this infrastructure datification, the RAINS project developed a model based computation system, i.e. “RAINS Computation Engine (RCE)”. The RCE has the ability to ingest, process, integrate, and compute based on automatically generated MRML models. The RCE interacts with the resources thru system drivers which are specific to the type of external network or resource controller. The RAINS project developed a modular and pluggable driver system which facilities a variety of resource controllers to automatically generate, maintain, and distribute MRML based resource descriptions. Once all of the resource topologies are absorbed by the RCE, a connected graph of the full distributed system topology is constructed, which forms the basis for computation and workflow processing. The RCE includes a Modular Computation Element (MCE) framework which allows for tailoring of the computation process to the specific set of resources under control, and the services desired. The input and output of an MCE are both model data based on MRS/MRML ontology and schema. Some of the RAINS project accomplishments include: Development of general and extensible multi-resource modeling framework; Design of a Resource Computation Engine (RCE) system which includes the following key capabilities; Absorb a variety of multi-resource model types and build integrated models; Novel architecture which uses model based communications across the full stack for all Flexible provision of abstract or intent based user facing interfaces; Workflow processing based on model descriptions; Release of the RCE as an open source software; Deployment of RCE in the University of Maryland/Mid-Atlantic Crossroad ScienceDMZ in prototype mode with a plan under way to transition to production; Deployment at the Argonne National Laboratory DTN Facility in prototype mode; Selection of RCE by the DOE SENSE (SDN for End-to-end Networked Science at the Exascale) project as the basis for their orchestration service.« less
Integrating Research Competencies in Massage Therapy Education.
ERIC Educational Resources Information Center
Hymel, Glenn M.
The massage therapy profession is currently engaged in a competency-based education movement that includes an emphasis on promoting massage therapy research competencies (MTRCs). A systems-based model for integrating MTRCs into massage therapy education was therefore proposed. The model and an accompanying checklist describe an approach to…
Automotive Maintenance Data Base for Model Years 1976-1979. Part I
DOT National Transportation Integrated Search
1980-12-01
An update of the existing data base was developed to include life cycle maintenance costs of representative vehicles for the model years 1976-1979. Repair costs as a function of time are also developed for a passenger car in each of the compact, subc...
Adding ecosystem function to agent-based land use models
USDA-ARS?s Scientific Manuscript database
The objective of this paper is to examine issues in the inclusion of simulations of ecosystem functions in agent-based models of land use decision-making. The reasons for incorporating these simulations include local interests in land fertility and global interests in carbon sequestration. Biogeoche...
Approaches to Classroom-Based Computational Science.
ERIC Educational Resources Information Center
Guzdial, Mark
Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…
Economic Modeling and Analysis of Educational Vouchers
ERIC Educational Resources Information Center
Epple, Dennis; Romano, Richard
2012-01-01
The analysis of educational vouchers has evolved from market-based analogies to models that incorporate distinctive features of the educational environment. These distinctive features include peer effects, scope for private school pricing and admissions based on student characteristics, the linkage of household residential and school choices in…
Querying and Ranking XML Documents.
ERIC Educational Resources Information Center
Schlieder, Torsten; Meuss, Holger
2002-01-01
Discussion of XML, information retrieval, precision, and recall focuses on a retrieval technique that adopts the similarity measure of the vector space model, incorporates the document structure, and supports structured queries. Topics include a query model based on tree matching; structured queries and term-based ranking; and term frequency and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo
Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., with the assistance of NREL's PV Manufacturing R&D program, have continued the advancement of CIGS production technology through the development of trajectory-oriented predictive/control models, fault-tolerance control, control-platform development, in-situ sensors, and process improvements. Modeling activities to date include the development of physics-based and empirical models for CIGS and sputter-deposition processing, implementation of model-based control, and application of predictive models to the construction of new evaporation sources and for control. Model-based control is enabled through implementation of reduced or empirical models into a control platform. Reliability improvement activities include implementation of preventivemore » maintenance schedules; detection of failed sensors/equipment and reconfiguration to continue processing; and systematic development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which, in turn, have been enabled by control and reliability improvements due to this PV Manufacturing R&D program. This has resulted in substantial improvements of flexible CIGS PV module performance and efficiency.« less
Modeling Hemispheric Detonation Experiments in 2-Dimensions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howard, W M; Fried, L E; Vitello, P A
2006-06-22
Experiments have been performed with LX-17 (92.5% TATB and 7.5% Kel-F 800 binder) to study scaling of detonation waves using a dimensional scaling in a hemispherical divergent geometry. We model these experiments using an arbitrary Lagrange-Eulerian (ALE3D) hydrodynamics code, with reactive flow models based on the thermo-chemical code, Cheetah. The thermo-chemical code Cheetah provides a pressure-dependent kinetic rate law, along with an equation of state based on exponential-6 fluid potentials for individual detonation product species, calibrated to high pressures ({approx} few Mbars) and high temperatures (20000K). The parameters for these potentials are fit to a wide variety of experimental data,more » including shock, compression and sound speed data. For the un-reacted high explosive equation of state we use a modified Murnaghan form. We model the detonator (including the flyer plate) and initiation system in detail. The detonator is composed of LX-16, for which we use a program burn model. Steinberg-Guinan models5 are used for the metal components of the detonator. The booster and high explosive are LX-10 and LX-17, respectively. For both the LX-10 and LX-17, we use a pressure dependent rate law, coupled with a chemical equilibrium equation of state based on Cheetah. For LX-17, the kinetic model includes carbon clustering on the nanometer size scale.« less
Moving Beyond ERP Components: A Selective Review of Approaches to Integrate EEG and Behavior
Bridwell, David A.; Cavanagh, James F.; Collins, Anne G. E.; Nunez, Michael D.; Srinivasan, Ramesh; Stober, Sebastian; Calhoun, Vince D.
2018-01-01
Relationships between neuroimaging measures and behavior provide important clues about brain function and cognition in healthy and clinical populations. While electroencephalography (EEG) provides a portable, low cost measure of brain dynamics, it has been somewhat underrepresented in the emerging field of model-based inference. We seek to address this gap in this article by highlighting the utility of linking EEG and behavior, with an emphasis on approaches for EEG analysis that move beyond focusing on peaks or “components” derived from averaging EEG responses across trials and subjects (generating the event-related potential, ERP). First, we review methods for deriving features from EEG in order to enhance the signal within single-trials. These methods include filtering based on user-defined features (i.e., frequency decomposition, time-frequency decomposition), filtering based on data-driven properties (i.e., blind source separation, BSS), and generating more abstract representations of data (e.g., using deep learning). We then review cognitive models which extract latent variables from experimental tasks, including the drift diffusion model (DDM) and reinforcement learning (RL) approaches. Next, we discuss ways to access associations among these measures, including statistical models, data-driven joint models and cognitive joint modeling using hierarchical Bayesian models (HBMs). We think that these methodological tools are likely to contribute to theoretical advancements, and will help inform our understandings of brain dynamics that contribute to moment-to-moment cognitive function. PMID:29632480
Term Dependence: A Basis for Luhn and Zipf Models.
ERIC Educational Resources Information Center
Losee, Robert M.
2001-01-01
Discusses relationships between the frequency-based characteristics of neighboring terms in natural language and the rank or frequency of the terms. Topics include information theory measures, including expected mutual information measure (EMIM); entropy and rank; Luhn's model of term aboutness; Zipf's law; and implications for indexing and…
Skeletal maturity determination from hand radiograph by model-based analysis
NASA Astrophysics Data System (ADS)
Vogelsang, Frank; Kohnen, Michael; Schneider, Hansgerd; Weiler, Frank; Kilbinger, Markus W.; Wein, Berthold B.; Guenther, Rolf W.
2000-06-01
Derived from a model based segmentation algorithm for hand radiographs proposed in our former work we now present a method to determine skeletal maturity by an automated analysis of regions of interest (ROI). These ROIs including the epiphyseal and carpal bones, which are most important for skeletal maturity determination, can be extracted out of the radiograph by knowledge based algorithms.
Development of a Computationally Efficient, High Fidelity, Finite Element Based Hall Thruster Model
NASA Technical Reports Server (NTRS)
Jacobson, David (Technical Monitor); Roy, Subrata
2004-01-01
This report documents the development of a two dimensional finite element based numerical model for efficient characterization of the Hall thruster plasma dynamics in the framework of multi-fluid model. Effect of the ionization and the recombination has been included in the present model. Based on the experimental data, a third order polynomial in electron temperature is used to calculate the ionization rate. The neutral dynamics is included only through the neutral continuity equation in the presence of a uniform neutral flow. The electrons are modeled as magnetized and hot, whereas ions are assumed magnetized and cold. The dynamics of Hall thruster is also investigated in the presence of plasma-wall interaction. The plasma-wall interaction is a function of wall potential, which in turn is determined by the secondary electron emission and sputtering yield. The effect of secondary electron emission and sputter yield has been considered simultaneously, Simulation results are interpreted in the light of experimental observations and available numerical solutions in the literature.
WASP TRANSPORT MODELING AND WASP ECOLOGICAL MODELING
A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...
Model-based learning and the contribution of the orbitofrontal cortex to the model-free world
McDannald, Michael A.; Takahashi, Yuji K.; Lopatina, Nina; Pietras, Brad W.; Jones, Josh L.; Schoenbaum, Geoffrey
2012-01-01
Learning is proposed to occur when there is a discrepancy between reward prediction and reward receipt. At least two separate systems are thought to exist: one in which predictions are proposed to be based on model-free or cached values; and another in which predictions are model-based. A basic neural circuit for model-free reinforcement learning has already been described. In the model-free circuit the ventral striatum (VS) is thought to supply a common-currency reward prediction to midbrain dopamine neurons that compute prediction errors and drive learning. In a model-based system, predictions can include more information about an expected reward, such as its sensory attributes or current, unique value. This detailed prediction allows for both behavioral flexibility and learning driven by changes in sensory features of rewards alone. Recent evidence from animal learning and human imaging suggests that, in addition to model-free information, the VS also signals model-based information. Further, there is evidence that the orbitofrontal cortex (OFC) signals model-based information. Here we review these data and suggest that the OFC provides model-based information to this traditional model-free circuitry and offer possibilities as to how this interaction might occur. PMID:22487030
Bayesian Model Selection under Time Constraints
NASA Astrophysics Data System (ADS)
Hoege, M.; Nowak, W.; Illman, W. A.
2017-12-01
Bayesian model selection (BMS) provides a consistent framework for rating and comparing models in multi-model inference. In cases where models of vastly different complexity compete with each other, we also face vastly different computational runtimes of such models. For instance, time series of a quantity of interest can be simulated by an autoregressive process model that takes even less than a second for one run, or by a partial differential equations-based model with runtimes up to several hours or even days. The classical BMS is based on a quantity called Bayesian model evidence (BME). It determines the model weights in the selection process and resembles a trade-off between bias of a model and its complexity. However, in practice, the runtime of models is another weight relevant factor for model selection. Hence, we believe that it should be included, leading to an overall trade-off problem between bias, variance and computing effort. We approach this triple trade-off from the viewpoint of our ability to generate realizations of the models under a given computational budget. One way to obtain BME values is through sampling-based integration techniques. We argue with the fact that more expensive models can be sampled much less under time constraints than faster models (in straight proportion to their runtime). The computed evidence in favor of a more expensive model is statistically less significant than the evidence computed in favor of a faster model, since sampling-based strategies are always subject to statistical sampling error. We present a straightforward way to include this misbalance into the model weights that are the basis for model selection. Our approach follows directly from the idea of insufficient significance. It is based on a computationally cheap bootstrapping error estimate of model evidence and is easy to implement. The approach is illustrated in a small synthetic modeling study.
Modeling of ground based laser propagation to low Earth orbit object for maneuver
NASA Astrophysics Data System (ADS)
Smith, Liam C.; Allen, Jeffrey H.; Bold, Matthew M.
2017-08-01
The Space Environment Research Centre (SERC) endeavors to demonstrate the ability to maneuver high area to mass ratio objects using ground based lasers. Lockheed Martin has been leading system performance modeling for this project that includes high power laser propagation through the atmosphere, target interactions and subsequent orbital maneuver of the object. This paper will describe the models used, model assumptions and performance estimates for laser maneuver demonstration.
Latino Definitions of Success: A Cultural Model of Intercultural Competence
Torres, Lucas
2010-01-01
The present study sought to examine Latino intercultural competence via two separate methodologies. Phase 1 entailed discovering and generating themes regarding the features of intercultural competence based on semistructured interviews of 15 Latino adults. Phase 2 included conducting a cultural consensus analysis from the quantitative responses of 46 Latino adults to determine the cultural model of intercultural competence. The major results indicated that the participants, despite variations in socioeconomic and generational statuses, shared a common knowledge base regarding the competencies needed for Latinos to successfully navigate different cultures. Overall, the cultural model of Latino intercultural competence includes a set of skills that integrates traditional cultural values along with attributes of self-efficacy. The findings are discussed within a competence-based conceptualization of cultural adaptation and potential advancements in acculturation research. PMID:20333325
Steele, Leah S; Durbin, Anna; Sibley, Lyn M; Glazier, Richard
2013-01-01
In Ontario, Canada, the patient-centred medical home is a model of primary care delivery that includes 3 model types of interest for this study: enhanced fee-for-service, blended capitation, and team-based blended capitation. All 3 models involve rostering of patients and have similar practice requirements but differ in method of physician reimbursement, with the blended capitation models incorporating adjustments for age and sex, but not case mix, of rostered patients. We evaluated the extent to which persons with mental illness were included in physicians' total practices (as rostered and non-rostered patients) and were included on physicians' rosters across types of medical homes in Ontario. Using population-based administrative data, we considered 3 groups of patients: those with psychotic or bipolar diagnoses, those with other mental health diagnoses, and those with no mental health diagnoses. We modelled the prevalence of mental health diagnoses and the proportion of patients with such diagnoses who were rostered across the 3 medical home model types, controlling for demographic characteristics and case mix. Compared with enhanced fee-for-service practices, and relative to patients without mental illness, the proportions of patients with psychosis or bipolar disorders were not different in blended capitation and team-based blended capitation practices (rate ratio [RR] 0.91, 95% confidence interval [CI] 0.82-1.01; RR 1.06, 95% CI 0.96-1.17, respectively). However, there were fewer patients with other mental illnesses (RR 0.94, 95% CI 0.90-0.99; RR 0.89, 95% CI 0.85-0.94, respectively). Compared with expected proportions, practices based on both capitation models were significantly less likely than enhanced fee-for-service practices to roster patients with psychosis or bipolar disorders (for blended capitation, RR 0.92, 95% CI 0.90-0.93; for team-based capitation, RR 0.92, 95% CI 0.88-0.93) and also patients with other mental illnesses (for blended capitation, RR 0.94, 95% CI 0.92-0.95; for team-based capitation, RR 0.93, 95% CI 0.92-0.94). Persons with mental illness were under-represented in the rosters of Ontario's capitation-based medical homes. These findings suggest a need to direct attention to the incentive structure for including patients with mental illness.
Abe, Makiko; Ito, Hidemi; Oze, Isao; Nomura, Masatoshi; Ogawa, Yoshihiro; Matsuo, Keitaro
2017-12-01
Little is known about the difference of genetic predisposition for CRC between ethnicities; however, many genetic traits common to colorectal cancer have been identified. This study investigated whether more SNPs identified in GWAS in East Asian population could improve the risk prediction of Japanese and explored possible application of genetic risk groups as an instrument of the risk communication. 558 Patients histologically verified colorectal cancer and 1116 first-visit outpatients were included for derivation study, and 547 cases and 547 controls were for replication study. Among each population, we evaluated prediction models for the risk of CRC that combined the genetic risk group based on SNPs from GWASs in European-population and a similarly developed model adding SNPs from GWASs in East Asian-population. We examined whether adding East Asian-specific SNPs would improve the discrimination. Six SNPs (rs6983267, rs4779584, rs4444235, rs9929218, rs10936599, rs16969681) from 23 SNPs by European-based GWAS and five SNPs (rs704017, rs11196172, rs10774214, rs647161, rs2423279) among ten SNPs by Asian-based GWAS were selected in CRC risk prediction model. Compared with a 6-SNP-based model, an 11-SNP model including Asian GWAS-SNPs showed improved discrimination capacity in Receiver operator characteristic analysis. A model with 11 SNPs resulted in statistically significant improvement in both derivation (P = 0.0039) and replication studies (P = 0.0018) compared with six SNP model. We estimated cumulative risk of CRC by using genetic risk group based on 11 SNPs and found that the cumulative risk at age 80 is approximately 13% in the high-risk group while 6% in the low-risk group. We constructed a more efficient CRC risk prediction model with 11 SNPs including newly identified East Asian-based GWAS SNPs (rs704017, rs11196172, rs10774214, rs647161, rs2423279). Risk grouping based on 11 SNPs depicted lifetime difference of CRC risk. This might be useful for effective individualized prevention for East Asian.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Ying -Qi; Segall, Paul; Bradley, Andrew
Physics-based models of volcanic eruptions track conduit processes as functions of depth and time. When used in inversions, these models permit integration of diverse geological and geophysical data sets to constrain important parameters of magmatic systems. We develop a 1-D steady state conduit model for effusive eruptions including equilibrium crystallization and gas transport through the conduit and compare with the quasi-steady dome growth phase of Mount St. Helens in 2005. Viscosity increase resulting from pressure-dependent crystallization leads to a natural transition from viscous flow to frictional sliding on the conduit margin. Erupted mass flux depends strongly on wall rock andmore » magma permeabilities due to their impact on magma density. Including both lateral and vertical gas transport reveals competing effects that produce nonmonotonic behavior in the mass flux when increasing magma permeability. Using this physics-based model in a Bayesian inversion, we link data sets from Mount St. Helens such as extrusion flux and earthquake depths with petrological data to estimate unknown model parameters, including magma chamber pressure and water content, magma permeability constants, conduit radius, and friction along the conduit walls. Even with this relatively simple model and limited data, we obtain improved constraints on important model parameters. We find that the magma chamber had low (<5 wt %) total volatiles and that the magma permeability scale is well constrained at ~10 –11.4m 2 to reproduce observed dome rock porosities. Here, compared with previous results, higher magma overpressure and lower wall friction are required to compensate for increased viscous resistance while keeping extrusion rate at the observed value.« less
NASA Astrophysics Data System (ADS)
Wong, Ying-Qi; Segall, Paul; Bradley, Andrew; Anderson, Kyle
2017-10-01
Physics-based models of volcanic eruptions track conduit processes as functions of depth and time. When used in inversions, these models permit integration of diverse geological and geophysical data sets to constrain important parameters of magmatic systems. We develop a 1-D steady state conduit model for effusive eruptions including equilibrium crystallization and gas transport through the conduit and compare with the quasi-steady dome growth phase of Mount St. Helens in 2005. Viscosity increase resulting from pressure-dependent crystallization leads to a natural transition from viscous flow to frictional sliding on the conduit margin. Erupted mass flux depends strongly on wall rock and magma permeabilities due to their impact on magma density. Including both lateral and vertical gas transport reveals competing effects that produce nonmonotonic behavior in the mass flux when increasing magma permeability. Using this physics-based model in a Bayesian inversion, we link data sets from Mount St. Helens such as extrusion flux and earthquake depths with petrological data to estimate unknown model parameters, including magma chamber pressure and water content, magma permeability constants, conduit radius, and friction along the conduit walls. Even with this relatively simple model and limited data, we obtain improved constraints on important model parameters. We find that the magma chamber had low (<5 wt %) total volatiles and that the magma permeability scale is well constrained at ˜10-11.4m2 to reproduce observed dome rock porosities. Compared with previous results, higher magma overpressure and lower wall friction are required to compensate for increased viscous resistance while keeping extrusion rate at the observed value.
Wong, Ying -Qi; Segall, Paul; Bradley, Andrew; ...
2017-10-04
Physics-based models of volcanic eruptions track conduit processes as functions of depth and time. When used in inversions, these models permit integration of diverse geological and geophysical data sets to constrain important parameters of magmatic systems. We develop a 1-D steady state conduit model for effusive eruptions including equilibrium crystallization and gas transport through the conduit and compare with the quasi-steady dome growth phase of Mount St. Helens in 2005. Viscosity increase resulting from pressure-dependent crystallization leads to a natural transition from viscous flow to frictional sliding on the conduit margin. Erupted mass flux depends strongly on wall rock andmore » magma permeabilities due to their impact on magma density. Including both lateral and vertical gas transport reveals competing effects that produce nonmonotonic behavior in the mass flux when increasing magma permeability. Using this physics-based model in a Bayesian inversion, we link data sets from Mount St. Helens such as extrusion flux and earthquake depths with petrological data to estimate unknown model parameters, including magma chamber pressure and water content, magma permeability constants, conduit radius, and friction along the conduit walls. Even with this relatively simple model and limited data, we obtain improved constraints on important model parameters. We find that the magma chamber had low (<5 wt %) total volatiles and that the magma permeability scale is well constrained at ~10 –11.4m 2 to reproduce observed dome rock porosities. Here, compared with previous results, higher magma overpressure and lower wall friction are required to compensate for increased viscous resistance while keeping extrusion rate at the observed value.« less
Wong, Ying-Qi; Segall, Paul; Bradley, Andrew; Anderson, Kyle R.
2017-01-01
Physics-based models of volcanic eruptions track conduit processes as functions of depth and time. When used in inversions, these models permit integration of diverse geological and geophysical data sets to constrain important parameters of magmatic systems. We develop a 1-D steady state conduit model for effusive eruptions including equilibrium crystallization and gas transport through the conduit and compare with the quasi-steady dome growth phase of Mount St. Helens in 2005. Viscosity increase resulting from pressure-dependent crystallization leads to a natural transition from viscous flow to frictional sliding on the conduit margin. Erupted mass flux depends strongly on wall rock and magma permeabilities due to their impact on magma density. Including both lateral and vertical gas transport reveals competing effects that produce nonmonotonic behavior in the mass flux when increasing magma permeability. Using this physics-based model in a Bayesian inversion, we link data sets from Mount St. Helens such as extrusion flux and earthquake depths with petrological data to estimate unknown model parameters, including magma chamber pressure and water content, magma permeability constants, conduit radius, and friction along the conduit walls. Even with this relatively simple model and limited data, we obtain improved constraints on important model parameters. We find that the magma chamber had low (<5wt%) total volatiles and that the magma permeability scale is well constrained at ~10-11.4 m2 to reproduce observed dome rock porosities. Compared with previous results, higher magma overpressure and lower wall friction are required to compensate for increased viscous resistance while keeping extrusion rate at the observed value.
Some Computer-Based Developments in Sociology.
ERIC Educational Resources Information Center
Heise, David R.; Simmons, Roberta G.
1985-01-01
Discusses several ways in which computers are being used in sociology and how they continue to change this discipline. Areas considered include data collection, data analysis, simulations of social processes based on mathematical models, and problem areas (including standardization concerns, training, and the financing of computing facilities).…
Modeling asset price processes based on mean-field framework
NASA Astrophysics Data System (ADS)
Ieda, Masashi; Shiino, Masatoshi
2011-12-01
We propose a model of the dynamics of financial assets based on the mean-field framework. This framework allows us to construct a model which includes the interaction among the financial assets reflecting the market structure. Our study is on the cutting edge in the sense of a microscopic approach to modeling the financial market. To demonstrate the effectiveness of our model concretely, we provide a case study, which is the pricing problem of the European call option with short-time memory noise.
Hashim, Mazlan
2015-01-01
This research presents the results of the GIS-based statistical models for generation of landslide susceptibility mapping using geographic information system (GIS) and remote-sensing data for Cameron Highlands area in Malaysia. Ten factors including slope, aspect, soil, lithology, NDVI, land cover, distance to drainage, precipitation, distance to fault, and distance to road were extracted from SAR data, SPOT 5 and WorldView-1 images. The relationships between the detected landslide locations and these ten related factors were identified by using GIS-based statistical models including analytical hierarchy process (AHP), weighted linear combination (WLC) and spatial multi-criteria evaluation (SMCE) models. The landslide inventory map which has a total of 92 landslide locations was created based on numerous resources such as digital aerial photographs, AIRSAR data, WorldView-1 images, and field surveys. Then, 80% of the landslide inventory was used for training the statistical models and the remaining 20% was used for validation purpose. The validation results using the Relative landslide density index (R-index) and Receiver operating characteristic (ROC) demonstrated that the SMCE model (accuracy is 96%) is better in prediction than AHP (accuracy is 91%) and WLC (accuracy is 89%) models. These landslide susceptibility maps would be useful for hazard mitigation purpose and regional planning. PMID:25898919
Shahabi, Himan; Hashim, Mazlan
2015-04-22
This research presents the results of the GIS-based statistical models for generation of landslide susceptibility mapping using geographic information system (GIS) and remote-sensing data for Cameron Highlands area in Malaysia. Ten factors including slope, aspect, soil, lithology, NDVI, land cover, distance to drainage, precipitation, distance to fault, and distance to road were extracted from SAR data, SPOT 5 and WorldView-1 images. The relationships between the detected landslide locations and these ten related factors were identified by using GIS-based statistical models including analytical hierarchy process (AHP), weighted linear combination (WLC) and spatial multi-criteria evaluation (SMCE) models. The landslide inventory map which has a total of 92 landslide locations was created based on numerous resources such as digital aerial photographs, AIRSAR data, WorldView-1 images, and field surveys. Then, 80% of the landslide inventory was used for training the statistical models and the remaining 20% was used for validation purpose. The validation results using the Relative landslide density index (R-index) and Receiver operating characteristic (ROC) demonstrated that the SMCE model (accuracy is 96%) is better in prediction than AHP (accuracy is 91%) and WLC (accuracy is 89%) models. These landslide susceptibility maps would be useful for hazard mitigation purpose and regional planning.
Goal-Based Domain Modeling as a Basis for Cross-Disciplinary Systems Engineering
NASA Astrophysics Data System (ADS)
Jarke, Matthias; Nissen, Hans W.; Rose, Thomas; Schmitz, Dominik
Small and medium-sized enterprises (SMEs) are important drivers for innovation. In particular, project-driven SMEs that closely cooperate with their customers have specific needs in regard to information engineering of their development process. They need a fast requirements capture since this is most often included in the (unpaid) offer development phase. At the same time, they need to maintain and reuse the knowledge and experiences they have gathered in previous projects extensively as it is their core asset. The situation is complicated further if the application field crosses disciplinary boundaries. To bridge the gaps and perspectives, we focus on shared goals and dependencies captured in models at a conceptual level. Such a model-based approach also offers a smarter connection to subsequent development stages, including a high share of automated code generation. In the approach presented here, the agent- and goal-oriented formalism i * is therefore extended by domain models to facilitate information organization. This extension permits a domain model-based similarity search, and a model-based transformation towards subsequent development stages. Our approach also addresses the evolution of domain models reflecting the experiences from completed projects. The approach is illustrated with a case study on software-intensive control systems in an SME of the automotive domain.
Integrated research in constitutive modelling at elevated temperatures, part 2
NASA Technical Reports Server (NTRS)
Haisler, W. E.; Allen, D. H.
1986-01-01
Four current viscoplastic models are compared experimentally with Inconel 718 at 1100 F. A series of tests were performed to create a sufficient data base from which to evaluate material constants. The models used include Bodner's anisotropic model; Krieg, Swearengen, and Rhode's model; Schmidt and Miller's model; and Walker's exponential model.
Estimating Longwave Atmospheric Emissivity in the Canadian Rocky Mountains
NASA Astrophysics Data System (ADS)
Ebrahimi, S.; Marshall, S. J.
2014-12-01
Incoming longwave radiation is an important source of energy contributing to snow and glacier melt. However, estimating the incoming longwave radiation from the atmosphere is challenging due to the highly varying conditions of the atmosphere, especially cloudiness. We analyze the performance of some existing models included a physically-based clear-sky model by Brutsaert (1987) and two different empirical models for all-sky conditions (Lhomme and others, 2007; Herrero and Polo, 2012) at Haig Glacier in the Canadian Rocky Mountains. Models are based on relations between readily observed near-surface meteorological data, including temperature, vapor pressure, relative humidity, and estimates of shortwave radiation transmissivity (i.e., clear-sky or cloud-cover indices). This class of models generally requires solar radiation data in order to obtain a proxy for cloud conditions. This is not always available for distributed models of glacier melt, and can have high spatial variations in regions of complex topography, which likely do not reflect the more homogeneous atmospheric longwave emissions. We therefore test longwave radiation parameterizations as a function of near-surface humidity and temperature variables, based on automatic weather station data (half-hourly and mean daily values) from 2004 to 2012. Results from comparative analysis of different incoming longwave radiation parameterizations showed that the locally-calibrated model based on relative humidity and vapour pressure performs better than other published models. Performance is degraded but still better than standard cloud-index based models when we transfer the model to another site, roughly 900 km away, Kwadacha Glacier in the northern Canadian Rockies.
Optimal control of CPR procedure using hemodynamic circulation model
Lenhart, Suzanne M.; Protopopescu, Vladimir A.; Jung, Eunok
2007-12-25
A method for determining a chest pressure profile for cardiopulmonary resuscitation (CPR) includes the steps of representing a hemodynamic circulation model based on a plurality of difference equations for a patient, applying an optimal control (OC) algorithm to the circulation model, and determining a chest pressure profile. The chest pressure profile defines a timing pattern of externally applied pressure to a chest of the patient to maximize blood flow through the patient. A CPR device includes a chest compressor, a controller communicably connected to the chest compressor, and a computer communicably connected to the controller. The computer determines the chest pressure profile by applying an OC algorithm to a hemodynamic circulation model based on the plurality of difference equations.
Cognitive Tools for Assessment and Learning in a High Information Flow Environment.
ERIC Educational Resources Information Center
Lajoie, Susanne P.; Azevedo, Roger; Fleiszer, David M.
1998-01-01
Describes the development of a simulation-based intelligent tutoring system for nurses working in a surgical intensive care unit. Highlights include situative learning theories and models of instruction, modeling expertise, complex decision making, linking theories of learning to the design of computer-based learning environments, cognitive task…
Automotive Maintenance Data Base for Model Years 1976-1979. Part II : Appendix E and F
DOT National Transportation Integrated Search
1980-12-01
An update of the existing data base was developed to include life cycle maintenance costs of representative vehicles for the model years 1976-1979. Repair costs as a function of time are also developed for a passenger car in each of the compact, subc...
Modelling Cognitive Style in a Peer Help Network.
ERIC Educational Resources Information Center
Bull, Susan; McCalla, Gord
2002-01-01
Explains I-Help, a computer-based peer help network where students can ask and answer questions about assignments and courses based on the metaphor of a help desk. Highlights include cognitive style; user modeling in I-Help; matching helpers to helpees; and types of questions. (Contains 64 references.) (LRW)
Stratigraphy of the crater Copernicus
NASA Technical Reports Server (NTRS)
Paquette, R.
1984-01-01
The stratigraphy of copernicus based on its olivine absorption bands is presented. Earth based spectral data are used to develop models that also employ cratering mechanics to devise theories for Copernican geomorphology. General geologic information, spectral information, upper and lower stratigraphic units and a chart for model comparison are included in the stratigraphic analysis.
Improving Conceptual Understanding and Representation Skills through Excel-Based Modeling
ERIC Educational Resources Information Center
Malone, Kathy L.; Schunn, Christian D.; Schuchardt, Anita M.
2018-01-01
The National Research Council framework for science education and the Next Generation Science Standards have developed a need for additional research and development of curricula that is both technologically model-based and includes engineering practices. This is especially the case for biology education. This paper describes a quasi-experimental…
Mathematical Modelling in Engineering: An Alternative Way to Teach Linear Algebra
ERIC Educational Resources Information Center
Domínguez-García, S.; García-Planas, M. I.; Taberna, J.
2016-01-01
Technological advances require that basic science courses for engineering, including Linear Algebra, emphasize the development of mathematical strengths associated with modelling and interpretation of results, which are not limited only to calculus abilities. Based on this consideration, we have proposed a project-based learning, giving a dynamic…
Designing Corporate Databases to Support Technology Innovation
ERIC Educational Resources Information Center
Gultz, Michael Jarett
2012-01-01
Based on a review of the existing literature on database design, this study proposed a unified database model to support corporate technology innovation. This study assessed potential support for the model based on the opinions of 200 technology industry executives, including Chief Information Officers, Chief Knowledge Officers and Chief Learning…
A Suggested Model for a Working Cyberschool.
ERIC Educational Resources Information Center
Javid, Mahnaz A.
2000-01-01
Suggests a model for a working cyberschool based on a case study of Kamiak Cyberschool (Washington), a technology-driven public high school. Topics include flexible hours; one-to-one interaction with teachers; a supportive school environment; use of computers, interactive media, and online resources; and self-paced, project-based learning.…
Using Structural Equation Modeling To Fit Models Incorporating Principal Components.
ERIC Educational Resources Information Center
Dolan, Conor; Bechger, Timo; Molenaar, Peter
1999-01-01
Considers models incorporating principal components from the perspectives of structural-equation modeling. These models include the following: (1) the principal-component analysis of patterned matrices; (2) multiple analysis of variance based on principal components; and (3) multigroup principal-components analysis. Discusses fitting these models…
Model-based diagnosis of large diesel engines based on angular speed variations of the crankshaft
NASA Astrophysics Data System (ADS)
Desbazeille, M.; Randall, R. B.; Guillet, F.; El Badaoui, M.; Hoisnard, C.
2010-07-01
This work aims at monitoring large diesel engines by analyzing the crankshaft angular speed variations. It focuses on a powerful 20-cylinder diesel engine with crankshaft natural frequencies within the operating speed range. First, the angular speed variations are modeled at the crankshaft free end. This includes modeling both the crankshaft dynamical behavior and the excitation torques. As the engine is very large, the first crankshaft torsional modes are in the low frequency range. A model with the assumption of a flexible crankshaft is required. The excitation torques depend on the in-cylinder pressure curve. The latter is modeled with a phenomenological model. Mechanical and combustion parameters of the model are optimized with the help of actual data. Then, an automated diagnosis based on an artificially intelligent system is proposed. Neural networks are used for pattern recognition of the angular speed waveforms in normal and faulty conditions. Reference patterns required in the training phase are computed with the model, calibrated using a small number of actual measurements. Promising results are obtained. An experimental fuel leakage fault is successfully diagnosed, including detection and localization of the faulty cylinder, as well as the approximation of the fault severity.
The Folding Energy Landscape and Free Energy Excitations of Cytochrome c
Weinkam, Patrick; Zimmermann, Jörg; Romesberg, Floyd E.
2014-01-01
The covalently bound heme cofactor plays a dominant role in the folding of cytochrome c. Due to the complicated inorganic chemistry of the heme, some might consider the folding of cytochrome c to be a special case that follows different principles than those used to describe folding of proteins without cofactors. Recent investigations, however, demonstrate that models which are commonly used to describe folding for many proteins work well for cytochrome c when heme is explicitly introduced and generally provide results that agree with experimental observations. We will first discuss results from simple native structure-based models. These models include attractive interactions between nonadjacent residues only if they are present in the crystal structure at pH 7. Since attractive nonnative contacts are not included in native structure-based models, their energy landscapes can be described as “perfectly funneled.” In other words, native structure-based models are energetically guided towards the native state and contain no energetic traps that would hinder folding. Energetic traps are sources of frustration which cause specific transient intermediates to be populated. Native structure-based models do include repulsion between residues due to excluded volume. Nonenergetic traps can therefore exist if the chain, which cannot cross over itself, must partially unfold in order for folding to proceed. The ability of native structure-based models to capture these type of motions is in part responsible for their successful predictions of folding pathways for many types of proteins. Models without frustration describe well the sequence of folding events for cytochrome c inferred from hydrogen exchange experiments thereby justifying their use as a starting point. At low pH, the folding sequence of cytochrome c deviates from that at pH 7 and from those predicted from models with perfectly funneled energy landscapes. Alternate folding pathways are a result of “chemical frustration.” This frustration arises because some regions of the protein are destabilized more than others due to the heterogeneous distribution of titratable residues that are protonated at low pH. We construct more complex models that include chemical frustration, in addition to the native structure-based terms. These more complex models only modestly perturb the energy landscape which remains overall well funneled. These perturbed models can accurately describe how alternative folding pathways are used at low pH. At alkaline pH, cytochrome c populates distinctly different structural ensembles. For instance, lysine residues are deprotonated and compete for the heme ligation site. The same models that can describe folding at low pH also predict well the structures and relative stabilities of intermediates populated at alkaline pH. PMID:20143816
Newcom, D W; Baas, T J; Stalder, K J; Schwab, C R
2005-04-01
Three selection models were evaluated to compare selection candidate rankings based on EBV and to evaluate subsequent effects of model-derived EBV on the selection differential and expected genetic response in the population. Data were collected from carcass- and ultrasound-derived estimates of loin i.m. fat percent (IMF) in a population of Duroc swine under selection to increase IMF. The models compared were Model 1, a two-trait animal model used in the selection experiment that included ultrasound IMF from all pigs scanned and carcass IMF from pigs slaughtered to estimate breeding values for both carcass (C1) and ultrasound IMF (U1); Model 2, a single-trait animal model that included ultrasound IMF values on all pigs scanned to estimate breeding values for ultrasound IMF (U2); and Model 3, a multiple-trait animal model including carcass IMF from slaughtered pigs and the first three principal components from a total of 10 image parameters averaged across four longitudinal ultrasound images to estimate breeding values for carcass IMF (C3). Rank correlations between breeding value estimates for U1 and C1, U1 and U2, and C1 and C3 were 0.95, 0.97, and 0.92, respectively. Other rank correlations were 0.86 or less. In the selection experiment, approximately the top 10% of boars and 50% of gilts were selected. Selection differentials for pigs in Generation 3 were greatest when ranking pigs based on C1, followed by U1, U2, and C3. In addition, selection differential and estimated response were evaluated when simulating selection of the top 1, 5, and 10% of sires and 50% of dams. Results of this analysis indicated the greatest selection differential was for selection based on C1. The greatest loss in selection differential was found for selection based on C3 when selecting the top 10 and 1% of boars and 50% of gilts. The loss in estimated response when selecting varying percentages of boars and the top 50% of gilts was greatest when selection was based on C3 (16.0 to 25.8%) and least for selection based on U1 (1.3 to 10.9%). Estimated genetic change from selection based on carcass IMF was greater than selection based on ultrasound IMF. Results show that selection based on a combination of ultrasonically predicted IMF and sib carcass IMF produced the greatest selection differentials and should lead to the greatest genetic change.
An Example of Competence-Based Learning: Use of Maxima in Linear Algebra for Engineers
ERIC Educational Resources Information Center
Diaz, Ana; Garcia, Alfonsa; de la Villa, Agustin
2011-01-01
This paper analyses the role of Computer Algebra Systems (CAS) in a model of learning based on competences. The proposal is an e-learning model Linear Algebra course for Engineering, which includes the use of a CAS (Maxima) and focuses on problem solving. A reference model has been taken from the Spanish Open University. The proper use of CAS is…
Meeting in Turkey: WASP Transport Modeling and WASP Ecological Modeling
A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...
Meeting in Korea: WASP Transport Modeling and WASP Ecological Modeling
A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...
Consumer psychology: categorization, inferences, affect, and persuasion.
Loken, Barbara
2006-01-01
This chapter reviews research on consumer psychology with emphasis on the topics of categorization, inferences, affect, and persuasion. The chapter reviews theory-based empirical research during the period 1994-2004. Research on categorization includes empirical research on brand categories, goals as organizing frameworks and motivational bases for judgments, and self-based processing. Research on inferences includes numerous types of inferences that are cognitively and/or experienced based. Research on affect includes the effects of mood on processing and cognitive and noncognitive bases for attitudes and intentions. Research on persuasion focuses heavily on the moderating role of elaboration and dual-process models, and includes research on attitude strength responses, advertising responses, and negative versus positive evaluative dimensions.
Nonlinear Fluid Model Of 3-D Field Effects In Tokamak Plasmas
NASA Astrophysics Data System (ADS)
Callen, J. D.; Hegna, C. C.; Beidler, M. T.
2017-10-01
Extended MHD codes (e.g., NIMROD, M3D-C1) are beginning to explore nonlinear effects of small 3-D magnetic fields on tokamak plasmas. To facilitate development of analogous physically understandable reduced models, a fluid-based dynamic nonlinear model of these added 3-D field effects in the base axisymmetric tokamak magnetic field geometry is being developed. The model incorporates kinetic-based closures within an extended MHD framework. Key 3-D field effects models that have been developed include: 1) a comprehensive modified Rutherford equation for the growth of a magnetic island that includes the classical tearing and NTM perturbed bootstrap current drives, externally applied magnetic field and current drives, and classical and neoclassical polarization current effects, and 2) dynamic nonlinear evolution of the plasma toroidal flow (radial electric field) in response to the 3-D fields. An application of this model to RMP ELM suppression precipitated by an ELM crash will be discussed. Supported by Office of Fusion Energy Sciences, Office of Science, Dept. of Energy Grants DE-FG02-86ER53218 and DE-FG02-92ER54139.
Kernel Regression Estimation of Fiber Orientation Mixtures in Diffusion MRI
Cabeen, Ryan P.; Bastin, Mark E.; Laidlaw, David H.
2016-01-01
We present and evaluate a method for kernel regression estimation of fiber orientations and associated volume fractions for diffusion MR tractography and population-based atlas construction in clinical imaging studies of brain white matter. This is a model-based image processing technique in which representative fiber models are estimated from collections of component fiber models in model-valued image data. This extends prior work in nonparametric image processing and multi-compartment processing to provide computational tools for image interpolation, smoothing, and fusion with fiber orientation mixtures. In contrast to related work on multi-compartment processing, this approach is based on directional measures of divergence and includes data-adaptive extensions for model selection and bilateral filtering. This is useful for reconstructing complex anatomical features in clinical datasets analyzed with the ball-and-sticks model, and our framework’s data-adaptive extensions are potentially useful for general multi-compartment image processing. We experimentally evaluate our approach with both synthetic data from computational phantoms and in vivo clinical data from human subjects. With synthetic data experiments, we evaluate performance based on errors in fiber orientation, volume fraction, compartment count, and tractography-based connectivity. With in vivo data experiments, we first show improved scan-rescan reproducibility and reliability of quantitative fiber bundle metrics, including mean length, volume, streamline count, and mean volume fraction. We then demonstrate the creation of a multi-fiber tractography atlas from a population of 80 human subjects. In comparison to single tensor atlasing, our multi-fiber atlas shows more complete features of known fiber bundles and includes reconstructions of the lateral projections of the corpus callosum and complex fronto-parietal connections of the superior longitudinal fasciculus I, II, and III. PMID:26691524
Endometrial cancer risk prediction including serum-based biomarkers: results from the EPIC cohort.
Fortner, Renée T; Hüsing, Anika; Kühn, Tilman; Konar, Meric; Overvad, Kim; Tjønneland, Anne; Hansen, Louise; Boutron-Ruault, Marie-Christine; Severi, Gianluca; Fournier, Agnès; Boeing, Heiner; Trichopoulou, Antonia; Benetou, Vasiliki; Orfanos, Philippos; Masala, Giovanna; Agnoli, Claudia; Mattiello, Amalia; Tumino, Rosario; Sacerdote, Carlotta; Bueno-de-Mesquita, H B As; Peeters, Petra H M; Weiderpass, Elisabete; Gram, Inger T; Gavrilyuk, Oxana; Quirós, J Ramón; Maria Huerta, José; Ardanaz, Eva; Larrañaga, Nerea; Lujan-Barroso, Leila; Sánchez-Cantalejo, Emilio; Butt, Salma Tunå; Borgquist, Signe; Idahl, Annika; Lundin, Eva; Khaw, Kay-Tee; Allen, Naomi E; Rinaldi, Sabina; Dossus, Laure; Gunter, Marc; Merritt, Melissa A; Tzoulaki, Ioanna; Riboli, Elio; Kaaks, Rudolf
2017-03-15
Endometrial cancer risk prediction models including lifestyle, anthropometric and reproductive factors have limited discrimination. Adding biomarker data to these models may improve predictive capacity; to our knowledge, this has not been investigated for endometrial cancer. Using a nested case-control study within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort, we investigated the improvement in discrimination gained by adding serum biomarker concentrations to risk estimates derived from an existing risk prediction model based on epidemiologic factors. Serum concentrations of sex steroid hormones, metabolic markers, growth factors, adipokines and cytokines were evaluated in a step-wise backward selection process; biomarkers were retained at p < 0.157 indicating improvement in the Akaike information criterion (AIC). Improvement in discrimination was assessed using the C-statistic for all biomarkers alone, and change in C-statistic from addition of biomarkers to preexisting absolute risk estimates. We used internal validation with bootstrapping (1000-fold) to adjust for over-fitting. Adiponectin, estrone, interleukin-1 receptor antagonist, tumor necrosis factor-alpha and triglycerides were selected into the model. After accounting for over-fitting, discrimination was improved by 2.0 percentage points when all evaluated biomarkers were included and 1.7 percentage points in the model including the selected biomarkers. Models including etiologic markers on independent pathways and genetic markers may further improve discrimination. © 2016 UICC.
NASA Astrophysics Data System (ADS)
Zhang, Jie; Nixon, Andrew; Barber, Tom; Budyn, Nicolas; Bevan, Rhodri; Croxford, Anthony; Wilcox, Paul
2018-04-01
In this paper, a methodology of using finite element (FE) model to validate a ray-based model in the simulation of full matrix capture (FMC) ultrasonic array data set is proposed. The overall aim is to separate signal contributions from different interactions in FE results for easier comparing each individual component in the ray-based model results. This is achieved by combining the results from multiple FE models of the system of interest that include progressively more geometrical features while preserving the same mesh structure. It is shown that the proposed techniques allow the interactions from a large number of different ray-paths to be isolated in FE results and compared directly to the results from a ray-based forward model.
Modeling wildlife populations with HexSim
HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications including population viability analysis for on...
Long-term hydrological simulation based on the Soil Conservation Service curve number
NASA Astrophysics Data System (ADS)
Mishra, Surendra Kumar; Singh, Vijay P.
2004-05-01
Presenting a critical review of daily flow simulation models based on the Soil Conservation Service curve number (SCS-CN), this paper introduces a more versatile model based on the modified SCS-CN method, which specializes into seven cases. The proposed model was applied to the Hemavati watershed (area = 600 km2) in India and was found to yield satisfactory results in both calibration and validation. The model conserved monthly and annual runoff volumes satisfactorily. A sensitivity analysis of the model parameters was performed, including the effect of variation in storm duration. Finally, to investigate the model components, all seven variants of the modified version were tested for their suitability.
System Model for MEMS based Laser Ultrasonic Receiver
NASA Technical Reports Server (NTRS)
Wilson, William C.
2002-01-01
A need has been identified for more advanced nondestructive Evaluation technologies for assuring the integrity of airframe structures, wiring, etc. Laser ultrasonic inspection instruments have been shown to detect flaws in structures. However, these instruments are generally too bulky to be used in the confined spaces that are typical of aerospace vehicles. Microsystems technology is one key to reducing the size of current instruments and enabling increased inspection coverage in areas that were previously inaccessible due to instrument size and weight. This paper investigates the system modeling of a Micro OptoElectroMechanical System (MOEMS) based laser ultrasonic receiver. The system model is constructed in software using MATLAB s dynamical simulator, Simulink. The optical components are modeled using geometrical matrix methods and include some image processing. The system model includes a test bench which simulates input stimuli and models the behavior of the material under test.
NASA Technical Reports Server (NTRS)
Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)
1991-01-01
An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.
Agent-Based Modeling in Public Health: Current Applications and Future Directions.
Tracy, Melissa; Cerdá, Magdalena; Keyes, Katherine M
2018-04-01
Agent-based modeling is a computational approach in which agents with a specified set of characteristics interact with each other and with their environment according to predefined rules. We review key areas in public health where agent-based modeling has been adopted, including both communicable and noncommunicable disease, health behaviors, and social epidemiology. We also describe the main strengths and limitations of this approach for questions with public health relevance. Finally, we describe both methodologic and substantive future directions that we believe will enhance the value of agent-based modeling for public health. In particular, advances in model validation, comparisons with other causal modeling procedures, and the expansion of the models to consider comorbidity and joint influences more systematically will improve the utility of this approach to inform public health research, practice, and policy.
Event-driven simulation in SELMON: An overview of EDSE
NASA Technical Reports Server (NTRS)
Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.
1992-01-01
EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.
New model performance index for engineering design of control systems
NASA Technical Reports Server (NTRS)
1970-01-01
Performance index includes a model representing linear control-system design specifications. Based on a geometric criterion for approximation of the model by the actual system, the index can be interpreted directly in terms of the desired system response model without actually having the model's time response.
A Mathematical Model of a Simple Amplifier Using a Ferroelectric Transistor
NASA Technical Reports Server (NTRS)
Sayyah, Rana; Hunt, Mitchell; MacLeod, Todd C.; Ho, Fat D.
2009-01-01
This paper presents a mathematical model characterizing the behavior of a simple amplifier using a FeFET. The model is based on empirical data and incorporates several variables that affect the output, including frequency, load resistance, and gate-to-source voltage. Since the amplifier is the basis of many circuit configurations, a mathematical model that describes the behavior of a FeFET-based amplifier will help in the integration of FeFETs into many other circuits.
Multivariable Parametric Cost Model for Ground Optical Telescope Assembly
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Rowell, Ginger Holmes; Reese, Gayle; Byberg, Alicia
2005-01-01
A parametric cost model for ground-based telescopes is developed using multivariable statistical analysis of both engineering and performance parameters. While diameter continues to be the dominant cost driver, diffraction-limited wavelength is found to be a secondary driver. Other parameters such as radius of curvature are examined. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e., multi-telescope phased-array systems). Additionally, single variable models Based on aperture diameter are derived.
An Evaluation of Three Approximate Item Response Theory Models for Equating Test Scores.
ERIC Educational Resources Information Center
Marco, Gary L.; And Others
Three item response models were evaluated for estimating item parameters and equating test scores. The models, which approximated the traditional three-parameter model, included: (1) the Rasch one-parameter model, operationalized in the BICAL computer program; (2) an approximate three-parameter logistic model based on coarse group data divided…
ERIC Educational Resources Information Center
Daniel, Esther Gnanamalar Sarojini; Saat, Rohaida Mohd.
2001-01-01
Introduces a learning module integrating three disciplines--physics, chemistry, and biology--and based on four elements: carbon, oxygen, hydrogen, and silicon. Includes atomic model and silicon-based life activities. (YDS)
Selecting among competing models of electro-optic, infrared camera system range performance
Nichols, Jonathan M.; Hines, James E.; Nichols, James D.
2013-01-01
Range performance is often the key requirement around which electro-optical and infrared camera systems are designed. This work presents an objective framework for evaluating competing range performance models. Model selection based on the Akaike’s Information Criterion (AIC) is presented for the type of data collected during a typical human observer and target identification experiment. These methods are then demonstrated on observer responses to both visible and infrared imagery in which one of three maritime targets was placed at various ranges. We compare the performance of a number of different models, including those appearing previously in the literature. We conclude that our model-based approach offers substantial improvements over the traditional approach to inference, including increased precision and the ability to make predictions for some distances other than the specific set for which experimental trials were conducted.
Geddes, C.A.; Brown, D.G.; Fagre, D.B.
2005-01-01
We derived and implemented two spatial models of May snow water equivalent (SWE) at Lee Ridge in Glacier National Park, Montana. We used the models to test the hypothesis that vegetation structure is a control on snow redistribution at the alpine treeline ecotone (ATE). The statistical models were derived using stepwise and "best" subsets regression techniques. The first model was derived from field measurements of SWE, topography, and vegetation taken at 27 sample points. The second model was derived using GIS-based measures of topography and vegetation. Both the field- (R² = 0.93) and GIS-based models (R² = 0.69) of May SWE included the following variables: site type (based on vegetation), elevation, maximum slope, and general slope aspect. Site type was identified as the most important predictor of SWE in both models, accounting for 74.0% and 29.5% of the variation, respectively. The GIS-based model was applied to create a predictive map of SWE across Lee Ridge, predicting little snow accumulation on the top of the ridge where vegetation is scarce. The GIS model failed in large depressions, including ephemeral stream channels. The models supported the hypothesis that upright vegetation has a positive effect on accumulation of SWE above and beyond the effects of topography. Vegetation, therefore, creates a positive feedback in which it modifies its, environment and could affect the ability of additional vegetation to become established.
Perspectives for Electronic Books in the World Wide Web Age.
ERIC Educational Resources Information Center
Bry, Francois; Kraus, Michael
2002-01-01
Discusses the rapid growth of the World Wide Web and the lack of use of electronic books and suggests that specialized contents and device independence can make Web-based books compete with print. Topics include enhancing the hypertext model of XML; client-side adaptation, including browsers and navigation; and semantic modeling. (Author/LRW)
Models of Educational Computing @ Home: New Frontiers for Research on Technology in Learning.
ERIC Educational Resources Information Center
Kafai, Yasmin B.; Fishman, Barry J.; Bruckman, Amy S.; Rockman, Saul
2002-01-01
Discusses models of home educational computing that are linked to learning in school and recommends the need for research that addresses the home as a computer-based learning environment. Topics include a history of research on educational computing at home; technological infrastructure, including software and compatibility; Internet access;…
IMPACT: Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking
NASA Astrophysics Data System (ADS)
Koller, J.; Brennan, S.; Godinez, H. C.; Higdon, D. M.; Klimenko, A.; Larsen, B.; Lawrence, E.; Linares, R.; McLaughlin, C. A.; Mehta, P. M.; Palmer, D.; Ridley, A. J.; Shoemaker, M.; Sutton, E.; Thompson, D.; Walker, A.; Wohlberg, B.
2013-12-01
Low-Earth orbiting satellites suffer from atmospheric drag due to thermospheric density which changes on the order of several magnitudes especially during space weather events. Solar flares, precipitating particles and ionospheric currents cause the upper atmosphere to heat up, redistribute, and cool again. These processes are intrinsically included in empirical models, e.g. MSIS and Jacchia-Bowman type models. However, sensitivity analysis has shown that atmospheric drag has the highest influence on satellite conjunction analysis and empirical model still do not adequately represent a desired accuracy. Space debris and collision avoidance have become an increasingly operational reality. It is paramount to accurately predict satellite orbits and include drag effect driven by space weather. The IMPACT project (Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking), funded with over $5 Million by the Los Alamos Laboratory Directed Research and Development office, has the goal to develop an integrated system of atmospheric drag modeling, orbit propagation, and conjunction analysis with detailed uncertainty quantification to address the space debris and collision avoidance problem. Now with over two years into the project, we have developed an integrated solution combining physics-based density modeling of the upper atmosphere between 120-700 km altitude, satellite drag forecasting for quiet and disturbed geomagnetic conditions, and conjunction analysis with non-Gaussian uncertainty quantification. We are employing several novel approaches including a unique observational sensor developed at Los Alamos; machine learning with a support-vector machine approach of the coupling between solar drivers of the upper atmosphere and satellite drag; rigorous data assimilative modeling using a physics-based approach instead of empirical modeling of the thermosphere; and a computed-tomography method for extracting temporal maps of thermospheric densities using ground based observations. The developed IMPACT framework is an open research framework enabling the exchange and testing of a variety of atmospheric density models, orbital propagators, drag coefficient models, ground based observations, etc. and study their effect on conjunctions and uncertainty predictions. The framework is based on a modern service-oriented architecture controlled by a web interface and providing 3D visualizations. The goal of this project is to revolutionize the ability to monitor and track space objects during highly disturbed space weather conditions, provide suitable forecasts for satellite drag conditions and conjunction analysis, and enable the exchange of models, codes, and data in an open research environment. We will present capabilities and results of the IMPACT framework including a demo of the control interface and visualizations.
Simulating cancer growth with multiscale agent-based modeling.
Wang, Zhihui; Butner, Joseph D; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S
2015-02-01
There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. Copyright © 2014 Elsevier Ltd. All rights reserved.
Modeling of the nearshore marine ecosystem with the AQUATOX model
Process-based models can be used to forecast the responses of coastal ecosystems to changes under future scenarios. However, most models applied to coastal systems do not include higher trophic levels, which are important providers of ecosystem services. AQUATOX is a mechanistic...
Evidence-based Controls for Epidemics Using Spatio-temporal Stochastic Model as a Bayesian Framwork
USDA-ARS?s Scientific Manuscript database
The control of highly infectious diseases of agricultural and plantation crops and livestock represents a key challenge in epidemiological and ecological modelling, with implemented control strategies often being controversial. Mathematical models, including the spatio-temporal stochastic models con...
Bond Graph Model of Cerebral Circulation: Toward Clinically Feasible Systemic Blood Flow Simulations
Safaei, Soroush; Blanco, Pablo J.; Müller, Lucas O.; Hellevik, Leif R.; Hunter, Peter J.
2018-01-01
We propose a detailed CellML model of the human cerebral circulation that runs faster than real time on a desktop computer and is designed for use in clinical settings when the speed of response is important. A lumped parameter mathematical model, which is based on a one-dimensional formulation of the flow of an incompressible fluid in distensible vessels, is constructed using a bond graph formulation to ensure mass conservation and energy conservation. The model includes arterial vessels with geometric and anatomical data based on the ADAN circulation model. The peripheral beds are represented by lumped parameter compartments. We compare the hemodynamics predicted by the bond graph formulation of the cerebral circulation with that given by a classical one-dimensional Navier-Stokes model working on top of the whole-body ADAN model. Outputs from the bond graph model, including the pressure and flow signatures and blood volumes, are compared with physiological data. PMID:29551979
Brayton Power Conversion System Parametric Design Modelling for Nuclear Electric Propulsion
NASA Technical Reports Server (NTRS)
Ashe, Thomas L.; Otting, William D.
1993-01-01
The parametrically based closed Brayton cycle (CBC) computer design model was developed for inclusion into the NASA LeRC overall Nuclear Electric Propulsion (NEP) end-to-end systems model. The code is intended to provide greater depth to the NEP system modeling which is required to more accurately predict the impact of specific technology on system performance. The CBC model is parametrically based to allow for conducting detailed optimization studies and to provide for easy integration into an overall optimizer driver routine. The power conversion model includes the modeling of the turbines, alternators, compressors, ducting, and heat exchangers (hot-side heat exchanger and recuperator). The code predicts performance to significant detail. The system characteristics determined include estimates of mass, efficiency, and the characteristic dimensions of the major power conversion system components. These characteristics are parametrically modeled as a function of input parameters such as the aerodynamic configuration (axial or radial), turbine inlet temperature, cycle temperature ratio, power level, lifetime, materials, and redundancy.
Creating High Reliability in Health Care Organizations
Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth
2006-01-01
Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981
Models, Measurements, and Local Decisions: Assessing and ...
This presentation includes a combination of modeling and measurement results to characterize near-source air quality in Newark, New Jersey with consideration of how this information could be used to inform decision making to reduce risk of health impacts. Decisions could include either exposure or emissions reduction, and a host of stakeholders, including residents, academics, NGOs, local and federal agencies. This presentation includes results from the C-PORT modeling system, and from a citizen science project from the local area. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Chang, Hsien-Yen; Weiner, Jonathan P
2010-01-18
Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI) programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG) case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234), while those in both 2002 and 2003 were included for prospective analyses (n = 164,562). Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group) model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster). When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Given the widespread availability of claims data and the superior explanatory power of claims-based risk adjustment models over demographics-only models, Taiwan's government should consider using claims-based models for policy-relevant applications. The performance of the ACG case-mix system in Taiwan was comparable to that found in other countries. This suggested that the ACG system could be applied to Taiwan's NHI even though it was originally developed in the USA. Many of the findings in this paper are likely to be relevant to other diagnosis-based risk adjustment methodologies.
Julien, Dominic; O'Connor, Kieron; Aardema, Frederick
2016-09-15
The inference-based approach (IBA) postulates that individuals with obsessive-compulsive disorder (OCD) confuse a possibility with reality (inferential confusion) according to specific inductive reasoning devices and act as if this possibility were true. A new treatment modality, the inference-based therapy (IBT), was developed. The aim of this study was to critically review empirical evidence regarding the etiological model, treatment efficacy, and model of change of IBA. A search of the literature was conducted using PsycINFO and Medline. Thirty-four articles were included in the review. The review reveals that intrusive thoughts of non-clinical and OCD individuals may occur in different contexts. There is support for a specific inductive reasoning style in OCD. Inferential confusion is associated with OCD symptoms. There is good evidence that IBT is an efficacious treatment for OCD, including two randomized controlled trials showing that IBT was as efficacious as cognitive-behavior therapy. There is some but limited evidence that the process of change during treatment is coherent with IBA's assumptions. Key premises were investigated in only a few studies. Some of these studies were conducted in non-clinical samples or did not include an anxious control group. IBA's etiological model, treatment modality, and model of change make a significant contribution to OCD. Copyright © 2016 Elsevier B.V. All rights reserved.
Flores-Alsina, Xavier; Comas, Joaquim; Rodriguez-Roda, Ignasi; Gernaey, Krist V; Rosen, Christian
2009-10-01
The main objective of this paper is to demonstrate how including the occurrence of filamentous bulking sludge in a secondary clarifier model will affect the predicted process performance during the simulation of WWTPs. The IWA Benchmark Simulation Model No. 2 (BSM2) is hereby used as a simulation case study. Practically, the proposed approach includes a risk assessment model based on a knowledge-based decision tree to detect favourable conditions for the development of filamentous bulking sludge. Once such conditions are detected, the settling characteristics of the secondary clarifier model are automatically changed during the simulation by modifying the settling model parameters to mimic the effect of growth of filamentous bacteria. The simulation results demonstrate that including effects of filamentous bulking in the secondary clarifier model results in a more realistic plant performance. Particularly, during the periods when the conditions for the development of filamentous bulking sludge are favourable--leading to poor activated sludge compaction, low return and waste TSS concentrations and difficulties in maintaining the biomass in the aeration basins--a subsequent reduction in overall pollution removal efficiency is observed. Also, a scenario analysis is conducted to examine i) the influence of sludge retention time (SRT), the external recirculation flow rate (Q(r)) and the air flow rate in the bioreactor (modelled as k(L)a) as factors promoting bulking sludge, and ii) the effect on the model predictions when the settling properties are changed due to a possible proliferation of filamentous microorganisms. Finally, the potentially adverse effects of certain operational procedures are highlighted, since such effects are normally not considered by state-of-the-art models that do not include microbiology-related solids separation problems.
Shih, Shirley L; Zafonte, Ross; Bates, David W; Gerrard, Paul; Goldstein, Richard; Mix, Jacqueline; Niewczyk, Paulette; Greysen, S Ryan; Kazis, Lewis; Ryan, Colleen M; Schneider, Jeffrey C
2016-10-01
Functional status is associated with patient outcomes, but is rarely included in hospital readmission risk models. The objective of this study was to determine whether functional status is a better predictor of 30-day acute care readmission than traditionally investigated variables including demographics and comorbidities. Retrospective database analysis between 2002 and 2011. 1158 US inpatient rehabilitation facilities. 4,199,002 inpatient rehabilitation facility admissions comprising patients from 16 impairment groups within the Uniform Data System for Medical Rehabilitation database. Logistic regression models predicting 30-day readmission were developed based on age, gender, comorbidities (Elixhauser comorbidity index, Deyo-Charlson comorbidity index, and Medicare comorbidity tier system), and functional status [Functional Independence Measure (FIM)]. We hypothesized that (1) function-based models would outperform demographic- and comorbidity-based models and (2) the addition of demographic and comorbidity data would not significantly enhance function-based models. For each impairment group, Function Only Models were compared against Demographic-Comorbidity Models and Function Plus Models (Function-Demographic-Comorbidity Models). The primary outcome was 30-day readmission, and the primary measure of model performance was the c-statistic. All-cause 30-day readmission rate from inpatient rehabilitation facilities to acute care hospitals was 9.87%. C-statistics for the Function Only Models were 0.64 to 0.70. For all 16 impairment groups, the Function Only Model demonstrated better c-statistics than the Demographic-Comorbidity Models (c-statistic difference: 0.03-0.12). The best-performing Function Plus Models exhibited negligible improvements in model performance compared to Function Only Models, with c-statistic improvements of only 0.01 to 0.05. Readmissions are currently used as a marker of hospital performance, with recent financial penalties to hospitals for excessive readmissions. Function-based readmission models outperform models based only on demographics and comorbidities. Readmission risk models would benefit from the inclusion of functional status as a primary predictor. Copyright © 2016 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Control algorithms and applications of the wavefront sensorless adaptive optics
NASA Astrophysics Data System (ADS)
Ma, Liang; Wang, Bin; Zhou, Yuanshen; Yang, Huizhen
2017-10-01
Compared with the conventional adaptive optics (AO) system, the wavefront sensorless (WFSless) AO system need not to measure the wavefront and reconstruct it. It is simpler than the conventional AO in system architecture and can be applied to the complex conditions. Based on the analysis of principle and system model of the WFSless AO system, wavefront correction methods of the WFSless AO system were divided into two categories: model-free-based and model-based control algorithms. The WFSless AO system based on model-free-based control algorithms commonly considers the performance metric as a function of the control parameters and then uses certain control algorithm to improve the performance metric. The model-based control algorithms include modal control algorithms, nonlinear control algorithms and control algorithms based on geometrical optics. Based on the brief description of above typical control algorithms, hybrid methods combining the model-free-based control algorithm with the model-based control algorithm were generalized. Additionally, characteristics of various control algorithms were compared and analyzed. We also discussed the extensive applications of WFSless AO system in free space optical communication (FSO), retinal imaging in the human eye, confocal microscope, coherent beam combination (CBC) techniques and extended objects.
Integrated Planning Model (IPM) Base Case v.4.10
Learn about EPA's IPM Base Case v.4.10, including Proposed Transport Rule results, documentation, the National Electric Energy Data System (NEEDS) database and user's guide, and run results using previous base cases.
Jaime-González, Carlos; Acebes, Pablo; Mateos, Ana; Mezquida, Eduardo T
2017-01-01
LiDAR technology has firmly contributed to strengthen the knowledge of habitat structure-wildlife relationships, though there is an evident bias towards flying vertebrates. To bridge this gap, we investigated and compared the performance of LiDAR and field data to model habitat preferences of wood mouse (Apodemus sylvaticus) in a Mediterranean high mountain pine forest (Pinus sylvestris). We recorded nine field and 13 LiDAR variables that were summarized by means of Principal Component Analyses (PCA). We then analyzed wood mouse's habitat preferences using three different models based on: (i) field PCs predictors, (ii) LiDAR PCs predictors; and (iii) both set of predictors in a combined model, including a variance partitioning analysis. Elevation was also included as a predictor in the three models. Our results indicate that LiDAR derived variables were better predictors than field-based variables. The model combining both data sets slightly improved the predictive power of the model. Field derived variables indicated that wood mouse was positively influenced by the gradient of increasing shrub cover and negatively affected by elevation. Regarding LiDAR data, two LiDAR PCs, i.e. gradients in canopy openness and complexity in forest vertical structure positively influenced wood mouse, although elevation interacted negatively with the complexity in vertical structure, indicating wood mouse's preferences for plots with lower elevations but with complex forest vertical structure. The combined model was similar to the LiDAR-based model and included the gradient of shrub cover measured in the field. Variance partitioning showed that LiDAR-based variables, together with elevation, were the most important predictors and that part of the variation explained by shrub cover was shared. LiDAR derived variables were good surrogates of environmental characteristics explaining habitat preferences by the wood mouse. Our LiDAR metrics represented structural features of the forest patch, such as the presence and cover of shrubs, as well as other characteristics likely including time since perturbation, food availability and predation risk. Our results suggest that LiDAR is a promising technology for further exploring habitat preferences by small mammal communities.
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
Liu, Yun-Feng; Fan, Ying-Ying; Dong, Hui-Yue; Zhang, Jian-Xing
2017-12-01
The method used in biomechanical modeling for finite element method (FEM) analysis needs to deliver accurate results. There are currently two solutions used in FEM modeling for biomedical model of human bone from computerized tomography (CT) images: one is based on a triangular mesh and the other is based on the parametric surface model and is more popular in practice. The outline and modeling procedures for the two solutions are compared and analyzed. Using a mandibular bone as an example, several key modeling steps are then discussed in detail, and the FEM calculation was conducted. Numerical calculation results based on the models derived from the two methods, including stress, strain, and displacement, are compared and evaluated in relation to accuracy and validity. Moreover, a comprehensive comparison of the two solutions is listed. The parametric surface based method is more helpful when using powerful design tools in computer-aided design (CAD) software, but the triangular mesh based method is more robust and efficient.
Assessing the Performance of a Computer-Based Policy Model of HIV and AIDS
Rydzak, Chara E.; Cotich, Kara L.; Sax, Paul E.; Hsu, Heather E.; Wang, Bingxia; Losina, Elena; Freedberg, Kenneth A.; Weinstein, Milton C.; Goldie, Sue J.
2010-01-01
Background Model-based analyses, conducted within a decision analytic framework, provide a systematic way to combine information about the natural history of disease and effectiveness of clinical management strategies with demographic and epidemiological characteristics of the population. Among the challenges with disease-specific modeling include the need to identify influential assumptions and to assess the face validity and internal consistency of the model. Methods and Findings We describe a series of exercises involved in adapting a computer-based simulation model of HIV disease to the Women's Interagency HIV Study (WIHS) cohort and assess model performance as we re-parameterized the model to address policy questions in the U.S. relevant to HIV-infected women using data from the WIHS. Empiric calibration targets included 24-month survival curves stratified by treatment status and CD4 cell count. The most influential assumptions in untreated women included chronic HIV-associated mortality following an opportunistic infection, and in treated women, the ‘clinical effectiveness’ of HAART and the ability of HAART to prevent HIV complications independent of virologic suppression. Good-fitting parameter sets required reductions in the clinical effectiveness of 1st and 2nd line HAART and improvements in 3rd and 4th line regimens. Projected rates of treatment regimen switching using the calibrated cohort-specific model closely approximated independent analyses published using data from the WIHS. Conclusions The model demonstrated good internal consistency and face validity, and supported cohort heterogeneities that have been reported in the literature. Iterative assessment of model performance can provide information about the relative influence of uncertain assumptions and provide insight into heterogeneities within and between cohorts. Description of calibration exercises can enhance the transparency of disease-specific models. PMID:20844741
Assessing the performance of a computer-based policy model of HIV and AIDS.
Rydzak, Chara E; Cotich, Kara L; Sax, Paul E; Hsu, Heather E; Wang, Bingxia; Losina, Elena; Freedberg, Kenneth A; Weinstein, Milton C; Goldie, Sue J
2010-09-09
Model-based analyses, conducted within a decision analytic framework, provide a systematic way to combine information about the natural history of disease and effectiveness of clinical management strategies with demographic and epidemiological characteristics of the population. Among the challenges with disease-specific modeling include the need to identify influential assumptions and to assess the face validity and internal consistency of the model. We describe a series of exercises involved in adapting a computer-based simulation model of HIV disease to the Women's Interagency HIV Study (WIHS) cohort and assess model performance as we re-parameterized the model to address policy questions in the U.S. relevant to HIV-infected women using data from the WIHS. Empiric calibration targets included 24-month survival curves stratified by treatment status and CD4 cell count. The most influential assumptions in untreated women included chronic HIV-associated mortality following an opportunistic infection, and in treated women, the 'clinical effectiveness' of HAART and the ability of HAART to prevent HIV complications independent of virologic suppression. Good-fitting parameter sets required reductions in the clinical effectiveness of 1st and 2nd line HAART and improvements in 3rd and 4th line regimens. Projected rates of treatment regimen switching using the calibrated cohort-specific model closely approximated independent analyses published using data from the WIHS. The model demonstrated good internal consistency and face validity, and supported cohort heterogeneities that have been reported in the literature. Iterative assessment of model performance can provide information about the relative influence of uncertain assumptions and provide insight into heterogeneities within and between cohorts. Description of calibration exercises can enhance the transparency of disease-specific models.
NASA Astrophysics Data System (ADS)
Bódi, Erika; Buday, Tamás; McIntosh, Richard William
2013-04-01
Defining extraction-modified flow patterns with hydrodynamic models is a pivotal question in preserving groundwater resources regarding both quality and quantity. Modeling is the first step in groundwater protection the main result of which is the determination of the protective area depending on the amount of extracted water. Solid models have significant effects on hydrodynamic models as they are based on the solid models. Due to the legislative regulations, on protection areas certain restrictions must be applied which has firm consequences on economic activities. In Hungarian regulations there are no clear instructions for the establishment of either geological or hydrodynamic modeling, however, modeling itself is an obligation. Choosing the modeling method is a key consideration for further numerical calculations and it is decisive regarding the shape and size of the groundwater protection area. The geometry of hydrodynamic model layers is derived from the solid model. There are different geological approaches including lithological and sequence stratigraphic classifications furthermore in the case of regional models, formation-based hydrostratigraphic units are also applicable. Lithological classification is based on assigning and mapping of lithotypes. When the geometry (e.g. tectonic characteristics) of the research area is not known, horizontal bedding is assumed the probability of which can not be assessed based on only lithology. If the geological correlation is based on sequence stratigraphic studies, the cyclicity of sediment deposition is also considered. This method is more integrated thus numerous parameters (e.g. electrofacies) are taken into consideration studying the geological conditions ensuring more reliable modeling. Layers of sequence stratigraphic models can be either lithologically homogeneous or they may include greater cycles of sediments containing therefore several lithological units. The advantage of this is that the modeling can handle pinching out lithological units and lenticular bodies easier while most hydrodynamic softwares cannot handle flow units related to such model layers. Interpretation of tectonic disturbance is similar. In Hungary groundwater is extracted mainly from Pleistocene and Pannonian aquifers sediments of which were deposited in the ancient Pannonian Lake. When the basin lost its open-marine connection eustasy had no direct effects on facies changes therefore subsidence and sediment supply became the main factors. Various basin-filling related facies developed including alluvial plain facies, different delta facies types and pelitic deep-basin facies. Creating solid models based on sequence stratigraphic methods requires more raw data and also genetic approaches, in addition more working hours hence this method is seldom used in practice. Lithology-based models can be transformed into sequence stratigraphic models by extending the data base (e.g. detecting more survey data). In environments where the obtained models differ significantly notable changes can occur in the supply directions in addition the groundwater travel-time of the two models even on equal extraction terms. Our study aims to call attention to the consequences of using different solid models for typical depositional systems of the Great Hungarian Plain and to their effects on groundwater protection.
Berg, Marrit L; Cheung, Kei Long; Hiligsmann, Mickaël; Evers, Silvia; de Kinderen, Reina J A; Kulchaitanaroaj, Puttarin; Pokhrel, Subhash
2017-06-01
To identify different types of models used in economic evaluations of smoking cessation, analyse the quality of the included models examining their attributes and ascertain their transferability to a new context. A systematic review of the literature on the economic evaluation of smoking cessation interventions published between 1996 and April 2015, identified via Medline, EMBASE, National Health Service (NHS) Economic Evaluation Database (NHS EED), Health Technology Assessment (HTA). The checklist-based quality of the included studies and transferability scores was based on the European Network of Health Economic Evaluation Databases (EURONHEED) criteria. Studies that were not in smoking cessation, not original research, not a model-based economic evaluation, that did not consider adult population and not from a high-income country were excluded. Among the 64 economic evaluations included in the review, the state-transition Markov model was the most frequently used method (n = 30/64), with quality adjusted life years (QALY) being the most frequently used outcome measure in a life-time horizon. A small number of the included studies (13 of 64) were eligible for EURONHEED transferability checklist. The overall transferability scores ranged from 0.50 to 0.97, with an average score of 0.75. The average score per section was 0.69 (range = 0.35-0.92). The relative transferability of the studies could not be established due to a limitation present in the EURONHEED method. All existing economic evaluations in smoking cessation lack in one or more key study attributes necessary to be fully transferable to a new context. © 2017 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
Geographic information system/watershed model interface
Fisher, Gary T.
1989-01-01
Geographic information systems allow for the interactive analysis of spatial data related to water-resources investigations. A conceptual design for an interface between a geographic information system and a watershed model includes functions for the estimation of model parameter values. Design criteria include ease of use, minimal equipment requirements, a generic data-base management system, and use of a macro language. An application is demonstrated for a 90.1-square-kilometer subbasin of the Patuxent River near Unity, Maryland, that performs automated derivation of watershed parameters for hydrologic modeling.
Multi-fluid CFD analysis in Process Engineering
NASA Astrophysics Data System (ADS)
Hjertager, B. H.
2017-12-01
An overview of modelling and simulation of flow processes in gas/particle and gas/liquid systems are presented. Particular emphasis is given to computational fluid dynamics (CFD) models that use the multi-dimensional multi-fluid techniques. Turbulence modelling strategies for gas/particle flows based on the kinetic theory for granular flows are given. Sub models for the interfacial transfer processes and chemical kinetics modelling are presented. Examples are shown for some gas/particle systems including flow and chemical reaction in risers as well as gas/liquid systems including bubble columns and stirred tanks.
Clements, Margaret; Aber, J Lawrence; Seidman, Edward
2008-01-01
Structural equation modeling was used to compare 6 competing theoretically based psychosocial models of the longitudinal association between life stressors and depressive symptoms in a sample of early adolescents (N= 907; 40% Hispanic, 32% Black, and 19% White; mean age at Time 1 = 11.4 years). Only two models fit the data, both of which included paths modeling the effect of depressive symptoms on stressors recall: The mood-congruent cognitive bias model included only depressive symptoms to life stressors paths (DS-->S), whereas the fully transactional model included paths representing both the DS-->S and stressors to depressive symptoms (S-->DS) effects. Social causation models and the stress generation model did not fit the data. Findings demonstrate the importance of accounting for mood-congruent cognitive bias in stressors-depressive symptoms investigations.
Roeder, Ingo; Herberg, Maria; Horn, Matthias
2009-04-01
Previously, we have modeled hematopoietic stem cell organization by a stochastic, single cell-based approach. Applications to different experimental systems demonstrated that this model consistently explains a broad variety of in vivo and in vitro data. A major advantage of the agent-based model (ABM) is the representation of heterogeneity within the hematopoietic stem cell population. However, this advantage comes at the price of time-consuming simulations if the systems become large. One example in this respect is the modeling of disease and treatment dynamics in patients with chronic myeloid leukemia (CML), where the realistic number of individual cells to be considered exceeds 10(6). To overcome this deficiency, without losing the representation of the inherent heterogeneity of the stem cell population, we here propose to approximate the ABM by a system of partial differential equations (PDEs). The major benefit of such an approach is its independence from the size of the system. Although this mean field approach includes a number of simplifying assumptions compared to the ABM, it retains the key structure of the model including the "age"-structure of stem cells. We show that the PDE model qualitatively and quantitatively reproduces the results of the agent-based approach.
Wang, Jian-Li; Yuan, Zi-Gang; Qian, Guo-Liang; Bao, Wu-Qiao; Jin, Guo-Liang
2018-06-01
The study aimed to develop simulation models including intracranial aneurysmal and parent vessel geometries, as well as vascular branches, through 3D printing technology. The simulation models focused on the benefits of aneurysmal treatments and clinical education. This prospective study included 13 consecutive patients who suffered from intracranial aneurysms confirmed by digital subtraction angiography (DSA) in the Neurosurgery Department of Shaoxing People's Hospital. The original 3D-DSA image data were extracted through the picture archiving and communication system and imported into Mimics. After reconstructing and transforming to Binary STL format, the simulation models of the hollow vascular tree were printed using 3D devices. The intracranial aneurysm 3D printing simulation model was developed based on DSA to assist neurosurgeons in aneurysmal treatments and residency training. Seven neurosurgical residents and 15 standardization training residents received their simulation model training and gave high assessments for the educational course with the follow-up qualitative questionnaire. 3D printed simulation models based on DSA can perfectly reveal target aneurysms and help neurosurgeons select therapeutic strategies precisely. As an educational tool, the 3D aneurysm vascular simulation model is useful for training residents.
Adaptation of Acoustic Model Experiments of STM via Smartphones and Tablets
ERIC Educational Resources Information Center
Thees, Michael; Hochberg, Katrin; Kuhn, Jochen; Aeschlimann, Martin
2017-01-01
The importance of Scanning Tunneling Microscopy (STM) in today's research and industry leads to the question of how to include such a key technology in physics education. Manfred Euler has developed an acoustic model experiment to illustrate the fundamental measuring principles based on an analogy between quantum mechanics and acoustics. Based on…
ERIC Educational Resources Information Center
Bergner, Yoav; Droschler, Stefan; Kortemeyer, Gerd; Rayyan, Saif; Seaton, Daniel; Pritchard, David E.
2012-01-01
We apply collaborative filtering (CF) to dichotomously scored student response data (right, wrong, or no interaction), finding optimal parameters for each student and item based on cross-validated prediction accuracy. The approach is naturally suited to comparing different models, both unidimensional and multidimensional in ability, including a…
A Four-Stage Model for Planning Computer-Based Instruction.
ERIC Educational Resources Information Center
Morrison, Gary R.; Ross, Steven M.
1988-01-01
Describes a flexible planning process for developing computer based instruction (CBI) in which the CBI design is implemented on paper between the lesson design and the program production. A four-stage model is explained, including (1) an initial flowchart, (2) storyboards, (3) a detailed flowchart, and (4) an evaluation. (16 references)…
Not Funding the Evidence-Based Model in Ohio
ERIC Educational Resources Information Center
Edlefson, Carla
2010-01-01
The purpose of this descriptive case study was to describe the implementation of Ohio's version of the Evidence-Based Model (OEBM) state school finance system in 2009. Data sources included state budget documents and analyses as well as interviews with local school officials. The new system was responsive to three policy objectives ordered by the…
Toward a Predictive Model of Arctic Coastal Retreat in a Warming Climate, Beaufort Sea, Alaska
2011-09-30
level by waves and surge and tide. Melt rate is governed by an empirically based iceberg melting algorithm that includes explicitly the roles of wave...Thermal erosion of a permafrost coastline: Improving process-based models using time-lapse photography, Arctic Alpine Antarctic Research 43(3): 474
ERIC Educational Resources Information Center
Walker, Patricia Hinton
1994-01-01
The University of Rochester's community nursing center is an entrepreneurial model for faculty practice based on sound business principles to enhance financial success. These principles include development and pricing of the product of nursing services, consumer dialogue instead of advertising monologue, and a diversified income base. (SK)
Pollux: Enhancing the Quality of Service of the Global Information Grid (GIG)
2009-06-01
and throughput of standard-based and/or COTS-based QoS-enabled pub/sub technologies, including DDS, JMS, Web Services, and CORBA. 2. The DDS QoS...of ser- vice pICKER (QUICKER) model-driven engineering ( MDE ) toolchain shown in Figure 8. QUICKER extends the Platform-Independent Component Modeling
Incorporating Solid Modeling and Team-Based Design into Freshman Engineering Graphics.
ERIC Educational Resources Information Center
Buchal, Ralph O.
2001-01-01
Describes the integration of these topics through a major team-based design and computer aided design (CAD) modeling project in freshman engineering graphics at the University of Western Ontario. Involves n=250 students working in teams of four to design and document an original Lego toy. Includes 12 references. (Author/YDS)
Enhancing Retrieval with Hyperlinks: A General Model Based on Propositional Argumentation Systems.
ERIC Educational Resources Information Center
Picard, Justin; Savoy, Jacques
2003-01-01
Discusses the use of hyperlinks for improving information retrieval on the World Wide Web and proposes a general model for using hyperlinks based on Probabilistic Argumentation Systems. Topics include propositional logic, knowledge, and uncertainty; assumptions; using hyperlinks to modify document score and rank; and estimating the popularity of a…
A Model for Art Therapists in Community-Based Practice
ERIC Educational Resources Information Center
Ottemiller, Dylan D.; Awais, Yasmine J.
2016-01-01
With growing trends toward preventative, community-based health care, art therapists must expand their scope of practice beyond the medical model and individual psychodynamics in order to serve, include, and empower those in need. In this article the authors review literature that illustrates the unique qualities art therapists can contribute to…
NASA Technical Reports Server (NTRS)
1985-01-01
Topics addressed include: assessment models; model predictions of ozone changes; ozone and temperature trends; trace gas effects on climate; kinetics and photchemical data base; spectroscopic data base (infrared to microwave); instrument intercomparisons and assessments; and monthly mean distribution of ozone and temperature.
Vella, Michael; Cannon, Robert C; Crook, Sharon; Davison, Andrew P; Ganapathy, Gautham; Robinson, Hugh P C; Silver, R Angus; Gleeson, Padraig
2014-01-01
NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment.
Vella, Michael; Cannon, Robert C.; Crook, Sharon; Davison, Andrew P.; Ganapathy, Gautham; Robinson, Hugh P. C.; Silver, R. Angus; Gleeson, Padraig
2014-01-01
NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment. PMID:24795618
Multiple sparse volumetric priors for distributed EEG source reconstruction.
Strobbe, Gregor; van Mierlo, Pieter; De Vos, Maarten; Mijović, Bogdan; Hallez, Hans; Van Huffel, Sabine; López, José David; Vandenberghe, Stefaan
2014-10-15
We revisit the multiple sparse priors (MSP) algorithm implemented in the statistical parametric mapping software (SPM) for distributed EEG source reconstruction (Friston et al., 2008). In the present implementation, multiple cortical patches are introduced as source priors based on a dipole source space restricted to a cortical surface mesh. In this note, we present a technique to construct volumetric cortical regions to introduce as source priors by restricting the dipole source space to a segmented gray matter layer and using a region growing approach. This extension allows to reconstruct brain structures besides the cortical surface and facilitates the use of more realistic volumetric head models including more layers, such as cerebrospinal fluid (CSF), compared to the standard 3-layered scalp-skull-brain head models. We illustrated the technique with ERP data and anatomical MR images in 12 subjects. Based on the segmented gray matter for each of the subjects, cortical regions were created and introduced as source priors for MSP-inversion assuming two types of head models. The standard 3-layered scalp-skull-brain head models and extended 4-layered head models including CSF. We compared these models with the current implementation by assessing the free energy corresponding with each of the reconstructions using Bayesian model selection for group studies. Strong evidence was found in favor of the volumetric MSP approach compared to the MSP approach based on cortical patches for both types of head models. Overall, the strongest evidence was found in favor of the volumetric MSP reconstructions based on the extended head models including CSF. These results were verified by comparing the reconstructed activity. The use of volumetric cortical regions as source priors is a useful complement to the present implementation as it allows to introduce more complex head models and volumetric source priors in future studies. Copyright © 2014 Elsevier Inc. All rights reserved.
Finite-Rate Ablation Boundary Conditions for Carbon-Phenolic Heat-Shield
NASA Technical Reports Server (NTRS)
Chen, Y.-K.; Milos, Frank S.
2003-01-01
A formulation of finite-rate ablation surface boundary conditions, including oxidation, nitridation, and sublimation of carbonaceous material with pyrolysis gas injection, has been developed based on surface species mass conservation. These surface boundary conditions are discretized and integrated with a Navier-Stokes solver. This numerical procedure can predict aerothermal heating, chemical species concentration, and carbonaceous material ablation rate over the heatshield surface of re-entry space vehicles. In this study, the gas-gas and gas-surface interactions are established for air flow over a carbon-phenolic heatshield. Two finite-rate gas-surface interaction models are considered in the present study. The first model is based on the work of Park, and the second model includes the kinetics suggested by Zhluktov and Abe. Nineteen gas phase chemical reactions and four gas-surface interactions are considered in the present model. There is a total of fourteen gas phase chemical species, including five species for air and nine species for ablation products. Three test cases are studied in this paper. The first case is a graphite test model in the arc-jet stream; the second is a light weight Phenolic Impregnated Carbon Ablator at the Stardust re-entry peak heating conditions, and the third is a fully dense carbon-phenolic heatshield at the peak heating point of a proposed Mars Sample Return Earth Entry Vehicle. Predictions based on both finite-rate gas- surface interaction models are compared with those obtained using B' tables, which were created based on the chemical equilibrium assumption. Stagnation point convective heat fluxes predicted using Park's finite-rate model are far below those obtained from chemical equilibrium B' tables and Zhluktov's model. Recession predictions from Zhluktov's model are generally lower than those obtained from Park's model and chemical equilibrium B' tables. The effect of species mass diffusion on predicted ablation rate is also examined.
Dataset for petroleum based stock markets and GAUSS codes for SAMEM.
Khalifa, Ahmed A A; Bertuccelli, Pietro; Otranto, Edoardo
2017-02-01
This article includes a unique data set of a balanced daily (Monday, Tuesday and Wednesday) for oil and natural gas volatility and the oil rich economies' stock markets for Saudi Arabia, Qatar, Kuwait, Abu Dhabi, Dubai, Bahrain and Oman, using daily data over the period spanning Oct. 18, 2006-July 30, 2015. Additionally, we have included unique GAUSS codes for estimating the spillover asymmetric multiplicative error model (SAMEM) with application to Petroleum-Based Stock Market. The data, the model and the codes have many applications in business and social science.
[Modeling in value-based medicine].
Neubauer, A S; Hirneiss, C; Kampik, A
2010-03-01
Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.
Using Poisson mixed-effects model to quantify transcript-level gene expression in RNA-Seq.
Hu, Ming; Zhu, Yu; Taylor, Jeremy M G; Liu, Jun S; Qin, Zhaohui S
2012-01-01
RNA sequencing (RNA-Seq) is a powerful new technology for mapping and quantifying transcriptomes using ultra high-throughput next-generation sequencing technologies. Using deep sequencing, gene expression levels of all transcripts including novel ones can be quantified digitally. Although extremely promising, the massive amounts of data generated by RNA-Seq, substantial biases and uncertainty in short read alignment pose challenges for data analysis. In particular, large base-specific variation and between-base dependence make simple approaches, such as those that use averaging to normalize RNA-Seq data and quantify gene expressions, ineffective. In this study, we propose a Poisson mixed-effects (POME) model to characterize base-level read coverage within each transcript. The underlying expression level is included as a key parameter in this model. Since the proposed model is capable of incorporating base-specific variation as well as between-base dependence that affect read coverage profile throughout the transcript, it can lead to improved quantification of the true underlying expression level. POME can be freely downloaded at http://www.stat.purdue.edu/~yuzhu/pome.html. yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary data are available at Bioinformatics online.
ERIC Educational Resources Information Center
Prayekti
2016-01-01
"Problem-based learning" (PBL) is one of an innovative learning model which can provide an active learning to student, include the motivation to achieve showed by student when the learning is in progress. This research is aimed to know: (1) differences of physic learning result for student group which taught by PBL versus expository…
Price vs. Performance: The Value of Next Generation Fighter Aircraft
2007-03-01
forms. Both the semi-log and log-log forms were plagued with heteroskedasticity (according to the Breusch - Pagan /Cook-Weisberg test ). The RDT&E models...from 1949-present were used to construct two models – one based on procurement costs and one based on research, design, test , and evaluation (RDT&E...fighter aircraft hedonic models include several different categories of variables. Aircraft procurement costs and research, design, test , and
The Betting Odds Rating System: Using soccer forecasts to forecast soccer.
Wunderlich, Fabian; Memmert, Daniel
2018-01-01
Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods.
The Betting Odds Rating System: Using soccer forecasts to forecast soccer
Memmert, Daniel
2018-01-01
Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods. PMID:29870554
NASA Astrophysics Data System (ADS)
Sung, Ji Young; Oh, Phil Seok
2017-06-01
Recent science education reform initiatives suggest that learning in science should be organized on the basis of scientists' actual practices including the development and use of models. In line with this, the current study adapted three types of modeling practices to teach two Korean 6th grade science classes the causes of the Earth's seasons. Specifically, the study aimed to identify the students' content-specific competencies and challenges based on fine-grained descriptions and analyses of two target groups' cases. Data included digital recordings of modeling-based science lessons in the two classes, the teacher's and students' artifacts, and interviews with the students. These multiple types of data were analyzed complementarily and qualitatively. It was revealed that the students had a competency in constructing models to generate the desired phenomenon (i.e., seasons). They had difficulty, however, in considering the tilt of the Earth's rotation axis as a cause of the seasons and in finding a proper way of representing the Sun's meridian altitude on a globe. But, when the students were helped and guided by the teacher and peers' interventions, they were able to revise their models in alignment with the scientific understanding of the seasons. Based on these findings, the teacher's pedagogical roles, which include using student competencies as resources, asking physical questions, and explicit guidance on experimentation skills, were recommended to support successful incorporations of modeling practices in the science classroom.
A deep auto-encoder model for gene expression prediction.
Xie, Rui; Wen, Jia; Quitadamo, Andrew; Cheng, Jianlin; Shi, Xinghua
2017-11-17
Gene expression is a key intermediate level that genotypes lead to a particular trait. Gene expression is affected by various factors including genotypes of genetic variants. With an aim of delineating the genetic impact on gene expression, we build a deep auto-encoder model to assess how good genetic variants will contribute to gene expression changes. This new deep learning model is a regression-based predictive model based on the MultiLayer Perceptron and Stacked Denoising Auto-encoder (MLP-SAE). The model is trained using a stacked denoising auto-encoder for feature selection and a multilayer perceptron framework for backpropagation. We further improve the model by introducing dropout to prevent overfitting and improve performance. To demonstrate the usage of this model, we apply MLP-SAE to a real genomic datasets with genotypes and gene expression profiles measured in yeast. Our results show that the MLP-SAE model with dropout outperforms other models including Lasso, Random Forests and the MLP-SAE model without dropout. Using the MLP-SAE model with dropout, we show that gene expression quantifications predicted by the model solely based on genotypes, align well with true gene expression patterns. We provide a deep auto-encoder model for predicting gene expression from SNP genotypes. This study demonstrates that deep learning is appropriate for tackling another genomic problem, i.e., building predictive models to understand genotypes' contribution to gene expression. With the emerging availability of richer genomic data, we anticipate that deep learning models play a bigger role in modeling and interpreting genomics.
NASA Astrophysics Data System (ADS)
Ke, Haohao; Ondov, John M.; Rogge, Wolfgang F.
2013-12-01
Composite chemical profiles of motor vehicle emissions were extracted from ambient measurements at a near-road site in Baltimore during a windless traffic episode in November, 2002, using four independent approaches, i.e., simple peak analysis, windless model-based linear regression, PMF, and UNMIX. Although the profiles are in general agreement, the windless-model-based profile treatment more effectively removes interference from non-traffic sources and is deemed to be more accurate for many species. In addition to abundances of routine pollutants (e.g., NOx, CO, PM2.5, EC, OC, sulfate, and nitrate), 11 particle-bound metals and 51 individual traffic-related organic compounds (including n-alkanes, PAHs, oxy-PAHs, hopanes, alkylcyclohexanes, and others) were included in the modeling.
Strategic approaches to planetary base development
NASA Technical Reports Server (NTRS)
Roberts, Barney B.
1992-01-01
The evolutionary development of a planetary expansionary outpost is considered in the light of both technical and economic issues. The outline of a partnering taxonomy is set forth which encompasses both institutional and temporal issues related to establishing shared interests and investments. The purely technical issues are discussed in terms of the program components which include nonaerospace technologies such as construction engineering. Five models are proposed in which partnership and autonomy for participants are approached in different ways including: (1) the standard customer/provider relationship; (2) a service-provider scenario; (3) the joint venture; (4) a technology joint-development model; and (5) a redundancy model for reduced costs. Based on the assumed characteristics of planetary surface systems the cooperative private/public models are championed with coordinated design by NASA to facilitate outside cooperation.
WATER QUALITY MODELING AND SAMPLING STUDY IN A DISTRIBUTION SYSTEM
A variety of computer based models have been developed and used by the water industry to access the movement and fate of contaminants within the distribution system. uch models include: ynamic and steady state hydraulic models which simulate the flow quantity, flow direction, and...
Beck, J D; Weintraub, J A; Disney, J A; Graves, R C; Stamm, J W; Kaste, L M; Bohannan, H M
1992-12-01
The purpose of this analysis is to compare three different statistical models for predicting children likely to be at risk of developing dental caries over a 3-yr period. Data are based on 4117 children who participated in the University of North Carolina Caries Risk Assessment Study, a longitudinal study conducted in the Aiken, South Carolina, and Portland, Maine areas. The three models differed with respect to either the types of variables included or the definition of disease outcome. The two "Prediction" models included both risk factor variables thought to cause dental caries and indicator variables that are associated with dental caries, but are not thought to be causal for the disease. The "Etiologic" model included only etiologic factors as variables. A dichotomous outcome measure--none or any 3-yr increment, was used in the "Any Risk Etiologic model" and the "Any Risk Prediction Model". Another outcome, based on a gradient measure of disease, was used in the "High Risk Prediction Model". The variables that are significant in these models vary across grades and sites, but are more consistent among the Etiologic model than the Predictor models. However, among the three sets of models, the Any Risk Prediction Models have the highest sensitivity and positive predictive values, whereas the High Risk Prediction Models have the highest specificity and negative predictive values. Considerations in determining model preference are discussed.
Whitford, Paul C; Noel, Jeffrey K; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y; Onuchic, José N
2009-05-01
Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Go) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase, and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a C(alpha) structure-based model and an all-atom empirical forcefield. Key findings include: (1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature, (2) folding mechanisms are robust to variations of the energetic parameters, (3) protein folding free-energy barriers can be manipulated through parametric modifications, (4) the global folding mechanisms in a C(alpha) model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model, and (5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Because this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function.
Whitford, Paul C.; Noel, Jeffrey K.; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y.; Onuchic, José N.
2012-01-01
Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Gō) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a Cα structure-based model and an all-atom empirical forcefield. Key findings include 1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature 2) folding mechanisms are robust to variations of the energetic parameters 3) protein folding free energy barriers can be manipulated through parametric modifications 4) the global folding mechanisms in a Cα model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model 5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Since this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function. PMID:18837035
An Agent Based Collaborative Simplification of 3D Mesh Model
NASA Astrophysics Data System (ADS)
Wang, Li-Rong; Yu, Bo; Hagiwara, Ichiro
Large-volume mesh model faces the challenge in fast rendering and transmission by Internet. The current mesh models obtained by using three-dimensional (3D) scanning technology are usually very large in data volume. This paper develops a mobile agent based collaborative environment on the development platform of mobile-C. Communication among distributed agents includes grasping image of visualized mesh model, annotation to grasped image and instant message. Remote and collaborative simplification can be efficiently conducted by Internet.
Modeling a Common-Source Amplifier Using a Ferroelectric Transistor
NASA Technical Reports Server (NTRS)
Sayyah, Rana; Hunt, Mitchell; MacLeond, Todd C.; Ho, Fat D.
2010-01-01
This paper presents a mathematical model characterizing the behavior of a common-source amplifier using a FeFET. The model is based on empirical data and incorporates several variables that affect the output, including frequency, load resistance, and gate-to-source voltage. Since the common-source amplifier is the most widely used amplifier in MOS technology, understanding and modeling the behavior of the FeFET-based common-source amplifier will help in the integration of FeFETs into many circuits.
Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985
NASA Technical Reports Server (NTRS)
1986-01-01
The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.
A Perspective on Computational Human Performance Models as Design Tools
NASA Technical Reports Server (NTRS)
Jones, Patricia M.
2010-01-01
The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.
Multivariable Parametric Cost Model for Ground Optical: Telescope Assembly
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Rowell, Ginger Holmes; Reese, Gayle; Byberg, Alicia
2004-01-01
A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis of both engineering and performance parameters. While diameter continues to be the dominant cost driver, diffraction limited wavelength is found to be a secondary driver. Other parameters such as radius of curvature were examined. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter were derived.
Implementing Curriculum-Based Learning Portfolio: A Case Study in Taiwan
ERIC Educational Resources Information Center
Chen, Shu-Chin Susan; Cheng, Yu-Pay
2011-01-01
The main purpose of this descriptive research is to examine and document the development of a curriculum-based learning portfolio model for children in a preschool for three-six-year-olds in Taiwan. Data collection methods adopted include classroom observation, in-depth interviews, questionnaires and documentation. Participants include a preschool…
NASA Technical Reports Server (NTRS)
White, R. J.
1973-01-01
A detailed description of Guyton's model and modifications are provided. Also included are descriptions of several typical experiments which the model can simulate to illustrate the model's general utility. A discussion of the problems associated with the interfacing of the model to other models such as respiratory and thermal regulation models which is prime importance since these stimuli are not present in the current model is also included. A user's guide for the operation of the model on the Xerox Sigma 3 computer is provided and two programs are described. A verification plan and procedure for performing experiments is also presented.
Space Shuttle propulsion performance reconstruction from flight data
NASA Technical Reports Server (NTRS)
Rogers, Robert M.
1989-01-01
The aplication of extended Kalman filtering to estimating Space Shuttle Solid Rocket Booster (SRB) performance, specific impulse, from flight data in a post-flight processing computer program. The flight data used includes inertial platform acceleration, SRB head pressure, and ground based radar tracking data. The key feature in this application is the model used for the SRBs, which represents a reference quasi-static internal ballistics model normalized to the propellant burn depth. Dynamic states of mass overboard and propellant burn depth are included in the filter model to account for real-time deviations from the reference model used. Aerodynamic, plume, wind and main engine uncertainties are included.
The Two Modes of Distance Education.
ERIC Educational Resources Information Center
Keegan, Desmond
1998-01-01
Discusses two models of distance-education, group-based versus individual-based. Highlights include group-based distance education for full-time and part-time students; individual-based distance education with pre-prepared materials and without pre-prepared materials; and distance education management and research. (LRW)
Cunningham, Albert R.; Trent, John O.
2012-01-01
Structure–activity relationship (SAR) models are powerful tools to investigate the mechanisms of action of chemical carcinogens and to predict the potential carcinogenicity of untested compounds. We describe the use of a traditional fragment-based SAR approach along with a new virtual ligand-protein interaction-based approach for modeling of nonmutagenic carcinogens. The ligand-based SAR models used descriptors derived from computationally calculated ligand-binding affinities for learning set agents to 5495 proteins. Two learning sets were developed. One set was from the Carcinogenic Potency Database, where chemicals tested for rat carcinogenesis along with Salmonella mutagenicity data were provided. The second was from Malacarne et al. who developed a learning set of nonalerting compounds based on rodent cancer bioassay data and Ashby’s structural alerts. When the rat cancer models were categorized based on mutagenicity, the traditional fragment model outperformed the ligand-based model. However, when the learning sets were composed solely of nonmutagenic or nonalerting carcinogens and noncarcinogens, the fragment model demonstrated a concordance of near 50%, whereas the ligand-based models demonstrated a concordance of 71% for nonmutagenic carcinogens and 74% for nonalerting carcinogens. Overall, these findings suggest that expert system analysis of virtual chemical protein interactions may be useful for developing predictive SAR models for nonmutagenic carcinogens. Moreover, a more practical approach for developing SAR models for carcinogenesis may include fragment-based models for chemicals testing positive for mutagenicity and ligand-based models for chemicals devoid of DNA reactivity. PMID:22678118
Cunningham, Albert R; Carrasquer, C Alex; Qamar, Shahid; Maguire, Jon M; Cunningham, Suzanne L; Trent, John O
2012-10-01
Structure-activity relationship (SAR) models are powerful tools to investigate the mechanisms of action of chemical carcinogens and to predict the potential carcinogenicity of untested compounds. We describe the use of a traditional fragment-based SAR approach along with a new virtual ligand-protein interaction-based approach for modeling of nonmutagenic carcinogens. The ligand-based SAR models used descriptors derived from computationally calculated ligand-binding affinities for learning set agents to 5495 proteins. Two learning sets were developed. One set was from the Carcinogenic Potency Database, where chemicals tested for rat carcinogenesis along with Salmonella mutagenicity data were provided. The second was from Malacarne et al. who developed a learning set of nonalerting compounds based on rodent cancer bioassay data and Ashby's structural alerts. When the rat cancer models were categorized based on mutagenicity, the traditional fragment model outperformed the ligand-based model. However, when the learning sets were composed solely of nonmutagenic or nonalerting carcinogens and noncarcinogens, the fragment model demonstrated a concordance of near 50%, whereas the ligand-based models demonstrated a concordance of 71% for nonmutagenic carcinogens and 74% for nonalerting carcinogens. Overall, these findings suggest that expert system analysis of virtual chemical protein interactions may be useful for developing predictive SAR models for nonmutagenic carcinogens. Moreover, a more practical approach for developing SAR models for carcinogenesis may include fragment-based models for chemicals testing positive for mutagenicity and ligand-based models for chemicals devoid of DNA reactivity.
Model-Based Reasoning in Upper-division Lab Courses
NASA Astrophysics Data System (ADS)
Lewandowski, Heather
2015-05-01
Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.
MTBE is a volatile organic compound used as an oxygenate additive to gasoline, added to comply with the 1990 Clean Air Act. Previous PBPK models for MTBE were reviewed and incorporated into the Exposure Related Dose Estimating Model (ERDEM) software. This model also included an e...
Means, John A.; Simson, Crystal M.; Zhou, Shu; Rachford, Aaron A.; Rack, Jeffrey J.; Hines, Jennifer V.
2009-01-01
The T box transcription antitermination riboswitch is one of the main regulatory mechanisms utilized by Gram-positive bacteria to regulate genes that are involved in amino acid metabolism. The details of the antitermination event, including the role that Mg2+ plays, in this riboswitch have not been completely elucidated. In these studies, details of the antitermination event were investigated utilizing 2-aminopurine to monitor structural changes of a model antiterminator RNA when it was bound to model tRNA. Based on the results of these fluorescence studies, the model tRNA binds the model antiterminator RNA via an induced fit. This binding is enhanced by the presence of Mg2+, facilitating the complete base pairing of the model tRNA acceptor end with the complementary bases in the model antiterminator bulge. PMID:19755116
Development of a 3D Stream Network and Topography for Improved Large-Scale Hydraulic Modeling
NASA Astrophysics Data System (ADS)
Saksena, S.; Dey, S.; Merwade, V.
2016-12-01
Most digital elevation models (DEMs) used for hydraulic modeling do not include channel bed elevations. As a result, the DEMs are complimented with additional bathymetric data for accurate hydraulic simulations. Existing methods to acquire bathymetric information through field surveys or through conceptual models are limited to reach-scale applications. With an increasing focus on large scale hydraulic modeling of rivers, a framework to estimate and incorporate bathymetry for an entire stream network is needed. This study proposes an interpolation-based algorithm to estimate bathymetry for a stream network by modifying the reach-based empirical River Channel Morphology Model (RCMM). The effect of a 3D stream network that includes river bathymetry is then investigated by creating a 1D hydraulic model (HEC-RAS) and 2D hydrodynamic model (Integrated Channel and Pond Routing) for the Upper Wabash River Basin in Indiana, USA. Results show improved simulation of flood depths and storage in the floodplain. Similarly, the impact of river bathymetry incorporation is more significant in the 2D model as compared to the 1D model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tom Elicson; Bentley Harwood; Jim Bouchard
Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. Themore » fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: • Development of time-dependent fire heat release rate profiles (required as input to CFAST), • Calculation of fire severity factors based on CFAST detailed fire modeling, and • Calculation of fire non-suppression probabilities.« less
A Stratified Acoustic Model Accounting for Phase Shifts for Underwater Acoustic Networks
Wang, Ping; Zhang, Lin; Li, Victor O. K.
2013-01-01
Accurate acoustic channel models are critical for the study of underwater acoustic networks. Existing models include physics-based models and empirical approximation models. The former enjoy good accuracy, but incur heavy computational load, rendering them impractical in large networks. On the other hand, the latter are computationally inexpensive but inaccurate since they do not account for the complex effects of boundary reflection losses, the multi-path phenomenon and ray bending in the stratified ocean medium. In this paper, we propose a Stratified Acoustic Model (SAM) based on frequency-independent geometrical ray tracing, accounting for each ray's phase shift during the propagation. It is a feasible channel model for large scale underwater acoustic network simulation, allowing us to predict the transmission loss with much lower computational complexity than the traditional physics-based models. The accuracy of the model is validated via comparisons with the experimental measurements in two different oceans. Satisfactory agreements with the measurements and with other computationally intensive classical physics-based models are demonstrated. PMID:23669708
A stratified acoustic model accounting for phase shifts for underwater acoustic networks.
Wang, Ping; Zhang, Lin; Li, Victor O K
2013-05-13
Accurate acoustic channel models are critical for the study of underwater acoustic networks. Existing models include physics-based models and empirical approximation models. The former enjoy good accuracy, but incur heavy computational load, rendering them impractical in large networks. On the other hand, the latter are computationally inexpensive but inaccurate since they do not account for the complex effects of boundary reflection losses, the multi-path phenomenon and ray bending in the stratified ocean medium. In this paper, we propose a Stratified Acoustic Model (SAM) based on frequency-independent geometrical ray tracing, accounting for each ray's phase shift during the propagation. It is a feasible channel model for large scale underwater acoustic network simulation, allowing us to predict the transmission loss with much lower computational complexity than the traditional physics-based models. The accuracy of the model is validated via comparisons with the experimental measurements in two different oceans. Satisfactory agreements with the measurements and with other computationally intensive classical physics-based models are demonstrated.
Pimental, Patricia A; O'Hara, John B; Jandak, Jessica L
2018-01-01
By virtue of their extensive knowledge base and specialized training in brain-behavior relationships, neuropsychologists are especially poised to execute a unique broad-based approach to overall cognitive wellness and should be viewed as primary care providers of cognitive health. This article will describe a novel comprehensive cognitive wellness service delivery model including cognitive health, anti-aging, lifelong wellness, and longevity-oriented practices. These practice areas include brain-based cognitive wellness, emotional and spiritually centric exploration, and related multimodality health interventions. As experts in mind-body connections, neuropsychologists can provide a variety of evidence-based treatment options, empowering patients with a sense of value and purpose. Multiple areas of clinical therapy skill-based learning, tailor-made to fit individual needs, will be discussed including: brain stimulating activities, restorative techniques, automatic negative thoughts and maladaptive thinking reduction, inflammation and pain management techniques, nutrition and culinary focused cognitive wellness, spirituality based practices and mindfulness, movement and exercise, alternative/complimentary therapies, relationship restoration/social engagement, and trauma healing/meaning. Cognitive health rests upon the foundation of counteracting mind-body connection disruptions from multiple etiologies including inflammation, chronic stress, metabolic issues, cardiac conditions, autoimmune disease, neurological disorders, infectious diseases, and allergy spectrum disorders. Superimposed on these issues are lifestyle patterns and negative health behaviors that develop as ill-fated compensatory mechanisms used to cope with life stressors and aging. The brain and body are electrical systems that can "short circuit." The therapy practices inherent in the proposed cognitive wellness service delivery model can provide preventative insulation and circuit breaking against the shock of illness.
Estimating Setup of Driven Piles into Louisiana Clayey Soils
DOT National Transportation Integrated Search
2009-11-15
Two types of mathematical models for pile setup prediction, the Skov-Denver model and the newly developed rate-based model, have been established from all the dynamic and static testing data, including restrikes of the production piles, restrikes, st...
Estimating setup of driven piles into Louisiana clayey soils.
DOT National Transportation Integrated Search
2010-11-15
Two types of mathematical models for pile setup prediction, the Skov-Denver model and the newly developed rate-based model, have been established from all the dynamic and static testing data, including restrikes of the production piles, restrikes, st...
Integrated communications and optical navigation system
NASA Astrophysics Data System (ADS)
Mueller, J.; Pajer, G.; Paluszek, M.
2013-12-01
The Integrated Communications and Optical Navigation System (ICONS) is a flexible navigation system for spacecraft that does not require global positioning system (GPS) measurements. The navigation solution is computed using an Unscented Kalman Filter (UKF) that can accept any combination of range, range-rate, planet chord width, landmark, and angle measurements using any celestial object. Both absolute and relative orbit determination is supported. The UKF employs a full nonlinear dynamical model of the orbit including gravity models and disturbance models. The ICONS package also includes attitude determination algorithms using the UKF algorithm with the Inertial Measurement Unit (IMU). The IMU is used as the dynamical base for the attitude determination algorithms. This makes the sensor a more capable plug-in replacement for a star tracker, thus reducing the integration and test cost of adding this sensor to a spacecraft. Recent additions include an integrated optical communications system which adds communications, and integrated range and range rate measurement and timing. The paper includes test results from trajectories based on the NASA New Horizons spacecraft.
Mesoscale acid deposition modeling studies
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Proctor, F. H.; Zack, John W.; Karyampudi, V. Mohan; Price, P. E.; Bousquet, M. D.; Coats, G. D.
1989-01-01
The work performed in support of the EPA/DOE MADS (Mesoscale Acid Deposition) Project included the development of meteorological data bases for the initialization of chemistry models, the testing and implementation of new planetary boundary layer parameterization schemes in the MASS model, the simulation of transport and precipitation for MADS case studies employing the MASS model, and the use of the TASS model in the simulation of cloud statistics and the complex transport of conservative tracers within simulated cumuloform clouds. The work performed in support of the NASA/FAA Wind Shear Program included the use of the TASS model in the simulation of the dynamical processes within convective cloud systems, the analyses of the sensitivity of microburst intensity and general characteristics as a function of the atmospheric environment within which they are formed, comparisons of TASS model microburst simulation results to observed data sets, and the generation of simulated wind shear data bases for use by the aviation meteorological community in the evaluation of flight hazards caused by microbursts.
Flexible language constructs for large parallel programs
NASA Technical Reports Server (NTRS)
Rosing, Matthew; Schnabel, Robert
1993-01-01
The goal of the research described is to develop flexible language constructs for writing large data parallel numerical programs for distributed memory (MIMD) multiprocessors. Previously, several models have been developed to support synchronization and communication. Models for global synchronization include SIMD (Single Instruction Multiple Data), SPMD (Single Program Multiple Data), and sequential programs annotated with data distribution statements. The two primary models for communication include implicit communication based on shared memory and explicit communication based on messages. None of these models by themselves seem sufficient to permit the natural and efficient expression of the variety of algorithms that occur in large scientific computations. An overview of a new language that combines many of these programming models in a clean manner is given. This is done in a modular fashion such that different models can be combined to support large programs. Within a module, the selection of a model depends on the algorithm and its efficiency requirements. An overview of the language and discussion of some of the critical implementation details is given.
Marine mammals' influence on ecosystem processes affecting fisheries in the Barents Sea is trivial.
Corkeron, Peter J
2009-04-23
Some interpretations of ecosystem-based fishery management include culling marine mammals as an integral component. The current Norwegian policy on marine mammal management is one example. Scientific support for this policy includes the Scenario Barents Sea (SBS) models. These modelled interactions between cod, Gadus morhua, herring, Clupea harengus, capelin, Mallotus villosus and northern minke whales, Balaenoptera acutorostrata. Adding harp seals Phoca groenlandica into this top-down modelling approach resulted in unrealistic model outputs. Another set of models of the Barents Sea fish-fisheries system focused on interactions within and between the three fish populations, fisheries and climate. These model key processes of the system successfully. Continuing calls to support the SBS models despite their failure suggest a belief that marine mammal predation must be a problem for fisheries. The best available scientific evidence provides no justification for marine mammal culls as a primary component of an ecosystem-based approach to managing the fisheries of the Barents Sea.
A Model Based Mars Climate Database for the Mission Design
NASA Technical Reports Server (NTRS)
2005-01-01
A viewgraph presentation on a model based climate database is shown. The topics include: 1) Why a model based climate database?; 2) Mars Climate Database v3.1 Who uses it ? (approx. 60 users!); 3) The new Mars Climate database MCD v4.0; 4) MCD v4.0: what's new ? 5) Simulation of Water ice clouds; 6) Simulation of Water ice cycle; 7) A new tool for surface pressure prediction; 8) Acces to the database MCD 4.0; 9) How to access the database; and 10) New web access
A Public-Health-Based Vision for the Management and Regulation of Psychedelics.
Haden, Mark; Emerson, Brian; Tupper, Kenneth W
2016-01-01
The Health Officers Council of British Columbia has proposed post-prohibition regulatory models for currently illegal drugs based on public health principles, and this article continues this work by proposing a model for the regulation and management of psychedelics. This article outlines recent research on psychedelic substances and the key determinants of benefit and harm from their use. It then describes a public-health-based model for the regulation of psychedelics, which includes governance, supervision, set and setting controls, youth access, supply control, demand limitation, and evaluation.
Evaluating Water Demand Using Agent-Based Modeling
NASA Astrophysics Data System (ADS)
Lowry, T. S.
2004-12-01
The supply and demand of water resources are functions of complex, inter-related systems including hydrology, climate, demographics, economics, and policy. To assess the safety and sustainability of water resources, planners often rely on complex numerical models that relate some or all of these systems using mathematical abstractions. The accuracy of these models relies on how well the abstractions capture the true nature of the systems interactions. Typically, these abstractions are based on analyses of observations and/or experiments that account only for the statistical mean behavior of each system. This limits the approach in two important ways: 1) It cannot capture cross-system disruptive events, such as major drought, significant policy change, or terrorist attack, and 2) it cannot resolve sub-system level responses. To overcome these limitations, we are developing an agent-based water resources model that includes the systems of hydrology, climate, demographics, economics, and policy, to examine water demand during normal and extraordinary conditions. Agent-based modeling (ABM) develops functional relationships between systems by modeling the interaction between individuals (agents), who behave according to a probabilistic set of rules. ABM is a "bottom-up" modeling approach in that it defines macro-system behavior by modeling the micro-behavior of individual agents. While each agent's behavior is often simple and predictable, the aggregate behavior of all agents in each system can be complex, unpredictable, and different than behaviors observed in mean-behavior models. Furthermore, the ABM approach creates a virtual laboratory where the effects of policy changes and/or extraordinary events can be simulated. Our model, which is based on the demographics and hydrology of the Middle Rio Grande Basin in the state of New Mexico, includes agent groups of residential, agricultural, and industrial users. Each agent within each group determines its water usage based on its own condition and the condition of the world around it. For example, residential agents can make decisions to convert to or from xeriscaping and/or low-flow appliances based on policy implementation, economic status, weather, and climatic conditions. Agricultural agents may vary their usage by making decisions on crop distribution and irrigation design. Preliminary results show that water usage can be highly irrational under certain conditions. Results also identify sub-sectors within each group that have the highest influence on ensemble group behavior, providing a means for policy makers to target their efforts. Finally, the model is able to predict the impact of low-probability, high-impact events such as catastrophic denial of service due to natural and/or man-made events.
NASA Astrophysics Data System (ADS)
Donndorf, St.; Malz, A.; Kley, J.
2012-04-01
Cross section balancing is a generally accepted method for studying fault zone geometries. We show a method for the construction of structural 3D models of complex fault zones using a combination of gOcad modelling and balanced cross sections. In this work a 3D model of the Schlotheim graben in the Thuringian basin was created from serial, parallel cross sections and existing borehole data. The Thuringian Basin is originally a part of the North German Basin, which was separated from it by the Harz uplift in the Late Cretaceous. It comprises several parallel NW-trending inversion structures. The Schlotheim graben is one example of these inverted graben zones, whose structure poses special challenges to 3D modelling. The fault zone extends 30 km in NW-SE direction and 1 km in NE-SW direction. This project was split into two parts: data management and model building. To manage the fundamental data a central database was created in ESRI's ArcGIS. The development of a scripting interface handles the data exchange between the different steps of modelling. The first step is the pre-processing of the base data in ArcGIS, followed by cross section balancing with Midland Valley's Move software and finally the construction of the 3D model in Paradigm's gOcad. With the specific aim of constructing a 3D model based on cross sections, the functionality of the gOcad software had to be extended. These extensions include pre-processing functions to create a simplified and usable data base for gOcad as well as construction functions to create surfaces based on linearly distributed data and processing functions to create the 3D model from different surfaces. In order to use the model for further geological and hydrological simulations, special requirements apply to the surface properties. The first characteristic of the surfaces should be a quality mesh, which contains triangles with maximized internal angles. To achieve that, an external meshing tool was included in gOcad. The second characteristic is that intersecting lines between two surfaces must be included in both surfaces and share nodes with them. To finish the modelling process 3D balancing was performed to further improve the model quality.
Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment
NASA Astrophysics Data System (ADS)
Taner, M. U.; Wi, S.; Brown, C.
2017-12-01
The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.
O'Connor, William T; O'Shea, Sean D
2015-06-01
Schizophrenia disease models are necessary to elucidate underlying changes and to establish new therapeutic strategies towards a stage where drug efficacy in schizophrenia (against all classes of symptoms) can be predicted. Here we summarise the evidence for a GABA dysfunction in schizophrenia and review the functional neuroanatomy of five pathways implicated in schizophrenia, namely the mesocortical, mesolimbic, ventral striopallidal, dorsal striopallidal and perforant pathways including the role of local GABA transmission and we describe the effect of clozapine on local neurotransmitter release. This review also evaluates psychotropic drug-induced, neurodevelopmental and environmental disease models including their compatibility with brain microdialysis. The validity of disease models including face, construct, etiological and predictive validity and how these models constitute theories about this illness is also addressed. A disease model based on the effect of the abrupt withdrawal of clozapine on GABA release is also described. The review concludes that while no single animal model is entirely successful in reproducing schizophreniform symptomatology, a disease model based on an ability to prevent and/or reverse the abrupt clozapine discontinuation-induced changes in GABA release in brain regions implicated in schizophrenia may be useful for hypothesis testing and for in vivo screening of novel ligands not limited to a single pharmacological class. Copyright © 2015 Elsevier Inc. All rights reserved.
Mizukami, Naoki; Clark, Martyn P.; Sampson, Kevin; Nijssen, Bart; Mao, Yixin; McMillan, Hilary; Viger, Roland; Markstrom, Steven; Hay, Lauren E.; Woods, Ross; Arnold, Jeffrey R.; Brekke, Levi D.
2016-01-01
This paper describes the first version of a stand-alone runoff routing tool, mizuRoute. The mizuRoute tool post-processes runoff outputs from any distributed hydrologic model or land surface model to produce spatially distributed streamflow at various spatial scales from headwater basins to continental-wide river systems. The tool can utilize both traditional grid-based river network and vector-based river network data. Both types of river network include river segment lines and the associated drainage basin polygons, but the vector-based river network can represent finer-scale river lines than the grid-based network. Streamflow estimates at any desired location in the river network can be easily extracted from the output of mizuRoute. The routing process is simulated as two separate steps. First, hillslope routing is performed with a gamma-distribution-based unit-hydrograph to transport runoff from a hillslope to a catchment outlet. The second step is river channel routing, which is performed with one of two routing scheme options: (1) a kinematic wave tracking (KWT) routing procedure; and (2) an impulse response function – unit-hydrograph (IRF-UH) routing procedure. The mizuRoute tool also includes scripts (python, NetCDF operators) to pre-process spatial river network data. This paper demonstrates mizuRoute's capabilities to produce spatially distributed streamflow simulations based on river networks from the United States Geological Survey (USGS) Geospatial Fabric (GF) data set in which over 54 000 river segments and their contributing areas are mapped across the contiguous United States (CONUS). A brief analysis of model parameter sensitivity is also provided. The mizuRoute tool can assist model-based water resources assessments including studies of the impacts of climate change on streamflow.
NASA Technical Reports Server (NTRS)
Stouffer, D. C.; Sheh, M. Y.
1988-01-01
A micromechanical model based on crystallographic slip theory was formulated for nickel-base single crystal superalloys. The current equations include both drag stress and back stress state variables to model the local inelastic flow. Specially designed experiments have been conducted to evaluate the effect of back stress in single crystals. The results showed that (1) the back stress is orientation dependent; and (2) the back stress state variable in the inelastic flow equation is necessary for predicting anelastic behavior of the material. The model also demonstrated improved fatigue predictive capability. Model predictions and experimental data are presented for single crystal superalloy Rene N4 at 982 C.
Human Language Technology: Opportunities and Challenges
2005-01-01
because of the connections to and reliance on signal processing. Audio diarization critically includes indexing of speakers [12], since speaker ...to reduce inter- speaker variability in training. Standard techniques include vocal-tract length normalization, adaptation of acoustic models using...maximum likelihood linear regression (MLLR), and speaker -adaptive training based on MLLR. The acoustic models are mixtures of Gaussians, typically with
Functional Additive Mixed Models
Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja
2014-01-01
We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592
Cost-effectiveness in Clostridium difficile treatment decision-making
Nuijten, Mark JC; Keller, Josbert J; Visser, Caroline E; Redekop, Ken; Claassen, Eric; Speelman, Peter; Pronk, Marja H
2015-01-01
AIM: To develop a framework for the clinical and health economic assessment for management of Clostridium difficile infection (CDI). METHODS: CDI has vast economic consequences emphasizing the need for innovative and cost effective solutions, which were aim of this study. A guidance model was developed for coverage decisions and guideline development in CDI. The model included pharmacotherapy with oral metronidazole or oral vancomycin, which is the mainstay for pharmacological treatment of CDI and is recommended by most treatment guidelines. RESULTS: A design for a patient-based cost-effectiveness model was developed, which can be used to estimate the cost-effectiveness of current and future treatment strategies in CDI. Patient-based outcomes were extrapolated to the population by including factors like, e.g., person-to-person transmission, isolation precautions and closing and cleaning wards of hospitals. CONCLUSION: The proposed framework for a population-based CDI model may be used for clinical and health economic assessments of CDI guidelines and coverage decisions for emerging treatments for CDI. PMID:26601096
Functional Additive Mixed Models.
Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja
2015-04-01
We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.
A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers
Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund; ...
2018-03-28
A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less
NASA Astrophysics Data System (ADS)
Bai, Hailong; Montési, Laurent G. J.; Behn, Mark D.
2017-01-01
MeltMigrator is a MATLAB®-based melt migration software developed to process three-dimensional mantle temperature and velocity data from user-supplied numerical models of mid-ocean ridges, calculate melt production and melt migration trajectories in the mantle, estimate melt flux along plate boundaries, and predict crustal thickness distribution on the seafloor. MeltMigrator is also capable of calculating compositional evolution depending on the choice of petrologic melting model. Programmed in modules, MeltMigrator is highly customizable and can be expanded to a wide range of applications. We have applied it to complex mid-ocean ridge model settings, including transform faults, oblique segments, ridge migration, asymmetrical spreading, background mantle flow, and ridge-plume interaction. In this technical report, we include an example application to a segmented mid-ocean ridge. MeltMigrator is available as a supplement to this paper, and it is also available from GitHub and the University of Maryland Geodynamics Group website.
Cost-effectiveness in Clostridium difficile treatment decision-making.
Nuijten, Mark Jc; Keller, Josbert J; Visser, Caroline E; Redekop, Ken; Claassen, Eric; Speelman, Peter; Pronk, Marja H
2015-11-16
To develop a framework for the clinical and health economic assessment for management of Clostridium difficile infection (CDI). CDI has vast economic consequences emphasizing the need for innovative and cost effective solutions, which were aim of this study. A guidance model was developed for coverage decisions and guideline development in CDI. The model included pharmacotherapy with oral metronidazole or oral vancomycin, which is the mainstay for pharmacological treatment of CDI and is recommended by most treatment guidelines. A design for a patient-based cost-effectiveness model was developed, which can be used to estimate the cost-effectiveness of current and future treatment strategies in CDI. Patient-based outcomes were extrapolated to the population by including factors like, e.g., person-to-person transmission, isolation precautions and closing and cleaning wards of hospitals. The proposed framework for a population-based CDI model may be used for clinical and health economic assessments of CDI guidelines and coverage decisions for emerging treatments for CDI.
A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund
A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less
Integrate Data into Scientific Workflows for Terrestrial Biosphere Model Evaluation through Brokers
NASA Astrophysics Data System (ADS)
Wei, Y.; Cook, R. B.; Du, F.; Dasgupta, A.; Poco, J.; Huntzinger, D. N.; Schwalm, C. R.; Boldrini, E.; Santoro, M.; Pearlman, J.; Pearlman, F.; Nativi, S.; Khalsa, S.
2013-12-01
Terrestrial biosphere models (TBMs) have become integral tools for extrapolating local observations and process-level understanding of land-atmosphere carbon exchange to larger regions. Model-model and model-observation intercomparisons are critical to understand the uncertainties within model outputs, to improve model skill, and to improve our understanding of land-atmosphere carbon exchange. The DataONE Exploration, Visualization, and Analysis (EVA) working group is evaluating TBMs using scientific workflows in UV-CDAT/VisTrails. This workflow-based approach promotes collaboration and improved tracking of evaluation provenance. But challenges still remain. The multi-scale and multi-discipline nature of TBMs makes it necessary to include diverse and distributed data resources in model evaluation. These include, among others, remote sensing data from NASA, flux tower observations from various organizations including DOE, and inventory data from US Forest Service. A key challenge is to make heterogeneous data from different organizations and disciplines discoverable and readily integrated for use in scientific workflows. This presentation introduces the brokering approach taken by the DataONE EVA to fill the gap between TBMs' evaluation scientific workflows and cross-organization and cross-discipline data resources. The DataONE EVA started the development of an Integrated Model Intercomparison Framework (IMIF) that leverages standards-based discovery and access brokers to dynamically discover, access, and transform (e.g. subset and resampling) diverse data products from DataONE, Earth System Grid (ESG), and other data repositories into a format that can be readily used by scientific workflows in UV-CDAT/VisTrails. The discovery and access brokers serve as an independent middleware that bridge existing data repositories and TBMs evaluation scientific workflows but introduce little overhead to either component. In the initial work, an OpenSearch-based discovery broker is leveraged to provide a consistent mechanism for data discovery. Standards-based data services, including Open Geospatial Consortium (OGC) Web Coverage Service (WCS) and THREDDS are leveraged to provide on-demand data access and transformations through the data access broker. To ease the adoption of broker services, a package of broker client VisTrails modules have been developed to be easily plugged into scientific workflows. The initial IMIF has been successfully tested in selected model evaluation scenarios involved in the NASA-funded Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP).
Witkiewitz, Katie; Bowen, Sarah; Harrop, Erin N; Douglas, Haley; Enkema, Matthew; Sedgwick, Carly
2014-04-01
Mindfulness-based treatments are growing in popularity among addiction treatment providers, and several studies suggest the efficacy of incorporating mindfulness practices into the treatment of addiction, including the treatment of substance use disorders and behavioral addictions (i.e., gambling). The current paper provides a review of theoretical models of mindfulness in the treatment of addiction and several hypothesized mechanisms of change. We provide an overview of mindfulness-based relapse prevention (MBRP), including session content, treatment targets, and client feedback from participants who have received MBRP in the context of empirical studies. Future research directions regarding operationalization and measurement, identifying factors that moderate treatment effects, and protocol adaptations for specific populations are discussed.
Gray, Steven; Voinov, Alexey; Paolisso, Michael; Jordan, Rebecca; BenDor, Todd; Bommel, Pierre; Glynn, Pierre D.; Hedelin, Beatrice; Hubacek, Klaus; Introne, Josh; Kolagani, Nagesh; Laursen, Bethany; Prell, Christina; Schmitt-Olabisi, Laura; Singer, Alison; Sterling, Eleanor J.; Zellner, Moira
2018-01-01
Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding ecological change. This is because stakeholders often hold valuable knowledge about socio-environmental dynamics and collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four-dimensional framework (4P) that includes reporting on dimensions of (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human–environment interactions and the consequences of ecological changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of ecological policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM.
NASA Astrophysics Data System (ADS)
Hamzah, Afiq; Hamid, Fatimah A.; Ismail, Razali
2016-12-01
An explicit solution for long-channel surrounding-gate (SRG) MOSFETs is presented from intrinsic to heavily doped body including the effects of interface traps and fixed oxide charges. The solution is based on the core SRGMOSFETs model of the Unified Charge Control Model (UCCM) for heavily doped conditions. The UCCM model of highly doped SRGMOSFETs is derived to obtain the exact equivalent expression as in the undoped case. Taking advantage of the undoped explicit charge-based expression, the asymptotic limits for below threshold and above threshold have been redefined to include the effect of trap states for heavily doped cases. After solving the asymptotic limits, an explicit mobile charge expression is obtained which includes the trap state effects. The explicit mobile charge model shows very good agreement with respect to numerical simulation over practical terminal voltages, doping concentration, geometry effects, and trap state effects due to the fixed oxide charges and interface traps. Then, the drain current is obtained using the Pao-Sah's dual integral, which is expressed as a function of inversion charge densities at the source/drain ends. The drain current agreed well with the implicit solution and numerical simulation for all regions of operation without employing any empirical parameters. A comparison with previous explicit models has been conducted to verify the competency of the proposed model with the doping concentration of 1× {10}19 {{cm}}-3, as the proposed model has better advantages in terms of its simplicity and accuracy at a higher doping concentration.
Gray, Steven; Voinov, Alexey; Paolisso, Michael; Jordan, Rebecca; BenDor, Todd; Bommel, Pierre; Glynn, Pierre; Hedelin, Beatrice; Hubacek, Klaus; Introne, Josh; Kolagani, Nagesh; Laursen, Bethany; Prell, Christina; Schmitt Olabisi, Laura; Singer, Alison; Sterling, Eleanor; Zellner, Moira
2018-01-01
Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding ecological change. This is because stakeholders often hold valuable knowledge about socio-environmental dynamics and collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four-dimensional framework (4P) that includes reporting on dimensions of (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human-environment interactions and the consequences of ecological changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of ecological policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM. © 2017 by the Ecological Society of America.
2014-01-01
Background mRNA translation involves simultaneous movement of multiple ribosomes on the mRNA and is also subject to regulatory mechanisms at different stages. Translation can be described by various codon-based models, including ODE, TASEP, and Petri net models. Although such models have been extensively used, the overlap and differences between these models and the implications of the assumptions of each model has not been systematically elucidated. The selection of the most appropriate modelling framework, and the most appropriate way to develop coarse-grained/fine-grained models in different contexts is not clear. Results We systematically analyze and compare how different modelling methodologies can be used to describe translation. We define various statistically equivalent codon-based simulation algorithms and analyze the importance of the update rule in determining the steady state, an aspect often neglected. Then a novel probabilistic Boolean network (PBN) model is proposed for modelling translation, which enjoys an exact numerical solution. This solution matches those of numerical simulation from other methods and acts as a complementary tool to analytical approximations and simulations. The advantages and limitations of various codon-based models are compared, and illustrated by examples with real biological complexities such as slow codons, premature termination and feedback regulation. Our studies reveal that while different models gives broadly similiar trends in many cases, important differences also arise and can be clearly seen, in the dependence of the translation rate on different parameters. Furthermore, the update rule affects the steady state solution. Conclusions The codon-based models are based on different levels of abstraction. Our analysis suggests that a multiple model approach to understanding translation allows one to ascertain which aspects of the conclusions are robust with respect to the choice of modelling methodology, and when (and why) important differences may arise. This approach also allows for an optimal use of analysis tools, which is especially important when additional complexities or regulatory mechanisms are included. This approach can provide a robust platform for dissecting translation, and results in an improved predictive framework for applications in systems and synthetic biology. PMID:24576337
NASA Technical Reports Server (NTRS)
Briggs, Maxwell H.
2011-01-01
The Fission Power System (FPS) project is developing a Technology Demonstration Unit (TDU) to verify the performance and functionality of a subscale version of the FPS reference concept in a relevant environment, and to verify component and system models. As hardware is developed for the TDU, component and system models must be refined to include the details of specific component designs. This paper describes the development of a Sage-based pseudo-steady-state Stirling convertor model and its implementation into a system-level model of the TDU.
Web-based application for inverting one-dimensional magnetotelluric data using Python
NASA Astrophysics Data System (ADS)
Suryanto, Wiwit; Irnaka, Theodosius Marwan
2016-11-01
One-dimensional modeling of magnetotelluric (MT) data has been performed using an online application on a web-based virtual private server. The application was developed with the Python language using the Django framework with HTML and CSS components. The input data, including the apparent resistivity and phase as a function of period or frequency with standard deviation, can be entered through an interactive web page that can be freely accessed at https://komputasi.geofisika.ugm.ac.id. The subsurface models, represented by resistivity as a function of depth, are iteratively improved by changing the model parameters, such as the resistivity and the layer depth, based on the observed apparent resistivity and phase data. The output of the application displayed on the screen presents resistivity as a function of depth and includes the RMS error for each iteration. Synthetic and real data were used in comparative tests of the application's performance, and it is shown that the application developed accurate subsurface resistivity models. Hence, this application can be used for practical one-dimensional modeling of MT data.
NASA Astrophysics Data System (ADS)
Yung, L. Y. Aaron; Somerville, Rachel S.
2017-06-01
The well-established Santa Cruz semi-analytic galaxy formation framework has been shown to be quite successful at explaining observations in the local Universe, as well as making predictions for low-redshift observations. Recently, metallicity-based gas partitioning and H2-based star formation recipes have been implemented in our model, replacing the legacy cold-gas based recipe. We then use our revised model to explore the high-redshift Universe and make predictions up to z = 15. Although our model is only calibrated to observations from the local universe, our predictions seem to match incredibly well with mid- to high-redshift observational constraints available-to-date, including rest-frame UV luminosity functions and the reionization history as constrained by CMB and IGM observations. We provide predictions for individual and statistical galaxy properties at a wide range of redshifts (z = 4 - 15), including objects that are too far or too faint to be detected with current facilities. And using our model predictions, we also provide forecasted luminosity functions and other observables for upcoming studies with JWST.
Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.
Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis
2016-07-01
Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.
A Cognitive Diagnosis Model for Cognitively Based Multiple-Choice Options
ERIC Educational Resources Information Center
de la Torre, Jimmy
2009-01-01
Cognitive or skills diagnosis models are discrete latent variable models developed specifically for the purpose of identifying the presence or absence of multiple fine-grained skills. However, applications of these models typically involve dichotomous or dichotomized data, including data from multiple-choice (MC) assessments that are scored as…
Pragmatic User Model Implementation in an Intelligent Help System.
ERIC Educational Resources Information Center
Fernandez-Manjon, Baltasar; Fernandez-Valmayor, Alfredo; Fernandez-Chamizo, Carmen
1998-01-01
Describes Aran, a knowledge-based system designed to help users deal with problems related to Unix operation. Highlights include adaptation to the individual user; user modeling knowledge; stereotypes; content of the individual user model; instantiation, acquisition, and maintenance of the individual model; dynamic acquisition of objective and…
Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods
ERIC Educational Resources Information Center
Soroush, Masoud; Weinberger, Charles B.
2010-01-01
This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…
A Model for Teaching Experiential Counseling Interventions to Novice Counselors.
ERIC Educational Resources Information Center
Cummings, Anne L.
1992-01-01
Describes model for teaching experiential interventions to novice counselors. Includes two experiential interventions that are focus for new model: two-chair approach based on Gestalt therapy principles and resolution of problematic reaction points. Cognitive, affective, and behavioral concepts of model are related to transfer of learning with the…
Community Organizing Practices in Academia: A Model, and Stories of Partnerships
ERIC Educational Resources Information Center
Avila, Maria
2010-01-01
This article describes a model of civic engagement based on four key community organizing practices, created at Occidental College and implemented since 2001. The foundations of this model do not include confrontation, mass mobilization, or demonstrations--tactics commonly associated with the term community organizing. This model, instead,…
Using Weighted Least Squares Regression for Obtaining Langmuir Sorption Constants
USDA-ARS?s Scientific Manuscript database
One of the most commonly used models for describing phosphorus (P) sorption to soils is the Langmuir model. To obtain model parameters, the Langmuir model is fit to measured sorption data using least squares regression. Least squares regression is based on several assumptions including normally dist...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geotechnical Sciences Group Bechtel Nevada
2006-01-01
A new three-dimensional hydrostratigraphic framework model for the Yucca Flat-Climax Mine Corrective Action Unit was completed in 2005. The model area includes Yucca Flat and Climax Mine, former nuclear testing areas at the Nevada Test Site, and proximal areas. The model area is approximately 1,250 square kilometers in size and is geologically complex. Yucca Flat is a topographically closed basin typical of many valleys in the Basin and Range province. Faulted and tilted blocks of Tertiary-age volcanic rocks and underlying Proterozoic and Paleozoic sedimentary rocks form low ranges around the structural basin. During the Cretaceous Period a granitic intrusive wasmore » emplaced at the north end of Yucca Flat. A diverse set of geological and geophysical data collected over the past 50 years was used to develop a structural model and hydrostratigraphic system for the basin. These were integrated using EarthVision? software to develop the 3-dimensional hydrostratigraphic framework model. Fifty-six stratigraphic units in the model area were grouped into 25 hydrostratigraphic units based on each unit's propensity toward aquifer or aquitard characteristics. The authors organized the alluvial section into 3 hydrostratigraphic units including 2 aquifers and 1 confining unit. The volcanic units in the model area are organized into 13 hydrostratigraphic units that include 8 aquifers and 5 confining units. The underlying pre-Tertiary rocks are divided into 7 hydrostratigraphic units, including 3 aquifers and 4 confining units. Other units include 1 Tertiary-age sedimentary confining unit and 1 Mesozoic-age granitic confining unit. The model depicts the thickness, extent, and geometric relationships of these hydrostratigraphic units (''layers'' in the model) along with the major structural features (i.e., faults). The model incorporates 178 high-angle normal faults of Tertiary age and 2 low-angle thrust faults of Mesozoic age. The complexity of the model area and the non-uniqueness of some of the interpretations incorporated into the base model made it necessary to formulate alternative interpretations for some of the major features in the model. Five of these alternatives were developed so they could be modeled in the same fashion as the base model. This work was done for the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office in support of the Underground Test Area subproject of the Environmental Restoration Project.« less
Salvi, Daniele; Macali, Armando; Mariottini, Paolo
2014-01-01
The bivalve family Ostreidae has a worldwide distribution and includes species of high economic importance. Phylogenetics and systematic of oysters based on morphology have proved difficult because of their high phenotypic plasticity. In this study we explore the phylogenetic information of the DNA sequence and secondary structure of the nuclear, fast-evolving, ITS2 rRNA and the mitochondrial 16S rRNA genes from the Ostreidae and we implemented a multi-locus framework based on four loci for oyster phylogenetics and systematics. Sequence-structure rRNA models aid sequence alignment and improved accuracy and nodal support of phylogenetic trees. In agreement with previous molecular studies, our phylogenetic results indicate that none of the currently recognized subfamilies, Crassostreinae, Ostreinae, and Lophinae, is monophyletic. Single gene trees based on Maximum likelihood (ML) and Bayesian (BA) methods and on sequence-structure ML were congruent with multilocus trees based on a concatenated (ML and BA) and coalescent based (BA) approaches and consistently supported three main clades: (i) Crassostrea, (ii) Saccostrea, and (iii) an Ostreinae-Lophinae lineage. Therefore, the subfamily Crassotreinae (including Crassostrea), Saccostreinae subfam. nov. (including Saccostrea and tentatively Striostrea) and Ostreinae (including Ostreinae and Lophinae taxa) are recognized. Based on phylogenetic and biogeographical evidence the Asian species of Crassostrea from the Pacific Ocean are assigned to Magallana gen. nov., whereas an integrative taxonomic revision is required for the genera Ostrea and Dendostrea. This study pointed out the suitability of the ITS2 marker for DNA barcoding of oyster and the relevance of using sequence-structure rRNA models and features of the ITS2 folding in molecular phylogenetics and taxonomy. The multilocus approach allowed inferring a robust phylogeny of Ostreidae providing a broad molecular perspective on their systematics. PMID:25250663
Salvi, Daniele; Macali, Armando; Mariottini, Paolo
2014-01-01
The bivalve family Ostreidae has a worldwide distribution and includes species of high economic importance. Phylogenetics and systematic of oysters based on morphology have proved difficult because of their high phenotypic plasticity. In this study we explore the phylogenetic information of the DNA sequence and secondary structure of the nuclear, fast-evolving, ITS2 rRNA and the mitochondrial 16S rRNA genes from the Ostreidae and we implemented a multi-locus framework based on four loci for oyster phylogenetics and systematics. Sequence-structure rRNA models aid sequence alignment and improved accuracy and nodal support of phylogenetic trees. In agreement with previous molecular studies, our phylogenetic results indicate that none of the currently recognized subfamilies, Crassostreinae, Ostreinae, and Lophinae, is monophyletic. Single gene trees based on Maximum likelihood (ML) and Bayesian (BA) methods and on sequence-structure ML were congruent with multilocus trees based on a concatenated (ML and BA) and coalescent based (BA) approaches and consistently supported three main clades: (i) Crassostrea, (ii) Saccostrea, and (iii) an Ostreinae-Lophinae lineage. Therefore, the subfamily Crassostreinae (including Crassostrea), Saccostreinae subfam. nov. (including Saccostrea and tentatively Striostrea) and Ostreinae (including Ostreinae and Lophinae taxa) are recognized [corrected]. Based on phylogenetic and biogeographical evidence the Asian species of Crassostrea from the Pacific Ocean are assigned to Magallana gen. nov., whereas an integrative taxonomic revision is required for the genera Ostrea and Dendostrea. This study pointed out the suitability of the ITS2 marker for DNA barcoding of oyster and the relevance of using sequence-structure rRNA models and features of the ITS2 folding in molecular phylogenetics and taxonomy. The multilocus approach allowed inferring a robust phylogeny of Ostreidae providing a broad molecular perspective on their systematics.
Barton, Hugh A; Chiu, Weihsueh A; Setzer, R Woodrow; Andersen, Melvin E; Bailer, A John; Bois, Frédéric Y; Dewoskin, Robert S; Hays, Sean; Johanson, Gunnar; Jones, Nancy; Loizou, George; Macphail, Robert C; Portier, Christopher J; Spendiff, Martin; Tan, Yu-Mei
2007-10-01
Physiologically based pharmacokinetic (PBPK) models are used in mode-of-action based risk and safety assessments to estimate internal dosimetry in animals and humans. When used in risk assessment, these models can provide a basis for extrapolating between species, doses, and exposure routes or for justifying nondefault values for uncertainty factors. Characterization of uncertainty and variability is increasingly recognized as important for risk assessment; this represents a continuing challenge for both PBPK modelers and users. Current practices show significant progress in specifying deterministic biological models and nondeterministic (often statistical) models, estimating parameters using diverse data sets from multiple sources, using them to make predictions, and characterizing uncertainty and variability of model parameters and predictions. The International Workshop on Uncertainty and Variability in PBPK Models, held 31 Oct-2 Nov 2006, identified the state-of-the-science, needed changes in practice and implementation, and research priorities. For the short term, these include (1) multidisciplinary teams to integrate deterministic and nondeterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through improved documentation of model structure(s), parameter values, sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include (1) theoretical and practical methodological improvements for nondeterministic/statistical modeling; (2) better methods for evaluating alternative model structures; (3) peer-reviewed databases of parameters and covariates, and their distributions; (4) expanded coverage of PBPK models across chemicals with different properties; and (5) training and reference materials, such as cases studies, bibliographies/glossaries, model repositories, and enhanced software. The multidisciplinary dialogue initiated by this Workshop will foster the collaboration, research, data collection, and training necessary to make characterizing uncertainty and variability a standard practice in PBPK modeling and risk assessment.
Modeling the Performance of Direct-Detection Doppler Lidar Systems in Real Atmospheres
NASA Technical Reports Server (NTRS)
McGill, Matthew J.; Hart, William D.; McKay, Jack A.; Spinhirne, James D.
1999-01-01
Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems has assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar systems: the double-edge and the multi-channel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only about 10-20% compared to nighttime performance, provided a proper solar filter is included in the instrument design.
Results of an Oncology Clinical Trial Nurse Role Delineation Study.
Purdom, Michelle A; Petersen, Sandra; Haas, Barbara K
2017-09-01
To evaluate the relevance of a five-dimensional model of clinical trial nursing practice in an oncology clinical trial nurse population. . Web-based cross-sectional survey. . Online via Qualtrics. . 167 oncology nurses throughout the United States, including 41 study coordinators, 35 direct care providers, and 91 dual-role nurses who provide direct patient care and trial coordination. . Principal components analysis was used to determine the dimensions of oncology clinical trial nursing practice. . Self-reported frequency of 59 activities. . The results did not support the original five-dimensional model of nursing care but revealed a more multidimensional model. . An analysis of frequency data revealed an eight-dimensional model of oncology research nursing, including care, manage study, expert, lead, prepare, data, advance science, and ethics. . This evidence-based model expands understanding of the multidimensional roles of oncology nurses caring for patients with cancer enrolled in clinical trials.
Modeling of Density-Dependent Flow based on the Thermodynamically Constrained Averaging Theory
NASA Astrophysics Data System (ADS)
Weigand, T. M.; Schultz, P. B.; Kelley, C. T.; Miller, C. T.; Gray, W. G.
2016-12-01
The thermodynamically constrained averaging theory (TCAT) has been used to formulate general classes of porous medium models, including new models for density-dependent flow. The TCAT approach provides advantages that include a firm connection between the microscale, or pore scale, and the macroscale; a thermodynamically consistent basis; explicit inclusion of factors such as a diffusion that arises from gradients associated with pressure and activity and the ability to describe both high and low concentration displacement. The TCAT model is presented and closure relations for the TCAT model are postulated based on microscale averages and a parameter estimation is performed on a subset of the experimental data. Due to the sharpness of the fronts, an adaptive moving mesh technique was used to ensure grid independent solutions within the run time constraints. The optimized parameters are then used for forward simulations and compared to the set of experimental data not used for the parameter estimation.
NASA Astrophysics Data System (ADS)
Wiggert, J. D.; Pan, C.; Dinniman, M. S.; Lau, Y.; Fitzpatrick, P. J.; O'Brien, S. J.; Bouchard, C.; Quas, L. M.; Miles, T. N.; Cambazoglu, M. K.; Dykstra, S. L.; Dzwonkowski, B.; Jacobs, G. A.; Church, I.; Hofmann, E. E.
2017-12-01
A circulation model based on the Coupled-Ocean-Atmosphere-Wave-Sediment Transport (COAWST) Modeling System, with coupled biogeochemical and sediment transport modules, has been implemented for Mississippi Sound and the adjacent continental shelf region. The model has 400-m horizontal resolution, 24 vertical layers, and includes wetting/drying capability to resolve shallow inshore regions. The circulation model was spun-up using oceanographic initial and lateral boundary conditions provided by a 1-km resolution regional implementation of the Navy Coastal Ocean Model (NCOM) in the Gulf of Mexico. The biogeochemical module includes multiple size classes of phytoplankton, zooplankton and detritus, a fish larvae compartment, and explicitly tracks dissolved oxygen with benthic cycling interaction. The sediment transport model is implemented based on benthic mapping data that provides bottom sediment type distributions and spatio-temporal validation. A regionally specific atmospheric forcing product that provides improved spatial and temporal resolution, including diurnal sea breeze impacts, has been developed and applied. Model experiments focus on periods when comprehensive ship-based sampling was deployed by the CONCORDE (Consortium for Coastal River-Dominated Ecosystems) research program, which was established to investigate the complex fine-scale biological, chemical and physical interactions in a marine system controlled by pulsed-river plume dynamics. Biophysical interactions and biogeochemical variability associated with estuarine - shelf exchanges between nearshore lagoonal estuarine waters and the continental shelf revealed by the model provide new insight into how seasonal variation of hydrological forcing conditions influence ecological and biogeochemical processes in the highly productive Northern Gulf region. Application of the COAWST-based model system with and without inclusion of the sediment transport module demonstrates how suspended sediment in the nearshore waters influences inner shelf ecosystem function through impacts exerted on the in situ light environment and particle aggregation-mediated organic matter fluxes.
NASA Astrophysics Data System (ADS)
Cobourn, W. Geoffrey
2010-08-01
An enhanced PM 2.5 air quality forecast model based on nonlinear regression (NLR) and back-trajectory concentrations has been developed for use in the Louisville, Kentucky metropolitan area. The PM 2.5 air quality forecast model is designed for use in the warm season, from May through September, when PM 2.5 air quality is more likely to be critical for human health. The enhanced PM 2.5 model consists of a basic NLR model, developed for use with an automated air quality forecast system, and an additional parameter based on upwind PM 2.5 concentration, called PM24. The PM24 parameter is designed to be determined manually, by synthesizing backward air trajectory and regional air quality information to compute 24-h back-trajectory concentrations. The PM24 parameter may be used by air quality forecasters to adjust the forecast provided by the automated forecast system. In this study of the 2007 and 2008 forecast seasons, the enhanced model performed well using forecasted meteorological data and PM24 as input. The enhanced PM 2.5 model was compared with three alternative models, including the basic NLR model, the basic NLR model with a persistence parameter added, and the NLR model with persistence and PM24. The two models that included PM24 were of comparable accuracy. The two models incorporating back-trajectory concentrations had lower mean absolute errors and higher rates of detecting unhealthy PM2.5 concentrations compared to the other models.
NASA Technical Reports Server (NTRS)
Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga P.; Stephens, Philip; Wilson, Brian D.; Akopian, Vardan; Komjathy, Attila; Lijima, Byron A.
2013-01-01
ISOGAME is designed and developed to assess quantitatively the impact of new observation systems on the capability of imaging and modeling the ionosphere. With ISOGAME, one can perform observation system simulation experiments (OSSEs). A typical OSSE using ISOGAME would involve: (1) simulating various ionospheric conditions on global scales; (2) simulating ionospheric measurements made from a constellation of low-Earth-orbiters (LEOs), particularly Global Navigation Satellite System (GNSS) radio occultation data, and from ground-based global GNSS networks; (3) conducting ionospheric data assimilation experiments with the Global Assimilative Ionospheric Model (GAIM); and (4) analyzing modeling results with visualization tools. ISOGAME can provide quantitative assessment of the accuracy of assimilative modeling with the interested observation system. Other observation systems besides those based on GNSS are also possible to analyze. The system is composed of a suite of software that combines the GAIM, including a 4D first-principles ionospheric model and data assimilation modules, an Internal Reference Ionosphere (IRI) model that has been developed by international ionospheric research communities, observation simulator, visualization software, and orbit design, simulation, and optimization software. The core GAIM model used in ISOGAME is based on the GAIM++ code (written in C++) that includes a new high-fidelity geomagnetic field representation (multi-dipole). New visualization tools and analysis algorithms for the OSSEs are now part of ISOGAME.
Evaluation of the Community Multi-scale Air Quality (CMAQ) ...
The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Protection Agency develops the CMAQ model and periodically releases new versions of the model that include bug fixes and various other improvements to the modeling system. In the fall of 2015, CMAQ version 5.1 was released. This new version of CMAQ will contain important bug fixes to several issues that were identified in CMAQv5.0.2 and additionally include updates to other portions of the code. Several annual, and numerous episodic, CMAQv5.1 simulations were performed to assess the impact of these improvements on the model results. These results will be presented, along with a base evaluation of the performance of the CMAQv5.1 modeling system against available surface and upper-air measurements available during the time period simulated. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, proces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kornreich, Drew E; Vaidya, Rajendra U; Ammerman, Curtt N
Integrated Computational Materials Engineering (ICME) is a novel overarching approach to bridge length and time scales in computational materials science and engineering. This approach integrates all elements of multi-scale modeling (including various empirical and science-based models) with materials informatics to provide users the opportunity to tailor material selections based on stringent application needs. Typically, materials engineering has focused on structural requirements (stress, strain, modulus, fracture toughness etc.) while multi-scale modeling has been science focused (mechanical threshold strength model, grain-size models, solid-solution strengthening models etc.). Materials informatics (mechanical property inventories) on the other hand, is extensively data focused. All of thesemore » elements are combined within the framework of ICME to create architecture for the development, selection and design new composite materials for challenging environments. We propose development of the foundations for applying ICME to composite materials development for nuclear and high-radiation environments (including nuclear-fusion energy reactors, nuclear-fission reactors, and accelerators). We expect to combine all elements of current material models (including thermo-mechanical and finite-element models) into the ICME framework. This will be accomplished through the use of a various mathematical modeling constructs. These constructs will allow the integration of constituent models, which in tum would allow us to use the adaptive strengths of using a combinatorial scheme (fabrication and computational) for creating new composite materials. A sample problem where these concepts are used is provided in this summary.« less
Gao, X-L; Zhang, G Y
2016-07-01
A non-classical model for a Mindlin plate resting on an elastic foundation is developed in a general form using a modified couple stress theory, a surface elasticity theory and a two-parameter Winkler-Pasternak foundation model. It includes all five kinematic variables possible for a Mindlin plate. The equations of motion and the complete boundary conditions are obtained simultaneously through a variational formulation based on Hamilton's principle, and the microstructure, surface energy and foundation effects are treated in a unified manner. The newly developed model contains one material length-scale parameter to describe the microstructure effect, three surface elastic constants to account for the surface energy effect, and two foundation parameters to capture the foundation effect. The current non-classical plate model reduces to its classical elasticity-based counterpart when the microstructure, surface energy and foundation effects are all suppressed. In addition, the new model includes the Mindlin plate models considering the microstructure dependence or the surface energy effect or the foundation influence alone as special cases, recovers the Kirchhoff plate model incorporating the microstructure, surface energy and foundation effects, and degenerates to the Timoshenko beam model including the microstructure effect. To illustrate the new Mindlin plate model, the static bending and free vibration problems of a simply supported rectangular plate are analytically solved by directly applying the general formulae derived.
Zhang, G. Y.
2016-01-01
A non-classical model for a Mindlin plate resting on an elastic foundation is developed in a general form using a modified couple stress theory, a surface elasticity theory and a two-parameter Winkler–Pasternak foundation model. It includes all five kinematic variables possible for a Mindlin plate. The equations of motion and the complete boundary conditions are obtained simultaneously through a variational formulation based on Hamilton's principle, and the microstructure, surface energy and foundation effects are treated in a unified manner. The newly developed model contains one material length-scale parameter to describe the microstructure effect, three surface elastic constants to account for the surface energy effect, and two foundation parameters to capture the foundation effect. The current non-classical plate model reduces to its classical elasticity-based counterpart when the microstructure, surface energy and foundation effects are all suppressed. In addition, the new model includes the Mindlin plate models considering the microstructure dependence or the surface energy effect or the foundation influence alone as special cases, recovers the Kirchhoff plate model incorporating the microstructure, surface energy and foundation effects, and degenerates to the Timoshenko beam model including the microstructure effect. To illustrate the new Mindlin plate model, the static bending and free vibration problems of a simply supported rectangular plate are analytically solved by directly applying the general formulae derived. PMID:27493578
Robust high-performance control for robotic manipulators
NASA Technical Reports Server (NTRS)
Seraji, Homayoun (Inventor)
1991-01-01
Model-based and performance-based control techniques are combined for an electrical robotic control system. Thus, two distinct and separate design philosophies have been merged into a single control system having a control law formulation including two distinct and separate components, each of which yields a respective signal component that is combined into a total command signal for the system. Those two separate system components include a feedforward controller and a feedback controller. The feedforward controller is model-based and contains any known part of the manipulator dynamics that can be used for on-line control to produce a nominal feedforward component of the system's control signal. The feedback controller is performance-based and consists of a simple adaptive PID controller which generates an adaptive control signal to complement the nominal feedforward signal.
Robust high-performance control for robotic manipulators
NASA Technical Reports Server (NTRS)
Seraji, Homayoun (Inventor)
1989-01-01
Model-based and performance-based control techniques are combined for an electrical robotic control system. Thus, two distinct and separate design philosophies were merged into a single control system having a control law formulation including two distinct and separate components, each of which yields a respective signal componet that is combined into a total command signal for the system. Those two separate system components include a feedforward controller and feedback controller. The feedforward controller is model-based and contains any known part of the manipulator dynamics that can be used for on-line control to produce a nominal feedforward component of the system's control signal. The feedback controller is performance-based and consists of a simple adaptive PID controller which generates an adaptive control signal to complement the nomical feedforward signal.
NASA Technical Reports Server (NTRS)
Guenther, D. B.
1994-01-01
The nonadiabatic frequencies of a standard solar model and a solar model that includes helium diffusion are discussed. The nonadiabatic pulsation calculation includes physics that describes the losses and gains due to radiation. Radiative gains and losses are modeled in both the diffusion approximation, which is only valid in optically thick regions, and the Eddington approximation, which is valid in both optically thin and thick regions. The calculated pulsation frequencies for modes with l less than or equal to 1320 are compared to the observed spectrum of the Sun. Compared to a strictly adiabatic calculation, the nonadiabatic calculation of p-mode frequencies improves the agreement between model and observation. When helium diffusion is included in the model the frequencies of the modes that are sensitive to regions near the base of the convection zone are improved (i.e., brought into closer agreement with observation), but the agreement is made worse for other modes. Cyclic variations in the frequency spacings of the Sun as a function of frequency of n are presented as evidence for a discontinuity in the structure of the Sun, possibly located near the base of the convection zone.
NASA Astrophysics Data System (ADS)
Luna, Julio; Jemei, Samir; Yousfi-Steiner, Nadia; Husar, Attila; Serra, Maria; Hissel, Daniel
2016-10-01
In this work, a nonlinear model predictive control (NMPC) strategy is proposed to improve the efficiency and enhance the durability of a proton exchange membrane fuel cell (PEMFC) power system. The PEMFC controller is based on a distributed parameters model that describes the nonlinear dynamics of the system, considering spatial variations along the gas channels. Parasitic power from different system auxiliaries is considered, including the main parasitic losses which are those of the compressor. A nonlinear observer is implemented, based on the discretised model of the PEMFC, to estimate the internal states. This information is included in the cost function of the controller to enhance the durability of the system by means of avoiding local starvation and inappropriate water vapour concentrations. Simulation results are presented to show the performance of the proposed controller over a given case study in an automotive application (New European Driving Cycle). With the aim of representing the most relevant phenomena that affects the PEMFC voltage, the simulation model includes a two-phase water model and the effects of liquid water on the catalyst active area. The control model is a simplified version that does not consider two-phase water dynamics.
Advanced RF Sources Based on Novel Nonlinear Transmission Lines
2015-01-26
microwave (HPM) sources. It is also critical to thin film devices and integrated circuits, carbon nanotube based cathodes and interconnects, field emitters ... line model (TLM) in Fig. 6b. Our model is compared with TLM, shown in Fig. 7a. When the interface resistance rc is small, TLM becomes inaccurate...due to current crowding. Fig. 6. (a) Electrical contact including specific interfacial resistivity ρc, and (b) its transmission line model
Using a logical information model-driven design process in healthcare.
Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen
2011-01-01
A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.
Data-base development for water-quality modeling of the Patuxent River basin, Maryland
Fisher, G.T.; Summers, R.M.
1987-01-01
Procedures and rationale used to develop a data base and data management system for the Patuxent Watershed Nonpoint Source Water Quality Monitoring and Modeling Program of the Maryland Department of the Environment and the U.S. Geological Survey are described. A detailed data base and data management system has been developed to facilitate modeling of the watershed for water quality planning purposes; statistical analysis; plotting of meteorologic, hydrologic and water quality data; and geographic data analysis. The system is Maryland 's prototype for development of a basinwide water quality management program. A key step in the program is to build a calibrated and verified water quality model of the basin using the Hydrological Simulation Program--FORTRAN (HSPF) hydrologic model, which has been used extensively in large-scale basin modeling. The compilation of the substantial existing data base for preliminary calibration of the basin model, including meteorologic, hydrologic, and water quality data from federal and state data bases and a geographic information system containing digital land use and soils data is described. The data base development is significant in its application of an integrated, uniform approach to data base management and modeling. (Lantz-PTT)
Hayes, Brett K; Heit, Evan; Swendsen, Haruka
2010-03-01
Inductive reasoning entails using existing knowledge or observations to make predictions about novel cases. We review recent findings in research on category-based induction as well as theoretical models of these results, including similarity-based models, connectionist networks, an account based on relevance theory, Bayesian models, and other mathematical models. A number of touchstone empirical phenomena that involve taxonomic similarity are described. We also examine phenomena involving more complex background knowledge about premises and conclusions of inductive arguments and the properties referenced. Earlier models are shown to give a good account of similarity-based phenomena but not knowledge-based phenomena. Recent models that aim to account for both similarity-based and knowledge-based phenomena are reviewed and evaluated. Among the most important new directions in induction research are a focus on induction with uncertain premise categories, the modeling of the relationship between inductive and deductive reasoning, and examination of the neural substrates of induction. A common theme in both the well-established and emerging lines of induction research is the need to develop well-articulated and empirically testable formal models of induction. Copyright © 2010 John Wiley & Sons, Ltd. For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.
Bittig, Arne T; Uhrmacher, Adelinde M
2017-01-01
Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.
NASA Astrophysics Data System (ADS)
Park, K.-R.; Kim, K.-h.; Kwak, S.; Svensson, J.; Lee, J.; Ghim, Y.-c.
2017-11-01
Feasibility study of direct spectra measurements of Thomson scattered photons for fusion-grade plasmas is performed based on a forward model of the KSTAR Thomson scattering system. Expected spectra in the forward model are calculated based on Selden function including the relativistic polarization correction. Noise in the signal is modeled with photon noise and Gaussian electrical noise. Electron temperature and density are inferred using Bayesian probability theory. Based on bias error, full width at half maximum and entropy of posterior distributions, spectral measurements are found to be feasible. Comparisons between spectrometer-based and polychromator-based Thomson scattering systems are performed with varying quantum efficiency and electrical noise levels.
GUI to Facilitate Research on Biological Damage from Radiation
NASA Technical Reports Server (NTRS)
Cucinotta, Frances A.; Ponomarev, Artem Lvovich
2010-01-01
A graphical-user-interface (GUI) computer program has been developed to facilitate research on the damage caused by highly energetic particles and photons impinging on living organisms. The program brings together, into one computational workspace, computer codes that have been developed over the years, plus codes that will be developed during the foreseeable future, to address diverse aspects of radiation damage. These include codes that implement radiation-track models, codes for biophysical models of breakage of deoxyribonucleic acid (DNA) by radiation, pattern-recognition programs for extracting quantitative information from biological assays, and image-processing programs that aid visualization of DNA breaks. The radiation-track models are based on transport models of interactions of radiation with matter and solution of the Boltzmann transport equation by use of both theoretical and numerical models. The biophysical models of breakage of DNA by radiation include biopolymer coarse-grained and atomistic models of DNA, stochastic- process models of deposition of energy, and Markov-based probabilistic models of placement of double-strand breaks in DNA. The program is designed for use in the NT, 95, 98, 2000, ME, and XP variants of the Windows operating system.
76 FR 70890 - Fenamidone; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
.../models/water/index.htm . Based on the Pesticide Root Zone Model/Exposure Analysis Modeling System (PRZM... listed in this unit could also be affected. The North American Industrial Classification System (NAICS... there is reliable information.'' This includes exposure through drinking water and in residential...
Bao, Jie; Hou, Zhangshuan; Huang, Maoyi; ...
2015-12-04
Here, effective sensitivity analysis approaches are needed to identify important parameters or factors and their uncertainties in complex Earth system models composed of multi-phase multi-component phenomena and multiple biogeophysical-biogeochemical processes. In this study, the impacts of 10 hydrologic parameters in the Community Land Model on simulations of runoff and latent heat flux are evaluated using data from a watershed. Different metrics, including residual statistics, the Nash-Sutcliffe coefficient, and log mean square error, are used as alternative measures of the deviations between the simulated and field observed values. Four sensitivity analysis (SA) approaches, including analysis of variance based on the generalizedmore » linear model, generalized cross validation based on the multivariate adaptive regression splines model, standardized regression coefficients based on a linear regression model, and analysis of variance based on support vector machine, are investigated. Results suggest that these approaches show consistent measurement of the impacts of major hydrologic parameters on response variables, but with differences in the relative contributions, particularly for the secondary parameters. The convergence behaviors of the SA with respect to the number of sampling points are also examined with different combinations of input parameter sets and output response variables and their alternative metrics. This study helps identify the optimal SA approach, provides guidance for the calibration of the Community Land Model parameters to improve the model simulations of land surface fluxes, and approximates the magnitudes to be adjusted in the parameter values during parametric model optimization.« less
Sebok, Angelia; Wickens, Christopher D
2017-03-01
The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.
Yang, P C; Zhang, S X; Sun, P P; Cai, Y L; Lin, Y; Zou, Y H
2017-07-10
Objective: To construct the Markov models to reflect the reality of prevention and treatment interventions against hepatitis B virus (HBV) infection, simulate the natural history of HBV infection in different age groups and provide evidence for the economics evaluations of hepatitis B vaccination and population-based antiviral treatment in China. Methods: According to the theory and techniques of Markov chain, the Markov models of Chinese HBV epidemic were developed based on the national data and related literature both at home and abroad, including the settings of Markov model states, allowable transitions and initial and transition probabilities. The model construction, operation and verification were conducted by using software TreeAge Pro 2015. Results: Several types of Markov models were constructed to describe the disease progression of HBV infection in neonatal period, perinatal period or adulthood, the progression of chronic hepatitis B after antiviral therapy, hepatitis B prevention and control in adults, chronic hepatitis B antiviral treatment and the natural progression of chronic hepatitis B in general population. The model for the newborn was fundamental which included ten states, i.e . susceptiblity to HBV, HBsAg clearance, immune tolerance, immune clearance, low replication, HBeAg negative CHB, compensated cirrhosis, decompensated cirrhosis, hepatocellular carcinoma (HCC) and death. The susceptible state to HBV was excluded in the perinatal period model, and the immune tolerance state was excluded in the adulthood model. The model for general population only included two states, survive and death. Among the 5 types of models, there were 9 initial states assigned with initial probabilities, and 27 states for transition probabilities. The results of model verifications showed that the probability curves were basically consistent with the situation of HBV epidemic in China. Conclusion: The Markov models developed can be used in economics evaluation of hepatitis B vaccination and treatment for the elimination of HBV infection in China though the structures and parameters in the model have uncertainty with dynamic natures.
Integration of Web-based and PC-based clinical research databases.
Brandt, C A; Sun, K; Charpentier, P; Nadkarni, P M
2004-01-01
We have created a Web-based repository or data library of information about measurement instruments used in studies of multi-factorial geriatric health conditions (the Geriatrics Research Instrument Library - GRIL) based upon existing features of two separate clinical study data management systems. GRIL allows browsing, searching, and selecting measurement instruments based upon criteria such as keywords and areas of applicability. Measurement instruments selected can be printed and/or included in an automatically generated standalone microcomputer database application, which can be downloaded by investigators for use in data collection and data management. Integration of database applications requires the creation of a common semantic model, and mapping from each system to this model. Various database schema conflicts at the table and attribute level must be identified and resolved prior to integration. Using a conflict taxonomy and a mapping schema facilitates this process. Critical conflicts at the table level that required resolution included name and relationship differences. A major benefit of integration efforts is the sharing of features and cross-fertilization of applications created for similar purposes in different operating environments. Integration of applications mandates some degree of metadata model unification.
Ren, Huazhong; Liu, Rongyuan; Yan, Guangjian; Li, Zhao-Liang; Qin, Qiming; Liu, Qiang; Nerry, Françoise
2015-04-06
Land surface emissivity is a crucial parameter in the surface status monitoring. This study aims at the evaluation of four directional emissivity models, including two bi-directional reflectance distribution function (BRDF) models and two gap-frequency-based models. Results showed that the kernel-driven BRDF model could well represent directional emissivity with an error less than 0.002, and was consequently used to retrieve emissivity with an accuracy of about 0.012 from an airborne multi-angular thermal infrared data set. Furthermore, we updated the cavity effect factor relating to multiple scattering inside canopy, which improved the performance of the gap-frequency-based models.
NASA Technical Reports Server (NTRS)
Campbell, William J.; Short, Nicholas M., Jr.; Roelofs, Larry H.; Dorfman, Erik
1991-01-01
A methodology for optimizing organization of data obtained by NASA earth and space missions is discussed. The methodology uses a concept based on semantic data modeling techniques implemented in a hierarchical storage model. The modeling is used to organize objects in mass storage devices, relational database systems, and object-oriented databases. The semantic data modeling at the metadata record level is examined, including the simulation of a knowledge base and semantic metadata storage issues. The semantic data model hierarchy and its application for efficient data storage is addressed, as is the mapping of the application structure to the mass storage.
Illustrative visualization of 3D city models
NASA Astrophysics Data System (ADS)
Doellner, Juergen; Buchholz, Henrik; Nienhaus, Marc; Kirsch, Florian
2005-03-01
This paper presents an illustrative visualization technique that provides expressive representations of large-scale 3D city models, inspired by the tradition of artistic and cartographic visualizations typically found in bird"s-eye view and panoramic maps. We define a collection of city model components and a real-time multi-pass rendering algorithm that achieves comprehensible, abstract 3D city model depictions based on edge enhancement, color-based and shadow-based depth cues, and procedural facade texturing. Illustrative visualization provides an effective visual interface to urban spatial information and associated thematic information complementing visual interfaces based on the Virtual Reality paradigm, offering a huge potential for graphics design. Primary application areas include city and landscape planning, cartoon worlds in computer games, and tourist information systems.
Quantitative Agent Based Model of User Behavior in an Internet Discussion Forum
Sobkowicz, Pawel
2013-01-01
The paper presents an agent based simulation of opinion evolution, based on a nonlinear emotion/information/opinion (E/I/O) individual dynamics, to an actual Internet discussion forum. The goal is to reproduce the results of two-year long observations and analyses of the user communication behavior and of the expressed opinions and emotions, via simulations using an agent based model. The model allowed to derive various characteristics of the forum, including the distribution of user activity and popularity (outdegree and indegree), the distribution of length of dialogs between the participants, their political sympathies and the emotional content and purpose of the comments. The parameters used in the model have intuitive meanings, and can be translated into psychological observables. PMID:24324606
A Model Based Approach to Increase the Part Accuracy in Robot Based Incremental Sheet Metal Forming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meier, Horst; Laurischkat, Roman; Zhu Junhong
One main influence on the dimensional accuracy in robot based incremental sheet metal forming results from the compliance of the involved robot structures. Compared to conventional machine tools the low stiffness of the robot's kinematic results in a significant deviation of the planned tool path and therefore in a shape of insufficient quality. To predict and compensate these deviations offline, a model based approach, consisting of a finite element approach, to simulate the sheet forming, and a multi body system, modeling the compliant robot structure, has been developed. This paper describes the implementation and experimental verification of the multi bodymore » system model and its included compensation method.« less
Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J
2013-05-01
Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.
Dwivedi, Dipankar; Mohanty, Binayak P.; Lesikar, Bruce J.
2013-01-01
Microbes have been identified as a major contaminant of water resources. Escherichia coli (E. coli) is a commonly used indicator organism. It is well recognized that the fate of E. coli in surface water systems is governed by multiple physical, chemical, and biological factors. The aim of this work is to provide insight into the physical, chemical, and biological factors along with their interactions that are critical in the estimation of E. coli loads in surface streams. There are various models to predict E. coli loads in streams, but they tend to be system or site specific or overly complex without enhancing our understanding of these factors. Hence, based on available data, a Bayesian Neural Network (BNN) is presented for estimating E. coli loads based on physical, chemical, and biological factors in streams. The BNN has the dual advantage of overcoming the absence of quality data (with regards to consistency in data) and determination of mechanistic model parameters by employing a probabilistic framework. This study evaluates whether the BNN model can be an effective alternative tool to mechanistic models for E. coli loads estimation in streams. For this purpose, a comparison with a traditional model (LOADEST, USGS) is conducted. The models are compared for estimated E. coli loads based on available water quality data in Plum Creek, Texas. All the model efficiency measures suggest that overall E. coli loads estimations by the BNN model are better than the E. coli loads estimations by the LOADEST model on all the three occasions (three-fold cross validation). Thirteen factors were used for estimating E. coli loads with the exhaustive feature selection technique, which indicated that six of thirteen factors are important for estimating E. coli loads. Physical factors included temperature and dissolved oxygen; chemical factors include phosphate and ammonia; biological factors include suspended solids and chlorophyll. The results highlight that the LOADEST model estimates E. coli loads better in the smaller ranges, whereas the BNN model estimates E. coli loads better in the higher ranges. Hence, the BNN model can be used to design targeted monitoring programs and implement regulatory standards through TMDL programs. PMID:24511166
Applying the cell-based coagulation model in the management of critical bleeding.
Ho, K M; Pavey, W
2017-03-01
The cell-based coagulation model was proposed 15 years ago, yet has not been applied commonly in the management of critical bleeding. Nevertheless, this alternative model may better explain the physiological basis of current coagulation management during critical bleeding. In this article we describe the limitations of the traditional coagulation protein cascade and standard coagulation tests, and explain the potential advantages of applying the cell-based model in current coagulation management strategies. The cell-based coagulation model builds on the traditional coagulation model and explains many recent clinical observations and research findings related to critical bleeding unexplained by the traditional model, including the encouraging results of using empirical 1:1:1 fresh frozen plasma:platelets:red blood cells transfusion strategy, and the use of viscoelastic and platelet function tests in patients with critical bleeding. From a practical perspective, applying the cell-based coagulation model also explains why new direct oral anticoagulants are effective systemic anticoagulants even without affecting activated partial thromboplastin time or the International Normalized Ratio in a dose-related fashion. The cell-based coagulation model represents the most cohesive scientific framework on which we can understand and manage coagulation during critical bleeding.
When and where does preferential flow matter - from observation to large scale modelling
NASA Astrophysics Data System (ADS)
Weiler, Markus; Leistert, Hannes; Steinbrich, Andreas
2017-04-01
Preferential flow can be of relevance in a wide range of soils and the interaction of different processes and factors are still difficult to assess. As most studies (including our own studies) focusing on the effect of preferential flow are based on relatively high precipitation rates, there is always the question how relevant preferential flow is under natural conditions, considering the site specific precipitation characteristics, the effect of the drying and wetting cycle on the initial soil water condition and shrinkage cracks, the site specific soil properties, soil structure and rock fragments, and the effect of plant roots and soil fauna (e.g. earthworm channels). In order to assess this question, we developed the distributed, process-based model RoGeR (Runoff Generation Research) to include a large number relevant features and processes of preferential flow in soils. The model was developed from a large number of process based research and experiments and includes preferential flow in roots, earthworm channels, along rock fragments and shrinkage cracks. We parameterized the uncalibrated model at a high spatial resolution of 5x5m for the whole state of Baden-Württemberg in Germany using LiDAR data, degree of sealing, landuse, soil properties and geology. As the model is an event based model, we derived typical event based precipitation characteristics based on rainfall duration, mean intensity and amount. Using the site-specific variability of initial soil moisture derived from a water balance model based on the same dataset, we simulated the infiltration and recharge amounts of all event classes derived from the event precipitation characteristics and initial soil moisture conditions. The analysis of the simulation results allowed us to extracts the relevance of preferential flow for infiltration and recharge considering all factors above. We could clearly see a strong effect of the soil properties and land-use, but also, particular for clay rich soils a strong effect of the initial conditions due to the development of soil cracks. Not too surprisingly, the relevance of preferential flow was much lower when considering the whole range of precipitation events as only considering events with a high rainfall intensity. Also, the influence on infiltration and recharge were different. Despite the model can still be improved in particular considering more realistic information about the spatial and temporal variability of preferential flow by soil fauna and plants, the model already shows under what situation we need to be very careful when predicting infiltration and recharge with models considering only longer time steps (daily) or only matrix flow.
NASA AVOSS Fast-Time Wake Prediction Models: User's Guide
NASA Technical Reports Server (NTRS)
Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew
2014-01-01
The National Aeronautics and Space Administration (NASA) is developing and testing fast-time wake transport and decay models to safely enhance the capacity of the National Airspace System (NAS). The fast-time wake models are empirical algorithms used for real-time predictions of wake transport and decay based on aircraft parameters and ambient weather conditions. The aircraft dependent parameters include the initial vortex descent velocity and the vortex pair separation distance. The atmospheric initial conditions include vertical profiles of temperature or potential temperature, eddy dissipation rate, and crosswind. The current distribution includes the latest versions of the APA (3.4) and the TDP (2.1) models. This User's Guide provides detailed information on the model inputs, file formats, and the model output. An example of a model run and a brief description of the Memphis 1995 Wake Vortex Dataset is also provided.
EPR-based material modelling of soils
NASA Astrophysics Data System (ADS)
Faramarzi, Asaad; Alani, Amir M.
2013-04-01
In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.
Communications network design and costing model users manual
NASA Technical Reports Server (NTRS)
Logan, K. P.; Somes, S. S.; Clark, C. A.
1983-01-01
The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.
Chudasama, Vaishali L.; Ovacik, Meric A.; Abernethy, Darrell R.
2015-01-01
Systems models of biological networks show promise for informing drug target selection/qualification, identifying lead compounds and factors regulating disease progression, rationalizing combinatorial regimens, and explaining sources of intersubject variability and adverse drug reactions. However, most models of biological systems are qualitative and are not easily coupled with dynamical models of drug exposure-response relationships. In this proof-of-concept study, logic-based modeling of signal transduction pathways in U266 multiple myeloma (MM) cells is used to guide the development of a simple dynamical model linking bortezomib exposure to cellular outcomes. Bortezomib is a commonly used first-line agent in MM treatment; however, knowledge of the signal transduction pathways regulating bortezomib-mediated cell cytotoxicity is incomplete. A Boolean network model of 66 nodes was constructed that includes major survival and apoptotic pathways and was updated using responses to several chemical probes. Simulated responses to bortezomib were in good agreement with experimental data, and a reduction algorithm was used to identify key signaling proteins. Bortezomib-mediated apoptosis was not associated with suppression of nuclear factor κB (NFκB) protein inhibition in this cell line, which contradicts a major hypothesis of bortezomib pharmacodynamics. A pharmacodynamic model was developed that included three critical proteins (phospho-NFκB, BclxL, and cleaved poly (ADP ribose) polymerase). Model-fitted protein dynamics and cell proliferation profiles agreed with experimental data, and the model-predicted IC50 (3.5 nM) is comparable to the experimental value (1.5 nM). The cell-based pharmacodynamic model successfully links bortezomib exposure to MM cellular proliferation via protein dynamics, and this model may show utility in exploring bortezomib-based combination regimens. PMID:26163548
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houweling, Antonetta C., E-mail: A.Houweling@umcutrecht.n; Philippens, Marielle E.P.; Dijkema, Tim
2010-03-15
Purpose: The dose-response relationship of the parotid gland has been described most frequently using the Lyman-Kutcher-Burman model. However, various other normal tissue complication probability (NTCP) models exist. We evaluated in a large group of patients the value of six NTCP models that describe the parotid gland dose response 1 year after radiotherapy. Methods and Materials: A total of 347 patients with head-and-neck tumors were included in this prospective parotid gland dose-response study. The patients were treated with either conventional radiotherapy or intensity-modulated radiotherapy. Dose-volume histograms for the parotid glands were derived from three-dimensional dose calculations using computed tomography scans. Stimulatedmore » salivary flow rates were measured before and 1 year after radiotherapy. A threshold of 25% of the pretreatment flow rate was used to define a complication. The evaluated models included the Lyman-Kutcher-Burman model, the mean dose model, the relative seriality model, the critical volume model, the parallel functional subunit model, and the dose-threshold model. The goodness of fit (GOF) was determined by the deviance and a Monte Carlo hypothesis test. Ranking of the models was based on Akaike's information criterion (AIC). Results: None of the models was rejected based on the evaluation of the GOF. The mean dose model was ranked as the best model based on the AIC. The TD{sub 50} in these models was approximately 39 Gy. Conclusions: The mean dose model was preferred for describing the dose-response relationship of the parotid gland.« less
Computational Aeroelastic Analyses of a Low-Boom Supersonic Configuration
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph
2015-01-01
An overview of NASA's Commercial Supersonic Technology (CST) Aeroservoelasticity (ASE) element is provided with a focus on recent computational aeroelastic analyses of a low-boom supersonic configuration developed by Lockheed-Martin and referred to as the N+2 configuration. The overview includes details of the computational models developed to date including a linear finite element model (FEM), linear unsteady aerodynamic models, unstructured CFD grids, and CFD-based aeroelastic analyses. In addition, a summary of the work involving the development of aeroelastic reduced-order models (ROMs) and the development of an aero-propulso-servo-elastic (APSE) model is provided.
NASA Astrophysics Data System (ADS)
Gwiazda, A.; Banas, W.; Sekala, A.; Foit, K.; Hryniewicz, P.; Kost, G.
2015-11-01
Process of workcell designing is limited by different constructional requirements. They are related to technological parameters of manufactured element, to specifications of purchased elements of a workcell and to technical characteristics of a workcell scene. This shows the complexity of the design-constructional process itself. The results of such approach are individually designed workcell suitable to the specific location and specific production cycle. Changing this parameters one must rebuild the whole configuration of a workcell. Taking into consideration this it is important to elaborate the base of typical elements of a robot kinematic chain that could be used as the tool for building Virtual modelling of kinematic chains of industrial robots requires several preparatory phase. Firstly, it is important to create a database element, which will be models of industrial robot arms. These models could be described as functional primitives that represent elements between components of the kinematic pairs and structural members of industrial robots. A database with following elements is created: the base kinematic pairs, the base robot structural elements, the base of the robot work scenes. The first of these databases includes kinematic pairs being the key component of the manipulator actuator modules. Accordingly, as mentioned previously, it includes the first stage rotary pair of fifth stage. This type of kinematic pairs was chosen due to the fact that it occurs most frequently in the structures of industrial robots. Second base consists of structural robot elements therefore it allows for the conversion of schematic structures of kinematic chains in the structural elements of the arm of industrial robots. It contains, inter alia, the structural elements such as base, stiff members - simple or angular units. They allow converting recorded schematic three-dimensional elements. Last database is a database of scenes. It includes elements of both simple and complex: simple models of technological equipment, conveyors models, models of the obstacles and like that. Using these elements it could be formed various production spaces (robotized workcells), in which it is possible to virtually track the operation of an industrial robot arm modelled in the system.
Trapp, Henry; Geiger, L.H.
1986-01-01
The sand-and-gravel aquifer is the only freshwater aquifer in southern Escambia County, Florida and is the source of public water supply for the area, including the City of Pensacola. The aquifer was simulated by a two-layer, digital model to provide hydrologic information for water resource planning. The lower layer represents the main-producing zone; the upper layer represents all of the aquifer above the main-producing zone including an unconfined zone and discontinuous perched, confined , and confining zones. The model was designed for steady-state simulation and predicts the response of the aquifer (changes in water levels) to groundwater pumping where steady-state conditions have been reached. Input to the model includes matrices representing constant-head nodes, starting head, transmissivity of layer 1, leakance between layers 1 and 2, lateral hydraulic conductivity of layer 2, and altitude of the base layer 2. The sources of water to the model are from recharge by infiltrated precipitation (estimated from base runoff), inflow across boundaries, and induced recharge from river leakance in periods of prolonged groundwater pumping. Model output includes final head and drawdown for each layer and total values for discharge and recharge in the model area. The model was calibrated for 1972 pumping and tested by simulating pumpages during 1939-40, 1958, and 1977. Sensitivity analyses showed water levels in both layers were most sensitive to changes in the recharge matrix and least sensitive to river leakage. Suggestions for further development of the model include subdivision and expansion of the grid, assignment of storage coefficients for transient simulations, more intensive study of the stream-aquifer relations, and consideration of the effects of infiltration basins on recharge. (Author 's abstract)
VHBuild.com: A Web-Based System for Managing Knowledge in Projects.
ERIC Educational Resources Information Center
Li, Heng; Tang, Sandy; Man, K. F.; Love, Peter E. D.
2002-01-01
Describes an intelligent Web-based construction project management system called VHBuild.com which integrates project management, knowledge management, and artificial intelligence technologies. Highlights include an information flow model; time-cost optimization based on genetic algorithms; rule-based drawing interpretation; and a case-based…
Consensus-based training and assessment model for general surgery.
Szasz, P; Louridas, M; de Montbrun, S; Harris, K A; Grantcharov, T P
2016-05-01
Surgical education is becoming competency-based with the implementation of in-training milestones. Training guidelines should reflect these changes and determine the specific procedures for such milestone assessments. This study aimed to develop a consensus view regarding operative procedures and tasks considered appropriate for junior and senior trainees, and the procedures that can be used as technical milestone assessments for trainee progression in general surgery. A Delphi process was followed where questionnaires were distributed to all 17 Canadian general surgery programme directors. Items were ranked on a 5-point Likert scale, with consensus defined as Cronbach's α of at least 0·70. Items rated 4 or above on the 5-point Likert scale by 80 per cent of the programme directors were included in the models. Two Delphi rounds were completed, with 14 programme directors taking part in round one and 11 in round two. The overall consensus was high (Cronbach's α = 0·98). The training model included 101 unique procedures and tasks, 24 specific to junior trainees, 68 specific to senior trainees, and nine appropriate to all. The assessment model included four procedures. A system of operative procedures and tasks for junior- and senior-level trainees has been developed along with an assessment model for trainee progression. These can be used as milestones in competency-based assessments. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Kvaternik, Raymond G.
1992-01-01
An overview is presented of government contributions to the program called Design Analysis Methods for Vibrations (DAMV) which attempted to develop finite-element-based analyses of rotorcraft vibrations. NASA initiated the program with a finite-element modeling program for the CH-47D tandem-rotor helicopter. The DAMV program emphasized four areas including: airframe finite-element modeling, difficult components studies, coupled rotor-airframe vibrations, and airframe structural optimization. Key accomplishments of the program include industrywide standards for modeling metal and composite airframes, improved industrial designs for vibrations, and the identification of critical structural contributors to airframe vibratory responses. The program also demonstrated the value of incorporating secondary modeling details to improving correlation, and the findings provide the basis for an improved finite-element-based dynamics design-analysis capability.
Object-based class modelling for multi-scale riparian forest habitat mapping
NASA Astrophysics Data System (ADS)
Strasser, Thomas; Lang, Stefan
2015-05-01
Object-based class modelling allows for mapping complex, hierarchical habitat systems. The riparian zone, including forests, represents such a complex ecosystem. Forests within riparian zones are biologically high productive and characterized by a rich biodiversity; thus considered of high community interest with an imperative to be protected and regularly monitored. Satellite earth observation (EO) provides tools for capturing the current state of forest habitats such as forest composition including intermixture of non-native tree species. Here we present a semi-automated object based image analysis (OBIA) approach for the mapping of riparian forests by applying class modelling of habitats based on the European Nature Information System (EUNIS) habitat classifications and the European Habitats Directive (HabDir) Annex 1. A very high resolution (VHR) WorldView-2 satellite image provided the required spatial and spectral details for a multi-scale image segmentation and rule-base composition to generate a six-level hierarchical representation of riparian forest habitats. Thereby habitats were hierarchically represented within an image object hierarchy as forest stands, stands of homogenous tree species and single trees represented by sunlit tree crowns. 522 EUNIS level 3 (EUNIS-3) habitat patches with a mean patch size (MPS) of 12,349.64 m2 were modelled from 938 forest stand patches (MPS = 6868.20 m2) and 43,742 tree stand patches (MPS = 140.79 m2). The delineation quality of the modelled EUNIS-3 habitats (focal level) was quantitatively assessed to an expert-based visual interpretation showing a mean deviation of 11.71%.
NASA Technical Reports Server (NTRS)
Thottappillil, Rajeev; Uman, Martin A.; Diendorfer, Gerhard
1991-01-01
Compared here are the calculated fields of the Traveling Current Source (TCS), Modified Transmission Line (MTL), and the Diendorfer-Uman (DU) models with a channel base current assumed in Nucci et al. on the one hand and with the channel base current assumed in Diendorfer and Uman on the other hand. The characteristics of the field wave shapes are shown to be very sensitive to the channel base current, especially the field zero crossing at 100 km for the TCS and DU models, and the magnetic hump after the initial peak at close range for the TCS models. Also, the DU model is theoretically extended to include any arbitrarily varying return stroke speed with height. A brief discussion is presented on the effects of an exponentially decreasing speed with height on the calculated fields for the TCS, MTL, and DU models.
Consideration of VT5 etch-based OPC modeling
NASA Astrophysics Data System (ADS)
Lim, ChinTeong; Temchenko, Vlad; Kaiser, Dieter; Meusel, Ingo; Schmidt, Sebastian; Schneider, Jens; Niehoff, Martin
2008-03-01
Including etch-based empirical data during OPC model calibration is a desired yet controversial decision for OPC modeling, especially for process with a large litho to etch biasing. While many OPC software tools are capable of providing this functionality nowadays; yet few were implemented in manufacturing due to various risks considerations such as compromises in resist and optical effects prediction, etch model accuracy or even runtime concern. Conventional method of applying rule-based alongside resist model is popular but requires a lot of lengthy code generation to provide a leaner OPC input. This work discusses risk factors and their considerations, together with introduction of techniques used within Mentor Calibre VT5 etch-based modeling at sub 90nm technology node. Various strategies are discussed with the aim of better handling of large etch bias offset without adding complexity into final OPC package. Finally, results were presented to assess the advantages and limitations of the final method chosen.
Anderson, Weston; Guikema, Seth; Zaitchik, Ben; Pan, William
2014-01-01
Obtaining accurate small area estimates of population is essential for policy and health planning but is often difficult in countries with limited data. In lieu of available population data, small area estimate models draw information from previous time periods or from similar areas. This study focuses on model-based methods for estimating population when no direct samples are available in the area of interest. To explore the efficacy of tree-based models for estimating population density, we compare six different model structures including Random Forest and Bayesian Additive Regression Trees. Results demonstrate that without information from prior time periods, non-parametric tree-based models produced more accurate predictions than did conventional regression methods. Improving estimates of population density in non-sampled areas is important for regions with incomplete census data and has implications for economic, health and development policies.
Anderson, Weston; Guikema, Seth; Zaitchik, Ben; Pan, William
2014-01-01
Obtaining accurate small area estimates of population is essential for policy and health planning but is often difficult in countries with limited data. In lieu of available population data, small area estimate models draw information from previous time periods or from similar areas. This study focuses on model-based methods for estimating population when no direct samples are available in the area of interest. To explore the efficacy of tree-based models for estimating population density, we compare six different model structures including Random Forest and Bayesian Additive Regression Trees. Results demonstrate that without information from prior time periods, non-parametric tree-based models produced more accurate predictions than did conventional regression methods. Improving estimates of population density in non-sampled areas is important for regions with incomplete census data and has implications for economic, health and development policies. PMID:24992657
Model-based learning and the contribution of the orbitofrontal cortex to the model-free world.
McDannald, Michael A; Takahashi, Yuji K; Lopatina, Nina; Pietras, Brad W; Jones, Josh L; Schoenbaum, Geoffrey
2012-04-01
Learning is proposed to occur when there is a discrepancy between reward prediction and reward receipt. At least two separate systems are thought to exist: one in which predictions are proposed to be based on model-free or cached values; and another in which predictions are model-based. A basic neural circuit for model-free reinforcement learning has already been described. In the model-free circuit the ventral striatum (VS) is thought to supply a common-currency reward prediction to midbrain dopamine neurons that compute prediction errors and drive learning. In a model-based system, predictions can include more information about an expected reward, such as its sensory attributes or current, unique value. This detailed prediction allows for both behavioral flexibility and learning driven by changes in sensory features of rewards alone. Recent evidence from animal learning and human imaging suggests that, in addition to model-free information, the VS also signals model-based information. Further, there is evidence that the orbitofrontal cortex (OFC) signals model-based information. Here we review these data and suggest that the OFC provides model-based information to this traditional model-free circuitry and offer possibilities as to how this interaction might occur. © 2012 The Authors. European Journal of Neuroscience © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.
Optics Program Simplifies Analysis and Design
NASA Technical Reports Server (NTRS)
2007-01-01
Engineers at Goddard Space Flight Center partnered with software experts at Mide Technology Corporation, of Medford, Massachusetts, through a Small Business Innovation Research (SBIR) contract to design the Disturbance-Optics-Controls-Structures (DOCS) Toolbox, a software suite for performing integrated modeling for multidisciplinary analysis and design. The DOCS Toolbox integrates various discipline models into a coupled process math model that can then predict system performance as a function of subsystem design parameters. The system can be optimized for performance; design parameters can be traded; parameter uncertainties can be propagated through the math model to develop error bounds on system predictions; and the model can be updated, based on component, subsystem, or system level data. The Toolbox also allows the definition of process parameters as explicit functions of the coupled model and includes a number of functions that analyze the coupled system model and provide for redesign. The product is being sold commercially by Nightsky Systems Inc., of Raleigh, North Carolina, a spinoff company that was formed by Mide specifically to market the DOCS Toolbox. Commercial applications include use by any contractors developing large space-based optical systems, including Lockheed Martin Corporation, The Boeing Company, and Northrup Grumman Corporation, as well as companies providing technical audit services, like General Dynamics Corporation
NASA Astrophysics Data System (ADS)
Shugart, Herman H.; Wang, Bin; Fischer, Rico; Ma, Jianyong; Fang, Jing; Yan, Xiaodong; Huth, Andreas; Armstrong, Amanda H.
2018-03-01
Individual-based models (IBMs) of complex systems emerged in the 1960s and early 1970s, across diverse disciplines from astronomy to zoology. Ecological IBMs arose with seemingly independent origins out of the tradition of understanding the ecosystems dynamics of ecosystems from a ‘bottom-up’ accounting of the interactions of the parts. Individual trees are principal among the parts of forests. Because these models are computationally demanding, they have prospered as the power of digital computers has increased exponentially over the decades following the 1970s. This review will focus on a class of forest IBMs called gap models. Gap models simulate the changes in forests by simulating the birth, growth and death of each individual tree on a small plot of land. The summation of these plots comprise a forest (or set of sample plots on a forested landscape or region). Other, more aggregated forest IBMs have been used in global applications including cohort-based models, ecosystem demography models, etc. Gap models have been used to provide the parameters for these bulk models. Currently, gap models have grown from local-scale to continental-scale and even global-scale applications to assess the potential consequences of climate change on natural forests. Modifications to the models have enabled simulation of disturbances including fire, insect outbreak and harvest. Our objective in this review is to provide the reader with an overview of the history, motivation and applications, including theoretical applications, of these models. In a time of concern over global changes, gap models are essential tools to understand forest responses to climate change, modified disturbance regimes and other change agents. Development of forest surveys to provide the starting points for simulations and better estimates of the behavior of the diversity of tree species in response to the environment are continuing needs for improvement for these and other IBMs.
A probabilistic model framework for evaluating year-to-year variation in crop productivity
NASA Astrophysics Data System (ADS)
Yokozawa, M.; Iizumi, T.; Tao, F.
2008-12-01
Most models describing the relation between crop productivity and weather condition have so far been focused on mean changes of crop yield. For keeping stable food supply against abnormal weather as well as climate change, evaluating the year-to-year variations in crop productivity rather than the mean changes is more essential. We here propose a new framework of probabilistic model based on Bayesian inference and Monte Carlo simulation. As an example, we firstly introduce a model on paddy rice production in Japan. It is called PRYSBI (Process- based Regional rice Yield Simulator with Bayesian Inference; Iizumi et al., 2008). The model structure is the same as that of SIMRIW, which was developed and used widely in Japan. The model includes three sub- models describing phenological development, biomass accumulation and maturing of rice crop. These processes are formulated to include response nature of rice plant to weather condition. This model inherently was developed to predict rice growth and yield at plot paddy scale. We applied it to evaluate the large scale rice production with keeping the same model structure. Alternatively, we assumed the parameters as stochastic variables. In order to let the model catch up actual yield at larger scale, model parameters were determined based on agricultural statistical data of each prefecture of Japan together with weather data averaged over the region. The posterior probability distribution functions (PDFs) of parameters included in the model were obtained using Bayesian inference. The MCMC (Markov Chain Monte Carlo) algorithm was conducted to numerically solve the Bayesian theorem. For evaluating the year-to-year changes in rice growth/yield under this framework, we firstly iterate simulations with set of parameter values sampled from the estimated posterior PDF of each parameter and then take the ensemble mean weighted with the posterior PDFs. We will also present another example for maize productivity in China. The framework proposed here provides us information on uncertainties, possibilities and limitations on future improvements in crop model as well.
Modeling and Simulations for the High Flux Isotope Reactor Cycle 400
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilas, Germina; Chandler, David; Ade, Brian J
2015-03-01
A concerted effort over the past few years has been focused on enhancing the core model for the High Flux Isotope Reactor (HFIR), as part of a comprehensive study for HFIR conversion from high-enriched uranium (HEU) to low-enriched uranium (LEU) fuel. At this time, the core model used to perform analyses in support of HFIR operation is an MCNP model for the beginning of Cycle 400, which was documented in detail in a 2005 technical report. A HFIR core depletion model that is based on current state-of-the-art methods and nuclear data was needed to serve as reference for the designmore » of an LEU fuel for HFIR. The recent enhancements in modeling and simulations for HFIR that are discussed in the present report include: (1) revision of the 2005 MCNP model for the beginning of Cycle 400 to improve the modeling data and assumptions as necessary based on appropriate primary reference sources HFIR drawings and reports; (2) improvement of the fuel region model, including an explicit representation for the involute fuel plate geometry that is characteristic to HFIR fuel; and (3) revision of the Monte Carlo-based depletion model for HFIR in use since 2009 but never documented in detail, with the development of a new depletion model for the HFIR explicit fuel plate representation. The new HFIR models for Cycle 400 are used to determine various metrics of relevance to reactor performance and safety assessments. The calculated metrics are compared, where possible, with measurement data from preconstruction critical experiments at HFIR, data included in the current HFIR safety analysis report, and/or data from previous calculations performed with different methods or codes. The results of the analyses show that the models presented in this report provide a robust and reliable basis for HFIR analyses.« less
Doble, Brett; Tan, Marcus; Harris, Anthony; Lorgelly, Paula
2015-02-01
The successful use of a targeted therapy is intrinsically linked to the ability of a companion diagnostic to correctly identify patients most likely to benefit from treatment. The aim of this study was to review the characteristics of companion diagnostics that are of importance for inclusion in an economic evaluation. Approaches for including these characteristics in model-based economic evaluations are compared with the intent to describe best practice methods. Five databases and government agency websites were searched to identify model-based economic evaluations comparing a companion diagnostic and subsequent treatment strategy to another alternative treatment strategy with model parameters for the sensitivity and specificity of the companion diagnostic (primary synthesis). Economic evaluations that limited model parameters for the companion diagnostic to only its cost were also identified (secondary synthesis). Quality was assessed using the Quality of Health Economic Studies instrument. 30 studies were included in the review (primary synthesis n = 12; secondary synthesis n = 18). Incremental cost-effectiveness ratios may be lower when the only parameter for the companion diagnostic included in a model is the cost of testing. Incorporating the test's accuracy in addition to its cost may be a more appropriate methodological approach. Altering the prevalence of the genetic biomarker, specific population tested, type of test, test accuracy and timing/sequence of multiple tests can all impact overall model results. The impact of altering a test's threshold for positivity is unknown as it was not addressed in any of the included studies. Additional quality criteria as outlined in our methodological checklist should be considered due to the shortcomings of standard quality assessment tools in differentiating studies that incorporate important test-related characteristics and those that do not. There is a need to refine methods for incorporating the characteristics of companion diagnostics into model-based economic evaluations to ensure consistent and transparent reimbursement decisions are made.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brost, Randolph C.; McLendon, William Clarence,
2013-01-01
Modeling geospatial information with semantic graphs enables search for sites of interest based on relationships between features, without requiring strong a priori models of feature shape or other intrinsic properties. Geospatial semantic graphs can be constructed from raw sensor data with suitable preprocessing to obtain a discretized representation. This report describes initial work toward extending geospatial semantic graphs to include temporal information, and initial results applying semantic graph techniques to SAR image data. We describe an efficient graph structure that includes geospatial and temporal information, which is designed to support simultaneous spatial and temporal search queries. We also report amore » preliminary implementation of feature recognition, semantic graph modeling, and graph search based on input SAR data. The report concludes with lessons learned and suggestions for future improvements.« less
Gibson, Amelia N.
2016-01-01
This grounded theory study used in-depth, semi-structured interview to examine the information-seeking behaviors of 35 parents of children with Down syndrome. Emergent themes include a progressive pattern of behavior including information overload and avoidance, passive attention, and active information seeking; varying preferences between tacit and explicit information at different stages; and selection of information channels and sources that varied based on personal and situational constraints. Based on the findings, the author proposes a progressive model of health information seeking and a framework for using this model to collect data in practice. The author also discusses the practical and theoretical implications of a responsive, progressive approach to understanding parents’ health information–seeking behavior. PMID:28462351
NASA Technical Reports Server (NTRS)
Shiau, Jyh-Jen; Wahba, Grace; Johnson, Donald R.
1986-01-01
A new method, based on partial spline models, is developed for including specified discontinuities in otherwise smooth two- and three-dimensional objective analyses. The method is appropriate for including tropopause height information in two- and three-dimensinal temperature analyses, using the O'Sullivan-Wahba physical variational method for analysis of satellite radiance data, and may in principle be used in a combined variational analysis of observed, forecast, and climate information. A numerical method for its implementation is described and a prototype two-dimensional analysis based on simulated radiosonde and tropopause height data is shown. The method may also be appropriate for other geophysical problems, such as modeling the ocean thermocline, fronts, discontinuities, etc.
Mounts, W M; Liebman, M N
1997-07-01
We have developed a method for representing biological pathways and simulating their behavior based on the use of stochastic activity networks (SANs). SANs, an extension of the original Petri net, have been used traditionally to model flow systems including data-communications networks and manufacturing processes. We apply the methodology to the blood coagulation cascade, a biological flow system, and present the representation method as well as results of simulation studies based on published experimental data. In addition to describing the dynamic model, we also present the results of its utilization to perform simulations of clinical states including hemophilia's A and B as well as sensitivity analysis of individual factors and their impact on thrombin production.
Dynamic Emulation Modelling (DEMo) of large physically-based environmental models
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.
2012-12-01
In environmental modelling large, spatially-distributed, physically-based models are widely adopted to describe the dynamics of physical, social and economic processes. Such an accurate process characterization comes, however, to a price: the computational requirements of these models are considerably high and prevent their use in any problem requiring hundreds or thousands of model runs to be satisfactory solved. Typical examples include optimal planning and management, data assimilation, inverse modelling and sensitivity analysis. An effective approach to overcome this limitation is to perform a top-down reduction of the physically-based model by identifying a simplified, computationally efficient emulator, constructed from and then used in place of the original model in highly resource-demanding tasks. The underlying idea is that not all the process details in the original model are equally important and relevant to the dynamics of the outputs of interest for the type of problem considered. Emulation modelling has been successfully applied in many environmental applications, however most of the literature considers non-dynamic emulators (e.g. metamodels, response surfaces and surrogate models), where the original dynamical model is reduced to a static map between input and the output of interest. In this study we focus on Dynamic Emulation Modelling (DEMo), a methodological approach that preserves the dynamic nature of the original physically-based model, with consequent advantages in a wide variety of problem areas. In particular, we propose a new data-driven DEMo approach that combines the many advantages of data-driven modelling in representing complex, non-linear relationships, but preserves the state-space representation typical of process-based models, which is both particularly effective in some applications (e.g. optimal management and data assimilation) and facilitates the ex-post physical interpretation of the emulator structure, thus enhancing the credibility of the model to stakeholders and decision-makers. Numerical results from the application of the approach to the reduction of 3D coupled hydrodynamic-ecological models in several real world case studies, including Marina Reservoir (Singapore) and Googong Reservoir (Australia), are illustrated.
Starspot detection and properties
NASA Astrophysics Data System (ADS)
Savanov, I. S.
2013-07-01
I review the currently available techniques for the starspots detection including the one-dimensional spot modelling of photometric light curves. Special attention will be paid to the modelling of photospheric activity based on the high-precision light curves obtained with space missions MOST, CoRoT, and Kepler. Physical spot parameters (temperature, sizes and variability time scales including short-term activity cycles) are discussed.
Steve Sutherland; Cara R. Nelson
2010-01-01
Invasion by nonnative plants can result in substantial adverse effects on the functions of native forest ecosystems, including nutrient cycling and fire regimes. Thus, forest managers need to be aware of the potential impacts of management activities, including silvicultural treatments, on nonnative vegetation. To aid in that effort, we created a conceptual model of...
Atmospheric, climatic and environmental research
NASA Technical Reports Server (NTRS)
Broecker, Wallace S.; Gornitz, Vivien M.
1992-01-01
Work performed on the three tasks during the report period is summarized. The climate and atmospheric modeling studies included work on climate model development and applications, paleoclimate studies, climate change applications, and SAGE II. Climate applications of Earth and planetary observations included studies on cloud climatology and planetary studies. Studies on the chemistry of the Earth and the environment are briefly described. Publications based on the above research are listed; two of these papers are included in the appendices.
NASA Technical Reports Server (NTRS)
Comfort, Richard H.; Horwitz, James L.
1993-01-01
During the course of this grant, work was performed on a variety of topics and there were a number of significant accomplishments. A summary of these accomplishments is included. The topics studied include empirical model data base, data reduction for archiving, semikinetic modeling of low energy plasma in the inner terrestrial magnetosphere and ionosphere, O(+) outflows, equatorial plasma trough, and plasma wave ray-tracing studies. A list of publications and presentations which have resulted from this research is also included.
Update on simulation-based surgical training and assessment in ophthalmology: a systematic review.
Thomsen, Ann Sofia S; Subhi, Yousif; Kiilgaard, Jens Folke; la Cour, Morten; Konge, Lars
2015-06-01
This study reviews the evidence behind simulation-based surgical training of ophthalmologists to determine (1) the validity of the reported models and (2) the ability to transfer skills to the operating room. Simulation-based training is established widely within ophthalmology, although it often lacks a scientific basis for implementation. We conducted a systematic review of trials involving simulation-based training or assessment of ophthalmic surgical skills among health professionals. The search included 5 databases (PubMed, EMBASE, PsycINFO, Cochrane Library, and Web of Science) and was completed on March 1, 2014. Overall, the included trials were divided into animal, cadaver, inanimate, and virtual-reality models. Risk of bias was assessed using the Cochrane Collaboration's tool. Validity evidence was evaluated using a modern validity framework (Messick's). We screened 1368 reports for eligibility and included 118 trials. The most common surgery simulated was cataract surgery. Most validity trials investigated only 1 or 2 of 5 sources of validity (87%). Only 2 trials (48 participants) investigated transfer of skills to the operating room; 4 trials (65 participants) evaluated the effect of simulation-based training on patient-related outcomes. Because of heterogeneity of the studies, it was not possible to conduct a quantitative analysis. The methodologic rigor of trials investigating simulation-based surgical training in ophthalmology is inadequate. To ensure effective implementation of training models, evidence-based knowledge of validity and efficacy is needed. We provide a useful tool for implementation and evaluation of research in simulation-based training. Copyright © 2015 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
A data storage and retrieval model for Louisiana traffic operations data : technical summary.
DOT National Transportation Integrated Search
1996-08-01
The overall goal of this research study was to develop a prototype computer-based indexing model for traffic operation data in DOTD. The methodology included: 1) extraction of state road network, 2) development of geographic reference model, 3) engin...
Functional Behavioral Assessment: A School Based Model.
ERIC Educational Resources Information Center
Asmus, Jennifer M.; Vollmer, Timothy R.; Borrero, John C.
2002-01-01
This article begins by discussing requirements for functional behavioral assessment under the Individuals with Disabilities Education Act and then describes a comprehensive model for the application of behavior analysis in the schools. The model includes descriptive assessment, functional analysis, and intervention and involves the participation…
A fluidized bed technique for estimating soil critical shear stress
USDA-ARS?s Scientific Manuscript database
Soil erosion models, depending on how they are formulated, always have erodibilitiy parameters in the erosion equations. For a process-based model like the Water Erosion Prediction Project (WEPP) model, the erodibility parameters include rill and interrill erodibility and critical shear stress. Thes...
High altitude atmospheric modeling
NASA Technical Reports Server (NTRS)
Hedin, Alan E.
1988-01-01
Five empirical models were compared with 13 data sets, including both atmospheric drag-based data and mass spectrometer data. The most recently published model, MSIS-86, was found to be the best model overall with an accuracy around 15 percent. The excellent overall agreement of the mass spectrometer-based MSIS models with the drag data, including both the older data from orbital decay and the newer accelerometer data, suggests that the absolute calibration of the (ensemble of) mass spectrometers and the assumed drag coefficient in the atomic oxygen regime are consistent to 5 percent. This study illustrates a number of reasons for the current accuracy limit such as calibration accuracy and unmodeled trends. Nevertheless, the largest variations in total density in the thermosphere are accounted for, to a very high degree, by existing models. The greatest potential for improvements is in areas where we still have insufficient data (like the lower thermosphere or exosphere), where there are disagreements in technique (such as the exosphere) which can be resolved, or wherever generally more accurate measurements become available.
Hyperviscosity for unstructured ALE meshes
NASA Astrophysics Data System (ADS)
Cook, Andrew W.; Ulitsky, Mark S.; Miller, Douglas S.
2013-01-01
An artificial viscosity, originally designed for Eulerian schemes, is adapted for use in arbitrary Lagrangian-Eulerian simulations. Changes to the Eulerian model (dubbed 'hyperviscosity') are discussed, which enable it to work within a Lagrangian framework. New features include a velocity-weighted grid scale and a generalised filtering procedure, applicable to either structured or unstructured grids. The model employs an artificial shear viscosity for treating small-scale vorticity and an artificial bulk viscosity for shock capturing. The model is based on the Navier-Stokes form of the viscous stress tensor, including the diagonal rate-of-expansion tensor. A second-order version of the model is presented, in which Laplacian operators act on the velocity divergence and the grid-weighted strain-rate magnitude to ensure that the velocity field remains smooth at the grid scale. Unlike sound-speed-based artificial viscosities, the hyperviscosity model is compatible with the low Mach number limit. The new model outperforms a commonly used Lagrangian artificial viscosity on a variety of test problems.
Brady, Amie M.G.; Plona, Meg B.
2012-01-01
The Cuyahoga River within Cuyahoga Valley National Park (CVNP) is at times impaired for recreational use due to elevated concentrations of Escherichia coli (E. coli), a fecal-indicator bacterium. During the recreational seasons of mid-May through September during 2009–11, samples were collected 4 days per week and analyzed for E. coli concentrations at two sites within CVNP. Other water-quality and environ-mental data, including turbidity, rainfall, and streamflow, were measured and (or) tabulated for analysis. Regression models developed to predict recreational water quality in the river were implemented during the recreational seasons of 2009–11 for one site within CVNP–Jaite. For the 2009 and 2010 seasons, the regression models were better at predicting exceedances of Ohio's single-sample standard for primary-contact recreation compared to the traditional method of using the previous day's E. coli concentration. During 2009, the regression model was based on data collected during 2005 through 2008, excluding available 2004 data. The resulting model for 2009 did not perform as well as expected (based on the calibration data set) and tended to overestimate concentrations (correct responses at 69 percent). During 2010, the regression model was based on data collected during 2004 through 2009, including all of the available data. The 2010 model performed well, correctly predicting 89 percent of the samples above or below the single-sample standard, even though the predictions tended to be lower than actual sample concentrations. During 2011, the regression model was based on data collected during 2004 through 2010 and tended to overestimate concentrations. The 2011 model did not perform as well as the traditional method or as expected, based on the calibration dataset (correct responses at 56 percent). At a second site—Lock 29, approximately 5 river miles upstream from Jaite, a regression model based on data collected at the site during the recreational seasons of 2008–10 also did not perform as well as the traditional method or as well as expected (correct responses at 60 percent). Above normal precipitation in the region and a delayed start to the 2011 sampling season (sampling began mid-June) may have affected how well the 2011 models performed. With these new data, however, updated regression models may be better able to predict recreational water quality conditions due to the increased amount of diverse water quality conditions included in the calibration data. Daily recreational water-quality predictions for Jaite were made available on the Ohio Nowcast Web site at www.ohionowcast.info. Other public outreach included signage at trailheads in the park, articles in the park's quarterly-published schedule of events and volunteer newsletters. A U.S. Geological Survey Fact Sheet was also published to bring attention to water-quality issues in the park.
An assembly process model based on object-oriented hierarchical time Petri Nets
NASA Astrophysics Data System (ADS)
Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui
2017-04-01
In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.
Comparison of Tungsten and Molybdenum Based Emitters for Advanced Thermionic Space Nuclear Reactors
NASA Astrophysics Data System (ADS)
Lee, Hsing H.; Dickinson, Jeffrey W.; Klein, Andrew C.; Lamp, Thomas R.
1994-07-01
Variations to the Advanced Thermionic Initiative thermionic fuel element are analyzed. Analysis included neutronic modeling with MCNP for criticality determination and thermal power distribution, and thermionic performance modeling with TFEHX. Changes to the original ATI configuration include the addition of W-HfC wire to the emitter for high temperature creep resistance improvement and substitution of molybdenum for the tungsten base material. Results from MCNP showed that all the tungsten used in the coating and base material must be 100% W-184 to obtain criticality. The presence of molybdenum in the emitter base affects the neutronic performance of the TFE by increasing the emitter neutron absorption cross section. Due to the reduced thermal conductivity for the molybdenum based emitter, a higher temperature is obtained resulting in a greater electrical power production. The thermal conductivity and resistivity of the composite emitter region were derived for the W-Mo composite and used in TFEHX.
Jiao, Y; Chen, R; Ke, X; Cheng, L; Chu, K; Lu, Z; Herskovits, E H
2011-01-01
Autism spectrum disorder (ASD) is a neurodevelopmental disorder, of which Asperger syndrome and high-functioning autism are subtypes. Our goal is: 1) to determine whether a diagnostic model based on single-nucleotide polymorphisms (SNPs), brain regional thickness measurements, or brain regional volume measurements can distinguish Asperger syndrome from high-functioning autism; and 2) to compare the SNP, thickness, and volume-based diagnostic models. Our study included 18 children with ASD: 13 subjects with high-functioning autism and 5 subjects with Asperger syndrome. For each child, we obtained 25 SNPs for 8 ASD-related genes; we also computed regional cortical thicknesses and volumes for 66 brain structures, based on structural magnetic resonance (MR) examination. To generate diagnostic models, we employed five machine-learning techniques: decision stump, alternating decision trees, multi-class alternating decision trees, logistic model trees, and support vector machines. For SNP-based classification, three decision-tree-based models performed better than the other two machine-learning models. The performance metrics for three decision-tree-based models were similar: decision stump was modestly better than the other two methods, with accuracy = 90%, sensitivity = 0.95 and specificity = 0.75. All thickness and volume-based diagnostic models performed poorly. The SNP-based diagnostic models were superior to those based on thickness and volume. For SNP-based classification, rs878960 in GABRB3 (gamma-aminobutyric acid A receptor, beta 3) was selected by all tree-based models. Our analysis demonstrated that SNP-based classification was more accurate than morphometry-based classification in ASD subtype classification. Also, we found that one SNP--rs878960 in GABRB3--distinguishes Asperger syndrome from high-functioning autism.
Sharma, Nripen S.; Jindal, Rohit; Mitra, Bhaskar; Lee, Serom; Li, Lulu; Maguire, Tim J.; Schloss, Rene; Yarmush, Martin L.
2014-01-01
Skin sensitization remains a major environmental and occupational health hazard. Animal models have been used as the gold standard method of choice for estimating chemical sensitization potential. However, a growing international drive and consensus for minimizing animal usage have prompted the development of in vitro methods to assess chemical sensitivity. In this paper, we examine existing approaches including in silico models, cell and tissue based assays for distinguishing between sensitizers and irritants. The in silico approaches that have been discussed include Quantitative Structure Activity Relationships (QSAR) and QSAR based expert models that correlate chemical molecular structure with biological activity and mechanism based read-across models that incorporate compound electrophilicity. The cell and tissue based assays rely on an assortment of mono and co-culture cell systems in conjunction with 3D skin models. Given the complexity of allergen induced immune responses, and the limited ability of existing systems to capture the entire gamut of cellular and molecular events associated with these responses, we also introduce a microfabricated platform that can capture all the key steps involved in allergic contact sensitivity. Finally, we describe the development of an integrated testing strategy comprised of two or three tier systems for evaluating sensitization potential of chemicals. PMID:24741377
Prediction of Chemical Function: Model Development and ...
The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (HT) screening-level exposures developed under ExpoCast can be combined with HT screening (HTS) bioactivity data for the risk-based prioritization of chemicals for further evaluation. The functional role (e.g. solvent, plasticizer, fragrance) that a chemical performs can drive both the types of products in which it is found and the concentration in which it is present and therefore impacting exposure potential. However, critical chemical use information (including functional role) is lacking for the majority of commercial chemicals for which exposure estimates are needed. A suite of machine-learning based models for classifying chemicals in terms of their likely functional roles in products based on structure were developed. This effort required collection, curation, and harmonization of publically-available data sources of chemical functional use information from government and industry bodies. Physicochemical and structure descriptor data were generated for chemicals with function data. Machine-learning classifier models for function were then built in a cross-validated manner from the descriptor/function data using the method of random forests. The models were applied to: 1) predict chemi
Creep fatigue life prediction for engine hot section materials (isotropic)
NASA Technical Reports Server (NTRS)
Moreno, Vito; Nissley, David; Lin, Li-Sen Jim
1985-01-01
The first two years of a two-phase program aimed at improving the high temperature crack initiation life prediction technology for gas turbine hot section components are discussed. In Phase 1 (baseline) effort, low cycle fatigue (LCF) models, using a data base generated for a cast nickel base gas turbine hot section alloy (B1900+Hf), were evaluated for their ability to predict the crack initiation life for relevant creep-fatigue loading conditions and to define data required for determination of model constants. The variables included strain range and rate, mean strain, strain hold times and temperature. None of the models predicted all of the life trends within reasonable data requirements. A Cycle Damage Accumulation (CDA) was therefore developed which follows an exhaustion of material ductility approach. Material ductility is estimated based on observed similarities of deformation structure between fatigue, tensile and creep tests. The cycle damage function is based on total strain range, maximum stress and stress amplitude and includes both time independent and time dependent components. The CDA model accurately predicts all of the trends in creep-fatigue life with loading conditions. In addition, all of the CDA model constants are determinable from rapid cycle, fully reversed fatigue tests and monotonic tensile and/or creep data.