Sample records for existing models based

  1. Florida Bay salinity and Everglades wetlands hydrology circa 1900 CE: A compilation of paleoecology-based statistical modeling analyses

    USGS Publications Warehouse

    Marshall, F.E.; Wingard, G.L.

    2012-01-01

    The upgraded method of coupled paleosalinity and hydrologic models was applied to the analysis of the circa-1900 CE segments of five estuarine sediment cores collected in Florida Bay. Comparisons of the observed mean stage (water level) data to the paleoecology-based model's averaged output show that the estimated stage in the Everglades wetlands was 0.3 to 1.6 feet higher at different locations. Observed mean flow data compared to the paleoecology-based model output show an estimated flow into Shark River Slough at Tamiami Trail of 401 to 2,539 cubic feet per second (cfs) higher than existing flows, and at Taylor Slough Bridge an estimated flow of 48 to 218 cfs above existing flows. For salinity in Florida Bay, the difference between paleoecology-based and observed mean salinity varies across the bay, from an aggregated average salinity of 14.7 less than existing in the northeastern basin to 1.0 less than existing in the western basin near the transition into the Gulf of Mexico. When the salinity differences are compared by region, the difference between paleoecology-based conditions and existing conditions are spatially consistent.

  2. Innovating Method of Existing Mechanical Product Based on TRIZ Theory

    NASA Astrophysics Data System (ADS)

    Zhao, Cunyou; Shi, Dongyan; Wu, Han

    Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.

  3. Traffic Flow Density Distribution Based on FEM

    NASA Astrophysics Data System (ADS)

    Ma, Jing; Cui, Jianming

    In analysis of normal traffic flow, it usually uses the static or dynamic model to numerical analyze based on fluid mechanics. However, in such handling process, the problem of massive modeling and data handling exist, and the accuracy is not high. Finite Element Method (FEM) is a production which is developed from the combination of a modern mathematics, mathematics and computer technology, and it has been widely applied in various domain such as engineering. Based on existing theory of traffic flow, ITS and the development of FEM, a simulation theory of the FEM that solves the problems existing in traffic flow is put forward. Based on this theory, using the existing Finite Element Analysis (FEA) software, the traffic flow is simulated analyzed with fluid mechanics and the dynamics. Massive data processing problem of manually modeling and numerical analysis is solved, and the authenticity of simulation is enhanced.

  4. Models of Integrating Physical Therapists into Family Health Teams in Ontario, Canada: Challenges and Opportunities

    PubMed Central

    Mandoda, Shilpa; Landry, Michel D.

    2011-01-01

    ABSTRACT Purpose: To explore the potential for different models of incorporating physical therapy (PT) services within the emerging network of family health teams (FHTs) in Ontario and to identify challenges and opportunities of each model. Methods: A two-phase mixed-methods qualitative descriptive approach was used. First, FHTs were mapped in relation to existing community-based PT practices. Second, semi-structured key-informant interviews were conducted with representatives from urban and rural FHTs and from a variety of community-based PT practices. Interviews were digitally recorded, transcribed verbatim, and analyzed using a categorizing/editing approach. Results: Most participants agreed that the ideal model involves embedding physical therapists directly into FHTs; in some situations, however, partnering with an existing external PT provider may be more feasible and sustainable. Access and funding remain the key issues, regardless of the model adopted. Conclusion: Although there are differences across the urban/rural divide, there exist opportunities to enhance and optimize existing delivery models so as to improve client access and address emerging demand for community-based PT services. PMID:22654231

  5. Catchment area-based evaluation of the AMC-dependent SCS-CN-based rainfall-runoff models

    NASA Astrophysics Data System (ADS)

    Mishra, S. K.; Jain, M. K.; Pandey, R. P.; Singh, V. P.

    2005-09-01

    Using a large set of rainfall-runoff data from 234 watersheds in the USA, a catchment area-based evaluation of the modified version of the Mishra and Singh (2002a) model was performed. The model is based on the Soil Conservation Service Curve Number (SCS-CN) methodology and incorporates the antecedent moisture in computation of direct surface runoff. Comparison with the existing SCS-CN method showed that the modified version performed better than did the existing one on the data of all seven area-based groups of watersheds ranging from 0.01 to 310.3 km2.

  6. Existence of periodic solutions in a model of respiratory syncytial virus RSV

    NASA Astrophysics Data System (ADS)

    Arenas, Abraham J.; González, Gilberto; Jódar, Lucas

    2008-08-01

    In this paper we study the existence of a positive periodic solutions for nested models of respiratory syncytial virus RSV, by using a continuation theorem based on coincidence degree theory. Conditions for the existence of periodic solutions in the model are given. Numerical simulations related to the transmission of respiratory syncytial virus in Madrid and Rio Janeiro are included.

  7. A systematic literature review of open source software quality assessment models.

    PubMed

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  8. AWARE: Adaptive Software Monitoring and Dynamic Reconfiguration for Critical Infrastructure Protection

    DTIC Science & Technology

    2015-04-29

    in which we applied these adaptation patterns to an adaptive news web server intended to tolerate extremely heavy, unexpected loads. To address...collection of existing models used as benchmarks for OO-based refactoring and an existing web -based repository called REMODD to provide users with model...invariant properties. Specifically, we developed Avida- MDE (based on the Avida digital evolution platform) to support the automatic generation of software

  9. A Multi-layer Dynamic Model for Coordination Based Group Decision Making in Water Resource Allocation and Scheduling

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Zhang, Xingnan; Li, Chenming; Wang, Jianying

    Management of group decision-making is an important issue in water source management development. In order to overcome the defects in lacking of effective communication and cooperation in the existing decision-making models, this paper proposes a multi-layer dynamic model for coordination in water resource allocation and scheduling based group decision making. By introducing the scheme-recognized cooperative satisfaction index and scheme-adjusted rationality index, the proposed model can solve the problem of poor convergence of multi-round decision-making process in water resource allocation and scheduling. Furthermore, the problem about coordination of limited resources-based group decision-making process can be solved based on the effectiveness of distance-based group of conflict resolution. The simulation results show that the proposed model has better convergence than the existing models.

  10. Rule-based simulation models

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Seraphine, Kathleen M.

    1991-01-01

    Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.

  11. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  12. Improved dual-porosity models for petrophysical analysis of vuggy reservoirs

    NASA Astrophysics Data System (ADS)

    Wang, Haitao

    2017-08-01

    A new vug interconnection, isolated vug (IVG), was investigated through resistivity modeling and the dual-porosity model for connected vug (CVG) vuggy reservoirs was tested. The vuggy models were built by pore-scale modeling, and their electrical resistivity was calculated by the finite difference method. For CVG vuggy reservoirs, the CVG reduced formation factors and increased the porosity exponents, and the existing dual-porosity model failed to match these results. Based on the existing dual-porosity model, a conceptual dual-porosity model for CVG was developed by introducing a decoupled term to reduce the resistivity of the model. For IVG vuggy reservoirs, IVG increased the formation factors and porosity exponents. The existing dual-porosity model succeeded due to accurate calculation of the formation factors of the deformed interparticle porous media caused by the insertion of the IVG. Based on the existing dual-porosity model, a new porosity model for IVG vuggy reservoirs was developed by simultaneously recalculating the formation factors of the altered interparticle pore-scale models. The formation factors and porosity exponents from the improved and extended dual-porosity models for CVG and IVG vuggy reservoirs well matched the simulated formation factors and porosity exponents. This work is helpful for understanding the influence of connected and disconnected vugs on resistivity factors—an issue of particular importance in carbonates.

  13. [Modeling in value-based medicine].

    PubMed

    Neubauer, A S; Hirneiss, C; Kampik, A

    2010-03-01

    Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.

  14. Review: Modelling chemical kinetics and convective heating in giant planet entries

    NASA Astrophysics Data System (ADS)

    Reynier, Philippe; D'Ammando, Giuliano; Bruno, Domenico

    2018-01-01

    A review of the existing chemical kinetics models for H2 / He mixtures and related transport and thermodynamic properties is presented as a pre-requisite towards the development of innovative models based on the state-to-state approach. A survey of the available results obtained during the mission preparation and post-flight analyses of the Galileo mission has been undertaken and a computational matrix has been derived. Different chemical kinetics schemes for hydrogen/helium mixtures have been applied to numerical simulations of the selected points along the entry trajectory. First, a reacting scheme, based on literature data, has been set up for computing the flow-field around the probe at high altitude and comparisons with existing numerical predictions are performed. Then, a macroscopic model derived from a state-to-state model has been constructed and incorporated into a CFD code. Comparisons with existing numerical results from the literature have been performed as well as cross-check comparisons between the predictions provided by the different models in order to evaluate the potential of innovative chemical kinetics models based on the state-to-state approach.

  15. ProbOnto: ontology and knowledge base of probability distributions.

    PubMed

    Swat, Maciej J; Grenon, Pierre; Wimalaratne, Sarala

    2016-09-01

    Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. http://probonto.org mjswat@ebi.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  16. A Transactional Model of Bullying and Victimization

    ERIC Educational Resources Information Center

    Georgiou, Stelios N.; Fanti, Kostas A.

    2010-01-01

    The purpose of the current study was to develop and test a transactional model, based on longitudinal data, capable to describe the existing interrelation between maternal behavior and child bullying and victimization experiences over time. The results confirmed the existence of such a model for bullying, but not for victimization in terms of…

  17. Computational studies of horizontal axis wind turbines in high wind speed condition using advanced turbulence models

    NASA Astrophysics Data System (ADS)

    Benjanirat, Sarun

    Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.

  18. A 3D model retrieval approach based on Bayesian networks lightfield descriptor

    NASA Astrophysics Data System (ADS)

    Xiao, Qinhan; Li, Yanjun

    2009-12-01

    A new 3D model retrieval methodology is proposed by exploiting a novel Bayesian networks lightfield descriptor (BNLD). There are two key novelties in our approach: (1) a BN-based method for building lightfield descriptor; and (2) a 3D model retrieval scheme based on the proposed BNLD. To overcome the disadvantages of the existing 3D model retrieval methods, we explore BN for building a new lightfield descriptor. Firstly, 3D model is put into lightfield, about 300 binary-views can be obtained along a sphere, then Fourier descriptors and Zernike moments descriptors can be calculated out from binaryviews. Then shape feature sequence would be learned into a BN model based on BN learning algorithm; Secondly, we propose a new 3D model retrieval method by calculating Kullback-Leibler Divergence (KLD) between BNLDs. Beneficial from the statistical learning, our BNLD is noise robustness as compared to the existing methods. The comparison between our method and the lightfield descriptor-based approach is conducted to demonstrate the effectiveness of our proposed methodology.

  19. THE MODELING OF THE FATE AND TRANSPORT OF ENVIRONMENTAL POLLUTANTS

    EPA Science Inventory

    Current models that predict the fate of organic compounds released to the environment are based on the assumption that these compounds exist exclusively as neutral species. This assumption is untrue under many environmental conditions, as some molecules can exist as cations, anio...

  20. Integrated Electronic Warfare Systems Aboard the United States Navy 21st Century Warship

    DTIC Science & Technology

    2009-12-01

    automated operation using a Human-In-the-Loop that could be integrated into existing and future combat systems. A model was developed that demonstrates...complete range of automated operation using a Human-In-the-Loop that could be integrated into existing and future combat systems. A model was developed...44 1. Base Case Model

  1. Generalized Ordinary Differential Equation Models 1

    PubMed Central

    Miao, Hongyu; Wu, Hulin; Xue, Hongqi

    2014-01-01

    Existing estimation methods for ordinary differential equation (ODE) models are not applicable to discrete data. The generalized ODE (GODE) model is therefore proposed and investigated for the first time. We develop the likelihood-based parameter estimation and inference methods for GODE models. We propose robust computing algorithms and rigorously investigate the asymptotic properties of the proposed estimator by considering both measurement errors and numerical errors in solving ODEs. The simulation study and application of our methods to an influenza viral dynamics study suggest that the proposed methods have a superior performance in terms of accuracy over the existing ODE model estimation approach and the extended smoothing-based (ESB) method. PMID:25544787

  2. Generalized Ordinary Differential Equation Models.

    PubMed

    Miao, Hongyu; Wu, Hulin; Xue, Hongqi

    2014-10-01

    Existing estimation methods for ordinary differential equation (ODE) models are not applicable to discrete data. The generalized ODE (GODE) model is therefore proposed and investigated for the first time. We develop the likelihood-based parameter estimation and inference methods for GODE models. We propose robust computing algorithms and rigorously investigate the asymptotic properties of the proposed estimator by considering both measurement errors and numerical errors in solving ODEs. The simulation study and application of our methods to an influenza viral dynamics study suggest that the proposed methods have a superior performance in terms of accuracy over the existing ODE model estimation approach and the extended smoothing-based (ESB) method.

  3. Extending Maxwell's equations for dielectric materials using analytical principles from viscoelasticity based on the fractional calculus

    NASA Astrophysics Data System (ADS)

    Wharmby, Andrew William

    Existing fractional calculus models having a non-empirical basis used to describe constitutive relationships between stress and strain in viscoelastic materials are modified to employ all orders of fractional derivatives between zero and one. Parallels between viscoelastic and dielectric theory are drawn so that these modified fractional calculus based models for viscoelastic materials may be used to describe relationships between electric flux density and electric field intensity in dielectric materials. The resulting fractional calculus based dielectric relaxation model is tested using existing complex permittivity data in the radio-frequency bandwidth of a wide variety of homogeneous materials. The consequences that the application of this newly developed fractional calculus based dielectric relaxation model has on Maxwell's equations are also examined through the effects of dielectric dissipation and dispersion.

  4. Simulating Runoff from a Grid Based Mercury Model: Flow Comparisons

    EPA Science Inventory

    Several mercury cycling models, including general mass balance approaches, mixed-batch reactors in streams or lakes, or regional process-based models, exist to assess the ecological exposure risks associated with anthropogenically increased atmospheric mercury (Hg) deposition, so...

  5. Comparing an annual and daily time-step model for predicting field-scale phosphorus loss

    USDA-ARS?s Scientific Manuscript database

    Numerous models exist for describing phosphorus (P) losses from agricultural fields. The complexity of these models varies considerably ranging from simple empirically-based annual time-step models to more complex process-based daily time step models. While better accuracy is often assumed with more...

  6. Discovering the Power of Individual-Based Modelling in Teaching and Learning: The Study of a Predator-Prey System

    ERIC Educational Resources Information Center

    Ginovart, Marta

    2014-01-01

    The general aim is to promote the use of individual-based models (biological agent-based models) in teaching and learning contexts in life sciences and to make their progressive incorporation into academic curricula easier, complementing other existing modelling strategies more frequently used in the classroom. Modelling activities for the study…

  7. A performance model for GPUs with caches

    DOE PAGES

    Dao, Thanh Tuan; Kim, Jungwon; Seo, Sangmin; ...

    2014-06-24

    To exploit the abundant computational power of the world's fastest supercomputers, an even workload distribution to the typically heterogeneous compute devices is necessary. While relatively accurate performance models exist for conventional CPUs, accurate performance estimation models for modern GPUs do not exist. This paper presents two accurate models for modern GPUs: a sampling-based linear model, and a model based on machine-learning (ML) techniques which improves the accuracy of the linear model and is applicable to modern GPUs with and without caches. We first construct the sampling-based linear model to predict the runtime of an arbitrary OpenCL kernel. Based on anmore » analysis of NVIDIA GPUs' scheduling policies we determine the earliest sampling points that allow an accurate estimation. The linear model cannot capture well the significant effects that memory coalescing or caching as implemented in modern GPUs have on performance. We therefore propose a model based on ML techniques that takes several compiler-generated statistics about the kernel as well as the GPU's hardware performance counters as additional inputs to obtain a more accurate runtime performance estimation for modern GPUs. We demonstrate the effectiveness and broad applicability of the model by applying it to three different NVIDIA GPU architectures and one AMD GPU architecture. On an extensive set of OpenCL benchmarks, on average, the proposed model estimates the runtime performance with less than 7 percent error for a second-generation GTX 280 with no on-chip caches and less than 5 percent for the Fermi-based GTX 580 with hardware caches. On the Kepler-based GTX 680, the linear model has an error of less than 10 percent. On an AMD GPU architecture, Radeon HD 6970, the model estimates with 8 percent of error rates. As a result, the proposed technique outperforms existing models by a factor of 5 to 6 in terms of accuracy.« less

  8. Assimilation of Spatially Sparse In Situ Soil Moisture Networks into a Continuous Model Domain

    NASA Astrophysics Data System (ADS)

    Gruber, A.; Crow, W. T.; Dorigo, W. A.

    2018-02-01

    Growth in the availability of near-real-time soil moisture observations from ground-based networks has spurred interest in the assimilation of these observations into land surface models via a two-dimensional data assimilation system. However, the design of such systems is currently hampered by our ignorance concerning the spatial structure of error afflicting ground and model-based soil moisture estimates. Here we apply newly developed triple collocation techniques to provide the spatial error information required to fully parameterize a two-dimensional (2-D) data assimilation system designed to assimilate spatially sparse observations acquired from existing ground-based soil moisture networks into a spatially continuous Antecedent Precipitation Index (API) model for operational agricultural drought monitoring. Over the contiguous United States (CONUS), the posterior uncertainty of surface soil moisture estimates associated with this 2-D system is compared to that obtained from the 1-D assimilation of remote sensing retrievals to assess the value of ground-based observations to constrain a surface soil moisture analysis. Results demonstrate that a fourfold increase in existing CONUS ground station density is needed for ground network observations to provide a level of skill comparable to that provided by existing satellite-based surface soil moisture retrievals.

  9. Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results

    ERIC Educational Resources Information Center

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-01-01

    We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…

  10. Comparing microscopic activity-based and traditional models of travel demand : an Austin area case study

    DOT National Transportation Integrated Search

    2007-09-01

    Two competing approaches to travel demand modeling exist today. The more traditional 4-step travel demand models rely on aggregate demographic data at a traffic analysis zone (TAZ) level. Activity-based microsimulation methods employ more robus...

  11. A dynamical system of deposit and loan volumes based on the Lotka-Volterra model

    NASA Astrophysics Data System (ADS)

    Sumarti, N.; Nurfitriyana, R.; Nurwenda, W.

    2014-02-01

    In this research, we proposed a dynamical system of deposit and loan volumes of a bank using a predator-prey paradigm, where the predator is loan volumes, and the prey is deposit volumes. The existence of loan depends on the existence of deposit because the bank will allocate the loan volume from a portion of the deposit volume. The dynamical systems have been constructed are a simple model, a model with Michaelis-Menten Response and a model with the Reserve Requirement. Equilibria of the systems are analysed whether they are stable or unstable based on their linearised system.

  12. Analysis and model on space-time characteristics of wind power output based on the measured wind speed data

    NASA Astrophysics Data System (ADS)

    Shi, Wenhui; Feng, Changyou; Qu, Jixian; Zha, Hao; Ke, Dan

    2018-02-01

    Most of the existing studies on wind power output focus on the fluctuation of wind farms and the spatial self-complementary of wind power output time series was ignored. Therefore the existing probability models can’t reflect the features of power system incorporating wind farms. This paper analyzed the spatial self-complementary of wind power and proposed a probability model which can reflect temporal characteristics of wind power on seasonal and diurnal timescales based on sufficient measured data and improved clustering method. This model could provide important reference for power system simulation incorporating wind farms.

  13. Collaborative Care in Schools: Enhancing Integration and Impact in Youth Mental Health

    PubMed Central

    Lyon, Aaron R.; Whitaker, Kelly; French, William P.; Richardson, Laura P.; Wasse, Jessica Knaster; McCauley, Elizabeth

    2016-01-01

    Collaborative Care is an innovative approach to integrated mental health service delivery that focuses on reducing access barriers, improving service quality, and lowering healthcare expenditures. A large body of evidence supports the effectiveness of Collaborative Care models with adults and, increasingly, for youth. Although existing studies examining these models for youth have focused exclusively on primary care, the education sector is also an appropriate analog for the accessibility that primary care offers to adults. Collaborative Care aligns closely with the practical realities of the education sector and may represent a strategy to achieve some of the objectives of increasingly popular multi-tiered systems of supports frameworks. Unfortunately, no resources exist to guide the application of Collaborative Care models in schools. Based on the existing evidence for Collaborative Care models, the current paper (1) provides a rationale for the adaptation of Collaborative Care models to improve mental health service accessibility and effectiveness in the education sector; (2) presents a preliminary Collaborative Care model for use in schools; and (3) describes avenues for research surrounding school-based Collaborative Care, including the currently funded Accessible, Collaborative Care for Effective School-based Services (ACCESS) project. PMID:28392832

  14. The Implementation and Evaluation of a Project-Oriented Problem-Based Learning Module in a First Year Engineering Programme

    ERIC Educational Resources Information Center

    McLoone, Seamus C.; Lawlor, Bob J.; Meehan, Andrew R.

    2016-01-01

    This paper describes how a circuits-based project-oriented problem-based learning educational model was integrated into the first year of a Bachelor of Engineering in Electronic Engineering programme at Maynooth University, Ireland. While many variations of problem based learning exist, the presented model is closely aligned with the model used in…

  15. Load Modeling – A Review

    DOE PAGES

    Arif, Anmar; Wang, Zhaoyu; Wang, Jianhui; ...

    2017-05-02

    Load modeling has significant impact on power system studies. This paper presents a review on load modeling and identification techniques. Load models can be classified into two broad categories: static and dynamic models, while there are two types of approaches to identify model parameters: measurement-based and component-based. Load modeling has received more attention in recent years because of the renewable integration, demand-side management, and smart metering devices. However, the commonly used load models are outdated, and cannot represent emerging loads. There is a need to systematically review existing load modeling techniques and suggest future research directions to meet the increasingmore » interests from industry and academia. In this study, we provide a thorough survey on the academic research progress and industry practices, and highlight existing issues and new trends in load modeling.« less

  16. Load Modeling – A Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arif, Anmar; Wang, Zhaoyu; Wang, Jianhui

    Load modeling has significant impact on power system studies. This paper presents a review on load modeling and identification techniques. Load models can be classified into two broad categories: static and dynamic models, while there are two types of approaches to identify model parameters: measurement-based and component-based. Load modeling has received more attention in recent years because of the renewable integration, demand-side management, and smart metering devices. However, the commonly used load models are outdated, and cannot represent emerging loads. There is a need to systematically review existing load modeling techniques and suggest future research directions to meet the increasingmore » interests from industry and academia. In this study, we provide a thorough survey on the academic research progress and industry practices, and highlight existing issues and new trends in load modeling.« less

  17. Guiding Conformation Space Search with an All-Atom Energy Potential

    PubMed Central

    Brunette, TJ; Brock, Oliver

    2009-01-01

    The most significant impediment for protein structure prediction is the inadequacy of conformation space search. Conformation space is too large and the energy landscape too rugged for existing search methods to consistently find near-optimal minima. To alleviate this problem, we present model-based search, a novel conformation space search method. Model-based search uses highly accurate information obtained during search to build an approximate, partial model of the energy landscape. Model-based search aggregates information in the model as it progresses, and in turn uses this information to guide exploration towards regions most likely to contain a near-optimal minimum. We validate our method by predicting the structure of 32 proteins, ranging in length from 49 to 213 amino acids. Our results demonstrate that model-based search is more effective at finding low-energy conformations in high-dimensional conformation spaces than existing search methods. The reduction in energy translates into structure predictions of increased accuracy. PMID:18536015

  18. Hydrous ferric oxide: evaluation of Cd-HFO surface complexation models combining Cd(K) EXAFS data, potentiometric titration results, and surface site structures identified from mineralogical knowledge.

    PubMed

    Spadini, Lorenzo; Schindler, Paul W; Charlet, Laurent; Manceau, Alain; Vala Ragnarsdottir, K

    2003-10-01

    The surface properties of ferrihydrite were studied by combining wet chemical data, Cd(K) EXAFS data, and a surface structure and protonation model of the ferrihydrite surface. Acid-base titration experiments and Cd(II)-ferrihydrite sorption experiments were performed within 3<-log[H(+)]<10.5 and 0.5<[Cd(t)]<12 mM in 0.3 M NaClO(4) at 25 degrees C, where [Cd(t)] refers to total Cd concentration. Measurements at -5.5triple bond Fe-OH(-1/2),logk((int))=-8.29, assuming the existence of a unique intrinsic microscopic constant, logk((int)), and consequently the existence of a single significant type of acid-base reactive functional groups. The surface structure model indicates that these groups are terminal water groups. The Cd(II) data were modeled assuming the existence of a single reactive site. The model fits the data set at low Cd(II) concentration and up to 50% surface coverage. At high coverage more Cd(II) ions than predicted are adsorbed, which is indicative of the existence of a second type of site of lower affinity. This agrees with the surface structure and protonation model developed, which indicates comparable concentrations of high- and low-affinity sites. The model further shows that for each class of low- and high-affinity sites there exists a variety of corresponding Cd surface complex structure, depending on the model crystal faces on which the complexes develop. Generally, high-affinity surface structures have surface coordinations of 3 and 4, as compared to 1 and 2 for low-affinity surface structures.

  19. Training of Existing Workers: Issues, Incentives and Models. Support Document

    ERIC Educational Resources Information Center

    Mawer, Giselle; Jackson, Elaine

    2005-01-01

    This document was produced by the authors based on their research for the report, "Training of Existing Workers: Issues, Incentives and Models," (ED495138) and is an added resource for further information. This support document is divided into the following sections: (1) The Retail Industry--A Snapshot; (2) Case Studies--Hardware, Retail…

  20. Toward better public health reporting using existing off the shelf approaches: The value of medical dictionaries in automated cancer detection using plaintext medical data.

    PubMed

    Kasthurirathne, Suranga N; Dixon, Brian E; Gichoya, Judy; Xu, Huiping; Xia, Yuni; Mamlin, Burke; Grannis, Shaun J

    2017-05-01

    Existing approaches to derive decision models from plaintext clinical data frequently depend on medical dictionaries as the sources of potential features. Prior research suggests that decision models developed using non-dictionary based feature sourcing approaches and "off the shelf" tools could predict cancer with performance metrics between 80% and 90%. We sought to compare non-dictionary based models to models built using features derived from medical dictionaries. We evaluated the detection of cancer cases from free text pathology reports using decision models built with combinations of dictionary or non-dictionary based feature sourcing approaches, 4 feature subset sizes, and 5 classification algorithms. Each decision model was evaluated using the following performance metrics: sensitivity, specificity, accuracy, positive predictive value, and area under the receiver operating characteristics (ROC) curve. Decision models parameterized using dictionary and non-dictionary feature sourcing approaches produced performance metrics between 70 and 90%. The source of features and feature subset size had no impact on the performance of a decision model. Our study suggests there is little value in leveraging medical dictionaries for extracting features for decision model building. Decision models built using features extracted from the plaintext reports themselves achieve comparable results to those built using medical dictionaries. Overall, this suggests that existing "off the shelf" approaches can be leveraged to perform accurate cancer detection using less complex Named Entity Recognition (NER) based feature extraction, automated feature selection and modeling approaches. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  2. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  3. Model-based testing with UML applied to a roaming algorithm for bluetooth devices.

    PubMed

    Dai, Zhen Ru; Grabowski, Jens; Neukirchen, Helmut; Pals, Holger

    2004-11-01

    In late 2001, the Object Management Group issued a Request for Proposal to develop a testing profile for UML 2.0. In June 2003, the work on the UML 2.0 Testing Profile was finally adopted by the OMG. Since March 2004, it has become an official standard of the OMG. The UML 2.0 Testing Profile provides support for UML based model-driven testing. This paper introduces a methodology on how to use the testing profile in order to modify and extend an existing UML design model for test issues. The application of the methodology will be explained by applying it to an existing UML Model for a Bluetooth device.

  4. A Nonlinear Regression Model Estimating Single Source Concentrations of Primary and Secondarily Formed 2.5

    EPA Science Inventory

    Various approaches and tools exist to estimate local and regional PM2.5 impacts from a single emissions source, ranging from simple screening techniques to Gaussian based dispersion models and complex grid-based Eulerian photochemical transport models. These approache...

  5. Rotorcraft Performance Model (RPM) for use in AEDT.

    DOT National Transportation Integrated Search

    2015-11-01

    This report documents a rotorcraft performance model for use in the FAAs Aviation Environmental Design Tool. The new rotorcraft performance model is physics-based. This new model replaces the existing helicopter trajectory modeling methods in the ...

  6. Electricity Load Forecasting Using Support Vector Regression with Memetic Algorithms

    PubMed Central

    Hu, Zhongyi; Xiong, Tao

    2013-01-01

    Electricity load forecasting is an important issue that is widely explored and examined in power systems operation literature and commercial transactions in electricity markets literature as well. Among the existing forecasting models, support vector regression (SVR) has gained much attention. Considering the performance of SVR highly depends on its parameters; this study proposed a firefly algorithm (FA) based memetic algorithm (FA-MA) to appropriately determine the parameters of SVR forecasting model. In the proposed FA-MA algorithm, the FA algorithm is applied to explore the solution space, and the pattern search is used to conduct individual learning and thus enhance the exploitation of FA. Experimental results confirm that the proposed FA-MA based SVR model can not only yield more accurate forecasting results than the other four evolutionary algorithms based SVR models and three well-known forecasting models but also outperform the hybrid algorithms in the related existing literature. PMID:24459425

  7. Electricity load forecasting using support vector regression with memetic algorithms.

    PubMed

    Hu, Zhongyi; Bao, Yukun; Xiong, Tao

    2013-01-01

    Electricity load forecasting is an important issue that is widely explored and examined in power systems operation literature and commercial transactions in electricity markets literature as well. Among the existing forecasting models, support vector regression (SVR) has gained much attention. Considering the performance of SVR highly depends on its parameters; this study proposed a firefly algorithm (FA) based memetic algorithm (FA-MA) to appropriately determine the parameters of SVR forecasting model. In the proposed FA-MA algorithm, the FA algorithm is applied to explore the solution space, and the pattern search is used to conduct individual learning and thus enhance the exploitation of FA. Experimental results confirm that the proposed FA-MA based SVR model can not only yield more accurate forecasting results than the other four evolutionary algorithms based SVR models and three well-known forecasting models but also outperform the hybrid algorithms in the related existing literature.

  8. Game-Theoretic Models of Information Overload in Social Networks

    NASA Astrophysics Data System (ADS)

    Borgs, Christian; Chayes, Jennifer; Karrer, Brian; Meeder, Brendan; Ravi, R.; Reagans, Ray; Sayedi, Amin

    We study the effect of information overload on user engagement in an asymmetric social network like Twitter. We introduce simple game-theoretic models that capture rate competition between celebrities producing updates in such networks where users non-strategically choose a subset of celebrities to follow based on the utility derived from high quality updates as well as disutility derived from having to wade through too many updates. Our two variants model the two behaviors of users dropping some potential connections (followership model) or leaving the network altogether (engagement model). We show that under a simple formulation of celebrity rate competition, there is no pure strategy Nash equilibrium under the first model. We then identify special cases in both models when pure rate equilibria exist for the celebrities: For the followership model, we show existence of a pure rate equilibrium when there is a global ranking of the celebrities in terms of the quality of their updates to users. This result also generalizes to the case when there is a partial order consistent with all the linear orders of the celebrities based on their qualities to the users. Furthermore, these equilibria can be computed in polynomial time. For the engagement model, pure rate equilibria exist when all users are interested in the same number of celebrities, or when they are interested in at most two. Finally, we also give a finite though inefficient procedure to determine if pure equilibria exist in the general case of the followership model.

  9. A Critical Review for Developing Accurate and Dynamic Predictive Models Using Machine Learning Methods in Medicine and Health Care.

    PubMed

    Alanazi, Hamdan O; Abdullah, Abdul Hanan; Qureshi, Kashif Naseer

    2017-04-01

    Recently, Artificial Intelligence (AI) has been used widely in medicine and health care sector. In machine learning, the classification or prediction is a major field of AI. Today, the study of existing predictive models based on machine learning methods is extremely active. Doctors need accurate predictions for the outcomes of their patients' diseases. In addition, for accurate predictions, timing is another significant factor that influences treatment decisions. In this paper, existing predictive models in medicine and health care have critically reviewed. Furthermore, the most famous machine learning methods have explained, and the confusion between a statistical approach and machine learning has clarified. A review of related literature reveals that the predictions of existing predictive models differ even when the same dataset is used. Therefore, existing predictive models are essential, and current methods must be improved.

  10. An individual-based model of zebrafish population dynamics accounting for energy dynamics.

    PubMed

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R R

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level.

  11. An Individual-Based Model of Zebrafish Population Dynamics Accounting for Energy Dynamics

    PubMed Central

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R. R.

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level. PMID:25938409

  12. Can Counter-Gang Models be Applied to Counter ISIS’s Internet Recruitment Campaign

    DTIC Science & Technology

    2016-06-10

    limitation that exists is the lack of reliable statistics from social media companies in regards to the quantity of ISIS-affiliated sites, which exist on... statistics , they have approximately 320-million monthly active users with thirty-five-plus languages supported and 77 percent of accounts located...Justice and Delinquency Prevention program. For deterrence-based models, the primary point of research is focused deterrence models with emphasis placed

  13. Data-based Non-Markovian Model Inference

    NASA Astrophysics Data System (ADS)

    Ghil, Michael

    2015-04-01

    This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close collaboration with M.D. Chekroun, D. Kondrashov, S. Kravtsov and A.W. Robertson.

  14. Verification of reflectance models in turbid waters

    NASA Technical Reports Server (NTRS)

    Tanis, F. J.; Lyzenga, D. R.

    1981-01-01

    Inherent optical parameters of very turbid waters were used to evaluate existing water reflectance models. Measured upwelling radiance spectra and Monte Carlo simulations of the radiative transfer equations were compared with results from models based upon two flow, quasi-single scattering, augmented isotropic scattering, and power series approximation. Each model was evaluated for three separate components of upwelling radiance: (1) direct sunlight; (2) diffuse skylight; and (3) internally reflected light. Limitations of existing water reflectance models as applied to turbid waters and possible applications to the extraction of water constituent information are discussed.

  15. A relevance theory of induction.

    PubMed

    Medin, Douglas L; Coley, John D; Storms, Gert; Hayes, Brett K

    2003-09-01

    A framework theory, organized around the principle of relevance, is proposed for category-based reasoning. According to the relevance principle, people assume that premises are informative with respect to conclusions. This idea leads to the prediction that people will use causal scenarios and property reinforcement strategies in inductive reasoning. These predictions are contrasted with both existing models and normative logic. Judgments of argument strength were gathered in three different countries, and the results showed the importance of both causal scenarios and property reinforcement in category-based inferences. The relation between the relevance framework and existing models of category-based inductive reasoning is discussed in the light of these findings.

  16. DIMM-SC: a Dirichlet mixture model for clustering droplet-based single cell transcriptomic data.

    PubMed

    Sun, Zhe; Wang, Ting; Deng, Ke; Wang, Xiao-Feng; Lafyatis, Robert; Ding, Ying; Hu, Ming; Chen, Wei

    2018-01-01

    Single cell transcriptome sequencing (scRNA-Seq) has become a revolutionary tool to study cellular and molecular processes at single cell resolution. Among existing technologies, the recently developed droplet-based platform enables efficient parallel processing of thousands of single cells with direct counting of transcript copies using Unique Molecular Identifier (UMI). Despite the technology advances, statistical methods and computational tools are still lacking for analyzing droplet-based scRNA-Seq data. Particularly, model-based approaches for clustering large-scale single cell transcriptomic data are still under-explored. We developed DIMM-SC, a Dirichlet Mixture Model for clustering droplet-based Single Cell transcriptomic data. This approach explicitly models UMI count data from scRNA-Seq experiments and characterizes variations across different cell clusters via a Dirichlet mixture prior. We performed comprehensive simulations to evaluate DIMM-SC and compared it with existing clustering methods such as K-means, CellTree and Seurat. In addition, we analyzed public scRNA-Seq datasets with known cluster labels and in-house scRNA-Seq datasets from a study of systemic sclerosis with prior biological knowledge to benchmark and validate DIMM-SC. Both simulation studies and real data applications demonstrated that overall, DIMM-SC achieves substantially improved clustering accuracy and much lower clustering variability compared to other existing clustering methods. More importantly, as a model-based approach, DIMM-SC is able to quantify the clustering uncertainty for each single cell, facilitating rigorous statistical inference and biological interpretations, which are typically unavailable from existing clustering methods. DIMM-SC has been implemented in a user-friendly R package with a detailed tutorial available on www.pitt.edu/∼wec47/singlecell.html. wei.chen@chp.edu or hum@ccf.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  17. The substantative knowledge base for travel and tourism: a systems model

    Treesearch

    David S. Solan

    1992-01-01

    Strategies for education and professional preparation in travel and tourism have generally been based in traditional tourism-related disciplines providing somewhat narrow perspectives of the tourism phenomenon. The need exists for models that provide comprehensive, holistic perspectives of travel and tourism. This paper presents one such systems model showing that...

  18. Modelling the monetary value of a QALY: a new approach based on UK data.

    PubMed

    Mason, Helen; Jones-Lee, Michael; Donaldson, Cam

    2009-08-01

    Debate about the monetary value of a quality-adjusted life year (QALY) has existed in the health economics literature for some time. More recently, concern about such a value has arisen in UK health policy. This paper reports on an attempt to 'model' a willingness-to-pay-based value of a QALY from the existing value of preventing a statistical fatality (VPF) currently used in UK public sector decision making. Two methods of deriving the value of a QALY from the existing UK VPF are outlined: one conventional and one new. The advantages and disadvantages of each of the approaches are discussed as well as the implications of the results for policy and health economic evaluation methodology.

  19. An investigation of modelling and design for software service applications.

    PubMed

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  20. Prediction of brittleness based on anisotropic rock physics model for kerogen-rich shale

    NASA Astrophysics Data System (ADS)

    Qian, Ke-Ran; He, Zhi-Liang; Chen, Ye-Quan; Liu, Xi-Wu; Li, Xiang-Yang

    2017-12-01

    The construction of a shale rock physics model and the selection of an appropriate brittleness index ( BI) are two significant steps that can influence the accuracy of brittleness prediction. On one hand, the existing models of kerogen-rich shale are controversial, so a reasonable rock physics model needs to be built. On the other hand, several types of equations already exist for predicting the BI whose feasibility needs to be carefully considered. This study constructed a kerogen-rich rock physics model by performing the selfconsistent approximation and the differential effective medium theory to model intercoupled clay and kerogen mixtures. The feasibility of our model was confirmed by comparison with classical models, showing better accuracy. Templates were constructed based on our model to link physical properties and the BI. Different equations for the BI had different sensitivities, making them suitable for different types of formations. Equations based on Young's Modulus were sensitive to variations in lithology, while those using Lame's Coefficients were sensitive to porosity and pore fluids. Physical information must be considered to improve brittleness prediction.

  1. A survey of Existing V&V, UQ and M&S Data and Knowledge Bases in Support of the Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hyung Lee; Rich Johnson, Ph.D.; Kimberlyn C. Moussesau

    2011-12-01

    The Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Oak Ridge National Laboratory, Utah State University and others. The objective of this consortium is to establish a comprehensive knowledge base to provide Verification and Validation (V&V) and Uncertainty Quantification (UQ) and other resources for advanced modeling and simulation (M&S) in nuclear reactor design and analysis. NE-KAMS will become a valuable resource for the nuclear industry, the national laboratories, the U.S. NRC and the public to help ensure themore » safe operation of existing and future nuclear reactors. A survey and evaluation of the state-of-the-art of existing V&V and M&S databases, including the Department of Energy and commercial databases, has been performed to ensure that the NE-KAMS effort will not be duplicating existing resources and capabilities and to assess the scope of the effort required to develop and implement NE-KAMS. The survey and evaluation have indeed highlighted the unique set of value-added functionality and services that NE-KAMS will provide to its users. Additionally, the survey has helped develop a better understanding of the architecture and functionality of these data and knowledge bases that can be used to leverage the development of NE-KAMS.« less

  2. Base Case v.5.15 Documentation Supplement to Support the Clean Power Plan

    EPA Pesticide Factsheets

    Learn about several modeling assumptions used as part of EPA's analysis of the Clean Power Plan (Carbon Pollution Guidelines for Existing Electric Generating Units) using the EPA v.5.15 Base Case using Integrated Planning Model (IPM).

  3. The flow of power law fluids in elastic networks and porous media.

    PubMed

    Sochi, Taha

    2016-02-01

    The flow of power law fluids, which include shear thinning and shear thickening as well as Newtonian as a special case, in networks of interconnected elastic tubes is investigated using a residual-based pore scale network modeling method with the employment of newly derived formulae. Two relations describing the mechanical interaction between the local pressure and local cross-sectional area in distensible tubes of elastic nature are considered in the derivation of these formulae. The model can be used to describe shear dependent flows of mainly viscous nature. The behavior of the proposed model is vindicated by several tests in a number of special and limiting cases where the results can be verified quantitatively or qualitatively. The model, which is the first of its kind, incorporates more than one major nonlinearity corresponding to the fluid rheology and conduit mechanical properties, that is non-Newtonian effects and tube distensibility. The formulation, implementation, and performance indicate that the model enjoys certain advantages over the existing models such as being exact within the restricting assumptions on which the model is based, easy implementation, low computational costs, reliability, and smooth convergence. The proposed model can, therefore, be used as an alternative to the existing Newtonian distensible models; moreover, it stretches the capabilities of the existing modeling approaches to reach non-Newtonian rheologies.

  4. Orthogonal model and experimental data for analyzing wood-fiber-based tri-axial ribbed structural panels in bending

    Treesearch

    Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai

    2017-01-01

    This paper presents an analysis of 3-dimensional engineered structural panels (3DESP) made from wood-fiber-based laminated paper composites. Since the existing models for calculating the mechanical behavior of core configurations within sandwich panels are very complex, a new simplified orthogonal model (SOM) using an equivalent element has been developed. This model...

  5. Surface-Charge-Based Micro-Models--A Solid Foundation for Learning about Direct Current Circuits

    ERIC Educational Resources Information Center

    Hirvonen, P. E.

    2007-01-01

    This study explores how the use of a surface-charge-based instructional approach affects introductory university level students' understanding of direct current (dc) circuits. The introduced teaching intervention includes electrostatics, surface-charge-based micro-models that explain the existence of an electric field inside the current-carrying…

  6. Reproducing (Dis)advantage: The Role of Family-Based, School-Based, and Cumulative-Based Processes

    ERIC Educational Resources Information Center

    Conner, Sonya

    2012-01-01

    Pierre Bourdieu's theory of cultural and social reproduction (Bourdieu 1973; Bourdieu and Passeron 1977) offers a model that can be used to explain the existence of persistent educational stratification in the United States, which contributes to perpetuation of social inequality, more generally. This theoretical model purports three…

  7. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  8. 3D Modeling of Lacus Mortis Pit Crater with Presumed Interior Tube Structure

    NASA Astrophysics Data System (ADS)

    Hong, Ik-Seon; Yi, Yu; Yu, Jaehyung; Haruyama, Junichi

    2015-06-01

    When humans explore the Moon, lunar caves will be an ideal base to provide a shelter from the hazards of radiation, meteorite impact, and extreme diurnal temperature differences. In order to ascertain the existence of caves on the Moon, it is best to visit the Moon in person. The Google Lunar X Prize(GLXP) competition started recently to attempt lunar exploration missions. Ones of those groups competing, plan to land on a pit of Lacus Mortis and determine the existence of a cave inside this pit. In this pit, there is a ramp from the entrance down to the inside of the pit, which enables a rover to approach the inner region of the pit. In this study, under the assumption of the existence of a cave in this pit, a 3D model was developed based on the optical image data. Since this model simulates the actual terrain, the rendering of the model agrees well with the image data. Furthermore, the 3D printing of this model will enable more rigorous investigations and also could be used to publicize lunar exploration missions with ease.

  9. Determination of Network Attributes from a High Resolution Terrain Data Base

    DTIC Science & Technology

    1987-09-01

    and existing models is in the method used to make decisions. All of ,he models- reviewed when developing the ALARM strategy depended either on threshold...problems with the methods currently accepted and used to *model the decision process. These methods are recognized because they have their uses...observation, detection, and lines of sight along a narrow strip of terrain relative to the overall size of the sectors of the two forces. Existing methods of

  10. Study of solid state photomultiplier

    NASA Technical Reports Server (NTRS)

    Hays, K. M.; Laviolette, R. A.

    1987-01-01

    Available solid state photomultiplier (SSPM) detectors were tested under low-background, low temperature conditions to determine the conditions producing optimal sensitivity in a space-based astronomy system such as a liquid cooled helium telescope in orbit. Detector temperatures varied between 6 and 9 K, with background flux ranging from 10 to the 13th power to less than 10 to the 6th power photons/square cm-s. Measured parameters included quantum efficiency, noise, dark current, and spectral response. Experimental data were reduced, analyzed, and combined with existing data to build the SSPM data base included herein. The results were compared to analytical models of SSPM performance where appropriate models existed. Analytical models presented here were developed to be as consistent with the data base as practicable. Significant differences between the theory and data are described. Some models were developed or updated as a result of this study.

  11. Characterization of Orbital Debris Via Hyper-Velocity Ground-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather

    2015-01-01

    To replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoD and NASA breakup models. DebriSat is intended to be representative of modern LEO satellites.Major design decisions were reviewed and approved by Aerospace subject matter experts from different disciplines. DebriSat includes 7 major subsystems. Attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. A key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), supporting the development of the DoD and NASA satellite breakup models was conducted at AEDC in 1992 .Breakup models based on SOCIT have supported many applications and matched on-orbit events reasonably well over the years.

  12. Efficacy of a surfactant-based wound dressing on biofilm control.

    PubMed

    Percival, Steven L; Mayer, Dieter; Salisbury, Anne-Marie

    2017-09-01

    The aim of this study was to evaluate the efficacy of both a nonantimicrobial and antimicrobial (1% silver sulfadiazine-SSD) surfactant-based wound dressing in the control of Pseudomonas aeruginosa, Enterococcus sp, Staphylococcus epidermidis, Staphylococcus aureus, and methicillin-resistant S. aureus (MRSA) biofilms. Anti-biofilm efficacy was evaluated in numerous adapted American Standards for Testing and Materials (ASTM) standard biofilm models and other bespoke biofilm models. The ASTM standard models employed included the Minimum biofilm eradication concentration (MBEC) biofilm model (ASTM E2799) and the Centers for Disease Control (CDC) biofilm reactor model (ASTM 2871). Such bespoke biofilm models included the filter biofilm model and the chamberslide biofilm model. Results showed complete kill of microorganisms within a biofilm using the antimicrobial surfactant-based wound dressing. Interestingly, the nonantimicrobial surfactant-based dressing could disrupt existing biofilms by causing biofilm detachment. Prior to biofilm detachment, we demonstrated, using confocal laser scanning microscopy (CLSM), the dispersive effect of the nonantimicrobial surfactant-based wound dressing on the biofilm within 10 minutes of treatment. Furthermore, the non-antimicrobial surfactant-based wound dressing caused an increase in microbial flocculation/aggregation, important for microbial concentration. In conclusion, this nonantimicrobial surfactant-based wound dressing leads to the effective detachment and dispersion of in vitro biofilms. The use of surfactant-based wound dressings in a clinical setting may help to disrupt existing biofilm from wound tissue and may increase the action of antimicrobial treatment. © 2017 by the Wound Healing Society.

  13. Enhancing emotional-based target prediction

    NASA Astrophysics Data System (ADS)

    Gosnell, Michael; Woodley, Robert

    2008-04-01

    This work extends existing agent-based target movement prediction to include key ideas of behavioral inertia, steady states, and catastrophic change from existing psychological, sociological, and mathematical work. Existing target prediction work inherently assumes a single steady state for target behavior, and attempts to classify behavior based on a single emotional state set. The enhanced, emotional-based target prediction maintains up to three distinct steady states, or typical behaviors, based on a target's operating conditions and observed behaviors. Each steady state has an associated behavioral inertia, similar to the standard deviation of behaviors within that state. The enhanced prediction framework also allows steady state transitions through catastrophic change and individual steady states could be used in an offline analysis with additional modeling efforts to better predict anticipated target reactions.

  14. A cost-performance model for ground-based optical communications receiving telescopes

    NASA Technical Reports Server (NTRS)

    Lesh, J. R.; Robinson, D. L.

    1986-01-01

    An analytical cost-performance model for a ground-based optical communications receiving telescope is presented. The model considers costs of existing telescopes as a function of diameter and field of view. This, coupled with communication performance as a function of receiver diameter and field of view, yields the appropriate telescope cost versus communication performance curve.

  15. Evaluation of Proteus as a Tool for the Rapid Development of Models of Hydrologic Systems

    NASA Astrophysics Data System (ADS)

    Weigand, T. M.; Farthing, M. W.; Kees, C. E.; Miller, C. T.

    2013-12-01

    Models of modern hydrologic systems can be complex and involve a variety of operators with varying character. The goal is to implement approximations of such models that are both efficient for the developer and computationally efficient, which is a set of naturally competing objectives. Proteus is a Python-based toolbox that supports prototyping of model formulations as well as a wide variety of modern numerical methods and parallel computing. We used Proteus to develop numerical approximations for three models: Richards' equation, a brine flow model derived using the Thermodynamically Constrained Averaging Theory (TCAT), and a multiphase TCAT-based tumor growth model. For Richards' equation, we investigated discontinuous Galerkin solutions with higher order time integration based on the backward difference formulas. The TCAT brine flow model was implemented using Proteus and a variety of numerical methods were compared to hand coded solutions. Finally, an existing tumor growth model was implemented in Proteus to introduce more advanced numerics and allow the code to be run in parallel. From these three example models, Proteus was found to be an attractive open-source option for rapidly developing high quality code for solving existing and evolving computational science models.

  16. Enhanced Vapor-Phase Diffusion in Porous Media - LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, C.K.; Webb, S.W.

    1999-01-01

    As part of the Laboratory-Directed Research and Development (LDRD) Program at Sandia National Laboratories, an investigation into the existence of enhanced vapor-phase diffusion (EVD) in porous media has been conducted. A thorough literature review was initially performed across multiple disciplines (soil science and engineering), and based on this review, the existence of EVD was found to be questionable. As a result, modeling and experiments were initiated to investigate the existence of EVD. In this LDRD, the first mechanistic model of EVD was developed which demonstrated the mechanisms responsible for EVD. The first direct measurements of EVD have also been conductedmore » at multiple scales. Measurements have been made at the pore scale, in a two- dimensional network as represented by a fracture aperture, and in a porous medium. Significant enhancement of vapor-phase transport relative to Fickian diffusion was measured in all cases. The modeling and experimental results provide additional mechanisms for EVD beyond those presented by the generally accepted model of Philip and deVries (1957), which required a thermal gradient for EVD to exist. Modeling and experimental results show significant enhancement under isothermal conditions. Application of EVD to vapor transport in the near-surface vadose zone show a significant variation between no enhancement, the model of Philip and deVries, and the present results. Based on this information, the model of Philip and deVries may need to be modified, and additional studies are recommended.« less

  17. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    NASA Astrophysics Data System (ADS)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  18. Development of the information model for consumer assessment of key quality indicators by goods labelling

    NASA Astrophysics Data System (ADS)

    Koshkina, S.; Ostrinskaya, L.

    2018-04-01

    An information model for “key” quality indicators of goods has been developed. This model is based on the assessment of f standardization existing state and the product labeling quality. According to the authors’ opinion, the proposed “key” indicators are the most significant for purchasing decision making. Customers will be able to use this model through their mobile technical devices. The developed model allows to decompose existing processes in data flows and to reveal the levels of possible architectural solutions. In-depth analysis of the presented information model decomposition levels will allow determining the stages of its improvement and to reveal additional indicators of the goods quality that are of interest to customers in the further research. Examining the architectural solutions for the customer’s information environment functioning when integrating existing databases will allow us to determine the boundaries of the model flexibility and customizability.

  19. An investigation of modelling and design for software service applications

    PubMed Central

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  20. Automotive Maintenance Data Base for Model Years 1976-1979. Part I

    DOT National Transportation Integrated Search

    1980-12-01

    An update of the existing data base was developed to include life cycle maintenance costs of representative vehicles for the model years 1976-1979. Repair costs as a function of time are also developed for a passenger car in each of the compact, subc...

  1. Models for Delivering School-Based Dental Care.

    ERIC Educational Resources Information Center

    Albert, David A.; McManus, Joseph M.; Mitchell, Dennis A.

    2005-01-01

    School-based health centers (SBHCs) often are located in high-need schools and communities. Dental service is frequently an addition to existing comprehensive services, functioning in a variety of models, configurations, and locations. SBHCs are indicated when parents have limited financial resources or inadequate health insurance, limiting…

  2. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  3. Stylized facts in social networks: Community-based static modeling

    NASA Astrophysics Data System (ADS)

    Jo, Hang-Hyun; Murase, Yohsuke; Török, János; Kertész, János; Kaski, Kimmo

    2018-06-01

    The past analyses of datasets of social networks have enabled us to make empirical findings of a number of aspects of human society, which are commonly featured as stylized facts of social networks, such as broad distributions of network quantities, existence of communities, assortative mixing, and intensity-topology correlations. Since the understanding of the structure of these complex social networks is far from complete, for deeper insight into human society more comprehensive datasets and modeling of the stylized facts are needed. Although the existing dynamical and static models can generate some stylized facts, here we take an alternative approach by devising a community-based static model with heterogeneous community sizes and larger communities having smaller link density and weight. With these few assumptions we are able to generate realistic social networks that show most stylized facts for a wide range of parameters, as demonstrated numerically and analytically. Since our community-based static model is simple to implement and easily scalable, it can be used as a reference system, benchmark, or testbed for further applications.

  4. Modern meta-heuristics based on nonlinear physics processes: A review of models and design procedures

    NASA Astrophysics Data System (ADS)

    Salcedo-Sanz, S.

    2016-10-01

    Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in broad sense, of meta-heuristics, and describe free-accessible software frameworks which can be used to make easier the implementation of these algorithms.

  5. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction.

    PubMed

    Lu, Jingtao; Goldsmith, Michael-Rock; Grulke, Christopher M; Chang, Daniel T; Brooks, Raina D; Leonard, Jeremy A; Phillips, Martin B; Hypes, Ethan D; Fair, Matthew J; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C; Tan, Yu-Mei

    2016-02-01

    Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals.

  6. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction

    PubMed Central

    Grulke, Christopher M.; Chang, Daniel T.; Brooks, Raina D.; Leonard, Jeremy A.; Phillips, Martin B.; Hypes, Ethan D.; Fair, Matthew J.; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C.; Tan, Yu-Mei

    2016-01-01

    Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals. PMID:26871706

  7. Multicriteria decision model for retrofitting existing buildings

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, B.

    2003-04-01

    In this paper a model to decide which buildings from an urban area should be retrofitted is presented. The model has been cast into existing ones by choosing the decision rule, criterion weighting and decision support system types most suitable for the spatial problem of reducing earthquake risk in urban areas, considering existing spatial multiatributive and multiobjective decision methods and especially collaborative issues. Due to the participative character of the group decision problem "retrofitting existing buildings" the decision making model is based on interactivity. Buildings have been modeled following the criteria of spatial decision support systems. This includes identifying the corresponding spatial elements of buildings according to the information needs of actors from different sphaeres like architects, construction engineers and economists. The decision model aims to facilitate collaboration between this actors. The way of setting priorities interactivelly will be shown, by detailing the two phases: judgemental and computational, in this case site analysis, collection and evaluation of the unmodified data and converting survey data to information with computational methods using additional expert support. Buildings have been divided into spatial elements which are characteristic for the survey, present typical damages in case of an earthquake and are decisive for a better seismic behaviour in case of retrofitting. The paper describes the architectural and engineering characteristics as well as the structural damage for constuctions of different building ages on the example of building types in Bucharest, Romania in compressible and interdependent charts, based on field observation, reports from the 1977 earthquake and detailed studies made by the author together with a local engineer for the EERI Web Housing Encyclopedia. On this base criteria for setting priorities flow into the expert information contained in the system.

  8. Research on Turbofan Engine Model above Idle State Based on NARX Modeling Approach

    NASA Astrophysics Data System (ADS)

    Yu, Bing; Shu, Wenjun

    2017-03-01

    The nonlinear model for turbofan engine above idle state based on NARX is studied. Above all, the data sets for the JT9D engine from existing model are obtained via simulation. Then, a nonlinear modeling scheme based on NARX is proposed and several models with different parameters are built according to the former data sets. Finally, the simulations have been taken to verify the precise and dynamic performance the models, the results show that the NARX model can well reflect the dynamics characteristic of the turbofan engine with high accuracy.

  9. Human systems dynamics: Toward a computational model

    NASA Astrophysics Data System (ADS)

    Eoyang, Glenda H.

    2012-09-01

    A robust and reliable computational model of complex human systems dynamics could support advancements in theory and practice for social systems at all levels, from intrapersonal experience to global politics and economics. Models of human interactions have evolved from traditional, Newtonian systems assumptions, which served a variety of practical and theoretical needs of the past. Another class of models has been inspired and informed by models and methods from nonlinear dynamics, chaos, and complexity science. None of the existing models, however, is able to represent the open, high dimension, and nonlinear self-organizing dynamics of social systems. An effective model will represent interactions at multiple levels to generate emergent patterns of social and political life of individuals and groups. Existing models and modeling methods are considered and assessed against characteristic pattern-forming processes in observed and experienced phenomena of human systems. A conceptual model, CDE Model, based on the conditions for self-organizing in human systems, is explored as an alternative to existing models and methods. While the new model overcomes the limitations of previous models, it also provides an explanatory base and foundation for prospective analysis to inform real-time meaning making and action taking in response to complex conditions in the real world. An invitation is extended to readers to engage in developing a computational model that incorporates the assumptions, meta-variables, and relationships of this open, high dimension, and nonlinear conceptual model of the complex dynamics of human systems.

  10. Caring for people with dementia in residential aged care: successes with a composite person-centered care model featuring Montessori-based activities.

    PubMed

    Roberts, Gail; Morley, Catherine; Walters, Wendy; Malta, Sue; Doyle, Colleen

    2015-01-01

    Person-centered models of dementia care commonly merge aspects of existing models with additional influences from published and unpublished evidence and existing government policy. This study reports on the development and evaluation of one such composite model of person-centered dementia care, the ABLE model. The model was based on building the capacity and ability of residents living with dementia, using environmental changes, staff education and organizational and community engagement. Montessori principles were also used. The evaluation of the model employed mixed methods. Significant behavior changes were evident among residents of the dementia care Unit after the model was introduced, as were reductions in anti-psychotic and sedative medication. Staff reported increased knowledge about meeting the needs of people with dementia, and experienced organizational culture change that supported the ABLE model of care. Families were very satisfied with the changes. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  12. 78 FR 71707 - MAP-21 Comprehensive Truck Size and Weight Limits Study Public Meeting and Outreach Sessions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    .... The second peer review will be on the extent to which the technical analysis and findings address the... Reports based on their thoroughness in reviewing the existing literature, analysis of existing models and...

  13. Automotive Maintenance Data Base for Model Years 1976-1979. Part II : Appendix E and F

    DOT National Transportation Integrated Search

    1980-12-01

    An update of the existing data base was developed to include life cycle maintenance costs of representative vehicles for the model years 1976-1979. Repair costs as a function of time are also developed for a passenger car in each of the compact, subc...

  14. Designing Corporate Databases to Support Technology Innovation

    ERIC Educational Resources Information Center

    Gultz, Michael Jarett

    2012-01-01

    Based on a review of the existing literature on database design, this study proposed a unified database model to support corporate technology innovation. This study assessed potential support for the model based on the opinions of 200 technology industry executives, including Chief Information Officers, Chief Knowledge Officers and Chief Learning…

  15. Marketing for a Web-Based Master's Degree Program in Light of Marketing Mix Model

    ERIC Educational Resources Information Center

    Pan, Cheng-Chang

    2012-01-01

    The marketing mix model was applied with a focus on Web media to re-strategize a Web-based Master's program in a southern state university in U.S. The program's existing marketing strategy was examined using the four components of the model: product, price, place, and promotion, in hopes to repackage the program (product) to prospective students…

  16. Towards a Global Unified Model of Europa's Tenuous Atmosphere

    NASA Astrophysics Data System (ADS)

    Plainaki, Christina; Cassidy, Tim A.; Shematovich, Valery I.; Milillo, Anna; Wurz, Peter; Vorburger, Audrey; Roth, Lorenz; Galli, André; Rubin, Martin; Blöcker, Aljona; Brandt, Pontus C.; Crary, Frank; Dandouras, Iannis; Jia, Xianzhe; Grassi, Davide; Hartogh, Paul; Lucchetti, Alice; McGrath, Melissa; Mangano, Valeria; Mura, Alessandro; Orsini, Stefano; Paranicas, Chris; Radioti, Aikaterini; Retherford, Kurt D.; Saur, Joachim; Teolis, Ben

    2018-02-01

    Despite the numerous modeling efforts of the past, our knowledge on the radiation-induced physical and chemical processes in Europa's tenuous atmosphere and on the exchange of material between the moon's surface and Jupiter's magnetosphere remains limited. In lack of an adequate number of in situ observations, the existence of a wide variety of models based on different scenarios and considerations has resulted in a fragmentary understanding of the interactions of the magnetospheric ion population with both the moon's icy surface and neutral gas envelope. Models show large discrepancy in the source and loss rates of the different constituents as well as in the determination of the spatial distribution of the atmosphere and its variation with time. The existence of several models based on very different approaches highlights the need of a detailed comparison among them with the final goal of developing a unified model of Europa's tenuous atmosphere. The availability to the science community of such a model could be of particular interest in view of the planning of the future mission observations (e.g., ESA's JUpiter ICy moons Explorer (JUICE) mission, and NASA's Europa Clipper mission). We review the existing models of Europa's tenuous atmosphere and discuss each of their derived characteristics of the neutral environment. We also discuss discrepancies among different models and the assumptions of the plasma environment in the vicinity of Europa. A summary of the existing observations of both the neutral and the plasma environments at Europa is also presented. The characteristics of a global unified model of the tenuous atmosphere are, then, discussed. Finally, we identify needed future experimental work in laboratories and propose some suitable observation strategies for upcoming missions.

  17. Authentication in Virtual Organizations: A Reputation Based PKI Interconnection Model

    NASA Astrophysics Data System (ADS)

    Wazan, Ahmad Samer; Laborde, Romain; Barrere, Francois; Benzekri, Abdelmalek

    Authentication mechanism constitutes a central part of the virtual organization work. The PKI technology is used to provide the authentication in each organization involved in the virtual organization. Different trust models are proposed to interconnect the different PKIs in order to propagate the trust between them. While the existing trust models contain many drawbacks, we propose a new trust model based on the reputation of PKIs.

  18. Study on Capturing Functional Requirements of the New Product Based on Evolution

    NASA Astrophysics Data System (ADS)

    Liu, Fang; Song, Liya; Bai, Zhonghang; Zhang, Peng

    In order to exist in an increasingly competitive global marketplace, it is important for corporations to forecast the evolutionary direction of new products rapidly and effectively. Most products in the world are developed based on the design of existing products. In the product design, capturing functional requirements is a key step. Function is continuously evolving, which is driven by the evolution of needs and technologies. So the functional requirements of new product can be forecasted based on the functions of existing product. Eight laws of function evolution are put forward in this paper. The process model of capturing the functional requirements of new product based on function evolution is proposed. An example illustrates the design process.

  19. Mosquito population dynamics from cellular automata-based simulation

    NASA Astrophysics Data System (ADS)

    Syafarina, Inna; Sadikin, Rifki; Nuraini, Nuning

    2016-02-01

    In this paper we present an innovative model for simulating mosquito-vector population dynamics. The simulation consist of two stages: demography and dispersal dynamics. For demography simulation, we follow the existing model for modeling a mosquito life cycles. Moreover, we use cellular automata-based model for simulating dispersal of the vector. In simulation, each individual vector is able to move to other grid based on a random walk. Our model is also capable to represent immunity factor for each grid. We simulate the model to evaluate its correctness. Based on the simulations, we can conclude that our model is correct. However, our model need to be improved to find a realistic parameters to match real data.

  20. A framework for modeling scenario-based barrier island storm impacts

    USGS Publications Warehouse

    Mickey, Rangley; Long, Joseph W.; Dalyander, P. Soupy; Plant, Nathaniel G.; Thompson, David M.

    2018-01-01

    Methods for investigating the vulnerability of existing or proposed coastal features to storm impacts often rely on simplified parametric models or one-dimensional process-based modeling studies that focus on changes to a profile across a dune or barrier island. These simple studies tend to neglect the impacts to curvilinear or alongshore varying island planforms, influence of non-uniform nearshore hydrodynamics and sediment transport, irregular morphology of the offshore bathymetry, and impacts from low magnitude wave events (e.g. cold fronts). Presented here is a framework for simulating regionally specific, low and high magnitude scenario-based storm impacts to assess the alongshore variable vulnerabilities of a coastal feature. Storm scenarios based on historic hydrodynamic conditions were derived and simulated using the process-based morphologic evolution model XBeach. Model results show that the scenarios predicted similar patterns of erosion and overwash when compared to observed qualitative morphologic changes from recent storm events that were not included in the dataset used to build the scenarios. The framework model simulations were capable of predicting specific areas of vulnerability in the existing feature and the results illustrate how this storm vulnerability simulation framework could be used as a tool to help inform the decision-making process for scientists, engineers, and stakeholders involved in coastal zone management or restoration projects.

  1. An ontology-based semantic configuration approach to constructing Data as a Service for enterprises

    NASA Astrophysics Data System (ADS)

    Cai, Hongming; Xie, Cheng; Jiang, Lihong; Fang, Lu; Huang, Chenxi

    2016-03-01

    To align business strategies with IT systems, enterprises should rapidly implement new applications based on existing information with complex associations to adapt to the continually changing external business environment. Thus, Data as a Service (DaaS) has become an enabling technology for enterprise through information integration and the configuration of existing distributed enterprise systems and heterogonous data sources. However, business modelling, system configuration and model alignment face challenges at the design and execution stages. To provide a comprehensive solution to facilitate data-centric application design in a highly complex and large-scale situation, a configurable ontology-based service integrated platform (COSIP) is proposed to support business modelling, system configuration and execution management. First, a meta-resource model is constructed and used to describe and encapsulate information resources by way of multi-view business modelling. Then, based on ontologies, three semantic configuration patterns, namely composite resource configuration, business scene configuration and runtime environment configuration, are designed to systematically connect business goals with executable applications. Finally, a software architecture based on model-view-controller (MVC) is provided and used to assemble components for software implementation. The result of the case study demonstrates that the proposed approach provides a flexible method of implementing data-centric applications.

  2. Password-only authenticated three-party key exchange with provable security in the standard model.

    PubMed

    Nam, Junghyun; Choo, Kim-Kwang Raymond; Kim, Junghwan; Kang, Hyun-Kyu; Kim, Jinsoo; Paik, Juryon; Won, Dongho

    2014-01-01

    Protocols for password-only authenticated key exchange (PAKE) in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000), which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks.

  3. Endoscopic skull base training using 3D printed models with pre-existing pathology.

    PubMed

    Narayanan, Vairavan; Narayanan, Prepageran; Rajagopalan, Raman; Karuppiah, Ravindran; Rahman, Zainal Ariff Abdul; Wormald, Peter-John; Van Hasselt, Charles Andrew; Waran, Vicknes

    2015-03-01

    Endoscopic base of skull surgery has been growing in acceptance in the recent past due to improvements in visualisation and micro instrumentation as well as the surgical maturing of early endoscopic skull base practitioners. Unfortunately, these demanding procedures have a steep learning curve. A physical simulation that is able to reproduce the complex anatomy of the anterior skull base provides very useful means of learning the necessary skills in a safe and effective environment. This paper aims to assess the ease of learning endoscopic skull base exposure and drilling techniques using an anatomically accurate physical model with a pre-existing pathology (i.e., basilar invagination) created from actual patient data. Five models of a patient with platy-basia and basilar invagination were created from the original MRI and CT imaging data of a patient. The models were used as part of a training workshop for ENT surgeons with varying degrees of experience in endoscopic base of skull surgery, from trainees to experienced consultants. The surgeons were given a list of key steps to achieve in exposing and drilling the skull base using the simulation model. They were then asked to list the level of difficulty of learning these steps using the model. The participants found the models suitable for learning registration, navigation and skull base drilling techniques. All participants also found the deep structures to be accurately represented spatially as confirmed by the navigation system. These models allow structured simulation to be conducted in a workshop environment where surgeons and trainees can practice to perform complex procedures in a controlled fashion under the supervision of experts.

  4. Municipal water-based heat pump heating and/or cooling systems: Findings and recommendations. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bloomquist, R.G.; Wegman, S.

    1998-04-01

    The purpose of the present work was to determine if existing heat pump systems based on municipal water systems meet existing water quality standards, to analyze water that has passed through a heat pump or heat exchanger to determine if corrosion products can be detected, to determine residual chlorine levels in municipal waters on the inlet as well as the outlet side of such installations, to analyses for bacterial contaminants and/or regrowth due to the presence of a heat pump or heat exchanger, to develop and suggest criteria for system design and construction, to provide recommendations and specifications for materialmore » and fluid selection, and to develop model rules and regulations for the installation, operation, and monitoring of new and existing systems. In addition, the Washington State University (WSU) has evaluated availability of computer models that would allow for water system mapping, water quality modeling and system operation.« less

  5. Cost-sensitive AdaBoost algorithm for ordinal regression based on extreme learning machine.

    PubMed

    Riccardi, Annalisa; Fernández-Navarro, Francisco; Carloni, Sante

    2014-10-01

    In this paper, the well known stagewise additive modeling using a multiclass exponential (SAMME) boosting algorithm is extended to address problems where there exists a natural order in the targets using a cost-sensitive approach. The proposed ensemble model uses an extreme learning machine (ELM) model as a base classifier (with the Gaussian kernel and the additional regularization parameter). The closed form of the derived weighted least squares problem is provided, and it is employed to estimate analytically the parameters connecting the hidden layer to the output layer at each iteration of the boosting algorithm. Compared to the state-of-the-art boosting algorithms, in particular those using ELM as base classifier, the suggested technique does not require the generation of a new training dataset at each iteration. The adoption of the weighted least squares formulation of the problem has been presented as an unbiased and alternative approach to the already existing ELM boosting techniques. Moreover, the addition of a cost model for weighting the patterns, according to the order of the targets, enables the classifier to tackle ordinal regression problems further. The proposed method has been validated by an experimental study by comparing it with already existing ensemble methods and ELM techniques for ordinal regression, showing competitive results.

  6. Optimization of monitoring networks based on uncertainty quantification of model predictions of contaminant transport

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Harp, D.

    2010-12-01

    The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.

  7. Dietary Exposure Potential Model

    EPA Science Inventory

    Existing food consumption and contaminant residue databases, typically products of nutrition and regulatory monitoring, contain useful information to characterize dietary intake of environmental chemicals. A PC-based model with resident database system, termed the Die...

  8. Proates a computer modelling system for power plant: Its description and application to heatrate improvement within PowerGen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, C.H.; Ready, A.B.; Rea, J.

    1995-06-01

    Versions of the computer program PROATES (PROcess Analysis for Thermal Energy Systems) have been used since 1979 to analyse plant performance improvement proposals relating to existing plant and also to evaluate new plant designs. Several plant modifications have been made to improve performance based on the model predictions and the predicted performance has been realised in practice. The program was born out of a need to model the overall steady state performance of complex plant to enable proposals to change plant component items or operating strategy to be evaluated. To do this with confidence it is necessary to model themore » multiple thermodynamic interactions between the plant components. The modelling system is modular in concept allowing the configuration of individual plant components to represent any particular power plant design. A library exists of physics based modules which have been extensively validated and which provide representations of a wide range of boiler, turbine and CW system components. Changes to model data and construction is achieved via a user friendly graphical model editing/analysis front-end with results being presented via the computer screen or hard copy. The paper describes briefly the modelling system but concentrates mainly on the application of the modelling system to assess design re-optimisation, firing with different fuels and the re-powering of an existing plant.« less

  9. Predictive local receptive fields based respiratory motion tracking for motion-adaptive radiotherapy.

    PubMed

    Yubo Wang; Tatinati, Sivanagaraja; Liyu Huang; Kim Jeong Hong; Shafiq, Ghufran; Veluvolu, Kalyana C; Khong, Andy W H

    2017-07-01

    Extracranial robotic radiotherapy employs external markers and a correlation model to trace the tumor motion caused by the respiration. The real-time tracking of tumor motion however requires a prediction model to compensate the latencies induced by the software (image data acquisition and processing) and hardware (mechanical and kinematic) limitations of the treatment system. A new prediction algorithm based on local receptive fields extreme learning machines (pLRF-ELM) is proposed for respiratory motion prediction. All the existing respiratory motion prediction methods model the non-stationary respiratory motion traces directly to predict the future values. Unlike these existing methods, the pLRF-ELM performs prediction by modeling the higher-level features obtained by mapping the raw respiratory motion into the random feature space of ELM instead of directly modeling the raw respiratory motion. The developed method is evaluated using the dataset acquired from 31 patients for two horizons in-line with the latencies of treatment systems like CyberKnife. Results showed that pLRF-ELM is superior to that of existing prediction methods. Results further highlight that the abstracted higher-level features are suitable to approximate the nonlinear and non-stationary characteristics of respiratory motion for accurate prediction.

  10. Haptics-based dynamic implicit solid modeling.

    PubMed

    Hua, Jing; Qin, Hong

    2004-01-01

    This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback.

  11. Predicted carbonation of existing concrete building based on the Indonesian tropical micro-climate

    NASA Astrophysics Data System (ADS)

    Hilmy, M.; Prabowo, H.

    2018-03-01

    This paper is aimed to predict the carbonation progress based on the previous mathematical model. It shortly explains the nature of carbonation including the processes and effects. Environmental humidity and temperature of the existing concrete building are measured and compared to data from local Meteorological, Climatological, and Geophysical Agency. The data gained are expressed in the form of annual hygrothermal values which will use as the input parameter in carbonation model. The physical properties of the observed building such as its location, dimensions, and structural material used are quantified. These data then utilized as an important input parameter for carbonation coefficients. The relationships between relative humidity and the rate of carbonation established. The results can provide a basis for repair and maintenance of existing concrete buildings and the sake of service life analysis of them.

  12. A new proposal for greenhouse gas emissions responsibility allocation: best available technologies approach.

    PubMed

    Berzosa, Álvaro; Barandica, Jesús M; Fernández-Sánchez, Gonzalo

    2014-01-01

    In recent years, several methodologies have been developed for the quantification of greenhouse gas (GHG) emissions. However, determining who is responsible for these emissions is also quite challenging. The most common approach is to assign emissions to the producer (based on the Kyoto Protocol), but proposals also exist for its allocation to the consumer (based on an ecological footprint perspective) and for a hybrid approach called shared responsibility. In this study, the existing proposals and standards regarding the allocation of GHG emissions responsibilities are analyzed, focusing on their main advantages and problems. A new model of shared responsibility that overcomes some of the existing problems is also proposed. This model is based on applying the best available technologies (BATs). This new approach allocates the responsibility between the producers and the final consumers based on the real capacity of each agent to reduce emissions. The proposed approach is demonstrated using a simple case study of a 4-step life cycle of ammonia nitrate (AN) fertilizer production. The proposed model has the characteristics that the standards and publications for assignment of GHG emissions responsibilities demand. This study presents a new way to assign responsibilities that pushes all the actors in the production chain, including consumers, to reduce pollution. © 2013 SETAC.

  13. Online Knowledge-Based Model for Big Data Topic Extraction.

    PubMed

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half.

  14. Process-based soil erodibility estimation for empirical water erosion models

    USDA-ARS?s Scientific Manuscript database

    A variety of modeling technologies exist for water erosion prediction each with specific parameters. It is of interest to scrutinize parameters of a particular model from the point of their compatibility with dataset of other models. In this research, functional relationships between soil erodibilit...

  15. The cloud services innovation platform- enabling service-based environmental modelling using infrastructure-as-a-service cloud computing

    USDA-ARS?s Scientific Manuscript database

    Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...

  16. Standardizing Acute Toxicity Data for use in Ecotoxicology Models: Influence of Test Type, Life Stage, and Concentration Reporting

    EPA Science Inventory

    Ecotoxicological models generally have large data requirements and are frequently based on existing information from diverse sources. Standardizing data for toxicological models may be necessary to reduce extraneous variation and to ensure models reflect intrinsic relationships. ...

  17. Constraint programming based biomarker optimization.

    PubMed

    Zhou, Manli; Luo, Youxi; Sun, Guoquan; Mai, Guoqin; Zhou, Fengfeng

    2015-01-01

    Efficient and intuitive characterization of biological big data is becoming a major challenge for modern bio-OMIC based scientists. Interactive visualization and exploration of big data is proven to be one of the successful solutions. Most of the existing feature selection algorithms do not allow the interactive inputs from users in the optimizing process of feature selection. This study investigates this question as fixing a few user-input features in the finally selected feature subset and formulates these user-input features as constraints for a programming model. The proposed algorithm, fsCoP (feature selection based on constrained programming), performs well similar to or much better than the existing feature selection algorithms, even with the constraints from both literature and the existing algorithms. An fsCoP biomarker may be intriguing for further wet lab validation, since it satisfies both the classification optimization function and the biomedical knowledge. fsCoP may also be used for the interactive exploration of bio-OMIC big data by interactively adding user-defined constraints for modeling.

  18. Promoter Sequences Prediction Using Relational Association Rule Mining

    PubMed Central

    Czibula, Gabriela; Bocicor, Maria-Iuliana; Czibula, Istvan Gergely

    2012-01-01

    In this paper we are approaching, from a computational perspective, the problem of promoter sequences prediction, an important problem within the field of bioinformatics. As the conditions for a DNA sequence to function as a promoter are not known, machine learning based classification models are still developed to approach the problem of promoter identification in the DNA. We are proposing a classification model based on relational association rules mining. Relational association rules are a particular type of association rules and describe numerical orderings between attributes that commonly occur over a data set. Our classifier is based on the discovery of relational association rules for predicting if a DNA sequence contains or not a promoter region. An experimental evaluation of the proposed model and comparison with similar existing approaches is provided. The obtained results show that our classifier overperforms the existing techniques for identifying promoter sequences, confirming the potential of our proposal. PMID:22563233

  19. Graph-based real-time fault diagnostics

    NASA Technical Reports Server (NTRS)

    Padalkar, S.; Karsai, G.; Sztipanovits, J.

    1988-01-01

    A real-time fault detection and diagnosis capability is absolutely crucial in the design of large-scale space systems. Some of the existing AI-based fault diagnostic techniques like expert systems and qualitative modelling are frequently ill-suited for this purpose. Expert systems are often inadequately structured, difficult to validate and suffer from knowledge acquisition bottlenecks. Qualitative modelling techniques sometimes generate a large number of failure source alternatives, thus hampering speedy diagnosis. In this paper we present a graph-based technique which is well suited for real-time fault diagnosis, structured knowledge representation and acquisition and testing and validation. A Hierarchical Fault Model of the system to be diagnosed is developed. At each level of hierarchy, there exist fault propagation digraphs denoting causal relations between failure modes of subsystems. The edges of such a digraph are weighted with fault propagation time intervals. Efficient and restartable graph algorithms are used for on-line speedy identification of failure source components.

  20. BIM authoring for an image-based bridge maintenance system of existing cable-supported bridges

    NASA Astrophysics Data System (ADS)

    Dang, N. S.; Shim, C. S.

    2018-04-01

    Infrastructure nowadays is increasingly become the main backbone for the metropolitan development in general. Along with the rise of new facilities, the demand in term of maintenance for the existing bridges is indispensable. Recently, the terminology of “preventive maintenance” is not unfamiliar with the engineer, literally is the use of a bridge maintenance system (BMS) based on a BIM-oriented model. In this paper, the process of generating a BMS based on BIM model is introduced in detail. Data management for this BMS is separated into two modules: site inspection system and information management system. The noteworthy aspect of this model lays on the closed and automatic process of “capture image, generate the technical damage report, and upload/feedback to the BMS” in real-time. A pilot BMS system for a cable-supported bridge is presented which showed a good performance and potential to further development of preventive maintenance.

  1. Characterization of Orbital Debris via Hyper-Velocity Laboratory-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather; Liou, J.-C.; Anz-Meador, Phillip; Sorge, Marlon; Opiela, John; Fitz-Coy, Norman; Huynh, Tom; Krisko, Paula

    2017-01-01

    Existing DOD and NASA satellite breakup models are based on a key laboratory test, Satellite Orbital debris Characterization Impact Test (SOCIT), which has supported many applications and matched on-orbit events involving older satellite designs reasonably well over the years. In order to update and improve these models, the NASA Orbital Debris Program Office, in collaboration with the Air Force Space and Missile Systems Center, The Aerospace Corporation, and the University of Florida, replicated a hypervelocity impact using a mock-up satellite, DebriSat, in controlled laboratory conditions. DebriSat is representative of present-day LEO satellites, built with modern spacecraft materials and construction techniques. Fragments down to 2 mm in size will be characterized by their physical and derived properties. A subset of fragments will be further analyzed in laboratory radar and optical facilities to update the existing radar-based NASA Size Estimation Model (SEM) and develop a comparable optical-based SEM. A historical overview of the project, status of the characterization process, and plans for integrating the data into various models will be discussed herein.

  2. Characterization of Orbital Debris via Hyper-Velocity Laboratory-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather; Liou, J.-C.; Krisko, Paula; Opiela, John; Fitz-Coy, Norman; Sorge, Marlon; Huynh, Tom

    2017-01-01

    Existing DoD and NASA satellite breakup models are based on a key laboratory test, Satellite Orbital debris Characterization Impact Test (SOCIT), which has supported many applications and matched on-orbit events involving older satellite designs reasonably well over the years. In order to update and improve these models, the NASA Orbital Debris Program Office, in collaboration with the Air Force Space and Missile Systems Center, The Aerospace Corporation, and the University of Florida, replicated a hypervelocity impact using a mock-up satellite, DebriSat, in controlled laboratory conditions. DebriSat is representative of present-day LEO satellites, built with modern spacecraft materials and construction techniques. Fragments down to 2 mm in size will be characterized by their physical and derived properties. A subset of fragments will be further analyzed in laboratory radar and optical facilities to update the existing radar-based NASA Size Estimation Model (SEM) and develop a comparable optical-based SEM. A historical overview of the project, status of the characterization process, and plans for integrating the data into various models will be discussed herein.

  3. Modeling spatiotemporal covariance for magnetoencephalography or electroencephalography source analysis.

    PubMed

    Plis, Sergey M; George, J S; Jun, S C; Paré-Blagoev, J; Ranken, D M; Wood, C C; Schmidt, D M

    2007-01-01

    We propose a new model to approximate spatiotemporal noise covariance for use in neural electromagnetic source analysis, which better captures temporal variability in background activity. As with other existing formalisms, our model employs a Kronecker product of matrices representing temporal and spatial covariance. In our model, spatial components are allowed to have differing temporal covariances. Variability is represented as a series of Kronecker products of spatial component covariances and corresponding temporal covariances. Unlike previous attempts to model covariance through a sum of Kronecker products, our model is designed to have a computationally manageable inverse. Despite increased descriptive power, inversion of the model is fast, making it useful in source analysis. We have explored two versions of the model. One is estimated based on the assumption that spatial components of background noise have uncorrelated time courses. Another version, which gives closer approximation, is based on the assumption that time courses are statistically independent. The accuracy of the structural approximation is compared to an existing model, based on a single Kronecker product, using both Frobenius norm of the difference between spatiotemporal sample covariance and a model, and scatter plots. Performance of ours and previous models is compared in source analysis of a large number of single dipole problems with simulated time courses and with background from authentic magnetoencephalography data.

  4. Computer aided modeling of soil mix designs to predict characteristics and properties of stabilized road bases.

    DOT National Transportation Integrated Search

    2009-07-01

    "Considerable data exists for soils that were tested and documented, both for native properties and : properties with pozzolan stabilization. While the data exists there was no database for the Nebraska : Department of Roads to retrieve this data for...

  5. FacetModeller: Software for manual creation, manipulation and analysis of 3D surface-based models

    NASA Astrophysics Data System (ADS)

    Lelièvre, Peter G.; Carter-McAuslan, Angela E.; Dunham, Michael W.; Jones, Drew J.; Nalepa, Mariella; Squires, Chelsea L.; Tycholiz, Cassandra J.; Vallée, Marc A.; Farquharson, Colin G.

    2018-01-01

    The creation of 3D models is commonplace in many disciplines. Models are often built from a collection of tessellated surfaces. To apply numerical methods to such models it is often necessary to generate a mesh of space-filling elements that conforms to the model surfaces. While there are meshing algorithms that can do so, they place restrictive requirements on the surface-based models that are rarely met by existing 3D model building software. Hence, we have developed a Java application named FacetModeller, designed for efficient manual creation, modification and analysis of 3D surface-based models destined for use in numerical modelling.

  6. Distributed Hydrologic Modeling Apps for Decision Support in the Cloud

    NASA Astrophysics Data System (ADS)

    Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.

    2013-12-01

    Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it requires the use of additional software. In particular, there are at least three elements that are needed: a geospatially enabled database, a map server, and geoprocessing toolbox. We recommend a software stack for geospatial web application development comprising: MapServer, PostGIS, and 52 North with Python as the scripting language to tie them together. Another hurdle that must be cleared is managing the cloud-computing load. We are using HTCondor as a solution to this end. Finally, we are creating a scripting environment wherein developers will be able to create apps that use existing hydrologic models in our system with minimal effort. This capability will be accomplished by creating a plugin for a Python content management system called CKAN. We are currently developing cyberinfrastructure that utilizes this stack and greatly lowers the investment required to deploy cloud-based modeling apps. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  7. Improved Accuracy Using Recursive Bayesian Estimation Based Language Model Fusion in ERP-Based BCI Typing Systems

    PubMed Central

    Orhan, U.; Erdogmus, D.; Roark, B.; Oken, B.; Purwar, S.; Hild, K. E.; Fowler, A.; Fried-Oken, M.

    2013-01-01

    RSVP Keyboard™ is an electroencephalography (EEG) based brain computer interface (BCI) typing system, designed as an assistive technology for the communication needs of people with locked-in syndrome (LIS). It relies on rapid serial visual presentation (RSVP) and does not require precise eye gaze control. Existing BCI typing systems which uses event related potentials (ERP) in EEG suffer from low accuracy due to low signal-to-noise ratio. Henceforth, RSVP Keyboard™ utilizes a context based decision making via incorporating a language model, to improve the accuracy of letter decisions. To further improve the contributions of the language model, we propose recursive Bayesian estimation, which relies on non-committing string decisions, and conduct an offline analysis, which compares it with the existing naïve Bayesian fusion approach. The results indicate the superiority of the recursive Bayesian fusion and in the next generation of RSVP Keyboard™ we plan to incorporate this new approach. PMID:23366432

  8. Improving inflow forecasting into hydropower reservoirs through a complementary modelling framework

    NASA Astrophysics Data System (ADS)

    Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.

    2014-10-01

    Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead-time is considered within the day-ahead (Elspot) market of the Nordic exchange market. We present here a new approach for issuing hourly reservoir inflow forecasts that aims to improve on existing forecasting models that are in place operationally, without needing to modify the pre-existing approach, but instead formulating an additive or complementary model that is independent and captures the structure the existing model may be missing. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. The procedure presented comprises an error model added on top of an un-alterable constant parameter conceptual model, the models being demonstrated with reference to the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead-times up to 17 h. Season based evaluations indicated that the improvement in inflow forecasts varies across seasons and inflow forecasts in autumn and spring are less successful with the 95% prediction interval bracketing less than 95% of the observations for lead-times beyond 17 h.

  9. Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization

    PubMed Central

    Marai, G. Elisabeta

    2018-01-01

    Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550

  10. A RETROSPECTIVE ANALYSIS OF MODEL UNCERTAINTY FOR FORECASTING HYDROLOGIC CHANGE

    EPA Science Inventory

    GIS-based hydrologic modeling offers a convenient means of assessing the impacts associated with land-cover/use change for environmental planning efforts. Alternative future scenarios can be used as input to hydrologic models and compared with existing conditions to evaluate pot...

  11. Students' models in some topics of electricity and magnetism

    NASA Astrophysics Data System (ADS)

    Warnakulasooriya, Rasil

    Model-based learning have been emphasized by many researchers. Furthermore, many theories have been put forward by researchers on how students reason. However, how the theories of reasoning are manifested within the context of electricity and magnetism and how to implement a model-based learning environment within such a context has not been the object of research. In this dissertation, we address the above two concerns. We probe students' reasoning, through a model-based diagnostic instrument. The instrument consists of a set of related multiple-choice questions that can be categorized as belonging to the same conceptual domain. The contextual features of a set are also kept to a minimum. We find that students' responses are tied to the models they have constructed or construct on the spot when faced with novel situations. We find that the concepts such as electric fields and electric potentials exist as mere "definitions" and do not contribute to forming a set of working models, and as such the need for the use of such concepts cannot be easily recognized. We also find that students function within a set of procedural rules. Whether these rules are extended directly from familiar situations through analogies or lead to constructing a set of new rules is constrained by the underlying models and the context of the questions. Models also either exist or are constructed in ways that lead students to overlook the common sense reality of physical phenomena. We also find that the way questions are perceived and interpreted are dependent on the underlying models and that different models exist without conflicting with each other. Based on the above findings, we argue that students' reasoning is context specific and is sensitive to the way the learning has taken place. Thus, we suggest a recontexualization process as a specific model-based learning environment to help students learn electricity and magnetism. The step-by-step guidance through a series of such related questions would then elucidate the context within which concepts are introduced, the limitations of particular representations and the ontological demands required by the subject.

  12. Developing a Behavioral Model for Mobile Phone-Based Diabetes Interventions

    PubMed Central

    Nundy, Shantanu; Dick, Jonathan J.; Solomon, Marla C.; Peek, Monica E.

    2013-01-01

    Objectives Behavioral models for mobile phone-based diabetes interventions are lacking. This study explores the potential mechanisms by which a text message-based diabetes program affected self-management among African-Americans. Methods We conducted in-depth, individual interviews among 18 African-American patients with type 2 diabetes who completed a 4-week text message-based diabetes program. Each interview was audio- taped, transcribed verbatim, and imported into Atlas.ti software. Coding was done iteratively. Emergent themes were mapped onto existing behavioral constructs and then used to develop a novel behavioral model for mobile phone-based diabetes self-management programs. Results The effects of the text message-based program went beyond automated reminders. The constant, daily communications reduced denial of diabetes and reinforced the importance of self-management (Rosenstock Health Belief Model). Responding positively to questions about self-management increased mastery experience (Bandura Self-Efficacy). Most surprisingly, participants perceived the automated program as a “friend” and “support group” that monitored and supported their self-management behaviors (Barrera Social Support). Conclusions A mobile phone-based diabetes program affected self-management through multiple behavioral constructs including health beliefs, self-efficacy, and social support. Practice implications: Disease management programs that utilize mobile technologies should be designed to leverage existing models of behavior change and can address barriers to self-management associated with health disparities. PMID:23063349

  13. Women's Self-definition in Adulthood: From a Different Model?

    ERIC Educational Resources Information Center

    Peck, Teresa A.

    1986-01-01

    Examines criticisms of existing models of adult development from both feminist and developmental psychologists. A model of women's adult self-definition is presented, based upon current research on women's adult experience. The model combines a dialectical approach, which considers the effects of social/historical factors, with a feminist…

  14. A development framework for artificial intelligence based distributed operations support systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1990-01-01

    Advanced automation is required to reduce costly human operations support requirements for complex space-based and ground control systems. Existing knowledge based technologies have been used successfully to automate individual operations tasks. Considerably less progress has been made in integrating and coordinating multiple operations applications for unified intelligent support systems. To fill this gap, SOCIAL, a tool set for developing Distributed Artificial Intelligence (DAI) systems is being constructed. SOCIAL consists of three primary language based components defining: models of interprocess communication across heterogeneous platforms; models for interprocess coordination, concurrency control, and fault management; and for accessing heterogeneous information resources. DAI applications subsystems, either new or existing, will access these distributed services non-intrusively, via high-level message-based protocols. SOCIAL will reduce the complexity of distributed communications, control, and integration, enabling developers to concentrate on the design and functionality of the target DAI system itself.

  15. PHYSIOLOGICALLY-BASED PHARMACOKINETIC ( PBPK ) MODEL FOR METHYL TERTIARY BUTYL ETHER ( MTBE ): A REVIEW OF EXISTING MODELS

    EPA Science Inventory

    MTBE is a volatile organic compound used as an oxygenate additive to gasoline, added to comply with the 1990 Clean Air Act. Previous PBPK models for MTBE were reviewed and incorporated into the Exposure Related Dose Estimating Model (ERDEM) software. This model also included an e...

  16. A PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODEL FOR TRICHLOROETHYLENE WITH SPECIFICITY FOR THE LONG EVANS RAT

    EPA Science Inventory

    A PBPK model for TCE with specificity for the male LE rat that accurately predicts TCE tissue time-course data has not been developed, although other PBPK models for TCE exist. Development of such a model was the present aim. The PBPK model consisted of 5 compartments: fat; slowl...

  17. Population genetic testing for cancer susceptibility: founder mutations to genomes.

    PubMed

    Foulkes, William D; Knoppers, Bartha Maria; Turnbull, Clare

    2016-01-01

    The current standard model for identifying carriers of high-risk mutations in cancer-susceptibility genes (CSGs) generally involves a process that is not amenable to population-based testing: access to genetic tests is typically regulated by health-care providers on the basis of a labour-intensive assessment of an individual's personal and family history of cancer, with face-to-face genetic counselling performed before mutation testing. Several studies have shown that application of these selection criteria results in a substantial proportion of mutation carriers being missed. Population-based genetic testing has been proposed as an alternative approach to determining cancer susceptibility, and aims for a more-comprehensive detection of mutation carriers. Herein, we review the existing data on population-based genetic testing, and consider some of the barriers, pitfalls, and challenges related to the possible expansion of this approach. We consider mechanisms by which population-based genetic testing for cancer susceptibility could be delivered, and suggest how such genetic testing might be integrated into existing and emerging health-care structures. The existing models of genetic testing (including issues relating to informed consent) will very likely require considerable alteration if the potential benefits of population-based genetic testing are to be fully realized.

  18. A Decision Model for Merging Base Operations: Outsourcing Pest Management on Joint Base Anacostia-Bolling

    DTIC Science & Technology

    2011-11-30

    OH: South- Western Cengage Learning. Mankiw , N. G. (2006). Principles of economics (4th ed.). Mason, OH: Thompson South- Western. Private...When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic times, it is now more...decision uncertainties. When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic

  19. EXPOSURE RELATED DOSE ESTIMATING MODEL ( ERDEM ) A PHYSIOLOGICALLY-BASED PHARMACOKINETIC AND PHARMACODYNAMIC ( PBPK/PD ) MODEL FOR ASSESSING HUMAN EXPOSURE AND RISK

    EPA Science Inventory

    The Exposure Related Dose Estimating Model (ERDEM) is a PBPK/PD modeling system that was developed by EPA's National Exposure Research Laboratory (NERL). The ERDEM framework provides the flexibility either to use existing models and to build new PBPK and PBPK/PD models to address...

  20. A domains-based taxonomy of supported accommodation for people with severe and persistent mental illness.

    PubMed

    Siskind, Dan; Harris, Meredith; Pirkis, Jane; Whiteford, Harvey

    2013-06-01

    A lack of definitional clarity in supported accommodation and the absence of a widely accepted system for classifying supported accommodation models creates barriers to service planning and evaluation. We undertook a systematic review of existing supported accommodation classification systems. Using a structured system for qualitative data analysis, we reviewed the stratification features in these classification systems, identified the key elements of supported accommodation and arranged them into domains and dimensions to create a new taxonomy. The existing classification systems were mapped onto the new taxonomy to verify the domains and dimensions. Existing classification systems used either a service-level characteristic or programmatic approach. We proposed a taxonomy based around four domains: duration of tenure; patient characteristics; housing characteristics; and service characteristics. All of the domains in the taxonomy were drawn from the existing classification structures; however, none of the existing classification structures covered all of the domains in the taxonomy. Existing classification systems are regionally based, limited in scope and lack flexibility. A domains-based taxonomy can allow more accurate description of supported accommodation services, aid in identifying the service elements likely to improve outcomes for specific patient populations, and assist in service planning.

  1. Plausible combinations: An improved method to evaluate the covariate structure of Cormack-Jolly-Seber mark-recapture models

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; McDonald, Trent L.; Amstrup, Steven C.

    2013-01-01

    Mark-recapture models are extensively used in quantitative population ecology, providing estimates of population vital rates, such as survival, that are difficult to obtain using other methods. Vital rates are commonly modeled as functions of explanatory covariates, adding considerable flexibility to mark-recapture models, but also increasing the subjectivity and complexity of the modeling process. Consequently, model selection and the evaluation of covariate structure remain critical aspects of mark-recapture modeling. The difficulties involved in model selection are compounded in Cormack-Jolly- Seber models because they are composed of separate sub-models for survival and recapture probabilities, which are conceptualized independently even though their parameters are not statistically independent. The construction of models as combinations of sub-models, together with multiple potential covariates, can lead to a large model set. Although desirable, estimation of the parameters of all models may not be feasible. Strategies to search a model space and base inference on a subset of all models exist and enjoy widespread use. However, even though the methods used to search a model space can be expected to influence parameter estimation, the assessment of covariate importance, and therefore the ecological interpretation of the modeling results, the performance of these strategies has received limited investigation. We present a new strategy for searching the space of a candidate set of Cormack-Jolly-Seber models and explore its performance relative to existing strategies using computer simulation. The new strategy provides an improved assessment of the importance of covariates and covariate combinations used to model survival and recapture probabilities, while requiring only a modest increase in the number of models on which inference is based in comparison to existing techniques.

  2. DEVELOPMENT AND EVALUATION OF AN INTEGRATED MODEL TO FACILITATE RISK-BASED CORRECTIVE ACTION AT SUPERFUND SITES

    EPA Science Inventory

    We developed a numerical model to predict chemical concentrations in indoor environments resulting from soil vapor intrusion and volatilization from groundwater. The model, which integrates new and existing algorithms for chemical fate and transport, was originally...

  3. Determination of wind tunnel constraint effects by a unified pressure signature method. Part 2: Application to jet-in-crossflow

    NASA Technical Reports Server (NTRS)

    Hackett, J. E.; Sampath, S.; Phillips, C. G.

    1981-01-01

    The development of an improved jet-in-crossflow model for estimating wind tunnel blockage and angle-of-attack interference is described. Experiments showed that the simpler existing models fall seriously short of representing far-field flows properly. A new, vortex-source-doublet (VSD) model was therefore developed which employs curved trajectories and experimentally-based singularity strengths. The new model is consistent with existing and new experimental data and it predicts tunnel wall (i.e. far-field) pressures properly. It is implemented as a preprocessor to the wall-pressure-signature-based tunnel interference predictor. The supporting experiments and theoretical studies revealed some new results. Comparative flow field measurements with 1-inch "free-air" and 3-inch impinging jets showed that vortex penetration into the flow, in diameters, was almost unaltered until 'hard' impingement occurred. In modeling impinging cases, a 'plume redirection' term was introduced which is apparently absent in previous models. The effects of this term were found to be very significant.

  4. Cloud Computing Value Chains: Understanding Businesses and Value Creation in the Cloud

    NASA Astrophysics Data System (ADS)

    Mohammed, Ashraf Bany; Altmann, Jörn; Hwang, Junseok

    Based on the promising developments in Cloud Computing technologies in recent years, commercial computing resource services (e.g. Amazon EC2) or software-as-a-service offerings (e.g. Salesforce. com) came into existence. However, the relatively weak business exploitation, participation, and adoption of other Cloud Computing services remain the main challenges. The vague value structures seem to be hindering business adoption and the creation of sustainable business models around its technology. Using an extensive analyze of existing Cloud business models, Cloud services, stakeholder relations, market configurations and value structures, this Chapter develops a reference model for value chains in the Cloud. Although this model is theoretically based on porter's value chain theory, the proposed Cloud value chain model is upgraded to fit the diversity of business service scenarios in the Cloud computing markets. Using this model, different service scenarios are explained. Our findings suggest new services, business opportunities, and policy practices for realizing more adoption and value creation paths in the Cloud.

  5. Exploring the common molecular basis for the universal DNA mutation bias: Revival of Loewdin mutation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Liang-Yu; Center for Bioinformatics, Huazhong Agricultural University, Wuhan 430070; Wang, Guang-Zhong

    2011-06-10

    Highlights: {yields} There exists a universal G:C {yields} A:T mutation bias in three domains of life. {yields} This universal mutation bias has not been sufficiently explained. {yields} A DNA mutation model proposed by Loewdin 40 years ago offers a common explanation. -- Abstract: Recently, numerous genome analyses revealed the existence of a universal G:C {yields} A:T mutation bias in bacteria, fungi, plants and animals. To explore the molecular basis for this mutation bias, we examined the three well-known DNA mutation models, i.e., oxidative damage model, UV-radiation damage model and CpG hypermutation model. It was revealed that these models cannot providemore » a sufficient explanation to the universal mutation bias. Therefore, we resorted to a DNA mutation model proposed by Loewdin 40 years ago, which was based on inter-base double proton transfers (DPT). Since DPT is a fundamental and spontaneous chemical process and occurs much more frequently within GC pairs than AT pairs, Loewdin model offers a common explanation for the observed universal mutation bias and thus has broad biological implications.« less

  6. Mechanistic understanding of cellular level of water in plant-based food material

    NASA Astrophysics Data System (ADS)

    Khan, Md. Imran H.; Kumar, C.; Karim, M. A.

    2017-06-01

    Understanding of water distribution in plant-based food material is crucial for developing an accurate heat and mass transfer drying model. Generally, in plant-based food tissue, water is distributed in three different spaces namely, intercellular water, intracellular water, and cell wall water. For hygroscopic material, these three types of water transport should be considered for actual understanding of heat and mass transfer during drying. However, there is limited study dedicated to the investigation of the moisture distribution in a different cellular environment in the plant-based food material. Therefore, the aim of the present study was to investigate the proportion of intercellular water, intracellular water, and cell wall water inside the plant-based food material. During this study, experiments were performed for two different plant-based food tissues namely, eggplant and potato tissue using 1H-NMR-T2 relaxometry. Various types of water component were calculated by using multicomponent fits of the T2 relaxation curves. The experimental result showed that in potato tissue 80-82% water exist in intracellular space; 10-13% water in intercellular space and only 4-6% water exist in the cell wall space. In eggplant tissue, 90-93% water in intracellular space, 4-6% water exists in intercellular space and the remaining percentage of water is recognized as cell wall water. The investigated results quantify different types of water in plant-based food tissue. The highest proportion of water exists in intracellular spaces. Therefore, it is necessary to include different transport mechanism for intracellular, intercellular and cell wall water during modelling of heat and mass transfer during drying.

  7. A Simplified Micromechanical Modeling Approach to Predict the Tensile Flow Curve Behavior of Dual-Phase Steels

    NASA Astrophysics Data System (ADS)

    Nanda, Tarun; Kumar, B. Ravi; Singh, Vishal

    2017-11-01

    Micromechanical modeling is used to predict material's tensile flow curve behavior based on microstructural characteristics. This research develops a simplified micromechanical modeling approach for predicting flow curve behavior of dual-phase steels. The existing literature reports on two broad approaches for determining tensile flow curve of these steels. The modeling approach developed in this work attempts to overcome specific limitations of the existing two approaches. This approach combines dislocation-based strain-hardening method with rule of mixtures. In the first step of modeling, `dislocation-based strain-hardening method' was employed to predict tensile behavior of individual phases of ferrite and martensite. In the second step, the individual flow curves were combined using `rule of mixtures,' to obtain the composite dual-phase flow behavior. To check accuracy of proposed model, four distinct dual-phase microstructures comprising of different ferrite grain size, martensite fraction, and carbon content in martensite were processed by annealing experiments. The true stress-strain curves for various microstructures were predicted with the newly developed micromechanical model. The results of micromechanical model matched closely with those of actual tensile tests. Thus, this micromechanical modeling approach can be used to predict and optimize the tensile flow behavior of dual-phase steels.

  8. Teachers and Students' Conceptions of Computer-Based Models in the Context of High School Chemistry: Elicitations at the Pre-intervention Stage

    NASA Astrophysics Data System (ADS)

    Waight, Noemi; Gillmeister, Kristina

    2014-04-01

    This study examined teachers' and students' initial conceptions of computer-based models—Flash and NetLogo models—and documented how teachers and students reconciled notions of multiple representations featuring macroscopic, submicroscopic and symbolic representations prior to actual intervention in eight high school chemistry classrooms. Individual in-depth interviews were conducted with 32 students and 6 teachers. Findings revealed an interplay of complex factors that functioned as opportunities and obstacles in the implementation of technologies in science classrooms. Students revealed preferences for the Flash models as opposed to the open-ended NetLogo models. Altogether, due to lack of content and modeling background knowledge, students experienced difficulties articulating coherent and blended understandings of multiple representations. Concurrently, while the aesthetic and interactive features of the models were of great value, they did not sustain students' initial curiosity and opportunities to improve understandings about chemistry phenomena. Most teachers recognized direct alignment of the Flash model with their existing curriculum; however, the benefits were relegated to existing procedural and passive classroom practices. The findings have implications for pedagogical approaches that address the implementation of computer-based models, function of models, models as multiple representations and the role of background knowledge and cognitive load, and the role of teacher vision and classroom practices.

  9. Smart Climatology Applications for Undersea Warfare

    DTIC Science & Technology

    2008-09-01

    Comparisons of these climatologies with existing Navy climatologies based on the Generalized Digital Environmental Model ( GDEM ) reveal differences in sonic...undersea warfare. 15. NUMBER OF PAGES 117 14. SUBJECT TERMS antisubmarine warfare, climate variations, climatology, GDEM , ocean, re...climatologies based on the Generalized Digital Environmental Model ( GDEM ) to our smart ocean climatologies reveal a number of differences. The

  10. A Constructivist-Based Model for the Teaching of Dissolution of Gas in a Liquid

    ERIC Educational Resources Information Center

    Calik, Muammer; Ayas, Alipasa; Coll, Richard K.

    2006-01-01

    In this article we present details of a four-step constructivist-based teaching strategy, which helps students understand the dissolution of a gas in a liquid. The model derived from Ayas (1995) involves elicitation of pre-existing ideas, focusing on the target concept, challenging students' ideas, and applying newly constructed ideas to similar…

  11. Assessing the SunGuide and STEWARD databases.

    DOT National Transportation Integrated Search

    2017-02-01

    This project evaluated the feasibility of using the existing software and data bases as platforms : for analyzing the attributes of electric vehicles within present and future transportation : infrastructure projects and models. The Florida based Sun...

  12. Technology based transportation solutions : model deployment initiative

    DOT National Transportation Integrated Search

    1997-08-01

    The Model Deployment Initiative provides real-life examples of technologys potential in metropolitan areas across the country. Investments from public and private sector partners will integrate existing ITS elements in the four sites as part of a ...

  13. Microfluidic on-chip biomimicry for 3D cell culture: a fit-for-purpose investigation from the end user standpoint

    PubMed Central

    Liu, Ye; Gill, Elisabeth; Shery Huang, Yan Yan

    2017-01-01

    A plethora of 3D and microfluidics-based culture models have been demonstrated in the recent years with the ultimate aim to facilitate predictive in vitro models for pharmaceutical development. This article summarizes to date the progress in the microfluidics-based tissue culture models, including organ-on-a-chip and vasculature-on-a-chip. Specific focus is placed on addressing the question of what kinds of 3D culture and system complexities are deemed desirable by the biological and biomedical community. This question is addressed through analysis of a research survey to evaluate the potential use of microfluidic cell culture models among the end users. Our results showed a willingness to adopt 3D culture technology among biomedical researchers, although a significant gap still exists between the desired systems and existing 3D culture options. With these results, key challenges and future directions are highlighted. PMID:28670465

  14. Microfluidic on-chip biomimicry for 3D cell culture: a fit-for-purpose investigation from the end user standpoint.

    PubMed

    Liu, Ye; Gill, Elisabeth; Shery Huang, Yan Yan

    2017-06-01

    A plethora of 3D and microfluidics-based culture models have been demonstrated in the recent years with the ultimate aim to facilitate predictive in vitro models for pharmaceutical development. This article summarizes to date the progress in the microfluidics-based tissue culture models, including organ-on-a-chip and vasculature-on-a-chip. Specific focus is placed on addressing the question of what kinds of 3D culture and system complexities are deemed desirable by the biological and biomedical community. This question is addressed through analysis of a research survey to evaluate the potential use of microfluidic cell culture models among the end users. Our results showed a willingness to adopt 3D culture technology among biomedical researchers, although a significant gap still exists between the desired systems and existing 3D culture options. With these results, key challenges and future directions are highlighted.

  15. Fault management for data systems

    NASA Technical Reports Server (NTRS)

    Boyd, Mark A.; Iverson, David L.; Patterson-Hine, F. Ann

    1993-01-01

    Issues related to automating the process of fault management (fault diagnosis and response) for data management systems are considered. Substantial benefits are to be gained by successful automation of this process, particularly for large, complex systems. The use of graph-based models to develop a computer assisted fault management system is advocated. The general problem is described and the motivation behind choosing graph-based models over other approaches for developing fault diagnosis computer programs is outlined. Some existing work in the area of graph-based fault diagnosis is reviewed, and a new fault management method which was developed from existing methods is offered. Our method is applied to an automatic telescope system intended as a prototype for future lunar telescope programs. Finally, an application of our method to general data management systems is described.

  16. An agenda-based routing protocol in delay tolerant mobile sensor networks.

    PubMed

    Wang, Xiao-Min; Zhu, Jin-Qi; Liu, Ming; Gong, Hai-Gang

    2010-01-01

    Routing in delay tolerant mobile sensor networks (DTMSNs) is challenging due to the networks' intermittent connectivity. Most existing routing protocols for DTMSNs use simplistic random mobility models for algorithm design and performance evaluation. In the real world, however, due to the unique characteristics of human mobility, currently existing random mobility models may not work well in environments where mobile sensor units are carried (such as DTMSNs). Taking a person's social activities into consideration, in this paper, we seek to improve DTMSN routing in terms of social structure and propose an agenda based routing protocol (ARP). In ARP, humans are classified based on their agendas and data transmission is made according to sensor nodes' transmission rankings. The effectiveness of ARP is demonstrated through comprehensive simulation studies.

  17. Online Knowledge-Based Model for Big Data Topic Extraction

    PubMed Central

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004

  18. Models and theories of prescribing decisions: A review and suggested a new model.

    PubMed

    Murshid, Mohsen Ali; Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the 'persuasion theory - elaboration likelihood model', the stimuli-response marketing model', the 'agency theory', the theory of planned behaviour,' and 'social power theory,' in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  19. Communication: Introducing prescribed biases in out-of-equilibrium Markov models

    NASA Astrophysics Data System (ADS)

    Dixit, Purushottam D.

    2018-03-01

    Markov models are often used in modeling complex out-of-equilibrium chemical and biochemical systems. However, many times their predictions do not agree with experiments. We need a systematic framework to update existing Markov models to make them consistent with constraints that are derived from experiments. Here, we present a framework based on the principle of maximum relative path entropy (minimum Kullback-Leibler divergence) to update Markov models using stationary state and dynamical trajectory-based constraints. We illustrate the framework using a biochemical model network of growth factor-based signaling. We also show how to find the closest detailed balanced Markov model to a given Markov model. Further applications and generalizations are discussed.

  20. Sequence modelling and an extensible data model for genomic database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Peter Wei-Der

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS's do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the Extensible Object Model'', to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less

  1. Sequence modelling and an extensible data model for genomic database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Peter Wei-Der

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS`s do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the ``Extensible Object Model``, to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less

  2. Modeling fuels and fire effects in 3D: Model description and applications

    Treesearch

    Francois Pimont; Russell Parsons; Eric Rigolot; Francois de Coligny; Jean-Luc Dupuy; Philippe Dreyfus; Rodman R. Linn

    2016-01-01

    Scientists and managers critically need ways to assess how fuel treatments alter fire behavior, yet few tools currently exist for this purpose.We present a spatially-explicit-fuel-modeling system, FuelManager, which models fuels, vegetation growth, fire behavior (using a physics-based model, FIRETEC), and fire effects. FuelManager's flexible approach facilitates...

  3. A modeling framework for evaluating streambank stabilization practices for reach-scale sediment reduction

    USDA-ARS?s Scientific Manuscript database

    Streambank stabilization techniques are often implemented to reduce sediment loads from unstable streambanks. Process-based models can predict sediment yields with stabilization scenarios prior to implementation. However, a framework does not exist on how to effectively utilize these models to evalu...

  4. HYDROLOGIC MODEL UNCERTAINTY ASSOCIATED WITH SIMULATING FUTURE LAND-COVER/USE SCENARIOS: A RETROSPECTIVE ANALYSIS

    EPA Science Inventory

    GIS-based hydrologic modeling offers a convenient means of assessing the impacts associated with land-cover/use change for environmental planning efforts. Alternative future scenarios can be used as input to hydrologic models and compared with existing conditions to evaluate pot...

  5. Analysis of vegetation effect on waves using a vertical 2-D RANS model

    USDA-ARS?s Scientific Manuscript database

    A vertical two-dimensional (2-D) model has been applied in the simulation of wave propagation through vegetated water bodies. The model is based on an existing model SOLA-VOF which solves the Reynolds-Averaged Navier-Stokes (RANS) equations with the finite difference method on a staggered rectangula...

  6. Developing the next generation of forest ecosystem models

    Treesearch

    Christopher R. Schwalm; Alan R. Ek

    2002-01-01

    Forest ecology and management are model-rich areas for research. Models are often cast as either empirical or mechanistic. With evolving climate change, hybrid models gain new relevance because of their ability to integrate existing mechanistic knowledge with empiricism based on causal thinking. The utility of hybrid platforms results in the combination of...

  7. Password-Only Authenticated Three-Party Key Exchange with Provable Security in the Standard Model

    PubMed Central

    Nam, Junghyun; Kim, Junghwan; Kang, Hyun-Kyu; Kim, Jinsoo; Paik, Juryon

    2014-01-01

    Protocols for password-only authenticated key exchange (PAKE) in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000), which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks. PMID:24977229

  8. GDTN: Genome-Based Delay Tolerant Network Formation in Heterogeneous 5G Using Inter-UA Collaboration.

    PubMed

    You, Ilsun; Sharma, Vishal; Atiquzzaman, Mohammed; Choo, Kim-Kwang Raymond

    2016-01-01

    With a more Internet-savvy and sophisticated user base, there are more demands for interactive applications and services. However, it is a challenge for existing radio access networks (e.g. 3G and 4G) to cope with the increasingly demanding requirements such as higher data rates and wider coverage area. One potential solution is the inter-collaborative deployment of multiple radio devices in a 5G setting designed to meet exacting user demands, and facilitate the high data rate requirements in the underlying networks. These heterogeneous 5G networks can readily resolve the data rate and coverage challenges. Networks established using the hybridization of existing networks have diverse military and civilian applications. However, there are inherent limitations in such networks such as irregular breakdown, node failures, and halts during speed transmissions. In recent years, there have been attempts to integrate heterogeneous 5G networks with existing ad hoc networks to provide a robust solution for delay-tolerant transmissions in the form of packet switched networks. However, continuous connectivity is still required in these networks, in order to efficiently regulate the flow to allow the formation of a robust network. Therefore, in this paper, we present a novel network formation consisting of nodes from different network maneuvered by Unmanned Aircraft (UA). The proposed model utilizes the features of a biological aspect of genomes and forms a delay tolerant network with existing network models. This allows us to provide continuous and robust connectivity. We then demonstrate that the proposed network model has an efficient data delivery, lower overheads and lesser delays with high convergence rate in comparison to existing approaches, based on evaluations in both real-time testbed and simulation environment.

  9. GDTN: Genome-Based Delay Tolerant Network Formation in Heterogeneous 5G Using Inter-UA Collaboration

    PubMed Central

    2016-01-01

    With a more Internet-savvy and sophisticated user base, there are more demands for interactive applications and services. However, it is a challenge for existing radio access networks (e.g. 3G and 4G) to cope with the increasingly demanding requirements such as higher data rates and wider coverage area. One potential solution is the inter-collaborative deployment of multiple radio devices in a 5G setting designed to meet exacting user demands, and facilitate the high data rate requirements in the underlying networks. These heterogeneous 5G networks can readily resolve the data rate and coverage challenges. Networks established using the hybridization of existing networks have diverse military and civilian applications. However, there are inherent limitations in such networks such as irregular breakdown, node failures, and halts during speed transmissions. In recent years, there have been attempts to integrate heterogeneous 5G networks with existing ad hoc networks to provide a robust solution for delay-tolerant transmissions in the form of packet switched networks. However, continuous connectivity is still required in these networks, in order to efficiently regulate the flow to allow the formation of a robust network. Therefore, in this paper, we present a novel network formation consisting of nodes from different network maneuvered by Unmanned Aircraft (UA). The proposed model utilizes the features of a biological aspect of genomes and forms a delay tolerant network with existing network models. This allows us to provide continuous and robust connectivity. We then demonstrate that the proposed network model has an efficient data delivery, lower overheads and lesser delays with high convergence rate in comparison to existing approaches, based on evaluations in both real-time testbed and simulation environment. PMID:27973618

  10. Implementation of the dynamical system of the deposit and loan growth based on the Lotka-Volterra model and the improved model

    NASA Astrophysics Data System (ADS)

    Fadhlurrahman, Akmal; Sumarti, Novriana

    2016-04-01

    The Lotka-Volterra model is a very popular mathematical model based on the relationship in Ecology between predator, which is an organism that eats another organism, and prey, which is the organism which the predator eats. Predator and prey evolve together. The prey is part of the predator's environment, and the existence of the predator depends on the existence of the prey. As a dynamical system, this model could generate limit cycles, which is an interesting type of equilibrium sometime in the system of two or more dimensions. In [1,2], the dynamical system of the the Deposit and Loan Volumes based on the Lotka-Volterra Model had been developed. In this paper, we improve the definition of parameters in the model and then implement the model on the data of banking from January 2003 to December 2014 which consist of 4 (four) types of banks. The data is represented into the form of return in order to have data in a periodical-like form. The results show the periodicity in the deposit and loan growth data which is in line with paper in [3] that suggest the positive correlation between loan growth and deposit growth, and vice-versa.

  11. Exploring international clinical education in US-based programs: identifying common practices and modifying an existing conceptual model of international service-learning.

    PubMed

    Pechak, Celia M; Black, Jill D

    2014-02-01

    Increasingly physical therapist students complete part of their clinical training outside of their home country. This trend is understudied. The purposes of this study were to: (1) explore, in depth, various international clinical education (ICE) programs; and (2) determine whether the Conceptual Model of Optimal International Service-Learning (ISL) could be applied or adapted to represent ICE. Qualitative content analysis was used to analyze ICE programs and consider modification of an existing ISL conceptual model for ICE. Fifteen faculty in the United States currently involved in ICE were interviewed. The interview transcriptions were systematically analyzed by two researchers. Three models of ICE practices emerged: (1) a traditional clinical education model where local clinical instructors (CIs) focus on the development of clinical skills; (2) a global health model where US-based CIs provide the supervision in the international setting, and learning outcomes emphasized global health and cultural competency; and (3) an ICE/ISL hybrid where US-based CIs supervise the students, and the foci includes community service. Additionally the data supported revising the ISL model's essential core conditions, components and consequence for ICE. The ICE conceptual model may provide a useful framework for future ICE program development and research.

  12. Gas Hydrate Petroleum System Modeling in western Nankai Trough Area

    NASA Astrophysics Data System (ADS)

    Tanaka, M.; Aung, T. T.; Fujii, T.; Wada, N.; Komatsu, Y.

    2017-12-01

    Since 2003, we have been conducting Gas Hydrate (GH) petroleum system models covering the eastern Nankai Trough, Japan, and results of resource potential from regional model shows good match with the value depicted from seismic and log data. In this year, we have applied this method to explore GH potential in study area. In our study area, GH prospects have been identified with aid of bottom simulating reflector (BSR) and presence of high velocity anomalies above the BSR interpreted based on 3D migration seismic and high density velocity cubes. In order to understand the pathway of biogenic methane from source to GH prospects 1D-2D-3D GH petroleum system models are built and investigated. This study comprises lower Miocene to Pleistocene, deep to shallow marine sedimentary successions of Pliocene and Pleistocene layers overlain the basement. The BSR were interpreted in Pliocene and Pleistocene layers. Based on 6 interpreted sequence boundaries from 3D migration seismic and velocity data, construction of a depth 3D framework model is made and distributed by a conceptual submarine fan depositional facies model derived from seismic facies analysis and referring existing geological report. 1D models are created to analyze lithology sensitivity to temperature and vitrinite data from an exploratory well drilled in the vicinity of study area. The PSM parameters are applied in 2D and 3D modeling and simulation. Existing report of the explanatory well reveals that thermogenic origin are considered to exist. For this reason, simulation scenarios including source formations for both biogenic and thermogenic reaction models are also investigated. Simulation results reveal lower boundary of GH saturation zone at pseudo wells has been simulated with sensitivity of a few tens of meters in comparing with interpreted BSR. From sensitivity analysis, simulated temperature was controlled by different peak generation temperature models and geochemical parameters. Progressive folding and updipping layers including paleostructure can effectively assist biogenic gas migration to upward. Biogenic and Thermogenic mixing model shows that kitchen center only has a potential for generating thermogenic hydrocarbon. Our Prospect based on seismic interpretation is consistent with high GH saturation area based on 3D modeling results.

  13. Simple algorithms for remote determination of mineral abundances and particle sizes from reflectance spectra

    NASA Technical Reports Server (NTRS)

    Johnson, Paul E.; Smith, Milton O.; Adams, John B.

    1992-01-01

    Algorithms were developed, based on Hapke's (1981) equations, for remote determinations of mineral abundances and particle sizes from reflectance spectra. In this method, spectra are modeled as a function of end-member abundances and illumination/viewing geometry. The method was tested on a laboratory data set. It is emphasized that, although there exist more sophisticated models, the present algorithms are particularly suited for remotely sensed data, where little opportunity exists to independently measure reflectance versus article size and phase function.

  14. Residual Risk Assessments

    EPA Science Inventory

    Each source category previously subjected to a technology-based standard will be examined to determine if health or ecological risks are significant enough to warrant further regulation. These assesments utilize existing models and data bases to examine the multi-media and multi-...

  15. Implementation of partnership management model of SMK (Vocational High School) with existing industries in mechanical engineering expertise in Central Java

    NASA Astrophysics Data System (ADS)

    Sumbodo, Wirawan; Pardjono, Samsudi, Rahadjo, Winarno Dwi

    2018-03-01

    This study aims to determine the existing conditions of implementation of partnership management model of SMK with the industry on the mechanical engineering expertise in Central Java. The method used is descriptive analysis. The research result shows that the implementation of partnership management model of SMK based on new existing industry produces ready graduates of 62.5% which belongs to low category, although the partnership program of SMK with the industry is done well with the average score of 3.17. As many as 37.5% of SMK graduates of Mechanical Engineering Expertise Program choose to continue their studies or to be an entrepreneur. It is expected that the partnership model of SMK with the industry can be developed into a reference for government policy in developing SMK that is able to produce graduates who are ready to work according to the needs of partner industry.

  16. Cellular-based modeling of oscillatory dynamics in brain networks.

    PubMed

    Skinner, Frances K

    2012-08-01

    Oscillatory, population activities have long been known to occur in our brains during different behavioral states. We know that many different cell types exist and that they contribute in distinct ways to the generation of these activities. I review recent papers that involve cellular-based models of brain networks, most of which include theta, gamma and sharp wave-ripple activities. To help organize the modeling work, I present it from a perspective of three different types of cellular-based modeling: 'Generic', 'Biophysical' and 'Linking'. Cellular-based modeling is taken to encompass the four features of experiment, model development, theory/analyses, and model usage/computation. The three modeling types are shown to include these features and interactions in different ways. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. An Exact Model-Based Method for Near-Field Sources Localization with Bistatic MIMO System.

    PubMed

    Singh, Parth Raj; Wang, Yide; Chargé, Pascal

    2017-03-30

    In this paper, we propose an exact model-based method for near-field sources localization with a bistatic multiple input, multiple output (MIMO) radar system, and compare it with an approximated model-based method. The aim of this paper is to propose an efficient way to use the exact model of the received signals of near-field sources in order to eliminate the systematic error introduced by the use of approximated model in most existing near-field sources localization techniques. The proposed method uses parallel factor (PARAFAC) decomposition to deal with the exact model. Thanks to the exact model, the proposed method has better precision and resolution than the compared approximated model-based method. The simulation results show the performance of the proposed method.

  18. An under-designed RC frame: Seismic assessment through displacement based approach and possible refurbishment with FRP strips and RC jacketing

    NASA Astrophysics Data System (ADS)

    Valente, Marco; Milani, Gabriele

    2017-07-01

    Many existing reinforced concrete buildings in Southern Europe were built (and hence designed) before the introduction of displacement based design in national seismic codes. They are obviously highly vulnerable to seismic actions. In such a situation, simplified methodologies for the seismic assessment and retrofitting of existing structures are required. In this study, a displacement based procedure using non-linear static analyses is applied to a four-story existing RC frame. The aim is to obtain an estimation of its overall structural inadequacy as well as the effectiveness of a specific retrofitting intervention by means of GFRP laminates and RC jacketing. Accurate numerical models are developed within a displacement based approach to reproduce the seismic response of the RC frame in the original configuration and after strengthening.

  19. Communication: Electron ionization of DNA bases.

    PubMed

    Rahman, M A; Krishnakumar, E

    2016-04-28

    No reliable experimental data exist for the partial and total electron ionization cross sections for DNA bases, which are very crucial for modeling radiation damage in genetic material of living cell. We have measured a complete set of absolute partial electron ionization cross sections up to 500 eV for DNA bases for the first time by using the relative flow technique. These partial cross sections are summed to obtain total ion cross sections for all the four bases and are compared with the existing theoretical calculations and the only set of measured absolute cross sections. Our measurements clearly resolve the existing discrepancy between the theoretical and experimental results, thereby providing for the first time reliable numbers for partial and total ion cross sections for these molecules. The results on fragmentation analysis of adenine supports the theory of its formation in space.

  20. Courses of action for effects based operations using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Haider, Sajjad; Levis, Alexander H.

    2006-05-01

    This paper presents an Evolutionary Algorithms (EAs) based approach to identify effective courses of action (COAs) in Effects Based Operations. The approach uses Timed Influence Nets (TINs) as the underlying mathematical model to capture a dynamic uncertain situation. TINs provide a concise graph-theoretic probabilistic approach to specify the cause and effect relationships that exist among the variables of interest (actions, desired effects, and other uncertain events) in a problem domain. The purpose of building these TIN models is to identify and analyze several alternative courses of action. The current practice is to use trial and error based techniques which are not only labor intensive but also produce sub-optimal results and are not capable of modeling constraints among actionable events. The EA based approach presented in this paper is aimed to overcome these limitations. The approach generates multiple COAs that are close enough in terms of achieving the desired effect. The purpose of generating multiple COAs is to give several alternatives to a decision maker. Moreover, the alternate COAs could be generalized based on the relationships that exist among the actions and their execution timings. The approach also allows a system analyst to capture certain types of constraints among actionable events.

  1. Interval-valued intuitionistic fuzzy matrix games based on Archimedean t-conorm and t-norm

    NASA Astrophysics Data System (ADS)

    Xia, Meimei

    2018-04-01

    Fuzzy game theory has been applied in many decision-making problems. The matrix game with interval-valued intuitionistic fuzzy numbers (IVIFNs) is investigated based on Archimedean t-conorm and t-norm. The existing matrix games with IVIFNs are all based on Algebraic t-conorm and t-norm, which are special cases of Archimedean t-conorm and t-norm. In this paper, the intuitionistic fuzzy aggregation operators based on Archimedean t-conorm and t-norm are employed to aggregate the payoffs of players. To derive the solution of the matrix game with IVIFNs, several mathematical programming models are developed based on Archimedean t-conorm and t-norm. The proposed models can be transformed into a pair of primal-dual linear programming models, based on which, the solution of the matrix game with IVIFNs is obtained. It is proved that the theorems being valid in the exiting matrix game with IVIFNs are still true when the general aggregation operator is used in the proposed matrix game with IVIFNs. The proposed method is an extension of the existing ones and can provide more choices for players. An example is given to illustrate the validity and the applicability of the proposed method.

  2. Magneto-hydrodynamical model for plasma

    NASA Astrophysics Data System (ADS)

    Liu, Ruikuan; Yang, Jiayan

    2017-10-01

    Based on the Newton's second law and the Maxwell equations for the electromagnetic field, we establish a new 3-D incompressible magneto-hydrodynamics model for the motion of plasma under the standard Coulomb gauge. By using the Galerkin method, we prove the existence of a global weak solution for this new 3-D model.

  3. Tourism Village Model Based on Local Indigenous: Case Study of Nongkosawit Tourism Village, Gunungpati, Semarang

    NASA Astrophysics Data System (ADS)

    Kurniasih; Nihayah, Dyah Maya; Sudibyo, Syafitri Amalia; Winda, Fajri Nur

    2018-02-01

    Officially, Nongkosawit Village has become a tourism village since 2012. However, the economic impact has not been received by the society yet because of inappropriate tourism village model. Therefore, this study aims to find out the best model for the development of Nongkosawit Tourism Village. This research used Analytical Hierarchy Process method. The results of this research shows that the model of tourism village which was suitable to the local indigenous of Nongkosawit Tourism Village was the cultural based tourism village with the percentage of 58%. Therefore, it is necessary to do re-orientation from the natural-based village model into the cultural-based village model by raising and exploring the existing culture through unique and different tourism products.

  4. A comparison of viscoelastic damping models

    NASA Technical Reports Server (NTRS)

    Slater, Joseph C.; Belvin, W. Keith; Inman, Daniel J.

    1993-01-01

    Modern finite element methods (FEM's) enable the precise modeling of mass and stiffness properties in what were in the past overwhelmingly large and complex structures. These models allow the accurate determination of natural frequencies and mode shapes. However, adequate methods for modeling highly damped and high frequency dependent structures did not exist until recently. The most commonly used method, Modal Strain Energy, does not correctly predict complex mode shapes since it is based on the assumption that the mode shapes of a structure are real. Recently, many techniques have been developed which allow the modeling of frequency dependent damping properties of materials in a finite element compatible form. Two of these methods, the Golla-Hughes-McTavish method and the Lesieutre-Mingori method, model the frequency dependent effects by adding coordinates to the existing system thus maintaining the linearity of the model. The third model, proposed by Bagley and Torvik, is based on the Fractional Calculus method and requires fewer empirical parameters to model the frequency dependence at the expense of linearity of the governing equations. This work examines the Modal Strain Energy, Golla-Hughes-McTavish and Bagley and Torvik models and compares them to determine the plausibility of using them for modeling viscoelastic damping in large structures.

  5. A Kernel Embedding-Based Approach for Nonstationary Causal Model Inference.

    PubMed

    Hu, Shoubo; Chen, Zhitang; Chan, Laiwan

    2018-05-01

    Although nonstationary data are more common in the real world, most existing causal discovery methods do not take nonstationarity into consideration. In this letter, we propose a kernel embedding-based approach, ENCI, for nonstationary causal model inference where data are collected from multiple domains with varying distributions. In ENCI, we transform the complicated relation of a cause-effect pair into a linear model of variables of which observations correspond to the kernel embeddings of the cause-and-effect distributions in different domains. In this way, we are able to estimate the causal direction by exploiting the causal asymmetry of the transformed linear model. Furthermore, we extend ENCI to causal graph discovery for multiple variables by transforming the relations among them into a linear nongaussian acyclic model. We show that by exploiting the nonstationarity of distributions, both cause-effect pairs and two kinds of causal graphs are identifiable under mild conditions. Experiments on synthetic and real-world data are conducted to justify the efficacy of ENCI over major existing methods.

  6. Failure Models and Criteria for FRP Under In-Plane or Three-Dimensional Stress States Including Shear Non-Linearity

    NASA Technical Reports Server (NTRS)

    Pinho, Silvestre T.; Davila, C. G.; Camanho, P. P.; Iannucci, L.; Robinson, P.

    2005-01-01

    A set of three-dimensional failure criteria for laminated fiber-reinforced composites, denoted LaRC04, is proposed. The criteria are based on physical models for each failure mode and take into consideration non-linear matrix shear behaviour. The model for matrix compressive failure is based on the Mohr-Coulomb criterion and it predicts the fracture angle. Fiber kinking is triggered by an initial fiber misalignment angle and by the rotation of the fibers during compressive loading. The plane of fiber kinking is predicted by the model. LaRC04 consists of 6 expressions that can be used directly for design purposes. Several applications involving a broad range of load combinations are presented and compared to experimental data and other existing criteria. Predictions using LaRC04 correlate well with the experimental data, arguably better than most existing criteria. The good correlation seems to be attributable to the physical soundness of the underlying failure models.

  7. An investigation into the vertical axis control power requirements for landing VTOL type aircraft onboard nonaviation ships in various sea states

    NASA Technical Reports Server (NTRS)

    Stevens, M. E.; Roskam, J.

    1985-01-01

    The problem of determining the vertical axis control requirements for landing a VTOL aircraft on a moving ship deck in various sea states is examined. Both a fixed-base piloted simulation and a nonpiloted simulation were used to determine the landing performance as influenced by thrust-to-weight ratio, vertical damping, and engine lags. The piloted simulation was run using a fixed-based simulator at Ames Research center. Simplified versions of an existing AV-8A Harrier model and an existing head-up display format were used. The ship model used was that of a DD963 class destroyer. Simplified linear models of the pilot, aircraft, ship motion, and ship air-wake turbulence were developed for the nonpiloted simulation. A unique aspect of the nonpiloted simulation was the development of a model of the piloting strategy used for shipboard landing. This model was refined during the piloted simulation until it provided a reasonably good representation of observed pilot behavior.

  8. A hierarchical fire frequency model to simulate temporal patterns of fire regimes in LANDIS

    Treesearch

    Jian Yang; Hong S. He; Eric J. Gustafson

    2004-01-01

    Fire disturbance has important ecological effects in many forest landscapes. Existing statistically based approaches can be used to examine the effects of a fire regime on forest landscape dynamics. Most examples of statistically based fire models divide a fire occurrence into two stages--fire ignition and fire initiation. However, the exponential and Weibull fire-...

  9. District Allocation of Human Resources Utilizing the Evidence Based Model: A Study of One High Achieving School District in Southern California

    ERIC Educational Resources Information Center

    Lane, Amber Marie

    2013-01-01

    This study applies the Gap Analysis Framework to understand the gaps that exist in human resource allocation of one Southern California school district. Once identified, gaps are closed with the reallocation of human resources, according to the Evidenced Based Model, requiring the re-purposing of core classroom teachers, specialists, special…

  10. Potential for Woody Bioenergy Crops Grown on Marginal Lands in the US Midwest to Reduce Carbon Emissions

    NASA Astrophysics Data System (ADS)

    Sahajpal, R.; Hurtt, G. C.; Fisk, J. P.; Izaurralde, R. C.; Zhang, X.

    2012-12-01

    While cellulosic biofuels are widely considered to be a low carbon energy source for the future, a comprehensive assessment of the environmental sustainability of existing and future biofuel systems is needed to assess their utility in meeting US energy and food needs without exacerbating environmental harm. To assess the carbon emission reduction potential of cellulosic biofuels, we need to identify lands that are initially not storing large quantities of carbon in soil and vegetation but are capable of producing abundant biomass with limited management inputs, and accurately model forest production rates and associated input requirements. Here we present modeled results for carbon emission reduction potential and cellulosic ethanol production of woody bioenergy crops replacing existing native prairie vegetation grown on marginal lands in the US Midwest. Marginal lands are selected based on soil properties describing use limitation, and are extracted from the SSURGO (Soil Survey Geographic) database. Yield estimates for existing native prairie vegetation on marginal lands modeled using the process-based field-scale model EPIC (Environmental Policy Integrated Climate) amount to ~ 6.7±2.0 Mg ha-1. To model woody bioenergy crops, the individual-based terrestrial ecosystem model ED (Ecosystem Demography) is initialized with the soil organic carbon stocks estimated at the end of the EPIC simulation. Four woody bioenergy crops: willow, southern pine, eucalyptus and poplar are parameterized in ED. Sensitivity analysis of model parameters and drivers is conducted to explore the range of carbon emission reduction possible with variation in woody bioenergy crop types, spatial and temporal resolution. We hypothesize that growing cellulosic crops on these marginal lands can provide significant water quality, biodiversity and GHG emissions mitigation benefits, without accruing additional carbon costs from the displacement of food and feed production.

  11. Computational fluid dynamics-habitat suitability index (CFD-HSI) modelling as an exploratory tool for assessing passability of riverine migratory challenge zones for fish

    USGS Publications Warehouse

    Haro, Alexander J.; Chelminski, Michael; Dudley, Robert W.

    2015-01-01

    We developed two-dimensional computational fluid hydraulics-habitat suitability index (CFD-HSI) models to identify and qualitatively assess potential zones of shallow water depth and high water velocity that may present passage challenges for five major anadromous fish species in a 2.63-km reach of the main stem Penobscot River, Maine, as a result of a dam removal downstream of the reach. Suitability parameters were based on distribution of fish lengths and body depths and transformed to cruising, maximum sustained and sprint swimming speeds. Zones of potential depth and velocity challenges were calculated based on the hydraulic models; ability of fish to pass a challenge zone was based on the percent of river channel that the contiguous zone spanned and its maximum along-current length. Three river flows (low: 99.1 m3 sec-1; normal: 344.9 m3 sec-1; and high: 792.9 m3 sec-1) were modelled to simulate existing hydraulic conditions and hydraulic conditions simulating removal of a dam at the downstream boundary of the reach. Potential depth challenge zones were nonexistent for all low-flow simulations of existing conditions for deeper-bodied fishes. Increasing flows for existing conditions and removal of the dam under all flow conditions increased the number and size of potential velocity challenge zones, with the effects of zones being more pronounced for smaller species. The two-dimensional CFD-HSI model has utility in demonstrating gross effects of flow and hydraulic alteration, but may not be as precise a predictive tool as a three-dimensional model. Passability of the potential challenge zones cannot be precisely quantified for two-dimensional or three-dimensional models due to untested assumptions and incomplete data on fish swimming performance and behaviours.

  12. Study of subgrid-scale velocity models for reacting and nonreacting flows

    NASA Astrophysics Data System (ADS)

    Langella, I.; Doan, N. A. K.; Swaminathan, N.; Pope, S. B.

    2018-05-01

    A study is conducted to identify advantages and limitations of existing large-eddy simulation (LES) closures for the subgrid-scale (SGS) kinetic energy using a database of direct numerical simulations (DNS). The analysis is conducted for both reacting and nonreacting flows, different turbulence conditions, and various filter sizes. A model, based on dissipation and diffusion of momentum (LD-D model), is proposed in this paper based on the observed behavior of four existing models. Our model shows the best overall agreements with DNS statistics. Two main investigations are conducted for both reacting and nonreacting flows: (i) an investigation on the robustness of the model constants, showing that commonly used constants lead to a severe underestimation of the SGS kinetic energy and enlightening their dependence on Reynolds number and filter size; and (ii) an investigation on the statistical behavior of the SGS closures, which suggests that the dissipation of momentum is the key parameter to be considered in such closures and that dilatation effect is important and must be captured correctly in reacting flows. Additional properties of SGS kinetic energy modeling are identified and discussed.

  13. The Structure of Psychopathology: Toward an Expanded Quantitative Empirical Model

    PubMed Central

    Wright, Aidan G.C.; Krueger, Robert F.; Hobbs, Megan J.; Markon, Kristian E.; Eaton, Nicholas R.; Slade, Tim

    2013-01-01

    There has been substantial recent interest in the development of a quantitative, empirically based model of psychopathology. However, the majority of pertinent research has focused on analyses of diagnoses, as described in current official nosologies. This is a significant limitation because existing diagnostic categories are often heterogeneous. In the current research, we aimed to redress this limitation of the existing literature, and to directly compare the fit of categorical, continuous, and hybrid (i.e., combined categorical and continuous) models of syndromes derived from indicators more fine-grained than diagnoses. We analyzed data from a large representative epidemiologic sample (the 2007 Australian National Survey of Mental Health and Wellbeing; N = 8,841). Continuous models provided the best fit for each syndrome we observed (Distress, Obsessive Compulsivity, Fear, Alcohol Problems, Drug Problems, and Psychotic Experiences). In addition, the best fitting higher-order model of these syndromes grouped them into three broad spectra: Internalizing, Externalizing, and Psychotic Experiences. We discuss these results in terms of future efforts to refine emerging empirically based, dimensional-spectrum model of psychopathology, and to use the model to frame psychopathology research more broadly. PMID:23067258

  14. 3D Numerical Modeling of the Propagation of Hydraulic Fracture at Its Intersection with Natural (Pre-existing) Fracture

    NASA Astrophysics Data System (ADS)

    Dehghan, Ali Naghi; Goshtasbi, Kamran; Ahangari, Kaveh; Jin, Yan; Bahmani, Aram

    2017-02-01

    A variety of 3D numerical models were developed based on hydraulic fracture experiments to simulate the propagation of hydraulic fracture at its intersection with natural (pre-existing) fracture. Since the interaction between hydraulic and pre-existing fractures is a key condition that causes complex fracture patterns, the extended finite element method was employed in ABAQUS software to simulate the problem. The propagation of hydraulic fracture in a fractured medium was modeled in two horizontal differential stresses (Δ σ) of 5e6 and 10e6 Pa considering different strike and dip angles of pre-existing fracture. The rate of energy release was calculated in the directions of hydraulic and pre-existing fractures (G_{{frac}} /G_{{rock}}) at their intersection point to determine the fracture behavior. Opening and crossing were two dominant fracture behaviors during the hydraulic and pre-existing fracture interaction at low and high differential stress conditions, respectively. The results of numerical studies were compared with those of experimental models, showing a good agreement between the two to validate the accuracy of the models. Besides the horizontal differential stress, strike and dip angles of the natural (pre-existing) fracture, the key finding of this research was the significant effect of the energy release rate on the propagation behavior of the hydraulic fracture. This effect was more prominent under the influence of strike and dip angles, as well as differential stress. The obtained results can be used to predict and interpret the generation of complex hydraulic fracture patterns in field conditions.

  15. Complication Reducing Effect of the Information Technology-Based Diabetes Management System on Subjects with Type 2 Diabetes

    PubMed Central

    Cho, Jae-Hyoung; Lee, Jin-Hee; Oh, Jeong-Ah; Kang, Mi-Ja; Choi, Yoon-Hee; Kwon, Hyuk-Sang; Chang, Sang-Ah; Cha, Bong-Yun; Son, Ho-Young; Yoon, Kun-Ho

    2008-01-01

    Objective We introduced a new information technology-based diabetes management system, called the Internet-based glucose monitoring system (IBGMS), and demonstrated its short-term and long-term favorable effects. However, there has been no report on clinical effects of such a new diabetes management system on the development of diabetic complications so far. This study was used to simulate the complication reducing effect of the IBGMS, given in addition to existing treatments in patients with type 2 diabetes. Research Design and Methods The CORE Diabetes Model, a peer-reviewed, published, validated computer simulation model, was used to project long-term clinical outcomes in type 2 diabetes patients receiving the IBGMS in addition to their existing treatment. The model combined standard Markov submodels to simulate the incidence and progression of diabetes-related complications. Results The addition of IBGMS was associated with improvements in reducing diabetic complications, mainly microangiopathic complications, including diabetic retinopathy, diabetic neuropathy, diabetic nephropathy, and diabetic foot ulcer. The IBGMS also delayed the development of all diabetic complications for more than 1 year. Conclusions This study demonstrated that the simulated IBGMS, compared to existing treatment, was associated with a reduction of diabetic complications. As a result, it provides valuable evidence for practical application to the public in the world. PMID:19885180

  16. A methodology for modeling barrier island storm-impact scenarios

    USGS Publications Warehouse

    Mickey, Rangley C.; Long, Joseph W.; Plant, Nathaniel G.; Thompson, David M.; Dalyander, P. Soupy

    2017-02-16

    A methodology for developing a representative set of storm scenarios based on historical wave buoy and tide gauge data for a region at the Chandeleur Islands, Louisiana, was developed by the U.S. Geological Survey. The total water level was calculated for a 10-year period and analyzed against existing topographic data to identify when storm-induced wave action would affect island morphology. These events were categorized on the basis of the threshold of total water level and duration to create a set of storm scenarios that were simulated, using a high-fidelity, process-based, morphologic evolution model, on an idealized digital elevation model of the Chandeleur Islands. The simulated morphological changes resulting from these scenarios provide a range of impacts that can help coastal managers determine resiliency of proposed or existing coastal structures and identify vulnerable areas within those structures.

  17. Stratification established by peeling detrainment from gravity currents: laboratory experiments and models

    NASA Astrophysics Data System (ADS)

    Hogg, Charlie; Dalziel, Stuart; Huppert, Herbert; Imberger, Jorg; Department of Applied Mathematics; Theoretical Physics Team; CentreWater Research Team

    2014-11-01

    Dense gravity currents feed fluid into confined basins in lakes, the oceans and many industrial applications. Existing models of the circulation and mixing in such basins are often based on the currents entraining ambient fluid. However, recent observations have suggested that uni-directional entrainment into a gravity current does not fully describe the mixing in such currents. Laboratory experiments were carried out which visualised peeling detrainment from the gravity current occurring when the ambient fluid was stratified. A theoretical model of the observed peeling detrainment was developed to predict the stratification in the basin. This new model gives a better approximation of the stratification observed in the experiments than the pre-existing entraining model. The model can now be developed such that it integrates into operational models of lakes.

  18. Towards a feminist empowerment model of forgiveness psychotherapy.

    PubMed

    McKay, Kevin M; Hill, Melanie S; Freedman, Suzanne R; Enright, Robert D

    2007-03-01

    In recent years Enright and Fitzgibbon's (2000) process model of forgiveness therapy has received substantial theoretical and empirical attention. However, both the process model of forgiveness therapy and the social-cognitive developmental model on which it is based have received criticism from feminist theorists. The current paper considers feminist criticisms of forgiveness therapy and uses a feminist lens to identify potential areas for growth. Specifically, Worell and Remer's (2003) model of synthesizing feminist ideals into existing theory was consulted, areas of bias within the forgiveness model of psychotherapy were identified, and strategies for restructuring areas of potential bias were introduced. Further, the authors consider unique aspects of forgiveness therapy that can potentially strengthen existing models of feminist therapy. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  19. How Many Wolves (Canis lupus) Fit into Germany? The Role of Assumptions in Predictive Rule-Based Habitat Models for Habitat Generalists

    PubMed Central

    Fechter, Dominik; Storch, Ilse

    2014-01-01

    Due to legislative protection, many species, including large carnivores, are currently recolonizing Europe. To address the impending human-wildlife conflicts in advance, predictive habitat models can be used to determine potentially suitable habitat and areas likely to be recolonized. As field data are often limited, quantitative rule based models or the extrapolation of results from other studies are often the techniques of choice. Using the wolf (Canis lupus) in Germany as a model for habitat generalists, we developed a habitat model based on the location and extent of twelve existing wolf home ranges in Eastern Germany, current knowledge on wolf biology, different habitat modeling techniques and various input data to analyze ten different input parameter sets and address the following questions: (1) How do a priori assumptions and different input data or habitat modeling techniques affect the abundance and distribution of potentially suitable wolf habitat and the number of wolf packs in Germany? (2) In a synthesis across input parameter sets, what areas are predicted to be most suitable? (3) Are existing wolf pack home ranges in Eastern Germany consistent with current knowledge on wolf biology and habitat relationships? Our results indicate that depending on which assumptions on habitat relationships are applied in the model and which modeling techniques are chosen, the amount of potentially suitable habitat estimated varies greatly. Depending on a priori assumptions, Germany could accommodate between 154 and 1769 wolf packs. The locations of the existing wolf pack home ranges in Eastern Germany indicate that wolves are able to adapt to areas densely populated by humans, but are limited to areas with low road densities. Our analysis suggests that predictive habitat maps in general, should be interpreted with caution and illustrates the risk for habitat modelers to concentrate on only one selection of habitat factors or modeling technique. PMID:25029506

  20. Systems oncology: towards patient-specific treatment regimes informed by multiscale mathematical modelling.

    PubMed

    Powathil, Gibin G; Swat, Maciej; Chaplain, Mark A J

    2015-02-01

    The multiscale complexity of cancer as a disease necessitates a corresponding multiscale modelling approach to produce truly predictive mathematical models capable of improving existing treatment protocols. To capture all the dynamics of solid tumour growth and its progression, mathematical modellers need to couple biological processes occurring at various spatial and temporal scales (from genes to tissues). Because effectiveness of cancer therapy is considerably affected by intracellular and extracellular heterogeneities as well as by the dynamical changes in the tissue microenvironment, any model attempt to optimise existing protocols must consider these factors ultimately leading to improved multimodal treatment regimes. By improving existing and building new mathematical models of cancer, modellers can play important role in preventing the use of potentially sub-optimal treatment combinations. In this paper, we analyse a multiscale computational mathematical model for cancer growth and spread, incorporating the multiple effects of radiation therapy and chemotherapy in the patient survival probability and implement the model using two different cell based modelling techniques. We show that the insights provided by such multiscale modelling approaches can ultimately help in designing optimal patient-specific multi-modality treatment protocols that may increase patients quality of life. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Perspectives on Non-Animal Alternatives for Assessing Sensitization Potential in Allergic Contact Dermatitis

    PubMed Central

    Sharma, Nripen S.; Jindal, Rohit; Mitra, Bhaskar; Lee, Serom; Li, Lulu; Maguire, Tim J.; Schloss, Rene; Yarmush, Martin L.

    2014-01-01

    Skin sensitization remains a major environmental and occupational health hazard. Animal models have been used as the gold standard method of choice for estimating chemical sensitization potential. However, a growing international drive and consensus for minimizing animal usage have prompted the development of in vitro methods to assess chemical sensitivity. In this paper, we examine existing approaches including in silico models, cell and tissue based assays for distinguishing between sensitizers and irritants. The in silico approaches that have been discussed include Quantitative Structure Activity Relationships (QSAR) and QSAR based expert models that correlate chemical molecular structure with biological activity and mechanism based read-across models that incorporate compound electrophilicity. The cell and tissue based assays rely on an assortment of mono and co-culture cell systems in conjunction with 3D skin models. Given the complexity of allergen induced immune responses, and the limited ability of existing systems to capture the entire gamut of cellular and molecular events associated with these responses, we also introduce a microfabricated platform that can capture all the key steps involved in allergic contact sensitivity. Finally, we describe the development of an integrated testing strategy comprised of two or three tier systems for evaluating sensitization potential of chemicals. PMID:24741377

  2. Identity-Based Motivation: Constraints and Opportunities in Consumer Research.

    PubMed

    Shavitt, Sharon; Torelli, Carlos J; Wong, Jimmy

    2009-07-01

    This commentary underscores the integrative nature of the identity-based motivation model (Oyserman, 2009). We situate the model within existing literatures in psychology and consumer behavior, and illustrate its novel elements with research examples. Special attention is devoted to, 1) how product- and brand-based affordances constrain identity-based motivation processes and, 2) the mindsets and action tendencies that can be triggered by specific cultural identities in pursuit of consumer goals. Future opportunities are suggested for researching the antecedents of product meanings and relevant identities.

  3. Derivation of Continuum Models from An Agent-based Cancer Model: Optimization and Sensitivity Analysis.

    PubMed

    Voulgarelis, Dimitrios; Velayudhan, Ajoy; Smith, Frank

    2017-01-01

    Agent-based models provide a formidable tool for exploring complex and emergent behaviour of biological systems as well as accurate results but with the drawback of needing a lot of computational power and time for subsequent analysis. On the other hand, equation-based models can more easily be used for complex analysis in a much shorter timescale. This paper formulates an ordinary differential equations and stochastic differential equations model to capture the behaviour of an existing agent-based model of tumour cell reprogramming and applies it to optimization of possible treatment as well as dosage sensitivity analysis. For certain values of the parameter space a close match between the equation-based and agent-based models is achieved. The need for division of labour between the two approaches is explored. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. Simulation of Healing Threshold in Strain-Induced Inflammation Through a Discrete Informatics Model.

    PubMed

    Ibrahim, Israr Bin M; Sarma O V, Sanjay; Pidaparti, Ramana M

    2018-05-01

    Respiratory diseases such as asthma and acute respiratory distress syndrome as well as acute lung injury involve inflammation at the cellular level. The inflammation process is very complex and is characterized by the emergence of cytokines along with other changes in cellular processes. Due to the complexity of the various constituents that makes up the inflammation dynamics, it is necessary to develop models that can complement experiments to fully understand inflammatory diseases. In this study, we developed a discrete informatics model based on cellular automata (CA) approach to investigate the influence of elastic field (stretch/strain) on the dynamics of inflammation and account for probabilistic adaptation based on statistical interpretation of existing experimental data. Our simulation model investigated the effects of low, medium, and high strain conditions on inflammation dynamics. Results suggest that the model is able to indicate the threshold of innate healing of tissue as a response to strain experienced by the tissue. When strain is under the threshold, the tissue is still capable of adapting its structure to heal the damaged part. However, there exists a strain threshold where healing capability breaks down. The results obtained demonstrate that the developed discrete informatics based CA model is capable of modeling and giving insights into inflammation dynamics parameters under various mechanical strain/stretch environments.

  5. Toward On-line Parameter Estimation of Concentric Tube Robots Using a Mechanics-based Kinematic Model

    PubMed Central

    Jang, Cheongjae; Ha, Junhyoung; Dupont, Pierre E.; Park, Frank Chongwoo

    2017-01-01

    Although existing mechanics-based models of concentric tube robots have been experimentally demonstrated to approximate the actual kinematics, determining accurate estimates of model parameters remains difficult due to the complex relationship between the parameters and available measurements. Further, because the mechanics-based models neglect some phenomena like friction, nonlinear elasticity, and cross section deformation, it is also not clear if model error is due to model simplification or to parameter estimation errors. The parameters of the superelastic materials used in these robots can be slowly time-varying, necessitating periodic re-estimation. This paper proposes a method for estimating the mechanics-based model parameters using an extended Kalman filter as a step toward on-line parameter estimation. Our methodology is validated through both simulation and experiments. PMID:28717554

  6. DebriSat: The New Hypervelocity Impact Test for Satellite Breakup Fragment Characterization

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather

    2015-01-01

    To replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoD and NASA breakup models: DebriSat is intended to be representative of modern LEO satellites. Major design decisions were reviewed and approved by Aerospace subject matter experts from different disciplines. DebriSat includes 7 major subsystems. Attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. center dotA key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), supporting the development of the DoD and NASA satellite breakup models was conducted at AEDC in 1992. Breakup models based on SOCIT have supported many applications and matched on-orbit events reasonably well over the years.

  7. Simulation of the shallow groundwater-flow system near Mole Lake, Forest County, Wisconsin

    USGS Publications Warehouse

    Fienen, Michael N.; Juckem, Paul F.; Hunt, Randall J.

    2011-01-01

    The shallow groundwater system near Mole Lake, Forest County, Wis. was simulated using a previously calibrated regional model. The previous model was updated using newly collected water-level measurements and refinements to surface-water features. The updated model was then used to calculate the area contributing recharge for one existing and two proposed pumping locations on lands of the Sokaogon Chippewa Community. Delineated 1-, 5-, and 10-year areas contributing recharge for existing and proposed wells extend from the areas of pumping to the northeast of the pumping locations. Steady-state pumping was simulated for two scenarios: a base pumping scenario using pumping rates that reflect what the Tribe expects to pump and a high pumping scenario, in which the rate was set to the maximum expected from wells installed in this area. In the base pumping scenario, pumping rates of 32 gallons per minute (gal/min; 46,000 gallons per day (gal/d)) from the existing well and 30 gal/min (43,000 gal/d) at each of the two proposed wells were simulated. The high pumping scenario simulated a rate of 70 gal/min (101,000 gal/d) from each of the three pumping wells to estimate of the largest areas contributing recharge that might be expected given what is currently known about the shallow groundwater system. The areas contributing recharge for both the base and high pumping scenarios did not intersect any modeled surface-water bodies; however, the high pumping scenario had a larger areal extent than the base pumping scenario and intersected a septic separator.

  8. Simulating Serious Games: A Discrete-Time Computational Model Based on Cognitive Flow Theory

    ERIC Educational Resources Information Center

    Westera, Wim

    2018-01-01

    This paper presents a computational model for simulating how people learn from serious games. While avoiding the combinatorial explosion of a games micro-states, the model offers a meso-level pathfinding approach, which is guided by cognitive flow theory and various concepts from learning sciences. It extends a basic, existing model by exposing…

  9. Tactile Teaching: Exploring Protein Structure/Function Using Physical Models

    ERIC Educational Resources Information Center

    Herman, Tim; Morris, Jennifer; Colton, Shannon; Batiza, Ann; Patrick, Michael; Franzen, Margaret; Goodsell, David S.

    2006-01-01

    The technology now exists to construct physical models of proteins based on atomic coordinates of solved structures. We review here our recent experiences in using physical models to teach concepts of protein structure and function at both the high school and the undergraduate levels. At the high school level, physical models are used in a…

  10. A Multidimensional Model for Child Maltreatment Prevention Readiness in Low- and Middle-Income Countries

    ERIC Educational Resources Information Center

    Mikton, Christopher; Mehra, Radhika; Butchart, Alexander; Addiss, David; Almuneef, Maha; Cardia, Nancy; Cheah, Irene; Chen, JingQi; Makoae, Mokhantso; Raleva, Marija

    2011-01-01

    The study's aim was to develop a multidimensional model for the assessment of child maltreatment prevention readiness in low- and middle-income countries. The model was developed based on a conceptual review of relevant existing models and approaches, an international expert consultation, and focus groups in six countries. The final model…

  11. lumpR 2.0.0: an R package facilitating landscape discretisation for hillslope-based hydrological models

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2017-08-01

    The characteristics of a landscape pose essential factors for hydrological processes. Therefore, an adequate representation of the landscape of a catchment in hydrological models is vital. However, many of such models exist differing, amongst others, in spatial concept and discretisation. The latter constitutes an essential pre-processing step, for which many different algorithms along with numerous software implementations exist. In that context, existing solutions are often model specific, commercial, or depend on commercial back-end software, and allow only a limited or no workflow automation at all. Consequently, a new package for the scientific software and scripting environment R, called lumpR, was developed. lumpR employs an algorithm for hillslope-based landscape discretisation directed to large-scale application via a hierarchical multi-scale approach. The package addresses existing limitations as it is free and open source, easily extendible to other hydrological models, and the workflow can be fully automated. Moreover, it is user-friendly as the direct coupling to a GIS allows for immediate visual inspection and manual adjustment. Sufficient control is furthermore retained via parameter specification and the option to include expert knowledge. Conversely, completely automatic operation also allows for extensive analysis of aspects related to landscape discretisation. In a case study, the application of the package is presented. A sensitivity analysis of the most important discretisation parameters demonstrates its efficient workflow automation. Considering multiple streamflow metrics, the employed model proved reasonably robust to the discretisation parameters. However, parameters determining the sizes of subbasins and hillslopes proved to be more important than the others, including the number of representative hillslopes, the number of attributes employed for the lumping algorithm, and the number of sub-discretisations of the representative hillslopes.

  12. 3D modeling based on CityEngine

    NASA Astrophysics Data System (ADS)

    Jia, Guangyin; Liao, Kaiju

    2017-03-01

    Currently, there are many 3D modeling softwares, like 3DMAX, AUTOCAD, and more populous BIM softwares represented by REVIT. CityEngine modeling software introduced in this paper can fully utilize the existing GIS data and combine other built models to make 3D modeling on internal and external part of buildings in a rapid and batch manner, so as to improve the 3D modeling efficiency.

  13. Model predictive control design for polytopic uncertain systems by synthesising multi-step prediction scenarios

    NASA Astrophysics Data System (ADS)

    Lu, Jianbo; Xi, Yugeng; Li, Dewei; Xu, Yuli; Gan, Zhongxue

    2018-01-01

    A common objective of model predictive control (MPC) design is the large initial feasible region, low online computational burden as well as satisfactory control performance of the resulting algorithm. It is well known that interpolation-based MPC can achieve a favourable trade-off among these different aspects. However, the existing results are usually based on fixed prediction scenarios, which inevitably limits the performance of the obtained algorithms. So by replacing the fixed prediction scenarios with the time-varying multi-step prediction scenarios, this paper provides a new insight into improvement of the existing MPC designs. The adopted control law is a combination of predetermined multi-step feedback control laws, based on which two MPC algorithms with guaranteed recursive feasibility and asymptotic stability are presented. The efficacy of the proposed algorithms is illustrated by a numerical example.

  14. A statistical learning approach to the modeling of chromatographic retention of oligonucleotides incorporating sequence and secondary structure data

    PubMed Central

    Sturm, Marc; Quinten, Sascha; Huber, Christian G.; Kohlbacher, Oliver

    2007-01-01

    We propose a new model for predicting the retention time of oligonucleotides. The model is based on ν support vector regression using features derived from base sequence and predicted secondary structure of oligonucleotides. Because of the secondary structure information, the model is applicable even at relatively low temperatures where the secondary structure is not suppressed by thermal denaturing. This makes the prediction of oligonucleotide retention time for arbitrary temperatures possible, provided that the target temperature lies within the temperature range of the training data. We describe different possibilities of feature calculation from base sequence and secondary structure, present the results and compare our model to existing models. PMID:17567619

  15. [GSH fermentation process modeling using entropy-criterion based RBF neural network model].

    PubMed

    Tan, Zuoping; Wang, Shitong; Deng, Zhaohong; Du, Guocheng

    2008-05-01

    The prediction accuracy and generalization of GSH fermentation process modeling are often deteriorated by noise existing in the corresponding experimental data. In order to avoid this problem, we present a novel RBF neural network modeling approach based on entropy criterion. It considers the whole distribution structure of the training data set in the parameter learning process compared with the traditional MSE-criterion based parameter learning, and thus effectively avoids the weak generalization and over-learning. Then the proposed approach is applied to the GSH fermentation process modeling. Our results demonstrate that this proposed method has better prediction accuracy, generalization and robustness such that it offers a potential application merit for the GSH fermentation process modeling.

  16. Invitation to a forum: architecting operational `next generation' earth monitoring satellites based on best modeling, existing sensor capabilities, with constellation efficiencies to secure trusted datasets for the next 20 years

    NASA Astrophysics Data System (ADS)

    Helmuth, Douglas B.; Bell, Raymond M.; Grant, David A.; Lentz, Christopher A.

    2012-09-01

    Architecting the operational Next Generation of earth monitoring satellites based on matured climate modeling, reuse of existing sensor & satellite capabilities, attention to affordability and evolutionary improvements integrated with constellation efficiencies - becomes our collective goal for an open architectural design forum. Understanding the earth's climate and collecting requisite signatures over the next 30 years is a shared mandate by many of the world's governments. But there remains a daunting challenge to bridge scientific missions to 'operational' systems that truly support the demands of decision makers, scientific investigators and global users' requirements for trusted data. In this paper we will suggest an architectural structure that takes advantage of current earth modeling examples including cross-model verification and a first order set of critical climate parameters and metrics; that in turn, are matched up with existing space borne collection capabilities and sensors. The tools used and the frameworks offered are designed to allow collaborative overlays by other stakeholders nominating different critical parameters and their own treaded connections to existing international collection experience. These aggregate design suggestions will be held up to group review and prioritized as potential constellation solutions including incremental and spiral developments - including cost benefits and organizational opportunities. This Part IV effort is focused on being an inclusive 'Next Gen Constellation' design discussion and is the natural extension to earlier papers.

  17. Thermal Modeling and Cryogenic Design of a Helical Superconducting Undulator Cryostat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiroyanagi, Y.; Fuerst, J.; Hasse, Q.

    A conceptual design for a helical superconducting undulator (HSCU) for the Advanced Photon Source (APS) at Argonne National Laboratory (ANL) has been completed. The device differs sufficiently from the existing APS planar superconducting undulator (SCU) design to warrant development of a new cryostat based on value engineering and lessons learned from the existing planar SCU. Changes include optimization of the existing cryocooler-based refrigeration system and thermal shield as well as cost reduction through the use of standard vacuum hardware. The end result is a design that provides significantly larger 4.2 K refrigeration margin in a smaller package for greater installationmore » flexibility in the APS storage ring. This paper presents ANSYS-based thermal analysis of the cryostat, including estimated static and dynamic« less

  18. Physical consistency of subgrid-scale models for large-eddy simulation of incompressible turbulent flows

    NASA Astrophysics Data System (ADS)

    Silvis, Maurits H.; Remmerswaal, Ronald A.; Verstappen, Roel

    2017-01-01

    We study the construction of subgrid-scale models for large-eddy simulation of incompressible turbulent flows. In particular, we aim to consolidate a systematic approach of constructing subgrid-scale models, based on the idea that it is desirable that subgrid-scale models are consistent with the mathematical and physical properties of the Navier-Stokes equations and the turbulent stresses. To that end, we first discuss in detail the symmetries of the Navier-Stokes equations, and the near-wall scaling behavior, realizability and dissipation properties of the turbulent stresses. We furthermore summarize the requirements that subgrid-scale models have to satisfy in order to preserve these important mathematical and physical properties. In this fashion, a framework of model constraints arises that we apply to analyze the behavior of a number of existing subgrid-scale models that are based on the local velocity gradient. We show that these subgrid-scale models do not satisfy all the desired properties, after which we explain that this is partly due to incompatibilities between model constraints and limitations of velocity-gradient-based subgrid-scale models. However, we also reason that the current framework shows that there is room for improvement in the properties and, hence, the behavior of existing subgrid-scale models. We furthermore show how compatible model constraints can be combined to construct new subgrid-scale models that have desirable properties built into them. We provide a few examples of such new models, of which a new model of eddy viscosity type, that is based on the vortex stretching magnitude, is successfully tested in large-eddy simulations of decaying homogeneous isotropic turbulence and turbulent plane-channel flow.

  19. Low Reynolds number two-equation modeling of turbulent flows

    NASA Technical Reports Server (NTRS)

    Michelassi, V.; Shih, T.-H.

    1991-01-01

    A k-epsilon model that accounts for viscous and wall effects is presented. The proposed formulation does not contain the local wall distance thereby making very simple the application to complex geometries. The formulation is based on an existing k-epsilon model that proved to fit very well with the results of direct numerical simulation. The new form is compared with nine different two-equation models and with direct numerical simulation for a fully developed channel flow at Re = 3300. The simple flow configuration allows a comparison free from numerical inaccuracies. The computed results prove that few of the considered forms exhibit a satisfactory agreement with the channel flow data. The model shows an improvement with respect to the existing formulations.

  20. Snow model design for operational purposes

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur

    2017-04-01

    A parsimonious distributed energy balance snow model intended for operational use is evaluated using discharge, snow covered area and grain size; the latter two as observed from the MODIS sensor. The snow model is an improvement of the existing GamSnow model, which is a part of the Enki modelling framework. Core requirements for the new version have been: 1. Reduction of calibration freedom, motivated by previous experience of non-identifiable parameters in the existing version 2. Improvement of process representation based on recent advances in physically based snow modelling 3. Limiting the sensitivity to forcing data which are poorly known over the spatial domain of interest (often in mountainous areas) 4. Preference for observable states, and the ability to improve from updates. The albedo calculation is completely revised, now based on grain size through an emulation of the SNICAR model (Flanner and Zender, 2006; Gardener and Sharp, 2010). The number of calibration parameters in the albedo model is reduced from 6 to 2. The wind function governing turbulent energy fluxes has been reduced from 2 to 1 parameter. Following Raleigh et al (2011), snow surface radiant temperature is split from the top layer thermodynamic temperature, using bias-corrected wet-bulb temperature to model the former. Analyses are ongoing, and the poster will bring evaluation results from 16 years of MODIS observations and more than 25 catchments in southern Norway.

  1. Implementation of the zooplankton functional response in plankton models: State of the art, recent challenges and future directions

    NASA Astrophysics Data System (ADS)

    Morozov, Andrew; Poggiale, Jean-Christophe; Cordoleani, Flora

    2012-09-01

    The conventional way of describing grazing in plankton models is based on a zooplankton functional response framework, according to which the consumption rate is computed as the product of a certain function of food (the functional response) and the density/biomass of herbivorous zooplankton. A large amount of literature on experimental feeding reports the existence of a zooplankton functional response in microcosms and small mesocosms, which goes a long way towards explaining the popularity of this framework both in mean-field (e.g. NPZD models) and spatially resolved models. On the other hand, the complex foraging behaviour of zooplankton (feeding cycles) as well as spatial heterogeneity of food and grazer distributions (plankton patchiness) across time and space scales raise questions as to the existence of a functional response of herbivores in vivo. In the current review, we discuss limitations of the ‘classical’ zooplankton functional response and consider possible ways to amend this framework to cope with the complexity of real planktonic ecosystems. Our general conclusion is that although the functional response of herbivores often does not exist in real ecosystems (especially in the form observed in the laboratory), this framework can be rather useful in modelling - but it does need some amendment which can be made based on various techniques of model reduction. We also show that the shape of the functional response depends on the spatial resolution (‘frame’) of the model. We argue that incorporating foraging behaviour and spatial heterogeneity in plankton models would not necessarily require the use of individual based modelling - an approach which is now becoming dominant in the literature. Finally, we list concrete future directions and challenges and emphasize the importance of a closer collaboration between plankton biologists and modellers in order to make further progress towards better descriptions of zooplankton grazing.

  2. What do we mean by sensitivity analysis? The need for comprehensive characterization of "global" sensitivity in Earth and Environmental systems models

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2015-05-01

    Sensitivity analysis is an essential paradigm in Earth and Environmental Systems modeling. However, the term "sensitivity" has a clear definition, based in partial derivatives, only when specified locally around a particular point (e.g., optimal solution) in the problem space. Accordingly, no unique definition exists for "global sensitivity" across the problem space, when considering one or more model responses to different factors such as model parameters or forcings. A variety of approaches have been proposed for global sensitivity analysis, based on different philosophies and theories, and each of these formally characterizes a different "intuitive" understanding of sensitivity. These approaches focus on different properties of the model response at a fundamental level and may therefore lead to different (even conflicting) conclusions about the underlying sensitivities. Here we revisit the theoretical basis for sensitivity analysis, summarize and critically evaluate existing approaches in the literature, and demonstrate their flaws and shortcomings through conceptual examples. We also demonstrate the difficulty involved in interpreting "global" interaction effects, which may undermine the value of existing interpretive approaches. With this background, we identify several important properties of response surfaces that are associated with the understanding and interpretation of sensitivities in the context of Earth and Environmental System models. Finally, we highlight the need for a new, comprehensive framework for sensitivity analysis that effectively characterizes all of the important sensitivity-related properties of model response surfaces.

  3. Inquiry-Based Learning of Molecular Phylogenetics

    ERIC Educational Resources Information Center

    Campo, Daniel; Garcia-Vazquez, Eva

    2008-01-01

    Reconstructing phylogenies from nucleotide sequences is a challenge for students because it strongly depends on evolutionary models and computer tools that are frequently updated. We present here an inquiry-based course aimed at learning how to trace a phylogeny based on sequences existing in public databases. Computer tools are freely available…

  4. Ras Dimer Formation as a New Signaling Mechanism and Potential Cancer Therapeutic Target

    PubMed Central

    Chen, Mo; Peters, Alec; Huang, Tao; Nan, Xiaolin

    2016-01-01

    The K-, N-, and HRas small GTPases are key regulators of cell physiology and are frequently mutated in human cancers. Despite intensive research, previous efforts to target hyperactive Ras based on known mechanisms of Ras signaling have been met with little success. Several studies have provided compelling evidence for the existence and biological relevance of Ras dimers, establishing a new mechanism for regulating Ras activity in cells additionally to GTP-loading and membrane localization. Existing data also start to reveal how Ras proteins dimerize on the membrane. We propose a dimer model to describe Ras-mediated effector activation, which contrasts existing models of Ras signaling as a monomer or as a 5-8 membered multimer. We also discuss potential implications of this model in both basic and translational Ras biology. PMID:26423697

  5. Scales and erosion

    USDA-ARS?s Scientific Manuscript database

    There is a need to develop scale explicit understanding of erosion to overcome existing conceptual and methodological flaws in our modelling methods currently applied to understand the process of erosion, transport and deposition at the catchment scale. These models need to be based on a sound under...

  6. PREDICTION OF MULTICOMPONENT INORGANIC ATMOSPHERIC AEROSOL BEHAVIOR. (R824793)

    EPA Science Inventory

    Many existing models calculate the composition of the atmospheric aerosol system by solving a set of algebraic equations based on reversible reactions derived from thermodynamic equilibrium. Some models rely on an a priori knowledge of the presence of components in certain relati...

  7. Computer-Based Resource Accounting Model for Generating Aggregate Resource Impacts of Alternative Automobile Technologies : Volume 1. Fleet Attributes Model

    DOT National Transportation Integrated Search

    1977-01-01

    Auto production and operation consume energy, material, capital and labor resources. Numerous substitution possibilities exist within and between resource sectors, corresponding to the broad spectrum of potential design technologies. Alternative auto...

  8. Operationalizing the Concept of Value--An Action Research-Based Model

    ERIC Educational Resources Information Center

    Naslund, Dag; Olsson, Annika; Karlsson, Sture

    2006-01-01

    Purpose: While the importance of measuring customer satisfaction levels is well established, less research exists on how organizations operationalize such knowledge. The purpose of this paper is to describe an action research (AR) case study resulting in a workshop model to operationalize the concept of value. The model facilitates organizational…

  9. A Systemic Cause Analysis Model for Human Performance Technicians

    ERIC Educational Resources Information Center

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  10. A predictive model for floating leaf vegetation in the St. Louis River Estuary

    EPA Science Inventory

    In July 2014, USEPA staff was asked by MPCA to develop a predictive model for floating leaf vegetation (FLV) in the St. Louis River Estuary (SLRE). The existing model (Host et al. 2012) greatly overpredicts FLV in St. Louis Bay probably because it was based on a limited number of...

  11. A Theory Based Model of Interpersonal Leadership: An Integration of the Literature

    ERIC Educational Resources Information Center

    Lamm, Kevan W.; Carter, Hannah S.; Lamm, Alexa J.

    2016-01-01

    Although the term interpersonal leadership has been well established within the literature, there remains a dearth of theoretically derived models that specifically address the comprehensive nature of the underlying leader behaviors and activities. The intent of the present article is to attempt to synthesize the existent leadership models,…

  12. Designing an Educational Game with Ten Steps to Complex Learning

    ERIC Educational Resources Information Center

    Enfield, Jacob

    2012-01-01

    Few instructional design (ID) models exist which are specific for developing educational games. Moreover, those extant ID models have not been rigorously evaluated. No ID models were found which focus on educational games with complex learning objectives. "Ten Steps to Complex Learning" (TSCL) is based on the four component instructional…

  13. A fuzzy case based reasoning tool for model based approach to rocket engine health monitoring

    NASA Technical Reports Server (NTRS)

    Krovvidy, Srinivas; Nolan, Adam; Hu, Yong-Lin; Wee, William G.

    1992-01-01

    In this system we develop a fuzzy case based reasoner that can build a case representation for several past anomalies detected, and we develop case retrieval methods that can be used to index a relevant case when a new problem (case) is presented using fuzzy sets. The choice of fuzzy sets is justified by the uncertain data. The new problem can be solved using knowledge of the model along with the old cases. This system can then be used to generalize the knowledge from previous cases and use this generalization to refine the existing model definition. This in turn can help to detect failures using the model based algorithms.

  14. Ferroelectric Field Effect Transistor Model Using Partitioned Ferroelectric Layer and Partial Polarization

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Ho, Fat D.

    2004-01-01

    A model of an n-channel ferroelectric field effect transistor has been developed based on both theoretical and empirical data. The model is based on an existing model that incorporates partitioning of the ferroelectric layer to calculate the polarization within the ferroelectric material. The model incorporates several new aspects that are useful to the user. It takes into account the effect of a non-saturating gate voltage only partially polarizing the ferroelectric material based on the existing remnant polarization. The model also incorporates the decay of the remnant polarization based on the time history of the FFET. A gate pulse of a specific voltage; will not put the ferroelectric material into a single amount of polarization for that voltage, but instead vary with previous state of the material and the time since the last change to the gate voltage. The model also utilizes data from FFETs made from different types of ferroelectric materials to allow the user just to input the material being used and not recreate the entire model. The model also allows the user to input the quality of the ferroelectric material being used. The ferroelectric material quality can go from a theoretical perfect material with little loss and no decay to a less than perfect material with remnant losses and decay. This model is designed to be used by people who need to predict the external characteristics of a FFET before the time and expense of design and fabrication. It also allows the parametric evaluation of quality of the ferroelectric film on the overall performance of the transistor.

  15. A Novel Extreme Learning Machine Classification Model for e-Nose Application Based on the Multiple Kernel Approach.

    PubMed

    Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong

    2017-06-19

    A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification.

  16. Supporting Scientific Modeling Practices in Atmospheric Sciences: Intended and Actual Affordances of a Computer-Based Modeling Tool

    ERIC Educational Resources Information Center

    Wu, Pai-Hsing; Wu, Hsin-Kai; Kuo, Che-Yu; Hsu, Ying-Shao

    2015-01-01

    Computer-based learning tools include design features to enhance learning but learners may not always perceive the existence of these features and use them in desirable ways. There might be a gap between what the tool features are designed to offer (intended affordance) and what they are actually used (actual affordance). This study thus aims at…

  17. Examining Attitudes, Enjoyment, Flow, and Physical Activity Levels in Pre-Service Teachers Utilizing the BLISS Model Compared to Traditional Dance Instruction

    ERIC Educational Resources Information Center

    Davis, Christa Ann

    2013-01-01

    This dissertation describes two studies, based on data collection within a pre-existing collegiate course for pre-service teachers in a children's dance setting at a northwest public university. The overall purpose of these experimental studies was to compare traditional movement/dance with the influence of a relevance-based instructional model,…

  18. Communication: Electron ionization of DNA bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, M. A.; Krishnakumar, E., E-mail: ekkumar@tifr.res.in

    2016-04-28

    No reliable experimental data exist for the partial and total electron ionization cross sections for DNA bases, which are very crucial for modeling radiation damage in genetic material of living cell. We have measured a complete set of absolute partial electron ionization cross sections up to 500 eV for DNA bases for the first time by using the relative flow technique. These partial cross sections are summed to obtain total ion cross sections for all the four bases and are compared with the existing theoretical calculations and the only set of measured absolute cross sections. Our measurements clearly resolve themore » existing discrepancy between the theoretical and experimental results, thereby providing for the first time reliable numbers for partial and total ion cross sections for these molecules. The results on fragmentation analysis of adenine supports the theory of its formation in space.« less

  19. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  20. Replicating Health Economic Models: Firm Foundations or a House of Cards?

    PubMed

    Bermejo, Inigo; Tappenden, Paul; Youn, Ji-Hee

    2017-11-01

    Health economic evaluation is a framework for the comparative analysis of the incremental health gains and costs associated with competing decision alternatives. The process of developing health economic models is usually complex, financially expensive and time-consuming. For these reasons, model development is sometimes based on previous model-based analyses; this endeavour is usually referred to as model replication. Such model replication activity may involve the comprehensive reproduction of an existing model or 'borrowing' all or part of a previously developed model structure. Generally speaking, the replication of an existing model may require substantially less effort than developing a new de novo model by bypassing, or undertaking in only a perfunctory manner, certain aspects of model development such as the development of a complete conceptual model and/or comprehensive literature searching for model parameters. A further motivation for model replication may be to draw on the credibility or prestige of previous analyses that have been published and/or used to inform decision making. The acceptability and appropriateness of replicating models depends on the decision-making context: there exists a trade-off between the 'savings' afforded by model replication and the potential 'costs' associated with reduced model credibility due to the omission of certain stages of model development. This paper provides an overview of the different levels of, and motivations for, replicating health economic models, and discusses the advantages, disadvantages and caveats associated with this type of modelling activity. Irrespective of whether replicated models should be considered appropriate or not, complete replicability is generally accepted as a desirable property of health economic models, as reflected in critical appraisal checklists and good practice guidelines. To this end, the feasibility of comprehensive model replication is explored empirically across a small number of recent case studies. Recommendations are put forward for improving reporting standards to enhance comprehensive model replicability.

  1. Gaussian mixture model based identification of arterial wall movement for computation of distension waveform.

    PubMed

    Patil, Ravindra B; Krishnamoorthy, P; Sethuraman, Shriram

    2015-01-01

    This work proposes a novel Gaussian Mixture Model (GMM) based approach for accurate tracking of the arterial wall and subsequent computation of the distension waveform using Radio Frequency (RF) ultrasound signal. The approach was evaluated on ultrasound RF data acquired using a prototype ultrasound system from an artery mimicking flow phantom. The effectiveness of the proposed algorithm is demonstrated by comparing with existing wall tracking algorithms. The experimental results show that the proposed method provides 20% reduction in the error margin compared to the existing approaches in tracking the arterial wall movement. This approach coupled with ultrasound system can be used to estimate the arterial compliance parameters required for screening of cardiovascular related disorders.

  2. Multiscale Modeling of Angiogenesis and Predictive Capacity

    NASA Astrophysics Data System (ADS)

    Pillay, Samara; Byrne, Helen; Maini, Philip

    Tumors induce the growth of new blood vessels from existing vasculature through angiogenesis. Using an agent-based approach, we model the behavior of individual endothelial cells during angiogenesis. We incorporate crowding effects through volume exclusion, motility of cells through biased random walks, and include birth and death-like processes. We use the transition probabilities associated with the discrete model and a discrete conservation equation for cell occupancy to determine collective cell behavior, in terms of partial differential equations (PDEs). We derive three PDE models incorporating single, multi-species and no volume exclusion. By fitting the parameters in our PDE models and other well-established continuum models to agent-based simulations during a specific time period, and then comparing the outputs from the PDE models and agent-based model at later times, we aim to determine how well the PDE models predict the future behavior of the agent-based model. We also determine whether predictions differ across PDE models and the significance of those differences. This may impact drug development strategies based on PDE models.

  3. An efficient temporal database design method based on EER

    NASA Astrophysics Data System (ADS)

    Liu, Zhi; Huang, Jiping; Miao, Hua

    2007-12-01

    Many existing methods of modeling temporal information are based on logical model, which makes relational schema optimization more difficult and more complicated. In this paper, based on the conventional EER model, the author attempts to analyse and abstract temporal information in the phase of conceptual modelling according to the concrete requirement to history information. Then a temporal data model named BTEER is presented. BTEER not only retains all designing ideas and methods of EER which makes BTEER have good upward compatibility, but also supports the modelling of valid time and transaction time effectively at the same time. In addition, BTEER can be transformed to EER easily and automatically. It proves in practice, this method can model the temporal information well.

  4. Surrogate-based optimization of hydraulic fracturing in pre-existing fracture networks

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Sun, Yunwei; Fu, Pengcheng; Carrigan, Charles R.; Lu, Zhiming; Tong, Charles H.; Buscheck, Thomas A.

    2013-08-01

    Hydraulic fracturing has been used widely to stimulate production of oil, natural gas, and geothermal energy in formations with low natural permeability. Numerical optimization of fracture stimulation often requires a large number of evaluations of objective functions and constraints from forward hydraulic fracturing models, which are computationally expensive and even prohibitive in some situations. Moreover, there are a variety of uncertainties associated with the pre-existing fracture distributions and rock mechanical properties, which affect the optimized decisions for hydraulic fracturing. In this study, a surrogate-based approach is developed for efficient optimization of hydraulic fracturing well design in the presence of natural-system uncertainties. The fractal dimension is derived from the simulated fracturing network as the objective for maximizing energy recovery sweep efficiency. The surrogate model, which is constructed using training data from high-fidelity fracturing models for mapping the relationship between uncertain input parameters and the fractal dimension, provides fast approximation of the objective functions and constraints. A suite of surrogate models constructed using different fitting methods is evaluated and validated for fast predictions. Global sensitivity analysis is conducted to gain insights into the impact of the input variables on the output of interest, and further used for parameter screening. The high efficiency of the surrogate-based approach is demonstrated for three optimization scenarios with different and uncertain ambient conditions. Our results suggest the critical importance of considering uncertain pre-existing fracture networks in optimization studies of hydraulic fracturing.

  5. An efficient current-based logic cell model for crosstalk delay analysis

    NASA Astrophysics Data System (ADS)

    Nazarian, Shahin; Das, Debasish

    2013-04-01

    Logic cell modelling is an important component in the analysis and design of CMOS integrated circuits, mostly due to nonlinear behaviour of CMOS cells with respect to the voltage signal at their input and output pins. A current-based model for CMOS logic cells is presented, which can be used for effective crosstalk noise and delta delay analysis in CMOS VLSI circuits. Existing current source models are expensive and need a new set of Spice-based characterisation, which is not compatible with typical EDA tools. In this article we present Imodel, a simple nonlinear logic cell model that can be derived from the typical cell libraries such as NLDM, with accuracy much higher than NLDM-based cell delay models. In fact, our experiments show an average error of 3% compared to Spice. This level of accuracy comes with a maximum runtime penalty of 19% compared to NLDM-based cell delay models on medium-sized industrial designs.

  6. Evaluation of new collision-pair selection models in DSMC

    NASA Astrophysics Data System (ADS)

    Akhlaghi, Hassan; Roohi, Ehsan

    2017-10-01

    The current paper investigates new collision-pair selection procedures in a direct simulation Monte Carlo (DSMC) method. Collision partner selection based on the random procedure from nearest neighbor particles and deterministic selection of nearest neighbor particles have already been introduced as schemes that provide accurate results in a wide range of problems. In the current research, new collision-pair selections based on the time spacing and direction of the relative movement of particles are introduced and evaluated. Comparisons between the new and existing algorithms are made considering appropriate test cases including fluctuations in homogeneous gas, 2D equilibrium flow, and Fourier flow problem. Distribution functions for number of particles and collisions in cell, velocity components, and collisional parameters (collision separation, time spacing, relative velocity, and the angle between relative movements of particles) are investigated and compared with existing analytical relations for each model. The capability of each model in the prediction of the heat flux in the Fourier problem at different cell numbers, numbers of particles, and time steps is examined. For new and existing collision-pair selection schemes, the effect of an alternative formula for the number of collision-pair selections and avoiding repetitive collisions are investigated via the prediction of the Fourier heat flux. The simulation results demonstrate the advantages and weaknesses of each model in different test cases.

  7. A model for the sustainable selection of building envelope assemblies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huedo, Patricia, E-mail: huedo@uji.es; Mulet, Elena, E-mail: emulet@uji.es; López-Mesa, Belinda, E-mail: belinda@unizar.es

    2016-02-15

    The aim of this article is to define an evaluation model for the environmental impacts of building envelopes to support planners in the early phases of materials selection. The model is intended to estimate environmental impacts for different combinations of building envelope assemblies based on scientifically recognised sustainability indicators. These indicators will increase the amount of information that existing catalogues show to support planners in the selection of building assemblies. To define the model, first the environmental indicators were selected based on the specific aims of the intended sustainability assessment. Then, a simplified LCA methodology was developed to estimate themore » impacts applicable to three types of dwellings considering different envelope assemblies, building orientations and climate zones. This methodology takes into account the manufacturing, installation, maintenance and use phases of the building. Finally, the model was validated and a matrix in Excel was created as implementation of the model. - Highlights: • Method to assess the envelope impacts based on a simplified LCA • To be used at an earlier phase than the existing methods in a simple way. • It assigns a score by means of known sustainability indicators. • It estimates data about the embodied and operating environmental impacts. • It compares the investment costs with the costs of the consumed energy.« less

  8. A spatial error model with continuous random effects and an application to growth convergence

    NASA Astrophysics Data System (ADS)

    Laurini, Márcio Poletti

    2017-10-01

    We propose a spatial error model with continuous random effects based on Matérn covariance functions and apply this model for the analysis of income convergence processes (β -convergence). The use of a model with continuous random effects permits a clearer visualization and interpretation of the spatial dependency patterns, avoids the problems of defining neighborhoods in spatial econometrics models, and allows projecting the spatial effects for every possible location in the continuous space, circumventing the existing aggregations in discrete lattice representations. We apply this model approach to analyze the economic growth of Brazilian municipalities between 1991 and 2010 using unconditional and conditional formulations and a spatiotemporal model of convergence. The results indicate that the estimated spatial random effects are consistent with the existence of income convergence clubs for Brazilian municipalities in this period.

  9. A Five-Dimensional Mathematical Model for Regional and Global Changes in Cardiac Uptake and Motion

    NASA Astrophysics Data System (ADS)

    Pretorius, P. H.; King, M. A.; Gifford, H. C.

    2004-10-01

    The objective of this work was to simultaneously introduce known regional changes in contraction pattern and perfusion to the existing gated Mathematical Cardiac Torso (MCAT) phantom heart model. We derived a simple integral to calculate the fraction of the ellipsoidal volume that makes up the left ventricle (LV), taking into account the stationary apex and the moving base. After calculating the LV myocardium volume of the existing beating heart model, we employed the property of conservation of mass to manipulate the LV ejection fraction to values ranging between 13.5% and 68.9%. Multiple dynamic heart models that differ in degree of LV wall thickening, base-to-apex motion, and ejection fraction, are thus available for use with the existing MCAT methodology. To introduce more complex regional LV contraction and perfusion patterns, we used composites of dynamic heart models to create a central region with little or no motion or perfusion, surrounded by a region in which the motion and perfusion gradually reverts to normal. To illustrate this methodology, the following gated cardiac acquisitions for different clinical situations were simulated analytically: 1) reduced regional motion and perfusion; 2) same perfusion as in (1) without motion intervention; and 3) washout from the normal and diseased myocardial regions. Both motion and perfusion can change dynamically during a single rotation or multiple rotations of a simulated single-photon emission computed tomography acquisition system.

  10. Towards an Enhancement of Organizational Information Security through Threat Factor Profiling (TFP) Model

    NASA Astrophysics Data System (ADS)

    Sidi, Fatimah; Daud, Maslina; Ahmad, Sabariah; Zainuddin, Naqliyah; Anneisa Abdullah, Syafiqa; Jabar, Marzanah A.; Suriani Affendey, Lilly; Ishak, Iskandar; Sharef, Nurfadhlina Mohd; Zolkepli, Maslina; Nur Majdina Nordin, Fatin; Amat Sejani, Hashimah; Ramadzan Hairani, Saiful

    2017-09-01

    Information security has been identified by organizations as part of internal operations that need to be well implemented and protected. This is because each day the organizations face a high probability of increase of threats to their networks and services that will lead to information security issues. Thus, effective information security management is required in order to protect their information assets. Threat profiling is a method that can be used by an organization to address the security challenges. Threat profiling allows analysts to understand and organize intelligent information related to threat groups. This paper presents a comparative analysis that was conducted to study the existing threat profiling models. It was found that existing threat models were constructed based on specific objectives, thus each model is limited to only certain components or factors such as assets, threat sources, countermeasures, threat agents, threat outcomes and threat actors. It is suggested that threat profiling can be improved by the combination of components found in each existing threat profiling model/framework. The proposed model can be used by an organization in executing a proactive approach to incident management.

  11. DecAID: a decaying wood advisory model for Oregon and Washington.

    Treesearch

    Kim Mellen; Bruce G. Marcot; Janet L. Ohmann; Karen L. Waddell; Elizabeth A. Willhite; Bruce B. Hostetler; Susan A. Livingston; Cay Ogden

    2002-01-01

    DecAID is a knowledge-based advisory model that provides guidance to managers in determining the size, amount, and distribution of dead and decaying wood (dead and partially dead trees and down wood) necessary to maintain wildlife habitat and ecosystem functions. The intent of the model is to update and replace existing snag-wildlife models in Washington and Oregon....

  12. A Methodology for Cybercraft Requirement Definition and Initial System Design

    DTIC Science & Technology

    2008-06-01

    the software development concepts of the SDLC , requirements, use cases and domain modeling . It ...collectively as Software Development 5 Life Cycle ( SDLC ) models . While there are numerous models that fit under the SDLC definition, all are based on... developed that provided expanded understanding of the domain, it is necessary to either update an existing domain model or create another domain

  13. COSMOS: accurate detection of somatic structural variations through asymmetric comparison between tumor and normal samples

    PubMed Central

    Yamagata, Koichi; Yamanishi, Ayako; Kokubu, Chikara; Takeda, Junji; Sese, Jun

    2016-01-01

    An important challenge in cancer genomics is precise detection of structural variations (SVs) by high-throughput short-read sequencing, which is hampered by the high false discovery rates of existing analysis tools. Here, we propose an accurate SV detection method named COSMOS, which compares the statistics of the mapped read pairs in tumor samples with isogenic normal control samples in a distinct asymmetric manner. COSMOS also prioritizes the candidate SVs using strand-specific read-depth information. Performance tests on modeled tumor genomes revealed that COSMOS outperformed existing methods in terms of F-measure. We also applied COSMOS to an experimental mouse cell-based model, in which SVs were induced by genome engineering and gamma-ray irradiation, followed by polymerase chain reaction-based confirmation. The precision of COSMOS was 84.5%, while the next best existing method was 70.4%. Moreover, the sensitivity of COSMOS was the highest, indicating that COSMOS has great potential for cancer genome analysis. PMID:26833260

  14. Visualization of RNA structure models within the Integrative Genomics Viewer.

    PubMed

    Busan, Steven; Weeks, Kevin M

    2017-07-01

    Analyses of the interrelationships between RNA structure and function are increasingly important components of genomic studies. The SHAPE-MaP strategy enables accurate RNA structure probing and realistic structure modeling of kilobase-length noncoding RNAs and mRNAs. Existing tools for visualizing RNA structure models are not suitable for efficient analysis of long, structurally heterogeneous RNAs. In addition, structure models are often advantageously interpreted in the context of other experimental data and gene annotation information, for which few tools currently exist. We have developed a module within the widely used and well supported open-source Integrative Genomics Viewer (IGV) that allows visualization of SHAPE and other chemical probing data, including raw reactivities, data-driven structural entropies, and data-constrained base-pair secondary structure models, in context with linear genomic data tracks. We illustrate the usefulness of visualizing RNA structure in the IGV by exploring structure models for a large viral RNA genome, comparing bacterial mRNA structure in cells with its structure under cell- and protein-free conditions, and comparing a noncoding RNA structure modeled using SHAPE data with a base-pairing model inferred through sequence covariation analysis. © 2017 Busan and Weeks; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  15. A Coarse-Grained Elastic Network Atom Contact Model and Its Use in the Simulation of Protein Dynamics and the Prediction of the Effect of Mutations

    PubMed Central

    Frappier, Vincent; Najmanovich, Rafael J.

    2014-01-01

    Normal mode analysis (NMA) methods are widely used to study dynamic aspects of protein structures. Two critical components of NMA methods are coarse-graining in the level of simplification used to represent protein structures and the choice of potential energy functional form. There is a trade-off between speed and accuracy in different choices. In one extreme one finds accurate but slow molecular-dynamics based methods with all-atom representations and detailed atom potentials. On the other extreme, fast elastic network model (ENM) methods with Cα−only representations and simplified potentials that based on geometry alone, thus oblivious to protein sequence. Here we present ENCoM, an Elastic Network Contact Model that employs a potential energy function that includes a pairwise atom-type non-bonded interaction term and thus makes it possible to consider the effect of the specific nature of amino-acids on dynamics within the context of NMA. ENCoM is as fast as existing ENM methods and outperforms such methods in the generation of conformational ensembles. Here we introduce a new application for NMA methods with the use of ENCoM in the prediction of the effect of mutations on protein stability. While existing methods are based on machine learning or enthalpic considerations, the use of ENCoM, based on vibrational normal modes, is based on entropic considerations. This represents a novel area of application for NMA methods and a novel approach for the prediction of the effect of mutations. We compare ENCoM to a large number of methods in terms of accuracy and self-consistency. We show that the accuracy of ENCoM is comparable to that of the best existing methods. We show that existing methods are biased towards the prediction of destabilizing mutations and that ENCoM is less biased at predicting stabilizing mutations. PMID:24762569

  16. A Comparison of Different Teaching Designs of "Acids and Bases" Subject

    ERIC Educational Resources Information Center

    Ültay, Neslihan; Çalik, Muammer

    2016-01-01

    Inability to link the acid-base concepts with daily life phenomena (as contexts) highlights the need for further research on the context-based acid-base chemistry. In this vein, the aim of this study is to investigate the effects of different teaching designs (REACT strategy, 5Es learning model and traditional (existing) instruction) relevant with…

  17. Project summaries

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Lunar base projects, including a reconfigurable lunar cargo launcher, a thermal and micrometeorite protection system, a versatile lifting machine with robotic capabilities, a cargo transport system, the design of a road construction system for a lunar base, and the design of a device for removing lunar dust from material surfaces, are discussed. The emphasis on the Gulf of Mexico project was on the development of a computer simulation model for predicting vessel station keeping requirements. An existing code, used in predicting station keeping requirements for oil drilling platforms operating in North Shore (Alaska) waters was used as a basis for the computer simulation. Modifications were made to the existing code. The input into the model consists of satellite altimeter readings and water velocity readings from buoys stationed in the Gulf of Mexico. The satellite data consists of altimeter readings (wave height) taken during the spring of 1989. The simulation model predicts water velocity and direction, and wind velocity.

  18. ANN modeling of DNA sequences: new strategies using DNA shape code.

    PubMed

    Parbhane, R V; Tambe, S S; Kulkarni, B D

    2000-09-01

    Two new encoding strategies, namely, wedge and twist codes, which are based on the DNA helical parameters, are introduced to represent DNA sequences in artificial neural network (ANN)-based modeling of biological systems. The performance of the new coding strategies has been evaluated by conducting three case studies involving mapping (modeling) and classification applications of ANNs. The proposed coding schemes have been compared rigorously and shown to outperform the existing coding strategies especially in situations wherein limited data are available for building the ANN models.

  19. Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Malsbury, T.; Atencio, A., Jr.

    1992-01-01

    A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.

  20. Wind-Tunnel Investigations of Blunt-Body Drag Reduction Using Forebody Surface Roughness

    NASA Technical Reports Server (NTRS)

    Whitmore, Stephen A.; Sprague, Stephanie; Naughton, Jonathan W.; Curry, Robert E. (Technical Monitor)

    2001-01-01

    This paper presents results of wind-tunnel tests that demonstrate a novel drag reduction technique for blunt-based vehicles. For these tests, the forebody roughness of a blunt-based model was modified using micomachined surface overlays. As forebody roughness increases, boundary layer at the model aft thickens and reduces the shearing effect of external flow on the separated flow behind the base region, resulting in reduced base drag. For vehicle configurations with large base drag, existing data predict that a small increment in forebody friction drag will result in a relatively large decrease in base drag. If the added increment in forebody skin drag is optimized with respect to base drag, reducing the total drag of the configuration is possible. The wind-tunnel tests results conclusively demonstrate the existence of a forebody dragbase drag optimal point. The data demonstrate that the base drag coefficient corresponding to the drag minimum lies between 0.225 and 0.275, referenced to the base area. Most importantly, the data show a drag reduction of approximately 15% when the drag optimum is reached. When this drag reduction is scaled to the X-33 base area, drag savings approaching 45,000 N (10,000 lbf) can be realized.

  1. Frequency-dependent selection predicts patterns of radiations and biodiversity.

    PubMed

    Melián, Carlos J; Alonso, David; Vázquez, Diego P; Regetz, James; Allesina, Stefano

    2010-08-26

    Most empirical studies support a decline in speciation rates through time, although evidence for constant speciation rates also exists. Declining rates have been explained by invoking pre-existing niches, whereas constant rates have been attributed to non-adaptive processes such as sexual selection and mutation. Trends in speciation rate and the processes underlying it remain unclear, representing a critical information gap in understanding patterns of global diversity. Here we show that the temporal trend in the speciation rate can also be explained by frequency-dependent selection. We construct a frequency-dependent and DNA sequence-based model of speciation. We compare our model to empirical diversity patterns observed for cichlid fish and Darwin's finches, two classic systems for which speciation rates and richness data exist. Negative frequency-dependent selection predicts well both the declining speciation rate found in cichlid fish and explains their species richness. For groups like the Darwin's finches, in which speciation rates are constant and diversity is lower, speciation rate is better explained by a model without frequency-dependent selection. Our analysis shows that differences in diversity may be driven by incipient species abundance with frequency-dependent selection. Our results demonstrate that genetic-distance-based speciation and frequency-dependent selection are sufficient to explain the high diversity observed in natural systems and, importantly, predict decay through time in speciation rate in the absence of pre-existing niches.

  2. Conceptual Framework for Trait-Based Ecological Risk Assessment for Wildlife Populations Exposed to Pesticides

    EPA Science Inventory

    Between screening level risk assessments and complex ecological models, a need exists for practical identification of risk based on general information about species, chemicals, and exposure scenarios. Several studies have identified demographic, biological, and toxicological fa...

  3. A Sulfur-based Glacial Ecosystem as a Model for the Habitability of Europa and Mars

    NASA Astrophysics Data System (ADS)

    Wright, K. E.; Gleeson, D. F.; Williamson, C.; Grasby, S. E.; Spear, J.; Pappalardo, R. T.; Templeton, A. S.

    2010-04-01

    Identifying the sulfur redox reactions and dominant microbial organisms in a sulfur-based glacial microbial ecosystem provides insights into the type of metabolisms that might exist on other planetary bodies, and the biosignatures they may present.

  4. Combining the Generic Entity-Attribute-Value Model and Terminological Models into a Common Ontology to Enable Data Integration and Decision Support.

    PubMed

    Bouaud, Jacques; Guézennec, Gilles; Séroussi, Brigitte

    2018-01-01

    The integration of clinical information models and termino-ontological models into a unique ontological framework is highly desirable for it facilitates data integration and management using the same formal mechanisms for both data concepts and information model components. This is particularly true for knowledge-based decision support tools that aim to take advantage of all facets of semantic web technologies in merging ontological reasoning, concept classification, and rule-based inferences. We present an ontology template that combines generic data model components with (parts of) existing termino-ontological resources. The approach is developed for the guideline-based decision support module on breast cancer management within the DESIREE European project. The approach is based on the entity attribute value model and could be extended to other domains.

  5. AQMEII: A New International Initiative on Air Quality Model Evaluation

    EPA Science Inventory

    We provide a conceptual view of the process of evaluating regional-scale three-dimensional numerical photochemical air quality modeling system, based on an examination of existing approached to the evaluation of such systems as they are currently used in a variety of application....

  6. Development of a Human Physiologically Based Pharmacokinetics (PBPK) Model For Dermal Permeability for Lindane

    EPA Science Inventory

    Lindane is a neurotoxicant used for the treatment of lice and scabies present on human skin. Due to its pharmaceutical application, an extensive pharmacokinetic database exists in humans. Mathematical diffusion models allow for calculation of lindane skin permeability coefficient...

  7. Chemical Transformation System: Cloud Based Cheminformatic Services to Support Integrated Environmental Modeling

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) systems that account for the fate/transport of organics frequently require physicochemical properties as well as transformation products. A myriad of chemical property databases exist but these can be difficult to access and often do not co...

  8. Biomonitoring Equivalents (BE) Dossier for Toluene (Cas No. 108-88-3)

    EPA Science Inventory

    This document reviews available pharmacokinetic data and models for toluene and applies these data and models to existing health-based exposure guidance values from the US Environmental Protection Agency, the Agency for Toxic Substances and Disease Registry, Health Canada, and th...

  9. A study on seismic behavior of pile foundations of bridge abutment on liquefiable ground through shaking table tests

    NASA Astrophysics Data System (ADS)

    Nakata, Mitsuhiko; Tanimoto, Shunsuke; Ishida, Shuichi; Ohsumi, Michio; Hoshikuma, Jun-ichi

    2017-10-01

    There is risk of bridge foundations to be damaged by liquefaction-induced lateral spreading of ground. Once bridge foundations have been damaged, it takes a lot of time for restoration. Therefore, it is important to assess the seismic behavior of the foundations on liquefiable ground appropriately. In this study, shaking table tests of models on a scale of 1/10 were conducted at the large scale shaking table in Public Works Research Institute, Japan, to investigate the seismic behavior of pile-supported bridge abutment on liquefiable ground. The shaking table tests were conducted for three types of model. Two are models of existing bridge which was built without design for liquefaction and the other is a model of bridge which was designed based on the current Japanese design specifications for highway bridges. As a result, the bending strains of piles of the abutment which were designed based on the current design specifications were less than those of the existing bridge.

  10. Flow stress model in metal cutting

    NASA Technical Reports Server (NTRS)

    Black, J. T.

    1978-01-01

    A model for the plastic deformation that occurs in metal cutting, based on dislocation mechanics, is presented. The model explains the fundamental deformation structure that develops during machining and is based on the well known Cottrell-Stokes Law, wherein the flow stress is partitioned into two parts; an athermal part which occurs in the shear fronts (or shear bands); and a thermal part which occurs in the lamella regions. The deformation envokes the presence of a cellular dislocation distribution which always exists in the material ahead of the shear process. This 'alien' dislocation distribution either exists in the metal prior to cutting or is produced by the compressive stress field which operates in front of the shear process. The magnitude of the flow stress and direction of the shear are shown to be correlated to the stacking fault energy of the metal being cut. The model is tested with respect to energy consumption rates and found to be consistent with observed values.

  11. Using Data-Driven Model-Brain Mappings to Constrain Formal Models of Cognition

    PubMed Central

    Borst, Jelmer P.; Nijboer, Menno; Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John R.

    2015-01-01

    In this paper we propose a method to create data-driven mappings from components of cognitive models to brain regions. Cognitive models are notoriously hard to evaluate, especially based on behavioral measures alone. Neuroimaging data can provide additional constraints, but this requires a mapping from model components to brain regions. Although such mappings can be based on the experience of the modeler or on a reading of the literature, a formal method is preferred to prevent researcher-based biases. In this paper we used model-based fMRI analysis to create a data-driven model-brain mapping for five modules of the ACT-R cognitive architecture. We then validated this mapping by applying it to two new datasets with associated models. The new mapping was at least as powerful as an existing mapping that was based on the literature, and indicated where the models were supported by the data and where they have to be improved. We conclude that data-driven model-brain mappings can provide strong constraints on cognitive models, and that model-based fMRI is a suitable way to create such mappings. PMID:25747601

  12. Subgrid-scale Condensation Modeling for Entropy-based Large Eddy Simulations of Clouds

    NASA Astrophysics Data System (ADS)

    Kaul, C. M.; Schneider, T.; Pressel, K. G.; Tan, Z.

    2015-12-01

    An entropy- and total water-based formulation of LES thermodynamics, such as that used by the recently developed code PyCLES, is advantageous from physical and numerical perspectives. However, existing closures for subgrid-scale thermodynamic fluctuations assume more traditional choices for prognostic thermodynamic variables, such as liquid potential temperature, and are not directly applicable to entropy-based modeling. Since entropy and total water are generally nonlinearly related to diagnosed quantities like temperature and condensate amounts, neglecting their small-scale variability can lead to bias in simulation results. Here we present the development of a subgrid-scale condensation model suitable for use with entropy-based thermodynamic formulations.

  13. ModFossa: A library for modeling ion channels using Python.

    PubMed

    Ferneyhough, Gareth B; Thibealut, Corey M; Dascalu, Sergiu M; Harris, Frederick C

    2016-06-01

    The creation and simulation of ion channel models using continuous-time Markov processes is a powerful and well-used tool in the field of electrophysiology and ion channel research. While several software packages exist for the purpose of ion channel modeling, most are GUI based, and none are available as a Python library. In an attempt to provide an easy-to-use, yet powerful Markov model-based ion channel simulator, we have developed ModFossa, a Python library supporting easy model creation and stimulus definition, complete with a fast numerical solver, and attractive vector graphics plotting.

  14. Model-Based Fatigue Prognosis of Fiber-Reinforced Laminates Exhibiting Concurrent Damage Mechanisms

    NASA Technical Reports Server (NTRS)

    Corbetta, M.; Sbarufatti, C.; Saxena, A.; Giglio, M.; Goebel, K.

    2016-01-01

    Prognostics of large composite structures is a topic of increasing interest in the field of structural health monitoring for aerospace, civil, and mechanical systems. Along with recent advancements in real-time structural health data acquisition and processing for damage detection and characterization, model-based stochastic methods for life prediction are showing promising results in the literature. Among various model-based approaches, particle-filtering algorithms are particularly capable in coping with uncertainties associated with the process. These include uncertainties about information on the damage extent and the inherent uncertainties of the damage propagation process. Some efforts have shown successful applications of particle filtering-based frameworks for predicting the matrix crack evolution and structural stiffness degradation caused by repetitive fatigue loads. Effects of other damage modes such as delamination, however, are not incorporated in these works. It is well established that delamination and matrix cracks not only co-exist in most laminate structures during the fatigue degradation process but also affect each other's progression. Furthermore, delamination significantly alters the stress-state in the laminates and accelerates the material degradation leading to catastrophic failure. Therefore, the work presented herein proposes a particle filtering-based framework for predicting a structure's remaining useful life with consideration of multiple co-existing damage-mechanisms. The framework uses an energy-based model from the composite modeling literature. The multiple damage-mode model has been shown to suitably estimate the energy release rate of cross-ply laminates as affected by matrix cracks and delamination modes. The model is also able to estimate the reduction in stiffness of the damaged laminate. This information is then used in the algorithms for life prediction capabilities. First, a brief summary of the energy-based damage model is provided. Then, the paper describes how the model is embedded within the prognostic framework and how the prognostics performance is assessed using observations from run-to-failure experiments

  15. Improved indexes for targeting placement of buffers of Hortonian runoff

    Treesearch

    M.G. Dosskey; Z. Qiu; M.J. Helmers; D.E. Eisenhauer

    2011-01-01

    Targeting specific locations within agricultural watersheds for installing vegetative buffers has been advocated as a way to enhance the impact of buffers and buffer programs on stream water quality. Existing models for targeting buffers of Hortonian, or infiltration-excess, runoff are not well developed. The objective was to improve on an existing soil survey–based...

  16. Distance measurement based on light field geometry and ray tracing.

    PubMed

    Chen, Yanqin; Jin, Xin; Dai, Qionghai

    2017-01-09

    In this paper, we propose a geometric optical model to measure the distances of object planes in a light field image. The proposed geometric optical model is composed of two sub-models based on ray tracing: object space model and image space model. The two theoretic sub-models are derived on account of on-axis point light sources. In object space model, light rays propagate into the main lens and refract inside it following the refraction theorem. In image space model, light rays exit from emission positions on the main lens and subsequently impinge on the image sensor with different imaging diameters. The relationships between imaging diameters of objects and their corresponding emission positions on the main lens are investigated through utilizing refocusing and similar triangle principle. By combining the two sub-models together and tracing light rays back to the object space, the relationships between objects' imaging diameters and corresponding distances of object planes are figured out. The performance of the proposed geometric optical model is compared with existing approaches using different configurations of hand-held plenoptic 1.0 cameras and real experiments are conducted using a preliminary imaging system. Results demonstrate that the proposed model can outperform existing approaches in terms of accuracy and exhibits good performance at general imaging range.

  17. A remote sensing based vegetation classification logic for global land cover analysis

    USGS Publications Warehouse

    Running, Steven W.; Loveland, Thomas R.; Pierce, Lars L.; Nemani, R.R.; Hunt, E. Raymond

    1995-01-01

    This article proposes a simple new logic for classifying global vegetation. The critical features of this classification are that 1) it is based on simple, observable, unambiguous characteristics of vegetation structure that are important to ecosystem biogeochemistry and can be measured in the field for validation, 2) the structural characteristics are remotely sensible so that repeatable and efficient global reclassifications of existing vegetation will be possible, and 3) the defined vegetation classes directly translate into the biophysical parameters of interest by global climate and biogeochemical models. A first test of this logic for the continental United States is presented based on an existing 1 km AVHRR normalized difference vegetation index database. Procedures for solving critical remote sensing problems needed to implement the classification are discussed. Also, some inferences from this classification to advanced vegetation biophysical variables such as specific leaf area and photosynthetic capacity useful to global biogeochemical modeling are suggested.

  18. Potential of hydraulically induced fractures to communicate with existing wellbores

    NASA Astrophysics Data System (ADS)

    Montague, James A.; Pinder, George F.

    2015-10-01

    The probability that new hydraulically fractured wells drilled within the area of New York underlain by the Marcellus Shale will intersect an existing wellbore is calculated using a statistical model, which incorporates: the depth of a new fracturing well, the vertical growth of induced fractures, and the depths and locations of existing nearby wells. The model first calculates the probability of encountering an existing well in plan view and combines this with the probability of an existing well-being at sufficient depth to intersect the fractured region. Average probability estimates for the entire region of New York underlain by the Marcellus Shale range from 0.00% to 3.45% based upon the input parameters used. The largest contributing parameter on the probability value calculated is the nearby density of wells meaning that due diligence by oil and gas companies during construction in identifying all nearby wells will have the greatest effect in reducing the probability of interwellbore communication.

  19. Critical Issues and Key Points from the Survey to the Creation of the Historical Building Information Model: the Case of Santo Stefano Basilica

    NASA Astrophysics Data System (ADS)

    Castagnetti, C.; Dubbini, M.; Ricci, P. C.; Rivola, R.; Giannini, M.; Capra, A.

    2017-05-01

    The new era of designing in architecture and civil engineering applications lies in the Building Information Modeling (BIM) approach, based on a 3D geometric model including a 3D database. This is easier for new constructions whereas, when dealing with existing buildings, the creation of the BIM is based on the accurate knowledge of the as-built construction. Such a condition is allowed by a 3D survey, often carried out with laser scanning technology or modern photogrammetry, which are able to guarantee an adequate points cloud in terms of resolution and completeness by balancing both time consuming and costs with respect to the request of final accuracy. The BIM approach for existing buildings and even more for historical buildings is not yet a well known and deeply discussed process. There are still several choices to be addressed in the process from the survey to the model and critical issues to be discussed in the modeling step, particularly when dealing with unconventional elements such as deformed geometries or historical elements. The paper describes a comprehensive workflow that goes through the survey and the modeling, allowing to focus on critical issues and key points to obtain a reliable BIM of an existing monument. The case study employed to illustrate the workflow is the Basilica of St. Stefano in Bologna (Italy), a large monumental complex with great religious, historical and architectural assets.

  20. From translational research to open technology innovation systems.

    PubMed

    Savory, Clive; Fortune, Joyce

    2015-01-01

    The purpose of this paper is to question whether the emphasis placed within translational research on a linear model of innovation provides the most effective model for managing health technology innovation. Several alternative perspectives are presented that have potential to enhance the existing model of translational research. A case study is presented of innovation of a clinical decision support system. The paper concludes from the case study that an extending the triple helix model of technology transfer, to one based on a quadruple helix, present a basis for improving the performance translational research. A case study approach is used to help understand development of an innovative technology within a teaching hospital. The case is then used to develop and refine a model of the health technology innovation system. The paper concludes from the case study that existing models of translational research could be refined further through the development of a quadruple helix model of heath technology innovation that encompasses greater emphasis on user-led and open innovation perspectives. The paper presents several implications for future research based on the need to enhance the model of health technology innovation used to guide policy and practice. The quadruple helix model of innovation that is proposed can potentially guide alterations to the existing model of translational research in the healthcare sector. Several suggestions are made for how innovation activity can be better supported at both a policy and operational level. This paper presents a synthesis of the innovation literature applied to a theoretically important case of open innovation in the UK National Health Service. It draws in perspectives from other industrial sectors and applies them specifically to the management and organisation of innovation activities around health technology and the services in which they are embedded.

  1. Detailed Primitive-Based 3d Modeling of Architectural Elements

    NASA Astrophysics Data System (ADS)

    Remondino, F.; Lo Buglio, D.; Nony, N.; De Luca, L.

    2012-07-01

    The article describes a pipeline, based on image-data, for the 3D reconstruction of building façades or architectural elements and the successive modeling using geometric primitives. The approach overcome some existing problems in modeling architectural elements and deliver efficient-in-size reality-based textured 3D models useful for metric applications. For the 3D reconstruction, an opensource pipeline developed within the TAPENADE project is employed. In the successive modeling steps, the user manually selects an area containing an architectural element (capital, column, bas-relief, window tympanum, etc.) and then the procedure fits geometric primitives and computes disparity and displacement maps in order to tie visual and geometric information together in a light but detailed 3D model. Examples are reported and commented.

  2. Comparing biomarkers as principal surrogate endpoints.

    PubMed

    Huang, Ying; Gilbert, Peter B

    2011-12-01

    Recently a new definition of surrogate endpoint, the "principal surrogate," was proposed based on causal associations between treatment effects on the biomarker and on the clinical endpoint. Despite its appealing interpretation, limited research has been conducted to evaluate principal surrogates, and existing methods focus on risk models that consider a single biomarker. How to compare principal surrogate value of biomarkers or general risk models that consider multiple biomarkers remains an open research question. We propose to characterize a marker or risk model's principal surrogate value based on the distribution of risk difference between interventions. In addition, we propose a novel summary measure (the standardized total gain) that can be used to compare markers and to assess the incremental value of a new marker. We develop a semiparametric estimated-likelihood method to estimate the joint surrogate value of multiple biomarkers. This method accommodates two-phase sampling of biomarkers and is more widely applicable than existing nonparametric methods by incorporating continuous baseline covariates to predict the biomarker(s), and is more robust than existing parametric methods by leaving the error distribution of markers unspecified. The methodology is illustrated using a simulated example set and a real data set in the context of HIV vaccine trials. © 2011, The International Biometric Society.

  3. Analytic proof of the existence of the Lorenz attractor in the extended Lorenz model

    NASA Astrophysics Data System (ADS)

    Ovsyannikov, I. I.; Turaev, D. V.

    2017-01-01

    We give an analytic (free of computer assistance) proof of the existence of a classical Lorenz attractor for an open set of parameter values of the Lorenz model in the form of Yudovich-Morioka-Shimizu. The proof is based on detection of a homoclinic butterfly with a zero saddle value and rigorous verification of one of the Shilnikov criteria for the birth of the Lorenz attractor; we also supply a proof for this criterion. The results are applied in order to give an analytic proof for the existence of a robust, pseudohyperbolic strange attractor (the so-called discrete Lorenz attractor) for an open set of parameter values in a 4-parameter family of 3D Henon-like diffeomorphisms.

  4. OPUS One: An Intelligent Adaptive Learning Environment Using Artificial Intelligence Support

    NASA Astrophysics Data System (ADS)

    Pedrazzoli, Attilio

    2010-06-01

    AI based Tutoring and Learning Path Adaptation are well known concepts in e-Learning scenarios today and increasingly applied in modern learning environments. In order to gain more flexibility and to enhance existing e-learning platforms, the OPUS One LMS Extension package will enable a generic Intelligent Tutored Adaptive Learning Environment, based on a holistic Multidimensional Instructional Design Model (PENTHA ID Model), allowing AI based tutoring and adaptation functionality to existing Web-based e-learning systems. Relying on "real time" adapted profiles, it allows content- / course authors to apply a dynamic course design, supporting tutored, collaborative sessions and activities, as suggested by modern pedagogy. The concept presented combines a personalized level of surveillance, learning activity- and learning path adaptation suggestions to ensure the students learning motivation and learning success. The OPUS One concept allows to implement an advanced tutoring approach combining "expert based" e-tutoring with the more "personal" human tutoring function. It supplies the "Human Tutor" with precise, extended course activity data and "adaptation" suggestions based on predefined subject matter rules. The concept architecture is modular allowing a personalized platform configuration.

  5. Enhancing and Adapting Treatment Foster Care: Lessons Learned in Trying to Change Practice.

    PubMed

    Murray, Maureen M; Southerland, Dannia; Farmer, Elizabeth M; Ballentine, Kess

    2010-01-01

    Evidence-based practices to improve outcomes for children with severe behavioral and emotional problems have received a great deal of attention in children's mental health. Therapeutic Foster Care (TFC), a residential intervention for youth with emotional or behavioral problems, is one of the few community-based programs that is considered to be evidence-based. However, as for most treatment approaches, the vast majority of existing programs do not deliver the evidence-based version. In an attempt to fill this gap and improve practice across a wide range of TFC agencies, we developed an enhanced model of TFC based on input from both practice and research. It includes elements associated with improved outcomes for youth in "usual care" TFC agencies as well as key elements from Chamberlain's evidence-based model. The current manuscript describes this "hybrid" intervention - Together Facing the Challenge - and discusses key issues in implementation. We describe the sample and settings, highlight key implementation strategies, and provide "lessons learned" to help guide others who may wish to change practice in existing agencies.

  6. Weight and the Future of Space Flight Hardware Cost Modeling

    NASA Technical Reports Server (NTRS)

    Prince, Frank A.

    2003-01-01

    Weight has been used as the primary input variable for cost estimating almost as long as there have been parametric cost models. While there are good reasons for using weight, serious limitations exist. These limitations have been addressed by multi-variable equations and trend analysis in models such as NAFCOM, PRICE, and SEER; however, these models have not be able to address the significant time lags that can occur between the development of similar space flight hardware systems. These time lags make the cost analyst's job difficult because insufficient data exists to perform trend analysis, and the current set of parametric models are not well suited to accommodating process improvements in space flight hardware design, development, build and test. As a result, people of good faith can have serious disagreement over the cost for new systems. To address these shortcomings, new cost modeling approaches are needed. The most promising approach is process based (sometimes called activity) costing. Developing process based models will require a detailed understanding of the functions required to produce space flight hardware combined with innovative approaches to estimating the necessary resources. Particularly challenging will be the lack of data at the process level. One method for developing a model is to combine notional algorithms with a discrete event simulation and model changes to the total cost as perturbations to the program are introduced. Despite these challenges, the potential benefits are such that efforts should be focused on developing process based cost models.

  7. Identity-Based Motivation: Constraints and Opportunities in Consumer Research

    PubMed Central

    Shavitt, Sharon; Torelli, Carlos J.; Wong, Jimmy

    2009-01-01

    This commentary underscores the integrative nature of the identity-based motivation model (Oyserman, 2009). We situate the model within existing literatures in psychology and consumer behavior, and illustrate its novel elements with research examples. Special attention is devoted to, 1) how product- and brand-based affordances constrain identity-based motivation processes and, 2) the mindsets and action tendencies that can be triggered by specific cultural identities in pursuit of consumer goals. Future opportunities are suggested for researching the antecedents of product meanings and relevant identities. PMID:20161045

  8. An optimum organizational structure for a large earth-orbiting multidisciplinary space base. Ph.D. Thesis - Fla. State Univ., 1973

    NASA Technical Reports Server (NTRS)

    Ragusa, J. M.

    1975-01-01

    An optimum hypothetical organizational structure was studied for a large earth-orbiting, multidisciplinary research and applications space base manned by a crew of technologists. Because such a facility does not presently exist, in situ empirical testing was not possible. Study activity was, therefore, concerned with the identification of a desired organizational structural model rather than with the empirical testing of the model. The essential finding of this research was that a four-level project type total matrix model will optimize the efficiency and effectiveness of space base technologists.

  9. The virtual enhancements - solar proton event radiation (VESPER) model

    NASA Astrophysics Data System (ADS)

    Aminalragia-Giamini, Sigiava; Sandberg, Ingmar; Papadimitriou, Constantinos; Daglis, Ioannis A.; Jiggens, Piers

    2018-02-01

    A new probabilistic model introducing a novel paradigm for the modelling of the solar proton environment at 1 AU is presented. The virtual enhancements - solar proton event radiation model (VESPER) uses the European space agency's solar energetic particle environment modelling (SEPEM) Reference Dataset and produces virtual time-series of proton differential fluxes. In this regard it fundamentally diverges from the approach of existing SPE models that are based on probabilistic descriptions of SPE macroscopic characteristics such as peak flux and cumulative fluence. It is shown that VESPER reproduces well the dataset characteristics it uses, and further comparisons with existing models are made with respect to their results. The production of time-series as the main output of the model opens a straightforward way for the calculation of solar proton radiation effects in terms of time-series and the pairing with effects caused by trapped radiation and galactic cosmic rays.

  10. Extending enterprise architecture modelling with business goals and requirements

    NASA Astrophysics Data System (ADS)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  11. Physically based estimation of soil water retention from textural data: General framework, new models, and streamlined existing models

    USGS Publications Warehouse

    Nimmo, J.R.; Herkelrath, W.N.; Laguna, Luna A.M.

    2007-01-01

    Numerous models are in widespread use for the estimation of soil water retention from more easily measured textural data. Improved models are needed for better prediction and wider applicability. We developed a basic framework from which new and existing models can be derived to facilitate improvements. Starting from the assumption that every particle has a characteristic dimension R associated uniquely with a matric pressure ?? and that the form of the ??-R relation is the defining characteristic of each model, this framework leads to particular models by specification of geometric relationships between pores and particles. Typical assumptions are that particles are spheres, pores are cylinders with volume equal to the associated particle volume times the void ratio, and that the capillary inverse proportionality between radius and matric pressure is valid. Examples include fixed-pore-shape and fixed-pore-length models. We also developed alternative versions of the model of Arya and Paris that eliminate its interval-size dependence and other problems. The alternative models are calculable by direct application of algebraic formulas rather than manipulation of data tables and intermediate results, and they easily combine with other models (e.g., incorporating structural effects) that are formulated on a continuous basis. Additionally, we developed a family of models based on the same pore geometry as the widely used unsaturated hydraulic conductivity model of Mualem. Predictions of measurements for different suitable media show that some of the models provide consistently good results and can be chosen based on ease of calculations and other factors. ?? Soil Science Society of America. All rights reserved.

  12. Validation and implementation of model based control strategies at an industrial wastewater treatment plant.

    PubMed

    Demey, D; Vanderhaegen, B; Vanhooren, H; Liessens, J; Van Eyck, L; Hopkins, L; Vanrolleghem, P A

    2001-01-01

    In this paper, the practical implementation and validation of advanced control strategies, designed using model based techniques, at an industrial wastewater treatment plant is demonstrated. The plant under study is treating the wastewater of a large pharmaceutical production facility. The process characteristics of the wastewater treatment were quantified by means of tracer tests, intensive measurement campaigns and the use of on-line sensors. In parallel, a dynamical model of the complete wastewater plant was developed according to the specific kinetic characteristics of the sludge and the highly varying composition of the industrial wastewater. Based on real-time data and dynamic models, control strategies for the equalisation system, the polymer dosing and phosphorus addition were established. The control strategies are being integrated in the existing SCADA system combining traditional PLC technology with robust PC based control calculations. The use of intelligent control in wastewater treatment offers a wide spectrum of possibilities to upgrade existing plants, to increase the capacity of the plant and to eliminate peaks. This can result in a more stable and secure overall performance and, finally, in cost savings. The use of on-line sensors has a potential not only for monitoring concentrations, but also for manipulating flows and concentrations. This way the performance of the plant can be secured.

  13. Fostering Third-Grade Students' Use of Scientific Models with the Water Cycle: Elementary Teachers' Conceptions and Practices

    ERIC Educational Resources Information Center

    Vo, Tina; Forbes, Cory T.; Zangori, Laura; Schwarz, Christina V.

    2015-01-01

    Elementary teachers play a crucial role in supporting and scaffolding students' model-based reasoning about natural phenomena, particularly complex systems such as the water cycle. However, little research exists to inform efforts in supporting elementary teachers' learning to foster model-centered, science learning environments. To address this…

  14. A model for estimating understory vegetation response to fertilization and precipitation in loblolly pine plantations

    Treesearch

    Curtis L. VanderSchaaf; Ryan W. McKnight; Thomas R. Fox; H. Lee Allen

    2010-01-01

    A model form is presented, where the model contains regressors selected for inclusion based on biological rationale, to predict how fertilization, precipitation amounts, and overstory stand density affect understory vegetation biomass. Due to time, economic, and logistic constraints, datasets of large sample sizes generally do not exist for understory vegetation. Thus...

  15. An Exploratory Study of the Elements to Develop a Coaching Model

    ERIC Educational Resources Information Center

    Brown, Gwendolyn

    2010-01-01

    This exploratory study examined the elements of a coaching model based on the best practices that first focus on providing managers with the ability to develop workers and increase productivity, before using existing models that only support the process of managing workers, when it becomes apparent that the worker is not meeting expected…

  16. Data-driven non-Markovian closure models

    NASA Astrophysics Data System (ADS)

    Kondrashov, Dmitri; Chekroun, Mickaël D.; Ghil, Michael

    2015-03-01

    This paper has two interrelated foci: (i) obtaining stable and efficient data-driven closure models by using a multivariate time series of partial observations from a large-dimensional system; and (ii) comparing these closure models with the optimal closures predicted by the Mori-Zwanzig (MZ) formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a generalization and a time-continuous limit of existing multilevel, regression-based approaches to closure in a data-driven setting; these approaches include empirical model reduction (EMR), as well as more recent multi-layer modeling. It is shown that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the MZ formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are derived on the structure of the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a broad class of MSM applications, a class that includes non-polynomial predictors and nonlinearities that do not necessarily preserve quadratic energy invariants. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. It is shown that the resulting closure model with energy-conserving nonlinearities efficiently captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lotka-Volterra model of population dynamics in its chaotic regime. The challenges here include the rarity of strange attractors in the model's parameter space and the existence of multiple attractor basins with fractal boundaries. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up.

  17. On Inertial Body Tracking in the Presence of Model Calibration Errors

    PubMed Central

    Miezal, Markus; Taetz, Bertram; Bleser, Gabriele

    2016-01-01

    In inertial body tracking, the human body is commonly represented as a biomechanical model consisting of rigid segments with known lengths and connecting joints. The model state is then estimated via sensor fusion methods based on data from attached inertial measurement units (IMUs). This requires the relative poses of the IMUs w.r.t. the segments—the IMU-to-segment calibrations, subsequently called I2S calibrations—to be known. Since calibration methods based on static poses, movements and manual measurements are still the most widely used, potentially large human-induced calibration errors have to be expected. This work compares three newly developed/adapted extended Kalman filter (EKF) and optimization-based sensor fusion methods with an existing EKF-based method w.r.t. their segment orientation estimation accuracy in the presence of model calibration errors with and without using magnetometer information. While the existing EKF-based method uses a segment-centered kinematic chain biomechanical model and a constant angular acceleration motion model, the newly developed/adapted methods are all based on a free segments model, where each segment is represented with six degrees of freedom in the global frame. Moreover, these methods differ in the assumed motion model (constant angular acceleration, constant angular velocity, inertial data as control input), the state representation (segment-centered, IMU-centered) and the estimation method (EKF, sliding window optimization). In addition to the free segments representation, the optimization-based method also represents each IMU with six degrees of freedom in the global frame. In the evaluation on simulated and real data from a three segment model (an arm), the optimization-based method showed the smallest mean errors, standard deviations and maximum errors throughout all tests. It also showed the lowest dependency on magnetometer information and motion agility. Moreover, it was insensitive w.r.t. I2S position and segment length errors in the tested ranges. Errors in the I2S orientations were, however, linearly propagated into the estimated segment orientations. In the absence of magnetic disturbances, severe model calibration errors and fast motion changes, the newly developed IMU centered EKF-based method yielded comparable results with lower computational complexity. PMID:27455266

  18. Global Existence Analysis of Cross-Diffusion Population Systems for Multiple Species

    NASA Astrophysics Data System (ADS)

    Chen, Xiuqing; Daus, Esther S.; Jüngel, Ansgar

    2018-02-01

    The existence of global-in-time weak solutions to reaction-cross-diffusion systems for an arbitrary number of competing population species is proved. The equations can be derived from an on-lattice random-walk model with general transition rates. In the case of linear transition rates, it extends the two-species population model of Shigesada, Kawasaki, and Teramoto. The equations are considered in a bounded domain with homogeneous Neumann boundary conditions. The existence proof is based on a refined entropy method and a new approximation scheme. Global existence follows under a detailed balance or weak cross-diffusion condition. The detailed balance condition is related to the symmetry of the mobility matrix, which mirrors Onsager's principle in thermodynamics. Under detailed balance (and without reaction) the entropy is nonincreasing in time, but counter-examples show that the entropy may increase initially if detailed balance does not hold.

  19. GIS based model interfacing : incorporating existing software and new techniques into a streamlined interface package

    DOT National Transportation Integrated Search

    2000-01-01

    The ability to visualize data has grown immensely as the speed and functionality of Geographic Information Systems (GIS) have increased. Now, with modeling software and GIS, planners are able to view a prediction of the future traffic demands in thei...

  20. Chemical Transformation System: Cloud Based Cheminformatic Services to Support Integrated Environmental Modeling (proceedings)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) systems that account for the fate/transport of organics frequently require physicochemical properties as well as transformation products. A myriad of chemical property databases exist but these can be difficult to access and often do not co...

  1. RESIDUAL RISK ASSESSMENTS - FINAL RESIDUAL RISK ASSESSMENT FOR SECONDARY LEAD SMELTERS

    EPA Science Inventory

    This source category previously subjected to a technology-based standard will be examined to determine if health or ecological risks are significant enough to warrant further regulation for Secondary Lead Smelters. These assesments utilize existing models and data bases to examin...

  2. Offshore marine constructions as propagators of moon jellyfish dispersal

    NASA Astrophysics Data System (ADS)

    Vodopivec, Martin; Peliz, Álvaro J.; Malej, Alenka

    2017-08-01

    We have studied the influence of offshore marine constructions on the moon jellyfish population in the Adriatic sea, where the newly set up substrates enable the formation of a new population based in the formerly unpopulated open waters. Our five-year long computer simulation uses a high resolution coupled bio-physical individual-based model to track the dispersal of the offspring from subpopulations originating from offshore and shore-based sources. According to our study, the platforms enhance connectivity between subpopulations of jellyfish polyps, help sustain existing shore-based subpopulations, contribute to jellyfish blooms in some areas, and play an important role in establishing connection with the rest of the Mediterranean, in addition to representing substantial amounts of available substrate. This is an aspect that is usually overlooked when evaluating the ecological impact of existing and future wind farms, oil and gas platforms, etc. Our approach could serve as a role model in future studies of ecological impacts of planned offshore constructions.

  3. Adaptive estimation of state of charge and capacity with online identified battery model for vanadium redox flow battery

    NASA Astrophysics Data System (ADS)

    Wei, Zhongbao; Tseng, King Jet; Wai, Nyunt; Lim, Tuti Mariana; Skyllas-Kazacos, Maria

    2016-11-01

    Reliable state estimate depends largely on an accurate battery model. However, the parameters of battery model are time varying with operating condition variation and battery aging. The existing co-estimation methods address the model uncertainty by integrating the online model identification with state estimate and have shown improved accuracy. However, the cross interference may arise from the integrated framework to compromise numerical stability and accuracy. Thus this paper proposes the decoupling of model identification and state estimate to eliminate the possibility of cross interference. The model parameters are online adapted with the recursive least squares (RLS) method, based on which a novel joint estimator based on extended Kalman Filter (EKF) is formulated to estimate the state of charge (SOC) and capacity concurrently. The proposed joint estimator effectively compresses the filter order which leads to substantial improvement in the computational efficiency and numerical stability. Lab scale experiment on vanadium redox flow battery shows that the proposed method is highly authentic with good robustness to varying operating conditions and battery aging. The proposed method is further compared with some existing methods and shown to be superior in terms of accuracy, convergence speed, and computational cost.

  4. Economic Models of Preventive Dentistry for Australian Children and Adolescents: A Systematic Review.

    PubMed

    Tonmukayakul, Utsana; Sia, Kah-Ling; Gold, Lisa; Hegde, Shalika; de Silva, Andrea M; Moodie, Marj

    2015-01-01

    To identify economic evaluation models and parameters that could be replicated or adapted to construct a generic model to assess cost-effectiveness of and prioritise a wide range of community-based oral disease prevention programmes in an Australian context. The literature search was conducted using MEDLINE, ERIC, PsycINFO, CINHAL (EBSCOhost), EMBASE (Ovid), CRD, DARE, NHSEED, HTA, all databases in the Cochrane library, Scopus and ScienceDirect databases from their inception to November 2012. Thirty-three articles met the criteria for inclusion in this review (7 were Australian studies, 26 articles were international). Existing models focused primarily on dental caries. Periodontal disease, another common oral health problem, was lacking. Among caries prevention studies, there was an absence of clear evidence showing continuous benefits from primary through to permanent dentition and the long-term effects of oral health promotion. No generic model was identified from previous studies that could be immediately adopted or adapted for our purposes of simulating and prioritising a diverse range of oral health interventions for Australian children and adolescents. Nevertheless, data sources specified in the existing Australian-based models will be useful for developing a generic model for such purposes.

  5. Quantitative physiologically based modeling of subjective fatigue during sleep deprivation.

    PubMed

    Fulcher, B D; Phillips, A J K; Robinson, P A

    2010-05-21

    A quantitative physiologically based model of the sleep-wake switch is used to predict variations in subjective fatigue-related measures during total sleep deprivation. The model includes the mutual inhibition of the sleep-active neurons in the hypothalamic ventrolateral preoptic area (VLPO) and the wake-active monoaminergic brainstem populations (MA), as well as circadian and homeostatic drives. We simulate sleep deprivation by introducing a drive to the MA, which we call wake effort, to maintain the system in a wakeful state. Physiologically this drive is proposed to be afferent from the cortex or the orexin group of the lateral hypothalamus. It is hypothesized that the need to exert this effort to maintain wakefulness at high homeostatic sleep pressure correlates with subjective fatigue levels. The model's output indeed exhibits good agreement with existing clinical time series of subjective fatigue-related measures, supporting this hypothesis. Subjective fatigue, adrenaline, and body temperature variations during two 72h sleep deprivation protocols are reproduced by the model. By distinguishing a motivation-dependent orexinergic contribution to the wake-effort drive, the model can be extended to interpret variation in performance levels during sleep deprivation in a way that is qualitatively consistent with existing, clinically derived results. The example of sleep deprivation thus demonstrates the ability of physiologically based sleep modeling to predict psychological measures from the underlying physiological interactions that produce them. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  6. Modelling of induced electric fields based on incompletely known magnetic fields

    NASA Astrophysics Data System (ADS)

    Laakso, Ilkka; De Santis, Valerio; Cruciani, Silvano; Campi, Tommaso; Feliziani, Mauro

    2017-08-01

    Determining the induced electric fields in the human body is a fundamental problem in bioelectromagnetics that is important for both evaluation of safety of electromagnetic fields and medical applications. However, existing techniques for numerical modelling of induced electric fields require detailed information about the sources of the magnetic field, which may be unknown or difficult to model in realistic scenarios. Here, we show how induced electric fields can accurately be determined in the case where the magnetic fields are known only approximately, e.g. based on field measurements. The robustness of our approach is shown in numerical simulations for both idealized and realistic scenarios featuring a personalized MRI-based head model. The approach allows for modelling of the induced electric fields in biological bodies directly based on real-world magnetic field measurements.

  7. An object-based storage model for distributed remote sensing images

    NASA Astrophysics Data System (ADS)

    Yu, Zhanwu; Li, Zhongmin; Zheng, Sheng

    2006-10-01

    It is very difficult to design an integrated storage solution for distributed remote sensing images to offer high performance network storage services and secure data sharing across platforms using current network storage models such as direct attached storage, network attached storage and storage area network. Object-based storage, as new generation network storage technology emerged recently, separates the data path, the control path and the management path, which solves the bottleneck problem of metadata existed in traditional storage models, and has the characteristics of parallel data access, data sharing across platforms, intelligence of storage devices and security of data access. We use the object-based storage in the storage management of remote sensing images to construct an object-based storage model for distributed remote sensing images. In the storage model, remote sensing images are organized as remote sensing objects stored in the object-based storage devices. According to the storage model, we present the architecture of a distributed remote sensing images application system based on object-based storage, and give some test results about the write performance comparison of traditional network storage model and object-based storage model.

  8. Differential geometry based solvation model. III. Quantum formulation

    PubMed Central

    Chen, Zhan; Wei, Guo-Wei

    2011-01-01

    Solvation is of fundamental importance to biomolecular systems. Implicit solvent models, particularly those based on the Poisson-Boltzmann equation for electrostatic analysis, are established approaches for solvation analysis. However, ad hoc solvent-solute interfaces are commonly used in the implicit solvent theory. Recently, we have introduced differential geometry based solvation models which allow the solvent-solute interface to be determined by the variation of a total free energy functional. Atomic fixed partial charges (point charges) are used in our earlier models, which depends on existing molecular mechanical force field software packages for partial charge assignments. As most force field models are parameterized for a certain class of molecules or materials, the use of partial charges limits the accuracy and applicability of our earlier models. Moreover, fixed partial charges do not account for the charge rearrangement during the solvation process. The present work proposes a differential geometry based multiscale solvation model which makes use of the electron density computed directly from the quantum mechanical principle. To this end, we construct a new multiscale total energy functional which consists of not only polar and nonpolar solvation contributions, but also the electronic kinetic and potential energies. By using the Euler-Lagrange variation, we derive a system of three coupled governing equations, i.e., the generalized Poisson-Boltzmann equation for the electrostatic potential, the generalized Laplace-Beltrami equation for the solvent-solute boundary, and the Kohn-Sham equations for the electronic structure. We develop an iterative procedure to solve three coupled equations and to minimize the solvation free energy. The present multiscale model is numerically validated for its stability, consistency and accuracy, and is applied to a few sets of molecules, including a case which is difficult for existing solvation models. Comparison is made to many other classic and quantum models. By using experimental data, we show that the present quantum formulation of our differential geometry based multiscale solvation model improves the prediction of our earlier models, and outperforms some explicit solvation model. PMID:22112067

  9. Application of natural analog studies to exploration for ore deposits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustafson, D.L.

    1995-09-01

    Natural analogs are viewed as similarities in nature and are routinely utilized by exploration geologists in their search for economic mineral deposits. Ore deposit modeling is undertaken by geologists to direct their exploration activities toward favorable geologic environments and, therefore, successful programs. Two types of modeling are presented: (i) empirical model development based on the study of known ore deposit characteristics, and (ii) concept model development based on theoretical considerations and field observations that suggest a new deposit type, not known to exist in nature, may exist and justifies an exploration program. Key elements that are important in empirical modelmore » development are described, and examples of successful applications of these natural analogs to exploration are presented. A classical example of successful concept model development, the discovery of the McLaughlin gold mine in California, is presented. The utilization of natural analogs is an important facet of mineral exploration. Natural analogs guide explorationists in their search for new discoveries, increase the probability of success, and may decrease overall exploration expenditure.« less

  10. Sequence memory based on coherent spin-interaction neural networks.

    PubMed

    Xia, Min; Wong, W K; Wang, Zhijie

    2014-12-01

    Sequence information processing, for instance, the sequence memory, plays an important role on many functions of brain. In the workings of the human brain, the steady-state period is alterable. However, in the existing sequence memory models using heteroassociations, the steady-state period cannot be changed in the sequence recall. In this work, a novel neural network model for sequence memory with controllable steady-state period based on coherent spininteraction is proposed. In the proposed model, neurons fire collectively in a phase-coherent manner, which lets a neuron group respond differently to different patterns and also lets different neuron groups respond differently to one pattern. The simulation results demonstrating the performance of the sequence memory are presented. By introducing a new coherent spin-interaction sequence memory model, the steady-state period can be controlled by dimension parameters and the overlap between the input pattern and the stored patterns. The sequence storage capacity is enlarged by coherent spin interaction compared with the existing sequence memory models. Furthermore, the sequence storage capacity has an exponential relationship to the dimension of the neural network.

  11. A simple, analytic 3-dimensional downburst model based on boundary layer stagnation flow

    NASA Technical Reports Server (NTRS)

    Oseguera, Rosa M.; Bowles, Roland L.

    1988-01-01

    A simple downburst model is developed for use in batch and real-time piloted simulation studies of guidance strategies for terminal area transport aircraft operations in wind shear conditions. The model represents an axisymmetric stagnation point flow, based on velocity profiles from the Terminal Area Simulation System (TASS) model developed by Proctor and satisfies the mass continuity equation in cylindrical coordinates. Altitude dependence, including boundary layer effects near the ground, closely matches real-world measurements, as do the increase, peak, and decay of outflow and downflow with increasing distance from the downburst center. Equations for horizontal and vertical winds were derived, and found to be infinitely differentiable, with no singular points existent in the flow field. In addition, a simple relationship exists among the ratio of maximum horizontal to vertical velocities, the downdraft radius, depth of outflow, and altitude of maximum outflow. In use, a microburst can be modeled by specifying four characteristic parameters, velocity components in the x, y and z directions, and the corresponding nine partial derivatives are obtained easily from the velocity equations.

  12. Modeling and prediction of ionospheric scintillation

    NASA Technical Reports Server (NTRS)

    Fremouw, E. J.

    1974-01-01

    Scintillation modeling performed thus far is based on the theory of diffraction by a weakly modulating phase screen developed by Briggs and Parkin (1963). Shortcomings of the existing empirical model for the scintillation index are discussed together with questions of channel modeling, giving attention to the needs of the communication engineers. It is pointed out that much improved scintillation index models may be available in a matter of a year or so.

  13. Resilient Software Systems

    DTIC Science & Technology

    2015-06-01

    and tools, called model-integrated computing ( MIC ) [3] relies on the use of domain-specific modeling languages for creating models of the system to be...hence giving reflective capabilities to it. We have followed the MIC method here: we designed a domain- specific modeling language for modeling...are produced one-off and not for the mass market , the scope for price reduction based on the market demands is non-existent. Processes to create

  14. Updating Known Distribution Models for Forecasting Climate Change Impact on Endangered Species

    PubMed Central

    Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo

    2013-01-01

    To plan endangered species conservation and to design adequate management programmes, it is necessary to predict their distributional response to climate change, especially under the current situation of rapid change. However, these predictions are customarily done by relating de novo the distribution of the species with climatic conditions with no regard of previously available knowledge about the factors affecting the species distribution. We propose to take advantage of known species distribution models, but proceeding to update them with the variables yielded by climatic models before projecting them to the future. To exemplify our proposal, the availability of suitable habitat across Spain for the endangered Bonelli's Eagle (Aquila fasciata) was modelled by updating a pre-existing model based on current climate and topography to a combination of different general circulation models and Special Report on Emissions Scenarios. Our results suggested that the main threat for this endangered species would not be climate change, since all forecasting models show that its distribution will be maintained and increased in mainland Spain for all the XXI century. We remark on the importance of linking conservation biology with distribution modelling by updating existing models, frequently available for endangered species, considering all the known factors conditioning the species' distribution, instead of building new models that are based on climate change variables only. PMID:23840330

  15. Updating known distribution models for forecasting climate change impact on endangered species.

    PubMed

    Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo

    2013-01-01

    To plan endangered species conservation and to design adequate management programmes, it is necessary to predict their distributional response to climate change, especially under the current situation of rapid change. However, these predictions are customarily done by relating de novo the distribution of the species with climatic conditions with no regard of previously available knowledge about the factors affecting the species distribution. We propose to take advantage of known species distribution models, but proceeding to update them with the variables yielded by climatic models before projecting them to the future. To exemplify our proposal, the availability of suitable habitat across Spain for the endangered Bonelli's Eagle (Aquila fasciata) was modelled by updating a pre-existing model based on current climate and topography to a combination of different general circulation models and Special Report on Emissions Scenarios. Our results suggested that the main threat for this endangered species would not be climate change, since all forecasting models show that its distribution will be maintained and increased in mainland Spain for all the XXI century. We remark on the importance of linking conservation biology with distribution modelling by updating existing models, frequently available for endangered species, considering all the known factors conditioning the species' distribution, instead of building new models that are based on climate change variables only.

  16. Weather-based forecasts of California crop yields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobell, D B; Cahill, K N; Field, C B

    2005-09-26

    Crop yield forecasts provide useful information to a range of users. Yields for several crops in California are currently forecast based on field surveys and farmer interviews, while for many crops official forecasts do not exist. As broad-scale crop yields are largely dependent on weather, measurements from existing meteorological stations have the potential to provide a reliable, timely, and cost-effective means to anticipate crop yields. We developed weather-based models of state-wide yields for 12 major California crops (wine grapes, lettuce, almonds, strawberries, table grapes, hay, oranges, cotton, tomatoes, walnuts, avocados, and pistachios), and tested their accuracy using cross-validation over themore » 1980-2003 period. Many crops were forecast with high accuracy, as judged by the percent of yield variation explained by the forecast, the number of yields with correctly predicted direction of yield change, or the number of yields with correctly predicted extreme yields. The most successfully modeled crop was almonds, with 81% of yield variance captured by the forecast. Predictions for most crops relied on weather measurements well before harvest time, allowing for lead times that were longer than existing procedures in many cases.« less

  17. Development and validation of a Markov microsimulation model for the economic evaluation of treatments in osteoporosis.

    PubMed

    Hiligsmann, Mickaël; Ethgen, Olivier; Bruyère, Olivier; Richy, Florent; Gathon, Henry-Jean; Reginster, Jean-Yves

    2009-01-01

    Markov models are increasingly used in economic evaluations of treatments for osteoporosis. Most of the existing evaluations are cohort-based Markov models missing comprehensive memory management and versatility. In this article, we describe and validate an original Markov microsimulation model to accurately assess the cost-effectiveness of prevention and treatment of osteoporosis. We developed a Markov microsimulation model with a lifetime horizon and a direct health-care cost perspective. The patient history was recorded and was used in calculations of transition probabilities, utilities, and costs. To test the internal consistency of the model, we carried out an example calculation for alendronate therapy. Then, external consistency was investigated by comparing absolute lifetime risk of fracture estimates with epidemiologic data. For women at age 70 years, with a twofold increase in the fracture risk of the average population, the costs per quality-adjusted life-year gained for alendronate therapy versus no treatment were estimated at €9105 and €15,325, respectively, under full and realistic adherence assumptions. All the sensitivity analyses in terms of model parameters and modeling assumptions were coherent with expected conclusions and absolute lifetime risk of fracture estimates were within the range of previous estimates, which confirmed both internal and external consistency of the model. Microsimulation models present some major advantages over cohort-based models, increasing the reliability of the results and being largely compatible with the existing state of the art, evidence-based literature. The developed model appears to be a valid model for use in economic evaluations in osteoporosis.

  18. A Recipe for implementing the Arrhenius-Shock-Temperature State Sensitive WSD (AWSD) model, with parameters for PBX 9502

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aslam, Tariq Dennis

    2017-10-03

    A reactive ow model for the tri-amino-tri-nitro-benzene (TATB) based plastic bonded explosive PBX 9502 is presented. This newly devised model is based primarily on the shock temperature of the material, along with local pressure, and accurately models a broader range of detonation and initiation scenarios. The equation of state for the reactants and products, as well as the thermodynamic closure of pressure and temperature equilibration are carried over from the Wescott-Stewart-Davis (WSD) model7,8. Thus, modifying an existing WSD model in a hydrocode should be rather straightforward.

  19. Analytical approximation of the InGaZnO thin-film transistors surface potential

    NASA Astrophysics Data System (ADS)

    Colalongo, Luigi

    2016-10-01

    Surface-potential-based mathematical models are among the most accurate and physically based compact models of thin-film transistors, and in turn of indium gallium zinc oxide TFTs, available today. However, the need of iterative computations of the surface potential limits their computational efficiency and diffusion in CAD applications. The existing closed-form approximations of the surface potential are based on regional approximations and empirical smoothing functions that could result not accurate enough in particular to model transconductances and transcapacitances. In this work we present an extremely accurate (in the range of nV) and computationally efficient non-iterative approximation of the surface potential that can serve as a basis for advanced surface-potential-based indium gallium zinc oxide TFTs models.

  20. Modeling the interdependent network based on two-mode networks

    NASA Astrophysics Data System (ADS)

    An, Feng; Gao, Xiangyun; Guan, Jianhe; Huang, Shupei; Liu, Qian

    2017-10-01

    Among heterogeneous networks, there exist obviously and closely interdependent linkages. Unlike existing research primarily focus on the theoretical research of physical interdependent network model. We propose a two-layer interdependent network model based on two-mode networks to explore the interdependent features in the reality. Specifically, we construct a two-layer interdependent loan network and develop several dependent features indices. The model is verified to enable us to capture the loan dependent features of listed companies based on loan behaviors and shared shareholders. Taking Chinese debit and credit market as case study, the main conclusions are: (1) only few listed companies shoulder the main capital transmission (20% listed companies occupy almost 70% dependent degree). (2) The control of these key listed companies will be more effective of avoiding the spreading of financial risks. (3) Identifying the companies with high betweenness centrality and controlling them could be helpful to monitor the financial risk spreading. (4) The capital transmission channel among Chinese financial listed companies and Chinese non-financial listed companies are relatively strong. However, under greater pressure of demand of capital transmission (70% edges failed), the transmission channel, which constructed by debit and credit behavior, will eventually collapse.

  1. Towards Open-World Person Re-Identification by One-Shot Group-Based Verification.

    PubMed

    Zheng, Wei-Shi; Gong, Shaogang; Xiang, Tao

    2016-03-01

    Solving the problem of matching people across non-overlapping multi-camera views, known as person re-identification (re-id), has received increasing interests in computer vision. In a real-world application scenario, a watch-list (gallery set) of a handful of known target people are provided with very few (in many cases only a single) image(s) (shots) per target. Existing re-id methods are largely unsuitable to address this open-world re-id challenge because they are designed for (1) a closed-world scenario where the gallery and probe sets are assumed to contain exactly the same people, (2) person-wise identification whereby the model attempts to verify exhaustively against each individual in the gallery set, and (3) learning a matching model using multi-shots. In this paper, a novel transfer local relative distance comparison (t-LRDC) model is formulated to address the open-world person re-identification problem by one-shot group-based verification. The model is designed to mine and transfer useful information from a labelled open-world non-target dataset. Extensive experiments demonstrate that the proposed approach outperforms both non-transfer learning and existing transfer learning based re-id methods.

  2. Catchment-scale modeling of nitrogen dynamics in a temperate forested watershed, Oregon. An interdisciplinary communication strategy.

    Treesearch

    Kellie Vache; Lutz Breuer; Julia Jones; Phil Sollins

    2015-01-01

    We present a systems modeling approach to the development of a place-based ecohydrological model. The conceptual model is calibrated to a variety of existing observations, taken in watershed 10 (WS10) at the HJ Andrews Experimental Forest (HJA) in Oregon, USA, a long term ecological research (LTER) site with a long history of catchment-...

  3. Allometric Equations for Aboveground and Belowground Biomass Estimations in an Evergreen Forest in Vietnam.

    PubMed

    Nam, Vu Thanh; van Kuijk, Marijke; Anten, Niels P R

    2016-01-01

    Allometric regression models are widely used to estimate tropical forest biomass, but balancing model accuracy with efficiency of implementation remains a major challenge. In addition, while numerous models exist for aboveground mass, very few exist for roots. We developed allometric equations for aboveground biomass (AGB) and root biomass (RB) based on 300 (of 45 species) and 40 (of 25 species) sample trees respectively, in an evergreen forest in Vietnam. The biomass estimations from these local models were compared to regional and pan-tropical models. For AGB we also compared local models that distinguish functional types to an aggregated model, to assess the degree of specificity needed in local models. Besides diameter at breast height (DBH) and tree height (H), wood density (WD) was found to be an important parameter in AGB models. Existing pan-tropical models resulted in up to 27% higher estimates of AGB, and overestimated RB by nearly 150%, indicating the greater accuracy of local models at the plot level. Our functional group aggregated local model which combined data for all species, was as accurate in estimating AGB as functional type specific models, indicating that a local aggregated model is the best choice for predicting plot level AGB in tropical forests. Finally our study presents the first allometric biomass models for aboveground and root biomass in forests in Vietnam.

  4. Allometric Equations for Aboveground and Belowground Biomass Estimations in an Evergreen Forest in Vietnam

    PubMed Central

    Nam, Vu Thanh; van Kuijk, Marijke; Anten, Niels P. R.

    2016-01-01

    Allometric regression models are widely used to estimate tropical forest biomass, but balancing model accuracy with efficiency of implementation remains a major challenge. In addition, while numerous models exist for aboveground mass, very few exist for roots. We developed allometric equations for aboveground biomass (AGB) and root biomass (RB) based on 300 (of 45 species) and 40 (of 25 species) sample trees respectively, in an evergreen forest in Vietnam. The biomass estimations from these local models were compared to regional and pan-tropical models. For AGB we also compared local models that distinguish functional types to an aggregated model, to assess the degree of specificity needed in local models. Besides diameter at breast height (DBH) and tree height (H), wood density (WD) was found to be an important parameter in AGB models. Existing pan-tropical models resulted in up to 27% higher estimates of AGB, and overestimated RB by nearly 150%, indicating the greater accuracy of local models at the plot level. Our functional group aggregated local model which combined data for all species, was as accurate in estimating AGB as functional type specific models, indicating that a local aggregated model is the best choice for predicting plot level AGB in tropical forests. Finally our study presents the first allometric biomass models for aboveground and root biomass in forests in Vietnam. PMID:27309718

  5. A New Framework for Cumulus Parametrization - A CPT in action

    NASA Astrophysics Data System (ADS)

    Jakob, C.; Peters, K.; Protat, A.; Kumar, V.

    2016-12-01

    The representation of convection in climate model remains a major Achilles Heel in our pursuit of better predictions of global and regional climate. The basic principle underpinning the parametrisation of tropical convection in global weather and climate models is that there exist discernible interactions between the resolved model scale and the parametrised cumulus scale. Furthermore, there must be at least some predictive power in the larger scales for the statistical behaviour on small scales for us to be able to formally close the parametrised equations. The presentation will discuss a new framework for cumulus parametrisation based on the idea of separating the prediction of cloud area from that of velocity. This idea is put into practice by combining an existing multi-scale stochastic cloud model with observations to arrive at the prediction of the area fraction for deep precipitating convection. Using mid-tropospheric humidity and vertical motion as predictors, the model is shown to reproduce the observed behaviour of both mean and variability of deep convective area fraction well. The framework allows for the inclusion of convective organisation and can - in principle - be made resolution-aware or resolution-independent. When combined with simple assumptions about cloud-base vertical motion the model can be used as a closure assumption in any existing cumulus parametrisation. Results of applying this idea in the the ECHAM model indicate significant improvements in the simulation of tropical variability, including but not limited to the MJO. This presentation will highlight how the close collaboration of the observational, theoretical and model development community in the spirit of the climate process teams can lead to significant progress in long-standing issues in climate modelling while preserving the freedom of individual groups in pursuing their specific implementation of an agreed framework.

  6. Models and theories of prescribing decisions: A review and suggested a new model

    PubMed Central

    Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research. PMID:28690701

  7. Action Reflected and Project Based Combined Methodology for the Appropriate Comprehension of Mechanisms in Industrial Design Education

    ERIC Educational Resources Information Center

    Yavuzcan, H. Güçlü; Sahin, Damla

    2017-01-01

    In industrial design (ID) education, mechanics-based courses are mainly based on a traditional lecture approach and they are highly abstract for ID students to comprehend. The existing studies highlight the requirement of a new approach for mechanics-based courses in ID departments. This study presents a combined teaching model for mechanisms…

  8. Model-Drive Architecture for Agent-Based Systems

    NASA Technical Reports Server (NTRS)

    Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.

    2004-01-01

    The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.

  9. Spatiotemporal patterns of terrestrial gross primary production: A review

    NASA Astrophysics Data System (ADS)

    Anav, Alessandro; Friedlingstein, Pierre; Beer, Christian; Ciais, Philippe; Harper, Anna; Jones, Chris; Murray-Tortarolo, Guillermo; Papale, Dario; Parazoo, Nicholas C.; Peylin, Philippe; Piao, Shilong; Sitch, Stephen; Viovy, Nicolas; Wiltshire, Andy; Zhao, Maosheng

    2015-09-01

    Great advances have been made in the last decade in quantifying and understanding the spatiotemporal patterns of terrestrial gross primary production (GPP) with ground, atmospheric, and space observations. However, although global GPP estimates exist, each data set relies upon assumptions and none of the available data are based only on measurements. Consequently, there is no consensus on the global total GPP and large uncertainties exist in its benchmarking. The objective of this review is to assess how the different available data sets predict the spatiotemporal patterns of GPP, identify the differences among data sets, and highlight the main advantages/disadvantages of each data set. We compare GPP estimates for the historical period (1990-2009) from two observation-based data sets (Model Tree Ensemble and Moderate Resolution Imaging Spectroradiometer) to coupled carbon-climate models and terrestrial carbon cycle models from the Fifth Climate Model Intercomparison Project and TRENDY projects and to a new hybrid data set (CARBONES). Results show a large range in the mean global GPP estimates. The different data sets broadly agree on GPP seasonal cycle in terms of phasing, while there is still discrepancy on the amplitude. For interannual variability (IAV) and trends, there is a clear separation between the observation-based data that show little IAV and trend, while the process-based models have large GPP variability and significant trends. These results suggest that there is an urgent need to improve observation-based data sets and develop carbon cycle modeling with processes that are currently treated either very simplistically to correctly estimate present GPP and better quantify the future uptake of carbon dioxide by the world's vegetation.

  10. Matching Aerial Images to 3D Building Models Using Context-Based Geometric Hashing

    PubMed Central

    Jung, Jaewook; Sohn, Gunho; Bang, Kiin; Wichmann, Andreas; Armenakis, Costas; Kada, Martin

    2016-01-01

    A city is a dynamic entity, which environment is continuously changing over time. Accordingly, its virtual city models also need to be regularly updated to support accurate model-based decisions for various applications, including urban planning, emergency response and autonomous navigation. A concept of continuous city modeling is to progressively reconstruct city models by accommodating their changes recognized in spatio-temporal domain, while preserving unchanged structures. A first critical step for continuous city modeling is to coherently register remotely sensed data taken at different epochs with existing building models. This paper presents a new model-to-image registration method using a context-based geometric hashing (CGH) method to align a single image with existing 3D building models. This model-to-image registration process consists of three steps: (1) feature extraction; (2) similarity measure; and matching, and (3) estimating exterior orientation parameters (EOPs) of a single image. For feature extraction, we propose two types of matching cues: edged corner features representing the saliency of building corner points with associated edges, and contextual relations among the edged corner features within an individual roof. A set of matched corners are found with given proximity measure through geometric hashing, and optimal matches are then finally determined by maximizing the matching cost encoding contextual similarity between matching candidates. Final matched corners are used for adjusting EOPs of the single airborne image by the least square method based on collinearity equations. The result shows that acceptable accuracy of EOPs of a single image can be achievable using the proposed registration approach as an alternative to a labor-intensive manual registration process. PMID:27338410

  11. A Collaborative Location Based Travel Recommendation System through Enhanced Rating Prediction for the Group of Users.

    PubMed

    Ravi, Logesh; Vairavasundaram, Subramaniyaswamy

    2016-01-01

    Rapid growth of web and its applications has created a colossal importance for recommender systems. Being applied in various domains, recommender systems were designed to generate suggestions such as items or services based on user interests. Basically, recommender systems experience many issues which reflects dwindled effectiveness. Integrating powerful data management techniques to recommender systems can address such issues and the recommendations quality can be increased significantly. Recent research on recommender systems reveals an idea of utilizing social network data to enhance traditional recommender system with better prediction and improved accuracy. This paper expresses views on social network data based recommender systems by considering usage of various recommendation algorithms, functionalities of systems, different types of interfaces, filtering techniques, and artificial intelligence techniques. After examining the depths of objectives, methodologies, and data sources of the existing models, the paper helps anyone interested in the development of travel recommendation systems and facilitates future research direction. We have also proposed a location recommendation system based on social pertinent trust walker (SPTW) and compared the results with the existing baseline random walk models. Later, we have enhanced the SPTW model for group of users recommendations. The results obtained from the experiments have been presented.

  12. Benefit-cost estimation for alternative drinking water maximum contaminant levels

    NASA Astrophysics Data System (ADS)

    Gurian, Patrick L.; Small, Mitchell J.; Lockwood, John R.; Schervish, Mark J.

    2001-08-01

    A simulation model for estimating compliance behavior and resulting costs at U.S. Community Water Suppliers is developed and applied to the evaluation of a more stringent maximum contaminant level (MCL) for arsenic. Probability distributions of source water arsenic concentrations are simulated using a statistical model conditioned on system location (state) and source water type (surface water or groundwater). This model is fit to two recent national surveys of source waters, then applied with the model explanatory variables for the population of U.S. Community Water Suppliers. Existing treatment types and arsenic removal efficiencies are also simulated. Utilities with finished water arsenic concentrations above the proposed MCL are assumed to select the least cost option compatible with their existing treatment from among 21 available compliance strategies and processes for meeting the standard. Estimated costs and arsenic exposure reductions at individual suppliers are aggregated to estimate the national compliance cost, arsenic exposure reduction, and resulting bladder cancer risk reduction. Uncertainties in the estimates are characterized based on uncertainties in the occurrence model parameters, existing treatment types, treatment removal efficiencies, costs, and the bladder cancer dose-response function for arsenic.

  13. A Physically Based Coupled Chemical and Physical Weathering Model for Simulating Soilscape Evolution

    NASA Astrophysics Data System (ADS)

    Willgoose, G. R.; Welivitiya, D.; Hancock, G. R.

    2015-12-01

    A critical missing link in existing landscape evolution models is a dynamic soil evolution models where soils co-evolve with the landform. Work by the authors over the last decade has demonstrated a computationally manageable model for soil profile evolution (soilscape evolution) based on physical weathering. For chemical weathering it is clear that full geochemistry models such as CrunchFlow and PHREEQC are too computationally intensive to be couplable to existing soilscape and landscape evolution models. This paper presents a simplification of CrunchFlow chemistry and physics that makes the task feasible, and generalises it for hillslope geomorphology applications. Results from this simplified model will be compared with field data for soil pedogenesis. Other researchers have previously proposed a number of very simple weathering functions (e.g. exponential, humped, reverse exponential) as conceptual models of the in-profile weathering process. The paper will show that all of these functions are possible for specific combinations of in-soil environmental, geochemical and geologic conditions, and the presentation will outline the key variables controlling which of these conceptual models can be realistic models of in-profile processes and under what conditions. The presentation will finish by discussing the coupling of this model with a physical weathering model, and will show sample results from our SSSPAM soilscape evolution model to illustrate the implications of including chemical weathering in the soilscape evolution model.

  14. Application of multiple modelling to hyperthermia estimation: reducing the effects of model mismatch.

    PubMed

    Potocki, J K; Tharp, H S

    1993-01-01

    Multiple model estimation is a viable technique for dealing with the spatial perfusion model mismatch associated with hyperthermia dosimetry. Using multiple models, spatial discrimination can be obtained without increasing the number of unknown perfusion zones. Two multiple model estimators based on the extended Kalman filter (EKF) are designed and compared with two EKFs based on single models having greater perfusion zone segmentation. Results given here indicate that multiple modelling is advantageous when the number of thermal sensors is insufficient for convergence of single model estimators having greater perfusion zone segmentation. In situations where sufficient measured outputs exist for greater unknown perfusion parameter estimation, the multiple model estimators and the single model estimators yield equivalent results.

  15. Diffusion of innovations in social interaction systems. An agent-based model for the introduction of new drugs in markets.

    PubMed

    Pombo-Romero, Julio; Varela, Luis M; Ricoy, Carlos J

    2013-06-01

    The existence of imitative behavior among consumers is a well-known phenomenon in the field of Economics. This behavior is especially common in markets determined by a high degree of innovation, asymmetric information and/or price-inelastic demand, features that exist in the pharmaceutical market. This paper presents evidence of the existence of imitative behavior among primary care physicians in Galicia (Spain) when choosing treatments for their patients. From this and other evidence, we propose a dynamic model for determining the entry of new drugs into the market. To do this, we introduce the structure of the organization of primary health care centers and the presence of groups of doctors who are specially interrelated, as well as the existence of commercial pressure on doctors. For modeling purposes, physicians are treated as spins connected in an exponentially distributed complex network of the Watts-Strogatz type. The proposed model provides an explanation for the differences observed in the patterns of the introduction of technological innovations in different regions. The main cause of these differences is the different structure of relationships among consumers, where the existence of small groups that show a higher degree of coordination over the average is particularly influential. The evidence presented, together with the proposed model, might be useful for the design of optimal strategies for the introduction of new drugs, as well as for planning policies to manage pharmaceutical expenditure.

  16. Effect law of Damage Characteristics of Rock Similar Material with Pre-Existing Cracks

    NASA Astrophysics Data System (ADS)

    Li, S. G.; Cheng, X. Y.; Liu, C.

    2017-11-01

    In order to further study the failure mechanism for rock similar materials, this study established the damage model based on accumulative AE events, investigated the damage characteristics for rock similar material samples with pre-existing cracks of varying width under uniaxial compression load. The equipment used in this study is the self-developed YYW-II strain controlled unconfined compression apparatus and the PCIE-8 acoustic emission (AE) monitoring system. The influences of the width of the pre-existing cracks to the damage characteristics of rock similar materials are analyzed. Results show that, (1) the damage model can better describe the damage characteristics of rock similar materials; (2) the tested samples have three stages during failure: initial damage stage, stable development of damage stage, and accelerated development of damage stage; (3) with the width of pre-existing cracks vary from 3mm to 5mm, the damage of rock similar materials increases gradually. The outcomes of this study provided additional values to the research of the failure mechanism for geotechnical similar material models.

  17. Quantifying the transport impacts of domestic waste collection strategies.

    PubMed

    McLeod, Fraser; Cherrett, Tom

    2008-11-01

    This paper models the effects of three different options for domestic waste collection using data from three Hampshire authorities: (i) joint working between neighbouring waste collection authorities; (ii) basing vehicles at waste disposal sites; and (iii) alternate weekly collection of residual waste and dry recyclables. A vehicle mileage savings of 3% was modelled for joint working, where existing vehicle allocations to depots were maintained, which increased to 5.9% when vehicles were re-allocated to depots optimally. Vehicle mileage was reduced by 13.5% when the collection rounds were based out of the two waste disposal sites rather than out of the existing depots, suggesting that the former could be the most effective place to keep vehicles providing that travel arrangements for the crews could be made. Alternate weekly collection was modelled to reduce vehicle mileage by around 8% and time taken by 14%, when compared with a typical scenario of weekly collection of residual and fortnightly collection of recyclable waste. These results were based on an assumption that 20% of the residual waste would be directly diverted into the dry recyclables waste stream.

  18. Monte Carlo approach in assessing damage in higher order structures of DNA

    NASA Technical Reports Server (NTRS)

    Chatterjee, A.; Schmidt, J. B.; Holley, W. R.

    1994-01-01

    We have developed a computer monitor of nuclear DNA in the form of chromatin fibre. The fibres are modeled as a ideal solenoid consisting of twenty helical turns with six nucleosomes per turn. The chromatin model, in combination with are Monte Carlo theory of radiation damage induces by charged particles, based on general features of tack structure and stopping power theory, has been used to evaluate the influence of DNA structure on initial damage. An interesting has emerged from our calculations. Our calculated results predict the existence of strong spatial correlations in damage sites associated with the symmetries in the solenoidal model. We have calculated spectra of short fragments of double stranded DNA produced by multiple double strand breaks induced by both high and low LET radiation. The spectra exhibit peaks at multiples of approximately 85 base pairs (the nucleosome periodicity), and approximately 1000 base pairs (solenoid periodicity). Preliminary experiments to investigate the fragment distributions from irradiated DNA, made by B. Rydberg at Lawrence Berkeley Laboratory, confirm the existence of short DNA fragments and are in substantial agreement with the predictions of our theory.

  19. An improved parameter estimation scheme for image modification detection based on DCT coefficient analysis.

    PubMed

    Yu, Liyang; Han, Qi; Niu, Xiamu; Yiu, S M; Fang, Junbin; Zhang, Ye

    2016-02-01

    Most of the existing image modification detection methods which are based on DCT coefficient analysis model the distribution of DCT coefficients as a mixture of a modified and an unchanged component. To separate the two components, two parameters, which are the primary quantization step, Q1, and the portion of the modified region, α, have to be estimated, and more accurate estimations of α and Q1 lead to better detection and localization results. Existing methods estimate α and Q1 in a completely blind manner, without considering the characteristics of the mixture model and the constraints to which α should conform. In this paper, we propose a more effective scheme for estimating α and Q1, based on the observations that, the curves on the surface of the likelihood function corresponding to the mixture model is largely smooth, and α can take values only in a discrete set. We conduct extensive experiments to evaluate the proposed method, and the experimental results confirm the efficacy of our method. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Education of Blind Persons in Ethiopia.

    ERIC Educational Resources Information Center

    Maru, A. A.; Cook, M. J.

    1990-01-01

    The paper reviews the historical and cultural attitudes of Ethiopians toward blind children, the education of blind children, the special situation of orphaned blind children, limitations of existing educational models, and development of a new model that relies on elements of community-based rehabilitation and the employment of blind high school…

  1. Contributions of Ecological School Mental Health Services to Students' Academic Success

    ERIC Educational Resources Information Center

    Doll, Beth; Spies, Rob; Champion, Allison

    2012-01-01

    This article describes an ecological framework for school mental health services that differs in important ways from existing service delivery models. The model is based on research describing ecological frameworks underlying students' school success. Ecological characteristics of schools and classrooms that promote academic success are described…

  2. From Conceptual Frameworks to Mental Models for Astronomy: Students' Perceptions

    ERIC Educational Resources Information Center

    Pundak, David; Liberman, Ido; Shacham, Miri

    2017-01-01

    Considerable debate exists among discipline-based astronomy education researchers about how students change their perceptions in science and astronomy. The study questioned the development of astronomical models among students in institutions of higher education by examining how college students change their initial conceptual frameworks and…

  3. Looking Both Ways through Time: The Janus Model of Lateralized Cognition

    ERIC Educational Resources Information Center

    Dien, Joseph

    2008-01-01

    Existing models of laterality, while often successful at describing circumscribed domains, have not been successful as explanations of the overall patterns of hemispheric asymmetries. It is therefore suggested that a new approach is needed based on shared contributions to adaptive hemispheric roles rather than functional and structural…

  4. Schoolwide Enrichment Model: Challenging All Children to Excel

    ERIC Educational Resources Information Center

    Beecher, Margaret

    2010-01-01

    This article summarizes how the components of the Schoolwide Enrichment Model were used to dramatically reduce the achievement gap in a school with a high at-risk student population. The theories of enrichment and instructional differentiation replaced an existing remedial paradigm and a strength-based methodology was embraced by the school…

  5. Adapting the Individual Placement and Support Model with Homeless Young Adults

    ERIC Educational Resources Information Center

    Ferguson, Kristin M.; Xie, Bin; Glynn, Shirley

    2012-01-01

    Background: Prior research reveals high unemployment rates among homeless young adults. The literature offers many examples of using evidence-based supported employment models with vulnerable populations to assist them in obtaining and maintaining competitive employment; yet few examples exist to date with homeless young adults with mental…

  6. Precision Optics Curriculum.

    ERIC Educational Resources Information Center

    Reid, Robert L.; And Others

    This guide outlines the competency-based, two-year precision optics curriculum that the American Precision Optics Manufacturers Association has proposed to fill the void that it suggests will soon exist as many of the master opticians currently employed retire. The model, which closely resembles the old European apprenticeship model, calls for 300…

  7. Early Grade Writing Assessment: An Instrument Model

    ERIC Educational Resources Information Center

    Jiménez, Juan E.

    2017-01-01

    The United Nations Educational, Scientific, and Cultural Organization promoted the creation of a model instrument for individual assessment of students' foundational writing skills in the Spanish language that was based on a literature review and existing writing tools and assessments. The purpose of the "Early Grade Writing Assessment"…

  8. Data publication and dissemination of interactive keys under the open access model

    USDA-ARS?s Scientific Manuscript database

    The concepts of publication, citation and dissemination of interactive keys and other online keys are discussed and illustrated by a sample paper published in the present issue (doi: 10.3897/zookeys.21.271). The present model is based on previous experience with several existing examples of publishi...

  9. "No Soy de Aqui ni Soy de Alla": Transgenerational Cultural Identity Formation

    ERIC Educational Resources Information Center

    Cardona, Jose Ruben Parra; Busby, Dean M.; Wampler, Richard S.

    2004-01-01

    The transgenerational cultural identity model offers a detailed understanding of the immigration experience by challenging agendas of assimilation and by expanding on existing theories of cultural identity. Based on this model, immigration is a complex phenomenon influenced by many variables such as sociopsychological dimensions, family,…

  10. Predicting the regeneration of Appalachian hardwoods: adapting the REGEN model for the Appalachian Plateau

    Treesearch

    Lance A. Vickers; Thomas R. Fox; David L. Loftis; David A. Boucugnani

    2013-01-01

    The difficulty of achieving reliable oak (Quercus spp.) regeneration is well documented. Application of silvicultural techniques to facilitate oak regeneration largely depends on current regeneration potential. A computer model to assess regeneration potential based on existing advanced reproduction in Appalachian hardwoods was developed by David...

  11. Learning to Teach: Pedagogical Content Knowledge in Adventure-Based Learning

    ERIC Educational Resources Information Center

    Sutherland, Sue; Stuhr, Paul T.; Ayvazo, Shiri

    2016-01-01

    Background: Many alternative curricular models exist in physical education to better meet the needs of students than the multi-activity team sports curriculum that dominates in the USA. These alternative curricular models typically require different content knowledge (CK) and pedagogical CK (PCK) to implement successfully. One of the complexities…

  12. Assessment "as" Learning: Enhancing Discourse, Understanding, and Achievement in Innovative Science Curricula

    ERIC Educational Resources Information Center

    Hickey, Daniel T.; Taasoobshirazi, Gita; Cross, Dionne

    2012-01-01

    An assessment-oriented design-based research model was applied to existing inquiry-oriented multimedia programs in astronomy, biology, and ecology. Building on emerging situative theories of assessment, the model extends prevailing views of formative assessment "for" learning by embedding "discursive" formative assessment more directly into the…

  13. Location-Based Services in Vehicular Networks

    ERIC Educational Resources Information Center

    Wu, Di

    2013-01-01

    Location-based services have been identified as a promising communication paradigm in highly mobile and dynamic vehicular networks. However, existing mobile ad hoc networking cannot be directly applied to vehicular networking due to differences in traffic conditions, mobility models and network topologies. On the other hand, hybrid architectures…

  14. The Principal's Role in Site-Based Management.

    ERIC Educational Resources Information Center

    Drury, William R.

    1993-01-01

    In existing school-based management models, the principal's role ranges from chairing the local council to being a coach/facilitator. With teachers and parents assuming greater control over governance, curriculum, and budgeting, paranoid principals may establish more formal bargaining relationships with district boards. Caution is advised, because…

  15. A cross-correlation-based estimate of the galaxy luminosity function

    NASA Astrophysics Data System (ADS)

    van Daalen, Marcel P.; White, Martin

    2018-06-01

    We extend existing methods for using cross-correlations to derive redshift distributions for photometric galaxies, without using photometric redshifts. The model presented in this paper simultaneously yields highly accurate and unbiased redshift distributions and, for the first time, redshift-dependent luminosity functions, using only clustering information and the apparent magnitudes of the galaxies as input. In contrast to many existing techniques for recovering unbiased redshift distributions, the output of our method is not degenerate with the galaxy bias b(z), which is achieved by modelling the shape of the luminosity bias. We successfully apply our method to a mock galaxy survey and discuss improvements to be made before applying our model to real data.

  16. Efficient Translation of LTL Formulae into Buchi Automata

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Lerda, Flavio

    2001-01-01

    Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.

  17. Software risk management through independent verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Zhou, Tong C.; Wood, Ralph

    1995-01-01

    Software project managers need tools to estimate and track project goals in a continuous fashion before, during, and after development of a system. In addition, they need an ability to compare the current project status with past project profiles to validate management intuition, identify problems, and then direct appropriate resources to the sources of problems. This paper describes a measurement-based approach to calculating the risk inherent in meeting project goals that leverages past project metrics and existing estimation and tracking models. We introduce the IV&V Goal/Questions/Metrics model, explain its use in the software development life cycle, and describe our attempts to validate the model through the reverse engineering of existing projects.

  18. “That model is sooooo last millennium!” Residential long term care as a system, not a place

    PubMed Central

    Ziemba, Rosemary; Perry, Tam E.; Takahashi, Beverly; Algase, Donna

    2010-01-01

    The current quandary with the design of existing long term care (LTC) settings results from focus on structures (“institutions”) instead of on a system of supports and services that transcends physical and traditional boundaries across settings, including nursing homes, assisted living residences and the home. Supported by analysis of the commonalities, socio-historical and political contexts, core values and fallacies of social and medical models in existing and emerging LTC options, a holistic model is proposed based on new core values which facilitate community and family integration, and which asserts dignity and personhood as universal attributes in an array of settings. PMID:20640176

  19. Hierarchical Task Network Prototyping In Unity3d

    DTIC Science & Technology

    2016-06-01

    visually debug. Here we present a solution for prototyping HTNs by extending an existing commercial implementation of Behavior Trees within the Unity3D game ...HTN, dynamic behaviors, behavior prototyping, agent-based simulation, entity-level combat model, game engine, discrete event simulation, virtual...commercial implementation of Behavior Trees within the Unity3D game engine prior to building the HTN in COMBATXXI. Existing HTNs were emulated within

  20. Using Item-Type Performance Covariance to Improve the Skill Model of an Existing Tutor

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Cen, Hao; Wu, Lili; Koedinger, Kenneth R.

    2008-01-01

    Using data from an existing pre-algebra computer-based tutor, we analyzed the covariance of item-types with the goal of describing a more effective way to assign skill labels to item-types. Analyzing covariance is important because it allows us to place the skills in a related network in which we can identify the role each skill plays in learning…

  1. Model-based learning and the contribution of the orbitofrontal cortex to the model-free world

    PubMed Central

    McDannald, Michael A.; Takahashi, Yuji K.; Lopatina, Nina; Pietras, Brad W.; Jones, Josh L.; Schoenbaum, Geoffrey

    2012-01-01

    Learning is proposed to occur when there is a discrepancy between reward prediction and reward receipt. At least two separate systems are thought to exist: one in which predictions are proposed to be based on model-free or cached values; and another in which predictions are model-based. A basic neural circuit for model-free reinforcement learning has already been described. In the model-free circuit the ventral striatum (VS) is thought to supply a common-currency reward prediction to midbrain dopamine neurons that compute prediction errors and drive learning. In a model-based system, predictions can include more information about an expected reward, such as its sensory attributes or current, unique value. This detailed prediction allows for both behavioral flexibility and learning driven by changes in sensory features of rewards alone. Recent evidence from animal learning and human imaging suggests that, in addition to model-free information, the VS also signals model-based information. Further, there is evidence that the orbitofrontal cortex (OFC) signals model-based information. Here we review these data and suggest that the OFC provides model-based information to this traditional model-free circuitry and offer possibilities as to how this interaction might occur. PMID:22487030

  2. Applying an Archetype-Based Approach to Electroencephalography/Event-Related Potential Experiments in the EEGBase Resource.

    PubMed

    Papež, Václav; Mouček, Roman

    2017-01-01

    The purpose of this study is to investigate the feasibility of applying openEHR (an archetype-based approach for electronic health records representation) to modeling data stored in EEGBase, a portal for experimental electroencephalography/event-related potential (EEG/ERP) data management. The study evaluates re-usage of existing openEHR archetypes and proposes a set of new archetypes together with the openEHR templates covering the domain. The main goals of the study are to (i) link existing EEGBase data/metadata and openEHR archetype structures and (ii) propose a new openEHR archetype set describing the EEG/ERP domain since this set of archetypes currently does not exist in public repositories. The main methodology is based on the determination of the concepts obtained from EEGBase experimental data and metadata that are expressible structurally by the openEHR reference model and semantically by openEHR archetypes. In addition, templates as the third openEHR resource allow us to define constraints over archetypes. Clinical Knowledge Manager (CKM), a public openEHR archetype repository, was searched for the archetypes matching the determined concepts. According to the search results, the archetypes already existing in CKM were applied and the archetypes not existing in the CKM were newly developed. openEHR archetypes support linkage to external terminologies. To increase semantic interoperability of the new archetypes, binding with the existing odML electrophysiological terminology was assured. Further, to increase structural interoperability, also other current solutions besides EEGBase were considered during the development phase. Finally, a set of templates using the selected archetypes was created to meet EEGBase requirements. A set of eleven archetypes that encompassed the domain of experimental EEG/ERP measurements were identified. Of these, six were reused without changes, one was extended, and four were newly created. All archetypes were arranged in the templates reflecting the EEGBase metadata structure. A mechanism of odML terminology referencing was proposed to assure semantic interoperability of the archetypes. The openEHR approach was found to be useful not only for clinical purposes but also for experimental data modeling.

  3. Fault Diagnosis approach based on a model-based reasoner and a functional designer for a wind turbine. An approach towards self-maintenance

    NASA Astrophysics Data System (ADS)

    Echavarria, E.; Tomiyama, T.; van Bussel, G. J. W.

    2007-07-01

    The objective of this on-going research is to develop a design methodology to increase the availability for offshore wind farms, by means of an intelligent maintenance system capable of responding to faults by reconfiguring the system or subsystems, without increasing service visits, complexity, or costs. The idea is to make use of the existing functional redundancies within the system and sub-systems to keep the wind turbine operational, even at a reduced capacity if necessary. Re-configuration is intended to be a built-in capability to be used as a repair strategy, based on these existing functionalities provided by the components. The possible solutions can range from using information from adjacent wind turbines, such as wind speed and direction, to setting up different operational modes, for instance re-wiring, re-connecting, changing parameters or control strategy. The methodology described in this paper is based on qualitative physics and consists of a fault diagnosis system based on a model-based reasoner (MBR), and on a functional redundancy designer (FRD). Both design tools make use of a function-behaviour-state (FBS) model. A design methodology based on the re-configuration concept to achieve self-maintained wind turbines is an interesting and promising approach to reduce stoppage rate, failure events, maintenance visits, and to maintain energy output possibly at reduced rate until the next scheduled maintenance.

  4. Bioethics: A Rationale and a Model

    ERIC Educational Resources Information Center

    Barman, Charles R.; Rusch, John J.

    1978-01-01

    Discusses the rationale for and development of an undergraduate bioethics course. Based on experiences with the course, general suggestions are offered to instructors planning to add bioethics to existing curricula. (MA)

  5. Mechanical modeling for magnetorheological elastomer isolators based on constitutive equations and electromagnetic analysis

    NASA Astrophysics Data System (ADS)

    Wang, Qi; Dong, Xufeng; Li, Luyu; Ou, Jinping

    2018-06-01

    As constitutive models are too complicated and existing mechanical models lack universality, these models are beyond satisfaction for magnetorheological elastomer (MRE) devices. In this article, a novel universal method is proposed to build concise mechanical models. Constitutive model and electromagnetic analysis were applied in this method to ensure universality, while a series of derivations and simplifications were carried out to obtain a concise formulation. To illustrate the proposed modeling method, a conical MRE isolator was introduced. Its basic mechanical equations were built based on equilibrium, deformation compatibility, constitutive equations and electromagnetic analysis. An iteration model and a highly efficient differential equation editor based model were then derived to solve the basic mechanical equations. The final simplified mechanical equations were obtained by re-fitting the simulations with a novel optimal algorithm. In the end, verification test of the isolator has proved the accuracy of the derived mechanical model and the modeling method.

  6. Cost-effectiveness of a National Telemedicine Diabetic Retinopathy Screening Program in Singapore.

    PubMed

    Nguyen, Hai V; Tan, Gavin Siew Wei; Tapp, Robyn Jennifer; Mital, Shweta; Ting, Daniel Shu Wei; Wong, Hon Tym; Tan, Colin S; Laude, Augustinus; Tai, E Shyong; Tan, Ngiap Chuan; Finkelstein, Eric A; Wong, Tien Yin; Lamoureux, Ecosse L

    2016-12-01

    To determine the incremental cost-effectiveness of a new telemedicine technician-based assessment relative to an existing model of family physician (FP)-based assessment of diabetic retinopathy (DR) in Singapore from the health system and societal perspectives. Model-based, cost-effectiveness analysis of the Singapore Integrated Diabetic Retinopathy Program (SiDRP). A hypothetical cohort of patients aged 55 years with type 2 diabetes previously not screened for DR. The SiDRP is a new telemedicine-based DR screening program using trained technicians to assess retinal photographs. We compared the cost-effectiveness of SiDRP with the existing model in which FPs assess photographs. We developed a hybrid decision tree/Markov model to simulate the costs, effectiveness, and incremental cost-effectiveness ratio (ICER) of SiDRP relative to FP-based DR screening over a lifetime horizon. We estimated the costs from the health system and societal perspectives. Effectiveness was measured in terms of quality-adjusted life-years (QALYs). Result robustness was calculated using deterministic and probabilistic sensitivity analyses. The ICER. From the societal perspective that takes into account all costs and effects, the telemedicine-based DR screening model had significantly lower costs (total cost savings of S$173 per person) while generating similar QALYs compared with the physician-based model (i.e., 13.1 QALYs). From the health system perspective that includes only direct medical costs, the cost savings are S$144 per person. By extrapolating these data to approximately 170 000 patients with diabetes currently being screened yearly for DR in Singapore's primary care polyclinics, the present value of future cost savings associated with the telemedicine-based model is estimated to be S$29.4 million over a lifetime horizon. While generating similar health outcomes, the telemedicine-based DR screening using technicians in the primary care setting saves costs for Singapore compared with the FP model. Our data provide a strong economic rationale to expand the telemedicine-based DR screening program in Singapore and elsewhere. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  7. Hazards and Possibilities of Optical Breakdown Effects Below the Threshold for Shockwave and Bubble Formation

    DTIC Science & Technology

    2006-07-01

    precision of the determination of Rmax, we established a refined method based on the model of bubble formation described above in section 3.6.1 and the...development can be modeled by hydrodynamic codes based on tabulated equation-of-state data . This has previously demonstrated on ps optical breakdown...per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and

  8. A Novel Extreme Learning Machine Classification Model for e-Nose Application Based on the Multiple Kernel Approach

    PubMed Central

    Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong

    2017-01-01

    A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification. PMID:28629202

  9. Laser Induced Aluminum Surface Breakdown Model

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Liu, Jiwen; Zhang, Sijun; Wang, Ten-See (Technical Monitor)

    2002-01-01

    Laser powered propulsion systems involve complex fluid dynamics, thermodynamics and radiative transfer processes. Based on an unstructured grid, pressure-based computational aerothermodynamics; platform, several sub-models describing such underlying physics as laser ray tracing and focusing, thermal non-equilibrium, plasma radiation and air spark ignition have been developed. This proposed work shall extend the numerical platform and existing sub-models to include the aluminum wall surface Inverse Bremsstrahlung (IB) effect from which surface ablation and free-electron generation can be initiated without relying on the air spark ignition sub-model. The following tasks will be performed to accomplish the research objectives.

  10. Template-based structure modeling of protein-protein interactions

    PubMed Central

    Szilagyi, Andras; Zhang, Yang

    2014-01-01

    The structure of protein-protein complexes can be constructed by using the known structure of other protein complexes as a template. The complex structure templates are generally detected either by homology-based sequence alignments or, given the structure of monomer components, by structure-based comparisons. Critical improvements have been made in recent years by utilizing interface recognition and by recombining monomer and complex template libraries. Encouraging progress has also been witnessed in genome-wide applications of template-based modeling, with modeling accuracy comparable to high-throughput experimental data. Nevertheless, bottlenecks exist due to the incompleteness of the proteinprotein complex structure library and the lack of methods for distant homologous template identification and full-length complex structure refinement. PMID:24721449

  11. School Reform for Youth At Risk: Analysis of Six Change Models. Volume I: Summary and Analysis.

    ERIC Educational Resources Information Center

    McCollum, Heather

    This document analyzes six school-reform models for at-risk youth. Part 1 examines three curriculum-based reform programs that explicitly target curriculum and instruction: Reading Recovery; Success for All; and the Academy model. These programs focus on changes in student achievement and work within the structure of existing schools. Part 2…

  12. Diagnosing Alzheimer's disease: a systematic review of economic evaluations.

    PubMed

    Handels, Ron L H; Wolfs, Claire A G; Aalten, Pauline; Joore, Manuela A; Verhey, Frans R J; Severens, Johan L

    2014-03-01

    The objective of this study is to systematically review the literature on economic evaluations of interventions for the early diagnosis of Alzheimer's disease (AD) and related disorders and to describe their general and methodological characteristics. We focused on the diagnostic aspects of the decision models to assess the applicability of existing decision models for the evaluation of the recently revised diagnostic research criteria for AD. PubMed and the National Institute for Health Research Economic Evaluation database were searched for English-language publications related to economic evaluations on diagnostic technologies. Trial-based economic evaluations were assessed using the Consensus on Health Economic Criteria list. Modeling studies were assessed using the framework for quality assessment of decision-analytic models. The search retrieved 2109 items, from which eight decision-analytic modeling studies and one trial-based economic evaluation met all eligibility criteria. Diversity among the study objective and characteristics was considerable and, despite considerable methodological quality, several flaws were indicated. Recommendations were focused on diagnostic aspects and the applicability of existing models for the evaluation of recently revised diagnostic research criteria for AD. Copyright © 2014 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  13. Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.

    PubMed

    Ji, Ming; Xiong, Chengjie; Grundman, Michael

    2003-10-01

    In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.

  14. Light transport in turbid media with non-scattering, low-scattering and high absorption heterogeneities based on hybrid simplified spherical harmonics with radiosity model

    PubMed Central

    Yang, Defu; Chen, Xueli; Peng, Zhen; Wang, Xiaorui; Ripoll, Jorge; Wang, Jing; Liang, Jimin

    2013-01-01

    Modeling light propagation in the whole body is essential and necessary for optical imaging. However, non-scattering, low-scattering and high absorption regions commonly exist in biological tissues, which lead to inaccuracy of the existing light transport models. In this paper, a novel hybrid light transport model that couples the simplified spherical harmonics approximation (SPN) with the radiosity theory (HSRM) was presented, to accurately describe light transport in turbid media with non-scattering, low-scattering and high absorption heterogeneities. In the model, the radiosity theory was used to characterize the light transport in non-scattering regions and the SPN was employed to handle the scattering problems, including subsets of low-scattering and high absorption. A Neumann source constructed by the light transport in the non-scattering region and formed at the interface between the non-scattering and scattering regions was superposed into the original light source, to couple the SPN with the radiosity theory. The accuracy and effectiveness of the HSRM was first verified with both regular and digital mouse model based simulations and a physical phantom based experiment. The feasibility and applicability of the HSRM was then investigated by a broad range of optical properties. Lastly, the influence of depth of the light source on the model was also discussed. Primary results showed that the proposed model provided high performance for light transport in turbid media with non-scattering, low-scattering and high absorption heterogeneities. PMID:24156077

  15. Light transport in turbid media with non-scattering, low-scattering and high absorption heterogeneities based on hybrid simplified spherical harmonics with radiosity model.

    PubMed

    Yang, Defu; Chen, Xueli; Peng, Zhen; Wang, Xiaorui; Ripoll, Jorge; Wang, Jing; Liang, Jimin

    2013-01-01

    Modeling light propagation in the whole body is essential and necessary for optical imaging. However, non-scattering, low-scattering and high absorption regions commonly exist in biological tissues, which lead to inaccuracy of the existing light transport models. In this paper, a novel hybrid light transport model that couples the simplified spherical harmonics approximation (SPN) with the radiosity theory (HSRM) was presented, to accurately describe light transport in turbid media with non-scattering, low-scattering and high absorption heterogeneities. In the model, the radiosity theory was used to characterize the light transport in non-scattering regions and the SPN was employed to handle the scattering problems, including subsets of low-scattering and high absorption. A Neumann source constructed by the light transport in the non-scattering region and formed at the interface between the non-scattering and scattering regions was superposed into the original light source, to couple the SPN with the radiosity theory. The accuracy and effectiveness of the HSRM was first verified with both regular and digital mouse model based simulations and a physical phantom based experiment. The feasibility and applicability of the HSRM was then investigated by a broad range of optical properties. Lastly, the influence of depth of the light source on the model was also discussed. Primary results showed that the proposed model provided high performance for light transport in turbid media with non-scattering, low-scattering and high absorption heterogeneities.

  16. A systematic comparison of recurrent event models for application to composite endpoints.

    PubMed

    Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine

    2018-01-04

    Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.

  17. A Non-Intrusive Pressure Sensor by Detecting Multiple Longitudinal Waves

    PubMed Central

    Zhou, Hongliang; Lin, Weibin; Ge, Xiaocheng; Zhou, Jian

    2016-01-01

    Pressure vessels are widely used in industrial fields, and some of them are safety-critical components in the system—for example, those which contain flammable or explosive material. Therefore, the pressure of these vessels becomes one of the critical measurements for operational management. In the paper, we introduce a new approach to the design of non-intrusive pressure sensors, based on ultrasonic waves. The model of this sensor is built based upon the travel-time change of the critically refracted longitudinal wave (LCR wave) and the reflected longitudinal waves with the pressure. To evaluate the model, experiments are carried out to compare the proposed model with other existing models. The results show that the proposed model can improve the accuracy compared to models based on a single wave. PMID:27527183

  18. Options for developing modernized geodetic datum for Nepal following the April 25, 2015 Mw7.8 Gorkha earthquake

    NASA Astrophysics Data System (ADS)

    Pearson, Chris; Manandhar, Niraj; Denys, Paul

    2017-09-01

    Along with the damage to buildings and infrastructure, the April 25, 2015 Mw7.8 Gorkha earthquake caused significant deformation over a large area of eastern Nepal with displacements of over 2 m recorded in the vicinity of Kathmandu. Nepal currently uses a classical datum developed in 1984 by the Royal (UK) Engineers in collaboration with the Nepal Survey Department. It has served Nepal well; however, the recent earthquakes have provided an impetus for developing a semi-dynamic datum that will be based on ITRF2014 and have the capacity to correct for tectonic deformation. In the scenario we present here, the datum would be based on ITRF2014 with a reference epoch set some time after the end of the current sequence of earthquakes. The deformation model contains a grid of the secular velocity field combined with models of the Gorkha Earthquake and the May 12 Mw7.3 aftershock. We have developed a preliminary velocity field by collating GPS derived crustal velocities from four previous studies for Nepal and adjacent parts of China and India and aligning them to the ITRF. Patches for the co-seismic part of the deformation for the Gorkha earthquake and the May 12, 2015 Mw 7.2 aftershock are based on published dislocation models. High order control would be a CORS network based around the existing Nepal GPS Array. Coordinates for existing lower order control would be determined by readjusting existing survey measurements and these would be combined with a series of new control stations spread throughout Nepal.

  19. Fire risk in San Diego County, California: A weighted Bayesian model approach

    USGS Publications Warehouse

    Kolden, Crystal A.; Weigel, Timothy J.

    2007-01-01

    Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.

  20. NASA Integrated Network Monitor and Control Software Architecture

    NASA Technical Reports Server (NTRS)

    Shames, Peter; Anderson, Michael; Kowal, Steve; Levesque, Michael; Sindiy, Oleg; Donahue, Kenneth; Barnes, Patrick

    2012-01-01

    The National Aeronautics and Space Administration (NASA) Space Communications and Navigation office (SCaN) has commissioned a series of trade studies to define a new architecture intended to integrate the three existing networks that it operates, the Deep Space Network (DSN), Space Network (SN), and Near Earth Network (NEN), into one integrated network that offers users a set of common, standardized, services and interfaces. The integrated monitor and control architecture utilizes common software and common operator interfaces that can be deployed at all three network elements. This software uses state-of-the-art concepts such as a pool of re-programmable equipment that acts like a configurable software radio, distributed hierarchical control, and centralized management of the whole SCaN integrated network. For this trade space study a model-based approach using SysML was adopted to describe and analyze several possible options for the integrated network monitor and control architecture. This model was used to refine the design and to drive the costing of the four different software options. This trade study modeled the three existing self standing network elements at point of departure, and then described how to integrate them using variations of new and existing monitor and control system components for the different proposed deployments under consideration. This paper will describe the trade space explored, the selected system architecture, the modeling and trade study methods, and some observations on useful approaches to implementing such model based trade space representation and analysis.

  1. SECURITY MODELING FOR MARITIME PORT DEFENSE RESOURCE ALLOCATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.; Dunn, D.

    2010-09-07

    Redeployment of existing law enforcement resources and optimal use of geographic terrain are examined for countering the threat of a maritime based small-vessel radiological or nuclear attack. The evaluation was based on modeling conducted by the Savannah River National Laboratory that involved the development of options for defensive resource allocation that can reduce the risk of a maritime based radiological or nuclear threat. A diverse range of potential attack scenarios has been assessed. As a result of identifying vulnerable pathways, effective countermeasures can be deployed using current resources. The modeling involved the use of the Automated Vulnerability Evaluation for Risksmore » of Terrorism (AVERT{reg_sign}) software to conduct computer based simulation modeling. The models provided estimates for the probability of encountering an adversary based on allocated resources including response boats, patrol boats and helicopters over various environmental conditions including day, night, rough seas and various traffic flow rates.« less

  2. A Stratified Acoustic Model Accounting for Phase Shifts for Underwater Acoustic Networks

    PubMed Central

    Wang, Ping; Zhang, Lin; Li, Victor O. K.

    2013-01-01

    Accurate acoustic channel models are critical for the study of underwater acoustic networks. Existing models include physics-based models and empirical approximation models. The former enjoy good accuracy, but incur heavy computational load, rendering them impractical in large networks. On the other hand, the latter are computationally inexpensive but inaccurate since they do not account for the complex effects of boundary reflection losses, the multi-path phenomenon and ray bending in the stratified ocean medium. In this paper, we propose a Stratified Acoustic Model (SAM) based on frequency-independent geometrical ray tracing, accounting for each ray's phase shift during the propagation. It is a feasible channel model for large scale underwater acoustic network simulation, allowing us to predict the transmission loss with much lower computational complexity than the traditional physics-based models. The accuracy of the model is validated via comparisons with the experimental measurements in two different oceans. Satisfactory agreements with the measurements and with other computationally intensive classical physics-based models are demonstrated. PMID:23669708

  3. A stratified acoustic model accounting for phase shifts for underwater acoustic networks.

    PubMed

    Wang, Ping; Zhang, Lin; Li, Victor O K

    2013-05-13

    Accurate acoustic channel models are critical for the study of underwater acoustic networks. Existing models include physics-based models and empirical approximation models. The former enjoy good accuracy, but incur heavy computational load, rendering them impractical in large networks. On the other hand, the latter are computationally inexpensive but inaccurate since they do not account for the complex effects of boundary reflection losses, the multi-path phenomenon and ray bending in the stratified ocean medium. In this paper, we propose a Stratified Acoustic Model (SAM) based on frequency-independent geometrical ray tracing, accounting for each ray's phase shift during the propagation. It is a feasible channel model for large scale underwater acoustic network simulation, allowing us to predict the transmission loss with much lower computational complexity than the traditional physics-based models. The accuracy of the model is validated via comparisons with the experimental measurements in two different oceans. Satisfactory agreements with the measurements and with other computationally intensive classical physics-based models are demonstrated.

  4. A novel client service quality measuring model and an eHealthcare mitigating approach.

    PubMed

    Cheng, L M; Choi, Wai Ping Choi; Wong, Anita Yiu Ming

    2016-07-01

    Facing population ageing in Hong Kong, the demand of long-term elderly health care services is increasing. The challenge is to support a good quality service under the constraints faced by recent shortage of nursing and care services professionals without redesigning the work flow operated in the existing elderly health care industries. the existing elderly health care industries. The Total QoS measure based on Finite Capacity Queuing Model is a reliable method and an effective measurement for Quality of services. The value is good for measuring the staffing level and offers a measurement for efficiency enhancement when incorporate new technologies like ICT. The implemented system has improved the Quality of Service by more than 14% and the extra released manpower resource will allow clinical care provider to offer further value added services without actually increasing head count. We have developed a novel Quality of Service measurement for Clinical Care services based on multi-queue using finite capacity queue model M/M/c/K/n and the measurement is useful for estimating the shortage of staff resource in a caring institution. It is essential for future integration with the existing widely used assessment model to develop reliable measuring limits which allow an effective measurement of public fund used in health care industries. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Parcels versus pixels: modeling agricultural land use across broad geographic regions using parcel-based field boundaries

    USGS Publications Warehouse

    Sohl, Terry L.; Dornbierer, Jordan; Wika, Steve; Sayler, Kristi L.; Quenzer, Robert

    2017-01-01

    Land use and land cover (LULC) change occurs at a local level within contiguous ownership and management units (parcels), yet LULC models primarily use pixel-based spatial frameworks. The few parcel-based models being used overwhelmingly focus on small geographic areas, limiting the ability to assess LULC change impacts at regional to national scales. We developed a modified version of the Forecasting Scenarios of land use change model to project parcel-based agricultural change across a large region in the United States Great Plains. A scenario representing an agricultural biofuel scenario was modeled from 2012 to 2030, using real parcel boundaries based on contiguous ownership and land management units. The resulting LULC projection provides a vastly improved representation of landscape pattern over existing pixel-based models, while simultaneously providing an unprecedented combination of thematic detail and broad geographic extent. The conceptual approach is practical and scalable, with potential use for national-scale projections.

  6. The effects of spatial dynamics on a wormhole throat

    NASA Astrophysics Data System (ADS)

    Alias, Anuar; Wan Abdullah, Wan Ahmad Tajuddin

    2018-02-01

    Previous studies on dynamic wormholes were focused on the dynamics of the wormhole itself, be it either rotating or evolutionary in character and also in various frameworks from classical to braneworld cosmological models. In this work, we modeled a dynamic factor that represents the spatial dynamics in terms of spacetime expansion and contraction surrounding the wormhole itself. Using an RS2-based braneworld cosmological model, we modified the spacetime metric of Wong and subsequently employed the method of Bronnikov, where it is observed that a traversable wormhole is easier to exist in an expanding brane universe, however it is difficult to exist in a contracting brane universe due to stress-energy tensors requirement. This model of spatial dynamic factor affecting the wormhole throat can also be applied on the cyclic or the bounce universe model.

  7. Pursuing the method of multiple working hypotheses to understand differences in process-based snow models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Essery, Richard

    2017-04-01

    When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.

  8. The Asthma Dialogues: A Model of Interactive Education for Skills

    ERIC Educational Resources Information Center

    Morrow, Robert; Fletcher, Jason; Mulvihill, Michael; Park, Heidi

    2007-01-01

    Introduction: A gap exists between asthma guidelines and actual care delivered. We developed an educational intervention using simulated physician-patient encounters as part of a project to improve asthma management by community-based primary care providers. We hypothesized that this type of skills-based interactive training would improve…

  9. Designing a Competency-Based Program in Veterinary Public Health and Preventive Medicine for the Professional Curriculum

    ERIC Educational Resources Information Center

    Selby, Lloyd A.; And Others

    1976-01-01

    A five-day workshop was successful in fulfilling its prime objective, development of a competency-based curriculum for veterinary public health and preventive medicine (VPH & PM). The model now may be used to re-evaluate and, where necessary, revise existing curriculums. (LBH)

  10. A two-fluid model of the solar wind

    NASA Technical Reports Server (NTRS)

    Sandbaek, O.; Leer, E.; Holzer, T. E.

    1992-01-01

    A method is presented for the integration of the two-fluid solar-wind equations which is applicable to a wide variety of coronal base densities and temperatures. The method involves proton heat conduction, and may be applied to coronal base conditions for which subsonic-supersonic solar wind solutions exist.

  11. An Open Trial of an Acceptance-Based Behavior Therapy for Generalized Anxiety Disorder

    ERIC Educational Resources Information Center

    Roemer, Lizabeth; Orsillo, Susan M.

    2007-01-01

    Research suggests that experiential avoidance may play an important role in generalized anxiety disorder (GAD; see Roemer, L., & Orsillo, S.M. (2002). "Expanding our conceptualization of and treatment for generalized anxiety disorder: Integrating mindfulness/acceptance-based approaches with existing cognitive-behavioral models." "Clinical…

  12. Combinational Reasoning of Quantitative Fuzzy Topological Relations for Simple Fuzzy Regions

    PubMed Central

    Liu, Bo; Li, Dajun; Xia, Yuanping; Ruan, Jian; Xu, Lili; Wu, Huanyi

    2015-01-01

    In recent years, formalization and reasoning of topological relations have become a hot topic as a means to generate knowledge about the relations between spatial objects at the conceptual and geometrical levels. These mechanisms have been widely used in spatial data query, spatial data mining, evaluation of equivalence and similarity in a spatial scene, as well as for consistency assessment of the topological relations of multi-resolution spatial databases. The concept of computational fuzzy topological space is applied to simple fuzzy regions to efficiently and more accurately solve fuzzy topological relations. Thus, extending the existing research and improving upon the previous work, this paper presents a new method to describe fuzzy topological relations between simple spatial regions in Geographic Information Sciences (GIS) and Artificial Intelligence (AI). Firstly, we propose a new definition for simple fuzzy line segments and simple fuzzy regions based on the computational fuzzy topology. And then, based on the new definitions, we also propose a new combinational reasoning method to compute the topological relations between simple fuzzy regions, moreover, this study has discovered that there are (1) 23 different topological relations between a simple crisp region and a simple fuzzy region; (2) 152 different topological relations between two simple fuzzy regions. In the end, we have discussed some examples to demonstrate the validity of the new method, through comparisons with existing fuzzy models, we showed that the proposed method can compute more than the existing models, as it is more expressive than the existing fuzzy models. PMID:25775452

  13. Cost-effectiveness of possible future smoking cessation strategies in Hungary: results from the EQUIPTMOD.

    PubMed

    Németh, Bertalan; Józwiak-Hagymásy, Judit; Kovács, Gábor; Kovács, Attila; Demjén, Tibor; Huber, Manuel B; Cheung, Kei-Long; Coyle, Kathryn; Lester-George, Adam; Pokhrel, Subhash; Vokó, Zoltán

    2018-01-25

    To evaluate potential health and economic returns from implementing smoking cessation interventions in Hungary. The EQUIPTMOD, a Markov-based economic model, was used to assess the cost-effectiveness of three implementation scenarios: (a) introducing a social marketing campaign; (b) doubling the reach of existing group-based behavioural support therapies and proactive telephone support; and (c) a combination of the two scenarios. All three scenarios were compared with current practice. The scenarios were chosen as feasible options available for Hungary based on the outcome of interviews with local stakeholders. Life-time costs and quality-adjusted life years (QALYs) were calculated from a health-care perspective. The analyses used various return on investment (ROI) estimates, including incremental cost-effectiveness ratios (ICERs), to compare the scenarios. Probabilistic sensitivity analyses assessed the extent to which the estimated mean ICERs were sensitive to the model input values. Introducing a social marketing campaign resulted in an increase of 0.3014 additional quitters per 1 000 smokers, translating to health-care cost-savings of €0.6495 per smoker compared with current practice. When the value of QALY gains was considered, cost-savings increased to €14.1598 per smoker. Doubling the reach of existing group-based behavioural support therapies and proactive telephone support resulted in health-care savings of €0.2539 per smoker (€3.9620 with the value of QALY gains), compared with current practice. The respective figures for the combined scenario were €0.8960 and €18.0062. Results were sensitive to model input values. According to the EQUIPTMOD modelling tool, it would be cost-effective for the Hungarian authorities introduce a social marketing campaign and double the reach of existing group-based behavioural support therapies and proactive telephone support. Such policies would more than pay for themselves in the long term. © 2018 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  14. Novel approach for dam break flow modeling using computational intelligence

    NASA Astrophysics Data System (ADS)

    Seyedashraf, Omid; Mehrabi, Mohammad; Akhtari, Ali Akbar

    2018-04-01

    A new methodology based on the computational intelligence (CI) system is proposed and tested for modeling the classic 1D dam-break flow problem. The reason to seek for a new solution lies in the shortcomings of the existing analytical and numerical models. This includes the difficulty of using the exact solutions and the unwanted fluctuations, which arise in the numerical results. In this research, the application of the radial-basis-function (RBF) and multi-layer-perceptron (MLP) systems is detailed for the solution of twenty-nine dam-break scenarios. The models are developed using seven variables, i.e. the length of the channel, the depths of the up-and downstream sections, time, and distance as the inputs. Moreover, the depths and velocities of each computational node in the flow domain are considered as the model outputs. The models are validated against the analytical, and Lax-Wendroff and MacCormack FDM schemes. The findings indicate that the employed CI models are able to replicate the overall shape of the shock- and rarefaction-waves. Furthermore, the MLP system outperforms RBF and the tested numerical schemes. A new monolithic equation is proposed based on the best fitting model, which can be used as an efficient alternative to the existing piecewise analytic equations.

  15. The species-area relationship and evolution.

    PubMed

    Lawson, Daniel; Jensen, Henrik Jeldtoft

    2006-08-07

    Models relating to the species-area curve usually assume the existence of species, and are concerned mainly with ecological timescales. We examine an individual-based model of co-evolution on a spatial lattice based on the tangled nature model in which species are emergent structures, and show that reproduction, mutation and dispersion by diffusion, with interaction via genotype space, produces power-law species-area relations as observed in ecological measurements at medium scales. We find that long-lasting co-evolutionary habitats form, allowing high diversity levels in a spatially homogenous system.

  16. Vision-based building energy diagnostics and retrofit analysis using 3D thermography and building information modeling

    NASA Astrophysics Data System (ADS)

    Ham, Youngjib

    The emerging energy crisis in the building sector and the legislative measures on improving energy efficiency are steering the construction industry towards adopting new energy efficient design concepts and construction methods that decrease the overall energy loads. However, the problems of energy efficiency are not only limited to the design and construction of new buildings. Today, a significant amount of input energy in existing buildings is still being wasted during the operational phase. One primary source of the energy waste is attributed to unnecessary heat flows through building envelopes during hot and cold seasons. This inefficiency increases the operational frequency of heating and cooling systems to keep the desired thermal comfort of building occupants, and ultimately results in excessive energy use. Improving thermal performance of building envelopes can reduce the energy consumption required for space conditioning and in turn provide building occupants with an optimal thermal comfort at a lower energy cost. In this sense, energy diagnostics and retrofit analysis for existing building envelopes are key enablers for improving energy efficiency. Since proper retrofit decisions of existing buildings directly translate into energy cost saving in the future, building practitioners are increasingly interested in methods for reliable identification of potential performance problems so that they can take timely corrective actions. However, sensing what and where energy problems are emerging or are likely to emerge and then analyzing how the problems influence the energy consumption are not trivial tasks. The overarching goal of this dissertation focuses on understanding the gaps in knowledge in methods for building energy diagnostics and retrofit analysis, and filling these gaps by devising a new method for multi-modal visual sensing and analytics using thermography and Building Information Modeling (BIM). First, to address the challenges in scaling and localization issues of 2D thermal image-based inspection, a new computer vision-based method is presented for automated 3D spatio-thermal modeling of building environments from images and localizing the thermal images into the 3D reconstructed scenes, which helps better characterize the as-is condition of existing buildings in 3D. By using these models, auditors can conduct virtual walk-through in buildings and explore the as-is condition of building geometry and the associated thermal conditions in 3D. Second, to address the challenges in qualitative and subjective interpretation of visual data, a new model-based method is presented to convert the 3D thermal profiles of building environments into their associated energy performance metrics. More specifically, the Energy Performance Augmented Reality (EPAR) models are formed which integrate the actual 3D spatio-thermal models ('as-is') with energy performance benchmarks ('as-designed') in 3D. In the EPAR models, the presence and location of potential energy problems in building environments are inferred based on performance deviations. The as-is thermal resistances of the building assemblies are also calculated at the level of mesh vertex in 3D. Then, based on the historical weather data reflecting energy load for space conditioning, the amount of heat transfer that can be saved by improving the as-is thermal resistances of the defective areas to the recommended level is calculated, and the equivalent energy cost for this saving is estimated. The outcome provides building practitioners with unique information that can facilitate energy efficient retrofit decision-makings. This is a major departure from offhand calculations that are based on historical cost data of industry best practices. Finally, to improve the reliability of BIM-based energy performance modeling and analysis for existing buildings, a new model-based automated method is presented to map actual thermal resistance measurements at the level of 3D vertexes to the associated BIM elements and update their corresponding thermal properties in the gbXML schema. By reflecting the as-is building condition in the BIM-based energy modeling process, this method bridges over the gap between the architectural information in the as-designed BIM and the as-is building condition for accurate energy performance analysis. The performance of each method was validated on ten case studies from interiors and exteriors of existing residential and instructional buildings in IL and VA. The extensive experimental results show the promise of the proposed methods in addressing the fundamental challenges of (1) visual sensing : scaling 2D visual assessments to real-world building environments and localizing energy problems; (2) analytics: subjective and qualitative assessments; and (3) BIM-based building energy analysis : a lack of procedures for reflecting the as-is building condition in the energy modeling process. Beyond the technical contributions, the domain expert surveys conducted in this dissertation show that the proposed methods have potential to improve the quality of thermographic inspection processes and complement the current building energy analysis tools.

  17. A Kernel-Based Low-Rank (KLR) Model for Low-Dimensional Manifold Recovery in Highly Accelerated Dynamic MRI.

    PubMed

    Nakarmi, Ukash; Wang, Yanhua; Lyu, Jingyuan; Liang, Dong; Ying, Leslie

    2017-11-01

    While many low rank and sparsity-based approaches have been developed for accelerated dynamic magnetic resonance imaging (dMRI), they all use low rankness or sparsity in input space, overlooking the intrinsic nonlinear correlation in most dMRI data. In this paper, we propose a kernel-based framework to allow nonlinear manifold models in reconstruction from sub-Nyquist data. Within this framework, many existing algorithms can be extended to kernel framework with nonlinear models. In particular, we have developed a novel algorithm with a kernel-based low-rank model generalizing the conventional low rank formulation. The algorithm consists of manifold learning using kernel, low rank enforcement in feature space, and preimaging with data consistency. Extensive simulation and experiment results show that the proposed method surpasses the conventional low-rank-modeled approaches for dMRI.

  18. Proton channel models

    PubMed Central

    Pupo, Amaury; Baez-Nieto, David; Martínez, Agustín; Latorre, Ramón; González, Carlos

    2014-01-01

    Voltage-gated proton channels are integral membrane proteins with the capacity to permeate elementary particles in a voltage and pH dependent manner. These proteins have been found in several species and are involved in various physiological processes. Although their primary topology is known, lack of details regarding their structures in the open conformation has limited analyses toward a deeper understanding of the molecular determinants of their function and regulation. Consequently, the function-structure relationships have been inferred based on homology models. In the present work, we review the existing proton channel models, their assumptions, predictions and the experimental facts that support them. Modeling proton channels is not a trivial task due to the lack of a close homolog template. Hence, there are important differences between published models. This work attempts to critically review existing proton channel models toward the aim of contributing to a better understanding of the structural features of these proteins. PMID:24755912

  19. Mesoscopic and continuum modelling of angiogenesis

    PubMed Central

    Spill, F.; Guerrero, P.; Alarcon, T.; Maini, P. K.; Byrne, H. M.

    2016-01-01

    Angiogenesis is the formation of new blood vessels from pre-existing ones in response to chemical signals secreted by, for example, a wound or a tumour. In this paper, we propose a mesoscopic lattice-based model of angiogenesis, in which processes that include proliferation and cell movement are considered as stochastic events. By studying the dependence of the model on the lattice spacing and the number of cells involved, we are able to derive the deterministic continuum limit of our equations and compare it to similar existing models of angiogenesis. We further identify conditions under which the use of continuum models is justified, and others for which stochastic or discrete effects dominate. We also compare different stochastic models for the movement of endothelial tip cells which have the same macroscopic, deterministic behaviour, but lead to markedly different behaviour in terms of production of new vessel cells. PMID:24615007

  20. Modeling highly transient flow, mass, and heat transport in the Chattahoochee River near Atlanta, Georgia

    USGS Publications Warehouse

    Jobson, Harvey E.; Keefer, Thomas N.

    1979-01-01

    A coupled flow-temperature model has been developed and verified for a 27.9-km reach of the Chattahoochee River between Buford Dam and Norcross, Ga. Flow in this reach of the Chattahoochee is continuous but highly regulated by Buford Dam, a flood-control and hydroelectric facility located near Buford, Ga. Calibration and verification utilized two sets of data collected under highly unsteady discharge conditions. Existing solution techniques, with certain minor improvements, were applied to verify the existing technology of flow and transport modeling. A linear, implicit finite-difference flow model was coupled with implicit, finite-difference transport and temperature models. Both the conservative and nonconservative forms of the transport equation were solved, and the difference in the predicted concentrations of dye were found to be insignificant. The temperature model, therefore, was based on the simpler nonconservative form of the transport equation. (Woodard-USGS)

  1. Non-minimally coupled tachyon field in teleparallel gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fazlpour, Behnaz; Banijamali, Ali, E-mail: b.fazlpour@umz.ac.ir, E-mail: a.banijamali@nit.ac.ir

    2015-04-01

    We perform a full investigation on dynamics of a new dark energy model in which the four-derivative of a non-canonical scalar field (tachyon) is non-minimally coupled to the vector torsion. Our analysis is done in the framework of teleparallel equivalent of general relativity which is based on torsion instead of curvature. We show that in our model there exists a late-time scaling attractor (point P{sub 4}), corresponding to an accelerating universe with the property that dark energy and dark matter densities are of the same order. Such a point can help to alleviate the cosmological coincidence problem. Existence of thismore » point is the most significant difference between our model and another model in which a canonical scalar field (quintessence) is used instead of tachyon field.« less

  2. Usability evaluation of mobile applications; where do we stand?

    NASA Astrophysics Data System (ADS)

    Zahra, Fatima; Hussain, Azham; Mohd, Haslina

    2017-10-01

    The range and availability of mobile applications is expanding rapidly. With the increased processing power available on portable devices, developers are increasing the range of services by embracing smartphones in their extensive and diverse practices. While usability testing and evaluations of mobile applications have not yet touched the accuracy level of other web based applications. The existing usability models do not adequately capture the complexities of interacting with applications on a mobile platform. Therefore, this study aims to presents review on existing usability models for mobile applications. These models are in their infancy but with time and more research they may eventually be adopted. Moreover, different categories of mobile apps (medical, entertainment, education) possess different functional and non-functional requirements thus customized models are required for diverse mobile applications.

  3. Prospects for rebuilding primary care using the patient-centered medical home.

    PubMed

    Landon, Bruce E; Gill, James M; Antonelli, Richard C; Rich, Eugene C

    2010-05-01

    Existing research suggests that models of enhanced primary care lead to health care systems with better performance. What the research does not show is whether such an approach is feasible or likely to be effective within the U.S. health care system. Many commentators have adopted the model of the patient-centered medical home as policy shorthand to address the reinvention of primary care in the United States. We analyze potential barriers to implementing the medical home model for policy makers and practitioners. Among others, these include developing new payment models, as well as the need for up-front funding to assemble the personnel and infrastructure required by an enhanced non-visit-based primary care practice and methods to facilitate transformation of existing practices to functioning medical homes.

  4. Research and Implementation of Tibetan Word Segmentation Based on Syllable Methods

    NASA Astrophysics Data System (ADS)

    Jiang, Jing; Li, Yachao; Jiang, Tao; Yu, Hongzhi

    2018-03-01

    Tibetan word segmentation (TWS) is an important problem in Tibetan information processing, while abbreviated word recognition is one of the key and most difficult problems in TWS. Most of the existing methods of Tibetan abbreviated word recognition are rule-based approaches, which need vocabulary support. In this paper, we propose a method based on sequence tagging model for abbreviated word recognition, and then implement in TWS systems with sequence labeling models. The experimental results show that our abbreviated word recognition method is fast and effective and can be combined easily with the segmentation model. This significantly increases the effect of the Tibetan word segmentation.

  5. Approaching mathematical model of the immune network based DNA Strand Displacement system.

    PubMed

    Mardian, Rizki; Sekiyama, Kosuke; Fukuda, Toshio

    2013-12-01

    One biggest obstacle in molecular programming is that there is still no direct method to compile any existed mathematical model into biochemical reaction in order to solve a computational problem. In this paper, the implementation of DNA Strand Displacement system based on nature-inspired computation is observed. By using the Immune Network Theory and Chemical Reaction Network, the compilation of DNA-based operation is defined and the formulation of its mathematical model is derived. Furthermore, the implementation on this system is compared with the conventional implementation by using silicon-based programming. From the obtained results, we can see a positive correlation between both. One possible application from this DNA-based model is for a decision making scheme of intelligent computer or molecular robot. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Query Language for Location-Based Services: A Model Checking Approach

    NASA Astrophysics Data System (ADS)

    Hoareau, Christian; Satoh, Ichiro

    We present a model checking approach to the rationale, implementation, and applications of a query language for location-based services. Such query mechanisms are necessary so that users, objects, and/or services can effectively benefit from the location-awareness of their surrounding environment. The underlying data model is founded on a symbolic model of space organized in a tree structure. Once extended to a semantic model for modal logic, we regard location query processing as a model checking problem, and thus define location queries as hybrid logicbased formulas. Our approach is unique to existing research because it explores the connection between location models and query processing in ubiquitous computing systems, relies on a sound theoretical basis, and provides modal logic-based query mechanisms for expressive searches over a decentralized data structure. A prototype implementation is also presented and will be discussed.

  7. The propagation of inventory-based positional errors into statistical landslide susceptibility models

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Glade, Thomas

    2016-12-01

    There is unanimous agreement that a precise spatial representation of past landslide occurrences is a prerequisite to produce high quality statistical landslide susceptibility models. Even though perfectly accurate landslide inventories rarely exist, investigations of how landslide inventory-based errors propagate into subsequent statistical landslide susceptibility models are scarce. The main objective of this research was to systematically examine whether and how inventory-based positional inaccuracies of different magnitudes influence modelled relationships, validation results, variable importance and the visual appearance of landslide susceptibility maps. The study was conducted for a landslide-prone site located in the districts of Amstetten and Waidhofen an der Ybbs, eastern Austria, where an earth-slide point inventory was available. The methodological approach comprised an artificial introduction of inventory-based positional errors into the present landslide data set and an in-depth evaluation of subsequent modelling results. Positional errors were introduced by artificially changing the original landslide position by a mean distance of 5, 10, 20, 50 and 120 m. The resulting differently precise response variables were separately used to train logistic regression models. Odds ratios of predictor variables provided insights into modelled relationships. Cross-validation and spatial cross-validation enabled an assessment of predictive performances and permutation-based variable importance. All analyses were additionally carried out with synthetically generated data sets to further verify the findings under rather controlled conditions. The results revealed that an increasing positional inventory-based error was generally related to increasing distortions of modelling and validation results. However, the findings also highlighted that interdependencies between inventory-based spatial inaccuracies and statistical landslide susceptibility models are complex. The systematic comparisons of 12 models provided valuable evidence that the respective error-propagation was not only determined by the degree of positional inaccuracy inherent in the landslide data, but also by the spatial representation of landslides and the environment, landslide magnitude, the characteristics of the study area, the selected classification method and an interplay of predictors within multiple variable models. Based on the results, we deduced that a direct propagation of minor to moderate inventory-based positional errors into modelling results can be partly counteracted by adapting the modelling design (e.g. generalization of input data, opting for strongly generalizing classifiers). Since positional errors within landslide inventories are common and subsequent modelling and validation results are likely to be distorted, the potential existence of inventory-based positional inaccuracies should always be considered when assessing landslide susceptibility by means of empirical models.

  8. Parameter estimation for lithium ion batteries

    NASA Astrophysics Data System (ADS)

    Santhanagopalan, Shriram

    With an increase in the demand for lithium based batteries at the rate of about 7% per year, the amount of effort put into improving the performance of these batteries from both experimental and theoretical perspectives is increasing. There exist a number of mathematical models ranging from simple empirical models to complicated physics-based models to describe the processes leading to failure of these cells. The literature is also rife with experimental studies that characterize the various properties of the system in an attempt to improve the performance of lithium ion cells. However, very little has been done to quantify the experimental observations and relate these results to the existing mathematical models. In fact, the best of the physics based models in the literature show as much as 20% discrepancy when compared to experimental data. The reasons for such a big difference include, but are not limited to, numerical complexities involved in extracting parameters from experimental data and inconsistencies in interpreting directly measured values for the parameters. In this work, an attempt has been made to implement simplified models to extract parameter values that accurately characterize the performance of lithium ion cells. The validity of these models under a variety of experimental conditions is verified using a model discrimination procedure. Transport and kinetic properties are estimated using a non-linear estimation procedure. The initial state of charge inside each electrode is also maintained as an unknown parameter, since this value plays a significant role in accurately matching experimental charge/discharge curves with model predictions and is not readily known from experimental data. The second part of the dissertation focuses on parameters that change rapidly with time. For example, in the case of lithium ion batteries used in Hybrid Electric Vehicle (HEV) applications, the prediction of the State of Charge (SOC) of the cell under a variety of road conditions is important. An algorithm to predict the SOC in time intervals as small as 5 ms is of critical demand. In such cases, the conventional non-linear estimation procedure is not time-effective. There exist methodologies in the literature, such as those based on fuzzy logic; however, these techniques require a lot of computational storage space. Consequently, it is not possible to implement such techniques on a micro-chip for integration as a part of a real-time device. The Extended Kalman Filter (EKF) based approach presented in this work is a first step towards developing an efficient method to predict online, the State of Charge of a lithium ion cell based on an electrochemical model. The final part of the dissertation focuses on incorporating uncertainty in parameter values into electrochemical models using the polynomial chaos theory (PCT).

  9. COSMOS: accurate detection of somatic structural variations through asymmetric comparison between tumor and normal samples.

    PubMed

    Yamagata, Koichi; Yamanishi, Ayako; Kokubu, Chikara; Takeda, Junji; Sese, Jun

    2016-05-05

    An important challenge in cancer genomics is precise detection of structural variations (SVs) by high-throughput short-read sequencing, which is hampered by the high false discovery rates of existing analysis tools. Here, we propose an accurate SV detection method named COSMOS, which compares the statistics of the mapped read pairs in tumor samples with isogenic normal control samples in a distinct asymmetric manner. COSMOS also prioritizes the candidate SVs using strand-specific read-depth information. Performance tests on modeled tumor genomes revealed that COSMOS outperformed existing methods in terms of F-measure. We also applied COSMOS to an experimental mouse cell-based model, in which SVs were induced by genome engineering and gamma-ray irradiation, followed by polymerase chain reaction-based confirmation. The precision of COSMOS was 84.5%, while the next best existing method was 70.4%. Moreover, the sensitivity of COSMOS was the highest, indicating that COSMOS has great potential for cancer genome analysis. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Modular Architecture for Integrated Model-Based Decision Support.

    PubMed

    Gaebel, Jan; Schreiber, Erik; Oeser, Alexander; Oeltze-Jafra, Steffen

    2018-01-01

    Model-based decision support systems promise to be a valuable addition to oncological treatments and the implementation of personalized therapies. For the integration and sharing of decision models, the involved systems must be able to communicate with each other. In this paper, we propose a modularized architecture of dedicated systems for the integration of probabilistic decision models into existing hospital environments. These systems interconnect via web services and provide model sharing and processing capabilities for clinical information systems. Along the lines of IHE integration profiles from other disciplines and the meaningful reuse of routinely recorded patient data, our approach aims for the seamless integration of decision models into hospital infrastructure and the physicians' daily work.

  11. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  12. An Event-driven, Value-based, Pull Systems Engineering Scheduling Approach

    DTIC Science & Technology

    2012-03-01

    engineering in rapid response environments has been difficult, particularly those where large, complex brownfield systems or systems of systems exist and...where large, complex brownfield systems or systems of systems exist and are constantly being updated with both short and long term software enhancements...2004. [13] B. Boehm, “Applying the Incremental Commitment Model to Brownfield System Development,” Proceedings, CSER, 2009. [14] A. Borshchev and A

  13. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    NASA Technical Reports Server (NTRS)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  14. The research on medical image classification algorithm based on PLSA-BOW model.

    PubMed

    Cao, C H; Cao, H L

    2016-04-29

    With the rapid development of modern medical imaging technology, medical image classification has become more important for medical diagnosis and treatment. To solve the existence of polysemous words and synonyms problem, this study combines the word bag model with PLSA (Probabilistic Latent Semantic Analysis) and proposes the PLSA-BOW (Probabilistic Latent Semantic Analysis-Bag of Words) model. In this paper we introduce the bag of words model in text field to image field, and build the model of visual bag of words model. The method enables the word bag model-based classification method to be further improved in accuracy. The experimental results show that the PLSA-BOW model for medical image classification can lead to a more accurate classification.

  15. Solving coupled groundwater flow systems using a Jacobian Free Newton Krylov method

    NASA Astrophysics Data System (ADS)

    Mehl, S.

    2012-12-01

    Jacobian Free Newton Kyrlov (JFNK) methods can have several advantages for simulating coupled groundwater flow processes versus conventional methods. Conventional methods are defined here as those based on an iterative coupling (rather than a direct coupling) and/or that use Picard iteration rather than Newton iteration. In an iterative coupling, the systems are solved separately, coupling information is updated and exchanged between the systems, and the systems are re-solved, etc., until convergence is achieved. Trusted simulators, such as Modflow, are based on these conventional methods of coupling and work well in many cases. An advantage of the JFNK method is that it only requires calculation of the residual vector of the system of equations and thus can make use of existing simulators regardless of how the equations are formulated. This opens the possibility of coupling different process models via augmentation of a residual vector by each separate process, which often requires substantially fewer changes to the existing source code than if the processes were directly coupled. However, appropriate perturbation sizes need to be determined for accurate approximations of the Frechet derivative, which is not always straightforward. Furthermore, preconditioning is necessary for reasonable convergence of the linear solution required at each Kyrlov iteration. Existing preconditioners can be used and applied separately to each process which maximizes use of existing code and robust preconditioners. In this work, iteratively coupled parent-child local grid refinement models of groundwater flow and groundwater flow models with nonlinear exchanges to streams are used to demonstrate the utility of the JFNK approach for Modflow models. Use of incomplete Cholesky preconditioners with various levels of fill are examined on a suite of nonlinear and linear models to analyze the effect of the preconditioner. Comparisons of convergence and computer simulation time are made using conventional iteratively coupled methods and those based on Picard iteration to those formulated with JFNK to gain insights on the types of nonlinearities and system features that make one approach advantageous. Results indicate that nonlinearities associated with stream/aquifer exchanges are more problematic than those resulting from unconfined flow.

  16. Integrating Technology into Classroom: The Learner-Centered Instructional Design

    ERIC Educational Resources Information Center

    Sezer, Baris; Karaoglan Yilmaz, Fatma Gizem; Yilmaz, Ramazan

    2013-01-01

    In this study, to present an instructional model by considering the existing models of instructional design (ARCS, ADDIE, ASSURE, Dick and Carey, Seels and Glasgow, Smith and Ragan etc.) with the nature of technology-based education and to reveal analysis, design, development, implementation, evaluation, and to revise levels with lower levels of…

  17. Wellness: An Alternative Paradigm for Violence Prevention.

    ERIC Educational Resources Information Center

    Makinson, Linda S.; Myers, Jane E.

    2003-01-01

    The authors address adolescent violence by promoting holistic health before symptoms occur and by using strength-based interventions to combat problems that already exist. A wellness model is presented, as is a discussion of research on the components of the model that are related to violence and violence prevention with adolescents. (Contains 49…

  18. The prediction of acoustical particle motion using an efficient polynomial curve fit procedure

    NASA Technical Reports Server (NTRS)

    Marshall, S. E.; Bernhard, R.

    1984-01-01

    A procedure is examined whereby the acoustic model parameters, natural frequencies and mode shapes, in the cavities of transportation vehicles are determined experimentally. The acoustic model shapes are described in terms of the particle motion. The acoustic modal analysis procedure is tailored to existing minicomputer based spectral analysis systems.

  19. Research on Model of Student Engagement in Online Learning

    ERIC Educational Resources Information Center

    Peng, Wang

    2017-01-01

    In this study, online learning refers students under the guidance of teachers through the online learning platform for organized learning. Based on the analysis of related research results, considering the existing problems, the main contents of this paper include the following aspects: (1) Analyze and study the current student engagement model.…

  20. Classifying Correlation Matrices into Relatively Homogeneous Subgroups: A Cluster Analytic Approach

    ERIC Educational Resources Information Center

    Cheung, Mike W.-L.; Chan, Wai

    2005-01-01

    Researchers are becoming interested in combining meta-analytic techniques and structural equation modeling to test theoretical models from a pool of studies. Most existing procedures are based on the assumption that all correlation matrices are homogeneous. Few studies have addressed what the next step should be when studies being analyzed are…

  1. Applications of Location Similarity Measures and Conceptual Spaces to Event Coreference and Classification

    ERIC Educational Resources Information Center

    McConky, Katie Theresa

    2013-01-01

    This work covers topics in event coreference and event classification from spoken conversation. Event coreference is the process of identifying descriptions of the same event across sentences, documents, or structured databases. Existing event coreference work focuses on sentence similarity models or feature based similarity models requiring slot…

  2. A Conceptual Three-Dimensional Model for Evaluating Community-Based Substance Abuse Prevention Programs.

    ERIC Educational Resources Information Center

    Albers, Eric C.; Santangelo, Linda K.; McKinlay, George; Cavote, Steve; Rock, Stephen L.; Evans, William

    2002-01-01

    Presents a three-dimensional model for conceptualizing existing prevention programs, defining and measuring effects of prevention programs, and making a connection between those programmatic effects, and the interests of the funder. This paper describes the methodology and its use for promoting the efficiency and effectiveness of substance abuse…

  3. A Best-Practice Model for Academic Advising of University Biology Majors

    ERIC Educational Resources Information Center

    Heekin, Jonathan Ralph Calvin

    2013-01-01

    Biology faculty at an East Coast university believed their undergraduate students were not being well served by the existing academic advising program. The purpose of this mixed methods project study was to evaluate the effectiveness of the academic advising model in a biology department. Guided by system-based organizational theory, a learning…

  4. A Model for Designing Peer-Initiated Activities to Promote Racial Awareness and an Appreciation of Differences.

    ERIC Educational Resources Information Center

    Mann, Barbara A.; Moser, Rita M.

    1991-01-01

    Presents a theoretical framework suggesting ways to design peer intervention programs and group existing programs. Suggests criteria for effective racial awareness programs, discussing examples of successful college prejudice activities. Notes diversity education efforts are most successful when based on a theoretical model that recognizes the…

  5. More than Words: Towards a Development-Based Approach to Language Revitalization

    ERIC Educational Resources Information Center

    Henderson, Brent; Rohloff, Peter; Henderson, Robert

    2014-01-01

    Existing models for language revitalization focus almost exclusively on language learning and use. While recognizing the value of these models, we argue that their effective application is largely limited to situations in which languages have low numbers of speakers. For languages that are rapidly undergoing language shift, but which still…

  6. Pedagogical Catalysts of Civic Competence: The Development of a Critical Epistemological Model for Community-Based Learning

    ERIC Educational Resources Information Center

    Stokamer, Stephanie

    2013-01-01

    Democratic problem-solving necessitates an active and informed citizenry, but existing research on service-learning has shed little light on the relationship between pedagogical practices and civic competence outcomes. This study developed and tested a model to represent that relationship and identified pedagogical catalysts of civic competence…

  7. Weighted Least Squares Fitting Using Ordinary Least Squares Algorithms.

    ERIC Educational Resources Information Center

    Kiers, Henk A. L.

    1997-01-01

    A general approach for fitting a model to a data matrix by weighted least squares (WLS) is studied. The approach consists of iteratively performing steps of existing algorithms for ordinary least squares fitting of the same model and is based on maximizing a function that majorizes WLS loss function. (Author/SLD)

  8. Assessing the New Competencies for Resident Education: A Model from an Emergency Medicine Program.

    ERIC Educational Resources Information Center

    Reisdorff, Earl J.; Hayes, Oliver W.; Carlson, Dale J.; Walker, Gregory L.

    2001-01-01

    Based on the experience of Michigan State University's emergency medicine residency program, proposes a practical method for modifying an existing student evaluation format. The model provides a template other programs could use in assessing residents' acquisition of the knowledge, skills, and attitudes reflected in the six general competencies…

  9. SEASONAL NH3 EMISSION ESTIMATES FOR THE EASTERN UNITED STATES BASED ON AMMONIUM WET CONCENTRATIONS AND AN INVERSE MODELING METHOD

    EPA Science Inventory

    Significant uncertainty exists in the magnitude and variability of ammonia (NH3) emissions. NH3 emissions are needed as input for air quality modeling of aerosols and deposition of nitrogen compounds. Approximately 85% of NH3 emissions are estimated to come from agricultural ...

  10. Model-based evaluation of two BNR processes--UCT and A2N.

    PubMed

    Hao, X; Van Loosdrecht, M C; Meijer, S C; Qian, Y

    2001-08-01

    The activity of denitrifying P-accumulating bacteria (DPB) has been verified to exist in most WWTPs with biological nutrient removal (BNR). The modified UCT process has a high content of DPB. A new BNR process with a two-sludge system named A2N was especially developed to exploit denitrifying dephosphatation. With the identical inflow and effluent standards, an existing full-scale UCT-type WWTP and a designed A2N process were evaluated by simulation. The used model is based on the Delft metabolical model for bio-P removal and ASM2d model for COD and N removal. Both processes accommodate denitrifying dephosphatation, but the A2N process has a more stable performance in N removal. Although excess sludge is increased by 6%, the A2N process leads to savings of 35, 85 and 30% in aeration energy, mixed liquor internal recirculation and land occupation respectively, as compared to the UCT process. Low temperature has a negative effect on growth of poly-P bacteria, which becomes to especially appear in the A2N process.

  11. Informing Environmental Water Management Decisions: Using Conditional Probability Networks to Address the Information Needs of Planning and Implementation Cycles.

    PubMed

    Horne, Avril C; Szemis, Joanna M; Webb, J Angus; Kaur, Simranjit; Stewardson, Michael J; Bond, Nick; Nathan, Rory

    2018-03-01

    One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.

  12. Informing Environmental Water Management Decisions: Using Conditional Probability Networks to Address the Information Needs of Planning and Implementation Cycles

    NASA Astrophysics Data System (ADS)

    Horne, Avril C.; Szemis, Joanna M.; Webb, J. Angus; Kaur, Simranjit; Stewardson, Michael J.; Bond, Nick; Nathan, Rory

    2018-03-01

    One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.

  13. Meta-Theoretical Contributions to the Constitution of a Model-Based Didactics of Science

    NASA Astrophysics Data System (ADS)

    Ariza, Yefrin; Lorenzano, Pablo; Adúriz-Bravo, Agustín

    2016-10-01

    There is nowadays consensus in the community of didactics of science (i.e. science education understood as an academic discipline) regarding the need to include the philosophy of science in didactical research, science teacher education, curriculum design, and the practice of science education in all educational levels. Some authors have identified an ever-increasing use of the concept of `theoretical model', stemming from the so-called semantic view of scientific theories. However, it can be recognised that, in didactics of science, there are over-simplified transpositions of the idea of model (and of other meta-theoretical ideas). In this sense, contemporary philosophy of science is often blurred or distorted in the science education literature. In this paper, we address the discussion around some meta-theoretical concepts that are introduced into didactics of science due to their perceived educational value. We argue for the existence of a `semantic family', and we characterise four different versions of semantic views existing within the family. In particular, we seek to contribute to establishing a model-based didactics of science mainly supported in this semantic family.

  14. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  15. Prediction of Protein-Protein Interaction Sites with Machine-Learning-Based Data-Cleaning and Post-Filtering Procedures.

    PubMed

    Liu, Guang-Hui; Shen, Hong-Bin; Yu, Dong-Jun

    2016-04-01

    Accurately predicting protein-protein interaction sites (PPIs) is currently a hot topic because it has been demonstrated to be very useful for understanding disease mechanisms and designing drugs. Machine-learning-based computational approaches have been broadly utilized and demonstrated to be useful for PPI prediction. However, directly applying traditional machine learning algorithms, which often assume that samples in different classes are balanced, often leads to poor performance because of the severe class imbalance that exists in the PPI prediction problem. In this study, we propose a novel method for improving PPI prediction performance by relieving the severity of class imbalance using a data-cleaning procedure and reducing predicted false positives with a post-filtering procedure: First, a machine-learning-based data-cleaning procedure is applied to remove those marginal targets, which may potentially have a negative effect on training a model with a clear classification boundary, from the majority samples to relieve the severity of class imbalance in the original training dataset; then, a prediction model is trained on the cleaned dataset; finally, an effective post-filtering procedure is further used to reduce potential false positive predictions. Stringent cross-validation and independent validation tests on benchmark datasets demonstrated the efficacy of the proposed method, which exhibits highly competitive performance compared with existing state-of-the-art sequence-based PPIs predictors and should supplement existing PPI prediction methods.

  16. Enlarged leukocyte referent libraries can explain additional variance in blood-based epigenome-wide association studies.

    PubMed

    Kim, Stephanie; Eliot, Melissa; Koestler, Devin C; Houseman, Eugene A; Wetmur, James G; Wiencke, John K; Kelsey, Karl T

    2016-09-01

    We examined whether variation in blood-based epigenome-wide association studies could be more completely explained by augmenting existing reference DNA methylation libraries. We compared existing and enhanced libraries in predicting variability in three publicly available 450K methylation datasets that collected whole-blood samples. Models were fit separately to each CpG site and used to estimate the additional variability when adjustments for cell composition were made with each library. Calculation of the mean difference in the CpG-specific residual sums of squares error between models for an arthritis, aging and metabolic syndrome dataset, indicated that an enhanced library explained significantly more variation across all three datasets (p < 10(-3)). Pathologically important immune cell subtypes can explain important variability in epigenome-wide association studies done in blood.

  17. Land-use threats and protected areas: a scenario-based, landscape level approach

    USGS Publications Warehouse

    Wilson, Tamara S.; Sleeter, Benjamin M.; Sleeter, Rachel R.; Soulard, Christopher E.

    2014-01-01

    Anthropogenic land use will likely present a greater challenge to biodiversity than climate change this century in the Pacific Northwest, USA. Even if species are equipped with the adaptive capacity to migrate in the face of a changing climate, they will likely encounter a human-dominated landscape as a major dispersal obstacle. Our goal was to identify, at the ecoregion-level, protected areas in close proximity to lands with a higher likelihood of future land-use conversion. Using a state-and-transition simulation model, we modeled spatially explicit (1 km2) land use from 2000 to 2100 under seven alternative land-use and emission scenarios for ecoregions in the Pacific Northwest. We analyzed scenario-based land-use conversion threats from logging, agriculture, and development near existing protected areas. A conversion threat index (CTI) was created to identify ecoregions with highest projected land-use conversion potential within closest proximity to existing protected areas. Our analysis indicated nearly 22% of land area in the Coast Range, over 16% of land area in the Puget Lowland, and nearly 11% of the Cascades had very high CTI values. Broader regional-scale land-use change is projected to impact nearly 40% of the Coast Range, 30% of the Puget Lowland, and 24% of the Cascades (i.e., two highest CTI classes). A landscape level, scenario-based approach to modeling future land use helps identify ecoregions with existing protected areas at greater risk from regional land-use threats and can help prioritize future conservation efforts.

  18. Initial state with shear in peripheral heavy ion collisions

    NASA Astrophysics Data System (ADS)

    Magas, V. K.; Gordillo, J.; Strottman, D.; Xie, Y. L.; Csernai, L. P.

    2018-06-01

    In the present work we propose a new way of constructing the initial state for further hydrodynamic simulation of relativistic heavy ion collisions based on Bjorken-like solution applied streak by streak in the transverse plane. Previous fluid dynamical calculations in Cartesian coordinates with an initial state based on a streak by streak Yang-Mills field led for peripheral higher energy collisions to large angular momentum, initial shear flow and significant local vorticity. Recent experiments verified the existence of this vorticity via the resulting polarization of emitted Λ and Λ ¯ particles. At the same time parton cascade models indicated the existence of more compact initial state configurations, which we are going to simulate in our approach. The proposed model satisfies all the conservation laws, including conservation of a strong initial angular momentum, which is present in noncentral collisions. As a consequence of this large initial angular momentum we observe the rotation of the whole system as well as the fluid shear in the initial state, which leads to large flow vorticity. Another advantage of the proposed model is that the initial state can be given in both [t,x,y,z] and [τ ,x ,y ,η ] coordinates and thus can be tested by all 3+1D hydrodynamical codes which exist in the field.

  19. Competitive Exclusion and Coexistence of Pathogens in a Homosexually-Transmitted Disease Model

    PubMed Central

    Chai, Caichun; Jiang, Jifa

    2011-01-01

    A sexually-transmitted disease model for two strains of pathogen in a one-sex, heterogeneously-mixing population has been studied completely by Jiang and Chai in (J Math Biol 56:373–390, 2008). In this paper, we give a analysis for a SIS STD with two competing strains, where populations are divided into three differential groups based on their susceptibility to two distinct pathogenic strains. We investigate the existence and stability of the boundary equilibria that characterizes competitive exclusion of the two competing strains; we also investigate the existence and stability of the positive coexistence equilibrium, which characterizes the possibility of coexistence of the two strains. We obtain sufficient and necessary conditions for the existence and global stability about these equilibria under some assumptions. We verify that there is a strong connection between the stability of the boundary equilibria and the existence of the coexistence equilibrium, that is, there exists a unique coexistence equilibrium if and only if the boundary equilibria both exist and have the same stability, the coexistence equilibrium is globally stable or unstable if and only if the two boundary equilibria are both unstable or both stable. PMID:21347222

  20. CELDA – an ontology for the comprehensive representation of cells in complex systems

    PubMed Central

    2013-01-01

    Background The need for detailed description and modeling of cells drives the continuous generation of large and diverse datasets. Unfortunately, there exists no systematic and comprehensive way to organize these datasets and their information. CELDA (Cell: Expression, Localization, Development, Anatomy) is a novel ontology for the association of primary experimental data and derived knowledge to various types of cells of organisms. Results CELDA is a structure that can help to categorize cell types based on species, anatomical localization, subcellular structures, developmental stages and origin. It targets cells in vitro as well as in vivo. Instead of developing a novel ontology from scratch, we carefully designed CELDA in such a way that existing ontologies were integrated as much as possible, and only minimal extensions were performed to cover those classes and areas not present in any existing model. Currently, ten existing ontologies and models are linked to CELDA through the top-level ontology BioTop. Together with 15.439 newly created classes, CELDA contains more than 196.000 classes and 233.670 relationship axioms. CELDA is primarily used as a representational framework for modeling, analyzing and comparing cells within and across species in CellFinder, a web based data repository on cells (http://cellfinder.org). Conclusions CELDA can semantically link diverse types of information about cell types. It has been integrated within the research platform CellFinder, where it exemplarily relates cell types from liver and kidney during development on the one hand and anatomical locations in humans on the other, integrating information on all spatial and temporal stages. CELDA is available from the CellFinder website: http://cellfinder.org/about/ontology. PMID:23865855

  1. Diagnostic evaluation of the Community Earth System Model in simulating mineral dust emission with insight into large-scale dust storm mobilization in the Middle East and North Africa (MENA)

    NASA Astrophysics Data System (ADS)

    Parajuli, Sagar Prasad; Yang, Zong-Liang; Lawrence, David M.

    2016-06-01

    Large amounts of mineral dust are injected into the atmosphere during dust storms, which are common in the Middle East and North Africa (MENA) where most of the global dust hotspots are located. In this work, we present simulations of dust emission using the Community Earth System Model Version 1.2.2 (CESM 1.2.2) and evaluate how well it captures the spatio-temporal characteristics of dust emission in the MENA region with a focus on large-scale dust storm mobilization. We explicitly focus our analysis on the model's two major input parameters that affect the vertical mass flux of dust-surface winds and the soil erodibility factor. We analyze dust emissions in simulations with both prognostic CESM winds and with CESM winds that are nudged towards ERA-Interim reanalysis values. Simulations with three existing erodibility maps and a new observation-based erodibility map are also conducted. We compare the simulated results with MODIS satellite data, MACC reanalysis data, AERONET station data, and CALIPSO 3-d aerosol profile data. The dust emission simulated by CESM, when driven by nudged reanalysis winds, compares reasonably well with observations on daily to monthly time scales despite CESM being a global General Circulation Model. However, considerable bias exists around known high dust source locations in northwest/northeast Africa and over the Arabian Peninsula where recurring large-scale dust storms are common. The new observation-based erodibility map, which can represent anthropogenic dust sources that are not directly represented by existing erodibility maps, shows improved performance in terms of the simulated dust optical depth (DOD) and aerosol optical depth (AOD) compared to existing erodibility maps although the performance of different erodibility maps varies by region.

  2. Physics of Accretion in X-Ray Binaries

    NASA Technical Reports Server (NTRS)

    Vrtilek, Saeqa D.

    2004-01-01

    This project consists of several related investigations directed to the study of mass transfer processes in X-ray binaries. Models developed over several years incorporating highly detailed physics will be tested on a balanced mix of existing data and planned observations with both ground and space-based observatories. The extended time coverage of the observations and the existence of {\\it simultaneous} X-ray, ultraviolet, and optical observations will be particularly beneficial for studying the accretion flows. These investigations, which take as detailed a look at the accretion process in X-ray binaries as is now possible, test current models to their limits, and force us to extend them. We now have the ability to do simultaneous ultraviolet/X-ray/optical spectroscopy with HST, Chandra, XMM, and ground-based observatories. The rich spectroscopy that these Observations give us must be interpreted principally by reference to detailed models, the development of which is already well underway; tests of these essential interpretive tools are an important product of the proposed investigations.

  3. The Physics of Accretion in X-Ray Binaries

    NASA Technical Reports Server (NTRS)

    Vrtilek, S.; Oliversen, Ronald (Technical Monitor)

    2001-01-01

    This project consists of several related investigations directed to the study of mass transfer processes in X-ray binaries. Models developed over several years incorporating highly detailed physics will be tested on a balanced mix of existing data and planned observations with both ground and space-based observatories. The extended time coverage of the observations and the existence of simultaneous X-ray, ultraviolet, and optical observations will be particularly beneficial for studying the accretion flows. These investigations, which take as detailed a look at the accretion process in X-ray binaries as is now possible, test current models to their limits, and force us to extend them. We now have the ability to do simultaneous ultraviolet/X-ray/optical spectroscopy with HST, Chandra, XMM, and ground-based observatories. The rich spectroscopy that these observations give us must be interpreted principally by reference to detailed models, the development of which is already well underway; tests of these essential interpretive tools are an important product of the proposed investigations.

  4. Artificial neural networks in mammography interpretation and diagnostic decision making.

    PubMed

    Ayer, Turgay; Chen, Qiushi; Burnside, Elizabeth S

    2013-01-01

    Screening mammography is the most effective means for early detection of breast cancer. Although general rules for discriminating malignant and benign lesions exist, radiologists are unable to perfectly detect and classify all lesions as malignant and benign, for many reasons which include, but are not limited to, overlap of features that distinguish malignancy, difficulty in estimating disease risk, and variability in recommended management. When predictive variables are numerous and interact, ad hoc decision making strategies based on experience and memory may lead to systematic errors and variability in practice. The integration of computer models to help radiologists increase the accuracy of mammography examinations in diagnostic decision making has gained increasing attention in the last two decades. In this study, we provide an overview of one of the most commonly used models, artificial neural networks (ANNs), in mammography interpretation and diagnostic decision making and discuss important features in mammography interpretation. We conclude by discussing several common limitations of existing research on ANN-based detection and diagnostic models and provide possible future research directions.

  5. Treatment of Adolescent Substance Use Disorders and Co-Occurring Internalizing Disorders: A Critical Review and Proposed Model.

    PubMed

    Hulvershorn, Leslie A; Quinn, Patrick D; Scott, Eric L

    2015-01-01

    The past several decades have seen dramatic growth in empirically supported treatments for adolescent substance use disorders (SUDs), yet even the most well-established approaches struggle to produce large or long-lasting improvements. These difficulties may stem, in part, from the high rates of comorbidity between SUDs and other psychiatric disorders. We critically reviewed the treatment outcome literature for adolescents with co-occurring SUDs and internalizing disorders. Our review identified components of existing treatments that might be included in an integrated, evidence-based approach to the treatment of SUDs and internalizing disorders. An effective program may involve careful assessment, inclusion of parents or guardians, and tailoring of interventions via a modular strategy. The existing literature guides the development of a conceptual evidence-based, modular treatment model targeting adolescents with co-occurring internalizing and SUDs. With empirical study, such a model may better address treatment outcomes for both disorder types in adolescents.

  6. Treatment of Adolescent Substance Use Disorders and Co-Occurring Internalizing Disorders: A Critical Review and Proposed Model

    PubMed Central

    Hulvershorn, Leslie A.; Quinn, Patrick D.; Scott, Eric L.

    2016-01-01

    Background The past several decades have seen dramatic growth in empirically supported treatments for adolescent substance use disorders (SUDs), yet even the most well-established approaches struggle to produce large or long-lasting improvements. These difficulties may stem, in part, from the high rates of comorbidity between SUDs and other psychiatric disorders. Method We critically reviewed the treatment outcome literature for adolescents with co-occurring SUDs and internalizing disorders. Results Our review identified components of existing treatments that might be included in an integrated, evidence-based approach to the treatment of SUDs and internalizing disorders. An effective program may involve careful assessment, inclusion of parents or guardians, and tailoring of interventions via a modular strategy. Conclusions The existing literature guides the development of a conceptual evidence-based, modular treatment model targeting adolescents with co-occurring internalizing and SUDs. With empirical study, such a model may better address treatment outcomes for both disorder types in adolescents. PMID:25973718

  7. Navy Sea Ice Prediction Systems

    DTIC Science & Technology

    2002-01-01

    for the IABP drifting buoys (red), the model (green), and the model with assimilation (black). 55 Oceanography • Vol. 15 • No. 1/2002 trate the need...SPECIAL ISSUE – NAVY OPERATIONAL MODELS : TEN YEARS LATER Oceanography • Vol. 15 • No. 1/2002 44 ice extent and/or ice thickness. A general trend...most often based on a combination of models and data. Modeling sea ice can be a difficult problem, as it exists in many different forms (Figure 1). It

  8. The importance of explicitly mapping instructional analogies in science education

    NASA Astrophysics Data System (ADS)

    Asay, Loretta Johnson

    Analogies are ubiquitous during instruction in science classrooms, yet research about the effectiveness of using analogies has produced mixed results. An aspect seldom studied is a model of instruction when using analogies. The few existing models for instruction with analogies have not often been examined quantitatively. The Teaching With Analogies (TWA) model (Glynn, 1991) is one of the models frequently cited in the variety of research about analogies. The TWA model outlines steps for instruction, including the step of explicitly mapping the features of the source to the target. An experimental study was conducted to examine the effects of explicitly mapping the features of the source and target in an analogy during computer-based instruction about electrical circuits. Explicit mapping was compared to no mapping and to a control with no analogy. Participants were ninth- and tenth-grade biology students who were each randomly assigned to one of three conditions (no analogy module, analogy module, or explicitly mapped analogy module) for computer-based instruction. Subjects took a pre-test before the instruction, which was used to assign them to a level of previous knowledge about electrical circuits for analysis of any differential effects. After the instruction modules, students took a post-test about electrical circuits. Two weeks later, they took a delayed post-test. No advantage was found for explicitly mapping the analogy. Learning patterns were the same, regardless of the type of instruction. Those who knew the least about electrical circuits, based on the pre-test, made the most gains. After the two-week delay, this group maintained the largest amount of their gain. Implications exist for science education classrooms, as analogy use should be based on research about effective practices. Further studies are suggested to foster the building of research-based models for classroom instruction with analogies.

  9. Convex Formulations of Learning from Crowds

    NASA Astrophysics Data System (ADS)

    Kajino, Hiroshi; Kashima, Hisashi

    It has attracted considerable attention to use crowdsourcing services to collect a large amount of labeled data for machine learning, since crowdsourcing services allow one to ask the general public to label data at very low cost through the Internet. The use of crowdsourcing has introduced a new challenge in machine learning, that is, coping with low quality of crowd-generated data. There have been many recent attempts to address the quality problem of multiple labelers, however, there are two serious drawbacks in the existing approaches, that are, (i) non-convexity and (ii) task homogeneity. Most of the existing methods consider true labels as latent variables, which results in non-convex optimization problems. Also, the existing models assume only single homogeneous tasks, while in realistic situations, clients can offer multiple tasks to crowds and crowd workers can work on different tasks in parallel. In this paper, we propose a convex optimization formulation of learning from crowds by introducing personal models of individual crowds without estimating true labels. We further extend the proposed model to multi-task learning based on the resemblance between the proposed formulation and that for an existing multi-task learning model. We also devise efficient iterative methods for solving the convex optimization problems by exploiting conditional independence structures in multiple classifiers.

  10. Using Video Self-Modeling via iPads to Increase Academic Responding of an Adolescent with Autism Spectrum Disorder and Intellectual Disability

    ERIC Educational Resources Information Center

    Hart, Juliet E.; Whalon, Kelly J.

    2012-01-01

    Recent investigations on effective interventions for students with autism spectrum disorder (ASD) have focused on video modeling (VM) and video self-modeling (VSM) to teach a variety of skills. While a considerable literature base exists on VM/VSM to address the social communication, functional, vocational, and behavioral needs of this student…

  11. Continuing Development of a Hybrid Model (VSH) of the Neutral Thermosphere

    NASA Technical Reports Server (NTRS)

    Burns, Alan

    1996-01-01

    We propose to continue the development of a new operational model of neutral thermospheric density, composition, temperatures and winds to improve current engineering environment definitions of the neutral thermosphere. This model will be based on simulations made with the National Center for Atmospheric Research (NCAR) Thermosphere-Ionosphere- Electrodynamic General Circulation Model (TIEGCM) and on empirical data. It will be capable of using real-time geophysical indices or data from ground-based and satellite inputs and provides neutral variables at specified locations and times. This "hybrid" model will be based on a Vector Spherical Harmonic (VSH) analysis technique developed (over the last 8 years) at the University of Michigan that permits the incorporation of the TIGCM outputs and data into the model. The VSH model will be a more accurate version of existing models of the neutral thermospheric, and will thus improve density specification for satellites flying in low Earth orbit (LEO).

  12. Community-Based Participatory Research Conceptual Model: Community Partner Consultation and Face Validity.

    PubMed

    Belone, Lorenda; Lucero, Julie E; Duran, Bonnie; Tafoya, Greg; Baker, Elizabeth A; Chan, Domin; Chang, Charlotte; Greene-Moton, Ella; Kelley, Michele A; Wallerstein, Nina

    2016-01-01

    A national community-based participatory research (CBPR) team developed a conceptual model of CBPR partnerships to understand the contribution of partnership processes to improved community capacity and health outcomes. With the model primarily developed through academic literature and expert consensus building, we sought community input to assess face validity and acceptability. Our research team conducted semi-structured focus groups with six partnerships nationwide. Participants validated and expanded on existing model constructs and identified new constructs based on "real-world" praxis, resulting in a revised model. Four cross-cutting constructs were identified: trust development, capacity, mutual learning, and power dynamics. By empirically testing the model, we found community face validity and capacity to adapt the model to diverse contexts. We recommend partnerships use and adapt the CBPR model and its constructs, for collective reflection and evaluation, to enhance their partnering practices and achieve their health and research goals. © The Author(s) 2014.

  13. Cross-Species Extrapolation of Uptake and Disposition of Neutral Organic Chemicals in Fish Using a Multispecies Physiologically-Based Toxicokinetic Model Framework.

    PubMed

    Brinkmann, Markus; Schlechtriem, Christian; Reininghaus, Mathias; Eichbaum, Kathrin; Buchinger, Sebastian; Reifferscheid, Georg; Hollert, Henner; Preuss, Thomas G

    2016-02-16

    The potential to bioconcentrate is generally considered to be an unwanted property of a substance. Consequently, chemical legislation, including the European REACH regulations, requires the chemical industry to provide bioconcentration data for chemicals that are produced or imported at volumes exceeding 100 tons per annum or if there is a concern that a substance is persistent, bioaccumulative, and toxic. For the filling of the existing data gap for chemicals produced or imported at levels that are below this stipulated volume, without the need for additional animal experiments, physiologically-based toxicokinetic (PBTK) models can be used to predict whole-body and tissue concentrations of neutral organic chemicals in fish. PBTK models have been developed for many different fish species with promising results. In this study, we developed PBTK models for zebrafish (Danio rerio) and roach (Rutilus rutilus) and combined them with existing models for rainbow trout (Onchorhynchus mykiss), lake trout (Salvelinus namaycush), and fathead minnow (Pimephales promelas). The resulting multispecies model framework allows for cross-species extrapolation of the bioaccumulative potential of neutral organic compounds. Predictions were compared with experimental data and were accurate for most substances. Our model can be used for probabilistic risk assessment of chemical bioaccumulation, with particular emphasis on cross-species evaluations.

  14. A generalised individual-based algorithm for modelling the evolution of quantitative herbicide resistance in arable weed populations.

    PubMed

    Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul

    2017-02-01

    Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  15. Prediction of total organic carbon content in shale reservoir based on a new integrated hybrid neural network and conventional well logging curves

    NASA Astrophysics Data System (ADS)

    Zhu, Linqi; Zhang, Chong; Zhang, Chaomo; Wei, Yang; Zhou, Xueqing; Cheng, Yuan; Huang, Yuyang; Zhang, Le

    2018-06-01

    There is increasing interest in shale gas reservoirs due to their abundant reserves. As a key evaluation criterion, the total organic carbon content (TOC) of the reservoirs can reflect its hydrocarbon generation potential. The existing TOC calculation model is not very accurate and there is still the possibility for improvement. In this paper, an integrated hybrid neural network (IHNN) model is proposed for predicting the TOC. This is based on the fact that the TOC information on the low TOC reservoir, where the TOC is easy to evaluate, comes from a prediction problem, which is the inherent problem of the existing algorithm. By comparing the prediction models established in 132 rock samples in the shale gas reservoir within the Jiaoshiba area, it can be seen that the accuracy of the proposed IHNN model is much higher than that of the other prediction models. The mean square error of the samples, which were not joined to the established models, was reduced from 0.586 to 0.442. The results show that TOC prediction is easier after logging prediction has been improved. Furthermore, this paper puts forward the next research direction of the prediction model. The IHNN algorithm can help evaluate the TOC of a shale gas reservoir.

  16. On the Relationship between Observed NLDN Lightning ...

    EPA Pesticide Factsheets

    Lightning-produced nitrogen oxides (NOX=NO+NO2) in the middle and upper troposphere play an essential role in the production of ozone (O3) and influence the oxidizing capacity of the troposphere. Despite much effort in both observing and modeling lightning NOX during the past decade, considerable uncertainties still exist with the quantification of lightning NOX production and distribution in the troposphere. It is even more challenging for regional chemistry and transport models to accurately parameterize lightning NOX production and distribution in time and space. The Community Multiscale Air Quality Model (CMAQ) parameterizes the lightning NO emissions using local scaling factors adjusted by the convective precipitation rate that is predicted by the upstream meteorological model; the adjustment is based on the observed lightning strikes from the National Lightning Detection Network (NLDN). For this parameterization to be valid, the existence of an a priori reasonable relationship between the observed lightning strikes and the modeled convective precipitation rates is needed. In this study, we will present an analysis leveraged on the observed NLDN lightning strikes and CMAQ model simulations over the continental United States for a time period spanning over a decade. Based on the analysis, new parameterization scheme for lightning NOX will be proposed and the results will be evaluated. The proposed scheme will be beneficial to modeling exercises where the obs

  17. Application of a passivity based control methodology for flexible joint robots to a simplified Space Shuttle RMS

    NASA Technical Reports Server (NTRS)

    Sicard, Pierre; Wen, John T.

    1992-01-01

    A passivity approach for the control design of flexible joint robots is applied to the rate control of a three-link arm modeled after the shoulder yaw joint of the Space Shuttle Remote Manipulator System (RMS). The system model includes friction and elastic joint couplings modeled as nonlinear springs. The basic structure of the proposed controller is the sum of a model-based feedforward and a model-independent feedback. A regulator approach with link state feedback is employed to define the desired motor state. Passivity theory is used to design a motor state-based controller to stabilize the error system formed by the feedforward. Simulation results show that greatly improved performance was obtained by using the proposed controller over the existing RMS controller.

  18. Testing the Digital Thread in Support of Model-Based Manufacturing and Inspection

    PubMed Central

    Hedberg, Thomas; Lubell, Joshua; Fischer, Lyle; Maggiano, Larry; Feeney, Allison Barnard

    2016-01-01

    A number of manufacturing companies have reported anecdotal evidence describing the benefits of Model-Based Enterprise (MBE). Based on this evidence, major players in industry have embraced a vision to deploy MBE. In our view, the best chance of realizing this vision is the creation of a single “digital thread.” Under MBE, there exists a Model-Based Definition (MBD), created by the Engineering function, that downstream functions reuse to complete Model-Based Manufacturing and Model-Based Inspection activities. The ensemble of data that enables the combination of model-based definition, manufacturing, and inspection defines this digital thread. Such a digital thread would enable real-time design and analysis, collaborative process-flow development, automated artifact creation, and full-process traceability in a seamless real-time collaborative development among project participants. This paper documents the strengths and weaknesses in the current, industry strategies for implementing MBE. It also identifies gaps in the transition and/or exchange of data between various manufacturing processes. Lastly, this paper presents measured results from a study of model-based processes compared to drawing-based processes and provides evidence to support the anecdotal evidence and vision made by industry. PMID:27325911

  19. Model-based semantic dictionaries for medical language understanding.

    PubMed Central

    Rassinoux, A. M.; Baud, R. H.; Ruch, P.; Trombert-Paviot, B.; Rodrigues, J. M.

    1999-01-01

    Semantic dictionaries are emerging as a major cornerstone towards achieving sound natural language understanding. Indeed, they constitute the main bridge between words and conceptual entities that reflect their meanings. Nowadays, more and more wide-coverage lexical dictionaries are electronically available in the public domain. However, associating a semantic content with lexical entries is not a straightforward task as it is subordinate to the existence of a fine-grained concept model of the treated domain. This paper presents the benefits and pitfalls in building and maintaining multilingual dictionaries, the semantics of which is directly established on an existing concept model. Concrete cases, handled through the GALEN-IN-USE project, illustrate the use of such semantic dictionaries for the analysis and generation of multilingual surgical procedures. PMID:10566333

  20. Flavor instabilities in the neutrino line model

    NASA Astrophysics Data System (ADS)

    Duan, Huaiyu; Shalgar, Shashank

    2015-07-01

    A dense neutrino medium can experience collective flavor oscillations through nonlinear neutrino-neutrino refraction. To make this multi-dimensional flavor transport problem more tractable, all existing studies have assumed certain symmetries (e.g., the spatial homogeneity and directional isotropy in the early universe) to reduce the dimensionality of the problem. In this work we show that, if both the directional and spatial symmetries are not enforced in the neutrino line model, collective oscillations can develop in the physical regimes where the symmetry-preserving oscillation modes are stable. Our results suggest that collective neutrino oscillations in real astrophysical environments (such as core-collapse supernovae and black-hole accretion discs) can be qualitatively different from the predictions based on existing models in which spatial and directional symmetries are artificially imposed.

  1. Out of This World: A University Partnership Model for Functional Clothing Design

    NASA Technical Reports Server (NTRS)

    Dunne, Lucy E.; Simon, Cory

    2013-01-01

    University collaborations with external partners can be difficult to initiate, especially in early-stage or emerging topics. External collaborators may be reluctant to commit the level of funding required to ensure that the topic is given adequate attention, and low-stakes mechanisms are relatively rare. Here, we present a successful model for collaboration between universities and NASA, which uses existing project-based coursework as a vehicle for exploration of emerging topics. This model leverages existing structures, reducing the financial and intellectual commitment of both University and NASA research partners, and facilitating pilot investigations for exploration of potential areas for more in-depth research. We outline the logistical structure and benefits for University and NASA partners over 1.5 years of collaboration.

  2. Dependability modeling and assessment in UML-based software development.

    PubMed

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  3. Dependability Modeling and Assessment in UML-Based Software Development

    PubMed Central

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C.

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results. PMID:22988428

  4. Extending data worth methods to select multiple observations targeting specific hydrological predictions of interest

    NASA Astrophysics Data System (ADS)

    Vilhelmsen, Troels N.; Ferré, Ty P. A.

    2016-04-01

    Hydrological models are often developed to forecasting future behavior in response due to natural or human induced changes in stresses affecting hydrologic systems. Commonly, these models are conceptualized and calibrated based on existing data/information about the hydrological conditions. However, most hydrologic systems lack sufficient data to constrain models with adequate certainty to support robust decision making. Therefore, a key element of a hydrologic study is the selection of additional data to improve model performance. Given the nature of hydrologic investigations, it is not practical to select data sequentially, i.e. to choose the next observation, collect it, refine the model, and then repeat the process. Rather, for timing and financial reasons, measurement campaigns include multiple wells or sampling points. There is a growing body of literature aimed at defining the expected data worth based on existing models. However, these are almost all limited to identifying single additional observations. In this study, we present a methodology for simultaneously selecting multiple potential new observations based on their expected ability to reduce the uncertainty of the forecasts of interest. This methodology is based on linear estimates of the predictive uncertainty, and it can be used to determine the optimal combinations of measurements (location and number) established to reduce the uncertainty of multiple predictions. The outcome of the analysis is an estimate of the optimal sampling locations; the optimal number of samples; as well as a probability map showing the locations within the investigated area that are most likely to provide useful information about the forecasting of interest.

  5. A neural network model of causative actions.

    PubMed

    Lee-Hand, Jeremy; Knott, Alistair

    2015-01-01

    A common idea in models of action representation is that actions are represented in terms of their perceptual effects (see e.g., Prinz, 1997; Hommel et al., 2001; Sahin et al., 2007; Umiltà et al., 2008; Hommel, 2013). In this paper we extend existing models of effect-based action representations to account for a novel distinction. Some actions bring about effects that are independent events in their own right: for instance, if John smashes a cup, he brings about the event of the cup smashing. Other actions do not bring about such effects. For instance, if John grabs a cup, this action does not cause the cup to "do" anything: a grab action has well-defined perceptual effects, but these are not registered by the perceptual system that detects independent events involving external objects in the world. In our model, effect-based actions are implemented in several distinct neural circuits, which are organized into a hierarchy based on the complexity of their associated perceptual effects. The circuit at the top of this hierarchy is responsible for actions that bring about independently perceivable events. This circuit receives input from the perceptual module that recognizes arbitrary events taking place in the world, and learns movements that reliably cause such events. We assess our model against existing experimental observations about effect-based motor representations, and make some novel experimental predictions. We also consider the possibility that the "causative actions" circuit in our model can be identified with a motor pathway reported in other work, specializing in "functional" actions on manipulable tools (Bub et al., 2008; Binkofski and Buxbaum, 2013).

  6. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its results and impact. We will highlight the insights gained by applying the Model Based System Engineering and provide recommendations for its applications and improvements.

  7. Setting priorities in health research using the model proposed by the World Health Organization: development of a quantitative methodology using tuberculosis in South Africa as a worked example.

    PubMed

    Hacking, Damian; Cleary, Susan

    2016-02-09

    Setting priorities is important in health research given the limited resources available for research. Various guidelines exist to assist in the priority setting process; however, priority setting still faces significant challenges such as the clear ranking of identified priorities. The World Health Organization (WHO) proposed a Disability Adjusted Life Year (DALY)-based model to rank priorities by research area (basic, health systems and biomedical) by dividing the DALYs into 'unavertable with existing interventions', 'avertable with improved efficiency' and 'avertable with existing but non-cost-effective interventions', respectively. However, the model has conceptual flaws and no clear methodology for its construction. Therefore, the aim of this paper was to amend the model to address these flaws, and develop a clear methodology by using tuberculosis in South Africa as a worked example. An amended model was constructed to represent total DALYs as the product of DALYs per person and absolute burden of disease. These figures were calculated for all countries from WHO datasets. The lowest figures achieved by any country were assumed to represent 'unavertable with existing interventions' if extrapolated to South Africa. The ratio of 'cost per patient treated' (adjusted for purchasing power and outcome weighted) between South Africa and the best country was used to calculate the 'avertable with improved efficiency section'. Finally, 'avertable with existing but non-cost-effective interventions' was calculated using Disease Control Priorities Project efficacy data, and the ratio between the best intervention and South Africa's current intervention, irrespective of cost. The amended model shows that South Africa has a tuberculosis burden of 1,009,837.3 DALYs; 0.009% of DALYs are unavertable with existing interventions and 96.3% of DALYs could be averted with improvements in efficiency. Of the remaining DALYs, a further 56.9% could be averted with existing but non-cost-effective interventions. The amended model was successfully constructed using limited data sources. The generalizability of the data used is the main limitation of the model. More complex formulas are required to deal with such potential confounding variables; however, the results act as starting point for development of a more robust model.

  8. A Generalized Framework for Modeling Next Generation 911 Implementations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelic, Andjelka; Aamir, Munaf Syed; Kelic, Andjelka

    This document summarizes the current state of Sandia 911 modeling capabilities and then addresses key aspects of Next Generation 911 (NG911) architectures for expansion of existing models. Analysis of three NG911 implementations was used to inform heuristics , associated key data requirements , and assumptions needed to capture NG911 architectures in the existing models . Modeling of NG911 necessitates careful consideration of its complexity and the diversity of implementations. Draft heuristics for constructing NG911 models are pres ented based on the analysis along with a summary of current challenges and ways to improve future NG911 modeling efforts . We foundmore » that NG911 relies on E nhanced 911 (E911) assets such as 911 selective routers to route calls originating from traditional tel ephony service which are a majority of 911 calls . We also found that the diversity and transitional nature of NG911 implementations necessitates significant and frequent data collection to ensure that adequate model s are available for crisis action support .« less

  9. Measuring and modeling polymer concentration profiles near spindle boundaries argues that spindle microtubules regulate their own nucleation

    NASA Astrophysics Data System (ADS)

    Kaye, Bryan; Stiehl, Olivia; Foster, Peter J.; Shelley, Michael J.; Needleman, Daniel J.; Fürthauer, Sebastian

    2018-05-01

    Spindles are self-organized microtubule-based structures that segregate chromosomes during cell division. The mass of the spindle is controlled by the balance between microtubule turnover and nucleation. The mechanisms that control the spatial regulation of microtubule nucleation remain poorly understood. While previous work found that microtubule nucleators bind to pre-existing microtubules in the spindle, it is still unclear whether this binding regulates the activity of those nucleators. Here we use a combination of experiments and mathematical modeling to investigate this issue. We measured the concentration of microtubules and soluble tubulin in and around the spindle. We found a very sharp decay in the concentration of microtubules at the spindle interface. This is inconsistent with a model in which the activity of nucleators is independent of their association with microtubules but consistent with a model in which microtubule nucleators are only active when bound to pre-existing microtubules. This argues that the activity of microtubule nucleators is greatly enhanced when bound to pre-existing microtubules. Thus, microtubule nucleators are both localized and activated by the microtubules they generate.

  10. Collaborative Drug Therapy Management: Case Studies of Three Community-Based Models of Care

    PubMed Central

    Snyder, Margie E.; Earl, Tara R.; Greenberg, Michael; Heisler, Holly; Revels, Michelle; Matson-Koffman, Dyann

    2015-01-01

    Collaborative drug therapy management agreements are a strategy for expanding the role of pharmacists in team-based care with other providers. However, these agreements have not been widely implemented. This study describes the features of existing provider–pharmacist collaborative drug therapy management practices and identifies the facilitators and barriers to implementing such services in community settings. We conducted in-depth, qualitative interviews in 2012 in a federally qualified health center, an independent pharmacy, and a retail pharmacy chain. Facilitators included 1) ensuring pharmacists were adequately trained; 2) obtaining stakeholder (eg, physician) buy-in; and 3) leveraging academic partners. Barriers included 1) lack of pharmacist compensation; 2) hesitation among providers to trust pharmacists; 3) lack of time and resources; and 4) existing informal collaborations that resulted in reduced interest in formal agreements. The models described in this study could be used to strengthen clinical–community linkages through team-based care, particularly for chronic disease prevention and management. PMID:25811494

  11. MathWorks Simulink and C++ integration with the new VLT PLC-based standard development platform for instrument control systems

    NASA Astrophysics Data System (ADS)

    Kiekebusch, Mario J.; Di Lieto, Nicola; Sandrock, Stefan; Popovic, Dan; Chiozzi, Gianluca

    2014-07-01

    ESO is in the process of implementing a new development platform, based on PLCs, for upcoming VLT control systems (new instruments and refurbishing of existing systems to manage obsolescence issues). In this context, we have evaluated the integration and reuse of existing C++ libraries and Simulink models into the real-time environment of BECKHOFF Embedded PCs using the capabilities of the latest version of TwinCAT software and MathWorks Embedded Coder. While doing so the aim was to minimize the impact of the new platform by adopting fully tested solutions implemented in C++. This allows us to reuse the in house expertise, as well as extending the normal capabilities of the traditional PLC programming environments. We present the progress of this work and its application in two concrete cases: 1) field rotation compensation for instrument tracking devices like derotators, 2) the ESO standard axis controller (ESTAC), a generic model-based controller implemented in Simulink and used for the control of telescope main axes.

  12. A Two-Phase Space Resection Model for Accurate Topographic Reconstruction from Lunar Imagery with PushbroomScanners

    PubMed Central

    Xu, Xuemiao; Zhang, Huaidong; Han, Guoqiang; Kwan, Kin Chung; Pang, Wai-Man; Fang, Jiaming; Zhao, Gansen

    2016-01-01

    Exterior orientation parameters’ (EOP) estimation using space resection plays an important role in topographic reconstruction for push broom scanners. However, existing models of space resection are highly sensitive to errors in data. Unfortunately, for lunar imagery, the altitude data at the ground control points (GCPs) for space resection are error-prone. Thus, existing models fail to produce reliable EOPs. Motivated by a finding that for push broom scanners, angular rotations of EOPs can be estimated independent of the altitude data and only involving the geographic data at the GCPs, which are already provided, hence, we divide the modeling of space resection into two phases. Firstly, we estimate the angular rotations based on the reliable geographic data using our proposed mathematical model. Then, with the accurate angular rotations, the collinear equations for space resection are simplified into a linear problem, and the global optimal solution for the spatial position of EOPs can always be achieved. Moreover, a certainty term is integrated to penalize the unreliable altitude data for increasing the error tolerance. Experimental results evidence that our model can obtain more accurate EOPs and topographic maps not only for the simulated data, but also for the real data from Chang’E-1, compared to the existing space resection model. PMID:27077855

  13. A Two-Phase Space Resection Model for Accurate Topographic Reconstruction from Lunar Imagery with PushbroomScanners.

    PubMed

    Xu, Xuemiao; Zhang, Huaidong; Han, Guoqiang; Kwan, Kin Chung; Pang, Wai-Man; Fang, Jiaming; Zhao, Gansen

    2016-04-11

    Exterior orientation parameters' (EOP) estimation using space resection plays an important role in topographic reconstruction for push broom scanners. However, existing models of space resection are highly sensitive to errors in data. Unfortunately, for lunar imagery, the altitude data at the ground control points (GCPs) for space resection are error-prone. Thus, existing models fail to produce reliable EOPs. Motivated by a finding that for push broom scanners, angular rotations of EOPs can be estimated independent of the altitude data and only involving the geographic data at the GCPs, which are already provided, hence, we divide the modeling of space resection into two phases. Firstly, we estimate the angular rotations based on the reliable geographic data using our proposed mathematical model. Then, with the accurate angular rotations, the collinear equations for space resection are simplified into a linear problem, and the global optimal solution for the spatial position of EOPs can always be achieved. Moreover, a certainty term is integrated to penalize the unreliable altitude data for increasing the error tolerance. Experimental results evidence that our model can obtain more accurate EOPs and topographic maps not only for the simulated data, but also for the real data from Chang'E-1, compared to the existing space resection model.

  14. On the use of Schwarz-Christoffel conformal mappings to the grid generation for global ocean models

    NASA Astrophysics Data System (ADS)

    Xu, S.; Wang, B.; Liu, J.

    2015-10-01

    In this article we propose two grid generation methods for global ocean general circulation models. Contrary to conventional dipolar or tripolar grids, the proposed methods are based on Schwarz-Christoffel conformal mappings that map areas with user-prescribed, irregular boundaries to those with regular boundaries (i.e., disks, slits, etc.). The first method aims at improving existing dipolar grids. Compared with existing grids, the sample grid achieves a better trade-off between the enlargement of the latitudinal-longitudinal portion and the overall smooth grid cell size transition. The second method addresses more modern and advanced grid design requirements arising from high-resolution and multi-scale ocean modeling. The generated grids could potentially achieve the alignment of grid lines to the large-scale coastlines, enhanced spatial resolution in coastal regions, and easier computational load balance. Since the grids are orthogonal curvilinear, they can be easily utilized by the majority of ocean general circulation models that are based on finite difference and require grid orthogonality. The proposed grid generation algorithms can also be applied to the grid generation for regional ocean modeling where complex land-sea distribution is present.

  15. Robust foreground detection: a fusion of masked grey world, probabilistic gradient information and extended conditional random field approach.

    PubMed

    Zulkifley, Mohd Asyraf; Moran, Bill; Rawlinson, David

    2012-01-01

    Foreground detection has been used extensively in many applications such as people counting, traffic monitoring and face recognition. However, most of the existing detectors can only work under limited conditions. This happens because of the inability of the detector to distinguish foreground and background pixels, especially in complex situations. Our aim is to improve the robustness of foreground detection under sudden and gradual illumination change, colour similarity issue, moving background and shadow noise. Since it is hard to achieve robustness using a single model, we have combined several methods into an integrated system. The masked grey world algorithm is introduced to handle sudden illumination change. Colour co-occurrence modelling is then fused with the probabilistic edge-based background modelling. Colour co-occurrence modelling is good in filtering moving background and robust to gradual illumination change, while an edge-based modelling is used for solving a colour similarity problem. Finally, an extended conditional random field approach is used to filter out shadow and afterimage noise. Simulation results show that our algorithm performs better compared to the existing methods, which makes it suitable for higher-level applications.

  16. Finding the top influential bloggers based on productivity and popularity features

    NASA Astrophysics Data System (ADS)

    Khan, Hikmat Ullah; Daud, Ali

    2017-07-01

    A blog acts as a platform of virtual communication to share comments or views about products, events and social issues. Like other social web activities, blogging actions spread to a large number of people. Users influence others in many ways, such as buying a product, having a particular political or social opinion or initiating new activity. Finding the top influential bloggers is an active research domain as it helps us in various fields, such as online marketing, e-commerce, product search and e-advertisements. There exist various models to find the influential bloggers, but they consider limited features using non-modular approach. This paper proposes a new model, Popularity and Productivity Model (PPM), based on a modular approach to find the top influential bloggers. It consists of popularity and productivity modules which exploit various features. We discuss the role of each proposed and existing features and evaluate the proposed model against the standard baseline models using datasets from the real-world blogs. The analysis using standard performance evaluation measures verifies that both productivity and popularity modules play a vital role to find influential bloggers in blogging community in an effective manner.

  17. Using Geometry-Based Metrics as Part of Fitness-for-Purpose Evaluations of 3D City Models

    NASA Astrophysics Data System (ADS)

    Wong, K.; Ellul, C.

    2016-10-01

    Three-dimensional geospatial information is being increasingly used in a range of tasks beyond visualisation. 3D datasets, however, are often being produced without exact specifications and at mixed levels of geometric complexity. This leads to variations within the models' geometric and semantic complexity as well as the degree of deviation from the corresponding real world objects. Existing descriptors and measures of 3D data such as CityGML's level of detail are perhaps only partially sufficient in communicating data quality and fitness-for-purpose. This study investigates whether alternative, automated, geometry-based metrics describing the variation of complexity within 3D datasets could provide additional relevant information as part of a process of fitness-for-purpose evaluation. The metrics include: mean vertex/edge/face counts per building; vertex/face ratio; minimum 2D footprint area and; minimum feature length. Each metric was tested on six 3D city models from international locations. The results show that geometry-based metrics can provide additional information on 3D city models as part of fitness-for-purpose evaluations. The metrics, while they cannot be used in isolation, may provide a complement to enhance existing data descriptors if backed up with local knowledge, where possible.

  18. Electronic medical record-based multicondition models to predict the risk of 30 day readmission or death among adult medicine patients: validation and comparison to existing models.

    PubMed

    Amarasingham, Ruben; Velasco, Ferdinand; Xie, Bin; Clark, Christopher; Ma, Ying; Zhang, Song; Bhat, Deepa; Lucena, Brian; Huesch, Marco; Halm, Ethan A

    2015-05-20

    There is increasing interest in using prediction models to identify patients at risk of readmission or death after hospital discharge, but existing models have significant limitations. Electronic medical record (EMR) based models that can be used to predict risk on multiple disease conditions among a wide range of patient demographics early in the hospitalization are needed. The objective of this study was to evaluate the degree to which EMR-based risk models for 30-day readmission or mortality accurately identify high risk patients and to compare these models with published claims-based models. Data were analyzed from all consecutive adult patients admitted to internal medicine services at 7 large hospitals belonging to 3 health systems in Dallas/Fort Worth between November 2009 and October 2010 and split randomly into derivation and validation cohorts. Performance of the model was evaluated against the Canadian LACE mortality or readmission model and the Centers for Medicare and Medicaid Services (CMS) Hospital Wide Readmission model. Among the 39,604 adults hospitalized for a broad range of medical reasons, 2.8% of patients died, 12.7% were readmitted, and 14.7% were readmitted or died within 30 days after discharge. The electronic multicondition models for the composite outcome of 30-day mortality or readmission had good discrimination using data available within 24 h of admission (C statistic 0.69; 95% CI, 0.68-0.70), or at discharge (0.71; 95% CI, 0.70-0.72), and were significantly better than the LACE model (0.65; 95% CI, 0.64-0.66; P =0.02) with significant NRI (0.16) and IDI (0.039, 95% CI, 0.035-0.044). The electronic multicondition model for 30-day readmission alone had good discrimination using data available within 24 h of admission (C statistic 0.66; 95% CI, 0.65-0.67) or at discharge (0.68; 95% CI, 0.67-0.69), and performed significantly better than the CMS model (0.61; 95% CI, 0.59-0.62; P < 0.01) with significant NRI (0.20) and IDI (0.037, 95% CI, 0.033-0.041). A new electronic multicondition model based on information derived from the EMR predicted mortality and readmission at 30 days, and was superior to previously published claims-based models.

  19. SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling.

    PubMed

    Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi

    2010-01-01

    Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster.

  20. Investigating Alfvénic wave propagation in coronal open-field regions

    PubMed Central

    Morton, R. J.; Tomczyk, S.; Pinto, R.

    2015-01-01

    The physical mechanisms behind accelerating solar and stellar winds are a long-standing astrophysical mystery, although recent breakthroughs have come from models invoking the turbulent dissipation of Alfvén waves. The existence of Alfvén waves far from the Sun has been known since the 1970s, and recently the presence of ubiquitous Alfvénic waves throughout the solar atmosphere has been confirmed. However, the presence of atmospheric Alfvénic waves does not, alone, provide sufficient support for wave-based models; the existence of counter-propagating Alfvénic waves is crucial for the development of turbulence. Here, we demonstrate that counter-propagating Alfvénic waves exist in open coronal magnetic fields and reveal key observational insights into the details of their generation, reflection in the upper atmosphere and outward propagation into the solar wind. The results enhance our knowledge of Alfvénic wave propagation in the solar atmosphere, providing support and constraints for some of the recent Alfvén wave turbulence models. PMID:26213234

  1. Creating COMFORT: A Communication-Based Model for Breaking Bad News

    ERIC Educational Resources Information Center

    Villagran, Melinda; Goldsmith, Joy; Wittenberg-Lyles, Elaine; Baldwin, Paula

    2010-01-01

    This study builds upon existing protocols for breaking bad news (BBN), and offers an interaction-based approach to communicating comfort to patients and their families. The goal was to analyze medical students' (N = 21) videotaped standardized patient BBN interactions after completing an instructional unit on a commonly used BBN protocol, commonly…

  2. Video Self-Modeling on an iPad to Teach Functional Math Skills to Adolescents with Autism and Intellectual Disability

    ERIC Educational Resources Information Center

    Burton, Cami E.; Anderson, Darlene H.; Prater, Mary Anne; Dyches, Tina T.

    2013-01-01

    Researchers suggest that video-based interventions can provide increased opportunity for students with disabilities to acquire important academic and functional skills; however, little research exists regarding video-based interventions on the academic skills of students with autism and intellectual disability. We used a…

  3. Cortical Bases of Elementary Deductive Reasoning: Inference, Memory, and Metadeduction

    ERIC Educational Resources Information Center

    Reverberi, Carlo; Shallice, Tim; D'Agostini, Serena; Skrap, Miran; Bonatti, Luca L.

    2009-01-01

    Elementary deduction is the ability of unreflectively drawing conclusions from explicit or implicit premises, on the basis of their logical forms. This ability is involved in many aspects of human cognition and interactions. To date, limited evidence exists on its cortical bases. We propose a model of elementary deduction in which logical…

  4. Metis Hub: The Development of an Intuitive Project Planning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McConnell, Rachael M.; Lawrence Livermore National Lab.

    2015-08-26

    The goal is to develop an intuitive, dynamic, and consistent interface for the Metis Planning System by combining user requirements and human engineering concepts. The system is largely based upon existing systems so some tools already have working models that we can follow. However, the web-based interface is completely new.

  5. Manufacturing vegetable oil based biodiesel: An engineering management perspective

    USDA-ARS?s Scientific Manuscript database

    According to the USDA, 6.45 million tons of cottonseed was produced in 2007. Each ton will yield approximately 44 to 46 gallons unrefined oil. Cottonseed oil bio-diesel could have the potential to create a more competitive oil market for oil mills. The proposed cost model is based on an existing cot...

  6. Testing Theories of Recognition Memory by Predicting Performance Across Paradigms

    ERIC Educational Resources Information Center

    Smith, David G.; Duncan, Matthew J. J.

    2004-01-01

    Signal-detection theory (SDT) accounts of recognition judgments depend on the assumption that recognition decisions result from a single familiarity-based process. However, fits of a hybrid SDT model, called dual-process theory (DPT), have provided evidence for the existence of a second, recollection-based process. In 2 experiments, the authors…

  7. Improving the Effectiveness of English Vocabulary Review by Integrating ARCS with Mobile Game-Based Learning

    ERIC Educational Resources Information Center

    Wu, Ting-Ting

    2018-01-01

    Memorizing English vocabulary is often considered uninteresting, and a lack of motivation exists during learning activities. Moreover, most vocabulary practice systems automatically select words from articles and do not provide integrated model methods for students. Therefore, this study constructed a mobile game-based English vocabulary practice…

  8. An evidence-based approach to case management model selection for an acute care facility: is there really a preferred model?

    PubMed

    Terra, Sandra M

    2007-01-01

    This research seeks to determine whether there is adequate evidence-based justification for selection of one acute care case management model over another. Acute Inpatient Hospital. This article presents a systematic review of published case management literature, resulting in classification specific to terms of level of evidence. This review examines the best available evidence in an effort to select an acute care case management model. Although no single case management model can be identified as preferred, it is clear that adequate evidence-based literature exists to acknowledge key factors driving the acute care model and to form a foundation for the efficacy of hospital case management practice. Although no single case management model can be identified as preferred, this systematic review demonstrates that adequate evidence-based literature exists to acknowledge key factors driving the acute care model and forming a foundation for the efficacy of hospital case management practice. Distinctive aspects of case management frameworks can be used to guide the development of an acute care case management model. The study illustrates: * The effectiveness of case management when there is direct patient contact by the case manager regardless of disease condition: not only does the quality of care increase but also length of stay (LOS) decreases, care is defragmented, and both patient and physician satisfaction can increase. * The preferred case management models result in measurable outcomes that can directly relate to, and demonstrate alignment with, organizational strategy. * Acute care management programs reduce cost and LOS, and improve outcomes. * An integrated case management program that includes social workers, as well as nursing, is the most effective acute care management model. * The successful case management model will recognize physicians, as well as patients, as valued customers with whom partnership can positively affect financial outcomes in terms of reduction in LOS, improvement in quality, and delivery of care.

  9. Explicating an Evidence-Based, Theoretically Informed, Mobile Technology-Based System to Improve Outcomes for People in Recovery for Alcohol Dependence

    PubMed Central

    Gustafson, David H.; Isham, Andrew; Baker, Timothy; Boyle, Michael G.; Levy, Michael

    2011-01-01

    Post treatment relapse to uncontrolled alcohol use is common. More cost-effective approaches are needed. We believe currently available communication technology can use existing models for relapse prevention to cost-effectively improve long-term relapse prevention. This paper describes: 1) research-based elements of alcohol related relapse prevention and how they can be encompassed in Self Determination Theory (SDT) and Marlatt’s Cognitive Behavioral Relapse Prevention Model, 2) how technology could help address the needs of people seeking recovery, 3) a technology-based prototype, organized around Self Determination Theory and Marlatt’s model and 4) how we are testing a system based on the ideas in this article and related ethical and operational considerations. PMID:21190410

  10. Whole-farm models to quantify greenhouse gas emissions and their potential use for linking climate change mitigation and adaptation in temperate grassland ruminant-based farming systems.

    PubMed

    Del Prado, A; Crosson, P; Olesen, J E; Rotz, C A

    2013-06-01

    The farm level is the most appropriate scale for evaluating options for mitigating greenhouse gas (GHG) emissions, because the farm represents the unit at which management decisions in livestock production are made. To date, a number of whole farm modelling approaches have been developed to quantify GHG emissions and explore climate change mitigation strategies for livestock systems. This paper analyses the limitations and strengths of the different existing approaches for modelling GHG mitigation by considering basic model structures, approaches for simulating GHG emissions from various farm components and the sensitivity of GHG outputs and mitigation measures to different approaches. Potential challenges for linking existing models with the simulation of impacts and adaptation measures under climate change are explored along with a brief discussion of the effects on other ecosystem services.

  11. River Inflows into Lakes: Basin Temperature Profiles Driven By Peeling Detrainment from Dense Underflows

    NASA Astrophysics Data System (ADS)

    Hogg, C. A. R.; Huppert, H. E.; Imberger, J.; Dalziel, S. B.

    2014-12-01

    Dense gravity currents from river inflows feed fluid into confined basins in lakes. Large inflows can influence temperature profiles in the basins. Existing parameterisations of the circulation and mixing of such inflows are often based on the entrainment of ambient fluid into the underflowing gravity currents. However, recent observations have suggested that uni-directional entrainment into a gravity current does not fully describe the transfer between such gravity currents and the ambient water. Laboratory experiments visualised peeling detrainment from the gravity current occurring when the ambient fluid was stratified. A theoretical model of the observed peeling detrainment was developed to predict the temperature profile in the basin. This new model gives a better approximation of the temperature profile observed in the experiments than the pre-existing entraining model. The model can now be developed such that it integrates into operational models of lake basins.

  12. Generic Business Model Types for Enterprise Mashup Intermediaries

    NASA Astrophysics Data System (ADS)

    Hoyer, Volker; Stanoevska-Slabeva, Katarina

    The huge demand for situational and ad-hoc applications desired by the mass of business end users led to a new kind of Web applications, well-known as Enterprise Mashups. Users with no or limited programming skills are empowered to leverage in a collaborative manner existing Mashup components by combining and reusing company internal and external resources within minutes to new value added applications. Thereby, Enterprise Mashup environments interact as intermediaries to match the supply of providers and demand of consumers. By following the design science approach, we propose an interaction phase model artefact based on market transaction phases to structure required intermediary features. By means of five case studies, we demonstrate the application of the designed model and identify three generic business model types for Enterprise Mashups intermediaries (directory, broker, and marketplace). So far, intermediaries following a real marketplace business model don’t exist in context of Enterprise Mashups and require further research for this emerging paradigm.

  13. Astrobiological Research on Tardigrades: Implications for Extraterrestrial Life Forms

    NASA Astrophysics Data System (ADS)

    Horikawa, D. D.

    2013-11-01

    Tardigrades have been considered as a model for astrobiological studies based on their tolerance to extreme environments. Future research on tardigrades might provide important insight into the possibilities of existence of multicellular life forms.

  14. The Sulfur Cycle

    ERIC Educational Resources Information Center

    Kellogg, W. W.; And Others

    1972-01-01

    A model estimating the contributions of sulfur compounds by natural and human activities, and the rate of removal of sulfur from the atmosphere, is based on a review of the existing literature. Areas requiring additional research are identified. (AL)

  15. Characterization of Orbital Debris via Hyper-Velocity Ground-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather

    2016-01-01

    The purpose of the DebriSat project is to replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoDand NASA breakup models.

  16. Performance Model of Intercity Ground Passenger Transportation Systems

    DOT National Transportation Integrated Search

    1975-08-01

    A preliminary examination of the problems associated with mixed-traffic operations - conventional freight and high speed passenger trains - is presented. Approaches based upon a modest upgrading of existing signal systems are described. Potential cos...

  17. Adopting Internet Standards for Orbital Use

    NASA Technical Reports Server (NTRS)

    Wood, Lloyd; Ivancic, William; da Silva Curiel, Alex; Jackson, Chris; Stewart, Dave; Shell, Dave; Hodgson, Dave

    2005-01-01

    After a year of testing and demonstrating a Cisco mobile access router intended for terrestrial use onboard the low-Earth-orbiting UK-DMC satellite as part of a larger merged ground/space IP-based internetwork, we reflect on and discuss the benefits and drawbacks of integration and standards reuse for small satellite missions. Benefits include ease of operation and the ability to leverage existing systems and infrastructure designed for general use, as well as reuse of existing, known, and well-understood security and operational models. Drawbacks include cases where integration work was needed to bridge the gaps in assumptions between different systems, and where performance considerations outweighed the benefits of reuse of pre-existing file transfer protocols. We find similarities with the terrestrial IP networks whose technologies we have adopted and also some significant differences in operational models and assumptions that must be considered.

  18. A label field fusion bayesian model and its penalized maximum rand estimator for image segmentation.

    PubMed

    Mignotte, Max

    2010-06-01

    This paper presents a novel segmentation approach based on a Markov random field (MRF) fusion model which aims at combining several segmentation results associated with simpler clustering models in order to achieve a more reliable and accurate segmentation result. The proposed fusion model is derived from the recently introduced probabilistic Rand measure for comparing one segmentation result to one or more manual segmentations of the same image. This non-parametric measure allows us to easily derive an appealing fusion model of label fields, easily expressed as a Gibbs distribution, or as a nonstationary MRF model defined on a complete graph. Concretely, this Gibbs energy model encodes the set of binary constraints, in terms of pairs of pixel labels, provided by each segmentation results to be fused. Combined with a prior distribution, this energy-based Gibbs model also allows for definition of an interesting penalized maximum probabilistic rand estimator with which the fusion of simple, quickly estimated, segmentation results appears as an interesting alternative to complex segmentation models existing in the literature. This fusion framework has been successfully applied on the Berkeley image database. The experiments reported in this paper demonstrate that the proposed method is efficient in terms of visual evaluation and quantitative performance measures and performs well compared to the best existing state-of-the-art segmentation methods recently proposed in the literature.

  19. Three-dimensional finite element modeling of a maxillary premolar tooth based on the micro-CT scanning: a detailed description.

    PubMed

    Huang, Zheng; Chen, Zhi

    2013-10-01

    This study describes the details of how to construct a three-dimensional (3D) finite element model of a maxillary first premolar tooth based on micro-CT data acquisition technique, MIMICS software and ANSYS software. The tooth was scanned by micro-CT, in which 1295 slices were obtained and then 648 slices were selected for modeling. The 3D surface mesh models of enamel and dentin were created by MIMICS (STL file). The solid mesh model was constructed by ANSYS. After the material properties and boundary conditions were set, a loading analysis was performed to demonstrate the applicableness of the resulting model. The first and third principal stresses were then evaluated. The results showed that the number of nodes and elements of the finite element model were 56 618 and 311801, respectively. The geometric form of the model was highly consistent with that of the true tooth, and the deviation between them was -0.28%. The loading analysis revealed the typical stress patterns in the contour map. The maximum compressive stress existed in the contact points and the maximum tensile stress existed in the deep fissure between the two cusps. It is concluded that by using the micro-CT and highly integrated software, construction of the 3D finite element model with high quality will not be difficult for clinical researchers.

  20. Model-based object classification using unification grammars and abstract representations

    NASA Astrophysics Data System (ADS)

    Liburdy, Kathleen A.; Schalkoff, Robert J.

    1993-04-01

    The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.

Top