Lightweight approach to model traceability in a CASE tool
NASA Astrophysics Data System (ADS)
Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita
2017-07-01
A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.
NASA Astrophysics Data System (ADS)
Oskouie, M. Faraji; Ansari, R.; Rouhi, H.
2018-04-01
Eringen's nonlocal elasticity theory is extensively employed for the analysis of nanostructures because it is able to capture nanoscale effects. Previous studies have revealed that using the differential form of the strain-driven version of this theory leads to paradoxical results in some cases, such as bending analysis of cantilevers, and recourse must be made to the integral version. In this article, a novel numerical approach is developed for the bending analysis of Euler-Bernoulli nanobeams in the context of strain- and stress-driven integral nonlocal models. This numerical approach is proposed for the direct solution to bypass the difficulties related to converting the integral governing equation into a differential equation. First, the governing equation is derived based on both strain-driven and stress-driven nonlocal models by means of the minimum total potential energy. Also, in each case, the governing equation is obtained in both strong and weak forms. To solve numerically the derived equations, matrix differential and integral operators are constructed based upon the finite difference technique and trapezoidal integration rule. It is shown that the proposed numerical approach can be efficiently applied to the strain-driven nonlocal model with the aim of resolving the mentioned paradoxes. Also, it is able to solve the problem based on the strain-driven model without inconsistencies of the application of this model that are reported in the literature.
Data-driven Modelling for decision making under uncertainty
NASA Astrophysics Data System (ADS)
Angria S, Layla; Dwi Sari, Yunita; Zarlis, Muhammad; Tulus
2018-01-01
The rise of the issues with the uncertainty of decision making has become a very warm conversation in operation research. Many models have been presented, one of which is with data-driven modelling (DDM). The purpose of this paper is to extract and recognize patterns in data, and find the best model in decision-making problem under uncertainty by using data-driven modeling approach with linear programming, linear and nonlinear differential equation, bayesian approach. Model criteria tested to determine the smallest error, and it will be the best model that can be used.
A Model-Driven Development Method for Management Information Systems
NASA Astrophysics Data System (ADS)
Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki
Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.
A Model-Driven Approach to e-Course Management
ERIC Educational Resources Information Center
Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana
2018-01-01
This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…
Teachers Develop CLIL Materials in Argentina: A Workshop Experience
ERIC Educational Resources Information Center
Banegas, Darío Luis
2016-01-01
Content and language integrated learning (CLIL) is a Europe-born approach. Nevertheless, CLIL as a language learning approach has been implemented in Latin America in different ways and models: content-driven models and language-driven models. As regards the latter, new school curricula demand that CLIL be used in secondary education in Argentina…
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
Reflection of a Year Long Model-Driven Business and UI Modeling Development Project
NASA Astrophysics Data System (ADS)
Sukaviriya, Noi; Mani, Senthil; Sinha, Vibha
Model-driven software development enables users to specify an application at a high level - a level that better matches problem domain. It also promises the users with better analysis and automation. Our work embarks on two collaborating domains - business process and human interactions - to build an application. Business modeling expresses business operations and flows then creates business flow implementation. Human interaction modeling expresses a UI design, its relationship with business data, logic, and flow, and can generate working UI. This double modeling approach automates the production of a working system with UI and business logic connected. This paper discusses the human aspects of this modeling approach after a year long of building a procurement outsourcing contract application using the approach - the result of which was deployed in December 2008. The paper discusses in multiple areas the happy endings and some heartache. We end with insights on how a model-driven approach could do better for humans in the process.
Hanuschkin, Alexander; Kunkel, Susanne; Helias, Moritz; Morrison, Abigail; Diesmann, Markus
2010-01-01
Traditionally, event-driven simulations have been limited to the very restricted class of neuronal models for which the timing of future spikes can be expressed in closed form. Recently, the class of models that is amenable to event-driven simulation has been extended by the development of techniques to accurately calculate firing times for some integrate-and-fire neuron models that do not enable the prediction of future spikes in closed form. The motivation of this development is the general perception that time-driven simulations are imprecise. Here, we demonstrate that a globally time-driven scheme can calculate firing times that cannot be discriminated from those calculated by an event-driven implementation of the same model; moreover, the time-driven scheme incurs lower computational costs. The key insight is that time-driven methods are based on identifying a threshold crossing in the recent past, which can be implemented by a much simpler algorithm than the techniques for predicting future threshold crossings that are necessary for event-driven approaches. As run time is dominated by the cost of the operations performed at each incoming spike, which includes spike prediction in the case of event-driven simulation and retrospective detection in the case of time-driven simulation, the simple time-driven algorithm outperforms the event-driven approaches. Additionally, our method is generally applicable to all commonly used integrate-and-fire neuronal models; we show that a non-linear model employing a standard adaptive solver can reproduce a reference spike train with a high degree of precision. PMID:21031031
Formalism Challenges of the Cougaar Model Driven Architecture
NASA Technical Reports Server (NTRS)
Bohner, Shawn A.; George, Boby; Gracanin, Denis; Hinchey, Michael G.
2004-01-01
The Cognitive Agent Architecture (Cougaar) is one of the most sophisticated distributed agent architectures developed today. As part of its research and evolution, Cougaar is being studied for application to large, logistics-based applications for the Department of Defense (DoD). Anticipiting future complex applications of Cougaar, we are investigating the Model Driven Architecture (MDA) approach to understand how effective it would be for increasing productivity in Cougar-based development efforts. Recognizing the sophistication of the Cougaar development environment and the limitations of transformation technologies for agents, we have systematically developed an approach that combines component assembly in the large and transformation in the small. This paper describes some of the key elements that went into the Cougaar Model Driven Architecture approach and the characteristics that drove the approach.
The "Village" Model: A Consumer-Driven Approach for Aging in Place
ERIC Educational Resources Information Center
Scharlach, Andrew; Graham, Carrie; Lehning, Amanda
2012-01-01
Purpose of the Study: This study examines the characteristics of the "Village" model, an innovative consumer-driven approach that aims to promote aging in place through a combination of member supports, service referrals, and consumer engagement. Design and Methods: Thirty of 42 fully operational Villages completed 2 surveys. One survey examined…
Accurate position estimation methods based on electrical impedance tomography measurements
NASA Astrophysics Data System (ADS)
Vergara, Samuel; Sbarbaro, Daniel; Johansen, T. A.
2017-08-01
Electrical impedance tomography (EIT) is a technology that estimates the electrical properties of a body or a cross section. Its main advantages are its non-invasiveness, low cost and operation free of radiation. The estimation of the conductivity field leads to low resolution images compared with other technologies, and high computational cost. However, in many applications the target information lies in a low intrinsic dimensionality of the conductivity field. The estimation of this low-dimensional information is addressed in this work. It proposes optimization-based and data-driven approaches for estimating this low-dimensional information. The accuracy of the results obtained with these approaches depends on modelling and experimental conditions. Optimization approaches are sensitive to model discretization, type of cost function and searching algorithms. Data-driven methods are sensitive to the assumed model structure and the data set used for parameter estimation. The system configuration and experimental conditions, such as number of electrodes and signal-to-noise ratio (SNR), also have an impact on the results. In order to illustrate the effects of all these factors, the position estimation of a circular anomaly is addressed. Optimization methods based on weighted error cost functions and derivate-free optimization algorithms provided the best results. Data-driven approaches based on linear models provided, in this case, good estimates, but the use of nonlinear models enhanced the estimation accuracy. The results obtained by optimization-based algorithms were less sensitive to experimental conditions, such as number of electrodes and SNR, than data-driven approaches. Position estimation mean squared errors for simulation and experimental conditions were more than twice for the optimization-based approaches compared with the data-driven ones. The experimental position estimation mean squared error of the data-driven models using a 16-electrode setup was less than 0.05% of the tomograph radius value. These results demonstrate that the proposed approaches can estimate an object’s position accurately based on EIT measurements if enough process information is available for training or modelling. Since they do not require complex calculations it is possible to use them in real-time applications without requiring high-performance computers.
Johnson, Douglas H.; Cook, R.D.
2013-01-01
In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Qingda; Gao, Xiaoyang; Krishnamoorthy, Sriram
Empirical optimizers like ATLAS have been very effective in optimizing computational kernels in libraries. The best choice of parameters such as tile size and degree of loop unrolling is determined by executing different versions of the computation. In contrast, optimizing compilers use a model-driven approach to program transformation. While the model-driven approach of optimizing compilers is generally orders of magnitude faster than ATLAS-like library generators, its effectiveness can be limited by the accuracy of the performance models used. In this paper, we describe an approach where a class of computations is modeled in terms of constituent operations that are empiricallymore » measured, thereby allowing modeling of the overall execution time. The performance model with empirically determined cost components is used to perform data layout optimization together with the selection of library calls and layout transformations in the context of the Tensor Contraction Engine, a compiler for a high-level domain-specific language for expressing computational models in quantum chemistry. The effectiveness of the approach is demonstrated through experimental measurements on representative computations from quantum chemistry.« less
Huang, Weidong; Li, Kun; Wang, Gan; Wang, Yingzhe
2013-11-01
In this article, we present a newly designed inverse umbrella surface aerator, and tested its performance in driving flow of an oxidation ditch. Results show that it has a better performance in driving the oxidation ditch than the original one with higher average velocity and more uniform flow field. We also present a computational fluid dynamics model for predicting the flow field in an oxidation ditch driven by a surface aerator. The improved momentum source term approach to simulate the flow field of the oxidation ditch driven by an inverse umbrella surface aerator was developed and validated through experiments. Four kinds of turbulent models were investigated with the approach, including the standard k - ɛ model, RNG k - ɛ model, realizable k - ɛ model, and Reynolds stress model, and the predicted data were compared with those calculated with the multiple rotating reference frame approach (MRF) and sliding mesh approach (SM). Results of the momentum source term approach are in good agreement with the experimental data, and its prediction accuracy is better than MRF, close to SM. It is also found that the momentum source term approach has lower computational expenses, is simpler to preprocess, and is easier to use.
NASA Astrophysics Data System (ADS)
Revunova, Svetlana; Vlasenko, Vyacheslav; Bukreev, Anatoly
2017-10-01
The article proposes the models of innovative activity development, which is driven by the formation of “points of innovation-driven growth”. The models are based on the analysis of the current state and dynamics of innovative development of construction enterprises in the transport sector and take into account a number of essential organizational and economic changes in management. The authors substantiate implementing such development models as an organizational innovation that has a communication genesis. The use of the communication approach to the formation of “points of innovation-driven growth” allowed the authors to apply the mathematical tools of the graph theory in order to activate the innovative activity of the transport industry in the region. As a result, the authors have proposed models that allow constructing an optimal mechanism for the formation of “points of innovation-driven growth”.
Coates, James; Jeyaseelan, Asha K; Ybarra, Norma; David, Marc; Faria, Sergio; Souhami, Luis; Cury, Fabio; Duclos, Marie; El Naqa, Issam
2015-04-01
We explore analytical and data-driven approaches to investigate the integration of genetic variations (single nucleotide polymorphisms [SNPs] and copy number variations [CNVs]) with dosimetric and clinical variables in modeling radiation-induced rectal bleeding (RB) and erectile dysfunction (ED) in prostate cancer patients. Sixty-two patients who underwent curative hypofractionated radiotherapy (66 Gy in 22 fractions) between 2002 and 2010 were retrospectively genotyped for CNV and SNP rs5489 in the xrcc1 DNA repair gene. Fifty-four patients had full dosimetric profiles. Two parallel modeling approaches were compared to assess the risk of severe RB (Grade⩾3) and ED (Grade⩾1); Maximum likelihood estimated generalized Lyman-Kutcher-Burman (LKB) and logistic regression. Statistical resampling based on cross-validation was used to evaluate model predictive power and generalizability to unseen data. Integration of biological variables xrcc1 CNV and SNP improved the fit of the RB and ED analytical and data-driven models. Cross-validation of the generalized LKB models yielded increases in classification performance of 27.4% for RB and 14.6% for ED when xrcc1 CNV and SNP were included, respectively. Biological variables added to logistic regression modeling improved classification performance over standard dosimetric models by 33.5% for RB and 21.2% for ED models. As a proof-of-concept, we demonstrated that the combination of genetic and dosimetric variables can provide significant improvement in NTCP prediction using analytical and data-driven approaches. The improvement in prediction performance was more pronounced in the data driven approaches. Moreover, we have shown that CNVs, in addition to SNPs, may be useful structural genetic variants in predicting radiation toxicities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Carton, Andrew; Driver, Cormac; Jackson, Andrew; Clarke, Siobhán
Theme/UML is an existing approach to aspect-oriented modelling that supports the modularisation and composition of concerns, including crosscutting ones, in design. To date, its lack of integration with model-driven engineering (MDE) techniques has limited its benefits across the development lifecycle. Here, we describe our work on facilitating the use of Theme/UML as part of an MDE process. We have developed a transformation tool that adopts model-driven architecture (MDA) standards. It defines a concern composition mechanism, implemented as a model transformation, to support the enhanced modularisation features of Theme/UML. We evaluate our approach by applying it to the development of mobile, context-aware applications-an application area characterised by many non-functional requirements that manifest themselves as crosscutting concerns.
NASA Astrophysics Data System (ADS)
Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan
A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.
Biomimetics and the case of the remarkable ragworms.
Hesselberg, Thomas
2007-08-01
Biomimetics is a rapidly growing field both as an academic and as an applied discipline. This paper gives a short introduction to the current status of the discipline before it describes three approaches to biomimetics: the mechanism-driven, which is based on the study of a specific mechanism; the focused organism-driven, which is based on the study of one function in a model organism; and the integrative organism-driven approach, where multiple functions of a model organism provide inspiration. The first two are established approaches and include many modern studies and the famous biomimetic discoveries of Velcro and the Lotus-Effect, whereas the last approach is not yet well recognized. The advantages of the integrative organism-driven approach are discussed using the ragworms as a case study. A morphological and locomotory study of these marine polychaetes reveals their biomimetic potential, which includes using their ability to move in slippery substrates as inspiration for novel endoscopes, using their compound setae as models for passive friction structures and using their three gaits, slow crawling, fast crawling, and swimming as well as their rapid burrowing technique to provide inspiration for the design of displacement pumps and multifunctional robots.
Biomimetics and the case of the remarkable ragworms
NASA Astrophysics Data System (ADS)
Hesselberg, Thomas
2007-08-01
Biomimetics is a rapidly growing field both as an academic and as an applied discipline. This paper gives a short introduction to the current status of the discipline before it describes three approaches to biomimetics: the mechanism-driven, which is based on the study of a specific mechanism; the focused organism-driven, which is based on the study of one function in a model organism; and the integrative organism-driven approach, where multiple functions of a model organism provide inspiration. The first two are established approaches and include many modern studies and the famous biomimetic discoveries of Velcro and the Lotus-Effect, whereas the last approach is not yet well recognized. The advantages of the integrative organism-driven approach are discussed using the ragworms as a case study. A morphological and locomotory study of these marine polychaetes reveals their biomimetic potential, which includes using their ability to move in slippery substrates as inspiration for novel endoscopes, using their compound setae as models for passive friction structures and using their three gaits, slow crawling, fast crawling, and swimming as well as their rapid burrowing technique to provide inspiration for the design of displacement pumps and multifunctional robots.
A data driven control method for structure vibration suppression
NASA Astrophysics Data System (ADS)
Xie, Yangmin; Wang, Chao; Shi, Hang; Shi, Junwei
2018-02-01
High radio-frequency space applications have motivated continuous research on vibration suppression of large space structures both in academia and industry. This paper introduces a novel data driven control method to suppress vibrations of flexible structures and experimentally validates the suppression performance. Unlike model-based control approaches, the data driven control method designs a controller directly from the input-output test data of the structure, without requiring parametric dynamics and hence free of system modeling. It utilizes the discrete frequency response via spectral analysis technique and formulates a non-convex optimization problem to obtain optimized controller parameters with a predefined controller structure. Such approach is then experimentally applied on an end-driving flexible beam-mass structure. The experiment results show that the presented method can achieve competitive disturbance rejections compared to a model-based mixed sensitivity controller under the same design criterion but with much less orders and design efforts, demonstrating the proposed data driven control is an effective approach for vibration suppression of flexible structures.
NASA Astrophysics Data System (ADS)
Forkel, Matthias; Dorigo, Wouter; Lasslop, Gitta; Teubner, Irene; Chuvieco, Emilio; Thonicke, Kirsten
2017-12-01
Vegetation fires affect human infrastructures, ecosystems, global vegetation distribution, and atmospheric composition. However, the climatic, environmental, and socioeconomic factors that control global fire activity in vegetation are only poorly understood, and in various complexities and formulations are represented in global process-oriented vegetation-fire models. Data-driven model approaches such as machine learning algorithms have successfully been used to identify and better understand controlling factors for fire activity. However, such machine learning models cannot be easily adapted or even implemented within process-oriented global vegetation-fire models. To overcome this gap between machine learning-based approaches and process-oriented global fire models, we introduce a new flexible data-driven fire modelling approach here (Satellite Observations to predict FIre Activity, SOFIA approach version 1). SOFIA models can use several predictor variables and functional relationships to estimate burned area that can be easily adapted with more complex process-oriented vegetation-fire models. We created an ensemble of SOFIA models to test the importance of several predictor variables. SOFIA models result in the highest performance in predicting burned area if they account for a direct restriction of fire activity under wet conditions and if they include a land cover-dependent restriction or allowance of fire activity by vegetation density and biomass. The use of vegetation optical depth data from microwave satellite observations, a proxy for vegetation biomass and water content, reaches higher model performance than commonly used vegetation variables from optical sensors. We further analyse spatial patterns of the sensitivity between anthropogenic, climate, and vegetation predictor variables and burned area. We finally discuss how multiple observational datasets on climate, hydrological, vegetation, and socioeconomic variables together with data-driven modelling and model-data integration approaches can guide the future development of global process-oriented vegetation-fire models.
Huang, Weidong; Li, Kun; Wang, Gan; Wang, Yingzhe
2013-01-01
Abstract In this article, we present a newly designed inverse umbrella surface aerator, and tested its performance in driving flow of an oxidation ditch. Results show that it has a better performance in driving the oxidation ditch than the original one with higher average velocity and more uniform flow field. We also present a computational fluid dynamics model for predicting the flow field in an oxidation ditch driven by a surface aerator. The improved momentum source term approach to simulate the flow field of the oxidation ditch driven by an inverse umbrella surface aerator was developed and validated through experiments. Four kinds of turbulent models were investigated with the approach, including the standard k−ɛ model, RNG k−ɛ model, realizable k−ɛ model, and Reynolds stress model, and the predicted data were compared with those calculated with the multiple rotating reference frame approach (MRF) and sliding mesh approach (SM). Results of the momentum source term approach are in good agreement with the experimental data, and its prediction accuracy is better than MRF, close to SM. It is also found that the momentum source term approach has lower computational expenses, is simpler to preprocess, and is easier to use. PMID:24302850
Data Driven Model Development for the Supersonic Semispan Transport (S(sup 4)T)
NASA Technical Reports Server (NTRS)
Kukreja, Sunil L.
2011-01-01
We investigate two common approaches to model development for robust control synthesis in the aerospace community; namely, reduced order aeroservoelastic modelling based on structural finite-element and computational fluid dynamics based aerodynamic models and a data-driven system identification procedure. It is shown via analysis of experimental Super- Sonic SemiSpan Transport (S4T) wind-tunnel data using a system identification approach it is possible to estimate a model at a fixed Mach, which is parsimonious and robust across varying dynamic pressures.
Combining Domain-driven Design and Mashups for Service Development
NASA Astrophysics Data System (ADS)
Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni
This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.
Driven-dissipative quantum Monte Carlo method for open quantum systems
NASA Astrophysics Data System (ADS)
Nagy, Alexandra; Savona, Vincenzo
2018-05-01
We develop a real-time full configuration-interaction quantum Monte Carlo approach to model driven-dissipative open quantum systems with Markovian system-bath coupling. The method enables stochastic sampling of the Liouville-von Neumann time evolution of the density matrix thanks to a massively parallel algorithm, thus providing estimates of observables on the nonequilibrium steady state. We present the underlying theory and introduce an initiator technique and importance sampling to reduce the statistical error. Finally, we demonstrate the efficiency of our approach by applying it to the driven-dissipative two-dimensional X Y Z spin-1/2 model on a lattice.
Model‐Based Approach to Predict Adherence to Protocol During Antiobesity Trials
Sharma, Vishnu D.; Combes, François P.; Vakilynejad, Majid; Lahu, Gezim; Lesko, Lawrence J.
2017-01-01
Abstract Development of antiobesity drugs is continuously challenged by high dropout rates during clinical trials. The objective was to develop a population pharmacodynamic model that describes the temporal changes in body weight, considering disease progression, lifestyle intervention, and drug effects. Markov modeling (MM) was applied for quantification and characterization of responder and nonresponder as key drivers of dropout rates, to ultimately support the clinical trial simulations and the outcome in terms of trial adherence. Subjects (n = 4591) from 6 Contrave® trials were included in this analysis. An indirect‐response model developed by van Wart et al was used as a starting point. Inclusion of drug effect was dose driven using a population dose‐ and time‐dependent pharmacodynamic (DTPD) model. Additionally, a population‐pharmacokinetic parameter‐ and data (PPPD)‐driven model was developed using the final DTPD model structure and final parameter estimates from a previously developed population pharmacokinetic model based on available Contrave® pharmacokinetic concentrations. Last, MM was developed to predict transition rate probabilities among responder, nonresponder, and dropout states driven by the pharmacodynamic effect resulting from the DTPD or PPPD model. Covariates included in the models and parameters were diabetes mellitus and race. The linked DTPD‐MM and PPPD‐MM was able to predict transition rates among responder, nonresponder, and dropout states well. The analysis concluded that body‐weight change is an important factor influencing dropout rates, and the MM depicted that overall a DTPD model‐driven approach provides a reasonable prediction of clinical trial outcome probabilities similar to a pharmacokinetic‐driven approach. PMID:28858397
Human systems immunology: hypothesis-based modeling and unbiased data-driven approaches.
Arazi, Arnon; Pendergraft, William F; Ribeiro, Ruy M; Perelson, Alan S; Hacohen, Nir
2013-10-31
Systems immunology is an emerging paradigm that aims at a more systematic and quantitative understanding of the immune system. Two major approaches have been utilized to date in this field: unbiased data-driven modeling to comprehensively identify molecular and cellular components of a system and their interactions; and hypothesis-based quantitative modeling to understand the operating principles of a system by extracting a minimal set of variables and rules underlying them. In this review, we describe applications of the two approaches to the study of viral infections and autoimmune diseases in humans, and discuss possible ways by which these two approaches can synergize when applied to human immunology. Copyright © 2012 Elsevier Ltd. All rights reserved.
Model-Based Approach to Predict Adherence to Protocol During Antiobesity Trials.
Sharma, Vishnu D; Combes, François P; Vakilynejad, Majid; Lahu, Gezim; Lesko, Lawrence J; Trame, Mirjam N
2018-02-01
Development of antiobesity drugs is continuously challenged by high dropout rates during clinical trials. The objective was to develop a population pharmacodynamic model that describes the temporal changes in body weight, considering disease progression, lifestyle intervention, and drug effects. Markov modeling (MM) was applied for quantification and characterization of responder and nonresponder as key drivers of dropout rates, to ultimately support the clinical trial simulations and the outcome in terms of trial adherence. Subjects (n = 4591) from 6 Contrave ® trials were included in this analysis. An indirect-response model developed by van Wart et al was used as a starting point. Inclusion of drug effect was dose driven using a population dose- and time-dependent pharmacodynamic (DTPD) model. Additionally, a population-pharmacokinetic parameter- and data (PPPD)-driven model was developed using the final DTPD model structure and final parameter estimates from a previously developed population pharmacokinetic model based on available Contrave ® pharmacokinetic concentrations. Last, MM was developed to predict transition rate probabilities among responder, nonresponder, and dropout states driven by the pharmacodynamic effect resulting from the DTPD or PPPD model. Covariates included in the models and parameters were diabetes mellitus and race. The linked DTPD-MM and PPPD-MM was able to predict transition rates among responder, nonresponder, and dropout states well. The analysis concluded that body-weight change is an important factor influencing dropout rates, and the MM depicted that overall a DTPD model-driven approach provides a reasonable prediction of clinical trial outcome probabilities similar to a pharmacokinetic-driven approach. © 2017, The Authors. The Journal of Clinical Pharmacology published by Wiley Periodicals, Inc. on behalf of American College of Clinical Pharmacology.
Data Driven Model Development for the SuperSonic SemiSpan Transport (S(sup 4)T)
NASA Technical Reports Server (NTRS)
Kukreja, Sunil L.
2011-01-01
In this report, we will investigate two common approaches to model development for robust control synthesis in the aerospace community; namely, reduced order aeroservoelastic modelling based on structural finite-element and computational fluid dynamics based aerodynamic models, and a data-driven system identification procedure. It is shown via analysis of experimental SuperSonic SemiSpan Transport (S4T) wind-tunnel data that by using a system identification approach it is possible to estimate a model at a fixed Mach, which is parsimonious and robust across varying dynamic pressures.
A Model-Driven Approach for Telecommunications Network Services Definition
NASA Astrophysics Data System (ADS)
Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.
Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.
Use case driven approach to develop simulation model for PCS of APR1400 simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang
2006-07-01
The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less
Data-driven integration of genome-scale regulatory and metabolic network models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Imam, Saheed; Schauble, Sascha; Brooks, Aaron N.
Microbes are diverse and extremely versatile organisms that play vital roles in all ecological niches. Understanding and harnessing microbial systems will be key to the sustainability of our planet. One approach to improving our knowledge of microbial processes is through data-driven and mechanism-informed computational modeling. Individual models of biological networks (such as metabolism, transcription, and signaling) have played pivotal roles in driving microbial research through the years. These networks, however, are highly interconnected and function in concert a fact that has led to the development of a variety of approaches aimed at simulating the integrated functions of two or moremore » network types. Though the task of integrating these different models is fraught with new challenges, the large amounts of high-throughput data sets being generated, and algorithms being developed, means that the time is at hand for concerted efforts to build integrated regulatory-metabolic networks in a data-driven fashion. Lastly, in this perspective, we review current approaches for constructing integrated regulatory-metabolic models and outline new strategies for future development of these network models for any microbial system.« less
Data-driven integration of genome-scale regulatory and metabolic network models
Imam, Saheed; Schauble, Sascha; Brooks, Aaron N.; ...
2015-05-05
Microbes are diverse and extremely versatile organisms that play vital roles in all ecological niches. Understanding and harnessing microbial systems will be key to the sustainability of our planet. One approach to improving our knowledge of microbial processes is through data-driven and mechanism-informed computational modeling. Individual models of biological networks (such as metabolism, transcription, and signaling) have played pivotal roles in driving microbial research through the years. These networks, however, are highly interconnected and function in concert a fact that has led to the development of a variety of approaches aimed at simulating the integrated functions of two or moremore » network types. Though the task of integrating these different models is fraught with new challenges, the large amounts of high-throughput data sets being generated, and algorithms being developed, means that the time is at hand for concerted efforts to build integrated regulatory-metabolic networks in a data-driven fashion. Lastly, in this perspective, we review current approaches for constructing integrated regulatory-metabolic models and outline new strategies for future development of these network models for any microbial system.« less
Reverse engineering biomolecular systems using -omic data: challenges, progress and opportunities.
Quo, Chang F; Kaddi, Chanchala; Phan, John H; Zollanvari, Amin; Xu, Mingqing; Wang, May D; Alterovitz, Gil
2012-07-01
Recent advances in high-throughput biotechnologies have led to the rapid growing research interest in reverse engineering of biomolecular systems (REBMS). 'Data-driven' approaches, i.e. data mining, can be used to extract patterns from large volumes of biochemical data at molecular-level resolution while 'design-driven' approaches, i.e. systems modeling, can be used to simulate emergent system properties. Consequently, both data- and design-driven approaches applied to -omic data may lead to novel insights in reverse engineering biological systems that could not be expected before using low-throughput platforms. However, there exist several challenges in this fast growing field of reverse engineering biomolecular systems: (i) to integrate heterogeneous biochemical data for data mining, (ii) to combine top-down and bottom-up approaches for systems modeling and (iii) to validate system models experimentally. In addition to reviewing progress made by the community and opportunities encountered in addressing these challenges, we explore the emerging field of synthetic biology, which is an exciting approach to validate and analyze theoretical system models directly through experimental synthesis, i.e. analysis-by-synthesis. The ultimate goal is to address the present and future challenges in reverse engineering biomolecular systems (REBMS) using integrated workflow of data mining, systems modeling and synthetic biology.
A model-driven approach to information security compliance
NASA Astrophysics Data System (ADS)
Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena
2017-06-01
The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.
Linking Goal-Oriented Requirements and Model-Driven Development
NASA Astrophysics Data System (ADS)
Pastor, Oscar; Giachetti, Giovanni
In the context of Goal-Oriented Requirement Engineering (GORE) there are interesting modeling approaches for the analysis of complex scenarios that are oriented to obtain and represent the relevant requirements for the development of software products. However, the way to use these GORE models in an automated Model-Driven Development (MDD) process is not clear, and, in general terms, the translation of these models into the final software products is still manually performed. Therefore, in this chapter, we show an approach to automatically link GORE models and MDD processes, which has been elaborated by considering the experience obtained from linking the i * framework with an industrially applied MDD approach. The linking approach proposed is formulated by means of a generic process that is based on current modeling standards and technologies in order to facilitate its application for different MDD and GORE approaches. Special attention is paid to how this process generates appropriate model transformation mechanisms to automatically obtain MDD conceptual models from GORE models, and how it can be used to specify validation mechanisms to assure the correct model transformations.
NASA Technical Reports Server (NTRS)
Narasimhan, Sriram; Roychoudhury, Indranil; Balaban, Edward; Saxena, Abhinav
2010-01-01
Model-based diagnosis typically uses analytical redundancy to compare predictions from a model against observations from the system being diagnosed. However this approach does not work very well when it is not feasible to create analytic relations describing all the observed data, e.g., for vibration data which is usually sampled at very high rates and requires very detailed finite element models to describe its behavior. In such cases, features (in time and frequency domains) that contain diagnostic information are extracted from the data. Since this is a computationally intensive process, it is not efficient to extract all the features all the time. In this paper we present an approach that combines the analytic model-based and feature-driven diagnosis approaches. The analytic approach is used to reduce the set of possible faults and then features are chosen to best distinguish among the remaining faults. We describe an implementation of this approach on the Flyable Electro-mechanical Actuator (FLEA) test bed.
Data-Driven Learning of Q-Matrix
ERIC Educational Resources Information Center
Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang
2012-01-01
The recent surge of interests in cognitive assessment has led to developments of novel statistical models for diagnostic classification. Central to many such models is the well-known "Q"-matrix, which specifies the item-attribute relationships. This article proposes a data-driven approach to identification of the "Q"-matrix and estimation of…
Putting the psychology back into psychological models: mechanistic versus rational approaches.
Sakamoto, Yasuaki; Jones, Mattr; Love, Bradley C
2008-09-01
Two basic approaches to explaining the nature of the mind are the rational and the mechanistic approaches. Rational analyses attempt to characterize the environment and the behavioral outcomes that humans seek to optimize, whereas mechanistic models attempt to simulate human behavior using processes and representations analogous to those used by humans. We compared these approaches with regard to their accounts of how humans learn the variability of categories. The mechanistic model departs in subtle ways from rational principles. In particular, the mechanistic model incrementally updates its estimates of category means and variances through error-driven learning, based on discrepancies between new category members and the current representation of each category. The model yields a prediction, which we verify, regarding the effects of order manipulations that the rational approach does not anticipate. Although both rational and mechanistic models can successfully postdict known findings, we suggest that psychological advances are driven primarily by consideration of process and representation and that rational accounts trail these breakthroughs.
Chen, Gang; Glen, Daniel R.; Saad, Ziad S.; Hamilton, J. Paul; Thomason, Moriah E.; Gotlib, Ian H.; Cox, Robert W.
2011-01-01
Vector autoregression (VAR) and structural equation modeling (SEM) are two popular brain-network modeling tools. VAR, which is a data-driven approach, assumes that connected regions exert time-lagged influences on one another. In contrast, the hypothesis-driven SEM is used to validate an existing connectivity model where connected regions have contemporaneous interactions among them. We present the two models in detail and discuss their applicability to FMRI data, and interpretational limits. We also propose a unified approach that models both lagged and contemporaneous effects. The unifying model, structural vector autoregression (SVAR), may improve statistical and explanatory power, and avoids some prevalent pitfalls that can occur when VAR and SEM are utilized separately. PMID:21975109
NASA Astrophysics Data System (ADS)
Barretta, Raffaele; Fabbrocino, Francesco; Luciano, Raimondo; Sciarra, Francesco Marotti de
2018-03-01
Strain-driven and stress-driven integral elasticity models are formulated for the analysis of the structural behaviour of fuctionally graded nano-beams. An innovative stress-driven two-phases constitutive mixture defined by a convex combination of local and nonlocal phases is presented. The analysis reveals that the Eringen strain-driven fully nonlocal model cannot be used in Structural Mechanics since it is ill-posed and the local-nonlocal mixtures based on the Eringen integral model partially resolve the ill-posedeness of the model. In fact, a singular behaviour of continuous nano-structures appears if the local fraction tends to vanish so that the ill-posedness of the Eringen integral model is not eliminated. On the contrary, local-nonlocal mixtures based on the stress-driven theory are mathematically and mechanically appropriate for nanosystems. Exact solutions of inflected functionally graded nanobeams of technical interest are established by adopting the new local-nonlocal mixture stress-driven integral relation. Effectiveness of the new nonlocal approach is tested by comparing the contributed results with the ones corresponding to the mixture Eringen theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santillán, David; Juanes, Ruben; Cueto-Felgueroso, Luis
Propagation of fluid-driven fractures plays an important role in natural and engineering processes, including transport of magma in the lithosphere, geologic sequestration of carbon dioxide, and oil and gas recovery from low-permeability formations, among many others. The simulation of fracture propagation poses a computational challenge as a result of the complex physics of fracture and the need to capture disparate length scales. Phase field models represent fractures as a diffuse interface and enjoy the advantage that fracture nucleation, propagation, branching, or twisting can be simulated without ad hoc computational strategies like remeshing or local enrichment of the solution space. Heremore » we propose a new quasi-static phase field formulation for modeling fluid-driven fracturing in elastic media at small strains. The approach fully couples the fluid flow in the fracture (described via the Reynolds lubrication approximation) and the deformation of the surrounding medium. The flow is solved on a lower dimensionality mesh immersed in the elastic medium. This approach leads to accurate coupling of both physics. We assessed the performance of the model extensively by comparing results for the evolution of fracture length, aperture, and fracture fluid pressure against analytical solutions under different fracture propagation regimes. Thus, the excellent performance of the numerical model in all regimes builds confidence in the applicability of phase field approaches to simulate fluid-driven fracture.« less
Santillán, David; Juanes, Ruben; Cueto-Felgueroso, Luis
2017-04-20
Propagation of fluid-driven fractures plays an important role in natural and engineering processes, including transport of magma in the lithosphere, geologic sequestration of carbon dioxide, and oil and gas recovery from low-permeability formations, among many others. The simulation of fracture propagation poses a computational challenge as a result of the complex physics of fracture and the need to capture disparate length scales. Phase field models represent fractures as a diffuse interface and enjoy the advantage that fracture nucleation, propagation, branching, or twisting can be simulated without ad hoc computational strategies like remeshing or local enrichment of the solution space. Heremore » we propose a new quasi-static phase field formulation for modeling fluid-driven fracturing in elastic media at small strains. The approach fully couples the fluid flow in the fracture (described via the Reynolds lubrication approximation) and the deformation of the surrounding medium. The flow is solved on a lower dimensionality mesh immersed in the elastic medium. This approach leads to accurate coupling of both physics. We assessed the performance of the model extensively by comparing results for the evolution of fracture length, aperture, and fracture fluid pressure against analytical solutions under different fracture propagation regimes. Thus, the excellent performance of the numerical model in all regimes builds confidence in the applicability of phase field approaches to simulate fluid-driven fracture.« less
Data-driven Modeling of Metal-oxide Sensors with Dynamic Bayesian Networks
NASA Astrophysics Data System (ADS)
Gosangi, Rakesh; Gutierrez-Osuna, Ricardo
2011-09-01
We present a data-driven probabilistic framework to model the transient response of MOX sensors modulated with a sequence of voltage steps. Analytical models of MOX sensors are usually built based on the physico-chemical properties of the sensing materials. Although building these models provides an insight into the sensor behavior, they also require a thorough understanding of the underlying operating principles. Here we propose a data-driven approach to characterize the dynamical relationship between sensor inputs and outputs. Namely, we use dynamic Bayesian networks (DBNs), probabilistic models that represent temporal relations between a set of random variables. We identify a set of control variables that influence the sensor responses, create a graphical representation that captures the causal relations between these variables, and finally train the model with experimental data. We validated the approach on experimental data in terms of predictive accuracy and classification performance. Our results show that DBNs can accurately predict the dynamic response of MOX sensors, as well as capture the discriminatory information present in the sensor transients.
Documentation Driven Development for Complex Real-Time Systems
2004-12-01
This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real
Gerwin, Philip M; Norinsky, Rada M; Tolwani, Ravi J
2018-03-01
Laboratory animal programs and core laboratories often set service rates based on cost estimates. However, actual costs may be unknown, and service rates may not reflect the actual cost of services. Accurately evaluating the actual costs of services can be challenging and time-consuming. We used a time-driven activity-based costing (ABC) model to determine the cost of services provided by a resource laboratory at our institution. The time-driven approach is a more efficient approach to calculating costs than using a traditional ABC model. We calculated only 2 parameters: the time required to perform an activity and the unit cost of the activity based on employee cost. This method allowed us to rapidly and accurately calculate the actual cost of services provided, including microinjection of a DNA construct, microinjection of embryonic stem cells, embryo transfer, and in vitro fertilization. We successfully implemented a time-driven ABC model to evaluate the cost of these services and the capacity of labor used to deliver them. We determined how actual costs compared with current service rates. In addition, we determined that the labor supplied to conduct all services (10,645 min/wk) exceeded the practical labor capacity (8400 min/wk), indicating that the laboratory team was highly efficient and that additional labor capacity was needed to prevent overloading of the current team. Importantly, this time-driven ABC approach allowed us to establish a baseline model that can easily be updated to reflect operational changes or changes in labor costs. We demonstrated that a time-driven ABC model is a powerful management tool that can be applied to other core facilities as well as to entire animal programs, providing valuable information that can be used to set rates based on the actual cost of services and to improve operating efficiency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.
The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networksmore » and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.« less
NASA Astrophysics Data System (ADS)
Vandermeulen, J.; Nasseri, S. A.; Van de Wiele, B.; Durin, G.; Van Waeyenberge, B.; Dupré, L.
2018-03-01
Lagrangian-based collective coordinate models for magnetic domain wall (DW) motion rely on an ansatz for the DW profile and a Lagrangian approach to describe the DW motion in terms of a set of time-dependent collective coordinates: the DW position, the DW magnetization angle, the DW width and the DW tilting angle. Another approach was recently used to derive similar equations of motion by averaging the Landau-Lifshitz-Gilbert equation without any ansatz, and identifying the relevant collective coordinates afterwards. In this paper, we use an updated version of the semi-analytical equations to compare the Lagrangian-based collective coordinate models with micromagnetic simulations for field- and STT-driven (spin-transfer torque-driven) DW motion in Pt/CoFe/MgO and Pt/Co/AlOx nanostrips. Through this comparison, we assess the accuracy of the different models, and provide insight into the deviations of the models from simulations. It is found that the lack of terms related to DW asymmetry in the Lagrangian-based collective coordinate models significantly contributes to the discrepancy between the predictions of the most accurate Lagrangian-based model and the micromagnetic simulations in the field-driven case. This is in contrast to the STT-driven case where the DW remains symmetric.
Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.
2012-01-01
Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365
Hervatis, Vasilis; Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil
2015-10-06
Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators' decision making. A deductive case study approach was applied to develop the conceptual model. The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach.
Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil
2015-01-01
Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840
Kozinszky, Zoltan; Töreki, Annamária; Hompoth, Emőke A; Dudas, Robert B; Németh, Gábor
2017-04-01
We endeavoured to analyze the factor structure of the Edinburgh Postnatal Depression Scale (EPDS) during a screening programme in Hungary, using exploratory (EFA) and confirmatory factor analysis (CFA), testing both previously published models and newly developed theory-driven ones, after a critical analysis of the literature. Between April 2011 and January 2015, a sample of 2967 pregnant women (between 12th and 30th weeks of gestation) and 714 women 6 weeks after delivery completed the Hungarian version of the EPDS in South-East Hungary. EFAs suggested unidimensionality in both samples. 33 out of 42 previously published models showed good and 6 acceptable fit with our antepartum data in CFAs, whilst 10 of them showed good and 28 acceptable fit in our postpartum sample. Using multiple fit indices, our theory-driven anhedonia (items 1,2) - anxiety (items 4,5) - low mood (items 8,9) model provided the best fit in the antepartum sample. In the postpartum sample, our theory-driven models were again among the best performing models, including an anhedonia and an anxiety factor together with either a low mood or a suicidal risk factor (items 3,6,10). The EPDS showed moderate within- and between-culture invariability, although this would also need to be re-examined with a theory-driven approach. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Wu, Xiao; Shen, Jiong; Li, Yiguo; Lee, Kwang Y
2014-05-01
This paper develops a novel data-driven fuzzy modeling strategy and predictive controller for boiler-turbine unit using fuzzy clustering and subspace identification (SID) methods. To deal with the nonlinear behavior of boiler-turbine unit, fuzzy clustering is used to provide an appropriate division of the operation region and develop the structure of the fuzzy model. Then by combining the input data with the corresponding fuzzy membership functions, the SID method is extended to extract the local state-space model parameters. Owing to the advantages of the both methods, the resulting fuzzy model can represent the boiler-turbine unit very closely, and a fuzzy model predictive controller is designed based on this model. As an alternative approach, a direct data-driven fuzzy predictive control is also developed following the same clustering and subspace methods, where intermediate subspace matrices developed during the identification procedure are utilized directly as the predictor. Simulation results show the advantages and effectiveness of the proposed approach. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty
Friedel, Michael J.
2011-01-01
This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.
NASA Technical Reports Server (NTRS)
Pena, Joaquin; Hinchey, Michael G.; Sterritt, Roy; Ruiz-Cortes, Antonio; Resinas, Manuel
2006-01-01
Autonomic Computing (AC), self-management based on high level guidance from humans, is increasingly gaining momentum as the way forward in designing reliable systems that hide complexity and conquer IT management costs. Effectively, AC may be viewed as Policy-Based Self-Management. The Model Driven Architecture (MDA) approach focuses on building models that can be transformed into code in an automatic manner. In this paper, we look at ways to implement Policy-Based Self-Management by means of models that can be converted to code using transformations that follow the MDA philosophy. We propose a set of UML-based models to specify autonomic and autonomous features along with the necessary procedures, based on modification and composition of models, to deploy a policy as an executing system.
Dynamic simulation of storm-driven barrier island morphology under future sea level rise
NASA Astrophysics Data System (ADS)
Passeri, D. L.; Long, J.; Plant, N. G.; Bilskie, M. V.; Hagen, S. C.
2016-12-01
The impacts of short-term processes such as tropical and extratropical storms have the potential to alter barrier island morphology. On the event scale, the effects of storm-driven morphology may result in damage or loss of property, infrastructure and habitat. On the decadal scale, the combination of storms and sea level rise (SLR) will evolve barrier islands. The effects of SLR on hydrodynamics and coastal morphology are dynamic and inter-related; nonlinearities in SLR can cause larger peak surges, lengthier inundation times and additional inundated land, which may result in increased erosion, overwash or breaching along barrier islands. This study uses a two-dimensional morphodynamic model (XBeach) to examine the response of Dauphin Island, AL to storm surge under future SLR. The model is forced with water levels and waves provided by a large-domain hydrodynamic model. A historic validation of hurricanes Ivan and Katrina indicates the model is capable of predicting morphologic response with high skill (0.5). The validated model is used to simulate storm surge driven by Ivan and Katrina under four future SLR scenarios, ranging from 20 cm to 2 m. Each SLR scenario is implemented using a static or "bathtub" approach (in which water levels are increased linearly by the amount of SLR) versus a dynamic approach (in which SLR is applied at the open ocean boundary of the hydrodynamic model and allowed to propagate through the domain as guided by the governing equations). Results illustrate that higher amounts of SLR result in additional shoreline change, dune erosion, overwash and breaching. Compared to the dynamic approach, the static approach over-predicts inundation, dune erosion, overwash and breaching of the island. Overall, results provide a better understanding of the effects of SLR on storm-driven barrier island morphology and support a paradigm shift away from the "bathtub" approach, towards considering the integrated, dynamic effects of SLR.
Autonomous Soil Assessment System: A Data-Driven Approach to Planetary Mobility Hazard Detection
NASA Astrophysics Data System (ADS)
Raimalwala, K.; Faragalli, M.; Reid, E.
2018-04-01
The Autonomous Soil Assessment System predicts mobility hazards for rovers. Its development and performance are presented, with focus on its data-driven models, machine learning algorithms, and real-time sensor data fusion for predictive analytics.
A Model-Driven Approach to Teaching Concurrency
ERIC Educational Resources Information Center
Carro, Manuel; Herranz, Angel; Marino, Julio
2013-01-01
We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…
NASA Astrophysics Data System (ADS)
Ihsani, Alvin; Farncombe, Troy
2016-02-01
The modelling of the projection operator in tomographic imaging is of critical importance especially when working with algebraic methods of image reconstruction. This paper proposes a distance-driven projection method which is targeted to single-pinhole single-photon emission computed tomograghy (SPECT) imaging since it accounts for the finite size of the pinhole, and the possible tilting of the detector surface in addition to other collimator-specific factors such as geometric sensitivity. The accuracy and execution time of the proposed method is evaluated by comparing to a ray-driven approach where the pinhole is sub-sampled with various sampling schemes. A point-source phantom whose projections were generated using OpenGATE was first used to compare the resolution of reconstructed images with each method using the full width at half maximum (FWHM). Furthermore, a high-activity Mini Deluxe Phantom (Data Spectrum Corp., Durham, NC, USA) SPECT resolution phantom was scanned using a Gamma Medica X-SPECT system and the signal-to-noise ratio (SNR) and structural similarity of reconstructed images was compared at various projection counts. Based on the reconstructed point-source phantom, the proposed distance-driven approach results in a lower FWHM than the ray-driven approach even when using a smaller detector resolution. Furthermore, based on the Mini Deluxe Phantom, it is shown that the distance-driven approach has consistently higher SNR and structural similarity compared to the ray-driven approach as the counts in measured projections deteriorates.
Development of a Stochastically-driven, Forward Predictive Performance Model for PEMFCs
NASA Astrophysics Data System (ADS)
Harvey, David Benjamin Paul
A one-dimensional multi-scale coupled, transient, and mechanistic performance model for a PEMFC membrane electrode assembly has been developed. The model explicitly includes each of the 5 layers within a membrane electrode assembly and solves for the transport of charge, heat, mass, species, dissolved water, and liquid water. Key features of the model include the use of a multi-step implementation of the HOR reaction on the anode, agglomerate catalyst sub-models for both the anode and cathode catalyst layers, a unique approach that links the composition of the catalyst layer to key properties within the agglomerate model and the implementation of a stochastic input-based approach for component material properties. The model employs a new methodology for validation using statistically varying input parameters and statistically-based experimental performance data; this model represents the first stochastic input driven unit cell performance model. The stochastic input driven performance model was used to identify optimal ionomer content within the cathode catalyst layer, demonstrate the role of material variation in potential low performing MEA materials, provide explanation for the performance of low-Pt loaded MEAs, and investigate the validity of transient-sweep experimental diagnostic methods.
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
Statistical and engineering methods for model enhancement
NASA Astrophysics Data System (ADS)
Chang, Chia-Jung
Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as “Minimal Adjustment”, which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.
NASA Astrophysics Data System (ADS)
Persano Adorno, Dominique; Pizzolato, Nicola; Fazio, Claudio
2015-09-01
Within the context of higher education for science or engineering undergraduates, we present an inquiry-driven learning path aimed at developing a more meaningful conceptual understanding of the electron dynamics in semiconductors in the presence of applied electric fields. The electron transport in a nondegenerate n-type indium phosphide bulk semiconductor is modelled using a multivalley Monte Carlo approach. The main characteristics of the electron dynamics are explored under different values of the driving electric field, lattice temperature and impurity density. Simulation results are presented by following a question-driven path of exploration, starting from the validation of the model and moving up to reasoned inquiries about the observed characteristics of electron dynamics. Our inquiry-driven learning path, based on numerical simulations, represents a viable example of how to integrate a traditional lecture-based teaching approach with effective learning strategies, providing science or engineering undergraduates with practical opportunities to enhance their comprehension of the physics governing the electron dynamics in semiconductors. Finally, we present a general discussion about the advantages and disadvantages of using an inquiry-based teaching approach within a learning environment based on semiconductor simulations.
Aspect-Oriented Model-Driven Software Product Line Engineering
NASA Astrophysics Data System (ADS)
Groher, Iris; Voelter, Markus
Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.
A Hybrid Physics-Based Data-Driven Approach for Point-Particle Force Modeling
NASA Astrophysics Data System (ADS)
Moore, Chandler; Akiki, Georges; Balachandar, S.
2017-11-01
This study improves upon the physics-based pairwise interaction extended point-particle (PIEP) model. The PIEP model leverages a physical framework to predict fluid mediated interactions between solid particles. While the PIEP model is a powerful tool, its pairwise assumption leads to increased error in flows with high particle volume fractions. To reduce this error, a regression algorithm is used to model the differences between the current PIEP model's predictions and the results of direct numerical simulations (DNS) for an array of monodisperse solid particles subjected to various flow conditions. The resulting statistical model and the physical PIEP model are superimposed to construct a hybrid, physics-based data-driven PIEP model. It must be noted that the performance of a pure data-driven approach without the model-form provided by the physical PIEP model is substantially inferior. The hybrid model's predictive capabilities are analyzed using more DNS. In every case tested, the hybrid PIEP model's prediction are more accurate than those of physical PIEP model. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE-1315138 and the U.S. DOE, NNSA, ASC Program, as a Cooperative Agreement under Contract No. DE-NA0002378.
Alderman, Phillip D.; Stanfill, Bryan
2016-10-06
Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less
Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali
2017-01-01
Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports.
Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali
2017-01-01
Objectives Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Methods Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Results Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. Conclusion The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports. PMID:28166263
Toward a user-driven approach to radiology software solutions: putting the wag back in the dog.
Morgan, Matthew; Mates, Jonathan; Chang, Paul
2006-09-01
The relationship between healthcare providers and the software industry is evolving. In many cases, industry's traditional, market-driven model is failing to meet the increasingly sophisticated and appropriately individualized needs of providers. Advances in both technology infrastructure and development methodologies have set the stage for the transition from a vendor-driven to a more user-driven process of solution engineering. To make this transition, providers must take an active role in the development process and vendors must provide flexible frameworks on which to build. Only then can the provider/vendor relationship mature from a purchaser/supplier to a codesigner/partner model, where true insight and innovation can occur.
Self-consistent modelling of line-driven hot-star winds with Monte Carlo radiation hydrodynamics
NASA Astrophysics Data System (ADS)
Noebauer, U. M.; Sim, S. A.
2015-11-01
Radiative pressure exerted by line interactions is a prominent driver of outflows in astrophysical systems, being at work in the outflows emerging from hot stars or from the accretion discs of cataclysmic variables, massive young stars and active galactic nuclei. In this work, a new radiation hydrodynamical approach to model line-driven hot-star winds is presented. By coupling a Monte Carlo radiative transfer scheme with a finite volume fluid dynamical method, line-driven mass outflows may be modelled self-consistently, benefiting from the advantages of Monte Carlo techniques in treating multiline effects, such as multiple scatterings, and in dealing with arbitrary multidimensional configurations. In this work, we introduce our approach in detail by highlighting the key numerical techniques and verifying their operation in a number of simplified applications, specifically in a series of self-consistent, one-dimensional, Sobolev-type, hot-star wind calculations. The utility and accuracy of our approach are demonstrated by comparing the obtained results with the predictions of various formulations of the so-called CAK theory and by confronting the calculations with modern sophisticated techniques of predicting the wind structure. Using these calculations, we also point out some useful diagnostic capabilities our approach provides. Finally, we discuss some of the current limitations of our method, some possible extensions and potential future applications.
Design and experiment of data-driven modeling and flutter control of a prototype wing
NASA Astrophysics Data System (ADS)
Lum, Kai-Yew; Xu, Cai-Lin; Lu, Zhenbo; Lai, Kwok-Leung; Cui, Yongdong
2017-06-01
This paper presents an approach for data-driven modeling of aeroelasticity and its application to flutter control design of a wind-tunnel wing model. Modeling is centered on system identification of unsteady aerodynamic loads using computational fluid dynamics data, and adopts a nonlinear multivariable extension of the Hammerstein-Wiener system. The formulation is in modal coordinates of the elastic structure, and yields a reduced-order model of the aeroelastic feedback loop that is parametrized by airspeed. Flutter suppression is thus cast as a robust stabilization problem over uncertain airspeed, for which a low-order H∞ controller is computed. The paper discusses in detail parameter sensitivity and observability of the model, the former to justify the chosen model structure, and the latter to provide a criterion for physical sensor placement. Wind tunnel experiments confirm the validity of the modeling approach and the effectiveness of the control design.
A Systems' Biology Approach to Study MicroRNA-Mediated Gene Regulatory Networks
Kunz, Manfred; Vera, Julio; Wolkenhauer, Olaf
2013-01-01
MicroRNAs (miRNAs) are potent effectors in gene regulatory networks where aberrant miRNA expression can contribute to human diseases such as cancer. For a better understanding of the regulatory role of miRNAs in coordinating gene expression, we here present a systems biology approach combining data-driven modeling and model-driven experiments. Such an approach is characterized by an iterative process, including biological data acquisition and integration, network construction, mathematical modeling and experimental validation. To demonstrate the application of this approach, we adopt it to investigate mechanisms of collective repression on p21 by multiple miRNAs. We first construct a p21 regulatory network based on data from the literature and further expand it using algorithms that predict molecular interactions. Based on the network structure, a detailed mechanistic model is established and its parameter values are determined using data. Finally, the calibrated model is used to study the effect of different miRNA expression profiles and cooperative target regulation on p21 expression levels in different biological contexts. PMID:24350286
Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann
We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less
NASA Technical Reports Server (NTRS)
Gwaltney, D. A.
2002-01-01
A FY 2001 Center Director's Discretionary Fund task to develop a test platform for the development, implementation. and evaluation of adaptive and other advanced control techniques for brushless DC (BLDC) motor-driven mechanisms is described. Important applications for BLDC motor-driven mechanisms are the translation of specimens in microgravity experiments and electromechanical actuation of nozzle and fuel valves in propulsion systems. Motor-driven aerocontrol surfaces are also being utilized in developmental X vehicles. The experimental test platform employs a linear translation stage that is mounted vertically and driven by a BLDC motor. Control approaches are implemented on a digital signal processor-based controller for real-time, closed-loop control of the stage carriage position. The goal of the effort is to explore the application of advanced control approaches that can enhance the performance of a motor-driven actuator over the performance obtained using linear control approaches with fixed gains. Adaptive controllers utilizing an exact model knowledge controller and a self-tuning controller are implemented and the control system performance is illustrated through the presentation of experimental results.
An Hypothesis-Driven, Molecular Phylogenetics Exercise for College Biology Students
ERIC Educational Resources Information Center
Parker, Joel D.; Ziemba, Robert E.; Cahan, Sara Helms; Rissing, Steven W.
2004-01-01
This hypothesis-driven laboratory exercise teaches how DNA evidence can be used to investigate an organism's evolutionary history while providing practical modeling of the fundamental processes of gene transcription and translation. We used an inquiry-based approach to construct a laboratory around a nontrivial, open-ended evolutionary question…
Constraint-Driven Software Design: An Escape from the Waterfall Model.
ERIC Educational Resources Information Center
de Hoog, Robert; And Others
1994-01-01
Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…
Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J
2017-11-01
Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Modeling and Calibration of a Novel One-Mirror Galvanometric Laser Scanner
Yu, Chengyi; Chen, Xiaobo; Xi, Juntong
2017-01-01
A laser stripe sensor has limited application when a point cloud of geometric samples on the surface of the object needs to be collected, so a galvanometric laser scanner is designed by using a one-mirror galvanometer element as its mechanical device to drive the laser stripe to sweep along the object. A novel mathematical model is derived for the proposed galvanometer laser scanner without any position assumptions and then a model-driven calibration procedure is proposed. Compared with available model-driven approaches, the influence of machining and assembly errors is considered in the proposed model. Meanwhile, a plane-constraint-based approach is proposed to extract a large number of calibration points effectively and accurately to calibrate the galvanometric laser scanner. Repeatability and accuracy of the galvanometric laser scanner are evaluated on the automobile production line to verify the efficiency and accuracy of the proposed calibration method. Experimental results show that the proposed calibration approach yields similar measurement performance compared with a look-up table calibration method. PMID:28098844
Louis R. Iverson; Anantha M. Prasad; Stephen N. Matthews; Matthew P. Peters
2011-01-01
We present an approach to modeling potential climate-driven changes in habitat for tree and bird species in the eastern United States. First, we took an empirical-statistical modeling approach, using randomForest, with species abundance data from national inventories combined with soil, climate, and landscape variables, to build abundance-based habitat models for 134...
A review of surrogate models and their application to groundwater modeling
NASA Astrophysics Data System (ADS)
Asher, M. J.; Croke, B. F. W.; Jakeman, A. J.; Peeters, L. J. M.
2015-08-01
The spatially and temporally variable parameters and inputs to complex groundwater models typically result in long runtimes which hinder comprehensive calibration, sensitivity, and uncertainty analysis. Surrogate modeling aims to provide a simpler, and hence faster, model which emulates the specified output of a more complex model in function of its inputs and parameters. In this review paper, we summarize surrogate modeling techniques in three categories: data-driven, projection, and hierarchical-based approaches. Data-driven surrogates approximate a groundwater model through an empirical model that captures the input-output mapping of the original model. Projection-based models reduce the dimensionality of the parameter space by projecting the governing equations onto a basis of orthonormal vectors. In hierarchical or multifidelity methods the surrogate is created by simplifying the representation of the physical system, such as by ignoring certain processes, or reducing the numerical resolution. In discussing the application to groundwater modeling of these methods, we note several imbalances in the existing literature: a large body of work on data-driven approaches seemingly ignores major drawbacks to the methods; only a fraction of the literature focuses on creating surrogates to reproduce outputs of fully distributed groundwater models, despite these being ubiquitous in practice; and a number of the more advanced surrogate modeling methods are yet to be fully applied in a groundwater modeling context.
Quantum correlations and limit cycles in the driven-dissipative Heisenberg lattice
NASA Astrophysics Data System (ADS)
Owen, E. T.; Jin, J.; Rossini, D.; Fazio, R.; Hartmann, M. J.
2018-04-01
Driven-dissipative quantum many-body systems have attracted increasing interest in recent years as they lead to novel classes of quantum many-body phenomena. In particular, mean-field calculations predict limit cycle phases, slow oscillations instead of stationary states, in the long-time limit for a number of driven-dissipative quantum many-body systems. Using a cluster mean-field and a self-consistent Mori projector approach, we explore the persistence of such limit cycles as short range quantum correlations are taken into account in a driven-dissipative Heisenberg model.
Lee, Y; Tien, J M
2001-01-01
We present mathematical models that determine the optimal parameters for strategically routing multidestination traffic in an end-to-end network setting. Multidestination traffic refers to a traffic type that can be routed to any one of a multiple number of destinations. A growing number of communication services is based on multidestination routing. In this parameter-driven approach, a multidestination call is routed to one of the candidate destination nodes in accordance with predetermined decision parameters associated with each candidate node. We present three different approaches: (1) a link utilization (LU) approach, (2) a network cost (NC) approach, and (3) a combined parametric (CP) approach. The LU approach provides the solution that would result in an optimally balanced link utilization, whereas the NC approach provides the least expensive way to route traffic to destinations. The CP approach, on the other hand, provides multiple solutions that help leverage link utilization and cost. The LU approach has in fact been implemented by a long distance carrier resulting in a considerable efficiency improvement in its international direct services, as summarized.
McFadden, David G.; Politi, Katerina; Bhutkar, Arjun; Chen, Frances K.; Song, Xiaoling; Pirun, Mono; Santiago, Philip M.; Kim-Kiselak, Caroline; Platt, James T.; Lee, Emily; Hodges, Emily; Rosebrock, Adam P.; Bronson, Roderick T.; Socci, Nicholas D.; Hannon, Gregory J.; Jacks, Tyler; Varmus, Harold
2016-01-01
Genetically engineered mouse models (GEMMs) of cancer are increasingly being used to assess putative driver mutations identified by large-scale sequencing of human cancer genomes. To accurately interpret experiments that introduce additional mutations, an understanding of the somatic genetic profile and evolution of GEMM tumors is necessary. Here, we performed whole-exome sequencing of tumors from three GEMMs of lung adenocarcinoma driven by mutant epidermal growth factor receptor (EGFR), mutant Kirsten rat sarcoma viral oncogene homolog (Kras), or overexpression of MYC proto-oncogene. Tumors from EGFR- and Kras-driven models exhibited, respectively, 0.02 and 0.07 nonsynonymous mutations per megabase, a dramatically lower average mutational frequency than observed in human lung adenocarcinomas. Tumors from models driven by strong cancer drivers (mutant EGFR and Kras) harbored few mutations in known cancer genes, whereas tumors driven by MYC, a weaker initiating oncogene in the murine lung, acquired recurrent clonal oncogenic Kras mutations. In addition, although EGFR- and Kras-driven models both exhibited recurrent whole-chromosome DNA copy number alterations, the specific chromosomes altered by gain or loss were different in each model. These data demonstrate that GEMM tumors exhibit relatively simple somatic genotypes compared with human cancers of a similar type, making these autochthonous model systems useful for additive engineering approaches to assess the potential of novel mutations on tumorigenesis, cancer progression, and drug sensitivity. PMID:27702896
McFadden, David G; Politi, Katerina; Bhutkar, Arjun; Chen, Frances K; Song, Xiaoling; Pirun, Mono; Santiago, Philip M; Kim-Kiselak, Caroline; Platt, James T; Lee, Emily; Hodges, Emily; Rosebrock, Adam P; Bronson, Roderick T; Socci, Nicholas D; Hannon, Gregory J; Jacks, Tyler; Varmus, Harold
2016-10-18
Genetically engineered mouse models (GEMMs) of cancer are increasingly being used to assess putative driver mutations identified by large-scale sequencing of human cancer genomes. To accurately interpret experiments that introduce additional mutations, an understanding of the somatic genetic profile and evolution of GEMM tumors is necessary. Here, we performed whole-exome sequencing of tumors from three GEMMs of lung adenocarcinoma driven by mutant epidermal growth factor receptor (EGFR), mutant Kirsten rat sarcoma viral oncogene homolog (Kras), or overexpression of MYC proto-oncogene. Tumors from EGFR- and Kras-driven models exhibited, respectively, 0.02 and 0.07 nonsynonymous mutations per megabase, a dramatically lower average mutational frequency than observed in human lung adenocarcinomas. Tumors from models driven by strong cancer drivers (mutant EGFR and Kras) harbored few mutations in known cancer genes, whereas tumors driven by MYC, a weaker initiating oncogene in the murine lung, acquired recurrent clonal oncogenic Kras mutations. In addition, although EGFR- and Kras-driven models both exhibited recurrent whole-chromosome DNA copy number alterations, the specific chromosomes altered by gain or loss were different in each model. These data demonstrate that GEMM tumors exhibit relatively simple somatic genotypes compared with human cancers of a similar type, making these autochthonous model systems useful for additive engineering approaches to assess the potential of novel mutations on tumorigenesis, cancer progression, and drug sensitivity.
Enrichment Clusters: A Practical Plan for Real-World, Student-Driven Learning.
ERIC Educational Resources Information Center
Renzulli, Joseph S.; Gentry, Marcia; Reis, Sally M.
This guidebook provides a rationale and guidelines for implementing a student-driven learning approach using enrichment clusters. Enrichment clusters allow students who share a common interest to meet each week to produce a product, performance, or targeted service based on that common interest. Chapter 1 discusses different models of learning.…
Closing the Loop: How We Better Serve Our Students through a Comprehensive Assessment Process
ERIC Educational Resources Information Center
Arcario, Paul; Eynon, Bret; Klages, Marisa; Polnariev, Bernard A.
2013-01-01
Outcomes assessment is often driven by demands for accountability. LaGuardia Community College's outcomes assessment model has advanced student learning, shaped academic program development, and created an impressive culture of faculty-driven assessment. Our inquiry-based approach uses ePortfolios for collection of student work and demonstrates…
NASA Astrophysics Data System (ADS)
Zebisch, Marc; Schneiderbauer, Stefan; Petitta, Marcello
2015-04-01
In the last decade the scope of climate change science has broadened significantly. 15 years ago the focus was mainly on understanding climate change, providing climate change scenarios and giving ideas about potential climate change impacts. Today, adaptation to climate change has become an increasingly important field of politics and one role of science is to inform and consult this process. Therefore, climate change science is not anymore focusing on data driven approaches only (such as climate or climate impact models) but is progressively applying and relying on qualitative approaches including opinion and expertise acquired through interactive processes with local stakeholders and decision maker. Furthermore, climate change science is facing the challenge of normative questions, such us 'how important is a decrease of yield in a developed country where agriculture only represents 3% of the GDP and the supply with agricultural products is strongly linked to global markets and less depending on local production?'. In this talk we will present examples from various applied research and consultancy projects on climate change vulnerabilities including data driven methods (e.g. remote sensing and modelling) to semi-quantitative and qualitative assessment approaches. Furthermore, we will discuss bottlenecks, pitfalls and opportunities in transferring climate change science to policy and decision maker oriented climate services.
NASA Astrophysics Data System (ADS)
Lu, Xiaojun; Liu, Changli; Chen, Lei
2018-04-01
In this paper, a redundant Piezo-driven stage having 3RRR compliant mechanism is introduced, we propose the master-slave control with trajectory planning (MSCTP) strategy and Bouc-Wen model to improve its micro-motion tracking performance. The advantage of the proposed controller lies in that its implementation only requires a simple control strategy without the complexity of modeling to avoid the master PEA's tracking error. The dynamic model of slave PEA system with Bouc-Wen hysteresis is established and identified via particle swarm optimization (PSO) approach. The Piezo-driven stage with operating period T=1s and 2s is implemented to track a prescribed circle. The simulation results show that MSCTP with Bouc-Wen model reduces the trajectory tracking errors to the range of the accuracy of our available measurement.
The application of domain-driven design in NMS
NASA Astrophysics Data System (ADS)
Zhang, Jinsong; Chen, Yan; Qin, Shengjun
2011-12-01
In the traditional design approach of data-model-driven, system analysis and design phases are often separated which makes the demand information can not be expressed explicitly. The method is also easy to lead developer to the process-oriented programming, making codes between the modules or between hierarchies disordered. So it is hard to meet requirement of system scalability. The paper proposes a software hiberarchy based on rich domain model according to domain-driven design named FHRDM, then the Webwork + Spring + Hibernate (WSH) framework is determined. Domain-driven design aims to construct a domain model which not only meets the demand of the field where the software exists but also meets the need of software development. In this way, problems in Navigational Maritime System (NMS) development like big system business volumes, difficulty of requirement elicitation, high development costs and long development cycle can be resolved successfully.
Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.; ...
2016-09-18
This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.
This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.
Parameterized data-driven fuzzy model based optimal control of a semi-batch reactor.
Kamesh, Reddi; Rani, K Yamuna
2016-09-01
A parameterized data-driven fuzzy (PDDF) model structure is proposed for semi-batch processes, and its application for optimal control is illustrated. The orthonormally parameterized input trajectories, initial states and process parameters are the inputs to the model, which predicts the output trajectories in terms of Fourier coefficients. Fuzzy rules are formulated based on the signs of a linear data-driven model, while the defuzzification step incorporates a linear regression model to shift the domain from input to output domain. The fuzzy model is employed to formulate an optimal control problem for single rate as well as multi-rate systems. Simulation study on a multivariable semi-batch reactor system reveals that the proposed PDDF modeling approach is capable of capturing the nonlinear and time-varying behavior inherent in the semi-batch system fairly accurately, and the results of operating trajectory optimization using the proposed model are found to be comparable to the results obtained using the exact first principles model, and are also found to be comparable to or better than parameterized data-driven artificial neural network model based optimization results. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
The Data-Driven Approach to Spectroscopic Analyses
NASA Astrophysics Data System (ADS)
Ness, M.
2018-01-01
I review the data-driven approach to spectroscopy, The Cannon, which is a method for deriving fundamental diagnostics of galaxy formation of precise chemical compositions and stellar ages, across many stellar surveys that are mapping the Milky Way. With The Cannon, the abundances and stellar parameters from the multitude of stellar surveys can be placed directly on the same scale, using stars in common between the surveys. Furthermore, the information that resides in the data can be fully extracted, this has resulted in higher precision stellar parameters and abundances being delivered from spectroscopic data and has opened up new avenues in galactic archeology, for example, in the determination of ages for red giant stars across the Galactic disk. Coupled with Gaia distances, proper motions, and derived orbit families, the stellar age and individual abundance information delivered at the precision obtained with the data-driven approach provides very strong constraints on the evolution of and birthplace of stars in the Milky Way. I will review the role of data-driven spectroscopy as we enter the era where we have both the data and the tools to build the ultimate conglomerate of galactic information as well as highlight further applications of data-driven models in the coming decade.
A systems science perspective and transdisciplinary models for food and nutrition security
Hammond, Ross A.; Dubé, Laurette
2012-01-01
We argue that food and nutrition security is driven by complex underlying systems and that both research and policy in this area would benefit from a systems approach. We present a framework for such an approach, examine key underlying systems, and identify transdisciplinary modeling tools that may prove especially useful. PMID:22826247
Evaluation of a Kinematically-Driven Finite Element Footstrike Model.
Hannah, Iain; Harland, Andy; Price, Dan; Schlarb, Heiko; Lucas, Tim
2016-06-01
A dynamic finite element model of a shod running footstrike was developed and driven with 6 degree of freedom foot segment kinematics determined from a motion capture running trial. Quadratic tetrahedral elements were used to mesh the footwear components with material models determined from appropriate mechanical tests. Model outputs were compared with experimental high-speed video (HSV) footage, vertical ground reaction force (GRF), and center of pressure (COP) excursion to determine whether such an approach is appropriate for the development of athletic footwear. Although unquantified, good visual agreement to the HSV footage was observed but significant discrepancies were found between the model and experimental GRF and COP readings (9% and 61% of model readings outside of the mean experimental reading ± 2 standard deviations, respectively). Model output was also found to be highly sensitive to input kinematics with a 120% increase in maximum GRF observed when translating the force platform 2 mm vertically. While representing an alternative approach to existing dynamic finite element footstrike models, loading highly representative of an experimental trial was not found to be achievable when employing exclusively kinematic boundary conditions. This significantly limits the usefulness of employing such an approach in the footwear development process.
Simplified subsurface modelling: data assimilation and violated model assumptions
NASA Astrophysics Data System (ADS)
Erdal, Daniel; Lange, Natascha; Neuweiler, Insa
2017-04-01
Integrated models are gaining more and more attention in hydrological modelling as they can better represent the interaction between different compartments. Naturally, these models come along with larger numbers of unknowns and requirements on computational resources compared to stand-alone models. If large model domains are to be represented, e.g. on catchment scale, the resolution of the numerical grid needs to be reduced or the model itself needs to be simplified. Both approaches lead to a reduced ability to reproduce the present processes. This lack of model accuracy may be compensated by using data assimilation methods. In these methods observations are used to update the model states, and optionally model parameters as well, in order to reduce the model error induced by the imposed simplifications. What is unclear is whether these methods combined with strongly simplified models result in completely data-driven models or if they can even be used to make adequate predictions of the model state for times when no observations are available. In the current work we consider the combined groundwater and unsaturated zone, which can be modelled in a physically consistent way using 3D-models solving the Richards equation. For use in simple predictions, however, simpler approaches may be considered. The question investigated here is whether a simpler model, in which the groundwater is modelled as a horizontal 2D-model and the unsaturated zones as a few sparse 1D-columns, can be used within an Ensemble Kalman filter to give predictions of groundwater levels and unsaturated fluxes. This is tested under conditions where the feedback between the two model-compartments are large (e.g. shallow groundwater table) and the simplification assumptions are clearly violated. Such a case may be a steep hill-slope or pumping wells, creating lateral fluxes in the unsaturated zone, or strong heterogeneous structures creating unaccounted flows in both the saturated and unsaturated compartments. Under such circumstances, direct modelling using a simplified model will not provide good results. However, a more data driven (e.g. grey box) approach, driven by the filter, may still provide an improved understanding of the system. Comparisons between full 3D simulations and simplified filter driven models will be shown and the resulting benefits and drawbacks will be discussed.
Exploring a model-driven architecture (MDA) approach to health care information systems development.
Raghupathi, Wullianallur; Umar, Amjad
2008-05-01
To explore the potential of the model-driven architecture (MDA) in health care information systems development. An MDA is conceptualized and developed for a health clinic system to track patient information. A prototype of the MDA is implemented using an advanced MDA tool. The UML provides the underlying modeling support in the form of the class diagram. The PIM to PSM transformation rules are applied to generate the prototype application from the model. The result of the research is a complete MDA methodology to developing health care information systems. Additional insights gained include development of transformation rules and documentation of the challenges in the application of MDA to health care. Design guidelines for future MDA applications are described. The model has the potential for generalizability. The overall approach supports limited interoperability and portability. The research demonstrates the applicability of the MDA approach to health care information systems development. When properly implemented, it has the potential to overcome the challenges of platform (vendor) dependency, lack of open standards, interoperability, portability, scalability, and the high cost of implementation.
NASA Astrophysics Data System (ADS)
Turner, D. P.; Jacobson, A. R.; Nemani, R. R.
2013-12-01
The recent development of large spatially-explicit datasets for multiple variables relevant to monitoring terrestrial carbon flux offers the opportunity to estimate the terrestrial land flux using several alternative, potentially complimentary, approaches. Here we developed and compared regional estimates of net ecosystem exchange (NEE) over the Pacific Northwest region of the U.S. using three approaches. In the prognostic modeling approach, the process-based Biome-BGC model was driven by distributed meteorological station data and was informed by Landsat-based coverages of forest stand age and disturbance regime. In the diagnostic modeling approach, the quasi-mechanistic CFLUX model estimated net ecosystem production (NEP) by upscaling eddy covariance flux tower observations. The model was driven by distributed climate data and MODIS FPAR (the fraction of incident PAR that is absorbed by the vegetation canopy). It was informed by coarse resolution (1 km) data about forest stand age. In both the prognostic and diagnostic modeling approaches, emissions estimates for biomass burning, harvested products, and river/stream evasion were added to model-based NEP to get NEE. The inversion model (CarbonTracker) relied on observations of atmospheric CO2 concentration to optimize prior surface carbon flux estimates. The Pacific Northwest is heterogeneous with respect to land cover and forest management, and repeated surveys of forest inventory plots support the presence of a strong regional carbon sink. The diagnostic model suggested a stronger carbon sink than the prognostic model, and a much larger sink that the inversion model. The introduction of Landsat data on disturbance history served to reduce uncertainty with respect to regional NEE in the diagnostic and prognostic modeling approaches. The FPAR data was particularly helpful in capturing the seasonality of the carbon flux using the diagnostic modeling approach. The inversion approach took advantage of a global network of CO2 observation stations, but had difficulty resolving regional fluxes such as that in the PNW given the still sparse nature of the CO2 measurement network.
Dynamical prediction of flu seasonality driven by ambient temperature: influenza vs. common cold
NASA Astrophysics Data System (ADS)
Postnikov, Eugene B.
2016-01-01
This work presents a comparative analysis of Influenzanet data for influenza itself and common cold in the Netherlands during the last 5 years, from the point of view of modelling by linearised SIRS equations parametrically driven by the ambient temperature. It is argued that this approach allows for the forecast of common cold, but not of influenza in a strict sense. The difference in their kinetic models is discussed with reference to the clinical background.
NASA Astrophysics Data System (ADS)
Rose, A.; McKee, J.; Weber, E.; Bhaduri, B. L.
2017-12-01
Leveraging decades of expertise in population modeling, and in response to growing demand for higher resolution population data, Oak Ridge National Laboratory is now generating LandScan HD at global scale. LandScan HD is conceived as a 90m resolution population distribution where modeling is tailored to the unique geography and data conditions of individual countries or regions by combining social, cultural, physiographic, and other information with novel geocomputation methods. Similarities among these areas are exploited in order to leverage existing training data and machine learning algorithms to rapidly scale development. Drawing on ORNL's unique set of capabilities, LandScan HD adapts highly mature population modeling methods developed for LandScan Global and LandScan USA, settlement mapping research and production in high-performance computing (HPC) environments, land use and neighborhood mapping through image segmentation, and facility-specific population density models. Adopting a flexible methodology to accommodate different geographic areas, LandScan HD accounts for the availability, completeness, and level of detail of relevant ancillary data. Beyond core population and mapped settlement inputs, these factors determine the model complexity for an area, requiring that for any given area, a data-driven model could support either a simple top-down approach, a more detailed bottom-up approach, or a hybrid approach.
Linear dynamical modes as new variables for data-driven ENSO forecast
NASA Astrophysics Data System (ADS)
Gavrilov, Andrey; Seleznev, Aleksei; Mukhin, Dmitry; Loskutov, Evgeny; Feigin, Alexander; Kurths, Juergen
2018-05-01
A new data-driven model for analysis and prediction of spatially distributed time series is proposed. The model is based on a linear dynamical mode (LDM) decomposition of the observed data which is derived from a recently developed nonlinear dimensionality reduction approach. The key point of this approach is its ability to take into account simple dynamical properties of the observed system by means of revealing the system's dominant time scales. The LDMs are used as new variables for empirical construction of a nonlinear stochastic evolution operator. The method is applied to the sea surface temperature anomaly field in the tropical belt where the El Nino Southern Oscillation (ENSO) is the main mode of variability. The advantage of LDMs versus traditionally used empirical orthogonal function decomposition is demonstrated for this data. Specifically, it is shown that the new model has a competitive ENSO forecast skill in comparison with the other existing ENSO models.
Data-driven Analysis and Prediction of Arctic Sea Ice
NASA Astrophysics Data System (ADS)
Kondrashov, D. A.; Chekroun, M.; Ghil, M.; Yuan, X.; Ting, M.
2015-12-01
We present results of data-driven predictive analyses of sea ice over the main Arctic regions. Our approach relies on the Multilayer Stochastic Modeling (MSM) framework of Kondrashov, Chekroun and Ghil [Physica D, 2015] and it leads to prognostic models of sea ice concentration (SIC) anomalies on seasonal time scales.This approach is applied to monthly time series of leading principal components from the multivariate Empirical Orthogonal Function decomposition of SIC and selected climate variables over the Arctic. We evaluate the predictive skill of MSM models by performing retrospective forecasts with "no-look ahead" forup to 6-months ahead. It will be shown in particular that the memory effects included in our non-Markovian linear MSM models improve predictions of large-amplitude SIC anomalies in certain Arctic regions. Furtherimprovements allowed by the MSM framework will adopt a nonlinear formulation, as well as alternative data-adaptive decompositions.
Data-driven approach to human motion modeling with Lua and gesture description language
NASA Astrophysics Data System (ADS)
Hachaj, Tomasz; Koptyra, Katarzyna; Ogiela, Marek R.
2017-03-01
The aim of this paper is to present the novel proposition of the human motion modelling and recognition approach that enables real time MoCap signal evaluation. By motions (actions) recognition we mean classification. The role of this approach is to propose the syntactic description procedure that can be easily understood, learnt and used in various motion modelling and recognition tasks in all MoCap systems no matter if they are vision or wearable sensor based. To do so we have prepared extension of Gesture Description Language (GDL) methodology that enables movements description and real-time recognition so that it can use not only positional coordinates of body joints but virtually any type of discreetly measured output MoCap signals like accelerometer, magnetometer or gyroscope. We have also prepared and evaluated the cross-platform implementation of this approach using Lua scripting language and JAVA technology. This implementation is called Data Driven GDL (DD-GDL). In tested scenarios the average execution speed is above 100 frames per second which is an acquisition time of many popular MoCap solutions.
Hybrid quantum-classical modeling of quantum dot devices
NASA Astrophysics Data System (ADS)
Kantner, Markus; Mittnenzweig, Markus; Koprucki, Thomas
2017-11-01
The design of electrically driven quantum dot devices for quantum optical applications asks for modeling approaches combining classical device physics with quantum mechanics. We connect the well-established fields of semiclassical semiconductor transport theory and the theory of open quantum systems to meet this requirement. By coupling the van Roosbroeck system with a quantum master equation in Lindblad form, we introduce a new hybrid quantum-classical modeling approach, which provides a comprehensive description of quantum dot devices on multiple scales: it enables the calculation of quantum optical figures of merit and the spatially resolved simulation of the current flow in realistic semiconductor device geometries in a unified way. We construct the interface between both theories in such a way, that the resulting hybrid system obeys the fundamental axioms of (non)equilibrium thermodynamics. We show that our approach guarantees the conservation of charge, consistency with the thermodynamic equilibrium and the second law of thermodynamics. The feasibility of the approach is demonstrated by numerical simulations of an electrically driven single-photon source based on a single quantum dot in the stationary and transient operation regime.
ERIC Educational Resources Information Center
Stanton, Marina R.; Atherton, W. Leigh; Toriello, Paul J.; Hodgson, Jennifer L.
2012-01-01
Although screening, brief intervention, and referral to treatment (SBIRT) has been a popular model to address potential substance abuse issues in primary care, there is a need for innovative approaches for training providers and staff on SBIRT protocols. An interdisciplinary approach to SBIRT training, named ICARE, was implemented at 3 different…
NASA Astrophysics Data System (ADS)
Weidinger, Simon A.; Knap, Michael
2017-04-01
We study the regimes of heating in the periodically driven O(N)-model, which is a well established model for interacting quantum many-body systems. By computing the absorbed energy with a non-equilibrium Keldysh Green’s function approach, we establish three dynamical regimes: at short times a single-particle dominated regime, at intermediate times a stable Floquet prethermal regime in which the system ceases to absorb, and at parametrically late times a thermalizing regime. Our simulations suggest that in the thermalizing regime the absorbed energy grows algebraically in time with an exponent that approaches the universal value of 1/2, and is thus significantly slower than linear Joule heating. Our results demonstrate the parametric stability of prethermal states in a many-body system driven at frequencies that are comparable to its microscopic scales. This paves the way for realizing exotic quantum phases, such as time crystals or interacting topological phases, in the prethermal regime of interacting Floquet systems.
Working with the HL7 metamodel in a Model Driven Engineering context.
Martínez-García, A; García-García, J A; Escalona, M J; Parra-Calderón, C L
2015-10-01
HL7 (Health Level 7) International is an organization that defines health information standards. Most HL7 domain information models have been designed according to a proprietary graphic language whose domain models are based on the HL7 metamodel. Many researchers have considered using HL7 in the MDE (Model-Driven Engineering) context. A limitation has been identified: all MDE tools support UML (Unified Modeling Language), which is a standard model language, but most do not support the HL7 proprietary model language. We want to support software engineers without HL7 experience, thus real-world problems would be modeled by them by defining system requirements in UML that are compliant with HL7 domain models transparently. The objective of the present research is to connect HL7 with software analysis using a generic model-based approach. This paper introduces a first approach to an HL7 MDE solution that considers the MIF (Model Interchange Format) metamodel proposed by HL7 by making use of a plug-in developed in the EA (Enterprise Architect) tool. Copyright © 2015 Elsevier Inc. All rights reserved.
Data-Driven H∞ Control for Nonlinear Distributed Parameter Systems.
Luo, Biao; Huang, Tingwen; Wu, Huai-Ning; Yang, Xiong
2015-11-01
The data-driven H∞ control problem of nonlinear distributed parameter systems is considered in this paper. An off-policy learning method is developed to learn the H∞ control policy from real system data rather than the mathematical model. First, Karhunen-Loève decomposition is used to compute the empirical eigenfunctions, which are then employed to derive a reduced-order model (ROM) of slow subsystem based on the singular perturbation theory. The H∞ control problem is reformulated based on the ROM, which can be transformed to solve the Hamilton-Jacobi-Isaacs (HJI) equation, theoretically. To learn the solution of the HJI equation from real system data, a data-driven off-policy learning approach is proposed based on the simultaneous policy update algorithm and its convergence is proved. For implementation purpose, a neural network (NN)- based action-critic structure is developed, where a critic NN and two action NNs are employed to approximate the value function, control, and disturbance policies, respectively. Subsequently, a least-square NN weight-tuning rule is derived with the method of weighted residuals. Finally, the developed data-driven off-policy learning approach is applied to a nonlinear diffusion-reaction process, and the obtained results demonstrate its effectiveness.
Research needs for developing a commodity-driven freight modeling approach.
DOT National Transportation Integrated Search
2003-01-01
It is well known that better freight forecasting models and data are needed, but the literature does not clearly indicate which components of the modeling methodology are most in need of improvement, which is a critical need in an era of limited rese...
Robust Synchronization Models for Presentation System Using SMIL-Driven Approach
ERIC Educational Resources Information Center
Asnawi, Rustam; Ahmad, Wan Fatimah Wan; Rambli, Dayang Rohaya Awang
2013-01-01
Current common Presentation System (PS) models are slide based oriented and lack synchronization analysis either with temporal or spatial constraints. Such models, in fact, tend to lead to synchronization problems, particularly on parallel synchronization with spatial constraints between multimedia element presentations. However, parallel…
A model teaching session for the hypothesis-driven physical examination.
Nishigori, Hiroshi; Masuda, Kozo; Kikukawa, Makoto; Kawashima, Atsushi; Yudkowsky, Rachel; Bordage, Georges; Otaki, Junji
2011-01-01
The physical examination is an essential clinical competence for all physicians. Most medical schools have students who learn the physical examination maneuvers using a head-to-toe approach. However, this promotes a rote approach to the physical exam, and it is not uncommon for students later on to fail to appreciate the meaning of abnormal findings and their contribution to the diagnostic reasoning process. The purpose of the project was to develop a model teaching session for the hypothesis-driven physical examination (HDPE) approach in which students could practice the physical examination in the context of diagnostic reasoning. We used an action research methodology to create this HDPE model by developing a teaching session, implementing it over 100 times with approximately 700 students, conducting internal reflection and external evaluations, and making adjustments as needed. A model nine-step HDPE teaching session was developed, including: (1) orientation, (2) anticipation, (3) preparation, (4) role play, (5) discussion-1, (6) answers, (7) discussion-2, (8) demonstration and (9) reflection. A structured model HDPE teaching session and tutor guide were developed into a workable instructional intervention. Faculty members are invited to teach the physical examination using this model.
Supported Employment Handbook: A Customer-Driven Approach for Persons with Significant Disabilities.
ERIC Educational Resources Information Center
Brooke, Valerie, Ed.; And Others
This manual provides training information for implementing supported employment by using a customer-driven approach. Chapter 1, "Supported Employment: A Customer-Driven Approach" (Valerie Brooke and others), describes current best practices, a new customer-driven approach to supported employment, and the role of the employment specialist. Chapter…
A model-driven approach for representing clinical archetypes for Semantic Web environments.
Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto
2009-02-01
The life-long clinical information of any person supported by electronic means configures his Electronic Health Record (EHR). This information is usually distributed among several independent and heterogeneous systems that may be syntactically or semantically incompatible. There are currently different standards for representing and exchanging EHR information among different systems. In advanced EHR approaches, clinical information is represented by means of archetypes. Most of these approaches use the Archetype Definition Language (ADL) to specify archetypes. However, ADL has some drawbacks when attempting to perform semantic activities in Semantic Web environments. In this work, Semantic Web technologies are used to specify clinical archetypes for advanced EHR architectures. The advantages of using the Ontology Web Language (OWL) instead of ADL are described and discussed in this work. Moreover, a solution combining Semantic Web and Model-driven Engineering technologies is proposed to transform ADL into OWL for the CEN EN13606 EHR architecture.
NASA Astrophysics Data System (ADS)
Hunter, Jason M.; Maier, Holger R.; Gibbs, Matthew S.; Foale, Eloise R.; Grosvenor, Naomi A.; Harders, Nathan P.; Kikuchi-Miller, Tahali C.
2018-05-01
Salinity modelling in river systems is complicated by a number of processes, including in-stream salt transport and various mechanisms of saline accession that vary dynamically as a function of water level and flow, often at different temporal scales. Traditionally, salinity models in rivers have either been process- or data-driven. The primary problem with process-based models is that in many instances, not all of the underlying processes are fully understood or able to be represented mathematically. There are also often insufficient historical data to support model development. The major limitation of data-driven models, such as artificial neural networks (ANNs) in comparison, is that they provide limited system understanding and are generally not able to be used to inform management decisions targeting specific processes, as different processes are generally modelled implicitly. In order to overcome these limitations, a generic framework for developing hybrid process and data-driven models of salinity in river systems is introduced and applied in this paper. As part of the approach, the most suitable sub-models are developed for each sub-process affecting salinity at the location of interest based on consideration of model purpose, the degree of process understanding and data availability, which are then combined to form the hybrid model. The approach is applied to a 46 km reach of the Murray River in South Australia, which is affected by high levels of salinity. In this reach, the major processes affecting salinity include in-stream salt transport, accession of saline groundwater along the length of the reach and the flushing of three waterbodies in the floodplain during overbank flows of various magnitudes. Based on trade-offs between the degree of process understanding and data availability, a process-driven model is developed for in-stream salt transport, an ANN model is used to model saline groundwater accession and three linear regression models are used to account for the flushing of the different floodplain storages. The resulting hybrid model performs very well on approximately 3 years of daily validation data, with a Nash-Sutcliffe efficiency (NSE) of 0.89 and a root mean squared error (RMSE) of 12.62 mg L-1 (over a range from approximately 50 to 250 mg L-1). Each component of the hybrid model results in noticeable improvements in model performance corresponding to the range of flows for which they are developed. The predictive performance of the hybrid model is significantly better than that of a benchmark process-driven model (NSE = -0.14, RMSE = 41.10 mg L-1, Gbench index = 0.90) and slightly better than that of a benchmark data-driven (ANN) model (NSE = 0.83, RMSE = 15.93 mg L-1, Gbench index = 0.36). Apart from improved predictive performance, the hybrid model also has advantages over the ANN benchmark model in terms of increased capacity for improving system understanding and greater ability to support management decisions.
Using a logical information model-driven design process in healthcare.
Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen
2011-01-01
A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.
Climate-driven vital rates do not always mean climate-driven population.
Tavecchia, Giacomo; Tenan, Simone; Pradel, Roger; Igual, José-Manuel; Genovart, Meritxell; Oro, Daniel
2016-12-01
Current climatic changes have increased the need to forecast population responses to climate variability. A common approach to address this question is through models that project current population state using the functional relationship between demographic rates and climatic variables. We argue that this approach can lead to erroneous conclusions when interpopulation dispersal is not considered. We found that immigration can release the population from climate-driven trajectories even when local vital rates are climate dependent. We illustrated this using individual-based data on a trans-equatorial migratory seabird, the Scopoli's shearwater Calonectris diomedea, in which the variation of vital rates has been associated with large-scale climatic indices. We compared the population annual growth rate λ i , estimated using local climate-driven parameters with ρ i , a population growth rate directly estimated from individual information and that accounts for immigration. While λ i varied as a function of climatic variables, reflecting the climate-dependent parameters, ρ i did not, indicating that dispersal decouples the relationship between population growth and climate variables from that between climatic variables and vital rates. Our results suggest caution when assessing demographic effects of climatic variability especially in open populations for very mobile organisms such as fish, marine mammals, bats, or birds. When a population model cannot be validated or it is not detailed enough, ignoring immigration might lead to misleading climate-driven projections. © 2016 John Wiley & Sons Ltd.
Data-driven non-linear elasticity: constitutive manifold construction and problem discretization
NASA Astrophysics Data System (ADS)
Ibañez, Ruben; Borzacchiello, Domenico; Aguado, Jose Vicente; Abisset-Chavanne, Emmanuelle; Cueto, Elias; Ladeveze, Pierre; Chinesta, Francisco
2017-11-01
The use of constitutive equations calibrated from data has been implemented into standard numerical solvers for successfully addressing a variety problems encountered in simulation-based engineering sciences (SBES). However, the complexity remains constantly increasing due to the need of increasingly detailed models as well as the use of engineered materials. Data-Driven simulation constitutes a potential change of paradigm in SBES. Standard simulation in computational mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy,\\ldots ), whereas the second one consists of models that scientists have extracted from collected, either natural or synthetic, data. Data-driven (or data-intensive) simulation consists of directly linking experimental data to computers in order to perform numerical simulations. These simulations will employ laws, universally recognized as epistemic, while minimizing the need of explicit, often phenomenological, models. The main drawback of such an approach is the large amount of required data, some of them inaccessible from the nowadays testing facilities. Such difficulty can be circumvented in many cases, and in any case alleviated, by considering complex tests, collecting as many data as possible and then using a data-driven inverse approach in order to generate the whole constitutive manifold from few complex experimental tests, as discussed in the present work.
Rhodes, Scott D.; McCoy, Thomas P.
2014-01-01
This study explored correlates of condom use within a respondent-driven sample of 190 Spanish-speaking immigrant Latino sexual minorities, including gay and bisexual men, other men who have sex with men (MSM), and transgender person, in North Carolina. Five analytic approaches for modeling data collected using respondent-driven sampling (RDS) were compared. Across most approaches, knowledge of HIV and sexually transmitted infections (STIs) and increased condom use self-efficacy predicted consistent condom use and increased homophobia predicted decreased consistent condom use. The same correlates were not significant in all analyses but were consistent in most. Clustering due to recruitment chains was low, while clustering due to recruiter was substantial. This highlights the importance accounting for clustering when analyzing RDS data. PMID:25646728
Continuous Energy Improvement in Motor Driven Systems - A Guidebook for Industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert A. McCoy and John G. Douglass
2014-02-01
This guidebook provides a step-by-step approach to developing a motor system energy-improvement action plan. An action plan includes which motors should be repaired or replaced with higher efficiency models, recommendations on maintaining a spares inventory, and discussion of improvements in maintenance practices. The guidebook is the successor to DOE’s 1997 Energy Management for Motor Driven Systems. It builds on its predecessor publication by including topics such as power transmission systems and matching driven equipment to process requirements in addition to motors.
Knowledge-driven genomic interactions: an application in ovarian cancer.
Kim, Dokyoon; Li, Ruowang; Dudek, Scott M; Frase, Alex T; Pendergrass, Sarah A; Ritchie, Marylyn D
2014-01-01
Effective cancer clinical outcome prediction for understanding of the mechanism of various types of cancer has been pursued using molecular-based data such as gene expression profiles, an approach that has promise for providing better diagnostics and supporting further therapies. However, clinical outcome prediction based on gene expression profiles varies between independent data sets. Further, single-gene expression outcome prediction is limited for cancer evaluation since genes do not act in isolation, but rather interact with other genes in complex signaling or regulatory networks. In addition, since pathways are more likely to co-operate together, it would be desirable to incorporate expert knowledge to combine pathways in a useful and informative manner. Thus, we propose a novel approach for identifying knowledge-driven genomic interactions and applying it to discover models associated with cancer clinical phenotypes using grammatical evolution neural networks (GENN). In order to demonstrate the utility of the proposed approach, an ovarian cancer data from the Cancer Genome Atlas (TCGA) was used for predicting clinical stage as a pilot project. We identified knowledge-driven genomic interactions associated with cancer stage from single knowledge bases such as sources of pathway-pathway interaction, but also knowledge-driven genomic interactions across different sets of knowledge bases such as pathway-protein family interactions by integrating different types of information. Notably, an integration model from different sources of biological knowledge achieved 78.82% balanced accuracy and outperformed the top models with gene expression or single knowledge-based data types alone. Furthermore, the results from the models are more interpretable because they are framed in the context of specific biological pathways or other expert knowledge. The success of the pilot study we have presented herein will allow us to pursue further identification of models predictive of clinical cancer survival and recurrence. Understanding the underlying tumorigenesis and progression in ovarian cancer through the global view of interactions within/between different biological knowledge sources has the potential for providing more effective screening strategies and therapeutic targets for many types of cancer.
NASA Astrophysics Data System (ADS)
Bellugi, D. G.; Tennant, C.; Larsen, L.
2016-12-01
Catchment and climate heterogeneity complicate prediction of runoff across time and space, and resulting parameter uncertainty can lead to large accumulated errors in hydrologic models, particularly in ungauged basins. Recently, data-driven modeling approaches have been shown to avoid the accumulated uncertainty associated with many physically-based models, providing an appealing alternative for hydrologic prediction. However, the effectiveness of different methods in hydrologically and geomorphically distinct catchments, and the robustness of these methods to changing climate and changing hydrologic processes remain to be tested. Here, we evaluate the use of machine learning techniques to predict daily runoff across time and space using only essential climatic forcing (e.g. precipitation, temperature, and potential evapotranspiration) time series as model input. Model training and testing was done using a high quality dataset of daily runoff and climate forcing data for 25+ years for 600+ minimally-disturbed catchments (drainage area range 5-25,000 km2, median size 336 km2) that cover a wide range of climatic and physical characteristics. Preliminary results using Support Vector Regression (SVR) suggest that in some catchments this nonlinear-based regression technique can accurately predict daily runoff, while the same approach fails in other catchments, indicating that the representation of climate inputs and/or catchment filter characteristics in the model structure need further refinement to increase performance. We bolster this analysis by using Sparse Identification of Nonlinear Dynamics (a sparse symbolic regression technique) to uncover the governing equations that describe runoff processes in catchments where SVR performed well and for ones where it performed poorly, thereby enabling inference about governing processes. This provides a robust means of examining how catchment complexity influences runoff prediction skill, and represents a contribution towards the integration of data-driven inference and physically-based models.
Mackey-Glass equation driven by fractional Brownian motion
NASA Astrophysics Data System (ADS)
Nguyen, Dung Tien
2012-11-01
In this paper we introduce a fractional stochastic version of the Mackey-Glass model which is a potential candidate to model objects in biology and finance. By a semi-martingale approximate approach we find an semi-analytical expression for the solution.
Cotter, Christopher R.; Schüttler, Heinz-Bernd; Igoshin, Oleg A.; Shimkets, Lawrence J.
2017-01-01
Collective cell movement is critical to the emergent properties of many multicellular systems, including microbial self-organization in biofilms, embryogenesis, wound healing, and cancer metastasis. However, even the best-studied systems lack a complete picture of how diverse physical and chemical cues act upon individual cells to ensure coordinated multicellular behavior. Known for its social developmental cycle, the bacterium Myxococcus xanthus uses coordinated movement to generate three-dimensional aggregates called fruiting bodies. Despite extensive progress in identifying genes controlling fruiting body development, cell behaviors and cell–cell communication mechanisms that mediate aggregation are largely unknown. We developed an approach to examine emergent behaviors that couples fluorescent cell tracking with data-driven models. A unique feature of this approach is the ability to identify cell behaviors affecting the observed aggregation dynamics without full knowledge of the underlying biological mechanisms. The fluorescent cell tracking revealed large deviations in the behavior of individual cells. Our modeling method indicated that decreased cell motility inside the aggregates, a biased walk toward aggregate centroids, and alignment among neighboring cells in a radial direction to the nearest aggregate are behaviors that enhance aggregation dynamics. Our modeling method also revealed that aggregation is generally robust to perturbations in these behaviors and identified possible compensatory mechanisms. The resulting approach of directly combining behavior quantification with data-driven simulations can be applied to more complex systems of collective cell movement without prior knowledge of the cellular machinery and behavioral cues. PMID:28533367
Network-driven design principles for neuromorphic systems.
Partzsch, Johannes; Schüffny, Rene
2015-01-01
Synaptic connectivity is typically the most resource-demanding part of neuromorphic systems. Commonly, the architecture of these systems is chosen mainly on technical considerations. As a consequence, the potential for optimization arising from the inherent constraints of connectivity models is left unused. In this article, we develop an alternative, network-driven approach to neuromorphic architecture design. We describe methods to analyse performance of existing neuromorphic architectures in emulating certain connectivity models. Furthermore, we show step-by-step how to derive a neuromorphic architecture from a given connectivity model. For this, we introduce a generalized description for architectures with a synapse matrix, which takes into account shared use of circuit components for reducing total silicon area. Architectures designed with this approach are fitted to a connectivity model, essentially adapting to its connection density. They are guaranteeing faithful reproduction of the model on chip, while requiring less total silicon area. In total, our methods allow designers to implement more area-efficient neuromorphic systems and verify usability of the connectivity resources in these systems.
Network-driven design principles for neuromorphic systems
Partzsch, Johannes; Schüffny, Rene
2015-01-01
Synaptic connectivity is typically the most resource-demanding part of neuromorphic systems. Commonly, the architecture of these systems is chosen mainly on technical considerations. As a consequence, the potential for optimization arising from the inherent constraints of connectivity models is left unused. In this article, we develop an alternative, network-driven approach to neuromorphic architecture design. We describe methods to analyse performance of existing neuromorphic architectures in emulating certain connectivity models. Furthermore, we show step-by-step how to derive a neuromorphic architecture from a given connectivity model. For this, we introduce a generalized description for architectures with a synapse matrix, which takes into account shared use of circuit components for reducing total silicon area. Architectures designed with this approach are fitted to a connectivity model, essentially adapting to its connection density. They are guaranteeing faithful reproduction of the model on chip, while requiring less total silicon area. In total, our methods allow designers to implement more area-efficient neuromorphic systems and verify usability of the connectivity resources in these systems. PMID:26539079
NASA Astrophysics Data System (ADS)
Badrzadeh, Honey; Sarukkalige, Ranjan; Jayawardena, A. W.
2015-10-01
Reliable river flow forecasts play a key role in flood risk mitigation. Among different approaches of river flow forecasting, data driven approaches have become increasingly popular in recent years due to their minimum information requirements and ability to simulate nonlinear and non-stationary characteristics of hydrological processes. In this study, attempts are made to apply four different types of data driven approaches, namely traditional artificial neural networks (ANN), adaptive neuro-fuzzy inference systems (ANFIS), wavelet neural networks (WNN), and, hybrid ANFIS with multi resolution analysis using wavelets (WNF). Developed models applied for real time flood forecasting at Casino station on Richmond River, Australia which is highly prone to flooding. Hourly rainfall and runoff data were used to drive the models which have been used for forecasting with 1, 6, 12, 24, 36 and 48 h lead-time. The performance of models further improved by adding an upstream river flow data (Wiangaree station), as another effective input. All models perform satisfactorily up to 12 h lead-time. However, the hybrid wavelet-based models significantly outperforming the ANFIS and ANN models in the longer lead-time forecasting. The results confirm the robustness of the proposed structure of the hybrid models for real time runoff forecasting in the study area.
ERIC Educational Resources Information Center
Kelly, Tricia
2010-01-01
Library and information management (LIM) organisations are on an almost continual path of change driven by changes in technology, service models, staffing structures, and financial allocations. The way in which LIM organisations approach change varies, as does the success rate of change management procedures undertaken. One particular approach to…
ERIC Educational Resources Information Center
Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel
2009-01-01
A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…
A Systems Approach to the Estimation of Ecosystem and Human Health Stressors in Air, Land and Water
A model linkage paradigm, based on the nitrogen cascade, is introduced. This general paradigm is then adapted to specific multi-media nitrogen issues and specific models to be linked. An example linked modeling system addressing potential nitrogen responses to biofuel-driven co...
Coordination control of flexible manufacturing systems
NASA Astrophysics Data System (ADS)
Menon, Satheesh R.
One of the first attempts was made to develop a model driven system for coordination control of Flexible Manufacturing Systems (FMS). The structure and activities of the FMS are modeled using a colored Petri Net based system. This approach has the advantage of being able to model the concurrency inherent in the system. It provides a method for encoding the system state, state transitions and the feasible transitions at any given state. Further structural analysis (for detecting conflicting actions, deadlocks which might occur during operation, etc.) can be performed. The problem is also addressed of implementing and testing the behavior of existing dynamic scheduling approaches in simulations of realistic situations. A simulation architecture was proposed and performance evaluation was carried out for establishing the correctness of the model, stability of the system from a structural (deadlocks) and temporal (boundedness of backlogs) points of view, and for collection of statistics for performance measures such as machine and robot utilizations, average wait times and idle times of resources. A real-time implementation architecture for the coordination controller was also developed and implemented in a software simulated environment. Given the current technology of FMS control, the model-driven colored Petri net-based approach promises to develop a very flexible control environment.
Nova-driven winds in globular clusters
NASA Technical Reports Server (NTRS)
Scott, E. H.; Durisen, R. H.
1978-01-01
Recent sensitive searches for H-alpha emission from ionized intracluster gas in globular clusters have set upper limits that conflict with theoretical predictions. It is suggested that nova outbursts heat the gas, producing winds that resolve this discrepancy. The incidence of novae in globular clusters, the conversion of kinetic energy of the nova shell to thermal energy of the intracluster gas, and the characteristics of the resultant winds are discussed. Calculated emission from the nova-driven models does not conflict with any observations to date. Some suggestions are made concerning the most promising approaches for future detection of intracluster gas on the basis of these models. The possible relationship of nova-driven winds to globular cluster X-ray sources is also considered.
Aspects of the BPRIM Language for Risk Driven Process Engineering
NASA Astrophysics Data System (ADS)
Sienou, Amadou; Lamine, Elyes; Pingaud, Hervé; Karduck, Achim
Nowadays organizations are exposed to frequent changes in business environment requiring continuous alignment of business processes on business strategies. This agility requires methods promoted in enterprise engineering approaches. Risk consideration in enterprise engineering is getting important since the business environment is becoming more and more competitive and unpredictable. Business processes are subject to the same quality requirements as material and human resources. Thus, process management is supposed to tackle value creation challenges but also the ones related to value preservation. Our research considers risk driven business process design as an integral part of enterprise engineering. A graphical modelling language for risk driven business process engineering was introduced in former research. This paper extends the language and handles questions related to modelling risk in organisational context.
Bridging paradigms: hybrid mechanistic-discriminative predictive models.
Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa
2013-03-01
Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.
The jABC Approach to Rigorous Collaborative Development of SCM Applications
NASA Astrophysics Data System (ADS)
Hörmann, Martina; Margaria, Tiziana; Mender, Thomas; Nagel, Ralf; Steffen, Bernhard; Trinh, Hong
Our approach to the model-driven collaborative design of IKEA's P3 Delivery Management Process uses the jABC [9] for model driven mediation and choreography to complement a RUP-based (Rational Unified Process) development process. jABC is a framework for service development based on Lightweight Process Coordination. Users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into (flow-) graph structures that can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex embedded systems has proven to be effective and adequate for the cooperation of non-programmers and non-technical people, which is the focus of this contribution, and it is now being rolled out in the operative practice.
A hybrid PCA-CART-MARS-based prognostic approach of the remaining useful life for aircraft engines.
Sánchez Lasheras, Fernando; García Nieto, Paulino José; de Cos Juez, Francisco Javier; Mayo Bayón, Ricardo; González Suárez, Victor Manuel
2015-03-23
Prognostics is an engineering discipline that predicts the future health of a system. In this research work, a data-driven approach for prognostics is proposed. Indeed, the present paper describes a data-driven hybrid model for the successful prediction of the remaining useful life of aircraft engines. The approach combines the multivariate adaptive regression splines (MARS) technique with the principal component analysis (PCA), dendrograms and classification and regression trees (CARTs). Elements extracted from sensor signals are used to train this hybrid model, representing different levels of health for aircraft engines. In this way, this hybrid algorithm is used to predict the trends of these elements. Based on this fitting, one can determine the future health state of a system and estimate its remaining useful life (RUL) with accuracy. To evaluate the proposed approach, a test was carried out using aircraft engine signals collected from physical sensors (temperature, pressure, speed, fuel flow, etc.). Simulation results show that the PCA-CART-MARS-based approach can forecast faults long before they occur and can predict the RUL. The proposed hybrid model presents as its main advantage the fact that it does not require information about the previous operation states of the input variables of the engine. The performance of this model was compared with those obtained by other benchmark models (multivariate linear regression and artificial neural networks) also applied in recent years for the modeling of remaining useful life. Therefore, the PCA-CART-MARS-based approach is very promising in the field of prognostics of the RUL for aircraft engines.
A Hybrid PCA-CART-MARS-Based Prognostic Approach of the Remaining Useful Life for Aircraft Engines
Lasheras, Fernando Sánchez; Nieto, Paulino José García; de Cos Juez, Francisco Javier; Bayón, Ricardo Mayo; Suárez, Victor Manuel González
2015-01-01
Prognostics is an engineering discipline that predicts the future health of a system. In this research work, a data-driven approach for prognostics is proposed. Indeed, the present paper describes a data-driven hybrid model for the successful prediction of the remaining useful life of aircraft engines. The approach combines the multivariate adaptive regression splines (MARS) technique with the principal component analysis (PCA), dendrograms and classification and regression trees (CARTs). Elements extracted from sensor signals are used to train this hybrid model, representing different levels of health for aircraft engines. In this way, this hybrid algorithm is used to predict the trends of these elements. Based on this fitting, one can determine the future health state of a system and estimate its remaining useful life (RUL) with accuracy. To evaluate the proposed approach, a test was carried out using aircraft engine signals collected from physical sensors (temperature, pressure, speed, fuel flow, etc.). Simulation results show that the PCA-CART-MARS-based approach can forecast faults long before they occur and can predict the RUL. The proposed hybrid model presents as its main advantage the fact that it does not require information about the previous operation states of the input variables of the engine. The performance of this model was compared with those obtained by other benchmark models (multivariate linear regression and artificial neural networks) also applied in recent years for the modeling of remaining useful life. Therefore, the PCA-CART-MARS-based approach is very promising in the field of prognostics of the RUL for aircraft engines. PMID:25806876
Kalantari, Zahra; Cavalli, Marco; Cantone, Carolina; Crema, Stefano; Destouni, Georgia
2017-03-01
Climate-driven increase in the frequency of extreme hydrological events is expected to impose greater strain on the built environment and major transport infrastructure, such as roads and railways. This study develops a data-driven spatial-statistical approach to quantifying and mapping the probability of flooding at critical road-stream intersection locations, where water flow and sediment transport may accumulate and cause serious road damage. The approach is based on novel integration of key watershed and road characteristics, including also measures of sediment connectivity. The approach is concretely applied to and quantified for two specific study case examples in southwest Sweden, with documented road flooding effects of recorded extreme rainfall. The novel contributions of this study in combining a sediment connectivity account with that of soil type, land use, spatial precipitation-runoff variability and road drainage in catchments, and in extending the connectivity measure use for different types of catchments, improve the accuracy of model results for road flood probability. Copyright © 2016 Elsevier B.V. All rights reserved.
Franco, Natália M; Medeiros, Gabriel F; Silva, Edson A; Murta, Angela S; Machado, Aydano P; Fidalgo, Robson N
2015-01-01
This work presents a Modeling Language and its technological infrastructure to customize the vocabulary of Communication Boards (CB), which are important tools to provide more humanization of health care. Using a technological infrastructure based on Model-Driven Development (MDD) approach, our Modelin Language (ML) creates an abstraction layer between users (e.g., health professionals such as an audiologist or speech therapist) and application code. Moreover, the use of a metamodel enables a syntactic corrector for preventing creation of wrong models. Our ML and metamodel enable more autonomy for health professionals in creating customized CB because it abstracts complexities and permits them to deal only with the domain concepts (e.g., vocabulary and patient needs). Additionally, our infrastructure provides a configuration file that can be used to share and reuse models. This way, the vocabulary modelling effort will decrease our time since people share vocabulary models. Our study provides an infrastructure that aims to abstract the complexity of CB vocabulary customization, giving more autonomy to health professionals when they need customizing, sharing and reusing vocabularies for CB.
Theory-Based Stakeholder Evaluation
ERIC Educational Resources Information Center
Hansen, Morten Balle; Vedung, Evert
2010-01-01
This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…
NASA Technical Reports Server (NTRS)
Allan, Brian G.
2000-01-01
A reduced order modeling approach of the Navier-Stokes equations is presented for the design of a distributed optimal feedback kernel. This approach is based oil a Krylov subspace method where significant modes of the flow are captured in the model This model is then used in all optimal feedback control design where sensing and actuation is performed oil tile entire flow field. This control design approach yields all optimal feedback kernel which provides insight into the placement of sensors and actuators in the flow field. As all evaluation of this approach, a two-dimensional shear layer and driven cavity flow are investigated.
Real-time Social Internet Data to Guide Forecasting Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del Valle, Sara Y.
Our goal is to improve decision support by monitoring and forecasting events using social media, mathematical models, and quantifying model uncertainty. Our approach is real-time, data-driven forecasts with quantified uncertainty: Not just for weather anymore. Information flow from human observations of events through an Internet system and classification algorithms is used to produce quantitatively uncertain forecast. In summary, we want to develop new tools to extract useful information from Internet data streams, develop new approaches to assimilate real-time information into predictive models, validate approaches by forecasting events, and our ultimate goal is to develop an event forecasting system using mathematicalmore » approaches and heterogeneous data streams.« less
A Data-Driven Approach to Interactive Visualization of Power Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Jun
Driven by emerging industry standards, electric utilities and grid coordination organizations are eager to seek advanced tools to assist grid operators to perform mission-critical tasks and enable them to make quick and accurate decisions. The emerging field of visual analytics holds tremendous promise for improving the business practices in today’s electric power industry. The conducted investigation, however, has revealed that the existing commercial power grid visualization tools heavily rely on human designers, hindering user’s ability to discover. Additionally, for a large grid, it is very labor-intensive and costly to build and maintain the pre-designed visual displays. This project proposes amore » data-driven approach to overcome the common challenges. The proposed approach relies on developing powerful data manipulation algorithms to create visualizations based on the characteristics of empirically or mathematically derived data. The resulting visual presentations emphasize what the data is rather than how the data should be presented, thus fostering comprehension and discovery. Furthermore, the data-driven approach formulates visualizations on-the-fly. It does not require a visualization design stage, completely eliminating or significantly reducing the cost for building and maintaining visual displays. The research and development (R&D) conducted in this project is mainly divided into two phases. The first phase (Phase I & II) focuses on developing data driven techniques for visualization of power grid and its operation. Various data-driven visualization techniques were investigated, including pattern recognition for auto-generation of one-line diagrams, fuzzy model based rich data visualization for situational awareness, etc. The R&D conducted during the second phase (Phase IIB) focuses on enhancing the prototyped data driven visualization tool based on the gathered requirements and use cases. The goal is to evolve the prototyped tool developed during the first phase into a commercial grade product. We will use one of the identified application areas as an example to demonstrate how research results achieved in this project are successfully utilized to address an emerging industry need. In summary, the data-driven visualization approach developed in this project has proven to be promising for building the next-generation power grid visualization tools. Application of this approach has resulted in a state-of-the-art commercial tool currently being leveraged by more than 60 utility organizations in North America and Europe .« less
Magnetically-driven medical robots: An analytical magnetic model for endoscopic capsules design
NASA Astrophysics Data System (ADS)
Li, Jing; Barjuei, Erfan Shojaei; Ciuti, Gastone; Hao, Yang; Zhang, Peisen; Menciassi, Arianna; Huang, Qiang; Dario, Paolo
2018-04-01
Magnetic-based approaches are highly promising to provide innovative solutions for the design of medical devices for diagnostic and therapeutic procedures, such as in the endoluminal districts. Due to the intrinsic magnetic properties (no current needed) and the high strength-to-size ratio compared with electromagnetic solutions, permanent magnets are usually embedded in medical devices. In this paper, a set of analytical formulas have been derived to model the magnetic forces and torques which are exerted by an arbitrary external magnetic field on a permanent magnetic source embedded in a medical robot. In particular, the authors modelled cylindrical permanent magnets as general solution often used and embedded in magnetically-driven medical devices. The analytical model can be applied to axially and diametrically magnetized, solid and annular cylindrical permanent magnets in the absence of the severe calculation complexity. Using a cylindrical permanent magnet as a selected solution, the model has been applied to a robotic endoscopic capsule as a pilot study in the design of magnetically-driven robots.
ODISEES: Ontology-Driven Interactive Search Environment for Earth Sciences
NASA Technical Reports Server (NTRS)
Rutherford, Matthew T.; Huffer, Elisabeth B.; Kusterer, John M.; Quam, Brandi M.
2015-01-01
This paper discusses the Ontology-driven Interactive Search Environment for Earth Sciences (ODISEES) project currently being developed to aid researchers attempting to find usable data among an overabundance of closely related data. ODISEES' ontological structure relies on a modular, adaptable concept modeling approach, which allows the domain to be modeled more or less as it is without worrying about terminology or external requirements. In the model, variables are individually assigned semantic content based on the characteristics of the measurements they represent, allowing intuitive discovery and comparison of data without requiring the user to sift through large numbers of data sets and variables to find the desired information.
Dynamic Data Driven Methods for Self-aware Aerospace Vehicles
2015-04-08
structural response model that incorporates multiple degradation or failure modes including damaged panel strength (BVID, thru- hole ), damaged panel...stiffness (BVID, thru- hole ), loose fastener, fretted fastener hole , and disbonded surface. • A new data-driven approach for the online updating of the flight...between the first and second plies. The panels were reinforced around the boarders of the panel with through holes to simulate mounting the wing skins to
Alpha-fetoprotein-targeted reporter gene expression imaging in hepatocellular carcinoma.
Kim, Kwang Il; Chung, Hye Kyung; Park, Ju Hui; Lee, Yong Jin; Kang, Joo Hyun
2016-07-21
Hepatocellular carcinoma (HCC) is one of the most common cancers in Eastern Asia, and its incidence is increasing globally. Numerous experimental models have been developed to better our understanding of the pathogenic mechanism of HCC and to evaluate novel therapeutic approaches. Molecular imaging is a convenient and up-to-date biomedical tool that enables the visualization, characterization and quantification of biologic processes in a living subject. Molecular imaging based on reporter gene expression, in particular, can elucidate tumor-specific events or processes by acquiring images of a reporter gene's expression driven by tumor-specific enhancers/promoters. In this review, we discuss the advantages and disadvantages of various experimental HCC mouse models and we present in vivo images of tumor-specific reporter gene expression driven by an alpha-fetoprotein (AFP) enhancer/promoter system in a mouse model of HCC. The current mouse models of HCC development are established by xenograft, carcinogen induction and genetic engineering, representing the spectrum of tumor-inducing factors and tumor locations. The imaging analysis approach of reporter genes driven by AFP enhancer/promoter is presented for these different HCC mouse models. Such molecular imaging can provide longitudinal information about carcinogenesis and tumor progression. We expect that clinical application of AFP-targeted reporter gene expression imaging systems will be useful for the detection of AFP-expressing HCC tumors and screening of increased/decreased AFP levels due to disease or drug treatment.
Alpha-fetoprotein-targeted reporter gene expression imaging in hepatocellular carcinoma
Kim, Kwang Il; Chung, Hye Kyung; Park, Ju Hui; Lee, Yong Jin; Kang, Joo Hyun
2016-01-01
Hepatocellular carcinoma (HCC) is one of the most common cancers in Eastern Asia, and its incidence is increasing globally. Numerous experimental models have been developed to better our understanding of the pathogenic mechanism of HCC and to evaluate novel therapeutic approaches. Molecular imaging is a convenient and up-to-date biomedical tool that enables the visualization, characterization and quantification of biologic processes in a living subject. Molecular imaging based on reporter gene expression, in particular, can elucidate tumor-specific events or processes by acquiring images of a reporter gene’s expression driven by tumor-specific enhancers/promoters. In this review, we discuss the advantages and disadvantages of various experimental HCC mouse models and we present in vivo images of tumor-specific reporter gene expression driven by an alpha-fetoprotein (AFP) enhancer/promoter system in a mouse model of HCC. The current mouse models of HCC development are established by xenograft, carcinogen induction and genetic engineering, representing the spectrum of tumor-inducing factors and tumor locations. The imaging analysis approach of reporter genes driven by AFP enhancer/promoter is presented for these different HCC mouse models. Such molecular imaging can provide longitudinal information about carcinogenesis and tumor progression. We expect that clinical application of AFP-targeted reporter gene expression imaging systems will be useful for the detection of AFP-expressing HCC tumors and screening of increased/decreased AFP levels due to disease or drug treatment. PMID:27468205
Motakis, E S; Nason, G P; Fryzlewicz, P; Rutter, G A
2006-10-15
Many standard statistical techniques are effective on data that are normally distributed with constant variance. Microarray data typically violate these assumptions since they come from non-Gaussian distributions with a non-trivial mean-variance relationship. Several methods have been proposed that transform microarray data to stabilize variance and draw its distribution towards the Gaussian. Some methods, such as log or generalized log, rely on an underlying model for the data. Others, such as the spread-versus-level plot, do not. We propose an alternative data-driven multiscale approach, called the Data-Driven Haar-Fisz for microarrays (DDHFm) with replicates. DDHFm has the advantage of being 'distribution-free' in the sense that no parametric model for the underlying microarray data is required to be specified or estimated; hence, DDHFm can be applied very generally, not just to microarray data. DDHFm achieves very good variance stabilization of microarray data with replicates and produces transformed intensities that are approximately normally distributed. Simulation studies show that it performs better than other existing methods. Application of DDHFm to real one-color cDNA data validates these results. The R package of the Data-Driven Haar-Fisz transform (DDHFm) for microarrays is available in Bioconductor and CRAN.
Wessells, Michael G
2015-05-01
Efforts to strengthen national child protection systems have frequently taken a top-down approach of imposing formal, government-managed services. Such expert-driven approaches are often characterized by low use of formal services and the misalignment of the nonformal and formal aspects of the child protection system. This article examines an alternative approach of community-driven, bottom-up work that enables nonformal-formal collaboration and alignment, greater use of formal services, internally driven social change, and high levels of community ownership. The dominant approach of reliance on expert-driven Child Welfare Committees produces low levels of community ownership. Using an approach developed and tested in rural Sierra Leone, community-driven action, including collaboration and linkages with the formal system, promoted the use of formal services and achieved increased ownership, effectiveness, and sustainability of the system. The field needs less reliance on expert-driven approaches and much wider use of slower, community-driven, bottom-up approaches to child protection. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Oses, Corey; Isayev, Olexandr; Toher, Cormac; Curtarolo, Stefano; Tropsha, Alexander
Historically, materials discovery is driven by a laborious trial-and-error process. The growth of materials databases and emerging informatics approaches finally offer the opportunity to transform this practice into data- and knowledge-driven rational design-accelerating discovery of novel materials exhibiting desired properties. By using data from the AFLOW repository for high-throughput, ab-initio calculations, we have generated Quantitative Materials Structure-Property Relationship (QMSPR) models to predict critical materials properties, including the metal/insulator classification, band gap energy, and bulk modulus. The prediction accuracy obtained with these QMSPR models approaches training data for virtually any stoichiometric inorganic crystalline material. We attribute the success and universality of these models to the construction of new materials descriptors-referred to as the universal Property-Labeled Material Fragments (PLMF). This representation affords straightforward model interpretation in terms of simple heuristic design rules that could guide rational materials design. This proof-of-concept study demonstrates the power of materials informatics to dramatically accelerate the search for new materials.
Data-driven Inference and Investigation of Thermosphere Dynamics and Variations
NASA Astrophysics Data System (ADS)
Mehta, P. M.; Linares, R.
2017-12-01
This paper presents a methodology for data-driven inference and investigation of thermosphere dynamics and variations. The approach uses data-driven modal analysis to extract the most energetic modes of variations for neutral thermospheric species using proper orthogonal decomposition, where the time-independent modes or basis represent the dynamics and the time-depedent coefficients or amplitudes represent the model parameters. The data-driven modal analysis approach combined with sparse, discrete observations is used to infer amplitues for the dynamic modes and to calibrate the energy content of the system. In this work, two different data-types, namely the number density measurements from TIMED/GUVI and the mass density measurements from CHAMP/GRACE are simultaneously ingested for an accurate and self-consistent specification of the thermosphere. The assimilation process is achieved with a non-linear least squares solver and allows estimation/tuning of the model parameters or amplitudes rather than the driver. In this work, we use the Naval Research Lab's MSIS model to derive the most energetic modes for six different species, He, O, N2, O2, H, and N. We examine the dominant drivers of variations for helium in MSIS and observe that seasonal latitudinal variation accounts for about 80% of the dynamic energy with a strong preference of helium for the winter hemisphere. We also observe enhanced helium presence near the poles at GRACE altitudes during periods of low solar activity (Feb 2007) as previously deduced. We will also examine the storm-time response of helium derived from observations. The results are expected to be useful in tuning/calibration of the physics-based models.
A biologically inspired approach to modeling unmanned vehicle teams
NASA Astrophysics Data System (ADS)
Cortesi, Roger S.; Galloway, Kevin S.; Justh, Eric W.
2008-04-01
Cooperative motion control of teams of agile unmanned vehicles presents modeling challenges at several levels. The "microscopic equations" describing individual vehicle dynamics and their interaction with the environment may be known fairly precisely, but are generally too complicated to yield qualitative insights at the level of multi-vehicle trajectory coordination. Interacting particle models are suitable for coordinating trajectories, but require care to ensure that individual vehicles are not driven in a "costly" manner. From the point of view of the cooperative motion controller, the individual vehicle autopilots serve to "shape" the microscopic equations, and we have been exploring the interplay between autopilots and cooperative motion controllers using a multivehicle hardware-in-the-loop simulator. Specifically, we seek refinements to interacting particle models in order to better describe observed behavior, without sacrificing qualitative understanding. A recent analogous example from biology involves introducing a fixed delay into a curvature-control-based feedback law for prey capture by an echolocating bat. This delay captures both neural processing time and the flight-dynamic response of the bat as it uses sensor-driven feedback. We propose a comparable approach for unmanned vehicle modeling; however, in contrast to the bat, with unmanned vehicles we have an additional freedom to modify the autopilot. Simulation results demonstrate the effectiveness of this biologically guided modeling approach.
Reducing usage of the computational resources by event driven approach to model predictive control
NASA Astrophysics Data System (ADS)
Misik, Stefan; Bradac, Zdenek; Cela, Arben
2017-08-01
This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.
Reverse engineering systems models of regulation: discovery, prediction and mechanisms.
Ashworth, Justin; Wurtmann, Elisabeth J; Baliga, Nitin S
2012-08-01
Biological systems can now be understood in comprehensive and quantitative detail using systems biology approaches. Putative genome-scale models can be built rapidly based upon biological inventories and strategic system-wide molecular measurements. Current models combine statistical associations, causative abstractions, and known molecular mechanisms to explain and predict quantitative and complex phenotypes. This top-down 'reverse engineering' approach generates useful organism-scale models despite noise and incompleteness in data and knowledge. Here we review and discuss the reverse engineering of biological systems using top-down data-driven approaches, in order to improve discovery, hypothesis generation, and the inference of biological properties. Copyright © 2011 Elsevier Ltd. All rights reserved.
Incremental checking of Master Data Management model based on contextual graphs
NASA Astrophysics Data System (ADS)
Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan
2015-10-01
The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.
Process-Driven Culture Learning in American KFL Classroom Settings
ERIC Educational Resources Information Center
Byon, Andrew Sangpil
2007-01-01
Teaching second language (L2) culture can be either content- or process-driven. The content-driven approach refers to explicit instruction of L2 cultural information. On the other hand, the process-driven approach focuses on students' active participation in cultural learning processes. In this approach, teachers are not only information…
Model-centric approaches for the development of health information systems.
Tuomainen, Mika; Mykkänen, Juha; Luostarinen, Heli; Pöyhölä, Assi; Paakkanen, Esa
2007-01-01
Modeling is used increasingly in healthcare to increase shared knowledge, to improve the processes, and to document the requirements of the solutions related to health information systems (HIS). There are numerous modeling approaches which aim to support these aims, but a careful assessment of their strengths, weaknesses and deficiencies is needed. In this paper, we compare three model-centric approaches in the context of HIS development: the Model-Driven Architecture, Business Process Modeling with BPMN and BPEL and the HL7 Development Framework. The comparison reveals that all these approaches are viable candidates for the development of HIS. However, they have distinct strengths and abstraction levels, they require local and project-specific adaptation and offer varying levels of automation. In addition, illustration of the solutions to the end users must be improved.
Protein-Protein Interface Predictions by Data-Driven Methods: A Review
Xue, Li C; Dobbs, Drena; Bonvin, Alexandre M.J.J.; Honavar, Vasant
2015-01-01
Reliably pinpointing which specific amino acid residues form the interface(s) between a protein and its binding partner(s) is critical for understanding the structural and physicochemical determinants of protein recognition and binding affinity, and has wide applications in modeling and validating protein interactions predicted by high-throughput methods, in engineering proteins, and in prioritizing drug targets. Here, we review the basic concepts, principles and recent advances in computational approaches to the analysis and prediction of protein-protein interfaces. We point out caveats for objectively evaluating interface predictors, and discuss various applications of data-driven interface predictors for improving energy model-driven protein-protein docking. Finally, we stress the importance of exploiting binding partner information in reliably predicting interfaces and highlight recent advances in this emerging direction. PMID:26460190
Initial conditions and modeling for simulations of shock driven turbulent material mixing
Grinstein, Fernando F.
2016-11-17
Here, we focus on the simulation of shock-driven material mixing driven by flow instabilities and initial conditions (IC). Beyond complex multi-scale resolution issues of shocks and variable density turbulence, me must address the equally difficult problem of predicting flow transition promoted by energy deposited at the material interfacial layer during the shock interface interactions. Transition involves unsteady large-scale coherent-structure dynamics capturable by a large eddy simulation (LES) strategy, but not by an unsteady Reynolds-Averaged Navier–Stokes (URANS) approach based on developed equilibrium turbulence assumptions and single-point-closure modeling. On the engineering end of computations, such URANS with reduced 1D/2D dimensionality and coarsermore » grids, tend to be preferred for faster turnaround in full-scale configurations.« less
Shock dynamics of two-lane driven lattice gases
NASA Astrophysics Data System (ADS)
Schiffmann, Christoph; Appert-Rolland, Cécile; Santen, Ludger
2010-06-01
Driven lattice gases such as those of the ASEP model are useful tools for the modelling of various stochastic transport processes carried out by self-driven particles, such as molecular motors or vehicles in road traffic. Often these processes take place in one-dimensional systems offering several tracks to the particles, and in many cases the particles are able to change track with a given rate. In this work we consider the case of strong coupling where the rate of hopping along the tracks and the exchange rates are of the same order, and show how a phenomenological approach based on a domain wall theory can be used to describe the dynamics of the system. In particular, the domain walls on the different tracks form pairs, whose dynamics dominate the behaviour of the system.
Market Model of Financing Higher Education in Sub-Saharan Africa: Examples from Kenya.
ERIC Educational Resources Information Center
Oketch, Moses O.
2003-01-01
Examines some of the rationales for financial diversification and partial privatization of state universities in Kenya and the different manifestations of market-driven approaches to university education. Explores whether the market model can address increased demand while maintaining educational quality. (EV)
HMM-based lexicon-driven and lexicon-free word recognition for online handwritten Indic scripts.
Bharath, A; Madhvanath, Sriganesh
2012-04-01
Research for recognizing online handwritten words in Indic scripts is at its early stages when compared to Latin and Oriental scripts. In this paper, we address this problem specifically for two major Indic scripts--Devanagari and Tamil. In contrast to previous approaches, the techniques we propose are largely data driven and script independent. We propose two different techniques for word recognition based on Hidden Markov Models (HMM): lexicon driven and lexicon free. The lexicon-driven technique models each word in the lexicon as a sequence of symbol HMMs according to a standard symbol writing order derived from the phonetic representation. The lexicon-free technique uses a novel Bag-of-Symbols representation of the handwritten word that is independent of symbol order and allows rapid pruning of the lexicon. On handwritten Devanagari word samples featuring both standard and nonstandard symbol writing orders, a combination of lexicon-driven and lexicon-free recognizers significantly outperforms either of them used in isolation. In contrast, most Tamil word samples feature the standard symbol order, and the lexicon-driven recognizer outperforms the lexicon free one as well as their combination. The best recognition accuracies obtained for 20,000 word lexicons are 87.13 percent for Devanagari when the two recognizers are combined, and 91.8 percent for Tamil using the lexicon-driven technique.
The Lom Approach--a Call for Concern?
ERIC Educational Resources Information Center
Armitage, Nicholas; Bowerman, Chris
2005-01-01
The LOM (Learning Object Model) approach to courseware design seems to be driven by a desire to increase access to education as well as use technology to enable a higher staff-student ratio than is currently possible. The LOM standard involves the use of standard metadata descriptions of content and adaptive content engines to deliver the…
The LOM Approach -- A CALL for Concern?
ERIC Educational Resources Information Center
Armitage, Nicholas; Bowerman, Chris
2005-01-01
The LOM (Learning Object Model) approach to courseware design seems to be driven by a desire to increase access to education as well as use technology to enable a higher staff-student ratio than is currently possible. The LOM standard involves the use of standard metadata descriptions of content and adaptive content engines to deliver the…
The "Learning Games Design Model": Immersion, Collaboration, and Outcomes-Driven Development
ERIC Educational Resources Information Center
Chamberlin, Barbara; Trespalacios, Jesús; Gallagher, Rachel
2012-01-01
Instructional designers in the Learning Games Lab at New Mexico State University have developed a specific approach for the creation of educational games, one that has been used successfully in over 20 instructional design projects and is extensible to other developers. Using this approach, game developers and content experts (a) work…
Xu, Wenjun; Chen, Jie; Lau, Henry Y K; Ren, Hongliang
2017-09-01
Accurate motion control of flexible surgical manipulators is crucial in tissue manipulation tasks. The tendon-driven serpentine manipulator (TSM) is one of the most widely adopted flexible mechanisms in minimally invasive surgery because of its enhanced maneuverability in torturous environments. TSM, however, exhibits high nonlinearities and conventional analytical kinematics model is insufficient to achieve high accuracy. To account for the system nonlinearities, we applied a data driven approach to encode the system inverse kinematics. Three regression methods: extreme learning machine (ELM), Gaussian mixture regression (GMR) and K-nearest neighbors regression (KNNR) were implemented to learn a nonlinear mapping from the robot 3D position states to the control inputs. The performance of the three algorithms was evaluated both in simulation and physical trajectory tracking experiments. KNNR performed the best in the tracking experiments, with the lowest RMSE of 2.1275 mm. The proposed inverse kinematics learning methods provide an alternative and efficient way to accurately model the tendon driven flexible manipulator. Copyright © 2016 John Wiley & Sons, Ltd.
Parametric instabilities in resonantly-driven Bose–Einstein condensates
NASA Astrophysics Data System (ADS)
Lellouch, S.; Goldman, N.
2018-04-01
Shaking optical lattices in a resonant manner offers an efficient and versatile method to devise artificial gauge fields and topological band structures for ultracold atomic gases. This was recently demonstrated through the experimental realization of the Harper–Hofstadter model, which combined optical superlattices and resonant time-modulations. Adding inter-particle interactions to these engineered band systems is expected to lead to strongly-correlated states with topological features, such as fractional Chern insulators. However, the interplay between interactions and external time-periodic drives typically triggers violent instabilities and uncontrollable heating, hence potentially ruling out the possibility of accessing such intriguing states of matter in experiments. In this work, we study the early-stage parametric instabilities that occur in systems of resonantly-driven Bose–Einstein condensates in optical lattices. We apply and extend an approach based on Bogoliubov theory (Lellouch et al 2017 Phys. Rev. X 7 021015) to a variety of resonantly-driven band models, from a simple shaken Wannier–Stark ladder to the more intriguing driven-induced Harper–Hofstadter model. In particular, we provide ab initio numerical and analytical predictions for the stability properties of these topical models. This work sheds light on general features that could guide current experiments to stable regimes of operation.
NASA Astrophysics Data System (ADS)
Bouya, Zahra; Terkildsen, Michael
2016-07-01
The Australian Space Forecast Centre (ASFC) provides space weather forecasts to a diverse group of customers. Space Weather Services (SWS) within the Australian Bureau of Meteorology is focussed both on developing tailored products and services for the key customer groups, and supporting ASFC operations. Research in SWS is largely centred on the development of data-driven models using a range of solar-terrestrial data. This paper will cover some data requirements , approaches and recent SWS activities for data driven modelling with a focus on the regional Ionospheric specification and forecasting.
Towards a Pattern-Driven Topical Ontology Modeling Methodology in Elderly Care Homes
NASA Astrophysics Data System (ADS)
Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert; Pudkey, Kevin
This paper presents a pattern-driven ontology modeling methodology, which is used to create topical ontologies in the human resource management (HRM) domain. An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). We use the Organization for Economic Co-operation and Development (OECD) and the National Vocational Qualification (NVQ) as the resource to create the topical ontologies in this paper. The methodology is implemented in a tool called PAD-ON suit. The paper approach is illustrated with a use case from elderly care homes in UK.
KRAS-driven lung adenocarcinoma: combined DDR1/Notch inhibition as an effective therapy
Ambrogio, Chiara; Nadal, Ernest; Villanueva, Alberto; Gómez-López, Gonzalo; Cash, Timothy P; Barbacid, Mariano; Santamaría, David
2016-01-01
Understanding the early evolution of cancer heterogeneity during the initial steps of tumorigenesis can uncover vulnerabilities of cancer cells that may be masked at later stages. We describe a comprehensive approach employing gene expression analysis in early lesions to identify novel therapeutic targets and the use of mouse models to test synthetic lethal drug combinations to treat human Kirsten rat sarcoma viral oncogene homologue (KRAS)-driven lung adenocarcinoma. PMID:27843638
Collision-model approach to steering of an open driven qubit
NASA Astrophysics Data System (ADS)
Beyer, Konstantin; Luoma, Kimmo; Strunz, Walter T.
2018-03-01
We investigate quantum steering of an open quantum system by measurements on its environment in the framework of collision models. As an example we consider a coherently driven qubit dissipatively coupled to a bath. We construct local nonadaptive and adaptive as well as nonlocal measurement scenarios specifying explicitly the measured observable on the environment. Our approach shows transparently how the conditional evolution of the open system depends on the type of the measurement scenario and the measured observables. These can then be optimized for steering. The nonlocal measurement scenario leads to maximal violation of the used steering inequality at zero temperature. Further, we investigate the robustness of the constructed scenarios against thermal noise. We find generally that steering becomes harder at higher temperatures. Surprisingly, the system can be steered even when bipartite entanglement between the system and individual subenvironments vanishes.
Using connectome-based predictive modeling to predict individual behavior from brain connectivity
Shen, Xilin; Finn, Emily S.; Scheinost, Dustin; Rosenberg, Monica D.; Chun, Marvin M.; Papademetris, Xenophon; Constable, R Todd
2017-01-01
Neuroimaging is a fast developing research area where anatomical and functional images of human brains are collected using techniques such as functional magnetic resonance imaging (fMRI), diffusion tensor imaging (DTI), and electroencephalography (EEG). Technical advances and large-scale datasets have allowed for the development of models capable of predicting individual differences in traits and behavior using brain connectivity measures derived from neuroimaging data. Here, we present connectome-based predictive modeling (CPM), a data-driven protocol for developing predictive models of brain-behavior relationships from connectivity data using cross-validation. This protocol includes the following steps: 1) feature selection, 2) feature summarization, 3) model building, and 4) assessment of prediction significance. We also include suggestions for visualizing the most predictive features (i.e., brain connections). The final result should be a generalizable model that takes brain connectivity data as input and generates predictions of behavioral measures in novel subjects, accounting for a significant amount of the variance in these measures. It has been demonstrated that the CPM protocol performs equivalently or better than most of the existing approaches in brain-behavior prediction. However, because CPM focuses on linear modeling and a purely data-driven driven approach, neuroscientists with limited or no experience in machine learning or optimization would find it easy to implement the protocols. Depending on the volume of data to be processed, the protocol can take 10–100 minutes for model building, 1–48 hours for permutation testing, and 10–20 minutes for visualization of results. PMID:28182017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saurav, Kumar; Chandan, Vikas
District-heating-and-cooling (DHC) systems are a proven energy solution that has been deployed for many years in a growing number of urban areas worldwide. They comprise a variety of technologies that seek to develop synergies between the production and supply of heat, cooling, domestic hot water and electricity. Although the benefits of DHC systems are significant and have been widely acclaimed, yet the full potential of modern DHC systems remains largely untapped. There are several opportunities for development of energy efficient DHC systems, which will enable the effective exploitation of alternative renewable resources, waste heat recovery, etc., in order to increasemore » the overall efficiency and facilitate the transition towards the next generation of DHC systems. This motivated the need for modelling these complex systems. Large-scale modelling of DHC-networks is challenging, as it has several components such as buildings, pipes, valves, heating source, etc., interacting with each other. In this paper, we focus on building modelling. In particular, we present a gray-box methodology for thermal modelling of buildings. Gray-box modelling is a hybrid of data driven and physics based models where, coefficients of the equations from physics based models are learned using data. This approach allows us to capture the dynamics of the buildings more effectively as compared to pure data driven approach. Additionally, this approach results in a simpler models as compared to pure physics based models. We first develop the individual components of the building such as temperature evolution, flow controller, etc. These individual models are then integrated in to the complete gray-box model for the building. The model is validated using data collected from one of the buildings at Lule{\\aa}, a city on the coast of northern Sweden.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, C.M.
2011-06-01
The need for risk-driven field experiments for CO{sub 2} geologic storage processes to complement ongoing pilot-scale demonstrations is discussed. These risk-driven field experiments would be aimed at understanding the circumstances under which things can go wrong with a CO{sub 2} capture and storage (CCS) project and cause it to fail, as distinguished from accomplishing this end using demonstration and industrial scale sites. Such risk-driven tests would complement risk-assessment efforts that have already been carried out by providing opportunities to validate risk models. In addition to experimenting with high-risk scenarios, these controlled field experiments could help validate monitoring approaches to improvemore » performance assessment and guide development of mitigation strategies.« less
Data-Driven Modeling and Prediction of Arctic Sea Ice
NASA Astrophysics Data System (ADS)
Kondrashov, Dmitri; Chekroun, Mickael; Ghil, Michael
2016-04-01
We present results of data-driven predictive analyses of sea ice over the main Arctic regions. Our approach relies on the Multilayer Stochastic Modeling (MSM) framework of Kondrashov, Chekroun and Ghil [Physica D, 2015] and it leads to probabilistic prognostic models of sea ice concentration (SIC) anomalies on seasonal time scales. This approach is applied to monthly time series of state-of-the-art data-adaptive decompositions of SIC and selected climate variables over the Arctic. We evaluate the predictive skill of MSM models by performing retrospective forecasts with "no-look ahead" for up to 6-months ahead. It will be shown in particular that the memory effects included intrinsically in the formulation of our non-Markovian MSM models allow for improvements of the prediction skill of large-amplitude SIC anomalies in certain Arctic regions on the one hand, and of September Sea Ice Extent, on the other. Further improvements allowed by the MSM framework will adopt a nonlinear formulation and explore next-generation data-adaptive decompositions, namely modification of Principal Oscillation Patterns (POPs) and rotated Multichannel Singular Spectrum Analysis (M-SSA).
Imaging plus X: multimodal models of neurodegenerative disease.
Oxtoby, Neil P; Alexander, Daniel C
2017-08-01
This article argues that the time is approaching for data-driven disease modelling to take centre stage in the study and management of neurodegenerative disease. The snowstorm of data now available to the clinician defies qualitative evaluation; the heterogeneity of data types complicates integration through traditional statistical methods; and the large datasets becoming available remain far from the big-data sizes necessary for fully data-driven machine-learning approaches. The recent emergence of data-driven disease progression models provides a balance between imposed knowledge of disease features and patterns learned from data. The resulting models are both predictive of disease progression in individual patients and informative in terms of revealing underlying biological patterns. Largely inspired by observational models, data-driven disease progression models have emerged in the last few years as a feasible means for understanding the development of neurodegenerative diseases. These models have revealed insights into frontotemporal dementia, Huntington's disease, multiple sclerosis, Parkinson's disease and other conditions. For example, event-based models have revealed finer graded understanding of progression patterns; self-modelling regression and differential equation models have provided data-driven biomarker trajectories; spatiotemporal models have shown that brain shape changes, for example of the hippocampus, can occur before detectable neurodegeneration; and network models have provided some support for prion-like mechanistic hypotheses of disease propagation. The most mature results are in sporadic Alzheimer's disease, in large part because of the availability of the Alzheimer's disease neuroimaging initiative dataset. Results generally support the prevailing amyloid-led hypothetical model of Alzheimer's disease, while revealing finer detail and insight into disease progression. The emerging field of disease progression modelling provides a natural mechanism to integrate different kinds of information, for example from imaging, serum and cerebrospinal fluid markers and cognitive tests, to obtain new insights into progressive diseases. Such insights include fine-grained longitudinal patterns of neurodegeneration, from early stages, and the heterogeneity of these trajectories over the population. More pragmatically, such models enable finer precision in patient staging and stratification, prediction of progression rates and earlier and better identification of at-risk individuals. We argue that this will make disease progression modelling invaluable for recruitment and end-points in future clinical trials, potentially ameliorating the high failure rate in trials of, e.g., Alzheimer's disease therapies. We review the state of the art in these techniques and discuss the future steps required to translate the ideas to front-line application.
A Knowledge-Based and Model-Driven Requirements Engineering Approach to Conceptual Satellite Design
NASA Astrophysics Data System (ADS)
Dos Santos, Walter A.; Leonor, Bruno B. F.; Stephany, Stephan
Satellite systems are becoming even more complex, making technical issues a significant cost driver. The increasing complexity of these systems makes requirements engineering activities both more important and difficult. Additionally, today's competitive pressures and other market forces drive manufacturing companies to improve the efficiency with which they design and manufacture space products and systems. This imposes a heavy burden on systems-of-systems engineering skills and particularly on requirements engineering which is an important phase in a system's life cycle. When this is poorly performed, various problems may occur, such as failures, cost overruns and delays. One solution is to underpin the preliminary conceptual satellite design with computer-based information reuse and integration to deal with the interdisciplinary nature of this problem domain. This can be attained by taking a model-driven engineering approach (MDE), in which models are the main artifacts during system development. MDE is an emergent approach that tries to address system complexity by the intense use of models. This work outlines the use of SysML (Systems Modeling Language) and a novel knowledge-based software tool, named SatBudgets, to deal with these and other challenges confronted during the conceptual phase of a university satellite system, called ITASAT, currently being developed by INPE and some Brazilian universities.
ERIC Educational Resources Information Center
Schellekens, Ad; Paas, Fred; Verbraeck, Alexander; van Merrienboer, Jeroen J. G.
2010-01-01
In a preceding case study, a process-focused demand-driven approach for organising flexible educational programmes in higher professional education (HPE) was developed. Operations management and instructional design contributed to designing a flexible educational model by means of discrete-event simulation. Educational experts validated the model…
Bordier, Cecile; Puja, Francesco; Macaluso, Emiliano
2013-01-01
The investigation of brain activity using naturalistic, ecologically-valid stimuli is becoming an important challenge for neuroscience research. Several approaches have been proposed, primarily relying on data-driven methods (e.g. independent component analysis, ICA). However, data-driven methods often require some post-hoc interpretation of the imaging results to draw inferences about the underlying sensory, motor or cognitive functions. Here, we propose using a biologically-plausible computational model to extract (multi-)sensory stimulus statistics that can be used for standard hypothesis-driven analyses (general linear model, GLM). We ran two separate fMRI experiments, which both involved subjects watching an episode of a TV-series. In Exp 1, we manipulated the presentation by switching on-and-off color, motion and/or sound at variable intervals, whereas in Exp 2, the video was played in the original version, with all the consequent continuous changes of the different sensory features intact. Both for vision and audition, we extracted stimulus statistics corresponding to spatial and temporal discontinuities of low-level features, as well as a combined measure related to the overall stimulus saliency. Results showed that activity in occipital visual cortex and the superior temporal auditory cortex co-varied with changes of low-level features. Visual saliency was found to further boost activity in extra-striate visual cortex plus posterior parietal cortex, while auditory saliency was found to enhance activity in the superior temporal cortex. Data-driven ICA analyses of the same datasets also identified “sensory” networks comprising visual and auditory areas, but without providing specific information about the possible underlying processes, e.g., these processes could relate to modality, stimulus features and/or saliency. We conclude that the combination of computational modeling and GLM enables the tracking of the impact of bottom–up signals on brain activity during viewing of complex and dynamic multisensory stimuli, beyond the capability of purely data-driven approaches. PMID:23202431
Scenario driven data modelling: a method for integrating diverse sources of data and data streams
2011-01-01
Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854
Managing business compliance using model-driven security management
NASA Astrophysics Data System (ADS)
Lang, Ulrich; Schreiner, Rudolf
Compliance with regulatory and governance standards is rapidly becoming one of the hot topics of information security today. This is because, especially with regulatory compliance, both business and government have to expect large financial and reputational losses if compliance cannot be ensured and demonstrated. One major difficulty of implementing such regulations is caused the fact that they are captured at a high level of abstraction that is business-centric and not IT centric. This means that the abstract intent needs to be translated in a trustworthy, traceable way into compliance and security policies that the IT security infrastructure can enforce. Carrying out this mapping process manually is time consuming, maintenance-intensive, costly, and error-prone. Compliance monitoring is also critical in order to be able to demonstrate compliance at any given point in time. The problem is further complicated because of the need for business-driven IT agility, where IT policies and enforcement can change frequently, e.g. Business Process Modelling (BPM) driven Service Oriented Architecture (SOA). Model Driven Security (MDS) is an innovative technology approach that can solve these problems as an extension of identity and access management (IAM) and authorization management (also called entitlement management). In this paper we will illustrate the theory behind Model Driven Security for compliance, provide an improved and extended architecture, as well as a case study in the healthcare industry using our OpenPMF 2.0 technology.
A pivotal-based approach for enterprise business process and IS integration
NASA Astrophysics Data System (ADS)
Ulmer, Jean-Stéphane; Belaud, Jean-Pierre; Le Lann, Jean-Marc
2013-02-01
A company must be able to describe and react against any endogenous or exogenous event. Such flexibility can be achieved through business process management (BPM). Nevertheless a BPM approach highlights complex relations between business and IT domains. A non-alignment is exposed between heterogeneous models: this is the 'business-IT gap' as described in the literature. Through concepts from business engineering and information systems driven by models and IT, we define a generic approach ensuring multi-view consistency. Its role is to maintain and provide all information related to the structure and semantic of models. Allowing the full return of a transformed model in the sense of reverse engineering, our platform enables synchronisation between analysis model and implementation model.
Moving Beyond ERP Components: A Selective Review of Approaches to Integrate EEG and Behavior
Bridwell, David A.; Cavanagh, James F.; Collins, Anne G. E.; Nunez, Michael D.; Srinivasan, Ramesh; Stober, Sebastian; Calhoun, Vince D.
2018-01-01
Relationships between neuroimaging measures and behavior provide important clues about brain function and cognition in healthy and clinical populations. While electroencephalography (EEG) provides a portable, low cost measure of brain dynamics, it has been somewhat underrepresented in the emerging field of model-based inference. We seek to address this gap in this article by highlighting the utility of linking EEG and behavior, with an emphasis on approaches for EEG analysis that move beyond focusing on peaks or “components” derived from averaging EEG responses across trials and subjects (generating the event-related potential, ERP). First, we review methods for deriving features from EEG in order to enhance the signal within single-trials. These methods include filtering based on user-defined features (i.e., frequency decomposition, time-frequency decomposition), filtering based on data-driven properties (i.e., blind source separation, BSS), and generating more abstract representations of data (e.g., using deep learning). We then review cognitive models which extract latent variables from experimental tasks, including the drift diffusion model (DDM) and reinforcement learning (RL) approaches. Next, we discuss ways to access associations among these measures, including statistical models, data-driven joint models and cognitive joint modeling using hierarchical Bayesian models (HBMs). We think that these methodological tools are likely to contribute to theoretical advancements, and will help inform our understandings of brain dynamics that contribute to moment-to-moment cognitive function. PMID:29632480
Huang, Ri-Bo; Du, Qi-Shi; Wei, Yu-Tuo; Pang, Zong-Wen; Wei, Hang; Chou, Kuo-Chen
2009-02-07
Predicting the bioactivity of peptides and proteins is an important challenge in drug development and protein engineering. In this study we introduce a novel approach, the so-called "physics and chemistry-driven artificial neural network (Phys-Chem ANN)", to deal with such a problem. Unlike the existing ANN approaches, which were designed under the inspiration of biological neural system, the Phys-Chem ANN approach is based on the physical and chemical principles, as well as the structural features of proteins. In the Phys-Chem ANN model the "hidden layers" are no longer virtual "neurons", but real structural units of proteins and peptides. It is a hybridization approach, which combines the linear free energy concept of quantitative structure-activity relationship (QSAR) with the advanced mathematical technique of ANN. The Phys-Chem ANN approach has adopted an iterative and feedback procedure, incorporating both machine-learning and artificial intelligence capabilities. In addition to making more accurate predictions for the bioactivities of proteins and peptides than is possible with the traditional QSAR approach, the Phys-Chem ANN approach can also provide more insights about the relationship between bioactivities and the structures involved than the ANN approach does. As an example of the application of the Phys-Chem ANN approach, a predictive model for the conformational stability of human lysozyme is presented.
NASA Astrophysics Data System (ADS)
Ulerich, J.; Göktepe, S.; Kuhl, E.
This manuscript presents a continuum approach towards cardiac growth and remodeling that is capable to predict chronic maladaptation of the heart in response to changes in mechanical loading. It is based on the multiplicative decomposition of the deformation gradient into and elastic and a growth part. Motivated by morphological changes in cardiomyocyte geometry, we introduce an anisotropic growth tensor that can capture both hypertrophic wall thickening and ventricular dilation within one generic concept. In agreement with clinical observations, we propose wall thickening to be a stress-driven phenomenon whereas dilation is introduced as a strain-driven process. The features of the proposed approach are illustrated in terms of the adaptation of thin heart slices and in terms overload-induced dilation in a generic bi-ventricular heart model.
Supervised dictionary learning for inferring concurrent brain networks.
Zhao, Shijie; Han, Junwei; Lv, Jinglei; Jiang, Xi; Hu, Xintao; Zhao, Yu; Ge, Bao; Guo, Lei; Liu, Tianming
2015-10-01
Task-based fMRI (tfMRI) has been widely used to explore functional brain networks via predefined stimulus paradigm in the fMRI scan. Traditionally, the general linear model (GLM) has been a dominant approach to detect task-evoked networks. However, GLM focuses on task-evoked or event-evoked brain responses and possibly ignores the intrinsic brain functions. In comparison, dictionary learning and sparse coding methods have attracted much attention recently, and these methods have shown the promise of automatically and systematically decomposing fMRI signals into meaningful task-evoked and intrinsic concurrent networks. Nevertheless, two notable limitations of current data-driven dictionary learning method are that the prior knowledge of task paradigm is not sufficiently utilized and that the establishment of correspondences among dictionary atoms in different brains have been challenging. In this paper, we propose a novel supervised dictionary learning and sparse coding method for inferring functional networks from tfMRI data, which takes both of the advantages of model-driven method and data-driven method. The basic idea is to fix the task stimulus curves as predefined model-driven dictionary atoms and only optimize the other portion of data-driven dictionary atoms. Application of this novel methodology on the publicly available human connectome project (HCP) tfMRI datasets has achieved promising results.
Effective Rating Scale Development for Speaking Tests: Performance Decision Trees
ERIC Educational Resources Information Center
Fulcher, Glenn; Davidson, Fred; Kemp, Jenny
2011-01-01
Rating scale design and development for testing speaking is generally conducted using one of two approaches: the measurement-driven approach or the performance data-driven approach. The measurement-driven approach prioritizes the ordering of descriptors onto a single scale. Meaning is derived from the scaling methodology and the agreement of…
Magidson, Jessica F.; Roberts, Brent; Collado-Rodriguez, Anahi; Lejuez, C.W.
2013-01-01
Considerable evidence suggests that personality traits may be changeable, raising the possibility that personality traits most linked to health problems can be modified with intervention. A growing body of research suggests that problematic personality traits may be altered with behavioral intervention using a bottom-approach. That is, by targeting core behaviors that underlie personality traits with the goal of engendering new, healthier patterns of behavior that over time become automatized and manifest in changes in personality traits. Nevertheless, a bottom-up model for changing personality traits is somewhat diffuse and requires clearer integration of theory and relevant interventions to enable real clinical application. As such, this manuscript proposes a set of guiding principles for theory-driven modification of targeted personality traits using a bottom-up approach, focusing specifically on targeting the trait of conscientiousness using a relevant behavioral intervention, Behavioral Activation (BA), considered within the motivational framework of Expectancy Value Theory (EVT). We conclude with a real case example of the application of BA to alter behaviors counter to conscientiousness in a substance dependent patient, highlighting the EVT principles most relevant to the approach and the importance and viability of a theoretically-driven, bottom-up approach to changing personality traits. PMID:23106844
Model-Driven Development of Safety Architectures
NASA Technical Reports Server (NTRS)
Denney, Ewen; Pai, Ganesh; Whiteside, Iain
2017-01-01
We describe the use of model-driven development for safety assurance of a pioneering NASA flight operation involving a fleet of small unmanned aircraft systems (sUAS) flying beyond visual line of sight. The central idea is to develop a safety architecture that provides the basis for risk assessment and visualization within a safety case, the formal justification of acceptable safety required by the aviation regulatory authority. A safety architecture is composed from a collection of bow tie diagrams (BTDs), a practical approach to manage safety risk by linking the identified hazards to the appropriate mitigation measures. The safety justification for a given unmanned aircraft system (UAS) operation can have many related BTDs. In practice, however, each BTD is independently developed, which poses challenges with respect to incremental development, maintaining consistency across different safety artifacts when changes occur, and in extracting and presenting stakeholder specific information relevant for decision making. We show how a safety architecture reconciles the various BTDs of a system, and, collectively, provide an overarching picture of system safety, by considering them as views of a unified model. We also show how it enables model-driven development of BTDs, replete with validations, transformations, and a range of views. Our approach, which we have implemented in our toolset, AdvoCATE, is illustrated with a running example drawn from a real UAS safety case. The models and some of the innovations described here were instrumental in successfully obtaining regulatory flight approval.
Consistent model driven architecture
NASA Astrophysics Data System (ADS)
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
Cryo-EM Data Are Superior to Contact and Interface Information in Integrative Modeling.
de Vries, Sjoerd J; Chauvot de Beauchêne, Isaure; Schindler, Christina E M; Zacharias, Martin
2016-02-23
Protein-protein interactions carry out a large variety of essential cellular processes. Cryo-electron microscopy (cryo-EM) is a powerful technique for the modeling of protein-protein interactions at a wide range of resolutions, and recent developments have caused a revolution in the field. At low resolution, cryo-EM maps can drive integrative modeling of the interaction, assembling existing structures into the map. Other experimental techniques can provide information on the interface or on the contacts between the monomers in the complex. This inevitably raises the question regarding which type of data is best suited to drive integrative modeling approaches. Systematic comparison of the prediction accuracy and specificity of the different integrative modeling paradigms is unavailable to date. Here, we compare EM-driven, interface-driven, and contact-driven integrative modeling paradigms. Models were generated for the protein docking benchmark using the ATTRACT docking engine and evaluated using the CAPRI two-star criterion. At 20 Å resolution, EM-driven modeling achieved a success rate of 100%, outperforming the other paradigms even with perfect interface and contact information. Therefore, even very low resolution cryo-EM data is superior in predicting heterodimeric and heterotrimeric protein assemblies. Our study demonstrates that a force field is not necessary, cryo-EM data alone is sufficient to accurately guide the monomers into place. The resulting rigid models successfully identify regions of conformational change, opening up perspectives for targeted flexible remodeling. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Cryo-EM Data Are Superior to Contact and Interface Information in Integrative Modeling
de Vries, Sjoerd J.; Chauvot de Beauchêne, Isaure; Schindler, Christina E.M.; Zacharias, Martin
2016-01-01
Protein-protein interactions carry out a large variety of essential cellular processes. Cryo-electron microscopy (cryo-EM) is a powerful technique for the modeling of protein-protein interactions at a wide range of resolutions, and recent developments have caused a revolution in the field. At low resolution, cryo-EM maps can drive integrative modeling of the interaction, assembling existing structures into the map. Other experimental techniques can provide information on the interface or on the contacts between the monomers in the complex. This inevitably raises the question regarding which type of data is best suited to drive integrative modeling approaches. Systematic comparison of the prediction accuracy and specificity of the different integrative modeling paradigms is unavailable to date. Here, we compare EM-driven, interface-driven, and contact-driven integrative modeling paradigms. Models were generated for the protein docking benchmark using the ATTRACT docking engine and evaluated using the CAPRI two-star criterion. At 20 Å resolution, EM-driven modeling achieved a success rate of 100%, outperforming the other paradigms even with perfect interface and contact information. Therefore, even very low resolution cryo-EM data is superior in predicting heterodimeric and heterotrimeric protein assemblies. Our study demonstrates that a force field is not necessary, cryo-EM data alone is sufficient to accurately guide the monomers into place. The resulting rigid models successfully identify regions of conformational change, opening up perspectives for targeted flexible remodeling. PMID:26846888
LexValueSets: An Approach for Context-Driven Value Sets Extraction
Pathak, Jyotishman; Jiang, Guoqian; Dwarkanath, Sridhar O.; Buntrock, James D.; Chute, Christopher G.
2008-01-01
The ability to model, share and re-use value sets across multiple medical information systems is an important requirement. However, generating value sets semi-automatically from a terminology service is still an unresolved issue, in part due to the lack of linkage to clinical context patterns that provide the constraints in defining a concept domain and invocation of value sets extraction. Towards this goal, we develop and evaluate an approach for context-driven automatic value sets extraction based on a formal terminology model. The crux of the technique is to identify and define the context patterns from various domains of discourse and leverage them for value set extraction using two complementary ideas based on (i) local terms provided by the Subject Matter Experts (extensional) and (ii) semantic definition of the concepts in coding schemes (intensional). A prototype was implemented based on SNOMED CT rendered in the LexGrid terminology model and a preliminary evaluation is presented. PMID:18998955
Sutherland, Clare A M; Liu, Xizi; Zhang, Lingshan; Chu, Yingtung; Oldmeadow, Julian A; Young, Andrew W
2018-04-01
People form first impressions from facial appearance rapidly, and these impressions can have considerable social and economic consequences. Three dimensions can explain Western perceivers' impressions of Caucasian faces: approachability, youthful-attractiveness, and dominance. Impressions along these dimensions are theorized to be based on adaptive cues to threat detection or sexual selection, making it likely that they are universal. We tested whether the same dimensions of facial impressions emerge across culture by building data-driven models of first impressions of Asian and Caucasian faces derived from Chinese and British perceivers' unconstrained judgments. We then cross-validated the dimensions with computer-generated average images. We found strong evidence for common approachability and youthful-attractiveness dimensions across perceiver and face race, with some evidence of a third dimension akin to capability. The models explained ~75% of the variance in facial impressions. In general, the findings demonstrate substantial cross-cultural agreement in facial impressions, especially on the most salient dimensions.
Gulati, Sanchita; During, David; Mainland, Jeff; Wong, Agnes M F
2018-01-01
One of the key challenges to healthcare organizations is the development of relevant and accurate cost information. In this paper, we used time-driven activity-based costing (TDABC) method to calculate the costs of treating individual patients with specific medical conditions over their full cycle of care. We discussed how TDABC provides a critical, systematic and data-driven approach to estimate costs accurately and dynamically, as well as its potential to enable structural and rational cost reduction to bring about a sustainable healthcare system. © 2018 Longwoods Publishing.
A necessary condition for dispersal driven growth of populations with discrete patch dynamics.
Guiver, Chris; Packman, David; Townley, Stuart
2017-07-07
We revisit the question of when can dispersal-induced coupling between discrete sink populations cause overall population growth? Such a phenomenon is called dispersal driven growth and provides a simple explanation of how dispersal can allow populations to persist across discrete, spatially heterogeneous, environments even when individual patches are adverse or unfavourable. For two classes of mathematical models, one linear and one non-linear, we provide necessary conditions for dispersal driven growth in terms of the non-existence of a common linear Lyapunov function, which we describe. Our approach draws heavily upon the underlying positive dynamical systems structure. Our results apply to both discrete- and continuous-time models. The theory is illustrated with examples and both biological and mathematical conclusions are drawn. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Quantitative metrics for evaluating the phased roll-out of clinical information systems.
Wong, David; Wu, Nicolas; Watkinson, Peter
2017-09-01
We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Power-Law Modeling of Cancer Cell Fates Driven by Signaling Data to Reveal Drug Effects
Zhang, Fan; Wu, Min; Kwoh, Chee Keong; Zheng, Jie
2016-01-01
Extracellular signals are captured and transmitted by signaling proteins inside a cell. An important type of cellular responses to the signals is the cell fate decision, e.g., apoptosis. However, the underlying mechanisms of cell fate regulation are still unclear, thus comprehensive and detailed kinetic models are not yet available. Alternatively, data-driven models are promising to bridge signaling data with the phenotypic measurements of cell fates. The traditional linear model for data-driven modeling of signaling pathways has its limitations because it assumes that the a cell fate is proportional to the activities of signaling proteins, which is unlikely in the complex biological systems. Therefore, we propose a power-law model to relate the activities of all the measured signaling proteins to the probabilities of cell fates. In our experiments, we compared our nonlinear power-law model with the linear model on three cancer datasets with phosphoproteomics and cell fate measurements, which demonstrated that the nonlinear model has superior performance on cell fates prediction. By in silico simulation of virtual protein knock-down, the proposed model is able to reveal drug effects which can complement traditional approaches such as binding affinity analysis. Moreover, our model is able to capture cell line specific information to distinguish one cell line from another in cell fate prediction. Our results show that the power-law data-driven model is able to perform better in cell fate prediction and provide more insights into the signaling pathways for cancer cell fates than the linear model. PMID:27764199
NASA Astrophysics Data System (ADS)
Chen, Jingbo; Wang, Chengyi; Yue, Anzhi; Chen, Jiansheng; He, Dongxu; Zhang, Xiuyan
2017-10-01
The tremendous success of deep learning models such as convolutional neural networks (CNNs) in computer vision provides a method for similar problems in the field of remote sensing. Although research on repurposing pretrained CNN to remote sensing tasks is emerging, the scarcity of labeled samples and the complexity of remote sensing imagery still pose challenges. We developed a knowledge-guided golf course detection approach using a CNN fine-tuned on temporally augmented data. The proposed approach is a combination of knowledge-driven region proposal, data-driven detection based on CNN, and knowledge-driven postprocessing. To confront data complexity, knowledge-derived cooccurrence, composition, and area-based rules are applied sequentially to propose candidate golf regions. To confront sample scarcity, we employed data augmentation in the temporal domain, which extracts samples from multitemporal images. The augmented samples were then used to fine-tune a pretrained CNN for golf detection. Finally, commission error was further suppressed by postprocessing. Experiments conducted on GF-1 imagery prove the effectiveness of the proposed approach.
A data-driven emulation framework for representing water-food nexus in a changing cold region
NASA Astrophysics Data System (ADS)
Nazemi, A.; Zandmoghaddam, S.; Hatami, S.
2017-12-01
Water resource systems are under increasing pressure globally. Growing population along with competition between water demands and emerging effects of climate change have caused enormous vulnerabilities in water resource management across many regions. Diagnosing such vulnerabilities and provision of effective adaptation strategies requires the availability of simulation tools that can adequately represent the interactions between competing water demands for limiting water resources and inform decision makers about the critical vulnerability thresholds under a range of potential natural and anthropogenic conditions. Despite a significant progress in integrated modeling of water resource systems, regional models are often unable to fully represent the contemplating dynamics within the key elements of water resource systems locally. Here we propose a data-driven approach to emulate a complex regional water resource system model developed for Oldman River Basin in southern Alberta, Canada. The aim of the emulation is to provide a detailed understanding of the trade-offs and interaction at the Oldman Reservoir, which is the key to flood control and irrigated agriculture in this over-allocated semi-arid cold region. Different surrogate models are developed to represent the dynamic of irrigation demand and withdrawal as well as reservoir evaporation and release individually. The nan-falsified offline models are then integrated through the water balance equation at the reservoir location to provide a coupled model for representing the dynamic of reservoir operation and water allocation at the local scale. The performance of individual and integrated models are rigorously examined and sources of uncertainty are highlighted. To demonstrate the practical utility of such surrogate modeling approach, we use the integrated data-driven model for examining the trade-off in irrigation water supply, reservoir storage and release under a range of changing climate, upstream streamflow and local irrigation conditions.
Composable Framework Support for Software-FMEA Through Model Execution
NASA Astrophysics Data System (ADS)
Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco
2016-08-01
Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.
Multidimensional Data Modeling for Business Process Analysis
NASA Astrophysics Data System (ADS)
Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.
The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.
Towards Current Profile Control in ITER: Potential Approaches and Research Needs
NASA Astrophysics Data System (ADS)
Schuster, E.; Barton, J. E.; Wehner, W. P.
2014-10-01
Many challenging plasma control problems still need to be addressed in order for the ITER Plasma Control System (PCS) to be able to successfully achieve the ITER project goals. For instance, setting up a suitable toroidal current density profile is key for one possible advanced scenario characterized by noninductive sustainment of the plasma current and steady-state operation. The nonlinearity and high dimensionality exhibited by the plasma demand a model-based current-profile control synthesis procedure that can accommodate this complexity through embedding the known physics within the design. The development of a model capturing the dynamics of the plasma relevant for control design enables not only the design of feedback controllers for regulation or tracking but also the design of optimal feedforward controllers for a systematic model-based approach to scenario planning, the design of state estimators for a reliable real-time reconstruction of the plasma internal profiles based on limited and noisy diagnostics, and the development of a fast predictive simulation code for closed-loop performance evaluation before implementation. Progress towards control-oriented modeling of the current profile evolution and associated control design has been reported following both data-driven and first-principles-driven approaches. An overview of these two approaches will be provided, as well as a discussion on research needs associated with each one of the model applications described above. Supported by the US Department of Energy under DE-SC0001334 and DE-SC0010661.
Extended-Range Prediction with Low-Dimensional, Stochastic-Dynamic Models: A Data-driven Approach
2012-09-30
characterization of extratropical storms and extremes and link these to LFV modes. Mingfang Ting, Yochanan Kushnir, Andrew W. Robertson...simulating and predicting a wide range of climate phenomena including ENSO, tropical Atlantic sea surface temperatures (SSTs), storm track variability...into empirical prediction models. Use observations to improve low-order dynamical MJO models. Adam Sobel, Daehyun Kim. Extratropical variability
Classroom Strategies Coaching Model: Integration of Formative Assessment and Instructional Coaching
ERIC Educational Resources Information Center
Reddy, Linda A.; Dudek, Christopher M.; Lekwa, Adam
2017-01-01
This article describes the theory, key components, and empirical support for the Classroom Strategies Coaching (CSC) Model, a data-driven coaching approach that systematically integrates data from multiple observations to identify teacher practice needs and goals, design practice plans, and evaluate progress towards goals. The primary aim of the…
Measuring Experiential Avoidance: A Preliminary Test of a Working Model
ERIC Educational Resources Information Center
Hayes, Steven C.; Strosahl, Kirk; Wilson, Kelly G.; Bissett, Richard T.; Pistorello, Jacqueline; Toarmino, Dosheen; Polusny, Melissa A.; Dykstra, Thane A.; Batten, Sonja V.; Bergan, John; Stewart, Sherry H.; Zvolensky, Michael J.; Eifert, Georg H.; Bond, Frank W.; Forsyth, John P.; Karekla, Maria; Mccurry, Susan M.
2004-01-01
The present study describes the development of a short, general measure of experiential avoidance, based on a specific theoretical approach to this process. A theoretically driven iterative exploratory analysis using structural equation modeling on data from a clinical sample yielded a single factor comprising 9 items. A fully confirmatory factor…
A Platform Independent Game Technology Model for Model Driven Serious Games Development
ERIC Educational Resources Information Center
Tang, Stephen; Hanneghan, Martin; Carter, Christopher
2013-01-01
Game-based learning (GBL) combines pedagogy and interactive entertainment to create a virtual learning environment in an effort to motivate and regain the interest of a new generation of "digital native" learners. However, this approach is impeded by the limited availability of suitable "serious" games and high-level design…
Information Technology Management: Course Re-Design Using an Assessment Driven Approach
ERIC Educational Resources Information Center
Schwieger, Dana; Surendran, Ken
2013-01-01
One of the core courses in the IS2010 Model Curriculum Guideline is "IS Strategy, Management and Acquisition" ("ISMA"). The authors redesigned their pre-IS2010 model Information Technology Management (ITM) course to meet the skills development stated in the ISMA course. Since the IT discipline is changing rapidly, the technical…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Christopher S.; Bernstein, Hans C.; Weisenhorn, Pamela
Metabolic network modeling of microbial communities provides an in-depth understanding of community-wide metabolic and regulatory processes. Compared to single organism analyses, community metabolic network modeling is more complex because it needs to account for interspecies interactions. To date, most approaches focus on reconstruction of high-quality individual networks so that, when combined, they can predict community behaviors as a result of interspecies interactions. However, this conventional method becomes ineffective for communities whose members are not well characterized and cannot be experimentally interrogated in isolation. Here, we tested a new approach that uses community-level data as a critical input for the networkmore » reconstruction process. This method focuses on directly predicting interspecies metabolic interactions in a community, when axenic information is insufficient. We validated our method through the case study of a bacterial photoautotroph-heterotroph consortium that was used to provide data needed for a community-level metabolic network reconstruction. Resulting simulations provided experimentally validated predictions of how a photoautotrophic cyanobacterium supports the growth of an obligate heterotrophic species by providing organic carbon and nitrogen sources.« less
Innovation diffusion on time-varying activity driven networks
NASA Astrophysics Data System (ADS)
Rizzo, Alessandro; Porfiri, Maurizio
2016-01-01
Since its introduction in the 1960s, the theory of innovation diffusion has contributed to the advancement of several research fields, such as marketing management and consumer behavior. The 1969 seminal paper by Bass [F.M. Bass, Manag. Sci. 15, 215 (1969)] introduced a model of product growth for consumer durables, which has been extensively used to predict innovation diffusion across a range of applications. Here, we propose a novel approach to study innovation diffusion, where interactions among individuals are mediated by the dynamics of a time-varying network. Our approach is based on the Bass' model, and overcomes key limitations of previous studies, which assumed timescale separation between the individual dynamics and the evolution of the connectivity patterns. Thus, we do not hypothesize homogeneous mixing among individuals or the existence of a fixed interaction network. We formulate our approach in the framework of activity driven networks to enable the analysis of the concurrent evolution of the interaction and individual dynamics. Numerical simulations offer a systematic analysis of the model behavior and highlight the role of individual activity on market penetration when targeted advertisement campaigns are designed, or a competition between two different products takes place.
Integrating geo web services for a user driven exploratory analysis
NASA Astrophysics Data System (ADS)
Moncrieff, Simon; Turdukulov, Ulanbek; Gulland, Elizabeth-Kate
2016-04-01
In data exploration, several online data sources may need to be dynamically aggregated or summarised over spatial region, time interval, or set of attributes. With respect to thematic data, web services are mainly used to present results leading to a supplier driven service model limiting the exploration of the data. In this paper we propose a user need driven service model based on geo web processing services. The aim of the framework is to provide a method for the scalable and interactive access to various geographic data sources on the web. The architecture combines a data query, processing technique and visualisation methodology to rapidly integrate and visually summarise properties of a dataset. We illustrate the environment on a health related use case that derives Age Standardised Rate - a dynamic index that needs integration of the existing interoperable web services of demographic data in conjunction with standalone non-spatial secure database servers used in health research. Although the example is specific to the health field, the architecture and the proposed approach are relevant and applicable to other fields that require integration and visualisation of geo datasets from various web services and thus, we believe is generic in its approach.
NASA Astrophysics Data System (ADS)
Weidinger, Simon; Knap, Michael
We study the regimes of heating in the periodically driven O (N) -model, which represents a generic model for interacting quantum many-body systems. By computing the absorbed energy with a non-equilibrium Keldysh Green's function approach, we establish three dynamical regimes: at short times a single-particle dominated regime, at intermediate times a stable Floquet prethermal regime in which the system ceases to absorb, and at parametrically late times a thermalizing regime. Our simulations suggest that in the thermalizing regime the absorbed energy grows algebraically in time with an the exponent that approaches the universal value of 1 / 2 , and is thus significantly slower than linear Joule heating. Our results demonstrate the parametric stability of prethermal states in a generic many-body system driven at frequencies that are comparable to its microscopic scales. This paves the way for realizing exotic quantum phases, such as time crystals or interacting topological phases, in the prethermal regime of interacting Floquet systems. We acknowledge support from the Technical University of Munich - Institute for Advanced Study, funded by the German Excellence Initiative and the European Union FP7 under Grant agreement 291763, and from the DFG Grant No. KN 1254/1-1.
A Machine-Learning-Driven Sky Model.
Satylmys, Pynar; Bashford-Rogers, Thomas; Chalmers, Alan; Debattista, Kurt
2017-01-01
Sky illumination is responsible for much of the lighting in a virtual environment. A machine-learning-based approach can compactly represent sky illumination from both existing analytic sky models and from captured environment maps. The proposed approach can approximate the captured lighting at a significantly reduced memory cost and enable smooth transitions of sky lighting to be created from a small set of environment maps captured at discrete times of day. The author's results demonstrate accuracy close to the ground truth for both analytical and capture-based methods. The approach has a low runtime overhead, so it can be used as a generic approach for both offline and real-time applications.
NASA Astrophysics Data System (ADS)
Kidon, Lyran; Wilner, Eli Y.; Rabani, Eran
2015-12-01
The generalized quantum master equation provides a powerful tool to describe the dynamics in quantum impurity models driven away from equilibrium. Two complementary approaches, one based on Nakajima-Zwanzig-Mori time-convolution (TC) and the other on the Tokuyama-Mori time-convolutionless (TCL) formulations provide a starting point to describe the time-evolution of the reduced density matrix. A key in both approaches is to obtain the so called "memory kernel" or "generator," going beyond second or fourth order perturbation techniques. While numerically converged techniques are available for the TC memory kernel, the canonical approach to obtain the TCL generator is based on inverting a super-operator in the full Hilbert space, which is difficult to perform and thus, nearly all applications of the TCL approach rely on a perturbative scheme of some sort. Here, the TCL generator is expressed using a reduced system propagator which can be obtained from system observables alone and requires the calculation of super-operators and their inverse in the reduced Hilbert space rather than the full one. This makes the formulation amenable to quantum impurity solvers or to diagrammatic techniques, such as the nonequilibrium Green's function. We implement the TCL approach for the resonant level model driven away from equilibrium and compare the time scales for the decay of the generator with that of the memory kernel in the TC approach. Furthermore, the effects of temperature, source-drain bias, and gate potential on the TCL/TC generators are discussed.
NASA Astrophysics Data System (ADS)
Nyawira, S. S.; Nabel, J. E. M. S.; Brovkin, V.; Pongratz, J.
2017-08-01
Historical changes in soil carbon associated with land-use change (LUC) result mainly from the changes in the quantity of litter inputs to the soil and the turnover of carbon in soils. We use a factor separation technique to assess how the input-driven and turnover-driven controls, as well as their synergies, have contributed to historical changes in soil carbon associated with LUC. We apply this approach to equilibrium simulations of present-day and pre-industrial land use performed using the dynamic global vegetation model JSBACH. Our results show that both the input-driven and turnover-driven changes generally contribute to a gain in soil carbon in afforested regions and a loss in deforested regions. However, in regions where grasslands have been converted to croplands, we find an input-driven loss that is partly offset by a turnover-driven gain, which stems from a decrease in the fire-related carbon losses. Omitting land management through crop and wood harvest substantially reduces the global losses through the input-driven changes. Our study thus suggests that the dominating control of soil carbon losses is via the input-driven changes, which are more directly accessible to human management than the turnover-driven ones.
Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria
2016-09-23
The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents' dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.
Livestock Helminths in a Changing Climate: Approaches and Restrictions to Meaningful Predictions
Fox, Naomi J.; Marion, Glenn; Davidson, Ross S.; White, Piran C. L.; Hutchings, Michael R.
2012-01-01
Simple Summary Parasitic helminths represent one of the most pervasive challenges to livestock, and their intensity and distribution will be influenced by climate change. There is a need for long-term predictions to identify potential risks and highlight opportunities for control. We explore the approaches to modelling future helminth risk to livestock under climate change. One of the limitations to model creation is the lack of purpose driven data collection. We also conclude that models need to include a broad view of the livestock system to generate meaningful predictions. Abstract Climate change is a driving force for livestock parasite risk. This is especially true for helminths including the nematodes Haemonchus contortus, Teladorsagia circumcincta, Nematodirus battus, and the trematode Fasciola hepatica, since survival and development of free-living stages is chiefly affected by temperature and moisture. The paucity of long term predictions of helminth risk under climate change has driven us to explore optimal modelling approaches and identify current bottlenecks to generating meaningful predictions. We classify approaches as correlative or mechanistic, exploring their strengths and limitations. Climate is one aspect of a complex system and, at the farm level, husbandry has a dominant influence on helminth transmission. Continuing environmental change will necessitate the adoption of mitigation and adaptation strategies in husbandry. Long term predictive models need to have the architecture to incorporate these changes. Ultimately, an optimal modelling approach is likely to combine mechanistic processes and physiological thresholds with correlative bioclimatic modelling, incorporating changes in livestock husbandry and disease control. Irrespective of approach, the principal limitation to parasite predictions is the availability of active surveillance data and empirical data on physiological responses to climate variables. By combining improved empirical data and refined models with a broad view of the livestock system, robust projections of helminth risk can be developed. PMID:26486780
Community-Based Participatory Evaluation: The Healthy Start Approach
Braithwaite, Ronald L.; McKenzie, Robetta D.; Pruitt, Vikki; Holden, Kisha B.; Aaron, Katrina; Hollimon, Chavone
2013-01-01
The use of community-based participatory research has gained momentum as a viable approach to academic and community engagement for research over the past 20 years. This article discusses an approach for extending the process with an emphasis on evaluation of a community partnership–driven initiative and thus advances the concept of conducting community-based participatory evaluation (CBPE) through a model used by the Healthy Start project of the Augusta Partnership for Children, Inc., in Augusta, Georgia. Application of the CBPE approach advances the importance of bilateral engagements with consumers and academic evaluators. The CBPE model shows promise as a reliable and credible evaluation approach for community-level assessment of health promotion programs. PMID:22461687
Community-based participatory evaluation: the healthy start approach.
Braithwaite, Ronald L; McKenzie, Robetta D; Pruitt, Vikki; Holden, Kisha B; Aaron, Katrina; Hollimon, Chavone
2013-03-01
The use of community-based participatory research has gained momentum as a viable approach to academic and community engagement for research over the past 20 years. This article discusses an approach for extending the process with an emphasis on evaluation of a community partnership-driven initiative and thus advances the concept of conducting community-based participatory evaluation (CBPE) through a model used by the Healthy Start project of the Augusta Partnership for Children, Inc., in Augusta, Georgia. Application of the CBPE approach advances the importance of bilateral engagements with consumers and academic evaluators. The CBPE model shows promise as a reliable and credible evaluation approach for community-level assessment of health promotion programs.
Model-driven discovery of underground metabolic functions in Escherichia coli.
Guzmán, Gabriela I; Utrilla, José; Nurk, Sergey; Brunk, Elizabeth; Monk, Jonathan M; Ebrahim, Ali; Palsson, Bernhard O; Feist, Adam M
2015-01-20
Enzyme promiscuity toward substrates has been discussed in evolutionary terms as providing the flexibility to adapt to novel environments. In the present work, we describe an approach toward exploring such enzyme promiscuity in the space of a metabolic network. This approach leverages genome-scale models, which have been widely used for predicting growth phenotypes in various environments or following a genetic perturbation; however, these predictions occasionally fail. Failed predictions of gene essentiality offer an opportunity for targeting biological discovery, suggesting the presence of unknown underground pathways stemming from enzymatic cross-reactivity. We demonstrate a workflow that couples constraint-based modeling and bioinformatic tools with KO strain analysis and adaptive laboratory evolution for the purpose of predicting promiscuity at the genome scale. Three cases of genes that are incorrectly predicted as essential in Escherichia coli--aspC, argD, and gltA--are examined, and isozyme functions are uncovered for each to a different extent. Seven isozyme functions based on genetic and transcriptional evidence are suggested between the genes aspC and tyrB, argD and astC, gabT and puuE, and gltA and prpC. This study demonstrates how a targeted model-driven approach to discovery can systematically fill knowledge gaps, characterize underground metabolism, and elucidate regulatory mechanisms of adaptation in response to gene KO perturbations.
Combining Model-driven and Schema-based Program Synthesis
NASA Technical Reports Server (NTRS)
Denney, Ewen; Whittle, John
2004-01-01
We describe ongoing work which aims to extend the schema-based program synthesis paradigm with explicit models. In this context, schemas can be considered as model-to-model transformations. The combination of schemas with explicit models offers a number of advantages, namely, that building synthesis systems becomes much easier since the models can be used in verification and in adaptation of the synthesis systems. We illustrate our approach using an example from signal processing.
Iglesias, Virginia; Yospin, Gabriel I; Whitlock, Cathy
2014-01-01
Fire is a key ecological process affecting vegetation dynamics and land cover. The characteristic frequency, size, and intensity of fire are driven by interactions between top-down climate-driven and bottom-up fuel-related processes. Disentangling climatic from non-climatic drivers of past fire regimes is a grand challenge in Earth systems science, and a topic where both paleoecology and ecological modeling have made substantial contributions. In this manuscript, we (1) review the use of sedimentary charcoal as a fire proxy and the methods used in charcoal-based fire history reconstructions; (2) identify existing techniques for paleoecological modeling; and (3) evaluate opportunities for coupling of paleoecological and ecological modeling approaches to better understand the causes and consequences of past, present, and future fire activity.
Extreme learning machine for reduced order modeling of turbulent geophysical flows.
San, Omer; Maulik, Romit
2018-04-01
We investigate the application of artificial neural networks to stabilize proper orthogonal decomposition-based reduced order models for quasistationary geophysical turbulent flows. An extreme learning machine concept is introduced for computing an eddy-viscosity closure dynamically to incorporate the effects of the truncated modes. We consider a four-gyre wind-driven ocean circulation problem as our prototype setting to assess the performance of the proposed data-driven approach. Our framework provides a significant reduction in computational time and effectively retains the dynamics of the full-order model during the forward simulation period beyond the training data set. Furthermore, we show that the method is robust for larger choices of time steps and can be used as an efficient and reliable tool for long time integration of general circulation models.
Extreme learning machine for reduced order modeling of turbulent geophysical flows
NASA Astrophysics Data System (ADS)
San, Omer; Maulik, Romit
2018-04-01
We investigate the application of artificial neural networks to stabilize proper orthogonal decomposition-based reduced order models for quasistationary geophysical turbulent flows. An extreme learning machine concept is introduced for computing an eddy-viscosity closure dynamically to incorporate the effects of the truncated modes. We consider a four-gyre wind-driven ocean circulation problem as our prototype setting to assess the performance of the proposed data-driven approach. Our framework provides a significant reduction in computational time and effectively retains the dynamics of the full-order model during the forward simulation period beyond the training data set. Furthermore, we show that the method is robust for larger choices of time steps and can be used as an efficient and reliable tool for long time integration of general circulation models.
Data-driven outbreak forecasting with a simple nonlinear growth model
Lega, Joceline; Brown, Heidi E.
2016-01-01
Recent events have thrown the spotlight on infectious disease outbreak response. We developed a data-driven method, EpiGro, which can be applied to cumulative case reports to estimate the order of magnitude of the duration, peak and ultimate size of an ongoing outbreak. It is based on a surprisingly simple mathematical property of many epidemiological data sets, does not require knowledge or estimation of disease transmission parameters, is robust to noise and to small data sets, and runs quickly due to its mathematical simplicity. Using data from historic and ongoing epidemics, we present the model. We also provide modeling considerations that justify this approach and discuss its limitations. In the absence of other information or in conjunction with other models, EpiGro may be useful to public health responders. PMID:27770752
Reconstruction of fire regimes through integrated paleoecological proxy data and ecological modeling
Iglesias, Virginia; Yospin, Gabriel I.; Whitlock, Cathy
2015-01-01
Fire is a key ecological process affecting vegetation dynamics and land cover. The characteristic frequency, size, and intensity of fire are driven by interactions between top-down climate-driven and bottom-up fuel-related processes. Disentangling climatic from non-climatic drivers of past fire regimes is a grand challenge in Earth systems science, and a topic where both paleoecology and ecological modeling have made substantial contributions. In this manuscript, we (1) review the use of sedimentary charcoal as a fire proxy and the methods used in charcoal-based fire history reconstructions; (2) identify existing techniques for paleoecological modeling; and (3) evaluate opportunities for coupling of paleoecological and ecological modeling approaches to better understand the causes and consequences of past, present, and future fire activity. PMID:25657652
Zhang, Huaguang; Cui, Lili; Zhang, Xin; Luo, Yanhong
2011-12-01
In this paper, a novel data-driven robust approximate optimal tracking control scheme is proposed for unknown general nonlinear systems by using the adaptive dynamic programming (ADP) method. In the design of the controller, only available input-output data is required instead of known system dynamics. A data-driven model is established by a recurrent neural network (NN) to reconstruct the unknown system dynamics using available input-output data. By adding a novel adjustable term related to the modeling error, the resultant modeling error is first guaranteed to converge to zero. Then, based on the obtained data-driven model, the ADP method is utilized to design the approximate optimal tracking controller, which consists of the steady-state controller and the optimal feedback controller. Further, a robustifying term is developed to compensate for the NN approximation errors introduced by implementing the ADP method. Based on Lyapunov approach, stability analysis of the closed-loop system is performed to show that the proposed controller guarantees the system state asymptotically tracking the desired trajectory. Additionally, the obtained control input is proven to be close to the optimal control input within a small bound. Finally, two numerical examples are used to demonstrate the effectiveness of the proposed control scheme.
NASA Astrophysics Data System (ADS)
Yao, Bing; Yang, Hui
2016-12-01
This paper presents a novel physics-driven spatiotemporal regularization (STRE) method for high-dimensional predictive modeling in complex healthcare systems. This model not only captures the physics-based interrelationship between time-varying explanatory and response variables that are distributed in the space, but also addresses the spatial and temporal regularizations to improve the prediction performance. The STRE model is implemented to predict the time-varying distribution of electric potentials on the heart surface based on the electrocardiogram (ECG) data from the distributed sensor network placed on the body surface. The model performance is evaluated and validated in both a simulated two-sphere geometry and a realistic torso-heart geometry. Experimental results show that the STRE model significantly outperforms other regularization models that are widely used in current practice such as Tikhonov zero-order, Tikhonov first-order and L1 first-order regularization methods.
Exploring galaxy evolution with latent space walks
NASA Astrophysics Data System (ADS)
Schawinski, Kevin; Turp, Dennis; Zhang, Ce
2018-01-01
We present a new approach using artificial intelligence to perform data-driven forward models of astrophysical phenomena. We describe how a variational autoencoder can be used to encode galaxies to latent space, independently manipulate properties such as the specific star formation rate, and return it to real space. Such transformations can be used for forward modeling phenomena using data as the only constraints. We demonstrate the utility of this approach using the question of the quenching of star formation in galaxies.
On the Stability of Jump-Linear Systems Driven by Finite-State Machines with Markovian Inputs
NASA Technical Reports Server (NTRS)
Patilkulkarni, Sudarshan; Herencia-Zapana, Heber; Gray, W. Steven; Gonzalez, Oscar R.
2004-01-01
This paper presents two mean-square stability tests for a jump-linear system driven by a finite-state machine with a first-order Markovian input process. The first test is based on conventional Markov jump-linear theory and avoids the use of any higher-order statistics. The second test is developed directly using the higher-order statistics of the machine s output process. The two approaches are illustrated with a simple model for a recoverable computer control system.
A survey of the three-dimensional high Reynolds number transonic wind tunnel
NASA Technical Reports Server (NTRS)
Takashima, K.; Sawada, H.; Aoki, T.
1982-01-01
The facilities for aerodynamic testing of airplane models at transonic speeds and high Reynolds numbers are surveyed. The need for high Reynolds number testing is reviewed, using some experimental results. Some approaches to high Reynolds number testing such as the cryogenic wind tunnel, the induction driven wind tunnel, the Ludwieg tube, the Evans clean tunnel and the hydraulic driven wind tunnel are described. The level of development of high Reynolds number testing facilities in Japan is discussed.
A data-driven approach for retrieving temperatures and abundances in brown dwarf atmospheres
DOE Office of Scientific and Technical Information (OSTI.GOV)
Line, Michael R.; Fortney, Jonathan J.; Marley, Mark S.
2014-09-20
Brown dwarf spectra contain a wealth of information about their molecular abundances, temperature structure, and gravity. We present a new data driven retrieval approach, previously used in planetary atmosphere studies, to extract the molecular abundances and temperature structure from brown dwarf spectra. The approach makes few a priori physical assumptions about the state of the atmosphere. The feasibility of the approach is first demonstrated on a synthetic brown dwarf spectrum. Given typical spectral resolutions, wavelength coverage, and noise, property precisions of tens of percent can be obtained for the molecular abundances and tens to hundreds of K on the temperaturemore » profile. The technique is then applied to the well-studied brown dwarf, Gl 570D. From this spectral retrieval, the spectroscopic radius is constrained to be 0.75-0.83 R {sub J}, log (g) to be 5.13-5.46, and T {sub eff} to be between 804 and 849 K. Estimates for the range of abundances and allowed temperature profiles are also derived. The results from our retrieval approach are in agreement with the self-consistent grid modeling results of Saumon et al. This new approach will allow us to address issues of compositional differences between brown dwarfs and possibly their formation environments, disequilibrium chemistry, and missing physics in current grid modeling approaches as well as a many other issues.« less
Lin, Chih-Tin; Meyhofer, Edgar; Kurabayashi, Katsuo
2010-01-01
Directional control of microtubule shuttles via microfabricated tracks is key to the development of controlled nanoscale mass transport by kinesin motor molecules. Here we develop and test a model to quantitatively predict the stochastic behavior of microtubule guiding when they mechanically collide with the sidewalls of lithographically patterned tracks. By taking into account appropriate probability distributions of microscopic states of the microtubule system, the model allows us to theoretically analyze the roles of collision conditions and kinesin surface densities in determining how the motion of microtubule shuttles is controlled. In addition, we experimentally observe the statistics of microtubule collision events and compare our theoretical prediction with experimental data to validate our model. The model will direct the design of future hybrid nanotechnology devices that integrate nanoscale transport systems powered by kinesin-driven molecular shuttles.
NASA Astrophysics Data System (ADS)
Sboev, A.; Moloshnikov, I.; Gudovskikh, D.; Rybka, R.
2017-12-01
In this work we compare several data-driven approaches to the task of author’s gender identification for texts with or without gender imitation. The data corpus has been specially gathered with crowdsourcing for this task. The best models are convolutional neural network with input of morphological data (fl-measure: 88%±3) for texts without imitation, and gradient boosting model with vector of character n-grams frequencies as input data (f1-measure: 64% ± 3) for texts with gender imitation. The method to filter the crowdsourced corpus using limited reference sample of texts to increase the accuracy of result is discussed.
FIELD-DRIVEN APPROACHES TO SUBSURFACE CONTAMINANT TRANSPORT MODELING.
Observations from field sites provide a means for prioritizing research activities. In the case of petroleum releases, observations may include spiking of concentration distributions that may be related to water table fluctuation, co-location of contaminant plumes with geochemi...
Exploring business process modelling paradigms and design-time to run-time transitions
NASA Astrophysics Data System (ADS)
Caron, Filip; Vanthienen, Jan
2016-09-01
The business process management literature describes a multitude of approaches (e.g. imperative, declarative or event-driven) that each result in a different mix of process flexibility, compliance, effectiveness and efficiency. Although the use of a single approach over the process lifecycle is often assumed, transitions between approaches at different phases in the process lifecycle may also be considered. This article explores several business process strategies by analysing the approaches at different phases in the process lifecycle as well as the various transitions.
ERIC Educational Resources Information Center
Wolf, Peter
2007-01-01
In the fall of 2003, Teaching Support Services (TSS), a department at the University of Guelph, was approached by a faculty member in the department of food sciences. Professor Art Hill was interested in seeking support in systematically assessing the department's undergraduate curriculum and using that assessment to trigger further improvement of…
ERIC Educational Resources Information Center
Bowen, Natasha K.; Powers, Joelle D.
2011-01-01
Evidence-based practice and data-driven decision making (DDDM) are two approaches to accountability that have been promoted in the school literature. In spite of the push to promote these approaches in schools, barriers to their widespread, appropriate, and effective use have limited their impact on practice and student outcomes. This article…
Theoretical Framework for Interaction Game Design
2016-05-19
modeling. We take a data-driven quantitative approach to understand conversational behaviors by measuring conversational behaviors using advanced sensing...current state of the art, human computing is considered to be a reasonable approach to break through the current limitation. To solicit high quality and...proper resources in conversation to enable smooth and effective interaction. The last technique is about conversation measurement , analysis, and
Wall Driven Cavity Approach to Slug Flow Modeling In a Micro channel
NASA Astrophysics Data System (ADS)
Sahu, Avinash; Kulkarni, Shekhar; Pushpavanam, Subramaniam; Pushpavanam Research League Team, Prof.
2014-03-01
Slug flow is a commonly observed stable regime and occurs at relatively low flow rates of the fluids. Wettability of channel decides continuous and discrete phases. In these types of biphasic flows, the fluid - fluid interface acts as a barrier that prohibits species movement across the interface. The flow inside a slug is qualitatively similar to the well known shallow cavity flow. In shallow cavities the flow mimics the ``fully developed'' internal circulation in slug flows. Another approach to slug flow modeling can be in a moving reference frame. Here the wall boundary moves in the direction opposite to that of the flow, hence induces circulations within the phases which is analogous to the well known Lid Driven Cavity. The two parallel walls are moved in the opposite directions which generate circulation patterns, equivalent to the ones regularly observed in slug flow in micro channels. A fourth order stream function equation is solved using finite difference approach. The flow field obtained using the two approaches will be used to analyze the effect on mass transfer and chemical reactions in the micro channel. The internal circulations and the performance of these systems will be validated experimentally.
Evaluating Model-Driven Development for large-scale EHRs through the openEHR approach.
Christensen, Bente; Ellingsen, Gunnar
2016-05-01
In healthcare, the openEHR standard is a promising Model-Driven Development (MDD) approach for electronic healthcare records. This paper aims to identify key socio-technical challenges when the openEHR approach is put to use in Norwegian hospitals. More specifically, key fundamental assumptions are investigated empirically. These assumptions promise a clear separation of technical and domain concerns, users being in control of the modelling process, and widespread user commitment. Finally, these assumptions promise an easy way to model and map complex organizations. This longitudinal case study is based on an interpretive approach, whereby data were gathered through 440h of participant observation, 22 semi-structured interviews and extensive document studies over 4 years. The separation of clinical and technical concerns seemed to be aspirational, because both designing the technical system and modelling the domain required technical and clinical competence. Hence developers and clinicians found themselves working together in both arenas. User control and user commitment seemed not to apply in large-scale projects, as modelling the domain turned out to be too complicated and hence to appeal only to especially interested users worldwide, not the local end-users. Modelling proved to be a complex standardization process that shaped both the actual modelling and healthcare practice itself. A broad assemblage of contributors seems to be needed for developing an archetype-based system, in which roles, responsibilities and contributions cannot be clearly defined and delimited. The way MDD occurs has implications for medical practice per se in the form of the need to standardize practices to ensure that medical concepts are uniform across practices. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Schellart, Wouter P.; Strak, Vincent
2016-10-01
We present a review of the analogue modelling method, which has been used for 200 years, and continues to be used, to investigate geological phenomena and geodynamic processes. We particularly focus on the following four components: (1) the different fundamental modelling approaches that exist in analogue modelling; (2) the scaling theory and scaling of topography; (3) the different materials and rheologies that are used to simulate the complex behaviour of rocks; and (4) a range of recording techniques that are used for qualitative and quantitative analyses and interpretations of analogue models. Furthermore, we apply these four components to laboratory-based subduction models and describe some of the issues at hand with modelling such systems. Over the last 200 years, a wide variety of analogue materials have been used with different rheologies, including viscous materials (e.g. syrups, silicones, water), brittle materials (e.g. granular materials such as sand, microspheres and sugar), plastic materials (e.g. plasticine), visco-plastic materials (e.g. paraffin, waxes, petrolatum) and visco-elasto-plastic materials (e.g. hydrocarbon compounds and gelatins). These materials have been used in many different set-ups to study processes from the microscale, such as porphyroclast rotation, to the mantle scale, such as subduction and mantle convection. Despite the wide variety of modelling materials and great diversity in model set-ups and processes investigated, all laboratory experiments can be classified into one of three different categories based on three fundamental modelling approaches that have been used in analogue modelling: (1) The external approach, (2) the combined (external + internal) approach, and (3) the internal approach. In the external approach and combined approach, energy is added to the experimental system through the external application of a velocity, temperature gradient or a material influx (or a combination thereof), and so the system is open. In the external approach, all deformation in the system is driven by the externally imposed condition, while in the combined approach, part of the deformation is driven by buoyancy forces internal to the system. In the internal approach, all deformation is driven by buoyancy forces internal to the system and so the system is closed and no energy is added during an experimental run. In the combined approach, the externally imposed force or added energy is generally not quantified nor compared to the internal buoyancy force or potential energy of the system, and so it is not known if these experiments are properly scaled with respect to nature. The scaling theory requires that analogue models are geometrically, kinematically and dynamically similar to the natural prototype. Direct scaling of topography in laboratory models indicates that it is often significantly exaggerated. This can be ascribed to (1) The lack of isostatic compensation, which causes topography to be too high. (2) The lack of erosion, which causes topography to be too high. (3) The incorrect scaling of topography when density contrasts are scaled (rather than densities); In isostatically supported models, scaling of density contrasts requires an adjustment of the scaled topography by applying a topographic correction factor. (4) The incorrect scaling of externally imposed boundary conditions in isostatically supported experiments using the combined approach; When externally imposed forces are too high, this creates topography that is too high. Other processes that also affect surface topography in laboratory models but not in nature (or only in a negligible way) include surface tension (for models using fluids) and shear zone dilatation (for models using granular material), but these will generally only affect the model surface topography on relatively short horizontal length scales of the order of several mm across material boundaries and shear zones, respectively.
A novel model for simulating the racing effect in capillary-driven underfill process in flip chip
NASA Astrophysics Data System (ADS)
Zhu, Wenhui; Wang, Kanglun; Wang, Yan
2018-04-01
Underfill is typically applied in flip chips to increase the reliability of the electronic packagings. In this paper, the evolution of the melt-front shape of the capillary-driven underfill flow is studied through 3D numerical analysis. Two different models, the prevailing surface force model and the capillary model based on the wetted wall boundary condition, are introduced to test their applicability, where level set method is used to track the interface of the two phase flow. The comparison between the simulation results and experimental data indicates that, the surface force model produces better prediction on the melt-front shape, especially in the central area of the flip chip. Nevertheless, the two above models cannot simulate properly the racing effect phenomenon that appears during underfill encapsulation. A novel ‘dynamic pressure boundary condition’ method is proposed based on the validated surface force model. Utilizing this approach, the racing effect phenomenon is simulated with high precision. In addition, a linear relationship is derived from this model between the flow front location at the edge of the flip chip and the filling time. Using the proposed approach, the impact of the underfill-dispensing length on the melt-front shape is also studied.
Relational machine learning for electronic health record-driven phenotyping.
Peissig, Peggy L; Santos Costa, Vitor; Caldwell, Michael D; Rottscheit, Carla; Berg, Richard L; Mendonca, Eneida A; Page, David
2014-12-01
Electronic health records (EHR) offer medical and pharmacogenomics research unprecedented opportunities to identify and classify patients at risk. EHRs are collections of highly inter-dependent records that include biological, anatomical, physiological, and behavioral observations. They comprise a patient's clinical phenome, where each patient has thousands of date-stamped records distributed across many relational tables. Development of EHR computer-based phenotyping algorithms require time and medical insight from clinical experts, who most often can only review a small patient subset representative of the total EHR records, to identify phenotype features. In this research we evaluate whether relational machine learning (ML) using inductive logic programming (ILP) can contribute to addressing these issues as a viable approach for EHR-based phenotyping. Two relational learning ILP approaches and three well-known WEKA (Waikato Environment for Knowledge Analysis) implementations of non-relational approaches (PART, J48, and JRIP) were used to develop models for nine phenotypes. International Classification of Diseases, Ninth Revision (ICD-9) coded EHR data were used to select training cohorts for the development of each phenotypic model. Accuracy, precision, recall, F-Measure, and Area Under the Receiver Operating Characteristic (AUROC) curve statistics were measured for each phenotypic model based on independent manually verified test cohorts. A two-sided binomial distribution test (sign test) compared the five ML approaches across phenotypes for statistical significance. We developed an approach to automatically label training examples using ICD-9 diagnosis codes for the ML approaches being evaluated. Nine phenotypic models for each ML approach were evaluated, resulting in better overall model performance in AUROC using ILP when compared to PART (p=0.039), J48 (p=0.003) and JRIP (p=0.003). ILP has the potential to improve phenotyping by independently delivering clinically expert interpretable rules for phenotype definitions, or intuitive phenotypes to assist experts. Relational learning using ILP offers a viable approach to EHR-driven phenotyping. Copyright © 2014 Elsevier Inc. All rights reserved.
Data-driven train set crash dynamics simulation
NASA Astrophysics Data System (ADS)
Tang, Zhao; Zhu, Yunrui; Nie, Yinyu; Guo, Shihui; Liu, Fengjia; Chang, Jian; Zhang, Jianjun
2017-02-01
Traditional finite element (FE) methods are arguably expensive in computation/simulation of the train crash. High computational cost limits their direct applications in investigating dynamic behaviours of an entire train set for crashworthiness design and structural optimisation. On the contrary, multi-body modelling is widely used because of its low computational cost with the trade-off in accuracy. In this study, a data-driven train crash modelling method is proposed to improve the performance of a multi-body dynamics simulation of train set crash without increasing the computational burden. This is achieved by the parallel random forest algorithm, which is a machine learning approach that extracts useful patterns of force-displacement curves and predicts a force-displacement relation in a given collision condition from a collection of offline FE simulation data on various collision conditions, namely different crash velocities in our analysis. Using the FE simulation results as a benchmark, we compared our method with traditional multi-body modelling methods and the result shows that our data-driven method improves the accuracy over traditional multi-body models in train crash simulation and runs at the same level of efficiency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ping; Lv, Youbin; Wang, Hong
Optimal operation of a practical blast furnace (BF) ironmaking process depends largely on a good measurement of molten iron quality (MIQ) indices. However, measuring the MIQ online is not feasible using the available techniques. In this paper, a novel data-driven robust modeling is proposed for online estimation of MIQ using improved random vector functional-link networks (RVFLNs). Since the output weights of traditional RVFLNs are obtained by the least squares approach, a robustness problem may occur when the training dataset is contaminated with outliers. This affects the modeling accuracy of RVFLNs. To solve this problem, a Cauchy distribution weighted M-estimation basedmore » robust RFVLNs is proposed. Since the weights of different outlier data are properly determined by the Cauchy distribution, their corresponding contribution on modeling can be properly distinguished. Thus robust and better modeling results can be achieved. Moreover, given that the BF is a complex nonlinear system with numerous coupling variables, the data-driven canonical correlation analysis is employed to identify the most influential components from multitudinous factors that affect the MIQ indices to reduce the model dimension. Finally, experiments using industrial data and comparative studies have demonstrated that the obtained model produces a better modeling and estimating accuracy and stronger robustness than other modeling methods.« less
Albert, Carlo; Ulzega, Simone; Stoop, Ruedi
2016-04-01
Parameter inference is a fundamental problem in data-driven modeling. Given observed data that is believed to be a realization of some parameterized model, the aim is to find parameter values that are able to explain the observed data. In many situations, the dominant sources of uncertainty must be included into the model for making reliable predictions. This naturally leads to stochastic models. Stochastic models render parameter inference much harder, as the aim then is to find a distribution of likely parameter values. In Bayesian statistics, which is a consistent framework for data-driven learning, this so-called posterior distribution can be used to make probabilistic predictions. We propose a novel, exact, and very efficient approach for generating posterior parameter distributions for stochastic differential equation models calibrated to measured time series. The algorithm is inspired by reinterpreting the posterior distribution as a statistical mechanics partition function of an object akin to a polymer, where the measurements are mapped on heavier beads compared to those of the simulated data. To arrive at distribution samples, we employ a Hamiltonian Monte Carlo approach combined with a multiple time-scale integration. A separation of time scales naturally arises if either the number of measurement points or the number of simulation points becomes large. Furthermore, at least for one-dimensional problems, we can decouple the harmonic modes between measurement points and solve the fastest part of their dynamics analytically. Our approach is applicable to a wide range of inference problems and is highly parallelizable.
Recommendations for Model Driven Paradigms for Integrated Approaches to Cyber Defense
2017-03-06
analogy (e.g., Susceptible, Infected, Recovered [SIR]) • Abstract wargaming: game -theoretic model of cyber conflict without modeling the underlying...malware. 3.7 Abstract Wargaming Here, a game -theoretic process is modeled with moves and effects inspired by cyber conflict but without modeling the...underlying processes of cyber attack and defense. Examples in literature include the following: • Cho J-H, Gao J. Cyber war game in temporal networks
NASA Technical Reports Server (NTRS)
Roberts, Christopher J.; Morgenstern, Robert M.; Israel, David J.; Borky, John M.; Bradley, Thomas H.
2017-01-01
NASA's next generation space communications network will involve dynamic and autonomous services analogous to services provided by current terrestrial wireless networks. This architecture concept, known as the Space Mobile Network (SMN), is enabled by several technologies now in development. A pillar of the SMN architecture is the establishment and utilization of a continuous bidirectional control plane space link channel and a new User Initiated Service (UIS) protocol to enable more dynamic and autonomous mission operations concepts, reduced user space communications planning burden, and more efficient and effective provider network resource utilization. This paper provides preliminary results from the application of model driven architecture methodology to develop UIS. Such an approach is necessary to ensure systematic investigation of several open questions concerning the efficiency, robustness, interoperability, scalability and security of the control plane space link and UIS protocol.
Sensor modeling and demonstration of a multi-object spectrometer for performance-driven sensing
NASA Astrophysics Data System (ADS)
Kerekes, John P.; Presnar, Michael D.; Fourspring, Kenneth D.; Ninkov, Zoran; Pogorzala, David R.; Raisanen, Alan D.; Rice, Andrew C.; Vasquez, Juan R.; Patel, Jeffrey P.; MacIntyre, Robert T.; Brown, Scott D.
2009-05-01
A novel multi-object spectrometer (MOS) is being explored for use as an adaptive performance-driven sensor that tracks moving targets. Developed originally for astronomical applications, the instrument utilizes an array of micromirrors to reflect light to a panchromatic imaging array. When an object of interest is detected the individual micromirrors imaging the object are tilted to reflect the light to a spectrometer to collect a full spectrum. This paper will present example sensor performance from empirical data collected in laboratory experiments, as well as our approach in designing optical and radiometric models of the MOS channels and the micromirror array. Simulation of moving vehicles in a highfidelity, hyperspectral scene is used to generate a dynamic video input for the adaptive sensor. Performance-driven algorithms for feature-aided target tracking and modality selection exploit multiple electromagnetic observables to track moving vehicle targets.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
NASA Astrophysics Data System (ADS)
Wang, Hai-Feng; Lin, Zhen-Quan; Gao, Yan; Xu, Chao
2009-08-01
We propose a catalytically activated duplication model to mimic the coagulation and duplication of the DNA polymer system under the catalysis of the primer RNA. In the model, two aggregates of the same species can coagulate themselves and a DNA aggregate of any size can yield a new monomer or double itself with the help of RNA aggregates. By employing the mean-field rate equation approach we analytically investigate the evolution behaviour of the system. For the system with catalysis-driven monomer duplications, the aggregate size distribution of DNA polymers ak(t) always follows a power law in size in the long-time limit, and it decreases with time or approaches a time-independent steady-state form in the case of the duplication rate independent of the size of the mother aggregates, while it increases with time increasing in the case of the duplication rate proportional to the size of the mother aggregates. For the system with complete catalysis-driven duplications, the aggregate size distribution ak(t) approaches a generalized or modified scaling form.
Stakeholder-Driven Quality Improvement: A Compelling Force for Clinical Practice Guidelines.
Rosenfeld, Richard M; Wyer, Peter C
2018-01-01
Clinical practice guideline development should be driven by rigorous methodology, but what is less clear is where quality improvement enters the process: should it be a priority-guiding force, or should it enter only after recommendations are formulated? We argue for a stakeholder-driven approach to guideline development, with an overriding goal of quality improvement based on stakeholder perceptions of needs, uncertainties, and knowledge gaps. In contrast, the widely used topic-driven approach, which often makes recommendations based only on randomized controlled trials, is driven by epidemiologic purity and evidence rigor, with quality improvement a downstream consideration. The advantages of a stakeholder-driven versus a topic-driven approach are highlighted by comparisons of guidelines for otitis media with effusion, thyroid nodules, sepsis, and acute bacterial rhinosinusitis. These comparisons show that stakeholder-driven guidelines are more likely to address the quality improvement needs and pressing concerns of clinicians and patients, including understudied populations and patients with multiple chronic conditions. Conversely, a topic-driven approach often addresses "typical" patients, based on research that may not reflect the needs of high-risk groups excluded from studies because of ethical issues or a desire for purity of research design.
Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya
2013-01-01
Abstract Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments. PMID:23808607
NASA Astrophysics Data System (ADS)
Nyawira, Sylvia; Nabel, Julia; Brovkin, Victor; Pongratz, Julia
2017-04-01
Modelling studies estimate a global loss in soil carbon caused by land-use changes (LUCs) over the last century. Although it is known that this loss stems from the changes in quantity of litter inputs from the vegetation to the soil (input-driven) and the changes in turnover of carbon in the soil (turnover-driven) associated with LUC, the individual contribution of these two controls to the total changes have not been assessed. Using the dynamic global vegetation model JSBACH, we apply a factor separation approach to isolate the contribution of the input-driven and turnover-driven changes, as well as their synergies, to the total changes in soil carbon from LUC. To assess how land management through crop and wood harvest influences the controls, we compare our results for simulations with and without land management. Our results reveal that for the afforested regions both the input-driven and turnover-driven changes generally result in soil carbon gain, whereas deforested regions exhibit a loss. However, for regions where croplands have increased at the expense of grasslands and pastures, the input-driven changes result in a loss that is partly offset by a gain via the turnover-driven changes. This gain stems from a decrease in the fire-related carbon losses when grasslands or pastures are replaced with croplands. Omitting land management reduces the carbon losses in regions where natural vegetation has been converted to croplands and enhances the gain in afforested regions. The global simulated losses are substantially reduced from 54.0 Pg C to 22.0 Pg C, with the input-driven losses reducing from 54.7 Pg C to 24.9 Pg C. Our study shows that the dominating control of soil carbon losses is through the input-driven changes, which are more directly influenced by human management than the turnover-driven ones.
Bayerstadler, Andreas; Benstetter, Franz; Heumann, Christian; Winter, Fabian
2014-09-01
Predictive Modeling (PM) techniques are gaining importance in the worldwide health insurance business. Modern PM methods are used for customer relationship management, risk evaluation or medical management. This article illustrates a PM approach that enables the economic potential of (cost-) effective disease management programs (DMPs) to be fully exploited by optimized candidate selection as an example of successful data-driven business management. The approach is based on a Generalized Linear Model (GLM) that is easy to apply for health insurance companies. By means of a small portfolio from an emerging country, we show that our GLM approach is stable compared to more sophisticated regression techniques in spite of the difficult data environment. Additionally, we demonstrate for this example of a setting that our model can compete with the expensive solutions offered by professional PM vendors and outperforms non-predictive standard approaches for DMP selection commonly used in the market.
Yim, Sunghoon; Jeon, Seokhee; Choi, Seungmoon
2016-01-01
In this paper, we present an extended data-driven haptic rendering method capable of reproducing force responses during pushing and sliding interaction on a large surface area. The main part of the approach is a novel input variable set for the training of an interpolation model, which incorporates the position of a proxy - an imaginary contact point on the undeformed surface. This allows us to estimate friction in both sliding and sticking states in a unified framework. Estimating the proxy position is done in real-time based on simulation using a sliding yield surface - a surface defining a border between the sliding and sticking regions in the external force space. During modeling, the sliding yield surface is first identified via an automated palpation procedure. Then, through manual palpation on a target surface, input data and resultant force data are acquired. The data are used to build a radial basis interpolation model. During rendering, this input-output mapping interpolation model is used to estimate force responses in real-time in accordance with the interaction input. Physical performance evaluation demonstrates that our approach achieves reasonably high estimation accuracy. A user study also shows plausible perceptual realism under diverse and extensive exploration.
Data to Decisions: Creating a Culture of Model-Driven Drug Discovery.
Brown, Frank K; Kopti, Farida; Chang, Charlie Zhenyu; Johnson, Scott A; Glick, Meir; Waller, Chris L
2017-09-01
Merck & Co., Inc., Kenilworth, NJ, USA, is undergoing a transformation in the way that it prosecutes R&D programs. Through the adoption of a "model-driven" culture, enhanced R&D productivity is anticipated, both in the form of decreased attrition at each stage of the process and by providing a rational framework for understanding and learning from the data generated along the way. This new approach focuses on the concept of a "Design Cycle" that makes use of all the data possible, internally and externally, to drive decision-making. These data can take the form of bioactivity, 3D structures, genomics, pathway, PK/PD, safety data, etc. Synthesis of high-quality data into models utilizing both well-established and cutting-edge methods has been shown to yield high confidence predictions to prioritize decision-making and efficiently reposition resources within R&D. The goal is to design an adaptive research operating plan that uses both modeled data and experiments, rather than just testing, to drive project decision-making. To support this emerging culture, an ambitious information management (IT) program has been initiated to implement a harmonized platform to facilitate the construction of cross-domain workflows to enable data-driven decision-making and the construction and validation of predictive models. These goals are achieved through depositing model-ready data, agile persona-driven access to data, a unified cross-domain predictive model lifecycle management platform, and support for flexible scientist-developed workflows that simplify data manipulation and consume model services. The end-to-end nature of the platform, in turn, not only supports but also drives the culture change by enabling scientists to apply predictive sciences throughout their work and over the lifetime of a project. This shift in mindset for both scientists and IT was driven by an early impactful demonstration of the potential benefits of the platform, in which expert-level early discovery predictive models were made available from familiar desktop tools, such as ChemDraw. This was built using a workflow-driven service-oriented architecture (SOA) on top of the rigorous registration of all underlying model entities.
A New Definition of Models and Modeling in Chemistry's Teaching
ERIC Educational Resources Information Center
Chamizo, José A.
2013-01-01
The synthesis of new chemical compounds makes it the most productive science. Unfortunately chemistry education practice has not been driven to any great extent by research findings, philosophical positions or advances in new ways of approaching knowledge. The changes that have occurred in textbooks during the past three decades do not show any…
USDA-ARS?s Scientific Manuscript database
Information embodied in ecological site descriptions and their state-and-transition models is crucial to effective land management, and as such is needed now. There is not time (or money) to employ a traditional research-based approach (i.e., inductive/deductive, hypothesis driven inference) to addr...
Magidson, Jessica F; Roberts, Brent W; Collado-Rodriguez, Anahi; Lejuez, C W
2014-05-01
Considerable evidence suggests that personality traits may be changeable, raising the possibility that personality traits most linked to health problems can be modified with intervention. A growing body of research suggests that problematic personality traits may be altered with behavioral intervention using a bottom-up approach. That is, by targeting core behaviors that underlie personality traits with the goal of engendering new, healthier patterns of behavior that, over time, become automatized and manifest in changes in personality traits. Nevertheless, a bottom-up model for changing personality traits is somewhat diffuse and requires clearer integration of theory and relevant interventions to enable real clinical application. As such, this article proposes a set of guiding principles for theory-driven modification of targeted personality traits using a bottom-up approach, focusing specifically on targeting the trait of conscientiousness using a relevant behavioral intervention, Behavioral Activation (BA), considered within the motivational framework of expectancy value theory (EVT). We conclude with a real case example of the application of BA to alter behaviors counter to conscientiousness in a substance-dependent patient, highlighting the EVT principles most relevant to the approach and the importance and viability of a theoretically driven, bottom-up approach to changing personality traits. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
As above, so below? Towards understanding inverse models in BCI
NASA Astrophysics Data System (ADS)
Lindgren, Jussi T.
2018-02-01
Objective. In brain-computer interfaces (BCI), measurements of the user’s brain activity are classified into commands for the computer. With EEG-based BCIs, the origins of the classified phenomena are often considered to be spatially localized in the cortical volume and mixed in the EEG. We investigate if more accurate BCIs can be obtained by reconstructing the source activities in the volume. Approach. We contrast the physiology-driven source reconstruction with data-driven representations obtained by statistical machine learning. We explain these approaches in a common linear dictionary framework and review the different ways to obtain the dictionary parameters. We consider the effect of source reconstruction on some major difficulties in BCI classification, namely information loss, feature selection and nonstationarity of the EEG. Main results. Our analysis suggests that the approaches differ mainly in their parameter estimation. Physiological source reconstruction may thus be expected to improve BCI accuracy if machine learning is not used or where it produces less optimal parameters. We argue that the considered difficulties of surface EEG classification can remain in the reconstructed volume and that data-driven techniques are still necessary. Finally, we provide some suggestions for comparing approaches. Significance. The present work illustrates the relationships between source reconstruction and machine learning-based approaches for EEG data representation. The provided analysis and discussion should help in understanding, applying, comparing and improving such techniques in the future.
Lightning-driven electric and magnetic fields measured in the stratosphere: Implications for sprites
NASA Astrophysics Data System (ADS)
Thomas, Jeremy Norman
A well accepted model for sprite production involves quasi-electrostatic fields (QSF) driven by large positive cloud-to-ground (+CG) strokes that can cause electrical breakdown in the middle atmosphere. A new high voltage, high impedance, double Langmuir probe instrument is designed specifically for measuring these large lightning-driven electric field changes at altitudes above 30 km. This High Voltage (HV) Electric Field Detector measured 200 nearby (<75 km) lightning-driven electric field changes, up to 140 V/m in magnitude, during the Brazil Sprite Balloon Campaign 2002--03. A numerical QSF model is developed and compared to the in situ measurements. It is found that the amplitudes and relaxation times of the electric fields driven by these nearby lightning events generally agree with the numerical QSF model, which suggests that the QSF approach is valid for modeling lightning-driven fields. Using the best fit parameters of this comparison, it is predicted that the electric fields at sprite altitudes (60--90 km) never surpass conventional breakdown in the mesosphere for each of these 200 nearby lightning events. Lightning-driven ELF to VLF (25 Hz--8 kHz) electric field changes were measured for each of the 2467 cloud-to-ground lightning (CGs) detected by the Brazilian Integrated Lightning Network (BIN) at distances of 75--600 km, and magnetic field changes (300 Hz--8 kHz) above the background noise were measured for about 35% (858) of these CGs. ELF pulses that occur 4--12 ms after the retarded time of the lightning sferic, which have been previously attributed to sprites, were found for 1.4% of 934 CGs examined with a strong bias towards +CGs (4.9% or 9/184) compared to -CGs (0.5% or 4/750). These results disagree with results from the Sprites99 Balloon Campaign [Bering et al., 2004b], in which the lightning-driven electric and magnetic field changes were rare, while the CG delayed ELF pulses were frequent. The Brazil Campaign results thus suggest that mesospheric currents are likely the result of the QSF driven by large charge moment strokes, which are usually +CG strokes, initiating breakdown in the middle atmosphere.
Biased Competition in Visual Processing Hierarchies: A Learning Approach Using Multiple Cues.
Gepperth, Alexander R T; Rebhan, Sven; Hasler, Stephan; Fritsch, Jannik
2011-03-01
In this contribution, we present a large-scale hierarchical system for object detection fusing bottom-up (signal-driven) processing results with top-down (model or task-driven) attentional modulation. Specifically, we focus on the question of how the autonomous learning of invariant models can be embedded into a performing system and how such models can be used to define object-specific attentional modulation signals. Our system implements bi-directional data flow in a processing hierarchy. The bottom-up data flow proceeds from a preprocessing level to the hypothesis level where object hypotheses created by exhaustive object detection algorithms are represented in a roughly retinotopic way. A competitive selection mechanism is used to determine the most confident hypotheses, which are used on the system level to train multimodal models that link object identity to invariant hypothesis properties. The top-down data flow originates at the system level, where the trained multimodal models are used to obtain space- and feature-based attentional modulation signals, providing biases for the competitive selection process at the hypothesis level. This results in object-specific hypothesis facilitation/suppression in certain image regions which we show to be applicable to different object detection mechanisms. In order to demonstrate the benefits of this approach, we apply the system to the detection of cars in a variety of challenging traffic videos. Evaluating our approach on a publicly available dataset containing approximately 3,500 annotated video images from more than 1 h of driving, we can show strong increases in performance and generalization when compared to object detection in isolation. Furthermore, we compare our results to a late hypothesis rejection approach, showing that early coupling of top-down and bottom-up information is a favorable approach especially when processing resources are constrained.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
NASA Astrophysics Data System (ADS)
Grzegożek, W.; Dobaj, K.; Kot, A.
2016-09-01
The paper includes the analysis of the rubber V-belt cooperation with the CVT transmission pulleys. The analysis of the forces and torques acting in the CVT transmission was conducted basing on calculated characteristics of the centrifugal regulator and the torque regulator. The accurate estimation of the regulator surface curvature allowed for calculation of the relation between the driving wheel axial force, the engine rotational speed and the gear ratio of the CVT transmission. Simplified analytical models of the rubber V-belt- pulley cooperation are based on three basic approaches. The Dittrich model assumes two contact regions on the driven and driving wheel. The Kim-Kim model considers, in addition to the previous model, also the radial friction. The radial friction results in the lack of the developed friction area on the driving pulley. The third approach, formulated in the Cammalleri model, assumes variable sliding angle along the wrap arch and describes it as a result the belt longitudinal and cross flexibility. Theoretical torque on the driven and driving wheel was calculated on the basis of the known regulators characteristics. The calculated torque was compared to the measured loading torque. The best accordance, referring to the centrifugal regulator range of work, was obtained for the Kim-Kim model.
Uneri, Ali; Nithiananthan, Sajendra; Schafer, Sebastian; Otake, Yoshito; Stayman, J. Webster; Kleinszig, Gerhard; Sussman, Marc S.; Prince, Jerry L.; Siewerdsen, Jeffrey H.
2013-01-01
Purpose: Surgical resection is the preferred modality for curative treatment of early stage lung cancer, but localization of small tumors (<10 mm diameter) during surgery presents a major challenge that is likely to increase as more early-stage disease is detected incidentally and in low-dose CT screening. To overcome the difficulty of manual localization (fingers inserted through intercostal ports) and the cost, logistics, and morbidity of preoperative tagging (coil or dye placement under CT-fluoroscopy), the authors propose the use of intraoperative cone-beam CT (CBCT) and deformable image registration to guide targeting of small tumors in video-assisted thoracic surgery (VATS). A novel algorithm is reported for registration of the lung from its inflated state (prior to pleural breach) to the deflated state (during resection) to localize surgical targets and adjacent critical anatomy. Methods: The registration approach geometrically resolves images of the inflated and deflated lung using a coarse model-driven stage followed by a finer image-driven stage. The model-driven stage uses image features derived from the lung surfaces and airways: triangular surface meshes are morphed to capture bulk motion; concurrently, the airways generate graph structures from which corresponding nodes are identified. Interpolation of the sparse motion fields computed from the bounding surface and interior airways provides a 3D motion field that coarsely registers the lung and initializes the subsequent image-driven stage. The image-driven stage employs an intensity-corrected, symmetric form of the Demons method. The algorithm was validated over 12 datasets, obtained from porcine specimen experiments emulating CBCT-guided VATS. Geometric accuracy was quantified in terms of target registration error (TRE) in anatomical targets throughout the lung, and normalized cross-correlation. Variations of the algorithm were investigated to study the behavior of the model- and image-driven stages by modifying individual algorithmic steps and examining the effect in comparison to the nominal process. Results: The combined model- and image-driven registration process demonstrated accuracy consistent with the requirements of minimally invasive VATS in both target localization (∼3–5 mm within the target wedge) and critical structure avoidance (∼1–2 mm). The model-driven stage initialized the registration to within a median TRE of 1.9 mm (95% confidence interval (CI) maximum = 5.0 mm), while the subsequent image-driven stage yielded higher accuracy localization with 0.6 mm median TRE (95% CI maximum = 4.1 mm). The variations assessing the individual algorithmic steps elucidated the role of each step and in some cases identified opportunities for further simplification and improvement in computational speed. Conclusions: The initial studies show the proposed registration method to successfully register CBCT images of the inflated and deflated lung. Accuracy appears sufficient to localize the target and adjacent critical anatomy within ∼1–2 mm and guide localization under conditions in which the target cannot be discerned directly in CBCT (e.g., subtle, nonsolid tumors). The ability to directly localize tumors in the operating room could provide a valuable addition to the VATS arsenal, obviate the cost, logistics, and morbidity of preoperative tagging, and improve patient safety. Future work includes in vivo testing, optimization of workflow, and integration with a CBCT image guidance system. PMID:23298134
Uneri, Ali; Nithiananthan, Sajendra; Schafer, Sebastian; Otake, Yoshito; Stayman, J Webster; Kleinszig, Gerhard; Sussman, Marc S; Prince, Jerry L; Siewerdsen, Jeffrey H
2013-01-01
Surgical resection is the preferred modality for curative treatment of early stage lung cancer, but localization of small tumors (<10 mm diameter) during surgery presents a major challenge that is likely to increase as more early-stage disease is detected incidentally and in low-dose CT screening. To overcome the difficulty of manual localization (fingers inserted through intercostal ports) and the cost, logistics, and morbidity of preoperative tagging (coil or dye placement under CT-fluoroscopy), the authors propose the use of intraoperative cone-beam CT (CBCT) and deformable image registration to guide targeting of small tumors in video-assisted thoracic surgery (VATS). A novel algorithm is reported for registration of the lung from its inflated state (prior to pleural breach) to the deflated state (during resection) to localize surgical targets and adjacent critical anatomy. The registration approach geometrically resolves images of the inflated and deflated lung using a coarse model-driven stage followed by a finer image-driven stage. The model-driven stage uses image features derived from the lung surfaces and airways: triangular surface meshes are morphed to capture bulk motion; concurrently, the airways generate graph structures from which corresponding nodes are identified. Interpolation of the sparse motion fields computed from the bounding surface and interior airways provides a 3D motion field that coarsely registers the lung and initializes the subsequent image-driven stage. The image-driven stage employs an intensity-corrected, symmetric form of the Demons method. The algorithm was validated over 12 datasets, obtained from porcine specimen experiments emulating CBCT-guided VATS. Geometric accuracy was quantified in terms of target registration error (TRE) in anatomical targets throughout the lung, and normalized cross-correlation. Variations of the algorithm were investigated to study the behavior of the model- and image-driven stages by modifying individual algorithmic steps and examining the effect in comparison to the nominal process. The combined model- and image-driven registration process demonstrated accuracy consistent with the requirements of minimally invasive VATS in both target localization (∼3-5 mm within the target wedge) and critical structure avoidance (∼1-2 mm). The model-driven stage initialized the registration to within a median TRE of 1.9 mm (95% confidence interval (CI) maximum = 5.0 mm), while the subsequent image-driven stage yielded higher accuracy localization with 0.6 mm median TRE (95% CI maximum = 4.1 mm). The variations assessing the individual algorithmic steps elucidated the role of each step and in some cases identified opportunities for further simplification and improvement in computational speed. The initial studies show the proposed registration method to successfully register CBCT images of the inflated and deflated lung. Accuracy appears sufficient to localize the target and adjacent critical anatomy within ∼1-2 mm and guide localization under conditions in which the target cannot be discerned directly in CBCT (e.g., subtle, nonsolid tumors). The ability to directly localize tumors in the operating room could provide a valuable addition to the VATS arsenal, obviate the cost, logistics, and morbidity of preoperative tagging, and improve patient safety. Future work includes in vivo testing, optimization of workflow, and integration with a CBCT image guidance system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Wen; Sang, Chaofeng; Wang, Dezhen, E-mail: wangdez@dlut.edu.cn
In this paper, a computational study of two counter-propagating helium plasma jets in ambient air is presented. A two-dimensional fluid model is applied to investigate the physical processes of the two plasma jets interaction (PJI) driven by equal and unequal voltages, respectively. In all studied cases, the PJI results in a decrease of both plasma bullets propagation velocity. When the two plasma jets are driven by equal voltages, they never merge but rather approach each other around the middle of the gas gap at a minimum approach distance, and the minimal distance decreases with the increase of both the appliedmore » voltages and initial electron density, but increases with the increase of the relative permittivity. When the two plasma jets are driven by unequal voltages, we observe the two plasma jets will merge at the position away from the middle of the gas gap. The effect of applied voltage difference on the PJI is also studied.« less
Paving the COWpath: data-driven design of pediatric order sets
Zhang, Yiye; Padman, Rema; Levin, James E
2014-01-01
Objective Evidence indicates that users incur significant physical and cognitive costs in the use of order sets, a core feature of computerized provider order entry systems. This paper develops data-driven approaches for automating the construction of order sets that match closely with user preferences and workflow while minimizing physical and cognitive workload. Materials and methods We developed and tested optimization-based models embedded with clustering techniques using physical and cognitive click cost criteria. By judiciously learning from users’ actual actions, our methods identify items for constituting order sets that are relevant according to historical ordering data and grouped on the basis of order similarity and ordering time. We evaluated performance of the methods using 47 099 orders from the year 2011 for asthma, appendectomy and pneumonia management in a pediatric inpatient setting. Results In comparison with existing order sets, those developed using the new approach significantly reduce the physical and cognitive workload associated with usage by 14–52%. This approach is also capable of accommodating variations in clinical conditions that affect order set usage and development. Discussion There is a critical need to investigate the cognitive complexity imposed on users by complex clinical information systems, and to design their features according to ‘human factors’ best practices. Optimizing order set generation using cognitive cost criteria introduces a new approach that can potentially improve ordering efficiency, reduce unintended variations in order placement, and enhance patient safety. Conclusions We demonstrate that data-driven methods offer a promising approach for designing order sets that are generalizable, data-driven, condition-based, and up to date with current best practices. PMID:24674844
Computational neuroscience approach to biomarkers and treatments for mental disorders.
Yahata, Noriaki; Kasai, Kiyoto; Kawato, Mitsuo
2017-04-01
Psychiatry research has long experienced a stagnation stemming from a lack of understanding of the neurobiological underpinnings of phenomenologically defined mental disorders. Recently, the application of computational neuroscience to psychiatry research has shown great promise in establishing a link between phenomenological and pathophysiological aspects of mental disorders, thereby recasting current nosology in more biologically meaningful dimensions. In this review, we highlight recent investigations into computational neuroscience that have undertaken either theory- or data-driven approaches to quantitatively delineate the mechanisms of mental disorders. The theory-driven approach, including reinforcement learning models, plays an integrative role in this process by enabling correspondence between behavior and disorder-specific alterations at multiple levels of brain organization, ranging from molecules to cells to circuits. Previous studies have explicated a plethora of defining symptoms of mental disorders, including anhedonia, inattention, and poor executive function. The data-driven approach, on the other hand, is an emerging field in computational neuroscience seeking to identify disorder-specific features among high-dimensional big data. Remarkably, various machine-learning techniques have been applied to neuroimaging data, and the extracted disorder-specific features have been used for automatic case-control classification. For many disorders, the reported accuracies have reached 90% or more. However, we note that rigorous tests on independent cohorts are critically required to translate this research into clinical applications. Finally, we discuss the utility of the disorder-specific features found by the data-driven approach to psychiatric therapies, including neurofeedback. Such developments will allow simultaneous diagnosis and treatment of mental disorders using neuroimaging, thereby establishing 'theranostics' for the first time in clinical psychiatry. © 2016 The Authors. Psychiatry and Clinical Neurosciences © 2016 Japanese Society of Psychiatry and Neurology.
Mao, Ningying; Lesher, Beth; Liu, Qifa; Qin, Lei; Chen, Yixi; Gao, Xin; Earnshaw, Stephanie R; McDade, Cheryl L; Charbonneau, Claudie
2016-01-01
Invasive fungal infections (IFIs) require rapid diagnosis and treatment. A decision-analytic model was used to estimate total costs and survival associated with a diagnostic-driven (DD) or an empiric treatment approach in neutropenic patients with hematological malignancies receiving chemotherapy or autologous/allogeneic stem cell transplants in Shanghai, Beijing, Chengdu, and Guangzhou, the People's Republic of China. Treatment initiation for the empiric approach occurred after clinical suspicion of an IFI; treatment initiation for the DD approach occurred after clinical suspicion and a positive IFI diagnostic test result. Model inputs were obtained from the literature; treatment patterns and resource use were based on clinical opinion. Total costs were lower for the DD versus the empiric approach in Shanghai (¥3,232 vs ¥4,331), Beijing (¥3,894 vs ¥4,864), Chengdu, (¥4,632 vs ¥5,795), and Guangzhou (¥8,489 vs ¥9,795). Antifungal administration was lower using the DD (5.7%) than empiric (9.8%) approach, with similar survival rates. Results from one-way and probabilistic sensitivity analyses were most sensitive to changes in diagnostic test sensitivity and IFI incidence; the DD approach dominated the empiric approach in 88% of scenarios. These results suggest that a DD compared to an empiric treatment approach in the People's Republic of China may be cost saving, with similar overall survival in immunocompromised patients with suspected IFIs.
Model-free information-theoretic approach to infer leadership in pairs of zebrafish.
Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio
2016-04-01
Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.
Model-free information-theoretic approach to infer leadership in pairs of zebrafish
NASA Astrophysics Data System (ADS)
Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio
2016-04-01
Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.
Moving university hydrology education forward with geoinformatics, data and modeling approaches
NASA Astrophysics Data System (ADS)
Merwade, V.; Ruddell, B. L.
2012-02-01
In this opinion paper, we review recent literature related to data and modeling driven instruction in hydrology, and present our findings from surveying the hydrology education community in the United States. This paper presents an argument that that Data and Modeling Driven Geoscience Cybereducation (DMDGC) approaches are valuable for teaching the conceptual and applied aspects of hydrology, as a part of the broader effort to improve Science, Technology, Engineering, and Mathematics (STEM) education at the university level. The authors have undertaken a series of surveys and a workshop involving the community of university hydrology educators to determine the state of the practice of DMDGC approaches to hydrology. We identify the most common tools and approaches currently utilized, quantify the extent of the adoption of DMDGC approaches in the university hydrology classroom, and explain the community's views on the challenges and barriers preventing DMDGC approaches from wider use. DMDGC approaches are currently emphasized at the graduate level of the curriculum, and only the most basic modeling and visualization tools are in widespread use. The community identifies the greatest barriers to greater adoption as a lack of access to easily adoptable curriculum materials and a lack of time and training to learn constantly changing tools and methods. The community's current consensus is that DMDGC approaches should emphasize conceptual learning, and should be used to complement rather than replace lecture-based pedagogies. Inadequate online material-publication and sharing systems, and a lack of incentives for faculty to develop and publish materials via such systems, is also identified as a challenge. Based on these findings, we suggest that a number of steps should be taken by the community to develop the potential of DMDGC in university hydrology education, including formal development and assessment of curriculum materials integrating lecture-format and DMDGC approaches, incentivizing the publication by faculty of excellent DMDGC curriculum materials, and implementing the publication and dissemination cyberinfrastructure necessary to support the unique DMDGC digital curriculum materials.
A zebrafish model of chordoma initiated by notochord-driven expression of HRASV12
Burger, Alexa; Vasilyev, Aleksandr; Tomar, Ritu; Selig, Martin K.; Nielsen, G. Petur; Peterson, Randall T.; Drummond, Iain A.; Haber, Daniel A.
2014-01-01
Chordoma is a malignant tumor thought to arise from remnants of the embryonic notochord, with its origin in the bones of the axial skeleton. Surgical resection is the standard treatment, usually in combination with radiation therapy, but neither chemotherapeutic nor targeted therapeutic approaches have demonstrated success. No animal model and only few chordoma cell lines are available for preclinical drug testing, and, although no druggable genetic drivers have been identified, activation of EGFR and downstream AKT-PI3K pathways have been described. Here, we report a zebrafish model of chordoma, based on stable transgene-driven expression of HRASV12 in notochord cells during development. Extensive intra-notochordal tumor formation is evident within days of transgene expression, ultimately leading to larval death. The zebrafish tumors share characteristics of human chordoma as demonstrated by immunohistochemistry and electron microscopy. The mTORC1 inhibitor rapamycin, which has some demonstrated activity in a chordoma cell line, delays the onset of tumor formation in our zebrafish model, and improves survival of tumor-bearing fish. Consequently, the HRASV12-driven zebrafish model of chordoma could enable high-throughput screening of potential therapeutic agents for the treatment of this refractory cancer. PMID:24311731
Model-driven approach to data collection and reporting for quality improvement
Curcin, Vasa; Woodcock, Thomas; Poots, Alan J.; Majeed, Azeem; Bell, Derek
2014-01-01
Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. PMID:24874182
NASA Astrophysics Data System (ADS)
van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.
2016-03-01
Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner shelf settings. This vision is illustrated through an idealised composition of models for a ~ 70 km stretch of the Suffolk coast, eastern England. A key advantage of model linking is that it allows a wide range of real-world situations to be simulated from a small set of model components. However, this process involves more than just the development of software that allows for flexible model coupling. The compatibility of radically different modelling assumptions remains to be carefully assessed and testing as well as evaluating uncertainties of models in composition are areas that require further attention.
Data-Driven Learning of Q-Matrix
Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang
2013-01-01
The recent surge of interests in cognitive assessment has led to developments of novel statistical models for diagnostic classification. Central to many such models is the well-known Q-matrix, which specifies the item–attribute relationships. This article proposes a data-driven approach to identification of the Q-matrix and estimation of related model parameters. A key ingredient is a flexible T-matrix that relates the Q-matrix to response patterns. The flexibility of the T-matrix allows the construction of a natural criterion function as well as a computationally amenable algorithm. Simulations results are presented to demonstrate usefulness and applicability of the proposed method. Extension to handling of the Q-matrix with partial information is presented. The proposed method also provides a platform on which important statistical issues, such as hypothesis testing and model selection, may be formally addressed. PMID:23926363
Application of a New Hybrid RANS/LES Modeling Paradigm to Compressible Flow
NASA Astrophysics Data System (ADS)
Oliver, Todd; Pederson, Clark; Haering, Sigfried; Moser, Robert
2017-11-01
It is well-known that traditional hybrid RANS/LES modeling approaches suffer from a number of deficiencies. These deficiencies often stem from overly simplistic blending strategies based on scalar measures of turbulence length scale and grid resolution and from use of isotropic subgrid models in LES regions. A recently developed hybrid modeling approach has shown promise in overcoming these deficiencies in incompressible flows [Haering, 2015]. In the approach, RANS/LES blending is accomplished using a hybridization parameter that is governed by an additional model transport equation and is driven to achieve equilibrium between the resolved and unresolved turbulence for the given grid. Further, the model uses an tensor eddy viscosity that is formulated to represent the effects of anisotropic grid resolution on subgrid quantities. In this work, this modeling approach is extended to compressible flows and implemented in the compressible flow solver SU2 (http://su2.stanford.edu/). We discuss both modeling and implementation challenges and show preliminary results for compressible flow test cases with smooth wall separation.
Jang, In Sock; Dienstmann, Rodrigo; Margolin, Adam A; Guinney, Justin
2015-01-01
Complex mechanisms involving genomic aberrations in numerous proteins and pathways are believed to be a key cause of many diseases such as cancer. With recent advances in genomics, elucidating the molecular basis of cancer at a patient level is now feasible, and has led to personalized treatment strategies whereby a patient is treated according to his or her genomic profile. However, there is growing recognition that existing treatment modalities are overly simplistic, and do not fully account for the deep genomic complexity associated with sensitivity or resistance to cancer therapies. To overcome these limitations, large-scale pharmacogenomic screens of cancer cell lines--in conjunction with modern statistical learning approaches--have been used to explore the genetic underpinnings of drug response. While these analyses have demonstrated the ability to infer genetic predictors of compound sensitivity, to date most modeling approaches have been data-driven, i.e. they do not explicitly incorporate domain-specific knowledge (priors) in the process of learning a model. While a purely data-driven approach offers an unbiased perspective of the data--and may yield unexpected or novel insights--this strategy introduces challenges for both model interpretability and accuracy. In this study, we propose a novel prior-incorporated sparse regression model in which the choice of informative predictor sets is carried out by knowledge-driven priors (gene sets) in a stepwise fashion. Under regularization in a linear regression model, our algorithm is able to incorporate prior biological knowledge across the predictive variables thereby improving the interpretability of the final model with no loss--and often an improvement--in predictive performance. We evaluate the performance of our algorithm compared to well-known regularization methods such as LASSO, Ridge and Elastic net regression in the Cancer Cell Line Encyclopedia (CCLE) and Genomics of Drug Sensitivity in Cancer (Sanger) pharmacogenomics datasets, demonstrating that incorporation of the biological priors selected by our model confers improved predictability and interpretability, despite much fewer predictors, over existing state-of-the-art methods.
NASA Astrophysics Data System (ADS)
Kafka, Orion L.; Yu, Cheng; Shakoor, Modesar; Liu, Zeliang; Wagner, Gregory J.; Liu, Wing Kam
2018-04-01
A data-driven mechanistic modeling technique is applied to a system representative of a broken-up inclusion ("stringer") within drawn nickel-titanium wire or tube, e.g., as used for arterial stents. The approach uses a decomposition of the problem into a training stage and a prediction stage. It is applied to compute the fatigue crack incubation life of a microstructure of interest under high-cycle fatigue. A parametric study of a matrix-inclusion-void microstructure is conducted. The results indicate that, within the range studied, a larger void between halves of the inclusion increases fatigue life, while larger inclusion diameter reduces fatigue life.
Statistical mechanics of the mixed majority minority game with random external information
NASA Astrophysics Data System (ADS)
DeMartino, A.; Giardina, I.; Mosetti, G.
2003-08-01
We study the asymptotic macroscopic properties of the mixed majority-minority game, modelling a population in which two types of heterogeneous adaptive agents, namely 'fundamentalists' driven by differentiation and 'trend-followers' driven by imitation, interact. The presence of a fraction f of trend-followers is shown to induce (a) a significant loss of informational efficiency with respect to a pure minority game (in particular, an efficient, unpredictable phase exists only for f < 1/2), and (b) a catastrophic increase of global fluctuations for f > 1/2. We solve the model by means of an approximate static (replica) theory and by a direct dynamical (generating functional) technique. The two approaches coincide and match numerical results convincingly.
Adapting the Biome-BGC Model to New Zealand Pastoral Agriculture: Climate Change and Land-Use Change
NASA Astrophysics Data System (ADS)
Keller, E. D.; Baisden, W. T.; Timar, L.
2011-12-01
We have adapted the Biome-BGC model to make climate change and land-use scenario estimates of New Zealand's pasture production in 2020 and 2050, with comparison to a 2005 baseline. We take an integrated modelling approach with the aim of enabling the model's use for policy assessments across broadly related issues such as climate change mitigation and adaptation, land-use change, and greenhouse gas projections. The Biome-BGC model is a biogeochemical model that simulates carbon, water, and nitrogen cycles in terrestrial ecosystems. We introduce two new 'ecosystems', sheep/beef and dairy pasture, within the existing structure of the Biome-BGC model and calibrate its ecophysiological parameters against pasture clipping data from diverse sites around New Zealand to form a baseline estimate of total New Zealand pasture production. Using downscaled AR4 climate projections, we construct mid- and upper-range climate change scenarios in 2020 and 2050. We produce land-use change scenarios in the same years by combining the Biome-BGC model with the Land Use in Rural New Zealand (LURNZ) model. The LURNZ model uses econometric approaches to predict future land-use change driven by changes in net profits driven by expected pricing, including the introduction of an emission trading system. We estimate the relative change in national pasture production from our 2005 baseline levels for both sheep/beef and dairy systems under each scenario.
An action potential-driven model of soleus muscle activation dynamics for locomotor-like movements
NASA Astrophysics Data System (ADS)
Kim, Hojeong; Sandercock, Thomas G.; Heckman, C. J.
2015-08-01
Objective. The goal of this study was to develop a physiologically plausible, computationally robust model for muscle activation dynamics (A(t)) under physiologically relevant excitation and movement. Approach. The interaction of excitation and movement on A(t) was investigated comparing the force production between a cat soleus muscle and its Hill-type model. For capturing A(t) under excitation and movement variation, a modular modeling framework was proposed comprising of three compartments: (1) spikes-to-[Ca2+]; (2) [Ca2+]-to-A; and (3) A-to-force transformation. The individual signal transformations were modeled based on physiological factors so that the parameter values could be separately determined for individual modules directly based on experimental data. Main results. The strong dependency of A(t) on excitation frequency and muscle length was found during both isometric and dynamically-moving contractions. The identified dependencies of A(t) under the static and dynamic conditions could be incorporated in the modular modeling framework by modulating the model parameters as a function of movement input. The new modeling approach was also applicable to cat soleus muscles producing waveforms independent of those used to set the model parameters. Significance. This study provides a modeling framework for spike-driven muscle responses during movement, that is suitable not only for insights into molecular mechanisms underlying muscle behaviors but also for large scale simulations.
Velikina, Julia V; Samsonov, Alexey A
2015-11-01
To accelerate dynamic MR imaging through development of a novel image reconstruction technique using low-rank temporal signal models preestimated from training data. We introduce the model consistency condition (MOCCO) technique, which utilizes temporal models to regularize reconstruction without constraining the solution to be low-rank, as is performed in related techniques. This is achieved by using a data-driven model to design a transform for compressed sensing-type regularization. The enforcement of general compliance with the model without excessively penalizing deviating signal allows recovery of a full-rank solution. Our method was compared with a standard low-rank approach utilizing model-based dimensionality reduction in phantoms and patient examinations for time-resolved contrast-enhanced angiography (CE-MRA) and cardiac CINE imaging. We studied the sensitivity of all methods to rank reduction and temporal subspace modeling errors. MOCCO demonstrated reduced sensitivity to modeling errors compared with the standard approach. Full-rank MOCCO solutions showed significantly improved preservation of temporal fidelity and aliasing/noise suppression in highly accelerated CE-MRA (acceleration up to 27) and cardiac CINE (acceleration up to 15) data. MOCCO overcomes several important deficiencies of previously proposed methods based on pre-estimated temporal models and allows high quality image restoration from highly undersampled CE-MRA and cardiac CINE data. © 2014 Wiley Periodicals, Inc.
Velikina, Julia V.; Samsonov, Alexey A.
2014-01-01
Purpose To accelerate dynamic MR imaging through development of a novel image reconstruction technique using low-rank temporal signal models pre-estimated from training data. Theory We introduce the MOdel Consistency COndition (MOCCO) technique that utilizes temporal models to regularize the reconstruction without constraining the solution to be low-rank as performed in related techniques. This is achieved by using a data-driven model to design a transform for compressed sensing-type regularization. The enforcement of general compliance with the model without excessively penalizing deviating signal allows recovery of a full-rank solution. Methods Our method was compared to standard low-rank approach utilizing model-based dimensionality reduction in phantoms and patient examinations for time-resolved contrast-enhanced angiography (CE MRA) and cardiac CINE imaging. We studied sensitivity of all methods to rank-reduction and temporal subspace modeling errors. Results MOCCO demonstrated reduced sensitivity to modeling errors compared to the standard approach. Full-rank MOCCO solutions showed significantly improved preservation of temporal fidelity and aliasing/noise suppression in highly accelerated CE MRA (acceleration up to 27) and cardiac CINE (acceleration up to 15) data. Conclusions MOCCO overcomes several important deficiencies of previously proposed methods based on pre-estimated temporal models and allows high quality image restoration from highly undersampled CE-MRA and cardiac CINE data. PMID:25399724
Air-Breathing Hypersonic Vehicle Tracking Control Based on Adaptive Dynamic Programming.
Mu, Chaoxu; Ni, Zhen; Sun, Changyin; He, Haibo
2017-03-01
In this paper, we propose a data-driven supplementary control approach with adaptive learning capability for air-breathing hypersonic vehicle tracking control based on action-dependent heuristic dynamic programming (ADHDP). The control action is generated by the combination of sliding mode control (SMC) and the ADHDP controller to track the desired velocity and the desired altitude. In particular, the ADHDP controller observes the differences between the actual velocity/altitude and the desired velocity/altitude, and then provides a supplementary control action accordingly. The ADHDP controller does not rely on the accurate mathematical model function and is data driven. Meanwhile, it is capable to adjust its parameters online over time under various working conditions, which is very suitable for hypersonic vehicle system with parameter uncertainties and disturbances. We verify the adaptive supplementary control approach versus the traditional SMC in the cruising flight, and provide three simulation studies to illustrate the improved performance with the proposed approach.
The Evolution of System Safety at NASA
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Everett, Chris; Groen, Frank
2014-01-01
The NASA system safety framework is in the process of change, motivated by the desire to promote an objectives-driven approach to system safety that explicitly focuses system safety efforts on system-level safety performance, and serves to unify, in a purposeful manner, safety-related activities that otherwise might be done in a way that results in gaps, redundancies, or unnecessary work. An objectives-driven approach to system safety affords more flexibility to determine, on a system-specific basis, the means by which adequate safety is achieved and verified. Such flexibility and efficiency is becoming increasingly important in the face of evolving engineering modalities and acquisition models, where, for example, NASA will increasingly rely on commercial providers for transportation services to low-earth orbit. A key element of this objectives-driven approach is the use of the risk-informed safety case (RISC): a structured argument, supported by a body of evidence, that provides a compelling, comprehensible and valid case that a system is or will be adequately safe for a given application in a given environment. The RISC addresses each of the objectives defined for the system, providing a rational basis for making informed risk acceptance decisions at relevant decision points in the system life cycle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kidon, Lyran; The Sackler Center for Computational Molecular and Materials Science, Tel Aviv University, Tel Aviv 69978; Wilner, Eli Y.
2015-12-21
The generalized quantum master equation provides a powerful tool to describe the dynamics in quantum impurity models driven away from equilibrium. Two complementary approaches, one based on Nakajima–Zwanzig–Mori time-convolution (TC) and the other on the Tokuyama–Mori time-convolutionless (TCL) formulations provide a starting point to describe the time-evolution of the reduced density matrix. A key in both approaches is to obtain the so called “memory kernel” or “generator,” going beyond second or fourth order perturbation techniques. While numerically converged techniques are available for the TC memory kernel, the canonical approach to obtain the TCL generator is based on inverting a super-operatormore » in the full Hilbert space, which is difficult to perform and thus, nearly all applications of the TCL approach rely on a perturbative scheme of some sort. Here, the TCL generator is expressed using a reduced system propagator which can be obtained from system observables alone and requires the calculation of super-operators and their inverse in the reduced Hilbert space rather than the full one. This makes the formulation amenable to quantum impurity solvers or to diagrammatic techniques, such as the nonequilibrium Green’s function. We implement the TCL approach for the resonant level model driven away from equilibrium and compare the time scales for the decay of the generator with that of the memory kernel in the TC approach. Furthermore, the effects of temperature, source-drain bias, and gate potential on the TCL/TC generators are discussed.« less
On the formation of Friedlander waves in a compressed-gas-driven shock tube
Tasissa, Abiy F.; Hautefeuille, Martin; Fitek, John H.; Radovitzky, Raúl A.
2016-01-01
Compressed-gas-driven shock tubes have become popular as a laboratory-scale replacement for field blast tests. The well-known initial structure of the Riemann problem eventually evolves into a shock structure thought to resemble a Friedlander wave, although this remains to be demonstrated theoretically. In this paper, we develop a semi-analytical model to predict the key characteristics of pseudo blast waves forming in a shock tube: location where the wave first forms, peak over-pressure, decay time and impulse. The approach is based on combining the solutions of the two different types of wave interactions that arise in the shock tube after the family of rarefaction waves in the Riemann solution interacts with the closed end of the tube. The results of the analytical model are verified against numerical simulations obtained with a finite volume method. The model furnishes a rational approach to relate shock tube parameters to desired blast wave characteristics, and thus constitutes a useful tool for the design of shock tubes for blast testing. PMID:27118888
Three-dimensional Cascaded Lattice Boltzmann Model for Thermal Convective Flows
NASA Astrophysics Data System (ADS)
Hajabdollahi, Farzaneh; Premnath, Kannan
2017-11-01
Fluid motion driven by thermal effects, such as due to buoyancy in differentially heated enclosures arise in several natural and industrial settings, whose understanding can be achieved via numerical simulations. Lattice Boltzmann (LB) methods are efficient kinetic computational approaches for coupled flow physics problems. In this study, we develop three-dimensional (3D) LB models based on central moments and multiple relaxation times for D3Q7 and D3Q15 lattices to solve the energy transport equations in a double distribution function approach. Their collision operators lead to a cascaded structure involving higher order terms resulting in improved stability. This is coupled to a central moment based LB flow solver with source terms. The new 3D cascaded LB models for the convective flows are first validated for natural convection of air driven thermally on two vertically opposite faces in a cubic cavity at different Rayleigh numbers against prior numerical and experimental data, which show good quantitative agreement. Then, the detailed structure of the 3D flow and thermal fields and the heat transfer rates at different Rayleigh numbers are analyzed and interpreted.
Liang, Yunlei; Du, Zhijiang; Sun, Lining
2017-01-01
The tendon driven mechanism using a cable and pulley to transmit power is adopted by many surgical robots. However, backlash hysteresis objectively exists in cable-pulley mechanisms, and this nonlinear problem is a great challenge in precise position control during the surgical procedure. Previous studies mainly focused on the transmission characteristics of the cable-driven system and constructed transmission models under particular assumptions to solve nonlinear problems. However, these approaches are limited because the modeling process is complex and the transmission models lack general applicability. This paper presents a novel position compensation control scheme to reduce the impact of backlash hysteresis on the positioning accuracy of surgical robots’ end-effectors. In this paper, a position compensation scheme using a support vector machine based on feedforward control is presented to reduce the position tracking error. To validate the proposed approach, experimental validations are conducted on our cable-pulley system and comparative experiments are carried out. The results show remarkable improvements in the performance of reducing the positioning error for the use of the proposed scheme. PMID:28974011
Noise-driven bias in the non-local voter model
NASA Astrophysics Data System (ADS)
Minors, Kevin; Rogers, Tim; Yates, Christian A.
2018-04-01
Is it more effective to have a strong influence over a small domain, or a weaker influence over a larger one? Here, we introduce and analyse an off-lattice generalisation of the voter model, in which the range and strength of agents' influence are control parameters. We consider both low- and high-density regimes and, using distinct mathematical approaches, derive analytical predictions for the evolution of agent densities. We find that, even when the agents are equally persuasive on average, those whose influence is wider but weaker have an overall noise-driven advantage allowing them to reliably dominate the entire population. We discuss the implications of our results and the potential of our model (or adaptations thereof) to improve the understanding of political campaign strategies and the evolution of disease.
NASA Astrophysics Data System (ADS)
Abellán-Nebot, J. V.; Liu, J.; Romero, F.
2009-11-01
The State Space modelling approach has been recently proposed as an engineering-driven technique for part quality prediction in Multistage Machining Processes (MMP). Current State Space models incorporate fixture and datum variations in the multi-stage variation propagation, without explicitly considering common operation variations such as machine-tool thermal distortions, cutting-tool wear, cutting-tool deflections, etc. This paper shows the limitations of the current State Space model through an experimental case study where the effect of the spindle thermal expansion, cutting-tool flank wear and locator errors are introduced. The paper also discusses the extension of the current State Space model to include operation variations and its potential benefits.
Velpuri, N.M.; Senay, G.B.; Asante, K.O.
2012-01-01
Lake Turkana is one of the largest desert lakes in the world and is characterized by high degrees of interand intra-annual fluctuations. The hydrology and water balance of this lake have not been well understood due to its remote location and unavailability of reliable ground truth datasets. Managing surface water resources is a great challenge in areas where in-situ data are either limited or unavailable. In this study, multi-source satellite-driven data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, and a digital elevation dataset were used to model Lake Turkana water levels from 1998 to 2009. Due to the unavailability of reliable lake level data, an approach is presented to calibrate and validate the water balance model of Lake Turkana using a composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data. Model validation results showed that the satellitedriven water balance model can satisfactorily capture the patterns and seasonal variations of the Lake Turkana water level fluctuations with a Pearson's correlation coefficient of 0.90 and a Nash-Sutcliffe Coefficient of Efficiency (NSCE) of 0.80 during the validation period (2004-2009). Model error estimates were within 10% of the natural variability of the lake. Our analysis indicated that fluctuations in Lake Turkana water levels are mainly driven by lake inflows and over-the-lake evaporation. Over-the-lake rainfall contributes only up to 30% of lake evaporative demand. During the modelling time period, Lake Turkana showed seasonal variations of 1-2m. The lake level fluctuated in the range up to 4m between the years 1998-2009. This study demonstrated the usefulness of satellite altimetry data to calibrate and validate the satellite-driven hydrological model for Lake Turkana without using any in-situ data. Furthermore, for Lake Turkana, we identified and outlined opportunities and challenges of using a calibrated satellite-driven water balance model for (i) quantitative assessment of the impact of basin developmental activities on lake levels and for (ii) forecasting lake level changes and their impact on fisheries. From this study, we suggest that globally available satellite altimetry data provide a unique opportunity for calibration and validation of hydrologic models in ungauged basins. ?? Author(s) 2012.
Numerical Modeling and Testing of an Inductively-Driven and High-Energy Pulsed Plasma Thrusters
NASA Technical Reports Server (NTRS)
Parma, Brian
2004-01-01
Pulsed Plasma Thrusters (PPTs) are advanced electric space propulsion devices that are characterized by simplicity and robustness. They suffer, however, from low thrust efficiencies. This summer, two approaches to improve the thrust efficiency of PPTs will be investigated through both numerical modeling and experimental testing. The first approach, an inductively-driven PPT, uses a double-ignition circuit to fire two PPTs in succession. This effectively changes the PPTs configuration from an LRC circuit to an LR circuit. The LR circuit is expected to provide better impedance matching and improving the efficiency of the energy transfer to the plasma. An added benefit of the LR circuit is an exponential decay of the current, whereas a traditional PPT s under damped LRC circuit experiences the characteristic "ringing" of its current. The exponential decay may provide improved lifetime and sustained electromagnetic acceleration. The second approach, a high-energy PPT, is a traditional PPT with a variable size capacitor bank. This PPT will be simulated and tested at energy levels between 100 and 450 joules in order to investigate the relationship between efficiency and energy level. Arbitrary Coordinate Hydromagnetic (MACH2) code is used. The MACH2 code, designed by the Center for Plasma Theory and Computation at the Air Force Research Laboratory, has been used to gain insight into a variety of plasma problems, including electric plasma thrusters. The goals for this summer include numerical predictions of performance for both the inductively-driven PPT and high-energy PFT, experimental validation of the numerical models, and numerical optimization of the designs. These goals will be met through numerical and experimental investigation of the PPTs current waveforms, mass loss (or ablation), and impulse bit characteristics.
Bauer, Julia; Chen, Wenjing; Nischwitz, Sebastian; Liebl, Jakob; Rieken, Stefan; Welzel, Thomas; Debus, Juergen; Parodi, Katia
2018-04-24
A reliable Monte Carlo prediction of proton-induced brain tissue activation used for comparison to particle therapy positron-emission-tomography (PT-PET) measurements is crucial for in vivo treatment verification. Major limitations of current approaches to overcome include the CT-based patient model and the description of activity washout due to tissue perfusion. Two approaches were studied to improve the activity prediction for brain irradiation: (i) a refined patient model using tissue classification based on MR information and (ii) a PT-PET data-driven refinement of washout model parameters. Improvements of the activity predictions compared to post-treatment PT-PET measurements were assessed in terms of activity profile similarity for six patients treated with a single or two almost parallel fields delivered by active proton beam scanning. The refined patient model yields a generally higher similarity for most of the patients, except in highly pathological areas leading to tissue misclassification. Using washout model parameters deduced from clinical patient data could considerably improve the activity profile similarity for all patients. Current methods used to predict proton-induced brain tissue activation can be improved with MR-based tissue classification and data-driven washout parameters, thus providing a more reliable basis for PT-PET verification. Copyright © 2018 Elsevier B.V. All rights reserved.
Assessment of cardiovascular risk based on a data-driven knowledge discovery approach.
Mendes, D; Paredes, S; Rocha, T; Carvalho, P; Henriques, J; Cabiddu, R; Morais, J
2015-01-01
The cardioRisk project addresses the development of personalized risk assessment tools for patients who have been admitted to the hospital with acute myocardial infarction. Although there are models available that assess the short-term risk of death/new events for such patients, these models were established in circumstances that do not take into account the present clinical interventions and, in some cases, the risk factors used by such models are not easily available in clinical practice. The integration of the existing risk tools (applied in the clinician's daily practice) with data-driven knowledge discovery mechanisms based on data routinely collected during hospitalizations, will be a breakthrough in overcoming some of these difficulties. In this context, the development of simple and interpretable models (based on recent datasets), unquestionably will facilitate and will introduce confidence in this integration process. In this work, a simple and interpretable model based on a real dataset is proposed. It consists of a decision tree model structure that uses a reduced set of six binary risk factors. The validation is performed using a recent dataset provided by the Portuguese Society of Cardiology (11113 patients), which originally comprised 77 risk factors. A sensitivity, specificity and accuracy of, respectively, 80.42%, 77.25% and 78.80% were achieved showing the effectiveness of the approach.
Process-driven and biological characterisation and mapping of seabed habitats sensitive to trawling.
Foveau, Aurélie; Vaz, Sandrine; Desroy, Nicolas; Kostylev, Vladimir E
2017-01-01
The increase of anthropogenic pressures on the marine environment together with the necessity of a sustainable management of marine living resources have underlined the need to map and model coastal environments, particularly for the purposes of spatial planning and for the implementation of integrated ecosystem-based management approach. The present study compares outputs of a process-driven benthic habitat sensitivity (PDS) model to the structure, composition and distribution of benthic invertebrates in the Eastern English Channel and southern part of the North Sea. Trawl disturbance indicators (TDI) computed from species biological traits and benthic community composition were produced from samples collected with a bottom trawl. The TDI was found to be highly correlated to the PDS further validating the latter's purpose to identify natural process-driven pattern of sensitivity. PDS was found to reflect an environmental potential that may no longer be fully observable in the field and difference with in situ biological observations could be partially explained by the spatial distribution of fishery pressure on the seafloor. The management implication of these findings are discussed and we suggest that, used in conjunction with TDI approaches, PDS may help monitor management effort by evaluating the difference between the current state and the presumed optimal environmental status of marine benthic habitats.
Process-driven and biological characterisation and mapping of seabed habitats sensitive to trawling
Desroy, Nicolas; Kostylev, Vladimir E.
2017-01-01
The increase of anthropogenic pressures on the marine environment together with the necessity of a sustainable management of marine living resources have underlined the need to map and model coastal environments, particularly for the purposes of spatial planning and for the implementation of integrated ecosystem-based management approach. The present study compares outputs of a process-driven benthic habitat sensitivity (PDS) model to the structure, composition and distribution of benthic invertebrates in the Eastern English Channel and southern part of the North Sea. Trawl disturbance indicators (TDI) computed from species biological traits and benthic community composition were produced from samples collected with a bottom trawl. The TDI was found to be highly correlated to the PDS further validating the latter’s purpose to identify natural process-driven pattern of sensitivity. PDS was found to reflect an environmental potential that may no longer be fully observable in the field and difference with in situ biological observations could be partially explained by the spatial distribution of fishery pressure on the seafloor. The management implication of these findings are discussed and we suggest that, used in conjunction with TDI approaches, PDS may help monitor management effort by evaluating the difference between the current state and the presumed optimal environmental status of marine benthic habitats. PMID:28981504
Modeling of the illumination driven coma of 67P/Churyumov-Gerasimenko
NASA Astrophysics Data System (ADS)
Bieler, André
2015-04-01
In this paper we present results modeling 67P/Churyumov-Gerasimenko's (C-G) neutral coma properties observed by the Rosetta ROSINA experiment with 3 different model approaches. The basic assumption for all models is the idea that the out-gassing properties of C-G are mainly illumination driven. With this assumption all models are capable of reproducing most features in the neutral coma signature as detected by the ROSINA-COPS instrument over several months. The models include the realistic shape model of the nucleus to calculate the illumination conditions over time which are used to define the boundary conditions for the hydrodynamic (BATS-R-US code) and the Direct Simulation Monte Carlo (AMPS code) simulations. The third model finally computes the projection of the total illumination on the comet surface towards the spacecraft. Our results indicate that at large heliocentric distances (3.5 to 2.8 AU) most gas coma structures observed by the in-situ instruments can be explained by uniformly distributed activity regions spread over the whole nucleus surface.
Geometrical Optimization Approach to Isomerization: Models and Limitations.
Chang, Bo Y; Shin, Seokmin; Engel, Volker; Sola, Ignacio R
2017-11-02
We study laser-driven isomerization reactions through an excited electronic state using the recently developed Geometrical Optimization procedure. Our goal is to analyze whether an initial wave packet in the ground state, with optimized amplitudes and phases, can be used to enhance the yield of the reaction at faster rates, driven by a single picosecond pulse or a pair of femtosecond pulses resonant with the electronic transition. We show that the symmetry of the system imposes limitations in the optimization procedure, such that the method rediscovers the pump-dump mechanism.
Novel Approaches to Breast Cancer Prevention and Inhibition of Metastases
2014-10-01
functional characterization of candidate breast cancer genes. The transgenic RNAi library is covering the whole Drosophila genome , giving us an...cancer prevention trials in BRCA1 carriers using RANKL blockade. Using Drosophila modeling of Ras-driven transformation, we performed a near- genome ... Genome wide functional genetics, haploid stem cells, Drosophila cancer modeling, breast cancer prevention, BRCA1 carriers 16. SECURITY
Multilingual Phoneme Models for Rapid Speech Processing System Development
2006-09-01
processes are used to develop an Arabic speech recognition system starting from monolingual English models, In- ternational Phonetic Association (IPA...clusters. It was found that multilingual bootstrapping methods out- perform monolingual English bootstrapping methods on the Arabic evaluation data initially...International Phonetic Alphabet . . . . . . . . . 7 2.3.2 Multilingual vs. Monolingual Speech Recognition 7 2.3.3 Data-Driven Approaches
Numerical investigation of coupled density-driven flow and hydrogeochemical processes below playas
NASA Astrophysics Data System (ADS)
Hamann, Enrico; Post, Vincent; Kohfahl, Claus; Prommer, Henning; Simmons, Craig T.
2015-11-01
Numerical modeling approaches with varying complexity were explored to investigate coupled groundwater flow and geochemical processes in saline basins. Long-term model simulations of a playa system gain insights into the complex feedback mechanisms between density-driven flow and the spatiotemporal patterns of precipitating evaporites and evolving brines. Using a reactive multicomponent transport model approach, the simulations reproduced, for the first time in a numerical study, the evaporite precipitation sequences frequently observed in saline basins ("bull's eyes"). Playa-specific flow, evapoconcentration, and chemical divides were found to be the primary controls for the location of evaporites formed, and the resulting brine chemistry. Comparative simulations with the computationally far less demanding surrogate single-species transport models showed that these were still able to replicate the major flow patterns obtained by the more complex reactive transport simulations. However, the simulated degree of salinization was clearly lower than in reactive multicomponent transport simulations. For example, in the late stages of the simulations, when the brine becomes halite-saturated, the nonreactive simulation overestimated the solute mass by almost 20%. The simulations highlight the importance of the consideration of reactive transport processes for understanding and quantifying geochemical patterns, concentrations of individual dissolved solutes, and evaporite evolution.
de Vries, Natalie Jane; Carlson, Jamie; Moscato, Pablo
2014-01-01
Online consumer behavior in general and online customer engagement with brands in particular, has become a major focus of research activity fuelled by the exponential increase of interactive functions of the internet and social media platforms and applications. Current research in this area is mostly hypothesis-driven and much debate about the concept of Customer Engagement and its related constructs remains existent in the literature. In this paper, we aim to propose a novel methodology for reverse engineering a consumer behavior model for online customer engagement, based on a computational and data-driven perspective. This methodology could be generalized and prove useful for future research in the fields of consumer behaviors using questionnaire data or studies investigating other types of human behaviors. The method we propose contains five main stages; symbolic regression analysis, graph building, community detection, evaluation of results and finally, investigation of directed cycles and common feedback loops. The ‘communities’ of questionnaire items that emerge from our community detection method form possible ‘functional constructs’ inferred from data rather than assumed from literature and theory. Our results show consistent partitioning of questionnaire items into such ‘functional constructs’ suggesting the method proposed here could be adopted as a new data-driven way of human behavior modeling. PMID:25036766
de Vries, Natalie Jane; Carlson, Jamie; Moscato, Pablo
2014-01-01
Online consumer behavior in general and online customer engagement with brands in particular, has become a major focus of research activity fuelled by the exponential increase of interactive functions of the internet and social media platforms and applications. Current research in this area is mostly hypothesis-driven and much debate about the concept of Customer Engagement and its related constructs remains existent in the literature. In this paper, we aim to propose a novel methodology for reverse engineering a consumer behavior model for online customer engagement, based on a computational and data-driven perspective. This methodology could be generalized and prove useful for future research in the fields of consumer behaviors using questionnaire data or studies investigating other types of human behaviors. The method we propose contains five main stages; symbolic regression analysis, graph building, community detection, evaluation of results and finally, investigation of directed cycles and common feedback loops. The 'communities' of questionnaire items that emerge from our community detection method form possible 'functional constructs' inferred from data rather than assumed from literature and theory. Our results show consistent partitioning of questionnaire items into such 'functional constructs' suggesting the method proposed here could be adopted as a new data-driven way of human behavior modeling.
NASA Astrophysics Data System (ADS)
Merwade, V.; Ruddell, B. L.
2012-08-01
In this opinion paper, we review recent literature related to data and modeling driven instruction in hydrology, and present our findings from surveying the hydrology education community in the United States. This paper presents an argument that that data and modeling driven geoscience cybereducation (DMDGC) approaches are essential for teaching the conceptual and applied aspects of hydrology, as a part of the broader effort to improve science, technology, engineering, and mathematics (STEM) education at the university level. The authors have undertaken a series of surveys and a workshop involving university hydrology educators to determine the state of the practice of DMDGC approaches to hydrology. We identify the most common tools and approaches currently utilized, quantify the extent of the adoption of DMDGC approaches in the university hydrology classroom, and explain the community's views on the challenges and barriers preventing DMDGC approaches from wider use. DMDGC approaches are currently emphasized at the graduate level of the curriculum, and only the most basic modeling and visualization tools are in widespread use. The community identifies the greatest barriers to greater adoption as a lack of access to easily adoptable curriculum materials and a lack of time and training to learn constantly changing tools and methods. The community's current consensus is that DMDGC approaches should emphasize conceptual learning, and should be used to complement rather than replace lecture-based pedagogies. Inadequate online material publication and sharing systems, and a lack of incentives for faculty to develop and publish materials via such systems, is also identified as a challenge. Based on these findings, we suggest that a number of steps should be taken by the community to develop the potential of DMDGC in university hydrology education, including formal development and assessment of curriculum materials, integrating lecture-format and DMDGC approaches, incentivizing the publication by faculty of excellent DMDGC curriculum materials, and implementing the publication and dissemination cyberinfrastructure necessary to support the unique DMDGC digital curriculum materials.
SATware: A Semantic Approach for Building Sentient Spaces
NASA Astrophysics Data System (ADS)
Massaguer, Daniel; Mehrotra, Sharad; Vaisenberg, Ronen; Venkatasubramanian, Nalini
This chapter describes the architecture of a semantic-based middleware environment for building sensor-driven sentient spaces. The proposed middleware explicitly models sentient space semantics (i.e., entities, spaces, activities) and supports mechanisms to map sensor observations to the state of the sentient space. We argue how such a semantic approach provides a powerful programming environment for building sensor spaces. In addition, the approach provides natural ways to exploit semantics for variety of purposes including scheduling under resource constraints and sensor recalibration.
Identification of tumor evolution patterns by means of inductive logic programming.
Bevilacqua, Vitoantonio; Chiarappa, Patrizia; Mastronardi, Giuseppe; Menolascina, Filippo; Paradiso, Angelo; Tommasi, Stefania
2008-06-01
In considering key events of genomic disorders in the development and progression of cancer, the correlation between genomic instability and carcinogenesis is currently under investigation. In this work, we propose an inductive logic programming approach to the problem of modeling evolution patterns for breast cancer. Using this approach, it is possible to extract fingerprints of stages of the disease that can be used in order to develop and deliver the most adequate therapies to patients. Furthermore, such a model can help physicians and biologists in the elucidation of molecular dynamics underlying the aberrations-waterfall model behind carcinogenesis. By showing results obtained on a real-world dataset, we try to give some hints about further approach to the knowledge-driven validations of such hypotheses.
AOP-driven Predictive Models for Carcinogenicity: an exercise in interoperable data application.
Traditional methods and data sources for risk assessment are resource-intensive, retrospective, and not a feasible approach to address the tremendous regulatory burden of unclassified chemicals. As a result, the adverse outcome pathway (AOP) concept was developed to facilitate a ...
Evaluating the utility of dynamical downscaling in agricultural impacts projections
Glotter, Michael; Elliott, Joshua; McInerney, David; Best, Neil; Foster, Ian; Moyer, Elisabeth J.
2014-01-01
Interest in estimating the potential socioeconomic costs of climate change has led to the increasing use of dynamical downscaling—nested modeling in which regional climate models (RCMs) are driven with general circulation model (GCM) output—to produce fine-spatial-scale climate projections for impacts assessments. We evaluate here whether this computationally intensive approach significantly alters projections of agricultural yield, one of the greatest concerns under climate change. Our results suggest that it does not. We simulate US maize yields under current and future CO2 concentrations with the widely used Decision Support System for Agrotechnology Transfer crop model, driven by a variety of climate inputs including two GCMs, each in turn downscaled by two RCMs. We find that no climate model output can reproduce yields driven by observed climate unless a bias correction is first applied. Once a bias correction is applied, GCM- and RCM-driven US maize yields are essentially indistinguishable in all scenarios (<10% discrepancy, equivalent to error from observations). Although RCMs correct some GCM biases related to fine-scale geographic features, errors in yield are dominated by broad-scale (100s of kilometers) GCM systematic errors that RCMs cannot compensate for. These results support previous suggestions that the benefits for impacts assessments of dynamically downscaling raw GCM output may not be sufficient to justify its computational demands. Progress on fidelity of yield projections may benefit more from continuing efforts to understand and minimize systematic error in underlying climate projections. PMID:24872455
Master stability functions reveal diffusion-driven pattern formation in networks
NASA Astrophysics Data System (ADS)
Brechtel, Andreas; Gramlich, Philipp; Ritterskamp, Daniel; Drossel, Barbara; Gross, Thilo
2018-03-01
We study diffusion-driven pattern formation in networks of networks, a class of multilayer systems, where different layers have the same topology, but different internal dynamics. Agents are assumed to disperse within a layer by undergoing random walks, while they can be created or destroyed by reactions between or within a layer. We show that the stability of homogeneous steady states can be analyzed with a master stability function approach that reveals a deep analogy between pattern formation in networks and pattern formation in continuous space. For illustration, we consider a generalized model of ecological meta-food webs. This fairly complex model describes the dispersal of many different species across a region consisting of a network of individual habitats while subject to realistic, nonlinear predator-prey interactions. In this example, the method reveals the intricate dependence of the dynamics on the spatial structure. The ability of the proposed approach to deal with this fairly complex system highlights it as a promising tool for ecology and other applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Popova, Evdokia; Rodgers, Theron M.; Gong, Xinyi
A novel data science workflow is developed and demonstrated to extract process-structure linkages (i.e., reduced-order model) for microstructure evolution problems when the final microstructure depends on (simulation or experimental) processing parameters. Our workflow consists of four main steps: data pre-processing, microstructure quantification, dimensionality reduction, and extraction/validation of process-structure linkages. These methods that can be employed within each step vary based on the type and amount of available data. In this paper, this data-driven workflow is applied to a set of synthetic additive manufacturing microstructures obtained using the Potts-kinetic Monte Carlo (kMC) approach. Additive manufacturing techniques inherently produce complex microstructures thatmore » can vary significantly with processing conditions. Using the developed workflow, a low-dimensional data-driven model was established to correlate process parameters with the predicted final microstructure. In addition, the modular workflows developed and presented in this work facilitate easy dissemination and curation by the broader community.« less
Reiner, Bruce I
2017-12-01
In conventional radiology peer review practice, a small number of exams (routinely 5% of the total volume) is randomly selected, which may significantly underestimate the true error rate within a given radiology practice. An alternative and preferable approach would be to create a data-driven model which mathematically quantifies a peer review risk score for each individual exam and uses this data to identify high risk exams and readers, and selectively target these exams for peer review. An analogous model can also be created to assist in the assignment of these peer review cases in keeping with specific priorities of the service provider. An additional option to enhance the peer review process would be to assign the peer review cases in a truly blinded fashion. In addition to eliminating traditional peer review bias, this approach has the potential to better define exam-specific standard of care, particularly when multiple readers participate in the peer review process.
Koch, Alex; Imhoff, Roland; Dotsch, Ron; Unkelbach, Christian; Alves, Hans
2016-05-01
Previous research argued that stereotypes differ primarily on the 2 dimensions of warmth/communion and competence/agency. We identify an empirical gap in support for this notion. The theoretical model constrains stereotypes a priori to these 2 dimensions; without this constraint, participants might spontaneously employ other relevant dimensions. We fill this gap by complementing the existing theory-driven approaches with a data-driven approach that allows an estimation of the spontaneously employed dimensions of stereotyping. Seven studies (total N = 4,451) show that people organize social groups primarily based on their agency/socioeconomic success (A), and as a second dimension, based on their conservative-progressive beliefs (B). Communion (C) is not found as a dimension by its own, but rather as an emergent quality in the two-dimensional space of A and B, resulting in a 2D ABC model of stereotype content about social groups. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Popova, Evdokia; Rodgers, Theron M.; Gong, Xinyi; ...
2017-03-13
A novel data science workflow is developed and demonstrated to extract process-structure linkages (i.e., reduced-order model) for microstructure evolution problems when the final microstructure depends on (simulation or experimental) processing parameters. Our workflow consists of four main steps: data pre-processing, microstructure quantification, dimensionality reduction, and extraction/validation of process-structure linkages. These methods that can be employed within each step vary based on the type and amount of available data. In this paper, this data-driven workflow is applied to a set of synthetic additive manufacturing microstructures obtained using the Potts-kinetic Monte Carlo (kMC) approach. Additive manufacturing techniques inherently produce complex microstructures thatmore » can vary significantly with processing conditions. Using the developed workflow, a low-dimensional data-driven model was established to correlate process parameters with the predicted final microstructure. In addition, the modular workflows developed and presented in this work facilitate easy dissemination and curation by the broader community.« less
A Finite Element Model for Mixed Porohyperelasticity with Transport, Swelling, and Growth.
Armstrong, Michelle Hine; Buganza Tepole, Adrián; Kuhl, Ellen; Simon, Bruce R; Vande Geest, Jonathan P
2016-01-01
The purpose of this manuscript is to establish a unified theory of porohyperelasticity with transport and growth and to demonstrate the capability of this theory using a finite element model developed in MATLAB. We combine the theories of volumetric growth and mixed porohyperelasticity with transport and swelling (MPHETS) to derive a new method that models growth of biological soft tissues. The conservation equations and constitutive equations are developed for both solid-only growth and solid/fluid growth. An axisymmetric finite element framework is introduced for the new theory of growing MPHETS (GMPHETS). To illustrate the capabilities of this model, several example finite element test problems are considered using model geometry and material parameters based on experimental data from a porcine coronary artery. Multiple growth laws are considered, including time-driven, concentration-driven, and stress-driven growth. Time-driven growth is compared against an exact analytical solution to validate the model. For concentration-dependent growth, changing the diffusivity (representing a change in drug) fundamentally changes growth behavior. We further demonstrate that for stress-dependent, solid-only growth of an artery, growth of an MPHETS model results in a more uniform hoop stress than growth in a hyperelastic model for the same amount of growth time using the same growth law. This may have implications in the context of developing residual stresses in soft tissues under intraluminal pressure. To our knowledge, this manuscript provides the first full description of an MPHETS model with growth. The developed computational framework can be used in concert with novel in-vitro and in-vivo experimental approaches to identify the governing growth laws for various soft tissues.
A Finite Element Model for Mixed Porohyperelasticity with Transport, Swelling, and Growth
Armstrong, Michelle Hine; Buganza Tepole, Adrián; Kuhl, Ellen; Simon, Bruce R.; Vande Geest, Jonathan P.
2016-01-01
The purpose of this manuscript is to establish a unified theory of porohyperelasticity with transport and growth and to demonstrate the capability of this theory using a finite element model developed in MATLAB. We combine the theories of volumetric growth and mixed porohyperelasticity with transport and swelling (MPHETS) to derive a new method that models growth of biological soft tissues. The conservation equations and constitutive equations are developed for both solid-only growth and solid/fluid growth. An axisymmetric finite element framework is introduced for the new theory of growing MPHETS (GMPHETS). To illustrate the capabilities of this model, several example finite element test problems are considered using model geometry and material parameters based on experimental data from a porcine coronary artery. Multiple growth laws are considered, including time-driven, concentration-driven, and stress-driven growth. Time-driven growth is compared against an exact analytical solution to validate the model. For concentration-dependent growth, changing the diffusivity (representing a change in drug) fundamentally changes growth behavior. We further demonstrate that for stress-dependent, solid-only growth of an artery, growth of an MPHETS model results in a more uniform hoop stress than growth in a hyperelastic model for the same amount of growth time using the same growth law. This may have implications in the context of developing residual stresses in soft tissues under intraluminal pressure. To our knowledge, this manuscript provides the first full description of an MPHETS model with growth. The developed computational framework can be used in concert with novel in-vitro and in-vivo experimental approaches to identify the governing growth laws for various soft tissues. PMID:27078495
Carey, Ryan M.; Sherwood, William Erik; Shipley, Michael T.; Borisyuk, Alla
2015-01-01
Olfaction in mammals is a dynamic process driven by the inhalation of air through the nasal cavity. Inhalation determines the temporal structure of sensory neuron responses and shapes the neural dynamics underlying central olfactory processing. Inhalation-linked bursts of activity among olfactory bulb (OB) output neurons [mitral/tufted cells (MCs)] are temporally transformed relative to those of sensory neurons. We investigated how OB circuits shape inhalation-driven dynamics in MCs using a modeling approach that was highly constrained by experimental results. First, we constructed models of canonical OB circuits that included mono- and disynaptic feedforward excitation, recurrent inhibition and feedforward inhibition of the MC. We then used experimental data to drive inputs to the models and to tune parameters; inputs were derived from sensory neuron responses during natural odorant sampling (sniffing) in awake rats, and model output was compared with recordings of MC responses to odorants sampled with the same sniff waveforms. This approach allowed us to identify OB circuit features underlying the temporal transformation of sensory inputs into inhalation-linked patterns of MC spike output. We found that realistic input-output transformations can be achieved independently by multiple circuits, including feedforward inhibition with slow onset and decay kinetics and parallel feedforward MC excitation mediated by external tufted cells. We also found that recurrent and feedforward inhibition had differential impacts on MC firing rates and on inhalation-linked response dynamics. These results highlight the importance of investigating neural circuits in a naturalistic context and provide a framework for further explorations of signal processing by OB networks. PMID:25717156
NASA Astrophysics Data System (ADS)
Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.
2018-04-01
Sea level rise has already caused more frequent and severe coastal flooding and this trend will likely continue. Flood prediction is an essential part of a coastal city's capacity to adapt to and mitigate this growing problem. Complex coastal urban hydrological systems however, do not always lend themselves easily to physically-based flood prediction approaches. This paper presents a method for using a data-driven approach to estimate flood severity in an urban coastal setting using crowd-sourced data, a non-traditional but growing data source, along with environmental observation data. Two data-driven models, Poisson regression and Random Forest regression, are trained to predict the number of flood reports per storm event as a proxy for flood severity, given extensive environmental data (i.e., rainfall, tide, groundwater table level, and wind conditions) as input. The method is demonstrated using data from Norfolk, Virginia USA from September 2010 to October 2016. Quality-controlled, crowd-sourced street flooding reports ranging from 1 to 159 per storm event for 45 storm events are used to train and evaluate the models. Random Forest performed better than Poisson regression at predicting the number of flood reports and had a lower false negative rate. From the Random Forest model, total cumulative rainfall was by far the most dominant input variable in predicting flood severity, followed by low tide and lower low tide. These methods serve as a first step toward using data-driven methods for spatially and temporally detailed coastal urban flood prediction.
Bouska, Kristen; Whitledge, Gregory W.; Lant, Christopher; Schoof, Justin
2018-01-01
Land cover is an important determinant of aquatic habitat and is projected to shift with climate changes, yet climate-driven land cover changes are rarely factored into climate assessments. To quantify impacts and uncertainty of coupled climate and land cover change on warm-water fish species’ distributions, we used an ensemble model approach to project distributions of 14 species. For each species, current range projections were compared to 27 scenario-based projections and aggregated to visualize uncertainty. Multiple regression and model selection techniques were used to identify drivers of range change. Novel, or no-analogue, climates were assessed to evaluate transferability of models. Changes in total probability of occurrence ranged widely across species, from a 63% increase to a 65% decrease. Distributional gains and losses were largely driven by temperature and flow variables and underscore the importance of habitat heterogeneity and connectivity to facilitate adaptation to changing conditions. Finally, novel climate conditions were driven by mean annual maximum temperature, which stresses the importance of understanding the role of temperature on fish physiology and the role of temperature-mitigating management practices.
Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R
2012-01-01
Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.
NASA Astrophysics Data System (ADS)
Nelson, Kevin; Corbin, George; Blowers, Misty
2014-05-01
Machine learning is continuing to gain popularity due to its ability to solve problems that are difficult to model using conventional computer programming logic. Much of the current and past work has focused on algorithm development, data processing, and optimization. Lately, a subset of research has emerged which explores issues related to security. This research is gaining traction as systems employing these methods are being applied to both secure and adversarial environments. One of machine learning's biggest benefits, its data-driven versus logic-driven approach, is also a weakness if the data on which the models rely are corrupted. Adversaries could maliciously influence systems which address drift and data distribution changes using re-training and online learning. Our work is focused on exploring the resilience of various machine learning algorithms to these data-driven attacks. In this paper, we present our initial findings using Monte Carlo simulations, and statistical analysis, to explore the maximal achievable shift to a classification model, as well as the required amount of control over the data.
How Complex, Probable, and Predictable is Genetically Driven Red Queen Chaos?
Duarte, Jorge; Rodrigues, Carla; Januário, Cristina; Martins, Nuno; Sardanyés, Josep
2015-12-01
Coevolution between two antagonistic species has been widely studied theoretically for both ecologically- and genetically-driven Red Queen dynamics. A typical outcome of these systems is an oscillatory behavior causing an endless series of one species adaptation and others counter-adaptation. More recently, a mathematical model combining a three-species food chain system with an adaptive dynamics approach revealed genetically driven chaotic Red Queen coevolution. In the present article, we analyze this mathematical model mainly focusing on the impact of species rates of evolution (mutation rates) in the dynamics. Firstly, we analytically proof the boundedness of the trajectories of the chaotic attractor. The complexity of the coupling between the dynamical variables is quantified using observability indices. By using symbolic dynamics theory, we quantify the complexity of genetically driven Red Queen chaos computing the topological entropy of existing one-dimensional iterated maps using Markov partitions. Co-dimensional two bifurcation diagrams are also built from the period ordering of the orbits of the maps. Then, we study the predictability of the Red Queen chaos, found in narrow regions of mutation rates. To extend the previous analyses, we also computed the likeliness of finding chaos in a given region of the parameter space varying other model parameters simultaneously. Such analyses allowed us to compute a mean predictability measure for the system in the explored region of the parameter space. We found that genetically driven Red Queen chaos, although being restricted to small regions of the analyzed parameter space, might be highly unpredictable.
Reis, Yara; Wolf, Thomas; Brors, Benedikt; Hamacher-Brady, Anne; Eils, Roland; Brady, Nathan R.
2012-01-01
Mitochondria exist as a network of interconnected organelles undergoing constant fission and fusion. Current approaches to study mitochondrial morphology are limited by low data sampling coupled with manual identification and classification of complex morphological phenotypes. Here we propose an integrated mechanistic and data-driven modeling approach to analyze heterogeneous, quantified datasets and infer relations between mitochondrial morphology and apoptotic events. We initially performed high-content, multi-parametric measurements of mitochondrial morphological, apoptotic, and energetic states by high-resolution imaging of human breast carcinoma MCF-7 cells. Subsequently, decision tree-based analysis was used to automatically classify networked, fragmented, and swollen mitochondrial subpopulations, at the single-cell level and within cell populations. Our results revealed subtle but significant differences in morphology class distributions in response to various apoptotic stimuli. Furthermore, key mitochondrial functional parameters including mitochondrial membrane potential and Bax activation, were measured under matched conditions. Data-driven fuzzy logic modeling was used to explore the non-linear relationships between mitochondrial morphology and apoptotic signaling, combining morphological and functional data as a single model. Modeling results are in accordance with previous studies, where Bax regulates mitochondrial fragmentation, and mitochondrial morphology influences mitochondrial membrane potential. In summary, we established and validated a platform for mitochondrial morphological and functional analysis that can be readily extended with additional datasets. We further discuss the benefits of a flexible systematic approach for elucidating specific and general relationships between mitochondrial morphology and apoptosis. PMID:22272225
Murphy, Matthew C; Poplawsky, Alexander J; Vazquez, Alberto L; Chan, Kevin C; Kim, Seong-Gi; Fukuda, Mitsuhiro
2016-08-15
Functional MRI (fMRI) is a popular and important tool for noninvasive mapping of neural activity. As fMRI measures the hemodynamic response, the resulting activation maps do not perfectly reflect the underlying neural activity. The purpose of this work was to design a data-driven model to improve the spatial accuracy of fMRI maps in the rat olfactory bulb. This system is an ideal choice for this investigation since the bulb circuit is well characterized, allowing for an accurate definition of activity patterns in order to train the model. We generated models for both cerebral blood volume weighted (CBVw) and blood oxygen level dependent (BOLD) fMRI data. The results indicate that the spatial accuracy of the activation maps is either significantly improved or at worst not significantly different when using the learned models compared to a conventional general linear model approach, particularly for BOLD images and activity patterns involving deep layers of the bulb. Furthermore, the activation maps computed by CBVw and BOLD data show increased agreement when using the learned models, lending more confidence to their accuracy. The models presented here could have an immediate impact on studies of the olfactory bulb, but perhaps more importantly, demonstrate the potential for similar flexible, data-driven models to improve the quality of activation maps calculated using fMRI data. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gang, G; Stayman, J; Ouadah, S
2015-06-15
Purpose: This work introduces a task-driven imaging framework that utilizes a patient-specific anatomical model, mathematical definition of the imaging task, and a model of the imaging system to prospectively design acquisition and reconstruction techniques that maximize task-based imaging performance. Utility of the framework is demonstrated in the joint optimization of tube current modulation and view-dependent reconstruction kernel in filtered-backprojection reconstruction and non-circular orbit design in model-based reconstruction. Methods: The system model is based on a cascaded systems analysis of cone-beam CT capable of predicting the spatially varying noise and resolution characteristics as a function of the anatomical model and amore » wide range of imaging parameters. Detectability index for a non-prewhitening observer model is used as the objective function in a task-driven optimization. The combination of tube current and reconstruction kernel modulation profiles were identified through an alternating optimization algorithm where tube current was updated analytically followed by a gradient-based optimization of reconstruction kernel. The non-circular orbit is first parameterized as a linear combination of bases functions and the coefficients were then optimized using an evolutionary algorithm. The task-driven strategy was compared with conventional acquisitions without modulation, using automatic exposure control, and in a circular orbit. Results: The task-driven strategy outperformed conventional techniques in all tasks investigated, improving the detectability of a spherical lesion detection task by an average of 50% in the interior of a pelvis phantom. The non-circular orbit design successfully mitigated photon starvation effects arising from a dense embolization coil in a head phantom, improving the conspicuity of an intracranial hemorrhage proximal to the coil. Conclusion: The task-driven imaging framework leverages a knowledge of the imaging task within a patient-specific anatomical model to optimize image acquisition and reconstruction techniques, thereby improving imaging performance beyond that achievable with conventional approaches. 2R01-CA-112163; R01-EB-017226; U01-EB-018758; Siemens Healthcare (Forcheim, Germany)« less
Drought resilience across ecologically dominant species: An experiment-model integration approach
NASA Astrophysics Data System (ADS)
Felton, A. J.; Warren, J.; Ricciuto, D. M.; Smith, M. D.
2017-12-01
Poorly understood are the mechanisms contributing to variability in ecosystem recovery following drought. Grasslands of the central U.S. are ecologically and economically important ecosystems, yet are also highly sensitive to drought. Although characteristics of these ecosystems change across gradients of temperature and precipitation, a consistent feature among these systems is the presence of highly abundant, dominant grass species that control biomass production. As a result, the incorporation of these species' traits into terrestrial biosphere models may constrain predictions amid increases in climatic variability. Here we report the results of a modeling-experiment (MODEX) research approach. We investigated the physiological, morphological and growth responses of the dominant grass species from each of the four major grasslands of the central U.S. (ranging from tallgrass prairie to desert grassland) following severe drought. Despite significant differences in baseline values, full recovery in leaf physiological function was evident across species, of which was consistently driven by the production of new leaves. Further, recovery in whole-plant carbon uptake tended to be driven by shifts in allocation from belowground to aboveground structures. However, there was clear variability among species in the magnitude of this dynamic as well as the relative allocation to stem versus leaf production. As a result, all species harbored the physiological capacity to recover from drought, yet we posit that variability in the recovery of whole-plant carbon uptake to be more strongly driven by variability in the sensitivity of species' morphology to soil moisture increases. The next step of this project will be to incorporate these and other existing data on these species and ecosystems into the community land model in an effort to test the sensitivity of this model to these data.
NASA Astrophysics Data System (ADS)
Shao, Xinxin; Naghdy, Fazel; Du, Haiping
2017-03-01
A fault-tolerant fuzzy H∞ control design approach for active suspension of in-wheel motor driven electric vehicles in the presence of sprung mass variation, actuator faults and control input constraints is proposed. The controller is designed based on the quarter-car active suspension model with a dynamic-damping-in-wheel-motor-driven-system, in which the suspended motor is operated as a dynamic absorber. The Takagi-Sugeno (T-S) fuzzy model is used to model this suspension with possible sprung mass variation. The parallel-distributed compensation (PDC) scheme is deployed to derive a fault-tolerant fuzzy controller for the T-S fuzzy suspension model. In order to reduce the motor wear caused by the dynamic force transmitted to the in-wheel motor, the dynamic force is taken as an additional controlled output besides the traditional optimization objectives such as sprung mass acceleration, suspension deflection and actuator saturation. The H∞ performance of the proposed controller is derived as linear matrix inequalities (LMIs) comprising three equality constraints which are solved efficiently by means of MATLAB LMI Toolbox. The proposed controller is applied to an electric vehicle suspension and its effectiveness is demonstrated through computer simulation.
Depinning and heterogeneous dynamics of colloidal crystal layers under shear flow
NASA Astrophysics Data System (ADS)
Gerloff, Sascha; Klapp, Sabine H. L.
2016-12-01
Using Brownian dynamics (BD) simulations and an analytical approach we investigate the shear-induced, nonequilibrium dynamics of dense colloidal suspensions confined to a narrow slit-pore. Focusing on situations where the colloids arrange in well-defined layers with solidlike in-plane structure, the confined films display complex, nonlinear behavior such as collective depinning and local transport via density excitations. These phenomena are reminiscent of colloidal monolayers driven over a periodic substrate potential. In order to deepen this connection, we present an effective model that maps the dynamics of the shear-driven colloidal layers to the motion of a single particle driven over an effective substrate potential. This model allows us to estimate the critical shear rate of the depinning transition based on the equilibrium configuration, revealing the impact of important parameters, such as the slit-pore width and the interaction strength. We then turn to heterogeneous systems where a layer of small colloids is sheared with respect to bottom layers of large particles. For these incommensurate systems we find that the particle transport is dominated by density excitations resembling the so-called "kink" solutions of the Frenkel-Kontorova (FK) model. In contrast to the FK model, however, the corresponding "antikinks" do not move.
Shlizerman, Eli; Riffell, Jeffrey A.; Kutz, J. Nathan
2014-01-01
The antennal lobe (AL), olfactory processing center in insects, is able to process stimuli into distinct neural activity patterns, called olfactory neural codes. To model their dynamics we perform multichannel recordings from the projection neurons in the AL driven by different odorants. We then derive a dynamic neuronal network from the electrophysiological data. The network consists of lateral-inhibitory neurons and excitatory neurons (modeled as firing-rate units), and is capable of producing unique olfactory neural codes for the tested odorants. To construct the network, we (1) design a projection, an odor space, for the neural recording from the AL, which discriminates between distinct odorants trajectories (2) characterize scent recognition, i.e., decision-making based on olfactory signals and (3) infer the wiring of the neural circuit, the connectome of the AL. We show that the constructed model is consistent with biological observations, such as contrast enhancement and robustness to noise. The study suggests a data-driven approach to answer a key biological question in identifying how lateral inhibitory neurons can be wired to excitatory neurons to permit robust activity patterns. PMID:25165442
Network Model-Assisted Inference from Respondent-Driven Sampling Data
Gile, Krista J.; Handcock, Mark S.
2015-01-01
Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328
Network Model-Assisted Inference from Respondent-Driven Sampling Data.
Gile, Krista J; Handcock, Mark S
2015-06-01
Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.
A generative tool for building health applications driven by ISO 13606 archetypes.
Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás
2012-10-01
The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.
Data-driven outbreak forecasting with a simple nonlinear growth model.
Lega, Joceline; Brown, Heidi E
2016-12-01
Recent events have thrown the spotlight on infectious disease outbreak response. We developed a data-driven method, EpiGro, which can be applied to cumulative case reports to estimate the order of magnitude of the duration, peak and ultimate size of an ongoing outbreak. It is based on a surprisingly simple mathematical property of many epidemiological data sets, does not require knowledge or estimation of disease transmission parameters, is robust to noise and to small data sets, and runs quickly due to its mathematical simplicity. Using data from historic and ongoing epidemics, we present the model. We also provide modeling considerations that justify this approach and discuss its limitations. In the absence of other information or in conjunction with other models, EpiGro may be useful to public health responders. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Random Matrix Approach to Quantum Adiabatic Evolution Algorithms
NASA Technical Reports Server (NTRS)
Boulatov, Alexei; Smelyanskiy, Vadier N.
2004-01-01
We analyze the power of quantum adiabatic evolution algorithms (Q-QA) for solving random NP-hard optimization problems within a theoretical framework based on the random matrix theory (RMT). We present two types of the driven RMT models. In the first model, the driving Hamiltonian is represented by Brownian motion in the matrix space. We use the Brownian motion model to obtain a description of multiple avoided crossing phenomena. We show that the failure mechanism of the QAA is due to the interaction of the ground state with the "cloud" formed by all the excited states, confirming that in the driven RMT models. the Landau-Zener mechanism of dissipation is not important. We show that the QAEA has a finite probability of success in a certain range of parameters. implying the polynomial complexity of the algorithm. The second model corresponds to the standard QAEA with the problem Hamiltonian taken from the Gaussian Unitary RMT ensemble (GUE). We show that the level dynamics in this model can be mapped onto the dynamics in the Brownian motion model. However, the driven RMT model always leads to the exponential complexity of the algorithm due to the presence of the long-range intertemporal correlations of the eigenvalues. Our results indicate that the weakness of effective transitions is the leading effect that can make the Markovian type QAEA successful.
Modeling Particle Acceleration and Transport at a 2-D CME-Driven Shock
NASA Astrophysics Data System (ADS)
Hu, Junxiang; Li, Gang; Ao, Xianzhi; Zank, Gary P.; Verkhoglyadova, Olga
2017-11-01
We extend our earlier Particle Acceleration and Transport in the Heliosphere (PATH) model to study particle acceleration and transport at a coronal mass ejection (CME)-driven shock. We model the propagation of a CME-driven shock in the ecliptic plane using the ZEUS-3D code from 20 solar radii to 2 AU. As in the previous PATH model, the initiation of the CME-driven shock is simplified and modeled as a disturbance at the inner boundary. Different from the earlier PATH model, the disturbance is now longitudinally dependent. Particles are accelerated at the 2-D shock via the diffusive shock acceleration mechanism. The acceleration depends on both the parallel and perpendicular diffusion coefficients κ|| and κ⊥ and is therefore shock-obliquity dependent. Following the procedure used in Li, Shalchi, et al. (k href="#jgra53857-bib-0045"/>), we obtain the particle injection energy, the maximum energy, and the accelerated particle spectra at the shock front. Once accelerated, particles diffuse and convect in the shock complex. The diffusion and convection of these particles are treated using a refined 2-D shell model in an approach similar to Zank et al. (k href="#jgra53857-bib-0089"/>). When particles escape from the shock, they propagate along and across the interplanetary magnetic field. The propagation is modeled using a focused transport equation with the addition of perpendicular diffusion. We solve the transport equation using a backward stochastic differential equation method where adiabatic cooling, focusing, pitch angle scattering, and cross-field diffusion effects are all included. Time intensity profiles and instantaneous particle spectra as well as particle pitch angle distributions are shown for two example CME shocks.
A Consumer-Driven Approach To Increase Suggestive Selling.
ERIC Educational Resources Information Center
Rohn, Don; Austin, John; Sanford, Alison
2003-01-01
Discussion of the effectiveness of behavioral interventions in improving suggestive selling behavior of sales staff focuses on a study that examined the efficacy of a consumer-driven approach to improve suggestive selling behavior of three employees of a fast food franchise. Reports that consumer-driven intervention increased suggestive selling…
Dynamic Emulation Modelling (DEMo) of large physically-based environmental models
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.
2012-12-01
In environmental modelling large, spatially-distributed, physically-based models are widely adopted to describe the dynamics of physical, social and economic processes. Such an accurate process characterization comes, however, to a price: the computational requirements of these models are considerably high and prevent their use in any problem requiring hundreds or thousands of model runs to be satisfactory solved. Typical examples include optimal planning and management, data assimilation, inverse modelling and sensitivity analysis. An effective approach to overcome this limitation is to perform a top-down reduction of the physically-based model by identifying a simplified, computationally efficient emulator, constructed from and then used in place of the original model in highly resource-demanding tasks. The underlying idea is that not all the process details in the original model are equally important and relevant to the dynamics of the outputs of interest for the type of problem considered. Emulation modelling has been successfully applied in many environmental applications, however most of the literature considers non-dynamic emulators (e.g. metamodels, response surfaces and surrogate models), where the original dynamical model is reduced to a static map between input and the output of interest. In this study we focus on Dynamic Emulation Modelling (DEMo), a methodological approach that preserves the dynamic nature of the original physically-based model, with consequent advantages in a wide variety of problem areas. In particular, we propose a new data-driven DEMo approach that combines the many advantages of data-driven modelling in representing complex, non-linear relationships, but preserves the state-space representation typical of process-based models, which is both particularly effective in some applications (e.g. optimal management and data assimilation) and facilitates the ex-post physical interpretation of the emulator structure, thus enhancing the credibility of the model to stakeholders and decision-makers. Numerical results from the application of the approach to the reduction of 3D coupled hydrodynamic-ecological models in several real world case studies, including Marina Reservoir (Singapore) and Googong Reservoir (Australia), are illustrated.
Mass imbalances in EPANET water-quality simulations
NASA Astrophysics Data System (ADS)
Davis, Michael J.; Janke, Robert; Taxon, Thomas N.
2018-04-01
EPANET is widely employed to simulate water quality in water distribution systems. However, in general, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results only for short water-quality time steps. Overly long time steps can yield errors in concentration estimates and can result in situations in which constituent mass is not conserved. The use of a time step that is sufficiently short to avoid these problems may not always be feasible. The absence of EPANET errors or warnings does not ensure conservation of mass. This paper provides examples illustrating mass imbalances and explains how such imbalances can occur because of fundamental limitations in the water-quality routing algorithm used in EPANET. In general, these limitations cannot be overcome by the use of improved water-quality modeling practices. This paper also presents a preliminary event-driven approach that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, toward those obtained using the preliminary event-driven approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations. The results presented in this paper should be of value to those who perform water-quality simulations using EPANET or use the results of such simulations, including utility managers and engineers.
Agile IT: Thinking in User-Centric Models
NASA Astrophysics Data System (ADS)
Margaria, Tiziana; Steffen, Bernhard
We advocate a new teaching direction for modern CS curricula: extreme model-driven development (XMDD), a new development paradigm designed to continuously involve the customer/application expert throughout the whole systems' life cycle. Based on the `One-Thing Approach', which works by successively enriching and refining one single artifact, system development becomes in essence a user-centric orchestration of intuitive service functionality. XMDD differs radically from classical software development, which, in our opinion is no longer adequate for the bulk of application programming - in particular when it comes to heterogeneous, cross organizational systems which must adapt to rapidly changing market requirements. Thus there is a need for new curricula addressing this model-driven, lightweight, and cooperative development paradigm that puts the user process in the center of the development and the application expert in control of the process evolution.
2014-04-25
EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL file and generate the corresponding UML...ObjectItemStructure specification shown in Figure 10. Running this script in the relational database server MySQL creates the physical schema that
Kamesh, Reddi; Rani, Kalipatnapu Yamuna
2017-12-01
In this paper, a novel formulation for nonlinear model predictive control (MPC) has been proposed incorporating the extended Kalman filter (EKF) control concept using a purely data-driven artificial neural network (ANN) model based on measurements for supervisory control. The proposed scheme consists of two modules focusing on online parameter estimation based on past measurements and control estimation over control horizon based on minimizing the deviation of model output predictions from set points along the prediction horizon. An industrial case study for temperature control of a multiproduct semibatch polymerization reactor posed as a challenge problem has been considered as a test bed to apply the proposed ANN-EKFMPC strategy at supervisory level as a cascade control configuration along with proportional integral controller [ANN-EKFMPC with PI (ANN-EKFMPC-PI)]. The proposed approach is formulated incorporating all aspects of MPC including move suppression factor for control effort minimization and constraint-handling capability including terminal constraints. The nominal stability analysis and offset-free tracking capabilities of the proposed controller are proved. Its performance is evaluated by comparison with a standard MPC-based cascade control approach using the same adaptive ANN model. The ANN-EKFMPC-PI control configuration has shown better controller performance in terms of temperature tracking, smoother input profiles, as well as constraint-handling ability compared with the ANN-MPC with PI approach for two products in summer and winter. The proposed scheme is found to be versatile although it is based on a purely data-driven model with online parameter estimation.
Model-driven approach to data collection and reporting for quality improvement.
Curcin, Vasa; Woodcock, Thomas; Poots, Alan J; Majeed, Azeem; Bell, Derek
2014-12-01
Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Othman, M. F.; Kurniawan, R.; Schramm, D.; Ariffin, A. K.
2018-05-01
Modeling a cable model in multibody dynamics simulation tool which dynamically varies in length, mass and stiffness is a challenging task. Simulation of cable-driven parallel robots (CDPR) for instance requires a cable model that can dynamically change in length for every desired pose of the platform. Thus, in this paper, a detailed procedure for modeling and simulation of a dynamic cable model in Dymola is proposed. The approach is also applicable for other types of Modelica simulation environments. The cable is modeled using standard mechanical elements like mass, spring, damper and joint. The parameters of the cable model are based on the factsheet of the manufacturer and experimental results. Its dynamic ability is tested by applying it on a complete planar CDPR model in which the parameters are based on a prototype named CABLAR, which is developed in Chair of Mechatronics, University of Duisburg-Essen. The prototype has been developed to demonstrate an application of CDPR as a goods storage and retrieval machine. The performance of the cable model during the simulation is analyzed and discussed.
Modeling of damage driven fracture failure of fiber post-restored teeth.
Xu, Binting; Wang, Yining; Li, Qing
2015-09-01
Mechanical failure of biomaterials, which can be initiated by either violent force, or progressive stress fatigue, is a serious issue. Great efforts have been made to improve the mechanical performances of dental restorations. Virtual simulation is a promising approach for biomechanical investigations, which presents significant advantages in improving efficiency than traditional in vivo/in vitro studies. Over the past few decades, a number of virtual studies have been conducted to investigate the biomechanical issues concerning dental biomaterials, but only with limited incorporation of brittle failure phenomena. Motivated by the contradictory findings between several finite element analyses and common clinical observations on the fracture resistance of post-restored teeth, this study aimed to provide an approach using numerical simulations for investigating the fracture failure process through a non-linear fracture mechanics model. The ability of this approach to predict fracture initiation and propagation in a complex biomechanical status based on the intrinsic material properties was investigated. Results of the virtual simulations matched the findings of experimental tests, in terms of the ultimate fracture failure strengths and predictive areas under risk of clinical failure. This study revealed that the failure of dental post-restored restorations is a typical damage-driven continuum-to-discrete process. This approach is anticipated to have ramifications not only for modeling fracture events, but also for the design and optimization of the mechanical properties of biomaterials for specific clinically determined requirements. Copyright © 2015 Elsevier Ltd. All rights reserved.
Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization
Marai, G. Elisabeta
2018-01-01
Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550
Simple Kinematic Pathway Approach (KPA) to Catchment-scale Travel Time and Water Age Distributions
NASA Astrophysics Data System (ADS)
Soltani, S. S.; Cvetkovic, V.; Destouni, G.
2017-12-01
The distribution of catchment-scale water travel times is strongly influenced by morphological dispersion and is partitioned between hillslope and larger, regional scales. We explore whether hillslope travel times are predictable using a simple semi-analytical "kinematic pathway approach" (KPA) that accounts for dispersion on two levels of morphological and macro-dispersion. The study gives new insights to shallow (hillslope) and deep (regional) groundwater travel times by comparing numerical simulations of travel time distributions, referred to as "dynamic model", with corresponding KPA computations for three different real catchment case studies in Sweden. KPA uses basic structural and hydrological data to compute transient water travel time (forward mode) and age (backward mode) distributions at the catchment outlet. Longitudinal and morphological dispersion components are reflected in KPA computations by assuming an effective Peclet number and topographically driven pathway length distributions, respectively. Numerical simulations of advective travel times are obtained by means of particle tracking using the fully-integrated flow model MIKE SHE. The comparison of computed cumulative distribution functions of travel times shows significant influence of morphological dispersion and groundwater recharge rate on the compatibility of the "kinematic pathway" and "dynamic" models. Zones of high recharge rate in "dynamic" models are associated with topographically driven groundwater flow paths to adjacent discharge zones, e.g. rivers and lakes, through relatively shallow pathway compartments. These zones exhibit more compatible behavior between "dynamic" and "kinematic pathway" models than the zones of low recharge rate. Interestingly, the travel time distributions of hillslope compartments remain almost unchanged with increasing recharge rates in the "dynamic" models. This robust "dynamic" model behavior suggests that flow path lengths and travel times in shallow hillslope compartments are controlled by topography, and therefore application and further development of the simple "kinematic pathway" approach is promising for their modeling.
A zebrafish model of chordoma initiated by notochord-driven expression of HRASV12.
Burger, Alexa; Vasilyev, Aleksandr; Tomar, Ritu; Selig, Martin K; Nielsen, G Petur; Peterson, Randall T; Drummond, Iain A; Haber, Daniel A
2014-07-01
Chordoma is a malignant tumor thought to arise from remnants of the embryonic notochord, with its origin in the bones of the axial skeleton. Surgical resection is the standard treatment, usually in combination with radiation therapy, but neither chemotherapeutic nor targeted therapeutic approaches have demonstrated success. No animal model and only few chordoma cell lines are available for preclinical drug testing, and, although no druggable genetic drivers have been identified, activation of EGFR and downstream AKT-PI3K pathways have been described. Here, we report a zebrafish model of chordoma, based on stable transgene-driven expression of HRASV12 in notochord cells during development. Extensive intra-notochordal tumor formation is evident within days of transgene expression, ultimately leading to larval death. The zebrafish tumors share characteristics of human chordoma as demonstrated by immunohistochemistry and electron microscopy. The mTORC1 inhibitor rapamycin, which has some demonstrated activity in a chordoma cell line, delays the onset of tumor formation in our zebrafish model, and improves survival of tumor-bearing fish. Consequently, the HRASV12-driven zebrafish model of chordoma could enable high-throughput screening of potential therapeutic agents for the treatment of this refractory cancer. © 2014. Published by The Company of Biologists Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.
2013-07-15
Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less
Developing a habitat-driven approach to CWWT design
Sartoris, James J.; Thullen, Joan S.
1998-01-01
A habitat-driven approach to CWWT design is defined as designing the constructed wetland to maximize habitat values for a given site within the constraints of meeting specified treatment criteria. This is in contrast to the more typical approach of designing the CWWT to maximize treatment efficiency, and then, perhaps, adding wildlife habitat features. The habitat-driven approach is advocated for two reasons: (1) because good wetland habitat is critically lacking, and (2) because it is hypothesized that well-designed habitat will result in good, sustainable wastewater treatment.
Bustamante, Carlos D.; Valero-Cuevas, Francisco J.
2010-01-01
The field of complex biomechanical modeling has begun to rely on Monte Carlo techniques to investigate the effects of parameter variability and measurement uncertainty on model outputs, search for optimal parameter combinations, and define model limitations. However, advanced stochastic methods to perform data-driven explorations, such as Markov chain Monte Carlo (MCMC), become necessary as the number of model parameters increases. Here, we demonstrate the feasibility and, what to our knowledge is, the first use of an MCMC approach to improve the fitness of realistically large biomechanical models. We used a Metropolis–Hastings algorithm to search increasingly complex parameter landscapes (3, 8, 24, and 36 dimensions) to uncover underlying distributions of anatomical parameters of a “truth model” of the human thumb on the basis of simulated kinematic data (thumbnail location, orientation, and linear and angular velocities) polluted by zero-mean, uncorrelated multivariate Gaussian “measurement noise.” Driven by these data, ten Markov chains searched each model parameter space for the subspace that best fit the data (posterior distribution). As expected, the convergence time increased, more local minima were found, and marginal distributions broadened as the parameter space complexity increased. In the 36-D scenario, some chains found local minima but the majority of chains converged to the true posterior distribution (confirmed using a cross-validation dataset), thus demonstrating the feasibility and utility of these methods for realistically large biomechanical problems. PMID:19272906
Insights into Departure Intention: A Qualitative Case Study
ERIC Educational Resources Information Center
Natoli, Riccardo; Jackling, Beverley; Siddique, Salina
2015-01-01
Efforts to address attrition rates at universities have been driven by Tinto's (1975) model of student engagement with its focus on student: (a) pre entry attributes; (b) academic engagement; and (c) social engagement. Using an ethnographic approach, the study involves interviews with business students to explore the links between these aspects…
NASA Astrophysics Data System (ADS)
González, D. L., II; Angus, M. P.; Tetteh, I. K.; Bello, G. A.; Padmanabhan, K.; Pendse, S. V.; Srinivas, S.; Yu, J.; Semazzi, F.; Kumar, V.; Samatova, N. F.
2014-04-01
Decades of hypothesis-driven and/or first-principles research have been applied towards the discovery and explanation of the mechanisms that drive climate phenomena, such as western African Sahel summer rainfall variability. Although connections between various climate factors have been theorized, not all of the key relationships are fully understood. We propose a data-driven approach to identify candidate players in this climate system, which can help explain underlying mechanisms and/or even suggest new relationships, to facilitate building a more comprehensive and predictive model of the modulatory relationships influencing a climate phenomenon of interest. We applied coupled heterogeneous association rule mining (CHARM), Lasso multivariate regression, and Dynamic Bayesian networks to find relationships within a complex system, and explored means with which to obtain a consensus result from the application of such varied methodologies. Using this fusion of approaches, we identified relationships among climate factors that modulate Sahel rainfall, including well-known associations from prior climate knowledge, as well as promising discoveries that invite further research by the climate science community.
The dynamics of information-driven coordination phenomena: A transfer entropy analysis
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-01-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data. PMID:27051875
On square-wave-driven stochastic resonance for energy harvesting in a bistable system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Su, Dongxu, E-mail: sudx@iis.u-tokyo.ac.jp; Zheng, Rencheng; Nakano, Kimihiko
Stochastic resonance is a physical phenomenon through which the throughput of energy within an oscillator excited by a stochastic source can be boosted by adding a small modulating excitation. This study investigates the feasibility of implementing square-wave-driven stochastic resonance to enhance energy harvesting. The motivating hypothesis was that such stochastic resonance can be efficiently realized in a bistable mechanism. However, the condition for the occurrence of stochastic resonance is conventionally defined by the Kramers rate. This definition is inadequate because of the necessity and difficulty in estimating white noise density. A bistable mechanism has been designed using an explicit analyticalmore » model which implies a new approach for achieving stochastic resonance in the paper. Experimental tests confirm that the addition of a small-scale force to the bistable system excited by a random signal apparently leads to a corresponding amplification of the response that we now term square-wave-driven stochastic resonance. The study therefore indicates that this approach may be a promising way to improve the performance of an energy harvester under certain forms of random excitation.« less
The dynamics of information-driven coordination phenomena: A transfer entropy analysis.
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-04-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.
Is Slow Slip a Cause or a Result of Tremor?
NASA Astrophysics Data System (ADS)
Luo, Y.; Ampuero, J. P.
2017-12-01
While various modeling efforts have been conducted to reproduce subsets of observations of tremor and slow-slip events (SSE), a fundamental but yet unanswered question is whether slow slip is a cause or a result of tremor. Tremor is commonly regarded as driven by SSE. This view is mainly based on observations of SSE without detected tremors and on (frequency-limited) estimates of total tremor seismic moment being lower than 1% of their concomitant SSE moment. In previous studies we showed that models of heterogeneous faults, composed of seismic asperities embedded in an aseismic fault zone matrix, reproduce quantitatively the hierarchical patterns of tremor migration observed in Cascadia and Shikoku. To address the title question, we design two end-member models of a heterogeneous fault. In the SSE-driven-tremor model, slow slip events are spontaneously generated by the matrix (even in the absence of seismic asperities) and drive tremor. In the Tremor-driven-SSE model the matrix is stable (it slips steadily in the absence of asperities) and slow slip events result from the collective behavior of tremor asperities interacting via transient creep (local afterslip fronts). We study these two end-member models through 2D quasi-dynamic multi-cycle simulations of faults governed by rate-and-state friction with heterogeneous frictional properties and effective normal stress, using the earthquake simulation software QDYN (https://zenodo.org/record/322459). We find that both models reproduce first-order observations of SSE and tremor and have very low seismic to aseismic moment ratio. However, the Tremor-driven-SSE model assumes a simpler rheology than the SSE-driven-tremor model and matches key observations better and without fine tuning, including the ratio of propagation speeds of forward SSE and rapid tremor reversals and the decay of inter-event times of Low Frequency Earthquakes. These modeling results indicate that, in contrast to a common view, SSE could be a result of tremor activity. We also find that, despite important interactions between asperities, tremor activity rates are proportional to the underlying aseismic slip rate, supporting an approach to estimate SSE properties with high spatial-temporal resolutions via tremor activity.
A composite computational model of liver glucose homeostasis. I. Building the composite model.
Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A
2012-04-07
A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.
A Comparison and Evaluation of Real-Time Software Systems Modeling Languages
NASA Technical Reports Server (NTRS)
Evensen, Kenneth D.; Weiss, Kathryn Anne
2010-01-01
A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.
Data-driven parameterization of the generalized Langevin equation
Lei, Huan; Baker, Nathan A.; Li, Xiantao
2016-11-29
We present a data-driven approach to determine the memory kernel and random noise of the generalized Langevin equation. To facilitate practical implementations, we parameterize the kernel function in the Laplace domain by a rational function, with coefficients directly linked to the equilibrium statistics of the coarse-grain variables. Further, we show that such an approximation can be constructed to arbitrarily high order. Within these approximations, the generalized Langevin dynamics can be embedded in an extended stochastic model without memory. We demonstrate how to introduce the stochastic noise so that the fluctuation-dissipation theorem is exactly satisfied.
Developing a Data Driven Process-Based Model for Remote Sensing of Ecosystem Production
NASA Astrophysics Data System (ADS)
Elmasri, B.; Rahman, A. F.
2010-12-01
Estimating ecosystem carbon fluxes at various spatial and temporal scales is essential for quantifying the global carbon cycle. Numerous models have been developed for this purpose using several environmental variables as well as vegetation indices derived from remotely sensed data. Here we present a data driven modeling approach for gross primary production (GPP) that is based on a process based model BIOME-BGC. The proposed model was run using available remote sensing data and it does not depend on look-up tables. Furthermore, this approach combines the merits of both empirical and process models, and empirical models were used to estimate certain input variables such as light use efficiency (LUE). This was achieved by using remotely sensed data to the mathematical equations that represent biophysical photosynthesis processes in the BIOME-BGC model. Moreover, a new spectral index for estimating maximum photosynthetic activity, maximum photosynthetic rate index (MPRI), is also developed and presented here. This new index is based on the ratio between the near infrared and the green bands (ρ858.5/ρ555). The model was tested and validated against MODIS GPP product and flux measurements from two eddy covariance flux towers located at Morgan Monroe State Forest (MMSF) in Indiana and Harvard Forest in Massachusetts. Satellite data acquired by the Advanced Microwave Scanning Radiometer (AMSR-E) and MODIS were used. The data driven model showed a strong correlation between the predicted and measured GPP at the two eddy covariance flux towers sites. This methodology produced better predictions of GPP than did the MODIS GPP product. Moreover, the proportion of error in the predicted GPP for MMSF and Harvard forest was dominated by unsystematic errors suggesting that the results are unbiased. The analysis indicated that maintenance respiration is one of the main factors that dominate the overall model outcome errors and improvement in maintenance respiration estimation will result in improved GPP predictions. Although there might be a room for improvements in our model outcomes through improved parameterization, our results suggest that such a methodology for running BIOME-BGC model based entirely on routinely available data can produce good predictions of GPP.
CP violation in multibody B decays from QCD factorization
NASA Astrophysics Data System (ADS)
Klein, Rebecca; Mannel, Thomas; Virto, Javier; Vos, K. Keri
2017-10-01
We test a data-driven approach based on QCD factorization for charmless three-body B-decays by confronting it to measurements of CP violation in B - → π - π + π -. While some of the needed non-perturbative objects can be directly extracted from data, some others can, so far, only be modelled. Although this approach is currently model dependent, we comment on the perspectives to reduce this model dependence. While our model naturally accommodates the gross features of the Dalitz distribution, it cannot quantitatively explain the details seen in the current experimental data on local CP asymmetries. We comment on possible refinements of our simple model and conclude by briefly discussing a possible extension of the model to large invariant masses, where large local CP asymmetries have been measured.
Taylor, Lyla L; Banwart, Steve A; Valdes, Paul J; Leake, Jonathan R; Beerling, David J
2012-02-19
Global weathering of calcium and magnesium silicate rocks provides the long-term sink for atmospheric carbon dioxide (CO(2)) on a timescale of millions of years by causing precipitation of calcium carbonates on the seafloor. Catchment-scale field studies consistently indicate that vegetation increases silicate rock weathering, but incorporating the effects of trees and fungal symbionts into geochemical carbon cycle models has relied upon simple empirical scaling functions. Here, we describe the development and application of a process-based approach to deriving quantitative estimates of weathering by plant roots, associated symbiotic mycorrhizal fungi and climate. Our approach accounts for the influence of terrestrial primary productivity via nutrient uptake on soil chemistry and mineral weathering, driven by simulations using a dynamic global vegetation model coupled to an ocean-atmosphere general circulation model of the Earth's climate. The strategy is successfully validated against observations of weathering in watersheds around the world, indicating that it may have some utility when extrapolated into the past. When applied to a suite of six global simulations from 215 to 50 Ma, we find significantly larger effects over the past 220 Myr relative to the present day. Vegetation and mycorrhizal fungi enhanced climate-driven weathering by a factor of up to 2. Overall, we demonstrate a more realistic process-based treatment of plant fungal-geosphere interactions at the global scale, which constitutes a first step towards developing 'next-generation' geochemical models.
Taylor, Lyla L.; Banwart, Steve A.; Valdes, Paul J.; Leake, Jonathan R.; Beerling, David J.
2012-01-01
Global weathering of calcium and magnesium silicate rocks provides the long-term sink for atmospheric carbon dioxide (CO2) on a timescale of millions of years by causing precipitation of calcium carbonates on the seafloor. Catchment-scale field studies consistently indicate that vegetation increases silicate rock weathering, but incorporating the effects of trees and fungal symbionts into geochemical carbon cycle models has relied upon simple empirical scaling functions. Here, we describe the development and application of a process-based approach to deriving quantitative estimates of weathering by plant roots, associated symbiotic mycorrhizal fungi and climate. Our approach accounts for the influence of terrestrial primary productivity via nutrient uptake on soil chemistry and mineral weathering, driven by simulations using a dynamic global vegetation model coupled to an ocean–atmosphere general circulation model of the Earth's climate. The strategy is successfully validated against observations of weathering in watersheds around the world, indicating that it may have some utility when extrapolated into the past. When applied to a suite of six global simulations from 215 to 50 Ma, we find significantly larger effects over the past 220 Myr relative to the present day. Vegetation and mycorrhizal fungi enhanced climate-driven weathering by a factor of up to 2. Overall, we demonstrate a more realistic process-based treatment of plant fungal–geosphere interactions at the global scale, which constitutes a first step towards developing ‘next-generation’ geochemical models. PMID:22232768
Electric-field-driven electron-transfer in mixed-valence molecules.
Blair, Enrique P; Corcelli, Steven A; Lent, Craig S
2016-07-07
Molecular quantum-dot cellular automata is a computing paradigm in which digital information is encoded by the charge configuration of a mixed-valence molecule. General-purpose computing can be achieved by arranging these compounds on a substrate and exploiting intermolecular Coulombic coupling. The operation of such a device relies on nonequilibrium electron transfer (ET), whereby the time-varying electric field of one molecule induces an ET event in a neighboring molecule. The magnitude of the electric fields can be quite large because of close spatial proximity, and the induced ET rate is a measure of the nonequilibrium response of the molecule. We calculate the electric-field-driven ET rate for a model mixed-valence compound. The mixed-valence molecule is regarded as a two-state electronic system coupled to a molecular vibrational mode, which is, in turn, coupled to a thermal environment. Both the electronic and vibrational degrees-of-freedom are treated quantum mechanically, and the dissipative vibrational-bath interaction is modeled with the Lindblad equation. This approach captures both tunneling and nonadiabatic dynamics. Relationships between microscopic molecular properties and the driven ET rate are explored for two time-dependent applied fields: an abruptly switched field and a linearly ramped field. In both cases, the driven ET rate is only weakly temperature dependent. When the model is applied using parameters appropriate to a specific mixed-valence molecule, diferrocenylacetylene, terahertz-range ET transfer rates are predicted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasilov, Sergei V.; Institute of Plasma Physics National Science Center “Kharkov Institute of Physics and Technology” ul. Akademicheskaya 1, 61108 Kharkov; Kernbichler, Winfried
2014-09-15
The toroidal torque driven by external non-resonant magnetic perturbations (neoclassical toroidal viscosity) is an important momentum source affecting the toroidal plasma rotation in tokamaks. The well-known force-flux relation directly links this torque to the non-ambipolar neoclassical particle fluxes arising due to the violation of the toroidal symmetry of the magnetic field. Here, a quasilinear approach for the numerical computation of these fluxes is described, which reduces the dimension of a standard neoclassical transport problem by one without model simplifications of the linearized drift kinetic equation. The only limiting condition is that the non-axisymmetric perturbation field is small enough such thatmore » the effect of the perturbation field on particle motion within the flux surface is negligible. Therefore, in addition to most of the transport regimes described by the banana (bounce averaged) kinetic equation also such regimes as, e.g., ripple-plateau and resonant diffusion regimes are naturally included in this approach. Based on this approach, a quasilinear version of the code NEO-2 [W. Kernbichler et al., Plasma Fusion Res. 3, S1061 (2008).] has been developed and benchmarked against a few analytical and numerical models. Results from NEO-2 stay in good agreement with results from these models in their pertinent range of validity.« less
Loss of Pin1 Suppresses Hedgehog-Driven Medulloblastoma Tumorigenesis.
Xu, Tao; Zhang, Honglai; Park, Sung-Soo; Venneti, Sriram; Kuick, Rork; Ha, Kimberly; Michael, Lowell Evan; Santi, Mariarita; Uchida, Chiyoko; Uchida, Takafumi; Srinivasan, Ashok; Olson, James M; Dlugosz, Andrzej A; Camelo-Piragua, Sandra; Rual, Jean-François
2017-03-01
Medulloblastoma is the most common malignant brain tumor in children. Therapeutic approaches to medulloblastoma (combination of surgery, radiotherapy, and chemotherapy) have led to significant improvements, but these are achieved at a high cost to quality of life. Alternative therapeutic approaches are needed. Genetic mutations leading to the activation of the Hedgehog pathway drive tumorigenesis in ~30% of medulloblastoma. In a yeast two-hybrid proteomic screen, we discovered a novel interaction between GLI1, a key transcription factor for the mediation of Hedgehog signals, and PIN1, a peptidylprolyl cis/trans isomerase that regulates the postphosphorylation fate of its targets. The GLI1/PIN1 interaction was validated by reciprocal pulldowns using epitope-tagged proteins in HEK293T cells as well as by co-immunoprecipiations of the endogenous proteins in a medulloblastoma cell line. Our results support a molecular model in which PIN1 promotes GLI1 protein abundance, thus contributing to the positive regulation of Hedgehog signals. Most importantly, in vivo functional analyses of Pin1 in the GFAP-tTA;TRE-SmoA1 mouse model of Hedgehog-driven medulloblastoma demonstrate that the loss of Pin1 impairs tumor development and dramatically increases survival. In summary, the discovery of the GLI1/PIN1 interaction uncovers PIN1 as a novel therapeutic target in Hedgehog-driven medulloblastoma tumorigenesis. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Díaz-Rodríguez, Natalia; Cadahía, Olmo León; Cuéllar, Manuel Pegalajar; Lilius, Johan; Calvo-Flores, Miguel Delgado
2014-01-01
Human activity recognition is a key task in ambient intelligence applications to achieve proper ambient assisted living. There has been remarkable progress in this domain, but some challenges still remain to obtain robust methods. Our goal in this work is to provide a system that allows the modeling and recognition of a set of complex activities in real life scenarios involving interaction with the environment. The proposed framework is a hybrid model that comprises two main modules: a low level sub-activity recognizer, based on data-driven methods, and a high-level activity recognizer, implemented with a fuzzy ontology to include the semantic interpretation of actions performed by users. The fuzzy ontology is fed by the sub-activities recognized by the low level data-driven component and provides fuzzy ontological reasoning to recognize both the activities and their influence in the environment with semantics. An additional benefit of the approach is the ability to handle vagueness and uncertainty in the knowledge-based module, which substantially outperforms the treatment of incomplete and/or imprecise data with respect to classic crisp ontologies. We validate these advantages with the public CAD-120 dataset (Cornell Activity Dataset), achieving an accuracy of 90.1% and 91.07% for low-level and high-level activities, respectively. This entails an improvement over fully data-driven or ontology-based approaches. PMID:25268914
Authoring and verification of clinical guidelines: a model driven approach.
Pérez, Beatriz; Porres, Ivan
2010-08-01
The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.
A Cohesive Zone Approach for Fatigue-Driven Delamination Analysis in Composite Materials
NASA Astrophysics Data System (ADS)
Amiri-Rad, Ahmad; Mashayekhi, Mohammad
2017-08-01
A new model for prediction of fatigue-driven delamination in laminated composites is proposed using cohesive interface elements. The presented model provides a link between cohesive elements damage evolution rate and crack growth rate of Paris law. This is beneficial since no additional material parameters are required and the well-known Paris law constants are used. The link between the cohesive zone method and fracture mechanics is achieved without use of effective length which has led to more accurate results. The problem of unknown failure path in calculation of the energy release rate is solved by imposing a condition on the damage model which leads to completely vertical failure path. A global measure of energy release rate is used for the whole cohesive zone which is computationally more efficient compared to previous similar models. The performance of the proposed model is investigated by simulation of well-known delamination tests and comparison against experimental data of the literature.
NASA Astrophysics Data System (ADS)
Halimah, B. Z.; Azlina, A.; Sembok, T. M.; Sufian, I.; Sharul Azman, M. N.; Azuraliza, A. B.; Zulaiha, A. O.; Nazlia, O.; Salwani, A.; Sanep, A.; Hailani, M. T.; Zaher, M. Z.; Azizah, J.; Nor Faezah, M. Y.; Choo, W. O.; Abdullah, Chew; Sopian, B.
The Holistic Islamic Banking System (HiCORE), a banking system suitable for virtual banking environment, created based on universityindustry collaboration initiative between Universiti Kebangsaan Malaysia (UKM) and Fuziq Software Sdn Bhd. HiCORE was modeled on a multitiered Simple - Services Oriented Architecture (S-SOA), using the parameterbased semantic approach. HiCORE's existence is timely as the financial world is looking for a new approach to creating banking and financial products that are interest free or based on the Islamic Syariah principles and jurisprudence. An interest free banking system has currently caught the interest of bankers and financiers all over the world. HiCORE's Parameter-based module houses the Customer-information file (CIF), Deposit and Financing components. The Parameter based module represents the third tier of the multi-tiered Simple SOA approach. This paper highlights the multi-tiered parameter- driven approach to the creation of new Islamiic products based on the 'dalil' (Quran), 'syarat' (rules) and 'rukun' (procedures) as required by the syariah principles and jurisprudence reflected by the semantic ontology embedded in the parameter module of the system.
Inter-subject phase synchronization for exploratory analysis of task-fMRI.
Bolt, Taylor; Nomi, Jason S; Vij, Shruti G; Chang, Catie; Uddin, Lucina Q
2018-08-01
Analysis of task-based fMRI data is conventionally carried out using a hypothesis-driven approach, where blood-oxygen-level dependent (BOLD) time courses are correlated with a hypothesized temporal structure. In some experimental designs, this temporal structure can be difficult to define. In other cases, experimenters may wish to take a more exploratory, data-driven approach to detecting task-driven BOLD activity. In this study, we demonstrate the efficiency and power of an inter-subject synchronization approach for exploratory analysis of task-based fMRI data. Combining the tools of instantaneous phase synchronization and independent component analysis, we characterize whole-brain task-driven responses in terms of group-wise similarity in temporal signal dynamics of brain networks. We applied this framework to fMRI data collected during performance of a simple motor task and a social cognitive task. Analyses using an inter-subject phase synchronization approach revealed a large number of brain networks that dynamically synchronized to various features of the task, often not predicted by the hypothesized temporal structure of the task. We suggest that this methodological framework, along with readily available tools in the fMRI community, provides a powerful exploratory, data-driven approach for analysis of task-driven BOLD activity. Copyright © 2018 Elsevier Inc. All rights reserved.
[Urban ecological risk assessment: a review].
Wang, Mei-E; Chen, Wei-Ping; Peng, Chi
2014-03-01
With the development of urbanization and the degradation of urban living environment, urban ecological risks caused by urbanization have attracted more and more attentions. Based on urban ecology principles and ecological risk assessment frameworks, contents of urban ecological risk assessment were reviewed in terms of driven forces, risk resources, risk receptors, endpoints and integrated approaches for risk assessment. It was suggested that types and degrees of urban economical and social activities were the driven forces for urban ecological risks. Ecological functional components at different levels in urban ecosystems as well as the urban system as a whole were the risk receptors. Assessment endpoints involved in changes of urban ecological structures, processes, functional components and the integrity of characteristic and function. Social-ecological models should be the major approaches for urban ecological risk assessment. Trends for urban ecological risk assessment study should focus on setting a definite protection target and criteria corresponding to assessment endpoints, establishing a multiple-parameter assessment system and integrative assessment approaches.
Impact-Driven Overturn of Lunar Regolith: A Refreshed Approach
NASA Astrophysics Data System (ADS)
Costello, E.; Ghent, R. R.; Lucey, P. G.; Tai Udovicic, C. J.
2016-12-01
Meteoritic impactors churn up lunar regolith, the layer of heterogeneous grains that covers nearly the entire lunar surface to a depth of tens to hundreds of meters, and affect its geologic, petrographic and chemical makeup. An understanding of the physical characteristics of the regolith and how they change through time is fundamentally important to our ability to interpret underlying geological processes from surface observations. Characterizing impact-driven regolith overturn in particular could help us understand the lifetime of rays, ejecta blankets, and stratigraphic layering. Several probabilistic models exist that describe the meteoritic impact-driven overturn process, including that presented by Gault et. al. in their paper `Mixing of the Lunar Regolith.' We re-visit this oft-cited model, updating the constants used with more modern laboratory impact experiments and time variable meteoritic flux estimates. Further, we compare the results of Gault's model to new approaches using remote sensing datasets and Monte Carlo cratering simulations that include conditions Gault's model did not such as the erosion, seismic settling, and degradation that result from the superposition of craters. From this work we present an updated understanding of overturn as a function of time and depth. Gault et. al. showed that the upper millimeter of regolith is mixed with great frequency and the rate of turnover drops off sharply at depth. Our work elaborates on this idea, addressing the sensitivity of this result to variations in parameters including meteoritic flux, impactor mass, velocity, angle of impact and crater geometry. In addition, we use these new methods and parameters to characterize the "mixing layer," as well as those less mixed layers below in an attempt to quantitatively match the new insights on spatial variation of the change in density with depth derived by the Diviner Lunar Radiometer.
Extended-Range Prediction with Low-Dimensional, Stochastic-Dynamic Models: A Data-driven Approach
2013-09-30
statistically extratropical storms and extremes, and link these to LFV modes. Mingfang Ting, Yochanan Kushnir, Andrew W. Robertson, Lei Wang...forecast models, as well as in the understanding they have generated. Adam Sobel, Daehyun Kim and Shuguang Wang. Extratropical variability and...predictability. Determine the extent to which extratropical monthly and seasonal low-frequency variability (LFV, i.e. PNA, NAO, as well as other regional
NASA Astrophysics Data System (ADS)
House, Thomas
2016-09-01
Chowell et al. [1] consider the early growth behaviour of various epidemic models that range from phenomenological approaches driven by data to mechanistic descriptions of complex interactions between individuals. This is particularly timely given the recent Ebola epidemic, although non-exponential early growth may be more common (but less immediately evident) than we realise.
2014-06-01
from the ODM standard. Leveraging SPARX EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL...server MySQL creates the physical schema that enables a user to store and retrieve data conforming to the vocabulary of the JC3IEDM. 6. GENERATING AN
Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian
2016-01-01
Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus-specific. Based on these proofs-of-concept, we conclude that our data-driven methods can reliably extract relevant tuning information from neuronal recordings, including cells whose seemingly haphazard response curves defy conventional fitting approaches.
Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian
2016-01-01
Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus-specific. Based on these proofs-of-concept, we conclude that our data-driven methods can reliably extract relevant tuning information from neuronal recordings, including cells whose seemingly haphazard response curves defy conventional fitting approaches. PMID:26785378
An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.
Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong
2016-01-01
With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.
A New Multivariate Approach for Prognostics Based on Extreme Learning Machine and Fuzzy Clustering.
Javed, Kamran; Gouriveau, Rafael; Zerhouni, Noureddine
2015-12-01
Prognostics is a core process of prognostics and health management (PHM) discipline, that estimates the remaining useful life (RUL) of a degrading machinery to optimize its service delivery potential. However, machinery operates in a dynamic environment and the acquired condition monitoring data are usually noisy and subject to a high level of uncertainty/unpredictability, which complicates prognostics. The complexity further increases, when there is absence of prior knowledge about ground truth (or failure definition). For such issues, data-driven prognostics can be a valuable solution without deep understanding of system physics. This paper contributes a new data-driven prognostics approach namely, an "enhanced multivariate degradation modeling," which enables modeling degrading states of machinery without assuming a homogeneous pattern. In brief, a predictability scheme is introduced to reduce the dimensionality of the data. Following that, the proposed prognostics model is achieved by integrating two new algorithms namely, the summation wavelet-extreme learning machine and subtractive-maximum entropy fuzzy clustering to show evolution of machine degradation by simultaneous predictions and discrete state estimation. The prognostics model is equipped with a dynamic failure threshold assignment procedure to estimate RUL in a realistic manner. To validate the proposition, a case study is performed on turbofan engines data from PHM challenge 2008 (NASA), and results are compared with recent publications.
Learning clinically useful information from images: Past, present and future.
Rueckert, Daniel; Glocker, Ben; Kainz, Bernhard
2016-10-01
Over the last decade, research in medical imaging has made significant progress in addressing challenging tasks such as image registration and image segmentation. In particular, the use of model-based approaches has been key in numerous, successful advances in methodology. The advantage of model-based approaches is that they allow the incorporation of prior knowledge acting as a regularisation that favours plausible solutions over implausible ones. More recently, medical imaging has moved away from hand-crafted, and often explicitly designed models towards data-driven, implicit models that are constructed using machine learning techniques. This has led to major improvements in all stages of the medical imaging pipeline, from acquisition and reconstruction to analysis and interpretation. As more and more imaging data is becoming available, e.g., from large population studies, this trend is likely to continue and accelerate. At the same time new developments in machine learning, e.g., deep learning, as well as significant improvements in computing power, e.g., parallelisation on graphics hardware, offer new potential for data-driven, semantic and intelligent medical imaging. This article outlines the work of the BioMedIA group in this area and highlights some of the challenges and opportunities for future work. Copyright © 2016 Elsevier B.V. All rights reserved.
Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.
2013-02-01
Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modeling. In this paper we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modeling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalization property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally very efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analyzed on two real-world case studies (Marina catchment (Singapore) and Canning River (Western Australia)) representing two different morphoclimatic contexts comparatively with other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.
Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.
2013-07-01
Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modelling. In this paper, we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modelling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalisation property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analysed on two real-world case studies - Marina catchment (Singapore) and Canning River (Western Australia) - representing two different morphoclimatic contexts. The evaluation is performed against other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.
SST-Forced Seasonal Simulation and Prediction Skill for Versions of the NCEP/MRF Model.
NASA Astrophysics Data System (ADS)
Livezey, Robert E.; Masutani, Michiko; Jil, Ming
1996-03-01
The feasibility of using a two-tier approach to provide guidance to operational long-lead seasonal prediction is explored. The approach includes first a forecast of global sea surface temperatures (SSTs) using a coupled general circulation model, followed by an atmospheric forecast using an atmospheric general circulation model (AGCM). For this exploration, ensembles of decade-long integrations of the AGCM driven by observed SSTs and ensembles of integrations of select cases driven by forecast SSTs have been conducted. The ability of the model in these sets of runs to reproduce observed atmospheric conditions has been evaluated with a multiparameter performance analysis.Results have identified performance and skill levels in the specified SST runs, for winters and springs over the Pacific/North America region, that are sufficient to impact operational seasonal predictions in years with major El Niño-Southern Oscillation (ENSO) episodes. Further, these levels were substantially reproduced in the forecast SST runs for 1-month leads and in many instances for up to one-season leads. In fact, overall the 0- and 1-month-lead forecasts of seasonal temperature over the United States for three falls and winters with major ENSO episodes were substantially better than corresponding official forecasts. Thus, there is considerable reason to develop a dynamical component for the official seasonal forecast process.
Simulation-Driven Design Approach for Design and Optimization of Blankholder
NASA Astrophysics Data System (ADS)
Sravan, Tatipala; Suddapalli, Nikshep R.; Johan, Pilthammar; Mats, Sigvant; Christian, Johansson
2017-09-01
Reliable design of stamping dies is desired for efficient and safe production. The design of stamping dies are today mostly based on casting feasibility, although it can also be based on criteria for fatigue, stiffness, safety, economy. Current work presents an approach that is built on Simulation Driven Design, enabling Design Optimization to address this issue. A structural finite element model of a stamping die, used to produce doors for Volvo V70/S80 car models, is studied. This die had developed cracks during its usage. To understand the behaviour of stress distribution in the stamping die, structural analysis of the die is conducted and critical regions with high stresses are identified. The results from structural FE-models are compared with analytical calculations pertaining to fatigue properties of the material. To arrive at an optimum design with increased stiffness and lifetime, topology and free-shape optimization are performed. In the optimization routine, identified critical regions of the die are set as design variables. Other optimization variables are set to maintain manufacturability of the resultant stamping die. Thereafter a CAD model is built based on geometrical results from topology and free-shape optimizations. Then the CAD model is subjected to structural analysis to visualize the new stress distribution. This process is iterated until a satisfactory result is obtained. The final results show reduction in stress levels by 70% with a more homogeneous distribution. Even though mass of the die is increased by 17 %, overall, a stiffer die with better lifetime is obtained. Finally, by reflecting on the entire process, a coordinated approach to handle such situations efficiently is presented.
Official conceptualizations of person-centered care: which person counts?
O'Dwyer, Ciara
2013-08-01
Numerous studies have indicated that a "psycho-social" person-centered care approach, involving the delivery of a compassionate, respectful model of care, leads to a high quality of life, particularly for older people living in residential care. This has prompted policy-makers to endorse this approach. Yet, some commentators have argued that the model of person-centered care in official government policies equates to a "consumer-based" rather than a psycho-social approach, as it focuses solely on offering service-users more choice and on promoting independence. However, as such arguments are made in the absence of any empirical analysis, it is unclear both whether such a distinction exists in practice, and, if so, how this alternative model developed. This study explores the development of minimum standards for residential care settings for older people in Ireland in order to address this gap in our understanding of person-centered care. Findings confirm that a consumer-driven model of person-centered care underpins the Irish Standards; residential care is portrayed as a hotel-like service and residents as discerning consumers, which may be unsuitable for older people in residential care with limited capacity to make key choices. Analysis indicates that this model can be seen both as an extension of consumer-driven policies endorsed by many neo-liberal governments, and also of policy-makers' fears of losing their autonomy when they reach the "Fourth Age". This study is particularly illuminating, given the similarities between the Irish care system with England, Scotland, Wales, Northern Ireland and Australia. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Held, H.; Gerstengarbe, F.-W.; Hattermann, F.; Pinto, J. G.; Ulbrich, U.; Böhm, U.; Born, K.; Büchner, M.; Donat, M. G.; Kücken, M.; Leckebusch, G. C.; Nissen, K.; Nocke, T.; Österle, H.; Pardowitz, T.; Werner, P. C.; Burghoff, O.; Broecker, U.; Kubik, A.
2012-04-01
We present an overview of a complementary-approaches impact project dealing with the consequences of climate change for the natural hazard branch of the insurance industry in Germany. The project was conducted by four academic institutions together with the German Insurance Association (GDV) and finalized in autumn 2011. A causal chain is modeled that goes from global warming projections over regional meteorological impacts to regional economic losses for private buildings, hereby fully covering the area of Germany. This presentation will focus on wind storm related losses, although the method developed had also been applied in part to hail and flood impact losses. For the first time, the GDV supplied their collected set of insurance cases, dating back for decades, for such an impact study. These data were used to calibrate and validate event-based damage functions which in turn were driven by three different types of regional climate models to generate storm loss projections. The regional models were driven by a triplet of ECHAM5 experiments following the A1B scenario which were found representative in the recent ENSEMBLES intercomparison study. In our multi-modeling approach we used two types of regional climate models that conceptually differ at maximum: a dynamical model (CCLM) and a statistical model based on the idea of biased bootstrapping (STARS). As a third option we pursued a hybrid approach (statistical-dynamical downscaling). For the assessment of climate change impacts, the buildings' infrastructure and their economic value is kept at current values. For all three approaches, a significant increase of average storm losses and extreme event return levels in the German private building sector is found for future decades assuming an A1B-scenario. However, the three projections differ somewhat in terms of magnitude and regional differentiation. We have developed a formalism that allows us to express the combined effect of multi-source uncertainty on return levels within the framework of a generalized Pareto distribution.
Transformation of Graphical ECA Policies into Executable PonderTalk Code
NASA Astrophysics Data System (ADS)
Romeikat, Raphael; Sinsel, Markus; Bauer, Bernhard
Rules are becoming more and more important in business modeling and systems engineering and are recognized as a high-level programming paradigma. For the effective development of rules it is desired to start at a high level, e.g. with graphical rules, and to refine them into code of a particular rule language for implementation purposes later. An model-driven approach is presented in this paper to transform graphical rules into executable code in a fully automated way. The focus is on event-condition-action policies as a special rule type. These are modeled graphically and translated into the PonderTalk language. The approach may be extended to integrate other rule types and languages as well.
NASA Astrophysics Data System (ADS)
Albano, Raffaele; Manfreda, Salvatore; Celano, Giuseppe
The paper introduces a minimalist water-driven crop model for sustainable irrigation management using an eco-hydrological approach. Such model, called MY SIRR, uses a relatively small number of parameters and attempts to balance simplicity, accuracy, and robustness. MY SIRR is a quantitative tool to assess water requirements and agricultural production across different climates, soil types, crops, and irrigation strategies. The MY SIRR source code is published under copyleft license. The FOSS approach could lower the financial barriers of smallholders, especially in developing countries, in the utilization of tools for better decision-making on the strategies for short- and long-term water resource management.
NASA Astrophysics Data System (ADS)
Xu, T.; Valocchi, A. J.; Ye, M.; Liang, F.
2016-12-01
Due to simplification and/or misrepresentation of the real aquifer system, numerical groundwater flow and solute transport models are usually subject to model structural error. During model calibration, the hydrogeological parameters may be overly adjusted to compensate for unknown structural error. This may result in biased predictions when models are used to forecast aquifer response to new forcing. In this study, we extend a fully Bayesian method [Xu and Valocchi, 2015] to calibrate a real-world, regional groundwater flow model. The method uses a data-driven error model to describe model structural error and jointly infers model parameters and structural error. In this study, Bayesian inference is facilitated using high performance computing and fast surrogate models. The surrogate models are constructed using machine learning techniques to emulate the response simulated by the computationally expensive groundwater model. We demonstrate in the real-world case study that explicitly accounting for model structural error yields parameter posterior distributions that are substantially different from those derived by the classical Bayesian calibration that does not account for model structural error. In addition, the Bayesian with error model method gives significantly more accurate prediction along with reasonable credible intervals.
Capa, Rémi L; Audiffren, Michel
2009-12-01
We tested whether the effect of achievement motivation on effort is modulated by two possible factors of the motivational intensity theory (Wright and Kirby, 2001): perceived difficulty and maximally justified effort. Approach-driven (N=16) and avoidance-driven (N=16) participants were first instructed to perform a reaction time task to the best of their abilities. Next, the participants were instructed to consistently beat their performance standard established in the first condition. Approach-driven participants showed a stronger decrease of midfrequency band of heart rate variability, which was used as an index of mental effort, than avoidance-driven participants in the second instruction condition. Moreover, avoidance-driven participants showed a higher corrugator supercilii reactivity, which was used as an index of negative affects, than approach-driven participants in the second instruction condition. No difference of perceived difficulty between groups was observed. Results suggested that avoidance-driven participants developed negative affects in the second instruction condition decreasing the maximally justified effort and their level of engagement.
Biswas, Rakesh; Maniam, Jayanthy; Lee, Edwin Wen Huo; Gopal, Premalatha; Umakanth, Shashikiran; Dahiya, Sumit; Ahmed, Sayeed
2008-10-01
The hypothesis in the conceptual model was that a user-driven innovation in presently available information and communication technology infrastructure would be able to meet patient and health professional users information needs and help them attain better health outcomes. An operational model was created to plan a trial on a sample diabetic population utilizing a randomized control trial design, assigning one randomly selected group of diabetics to receive electronic information intervention and analyse if it would improve their health outcomes in comparison with a matched diabetic population who would only receive regular medical intervention. Diabetes was chosen for this particular trial, as it is a major chronic illness in Malaysia as elsewhere in the world. It is in essence a position paper for how the study concept should be organized to stimulate wider discussion prior to beginning the study.
Metabolic network reconstruction of Chlamydomonas offers insight into light-driven algal metabolism
Chang, Roger L; Ghamsari, Lila; Manichaikul, Ani; Hom, Erik F Y; Balaji, Santhanam; Fu, Weiqi; Shen, Yun; Hao, Tong; Palsson, Bernhard Ø; Salehi-Ashtiani, Kourosh; Papin, Jason A
2011-01-01
Metabolic network reconstruction encompasses existing knowledge about an organism's metabolism and genome annotation, providing a platform for omics data analysis and phenotype prediction. The model alga Chlamydomonas reinhardtii is employed to study diverse biological processes from photosynthesis to phototaxis. Recent heightened interest in this species results from an international movement to develop algal biofuels. Integrating biological and optical data, we reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. PMID:21811229
Model-Driven Configuration of SELinux Policies
NASA Astrophysics Data System (ADS)
Agreiter, Berthold; Breu, Ruth
The need for access control in computer systems is inherent. However, the complexity to configure such systems is constantly increasing which affects the overall security of a system negatively. We think that it is important to define security requirements on a non-technical level while taking the application domain into respect in order to have a clear and separated view on security configuration (i.e. unblurred by technical details). On the other hand, security functionality has to be tightly integrated with the system and its development process in order to provide comprehensive means of enforcement. In this paper, we propose a systematic approach based on model-driven security configuration to leverage existing operating system security mechanisms (SELinux) for realising access control. We use UML models and develop a UML profile to satisfy these needs. Our goal is to exploit a comprehensive protection mechanism while rendering its security policy manageable by a domain specialist.
Data-driven non-Markovian closure models
NASA Astrophysics Data System (ADS)
Kondrashov, Dmitri; Chekroun, Mickaël D.; Ghil, Michael
2015-03-01
This paper has two interrelated foci: (i) obtaining stable and efficient data-driven closure models by using a multivariate time series of partial observations from a large-dimensional system; and (ii) comparing these closure models with the optimal closures predicted by the Mori-Zwanzig (MZ) formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a generalization and a time-continuous limit of existing multilevel, regression-based approaches to closure in a data-driven setting; these approaches include empirical model reduction (EMR), as well as more recent multi-layer modeling. It is shown that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the MZ formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are derived on the structure of the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a broad class of MSM applications, a class that includes non-polynomial predictors and nonlinearities that do not necessarily preserve quadratic energy invariants. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. It is shown that the resulting closure model with energy-conserving nonlinearities efficiently captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lotka-Volterra model of population dynamics in its chaotic regime. The challenges here include the rarity of strange attractors in the model's parameter space and the existence of multiple attractor basins with fractal boundaries. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up.
NASA Astrophysics Data System (ADS)
Tellman, B.; Schwarz, B.
2014-12-01
This talk describes the development of a web application to predict and communicate vulnerability to floods given publicly available data, disaster science, and geotech cloud capabilities. The proof of concept in Google Earth Engine API with initial testing on case studies in New York and Utterakhand India demonstrates the potential of highly parallelized cloud computing to model socio-ecological disaster vulnerability at high spatial and temporal resolution and in near real time. Cloud computing facilitates statistical modeling with variables derived from large public social and ecological data sets, including census data, nighttime lights (NTL), and World Pop to derive social parameters together with elevation, satellite imagery, rainfall, and observed flood data from Dartmouth Flood Observatory to derive biophysical parameters. While more traditional, physically based hydrological models that rely on flow algorithms and numerical methods are currently unavailable in parallelized computing platforms like Google Earth Engine, there is high potential to explore "data driven" modeling that trades physics for statistics in a parallelized environment. A data driven approach to flood modeling with geographically weighted logistic regression has been initially tested on Hurricane Irene in southeastern New York. Comparison of model results with observed flood data reveals a 97% accuracy of the model to predict flooded pixels. Testing on multiple storms is required to further validate this initial promising approach. A statistical social-ecological flood model that could produce rapid vulnerability assessments to predict who might require immediate evacuation and where could serve as an early warning. This type of early warning system would be especially relevant in data poor places lacking the computing power, high resolution data such as LiDar and stream gauges, or hydrologic expertise to run physically based models in real time. As the data-driven model presented relies on globally available data, the only real time data input required would be typical data from a weather service, e.g. precipitation or coarse resolution flood prediction. However, model uncertainty will vary locally depending upon the resolution and frequency of observed flood and socio-economic damage impact data.
Geurts, Aron M; Collier, Lara S; Geurts, Jennifer L; Oseth, Leann L; Bell, Matthew L; Mu, David; Lucito, Robert; Godbout, Susan A; Green, Laura E; Lowe, Scott W; Hirsch, Betsy A; Leinwand, Leslie A; Largaespada, David A
2006-01-01
Previous studies of the Sleeping Beauty (SB) transposon system, as an insertional mutagen in the germline of mice, have used reverse genetic approaches. These studies have led to its proposed use for regional saturation mutagenesis by taking a forward-genetic approach. Thus, we used the SB system to mutate a region of mouse Chromosome 11 in a forward-genetic screen for recessive lethal and viable phenotypes. This work represents the first reported use of an insertional mutagen in a phenotype-driven approach. The phenotype-driven approach was successful in both recovering visible and behavioral mutants, including dominant limb and recessive behavioral phenotypes, and allowing for the rapid identification of candidate gene disruptions. In addition, a high frequency of recessive lethal mutations arose as a result of genomic rearrangements near the site of transposition, resulting from transposon mobilization. The results suggest that the SB system could be used in a forward-genetic approach to recover interesting phenotypes, but that local chromosomal rearrangements should be anticipated in conjunction with single-copy, local transposon insertions in chromosomes. Additionally, these mice may serve as a model for chromosome rearrangements caused by transposable elements during the evolution of vertebrate genomes. PMID:17009875
Model-Driven Development for scientific computing. An upgrade of the RHEEDGr program
NASA Astrophysics Data System (ADS)
Daniluk, Andrzej
2009-11-01
Model-Driven Engineering (MDE) is the software engineering discipline, which considers models as the most important element for software development, and for the maintenance and evolution of software, through model transformation. Model-Driven Architecture (MDA) is the approach for software development under the Model-Driven Engineering framework. This paper surveys the core MDA technology that was used to upgrade of the RHEEDGR program to C++0x language standards. New version program summaryProgram title: RHEEDGR-09 Catalogue identifier: ADUY_v3_0 Program summary URL:
NASA Astrophysics Data System (ADS)
Kovalskyy, V.; Henebry, G. M.
2012-01-01
Phenologies of the vegetated land surface are being used increasingly for diagnosis and prognosis of climate change consequences. Current prospective and retrospective phenological models stand far apart in their approaches to the subject. We report on an exploratory attempt to implement a phenological model based on a new event driven concept which has both diagnostic and prognostic capabilities in the same modeling framework. This Event Driven Phenological Model (EDPM) is shown to simulate land surface phenologies and phenophase transition dates in agricultural landscapes based on assimilation of weather data and land surface observations from spaceborne sensors. The model enables growing season phenologies to develop in response to changing environmental conditions and disturbance events. It also has the ability to ingest remotely sensed data to adjust its output to improve representation of the modeled variable. We describe the model and report results of initial testing of the EDPM using Level 2 flux tower records from the Ameriflux sites at Mead, Nebraska, USA, and at Bondville, Illinois, USA. Simulating the dynamics of normalized difference vegetation index based on flux tower data, the predictions by the EDPM show good agreement (RMSE < 0.08; r2 > 0.8) for maize and soybean during several growing seasons at different locations. This study presents the EDPM used in the companion paper (Kovalskyy and Henebry, 2011) in a coupling scheme to estimate daily actual evapotranspiration over multiple growing seasons.
NASA Astrophysics Data System (ADS)
Kovalskyy, V.; Henebry, G. M.
2011-05-01
Phenologies of the vegetated land surface are being used increasingly for diagnosis and prognosis of climate change consequences. Current prospective and retrospective phenological models stand far apart in their approaches to the subject. We report on an exploratory attempt to implement a phenological model based on a new event driven concept which has both diagnostic and prognostic capabilities in the same modeling framework. This Event Driven Phenological Model (EDPM) is shown to simulate land surface phenologies and phenophase transition dates in agricultural landscapes based on assimilation of weather data and land surface observations from spaceborne sensors. The model enables growing season phenologies to develop in response to changing environmental conditions and disturbance events. It also has the ability to ingest remotely sensed data to adjust its output to improve representation of the modeled variable. We describe the model and report results of initial testing of the EDPM using Level 2 flux tower records from the Ameriflux sites at Mead, Nebraska, USA, and at Bondville, Illinois, USA. Simulating the dynamics of normalized difference vegetation index based on flux tower data, the predictions by the EDPM show good agreement (RMSE < 0.08; r2>0.8) for maize and soybean during several growing seasons at different locations. This study presents the EDPM used in the companion paper (Kovalskyy and Henebry, 2011) in a coupling scheme to estimate daily actual evapotranspiration over multiple growing seasons.
NASA Technical Reports Server (NTRS)
Celaya, Jose; Saxena, Abhinav; Saha, Sankalita; Goebel, Kai F.
2011-01-01
An approach for predicting remaining useful life of power MOSFETs (metal oxide field effect transistor) devices has been developed. Power MOSFETs are semiconductor switching devices that are instrumental in electronics equipment such as those used in operation and control of modern aircraft and spacecraft. The MOSFETs examined here were aged under thermal overstress in a controlled experiment and continuous performance degradation data were collected from the accelerated aging experiment. Dieattach degradation was determined to be the primary failure mode. The collected run-to-failure data were analyzed and it was revealed that ON-state resistance increased as die-attach degraded under high thermal stresses. Results from finite element simulation analysis support the observations from the experimental data. Data-driven and model based prognostics algorithms were investigated where ON-state resistance was used as the primary precursor of failure feature. A Gaussian process regression algorithm was explored as an example for a data-driven technique and an extended Kalman filter and a particle filter were used as examples for model-based techniques. Both methods were able to provide valid results. Prognostic performance metrics were employed to evaluate and compare the algorithms.
Corredor, Iván; Bernardos, Ana M.; Iglesias, Josué; Casar, José R.
2012-01-01
Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym. PMID:23012544
Dynamically adaptive data-driven simulation of extreme hydrological flows
NASA Astrophysics Data System (ADS)
Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint
2018-02-01
Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.
Models and Frameworks: A Synergistic Association for Developing Component-Based Applications
Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858
Models and frameworks: a synergistic association for developing component-based applications.
Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
Application of Petri Nets in Bone Remodeling
Li, Lingxi; Yokota, Hiroki
2009-01-01
Understanding a mechanism of bone remodeling is a challenging task for both life scientists and model builders, since this highly interactive and nonlinear process can seldom be grasped by simple intuition. A set of ordinary differential equations (ODEs) have been built for simulating bone formation as well as bone resorption. Although solving ODEs numerically can provide useful predictions for dynamical behaviors in a continuous time frame, an actual bone remodeling process in living tissues is driven by discrete events of molecular and cellular interactions. Thus, an event-driven tool such as Petri nets (PNs), which may dynamically and graphically mimic individual molecular collisions or cellular interactions, seems to augment the existing ODE-based systems analysis. Here, we applied PNs to expand the ODE-based approach and examined discrete, dynamical behaviors of key regulatory molecules and bone cells. PNs have been used in many engineering areas, but their application to biological systems needs to be explored. Our PN model was based on 8 ODEs that described an osteoprotegerin linked molecular pathway consisting of 4 types of bone cells. The models allowed us to conduct both qualitative and quantitative evaluations and evaluate homeostatic equilibrium states. The results support that application of PN models assists understanding of an event-driven bone remodeling mechanism using PN-specific procedures such as places, transitions, and firings. PMID:19838338
Indicators of ecosystem function identify alternate states in the sagebrush steppe.
Kachergis, Emily; Rocca, Monique E; Fernandez-Gimenez, Maria E
2011-10-01
Models of ecosystem change that incorporate nonlinear dynamics and thresholds, such as state-and-transition models (STMs), are increasingly popular tools for land management decision-making. However, few models are based on systematic collection and documentation of ecological data, and of these, most rely solely on structural indicators (species composition) to identify states and transitions. As STMs are adopted as an assessment framework throughout the United States, finding effective and efficient ways to create data-driven models that integrate ecosystem function and structure is vital. This study aims to (1) evaluate the utility of functional indicators (indicators of rangeland health, IRH) as proxies for more difficult ecosystem function measurements and (2) create a data-driven STM for the sagebrush steppe of Colorado, USA, that incorporates both ecosystem structure and function. We sampled soils, plant communities, and IRH at 41 plots with similar clayey soils but different site histories to identify potential states and infer the effects of management practices and disturbances on transitions. We found that many IRH were correlated with quantitative measures of functional indicators, suggesting that the IRH can be used to approximate ecosystem function. In addition to a reference state that functions as expected for this soil type, we identified four biotically and functionally distinct potential states, consistent with the theoretical concept of alternate states. Three potential states were related to management practices (chemical and mechanical shrub treatments and seeding history) while one was related only to ecosystem processes (erosion). IRH and potential states were also related to environmental variation (slope, soil texture), suggesting that there are environmental factors within areas with similar soils that affect ecosystem dynamics and should be noted within STMs. Our approach generated an objective, data-driven model of ecosystem dynamics for rangeland management. Our findings suggest that the IRH approximate ecosystem processes and can distinguish between alternate states and communities and identify transitions when building data-driven STMs. Functional indicators are a simple, efficient way to create data-driven models that are consistent with alternate state theory. Managers can use them to improve current model-building methods and thus apply state-and-transition models more broadly for land management decision-making.
Contextualizing Learning Scenarios According to Different Learning Management Systems
ERIC Educational Resources Information Center
Drira, R.; Laroussi, M.; Le Pallec, X.; Warin, B.
2012-01-01
In this paper, we first demonstrate that an instructional design process of Technology Enhanced Learning (TEL) systems based on a Model Driven Approach (MDA) addresses the limits of Learning Technology Standards (LTS), such as SCORM and IMS-LD. Although these standards ensure the interoperability of TEL systems across different Learning Management…
Empirical Analysis of Exploiting Review Helpfulness for Extractive Summarization of Online Reviews
ERIC Educational Resources Information Center
Xiong, Wenting; Litman, Diane
2014-01-01
We propose a novel unsupervised extractive approach for summarizing online reviews by exploiting review helpfulness ratings. In addition to using the helpfulness ratings for review-level filtering, we suggest using them as the supervision of a topic model for sentence-level content scoring. The proposed method is metadata-driven, requiring no…
Towards Model-Driven End-User Development in CALL
ERIC Educational Resources Information Center
Farmer, Rod; Gruba, Paul
2006-01-01
The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…
Community College Dual Enrollment Faculty Orientation: A Utilization-Focused Approach
ERIC Educational Resources Information Center
Charlier, Hara D.; Duggan, Molly H.
2010-01-01
The current climate of accountability demands that institutions engage in data-driven program evaluation. In order to promote quality dual enrollment (DE) programs, institutions must support the adjunct faculty teaching college courses in high schools. This study uses Patton's utilization-focused model (1997) to conduct a formative evaluation of a…
Data-Driven Approaches for Paraphrasing across Language Variations
ERIC Educational Resources Information Center
Xu, Wei
2014-01-01
Our language changes very rapidly, accompanying political, social and cultural trends, as well as the evolution of science and technology. The Internet, especially the social media, has accelerated this process of change. This poses a severe challenge for both human beings and natural language processing (NLP) systems, which usually only model a…
USDA-ARS?s Scientific Manuscript database
Bovine mastitis is an inflammation-driven disease of the bovine mammary gland that costs the global dairy industry several billion dollars per annum. Because disease susceptibility is a multi-factorial complex phenotype, a multi-omic integrative biology approach is required to dissect the multilayer...
Tuition Pricing and Aid Strategies: A Practical Approach. AIR 1994 Annual Forum Paper.
ERIC Educational Resources Information Center
Fine, Paul L.
This paper examines the applicability of net tuition revenue models for a highly selective, elite priced, private research university in the southern U.S. Pricing and aid strategies for this university seem to be driven by intuitive assumptions about the economy, market forces, needs-blind admissions, student satisfaction, net price…
Model and Subcomponent Development for a Pulse-Combustor-Driven Microgenerator
2004-08-31
sputtering of thin magnetic and dielectric layers [4]; and mechanical lamination of polymer -coated NiFe foils [5]. Although these approaches have...photomicrograph of the fabricated device is given in Figure 4.2-6. 3d solenoid- like Cu coil EPOXY SU8 NIFE LAMINATE D CORE Figure 4.2-6 Photomicrograph
Academic Procrastinators, Strategic Delayers and Something Betwixt and Between: An Interview Study
ERIC Educational Resources Information Center
Lindblom-Ylänne, Sari; Saariaho, Emmi; Inkinen, Mikko; Haarala-Muhonen, Anne; Hailikari, Telle
2015-01-01
The study explored university undergraduates' dilatory behaviour, more precisely, procrastination and strategic delaying. Using qualitative interview data, we applied a theory-driven and person-oriented approach to test the theoretical model of Klingsieck (2013). The sample consisted of 28 Bachelor students whose study pace had been slow during…
ERIC Educational Resources Information Center
Song, Ji Hoon; Kim, Hye Kyoung; Park, Sunyoung; Bae, Sang Hoon
2014-01-01
The purpose of this study was to develop an empirical data-driven model for a knowledge creation school system in career technical education (CTE) by identifying supportive and hindering factors influencing knowledge creation practices in CTE schools. Nonaka and colleagues' (Nonaka & Konno, 1998; Nonaka & Takeuchi, 1995) knowledge…
The Tough Road to Better Science Teaching
ERIC Educational Resources Information Center
Brainard, Jeffrey
2007-01-01
For decades introductory science courses have relied largely on lectures and tests that reward memorization of facts and formulas, an approach that has driven away many talented students. While new teaching models have shown success in engaging and retaining undergraduates, they have yet to be widely adopted in academe. For one thing, the tenure…
Strategic Industrial Alliances in Paper Industry: XML- vs Ontology-Based Integration Platforms
ERIC Educational Resources Information Center
Naumenko, Anton; Nikitin, Sergiy; Terziyan, Vagan; Zharko, Andriy
2005-01-01
Purpose: To identify cases related to design of ICT platforms for industrial alliances, where the use of Ontology-driven architectures based on Semantic web standards is more advantageous than application of conventional modeling together with XML standards. Design/methodology/approach: A comparative analysis of the two latest and the most obvious…
Minnows as a Classroom Model for Human Environmental Health
ERIC Educational Resources Information Center
Weber, Daniel N.; Hesselbach, Renee; Kane, Andrew S.; Petering, David H.; Petering, Louise; Berg, Craig A.
2013-01-01
Understanding human environmental health is difficult for high school students, as is the process of scientific investigation. This module provides a framework to address both concerns through an inquiry-based approach using a hypothesis-driven set of experiments that draws upon a real-life concern, environmental exposures to lead (Pb2+). Students…
NASA Astrophysics Data System (ADS)
Gang, Grace J.; Siewerdsen, Jeffrey H.; Webster Stayman, J.
2017-06-01
Tube current modulation (TCM) is routinely adopted on diagnostic CT scanners for dose reduction. Conventional TCM strategies are generally designed for filtered-backprojection (FBP) reconstruction to satisfy simple image quality requirements based on noise. This work investigates TCM designs for model-based iterative reconstruction (MBIR) to achieve optimal imaging performance as determined by a task-based image quality metric. Additionally, regularization is an important aspect of MBIR that is jointly optimized with TCM, and includes both the regularization strength that controls overall smoothness as well as directional weights that permits control of the isotropy/anisotropy of the local noise and resolution properties. Initial investigations focus on a known imaging task at a single location in the image volume. The framework adopts Fourier and analytical approximations for fast estimation of the local noise power spectrum (NPS) and modulation transfer function (MTF)—each carrying dependencies on TCM and regularization. For the single location optimization, the local detectability index (d‧) of the specific task was directly adopted as the objective function. A covariance matrix adaptation evolution strategy (CMA-ES) algorithm was employed to identify the optimal combination of imaging parameters. Evaluations of both conventional and task-driven approaches were performed in an abdomen phantom for a mid-frequency discrimination task in the kidney. Among the conventional strategies, the TCM pattern optimal for FBP using a minimum variance criterion yielded a worse task-based performance compared to an unmodulated strategy when applied to MBIR. Moreover, task-driven TCM designs for MBIR were found to have the opposite behavior from conventional designs for FBP, with greater fluence assigned to the less attenuating views of the abdomen and less fluence to the more attenuating lateral views. Such TCM patterns exaggerate the intrinsic anisotropy of the MTF and NPS as a result of the data weighting in MBIR. Directional penalty design was found to reinforce the same trend. The task-driven approaches outperform conventional approaches, with the maximum improvement in d‧ of 13% given by the joint optimization of TCM and regularization. This work demonstrates that the TCM optimal for MBIR is distinct from conventional strategies proposed for FBP reconstruction and strategies optimal for FBP are suboptimal and may even reduce performance when applied to MBIR. The task-driven imaging framework offers a promising approach for optimizing acquisition and reconstruction for MBIR that can improve imaging performance and/or dose utilization beyond conventional imaging strategies.
Multi-source micro-friction identification for a class of cable-driven robots with passive backbone
NASA Astrophysics Data System (ADS)
Tjahjowidodo, Tegoeh; Zhu, Ke; Dailey, Wayne; Burdet, Etienne; Campolo, Domenico
2016-12-01
This paper analyses the dynamics of cable-driven robots with a passive backbone and develops techniques for their dynamic identification, which are tested on the H-Man, a planar cabled differential transmission robot for haptic interaction. The mechanism is optimized for human-robot interaction by accounting for the cost-benefit-ratio of the system, specifically by eliminating the necessity of an external force sensor to reduce the overall cost. As a consequence, this requires an effective dynamic model for accurate force feedback applications which include friction behavior in the system. We first consider the significance of friction in both the actuator and backbone spaces. Subsequently, we study the required complexity of the stiction model for the application. Different models representing different levels of complexity are investigated, ranging from the conventional approach of Coulomb to an advanced model which includes hysteresis. The results demonstrate each model's ability to capture the dynamic behavior of the system. In general, it is concluded that there is a trade-off between model accuracy and the model cost.
Modelling irradiation-induced softening in BCC iron by crystal plasticity approach
NASA Astrophysics Data System (ADS)
Xiao, Xiazi; Terentyev, Dmitry; Yu, Long; Song, Dingkun; Bakaev, A.; Duan, Huiling
2015-11-01
Crystal plasticity model (CPM) for BCC iron to account for radiation-induced strain softening is proposed. CPM is based on the plastically-driven and thermally-activated removal of dislocation loops. Atomistic simulations are applied to parameterize dislocation-defect interactions. Combining experimental microstructures, defect-hardening/absorption rules from atomistic simulations, and CPM fitted to properties of non-irradiated iron, the model achieves a good agreement with experimental data regarding radiation-induced strain softening and flow stress increase under neutron irradiation.
Sinker tectonics - An approach to the surface of Miranda
NASA Technical Reports Server (NTRS)
Janes, D. M.; Melosh, H. J.
1988-01-01
Two of the proposed explanations for the coronae seen on Miranda involve mantle convection driven by density anomalies. In the sinker model, the coronae result from late-accreting large silicate bodies slowly sinking through an icy mantle toward the body's center; in the riser model, they result from a compositionally produced, low-density, rising diapir. The present study determines the surface stresses induced by such density anomalies and the expected surface expressions. The results are in good agreement with the predictions of the sinker model.
Airborne Detection and Tracking of Geologic Leakage Sites
NASA Astrophysics Data System (ADS)
Jacob, Jamey; Allamraju, Rakshit; Axelrod, Allan; Brown, Calvin; Chowdhary, Girish; Mitchell, Taylor
2014-11-01
Safe storage of CO2 to reduce greenhouse gas emissions without adversely affecting energy use or hindering economic growth requires development of monitoring technology that is capable of validating storage permanence while ensuring the integrity of sequestration operations. Soil gas monitoring has difficulty accurately distinguishing gas flux signals related to leakage from those associated with meteorologically driven changes of soil moisture and temperature. Integrated ground and airborne monitoring systems are being deployed capable of directly detecting CO2 concentration in storage sites. Two complimentary approaches to detecting leaks in the carbon sequestration fields are presented. The first approach focuses on reducing the requisite network communication for fusing individual Gaussian Process (GP) CO2 sensing models into a global GP CO2 model. The GP fusion approach learns how to optimally allocate the static and mobile sensors. The second approach leverages a hierarchical GP-Sigmoidal Gaussian Cox Process for airborne predictive mission planning to optimally reducing the entropy of the global CO2 model. Results from the approaches will be presented.
NASA Astrophysics Data System (ADS)
Crouch, Dustin L.; (Helen Huang, He
2017-06-01
Objective. We investigated the feasibility of a novel, customizable, simplified EMG-driven musculoskeletal model for estimating coordinated hand and wrist motions during a real-time path tracing task. Approach. A two-degree-of-freedom computational musculoskeletal model was implemented for real-time EMG-driven control of a stick figure hand displayed on a computer screen. After 5-10 minutes of undirected practice, subjects were given three attempts to trace 10 straight paths, one at a time, with the fingertip of the virtual hand. Able-bodied subjects completed the task on two separate test days. Main results. Across subjects and test days, there was a significant linear relationship between log-transformed measures of accuracy and speed (Pearson’s r = 0.25, p < 0.0001). The amputee subject could coordinate movement between the wrist and MCP joints, but favored metacarpophalangeal joint motion more highly than able-bodied subjects in 8 of 10 trials. For able-bodied subjects, tracing accuracy was lower at the extremes of the model’s range of motion, though there was no apparent relationship between tracing accuracy and fingertip location for the amputee. Our result suggests that, unlike able-bodied subjects, the amputee’s motor control patterns were not accustomed to the multi-joint dynamics of the wrist and hand, possibly as a result of post-amputation cortical plasticity, disuse, or sensory deficits. Significance. To our knowledge, our study is one of very few that have demonstrated the real-time simultaneous control of multi-joint movements, especially wrist and finger movements, using an EMG-driven musculoskeletal model, which differs from the many data-driven algorithms that dominate the literature on EMG-driven prosthesis control. Real-time control was achieved with very little training and simple, quick (~15 s) calibration. Thus, our model is potentially a practical and effective control platform for multifunctional myoelectric prostheses that could restore more life-like hand function for individuals with upper limb amputation.
Towards predictive models of the human gut microbiome
2014-01-01
The intestinal microbiota is an ecosystem susceptible to external perturbations such as dietary changes and antibiotic therapies. Mathematical models of microbial communities could be of great value in the rational design of microbiota-tailoring diets and therapies. Here, we discuss how advances in another field, engineering of microbial communities for wastewater treatment bioreactors, could inspire development of mechanistic mathematical models of the gut microbiota. We review the current state-of-the-art in bioreactor modeling and current efforts in modeling the intestinal microbiota. Mathematical modeling could benefit greatly from the deluge of data emerging from metagenomic studies, but data-driven approaches such as network inference that aim to predict microbiome dynamics without explicit mechanistic knowledge seem better suited to model these data. Finally, we discuss how the integration of microbiome shotgun sequencing and metabolic modeling approaches such as flux balance analysis may fulfill the promise of a mechanistic model of the intestinal microbiota. PMID:24727124
Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G
2012-01-01
Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.
A Model to Translate Evidence-Based Interventions Into Community Practice
Christiansen, Ann L.; Peterson, Donna J.; Guse, Clare E.; Maurana, Cheryl A.; Brandenburg, Terry
2012-01-01
There is a tension between 2 alternative approaches to implementing community-based interventions. The evidence-based public health movement emphasizes the scientific basis of prevention by disseminating rigorously evaluated interventions from academic and governmental agencies to local communities. Models used by local health departments to incorporate community input into their planning, such as the community health improvement process (CHIP), emphasize community leadership in identifying health problems and developing and implementing health improvement strategies. Each approach has limitations. Modifying CHIP to formally include consideration of evidence-based interventions in both the planning and evaluation phases leads to an evidence-driven community health improvement process that can serve as a useful framework for uniting the different approaches while emphasizing community ownership, priorities, and wisdom. PMID:22397341
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koven, C. D.; Chambers, J. Q.; Georgiou, K.
To better understand sources of uncertainty in projections of terrestrial carbon cycle feedbacks, we present an approach to separate the controls on modeled carbon changes. We separate carbon changes into four categories using a linearized, equilibrium approach: those arising from changed inputs (productivity-driven changes), and outputs (turnover-driven changes), of both the live and dead carbon pools. Using Coupled Model Intercomparison Project Phase 5 (CMIP5) simulations for five models, we find that changes to the live pools are primarily explained by productivity-driven changes, with only one model showing large compensating changes to live carbon turnover times. For dead carbon pools, themore » situation is more complex as all models predict a large reduction in turnover times in response to increases in productivity. This response arises from the common representation of a broad spectrum of decomposition turnover times via a multi-pool approach, in which flux-weighted turnover times are faster than mass-weighted turnover times. This leads to a shift in the distribution of carbon among dead pools in response to changes in inputs, and therefore a transient but long-lived reduction in turnover times. Since this behavior, a reduction in inferred turnover times resulting from an increase in inputs, is superficially similar to priming processes, but occurring without the mechanisms responsible for priming, we call the phenomenon "false priming", and show that it masks much of the intrinsic changes to dead carbon turnover times as a result of changing climate. These patterns hold across the fully coupled, biogeochemically coupled, and radiatively coupled 1 % yr −1 increasing CO 2 experiments. We disaggregate inter-model uncertainty in the globally integrated equilibrium carbon responses to initial turnover times, initial productivity, fractional changes in turnover, and fractional changes in productivity. For both the live and dead carbon pools, inter-model spread in carbon changes arising from initial conditions is dominated by model disagreement on turnover times, whereas inter-model spread in carbon changes from fractional changes to these terms is dominated by model disagreement on changes to productivity in response to both warming and CO 2 fertilization. However, the lack of changing turnover time control on carbon responses, for both live and dead carbon pools, in response to the imposed forcings may arise from a common lack of process representation behind changing turnover times (e.g., allocation and mortality for live carbon; permafrost, microbial dynamics, and mineral stabilization for dead carbon), rather than a true estimate of the importance of these processes.« less
Koven, C. D.; Chambers, J. Q.; Georgiou, K.; ...
2015-09-07
To better understand sources of uncertainty in projections of terrestrial carbon cycle feedbacks, we present an approach to separate the controls on modeled carbon changes. We separate carbon changes into four categories using a linearized, equilibrium approach: those arising from changed inputs (productivity-driven changes), and outputs (turnover-driven changes), of both the live and dead carbon pools. Using Coupled Model Intercomparison Project Phase 5 (CMIP5) simulations for five models, we find that changes to the live pools are primarily explained by productivity-driven changes, with only one model showing large compensating changes to live carbon turnover times. For dead carbon pools, themore » situation is more complex as all models predict a large reduction in turnover times in response to increases in productivity. This response arises from the common representation of a broad spectrum of decomposition turnover times via a multi-pool approach, in which flux-weighted turnover times are faster than mass-weighted turnover times. This leads to a shift in the distribution of carbon among dead pools in response to changes in inputs, and therefore a transient but long-lived reduction in turnover times. Since this behavior, a reduction in inferred turnover times resulting from an increase in inputs, is superficially similar to priming processes, but occurring without the mechanisms responsible for priming, we call the phenomenon "false priming", and show that it masks much of the intrinsic changes to dead carbon turnover times as a result of changing climate. These patterns hold across the fully coupled, biogeochemically coupled, and radiatively coupled 1 % yr −1 increasing CO 2 experiments. We disaggregate inter-model uncertainty in the globally integrated equilibrium carbon responses to initial turnover times, initial productivity, fractional changes in turnover, and fractional changes in productivity. For both the live and dead carbon pools, inter-model spread in carbon changes arising from initial conditions is dominated by model disagreement on turnover times, whereas inter-model spread in carbon changes from fractional changes to these terms is dominated by model disagreement on changes to productivity in response to both warming and CO 2 fertilization. However, the lack of changing turnover time control on carbon responses, for both live and dead carbon pools, in response to the imposed forcings may arise from a common lack of process representation behind changing turnover times (e.g., allocation and mortality for live carbon; permafrost, microbial dynamics, and mineral stabilization for dead carbon), rather than a true estimate of the importance of these processes.« less
Closed-loop suppression of chaos in nonlinear driven oscillators
NASA Astrophysics Data System (ADS)
Aguirre, L. A.; Billings, S. A.
1995-05-01
This paper discusses the suppression of chaos in nonlinear driven oscillators via the addition of a periodic perturbation. Given a system originally undergoing chaotic motions, it is desired that such a system be driven to some periodic orbit. This can be achieved by the addition of a weak periodic signal to the oscillator input. This is usually accomplished in open loop, but this procedure presents some difficulties which are discussed in the paper. To ensure that this is attained despite uncertainties and possible disturbances on the system, a procedure is suggested to perform control in closed loop. In addition, it is illustrated how a model, estimated from input/output data, can be used in the design. Numerical examples which use the Duffing-Ueda and modified van der Pol oscillators are included to illustrate some of the properties of the new approach.
qPortal: A platform for data-driven biomedical research.
Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven
2018-01-01
Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on high-performance computing systems via coupling of workflow management systems. Integration of project and data management as well as workflow resources in one place present clear advantages over existing solutions.
A data-driven prediction method for fast-slow systems
NASA Astrophysics Data System (ADS)
Groth, Andreas; Chekroun, Mickael; Kondrashov, Dmitri; Ghil, Michael
2016-04-01
In this work, we present a prediction method for processes that exhibit a mixture of variability on low and fast scales. The method relies on combining empirical model reduction (EMR) with singular spectrum analysis (SSA). EMR is a data-driven methodology for constructing stochastic low-dimensional models that account for nonlinearity and serial correlation in the estimated noise, while SSA provides a decomposition of the complex dynamics into low-order components that capture spatio-temporal behavior on different time scales. Our study focuses on the data-driven modeling of partial observations from dynamical systems that exhibit power spectra with broad peaks. The main result in this talk is that the combination of SSA pre-filtering with EMR modeling improves, under certain circumstances, the modeling and prediction skill of such a system, as compared to a standard EMR prediction based on raw data. Specifically, it is the separation into "fast" and "slow" temporal scales by the SSA pre-filtering that achieves the improvement. We show, in particular that the resulting EMR-SSA emulators help predict intermittent behavior such as rapid transitions between specific regions of the system's phase space. This capability of the EMR-SSA prediction will be demonstrated on two low-dimensional models: the Rössler system and a Lotka-Volterra model for interspecies competition. In either case, the chaotic dynamics is produced through a Shilnikov-type mechanism and we argue that the latter seems to be an important ingredient for the good prediction skills of EMR-SSA emulators. Shilnikov-type behavior has been shown to arise in various complex geophysical fluid models, such as baroclinic quasi-geostrophic flows in the mid-latitude atmosphere and wind-driven double-gyre ocean circulation models. This pervasiveness of the Shilnikow mechanism of fast-slow transition opens interesting perspectives for the extension of the proposed EMR-SSA approach to more realistic situations.
Comparison of driven and simulated "free" stall flutter in a wind tunnel
NASA Astrophysics Data System (ADS)
Culler, Ethan; Farnsworth, John; Fagley, Casey; Seidel, Jurgen
2016-11-01
Stall flutter and dynamic stall have received a significant amount of attention over the years. To experimentally study this problem, the body undergoing stall flutter is typically driven at a characteristic, single frequency sinusoid with a prescribed pitching amplitude and mean angle of attack offset. This approach allows for testing with repeatable kinematics, however it effectively decouples the structural motion from the aerodynamic forcing. Recent results suggest that this driven approach could misrepresent the forcing observed in a "free" stall flutter scenario. Specifically, a dynamically pitched rigid NACA 0018 wing section was tested in the wind tunnel under two modes of operation: (1) Cyber-Physical where "free" stall flutter was physically simulated through a custom motor-control system modeling a torsional spring and (2) Direct Motor-Driven Dynamic Pitch at a single frequency sinusoid representative of the cyber-physical motion. The time-resolved pitch angle and moment were directly measured and compared for each case. It was found that small deviations in the pitch angle trajectory between these two operational cases generate significantly different aerodynamic pitching moments on the wing section, with the pitching moments nearly 180o out of phase in some cases. This work is supported by the Air Force Office of Scientific Research through the Flow Interactions and Control Program and by the National Defense Science and Engineering Graduate Fellowship Program.
Time Series Modeling of Army Mission Command Communication Networks: An Event-Driven Analysis
2013-06-01
Lehmann, D. R. (1984). How advertising affects sales: Meta- analysis of econometric results. Journal of Marketing Research , 21, 65-74. Barabási, A. L...317-357. Leone, R. P. (1983). Modeling sales-advertising relationships: An integrated time series- econometric approach. Journal of Marketing ... Research , 20, 291-295. McGrath, J. E., & Kravitz, D. A. (1982). Group research. Annual Review of Psychology, 33, 195- 230. Monge, P. R., & Contractor
ERIC Educational Resources Information Center
Sampson, Victor; Enderle, Patrick; Grooms, Jonathon; Witte, Shelbie
2013-01-01
This study examined how students' science-specific argumentative writing skills and understanding of core ideas changed over the course of a school year as they participated in a series of science laboratories designed using the Argument-Driven Inquiry (ADI) instructional model. The ADI model is a student-centered and writing-intensive approach to…
An analytically solvable three-body break-up model problem in hyperspherical coordinates
NASA Astrophysics Data System (ADS)
Ancarani, L. U.; Gasaneo, G.; Mitnik, D. M.
2012-10-01
An analytically solvable S-wave model for three particles break-up processes is presented. The scattering process is represented by a non-homogeneous Coulombic Schrödinger equation where the driven term is given by a Coulomb-like interaction multiplied by the product of a continuum wave function and a bound state in the particles coordinates. The closed form solution is derived in hyperspherical coordinates leading to an analytic expression for the associated scattering transition amplitude. The proposed scattering model contains most of the difficulties encountered in real three-body scattering problem, e.g., non-separability in the electrons' spherical coordinates and Coulombic asymptotic behavior. Since the coordinates' coupling is completely different, the model provides an alternative test to that given by the Temkin-Poet model. The knowledge of the analytic solution provides an interesting benchmark to test numerical methods dealing with the double continuum, in particular in the asymptotic regions. An hyperspherical Sturmian approach recently developed for three-body collisional problems is used to reproduce to high accuracy the analytical results. In addition to this, we generalized the model generating an approximate wave function possessing the correct radial asymptotic behavior corresponding to an S-wave three-body Coulomb problem. The model allows us to explore the typical structure of the solution of a three-body driven equation, to identify three regions (the driven, the Coulombic and the asymptotic), and to analyze how far one has to go to extract the transition amplitude.
Yun Chen; Hui Yang
2014-01-01
The rapid advancements of biomedical instrumentation and healthcare technology have resulted in data-rich environments in hospitals. However, the meaningful information extracted from rich datasets is limited. There is a dire need to go beyond current medical practices, and develop data-driven methods and tools that will enable and help (i) the handling of big data, (ii) the extraction of data-driven knowledge, (iii) the exploitation of acquired knowledge for optimizing clinical decisions. This present study focuses on the prediction of mortality rates in Intensive Care Units (ICU) using patient-specific healthcare recordings. It is worth mentioning that postsurgical monitoring in ICU leads to massive datasets with unique properties, e.g., variable heterogeneity, patient heterogeneity, and time asyncronization. To cope with the challenges in ICU datasets, we developed the postsurgical decision support system with a series of analytical tools, including data categorization, data pre-processing, feature extraction, feature selection, and predictive modeling. Experimental results show that the proposed data-driven methodology outperforms traditional approaches and yields better results based on the evaluation of real-world ICU data from 4000 subjects in the database. This research shows great potentials for the use of data-driven analytics to improve the quality of healthcare services.
Dynamic modeling and motion simulation for a winged hybrid-driven underwater glider
NASA Astrophysics Data System (ADS)
Wang, Shu-Xin; Sun, Xiu-Jun; Wang, Yan-Hui; Wu, Jian-Guo; Wang, Xiao-Ming
2011-03-01
PETREL, a winged hybrid-driven underwater glider is a novel and practical marine survey platform which combines the features of legacy underwater glider and conventional AUV (autonomous underwater vehicle). It can be treated as a multi-rigid-body system with a floating base and a particular hydrodynamic profile. In this paper, theorems on linear and angular momentum are used to establish the dynamic equations of motion of each rigid body and the effect of translational and rotational motion of internal masses on the attitude control are taken into consideration. In addition, due to the unique external shape with fixed wings and deflectable rudders and the dual-drive operation in thrust and glide modes, the approaches of building dynamic model of conventional AUV and hydrodynamic model of submarine are introduced, and the tailored dynamic equations of the hybrid glider are formulated. Moreover, the behaviors of motion in glide and thrust operation are analyzed based on the simulation and the feasibility of the dynamic model is validated by data from lake field trials.
Time scale of random sequential adsorption.
Erban, Radek; Chapman, S Jonathan
2007-04-01
A simple multiscale approach to the diffusion-driven adsorption from a solution to a solid surface is presented. The model combines two important features of the adsorption process: (i) The kinetics of the chemical reaction between adsorbing molecules and the surface and (ii) geometrical constraints on the surface made by molecules which are already adsorbed. The process (i) is modeled in a diffusion-driven context, i.e., the conditional probability of adsorbing a molecule provided that the molecule hits the surface is related to the macroscopic surface reaction rate. The geometrical constraint (ii) is modeled using random sequential adsorption (RSA), which is the sequential addition of molecules at random positions on a surface; one attempt to attach a molecule is made per one RSA simulation time step. By coupling RSA with the diffusion of molecules in the solution above the surface the RSA simulation time step is related to the real physical time. The method is illustrated on a model of chemisorption of reactive polymers to a virus surface.
The Impact of Heterogeneity and Awareness in Modeling Epidemic Spreading on Multiplex Networks
Scatà, Marialisa; Di Stefano, Alessandro; Liò, Pietro; La Corte, Aurelio
2016-01-01
In the real world, dynamic processes involving human beings are not disjoint. To capture the real complexity of such dynamics, we propose a novel model of the coevolution of epidemic and awareness spreading processes on a multiplex network, also introducing a preventive isolation strategy. Our aim is to evaluate and quantify the joint impact of heterogeneity and awareness, under different socioeconomic conditions. Considering, as case study, an emerging public health threat, Zika virus, we introduce a data-driven analysis by exploiting multiple sources and different types of data, ranging from Big Five personality traits to Google Trends, related to different world countries where there is an ongoing epidemic outbreak. Our findings demonstrate how the proposed model allows delaying the epidemic outbreak and increasing the resilience of nodes, especially under critical economic conditions. Simulation results, using data-driven approach on Zika virus, which has a growing scientific research interest, are coherent with the proposed analytic model. PMID:27848978
Hanan, Erin J; Tague, Christina; Choate, Janet; Liu, Mingliang; Kolden, Crystal; Adam, Jennifer
2018-03-24
Disturbances such as wildfire, insect outbreaks, and forest clearing, play an important role in regulating carbon, nitrogen, and hydrologic fluxes in terrestrial watersheds. Evaluating how watersheds respond to disturbance requires understanding mechanisms that interact over multiple spatial and temporal scales. Simulation modeling is a powerful tool for bridging these scales; however, model projections are limited by uncertainties in the initial state of plant carbon and nitrogen stores. Watershed models typically use one of two methods to initialize these stores: spin-up to steady state or remote sensing with allometric relationships. Spin-up involves running a model until vegetation reaches equilibrium based on climate. This approach assumes that vegetation across the watershed has reached maturity and is of uniform age, which fails to account for landscape heterogeneity and non-steady-state conditions. By contrast, remote sensing, can provide data for initializing such conditions. However, methods for assimilating remote sensing into model simulations can also be problematic. They often rely on empirical allometric relationships between a single vegetation variable and modeled carbon and nitrogen stores. Because allometric relationships are species- and region-specific, they do not account for the effects of local resource limitation, which can influence carbon allocation (to leaves, stems, roots, etc.). To address this problem, we developed a new initialization approach using the catchment-scale ecohydrologic model RHESSys. The new approach merges the mechanistic stability of spin-up with the spatial fidelity of remote sensing. It uses remote sensing to define spatially explicit targets for one or several vegetation state variables, such as leaf area index, across a watershed. The model then simulates the growth of carbon and nitrogen stores until the defined targets are met for all locations. We evaluated this approach in a mixed pine-dominated watershed in central Idaho, and a chaparral-dominated watershed in southern California. In the pine-dominated watershed, model estimates of carbon, nitrogen, and water fluxes varied among methods, while the target-driven method increased correspondence between observed and modeled streamflow. In the chaparral watershed, where vegetation was more homogeneously aged, there were no major differences among methods. Thus, in heterogeneous, disturbance-prone watersheds, the target-driven approach shows potential for improving biogeochemical projections. © 2018 by the Ecological Society of America.
de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul
2012-01-01
Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.
Modeling interdependencies between business and communication processes in hospitals.
Brigl, Birgit; Wendt, Thomas; Winter, Alfred
2003-01-01
The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.
NASA Astrophysics Data System (ADS)
El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali
2015-09-01
The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Moradkhani, H.; Marshall, L. A.; Sharma, A.; Geenens, G.
2016-12-01
Effective combination of model simulations and observations through Data Assimilation (DA) depends heavily on uncertainty characterisation. Many traditional methods for quantifying model uncertainty in DA require some level of subjectivity (by way of tuning parameters or by assuming Gaussian statistics). Furthermore, the focus is typically on only estimating the first and second moments. We propose a data-driven methodology to estimate the full distributional form of model uncertainty, i.e. the transition density p(xt|xt-1). All sources of uncertainty associated with the model simulations are considered collectively, without needing to devise stochastic perturbations for individual components (such as model input, parameter and structural uncertainty). A training period is used to derive the distribution of errors in observed variables conditioned on hidden states. Errors in hidden states are estimated from the conditional distribution of observed variables using non-linear optimization. The theory behind the framework and case study applications are discussed in detail. Results demonstrate improved predictions and more realistic uncertainty bounds compared to a standard perturbation approach.
Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework
NASA Astrophysics Data System (ADS)
Hermawan; Hastarista, Fika
2016-01-01
Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.
SEAPODYM-LTL: a parsimonious zooplankton dynamic biomass model
NASA Astrophysics Data System (ADS)
Conchon, Anna; Lehodey, Patrick; Gehlen, Marion; Titaud, Olivier; Senina, Inna; Séférian, Roland
2017-04-01
Mesozooplankton organisms are of critical importance for the understanding of early life history of most fish stocks, as well as the nutrient cycles in the ocean. Ongoing climate change and the need for improved approaches to the management of living marine resources has driven recent advances in zooplankton modelling. The classical modeling approach tends to describe the whole biogeochemical and plankton cycle with increasing complexity. We propose here a different and parsimonious zooplankton dynamic biomass model (SEAPODYM-LTL) that is cost efficient and can be advantageously coupled with primary production estimated either from satellite derived ocean color data or biogeochemical models. In addition, the adjoint code of the model is developed allowing a robust optimization approach for estimating the few parameters of the model. In this study, we run the first optimization experiments using a global database of climatological zooplankton biomass data and we make a comparative analysis to assess the importance of resolution and primary production inputs on model fit to observations. We also compare SEAPODYM-LTL outputs to those produced by a more complex biogeochemical model (PISCES) but sharing the same physical forcings.
On the Conditioning of Machine-Learning-Assisted Turbulence Modeling
NASA Astrophysics Data System (ADS)
Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng
2017-11-01
Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.
NASA Astrophysics Data System (ADS)
Butler, Samuel D.; Marciniak, Michael A.
2014-09-01
Since the development of the Torrance-Sparrow bidirectional re ectance distribution function (BRDF) model in 1967, several BRDF models have been created. Previous attempts to categorize BRDF models have relied upon somewhat vague descriptors, such as empirical, semi-empirical, and experimental. Our approach is to instead categorize BRDF models based on functional form: microfacet normal distribution, geometric attenua- tion, directional-volumetric and Fresnel terms, and cross section conversion factor. Several popular microfacet models are compared to a standardized notation for a microfacet BRDF model. A library of microfacet model components is developed, allowing for creation of unique microfacet models driven by experimentally measured BRDFs.
A Null Space Control of Two Wheels Driven Mobile Manipulator Using Passivity Theory
NASA Astrophysics Data System (ADS)
Shibata, Tsuyoshi; Murakami, Toshiyuki
This paper describes a control strategy of null space motion of a two wheels driven mobile manipulator. Recently, robot is utilized in various industrial fields and it is preferable for the robot manipulator to have multiple degrees of freedom motion. Several studies of kinematics for null space motion have been proposed. However stability analysis of null space motion is not enough. Furthermore, these approaches apply to stable systems, but they do not apply unstable systems. Then, in this research, base of manipulator equips with two wheels driven mobile robot. This robot is called two wheels driven mobile manipulator, which becomes unstable system. In the proposed approach, a control design of null space uses passivity based stabilizing. A proposed controller is decided so that closed-loop system of robot dynamics satisfies passivity. This is passivity based control. Then, control strategy is that stabilizing of the robot system applies to work space observer based approach and null space control while keeping end-effector position. The validity of the proposed approach is verified by simulations and experiments of two wheels driven mobile manipulator.
New approaches in clinical application of laser-driven ionizing radiation
NASA Astrophysics Data System (ADS)
Hideghéty, Katalin; Szabó, Rita Emilia; Polanek, Róbert; Szabó, Zoltán.; Brunner, Szilvia; Tőkés, Tünde
2017-05-01
The planned laser-driven ionizing beams (photon, very high energy electron, proton, carbon ion) at laser facilities have the unique property of ultra-high dose rate (>Gy/s-10), short pulses, and at ELI-ALPS high repetition rate, carry the potential to develop novel laser-driven methods towards compact hospital-based clinical application. The enhanced flexibility in particle and energy selection, the high spatial and time resolution and extreme dose rate could be highly beneficial in radiotherapy. These approaches may increase significantly the therapeutic index over the currently available advanced radiation oncology methods. We highlight two nuclear reactionbased binary modalities and the planned radiobiology research. Boron Neutron Capture Therapy is an advanced cell targeted modality requiring 10B enriched boron carrier and appropriate neutron beam. The development of laser-based thermal and epithermal neutron source with as high as 1010 fluence rate could enhance the research activity in this promising field. Boron-Proton Fusion reaction is as well as a binary approach, where 11B containing compounds are accumulated into the cells, and the tumour selectively irradiated with protons. Due to additional high linear energy transfer alpha particle release of the BPFR and the maximum point of the Bragg-peak is increased, which result in significant biological effect enhancement. Research at ELI-ALPS on detection of biological effect differences of modified or different quality radiation will be presented using recently developed zebrafish embryo and rodent models.
Location-Driven Image Retrieval for Images Collected by a Mobile Robot
NASA Astrophysics Data System (ADS)
Tanaka, Kanji; Hirayama, Mitsuru; Okada, Nobuhiro; Kondo, Eiji
Mobile robot teleoperation is a method for a human user to interact with a mobile robot over time and distance. Successful teleoperation depends on how well images taken by the mobile robot are visualized to the user. To enhance the efficiency and flexibility of the visualization, an image retrieval system on such a robot’s image database would be very useful. The main difference of the robot’s image database from standard image databases is that various relevant images exist due to variety of viewing conditions. The main contribution of this paper is to propose an efficient retrieval approach, named location-driven approach, utilizing correlation between visual features and real world locations of images. Combining the location-driven approach with the conventional feature-driven approach, our goal can be viewed as finding an optimal classifier between relevant and irrelevant feature-location pairs. An active learning technique based on support vector machine is extended for this aim.
Gaussian Processes for Data-Efficient Learning in Robotics and Control.
Deisenroth, Marc Peter; Fox, Dieter; Rasmussen, Carl Edward
2015-02-01
Autonomous learning has been a promising direction in control and robotics for more than a decade since data-driven learning allows to reduce the amount of engineering knowledge, which is otherwise required. However, autonomous reinforcement learning (RL) approaches typically require many interactions with the system to learn controllers, which is a practical limitation in real systems, such as robots, where many interactions can be impractical and time consuming. To address this problem, current learning approaches typically require task-specific knowledge in form of expert demonstrations, realistic simulators, pre-shaped policies, or specific knowledge about the underlying dynamics. In this paper, we follow a different approach and speed up learning by extracting more information from data. In particular, we learn a probabilistic, non-parametric Gaussian process transition model of the system. By explicitly incorporating model uncertainty into long-term planning and controller learning our approach reduces the effects of model errors, a key problem in model-based learning. Compared to state-of-the art RL our model-based policy search method achieves an unprecedented speed of learning. We demonstrate its applicability to autonomous learning in real robot and control tasks.
Rossa, Carlos; Lehmann, Thomas; Sloboda, Ronald; Usmani, Nawaid; Tavakoli, Mahdi
2017-08-01
Global modelling has traditionally been the approach taken to estimate needle deflection in soft tissue. In this paper, we propose a new method based on local data-driven modelling of needle deflection. External measurement of needle-tissue interactions is collected from several insertions in ex vivo tissue to form a cloud of data. Inputs to the system are the needle insertion depth, axial rotations, and the forces and torques measured at the needle base by a force sensor. When a new insertion is performed, the just-in-time learning method estimates the model outputs given the current inputs to the needle-tissue system and the historical database. The query is compared to every observation in the database and is given weights according to some similarity criteria. Only a subset of historical data that is most relevant to the query is selected and a local linear model is fit to the selected points to estimate the query output. The model outputs the 3D deflection of the needle tip and the needle insertion force. The proposed approach is validated in ex vivo multilayered biological tissue in different needle insertion scenarios. Experimental results in five different case studies indicate an accuracy in predicting needle deflection of 0.81 and 1.24 mm in the horizontal and vertical lanes, respectively, and an accuracy of 0.5 N in predicting the needle insertion force over 216 needle insertions.
NASA Astrophysics Data System (ADS)
González, D. L., II; Angus, M. P.; Tetteh, I. K.; Bello, G. A.; Padmanabhan, K.; Pendse, S. V.; Srinivas, S.; Yu, J.; Semazzi, F.; Kumar, V.; Samatova, N. F.
2015-01-01
Decades of hypothesis-driven and/or first-principles research have been applied towards the discovery and explanation of the mechanisms that drive climate phenomena, such as western African Sahel summer rainfall~variability. Although connections between various climate factors have been theorized, not all of the key relationships are fully understood. We propose a data-driven approach to identify candidate players in this climate system, which can help explain underlying mechanisms and/or even suggest new relationships, to facilitate building a more comprehensive and predictive model of the modulatory relationships influencing a climate phenomenon of interest. We applied coupled heterogeneous association rule mining (CHARM), Lasso multivariate regression, and dynamic Bayesian networks to find relationships within a complex system, and explored means with which to obtain a consensus result from the application of such varied methodologies. Using this fusion of approaches, we identified relationships among climate factors that modulate Sahel rainfall. These relationships fall into two categories: well-known associations from prior climate knowledge, such as the relationship with the El Niño-Southern Oscillation (ENSO) and putative links, such as North Atlantic Oscillation, that invite further research.
Gonzalez, II, D. L.; Angus, M. P.; Tetteh, I. K.; ...
2015-01-13
Decades of hypothesis-driven and/or first-principles research have been applied towards the discovery and explanation of the mechanisms that drive climate phenomena, such as western African Sahel summer rainfall~variability. Although connections between various climate factors have been theorized, not all of the key relationships are fully understood. We propose a data-driven approach to identify candidate players in this climate system, which can help explain underlying mechanisms and/or even suggest new relationships, to facilitate building a more comprehensive and predictive model of the modulatory relationships influencing a climate phenomenon of interest. We applied coupled heterogeneous association rule mining (CHARM), Lasso multivariate regression,more » and dynamic Bayesian networks to find relationships within a complex system, and explored means with which to obtain a consensus result from the application of such varied methodologies. Using this fusion of approaches, we identified relationships among climate factors that modulate Sahel rainfall. As a result, these relationships fall into two categories: well-known associations from prior climate knowledge, such as the relationship with the El Niño–Southern Oscillation (ENSO) and putative links, such as North Atlantic Oscillation, that invite further research.« less
Electric-field-driven electron-transfer in mixed-valence molecules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blair, Enrique P., E-mail: enrique-blair@baylor.edu; Corcelli, Steven A., E-mail: scorcell@nd.edu; Lent, Craig S., E-mail: lent@nd.edu
2016-07-07
Molecular quantum-dot cellular automata is a computing paradigm in which digital information is encoded by the charge configuration of a mixed-valence molecule. General-purpose computing can be achieved by arranging these compounds on a substrate and exploiting intermolecular Coulombic coupling. The operation of such a device relies on nonequilibrium electron transfer (ET), whereby the time-varying electric field of one molecule induces an ET event in a neighboring molecule. The magnitude of the electric fields can be quite large because of close spatial proximity, and the induced ET rate is a measure of the nonequilibrium response of the molecule. We calculate themore » electric-field-driven ET rate for a model mixed-valence compound. The mixed-valence molecule is regarded as a two-state electronic system coupled to a molecular vibrational mode, which is, in turn, coupled to a thermal environment. Both the electronic and vibrational degrees-of-freedom are treated quantum mechanically, and the dissipative vibrational-bath interaction is modeled with the Lindblad equation. This approach captures both tunneling and nonadiabatic dynamics. Relationships between microscopic molecular properties and the driven ET rate are explored for two time-dependent applied fields: an abruptly switched field and a linearly ramped field. In both cases, the driven ET rate is only weakly temperature dependent. When the model is applied using parameters appropriate to a specific mixed-valence molecule, diferrocenylacetylene, terahertz-range ET transfer rates are predicted.« less
Hayes, Daniel J.; Turner, David P.; Stinson, Graham; McGuire, A. David; Wei, Yaxing; West, Tristram O.; Heath, Linda S.; de Jong, Bernardus; McConkey, Brian G.; Birdsey, Richard A.; Kurz, Werner A.; Jacobson, Andrew R.; Huntzinger, Deborah N.; Pan, Yude; Post, W. Mac; Cook, Robert B.
2012-01-01
We develop an approach for estimating net ecosystem exchange (NEE) using inventory-based information over North America (NA) for a recent 7-year period (ca. 2000–2006). The approach notably retains information on the spatial distribution of NEE, or the vertical exchange between land and atmosphere of all non-fossil fuel sources and sinks of CO2, while accounting for lateral transfers of forest and crop products as well as their eventual emissions. The total NEE estimate of a -327 ± 252 TgC yr-1 sink for NA was driven primarily by CO2 uptake in the Forest Lands sector (-248 TgC yr-1), largely in the Northwest and Southeast regions of the US, and in the Crop Lands sector (-297 TgC yr-1), predominantly in the Midwest US states. These sinks are counteracted by the carbon source estimated for the Other Lands sector (+218 TgC yr-1), where much of the forest and crop products are assumed to be returned to the atmosphere (through livestock and human consumption). The ecosystems of Mexico are estimated to be a small net source (+18 TgC yr-1) due to land use change between 1993 and 2002. We compare these inventory-based estimates with results from a suite of terrestrial biosphere and atmospheric inversion models, where the mean continental-scale NEE estimate for each ensemble is -511 TgC yr-1 and -931 TgC yr-1, respectively. In the modeling approaches, all sectors, including Other Lands, were generally estimated to be a carbon sink, driven in part by assumed CO2 fertilization and/or lack of consideration of carbon sources from disturbances and product emissions. Additional fluxes not measured by the inventories, although highly uncertain, could add an additional -239 TgC yr-1 to the inventory-based NA sink estimate, thus suggesting some convergence with the modeling approaches.
Weather models as virtual sensors to data-driven rainfall predictions in urban watersheds
NASA Astrophysics Data System (ADS)
Cozzi, Lorenzo; Galelli, Stefano; Pascal, Samuel Jolivet De Marc; Castelletti, Andrea
2013-04-01
Weather and climate predictions are a key element of urban hydrology where they are used to inform water management and assist in flood warning delivering. Indeed, the modelling of the very fast dynamics of urbanized catchments can be substantially improved by the use of weather/rainfall predictions. For example, in Singapore Marina Reservoir catchment runoff processes have a very short time of concentration (roughly one hour) and observational data are thus nearly useless for runoff predictions and weather prediction are required. Unfortunately, radar nowcasting methods do not allow to carrying out long - term weather predictions, whereas numerical models are limited by their coarse spatial scale. Moreover, numerical models are usually poorly reliable because of the fast motion and limited spatial extension of rainfall events. In this study we investigate the combined use of data-driven modelling techniques and weather variables observed/simulated with a numerical model as a way to improve rainfall prediction accuracy and lead time in the Singapore metropolitan area. To explore the feasibility of the approach, we use a Weather Research and Forecast (WRF) model as a virtual sensor network for the input variables (the states of the WRF model) to a machine learning rainfall prediction model. More precisely, we combine an input variable selection method and a non-parametric tree-based model to characterize the empirical relation between the rainfall measured at the catchment level and all possible weather input variables provided by WRF model. We explore different lead time to evaluate the model reliability for different long - term predictions, as well as different time lags to see how past information could improve results. Results show that the proposed approach allow a significant improvement of the prediction accuracy of the WRF model on the Singapore urban area.
Genetic Programming for Automatic Hydrological Modelling
NASA Astrophysics Data System (ADS)
Chadalawada, Jayashree; Babovic, Vladan
2017-04-01
One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach for conceptual hydrological modeling: 1. Motivation and theoretical development, Water Resources Research, 47(11).
Competency--and Process-Driven e-Learning--A Model-Based Approach
ERIC Educational Resources Information Center
Leyking, Katrina; Chikova, Pavlina; Loos, Peter
2007-01-01
As a matter of fact e-Learning still has not really caught on for corporate training purposes. Investigations on the reasons reveal that e-Learning modules like WBTs often miss any relevance for the tasks to be accomplished in the day-to-day workplace settings. The very learning needs both from an organizational and individual perspective are…
John G. Michopoulos; Tomonari Furukawa; John C. Hermanson; Samuel G. Lambrakos
2011-01-01
The goal of this paper is to propose and demonstrate a multi level design optimization approach for the coordinated determination of a material constitutive model synchronously to the design of the experimental procedure needed to acquire the necessary data. The methodology achieves both online (real-time) and offline design of optimum experiments required for...
ERIC Educational Resources Information Center
Huang, Zuqing; Qiu, Robin G.
2016-01-01
University ranking or higher education assessment in general has been attracting more and more public attention over the years. However, the subjectivity-based evaluation index and indicator selections and weights that are widely adopted in most existing ranking systems have been called into question. In other words, the objectivity and…
Satellite detection of land-use change and effects on regional forest aboveground biomass estimates
Daolan Zheng; Linda S. Heath; Mark J. Ducey
2008-01-01
We used remote-sensing-driven models to detect land-cover change effects on forest aboveground biomass (AGB) density (Mg·ha−1, dry weight) and total AGB (Tg) in Minnesota, Wisconsin, and Michigan USA, between the years 1992-2001, and conducted an evaluation of the approach. Inputs included remotely-sensed 1992 reflectance data...
Functional language and data flow architectures
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Patel, D. R.; Lang, T.
1983-01-01
This is a tutorial article about language and architecture approaches for highly concurrent computer systems based on the functional style of programming. The discussion concentrates on the basic aspects of functional languages, and sequencing models such as data-flow, demand-driven and reduction which are essential at the machine organization level. Several examples of highly concurrent machines are described.
ERIC Educational Resources Information Center
Tang, Stephen; Hanneghan, Martin
2011-01-01
Game-based learning harnesses the advantages of computer games technology to create a fun, motivating and interactive virtual learning environment that promotes problem-based experiential learning. Such an approach is advocated by many commentators to provide an enhanced learning experience than those based on traditional didactic methods.…
Wind-driven rain and its implications for natural hazard management
NASA Astrophysics Data System (ADS)
Marzen, Miriam; Iserloh, Thomas; de Lima, João L. M. P.; Fister, Wolfgang; Ries, Johannes B.
2017-04-01
Prediction and risk assessment of hydrological extremes are great challenges. Following climate predictions, frequent and violent rainstorms will become a new hazard to several regions in the medium term. Particularly agricultural soils will be severely threatened due to the combined action of heavy rainfall and accompanying winds on bare soil surfaces. Basing on the general underestimation of the effect of wind on rain erosion, conventional soil erosion measurements and modeling approaches lack related information to adequately calculate its impact. The presented experimental-empirical approach shows the powerful impact of wind on the erosive potential of rain. The tested soils had properties that characterise three different environments 1. Silty loam of semi-arid Mediterranean dryfarming and fallow, 2. clayey loam of humid agricultural sites and 3. cohesionless sandy substrates as found at coasts, dune fields and drift-sand areas. Erosion was found to increase by a factor of 1.3 to 7.1, depending on site characteristics. Complementary tests with a laboratory procedure were used to quantify explicitly the effect of wind on raindrop erosion as well as the influence of substrate, surface structure and slope on particle displacement. These tests confirmed the impact of wind-driven rain on total erosion rates to be of great importance when compared to all other tested factors. To successfully adapt soil erosion models to near-future challenges of climate change induced rain storms, wind-driven rain is supposed to be introduced into the hazard management agenda.
NASA Astrophysics Data System (ADS)
Qin, Yuan; Yao, Man; Hao, Ce; Wan, Lijun; Wang, Yunhe; Chen, Ting; Wang, Dong; Wang, Xudong; Chen, Yonggang
2017-09-01
Two-dimensional (2D) chiral self-assembly system of 5-(benzyloxy)-isophthalic acid derivative/(S)-(+)-2-octanol/highly oriented pyrolytic graphite was studied. A combined density functional theory/molecular mechanics/molecular dynamics (DFT/MM/MD) approach for system of 2D chiral molecular self-assembly driven by hydrogen bond at the liquid/solid interface was thus proposed. Structural models of the chiral assembly were built on the basis of scanning tunneling microscopy (STM) images and simplified for DFT geometry optimization. Merck Molecular Force Field (MMFF) was singled out as the suitable force field by comparing the optimized configurations of MM and DFT. MM and MD simulations for hexagonal unit model which better represented the 2D assemble network were then preformed with MMFF. The adhesion energy, evolution of self-assembly process and characteristic parameters of hydrogen bond were obtained and analyzed. According to the above simulation, the stabilities of the clockwise and counterclockwise enantiomorphous networks were evaluated. The calculational results were supported by STM observations and the feasibility of the simulation method was confirmed by two other systems in the presence of chiral co-absorbers (R)-(-)-2-octanol and achiral co-absorbers 1-octanol. This theoretical simulation method assesses the stability trend of 2D enantiomorphous assemblies with atomic scale and can be applied to the similar hydrogen bond driven 2D chirality of molecular self-assembly system.
Caignard, Grégory; Eva, Megan M.; van Bruggen, Rebekah; Eveleigh, Robert; Bourque, Guillaume; Malo, Danielle; Gros, Philippe; Vidal, Silvia M.
2014-01-01
Infectious diseases are responsible for over 25% of deaths globally, but many more individuals are exposed to deadly pathogens. The outcome of infection results from a set of diverse factors including pathogen virulence factors, the environment, and the genetic make-up of the host. The completion of the human reference genome sequence in 2004 along with technological advances have tremendously accelerated and renovated the tools to study the genetic etiology of infectious diseases in humans and its best characterized mammalian model, the mouse. Advancements in mouse genomic resources have accelerated genome-wide functional approaches, such as gene-driven and phenotype-driven mutagenesis, bringing to the fore the use of mouse models that reproduce accurately many aspects of the pathogenesis of human infectious diseases. Treatment with the mutagen N-ethyl-N-nitrosourea (ENU) has become the most popular phenotype-driven approach. Our team and others have employed mouse ENU mutagenesis to identify host genes that directly impact susceptibility to pathogens of global significance. In this review, we first describe the strategies and tools used in mouse genetics to understand immunity to infection with special emphasis on chemical mutagenesis of the mouse germ-line together with current strategies to efficiently identify functional mutations using next generation sequencing. Then, we highlight illustrative examples of genes, proteins, and cellular signatures that have been revealed by ENU screens and have been shown to be involved in susceptibility or resistance to infectious diseases caused by parasites, bacteria, and viruses. PMID:25268389
A methodology proposal for collaborative business process elaboration using a model-driven approach
NASA Astrophysics Data System (ADS)
Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé
2015-05-01
Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).
Experimental Definition and Validation of Protein Coding Transcripts in Chlamydomonas reinhardtii
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kourosh Salehi-Ashtiani; Jason A. Papin
Algal fuel sources promise unsurpassed yields in a carbon neutral manner that minimizes resource competition between agriculture and fuel crops. Many challenges must be addressed before algal biofuels can be accepted as a component of the fossil fuel replacement strategy. One significant challenge is that the cost of algal fuel production must become competitive with existing fuel alternatives. Algal biofuel production presents the opportunity to fine-tune microbial metabolic machinery for an optimal blend of biomass constituents and desired fuel molecules. Genome-scale model-driven algal metabolic design promises to facilitate both goals by directing the utilization of metabolites in the complex, interconnectedmore » metabolic networks to optimize production of the compounds of interest. Using Chlamydomonas reinhardtii as a model, we developed a systems-level methodology bridging metabolic network reconstruction with annotation and experimental verification of enzyme encoding open reading frames. We reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. Our approach to generate a predictive metabolic model integrated with cloned open reading frames, provides a cost-effective platform to generate metabolic engineering resources. While the generated resources are specific to algal systems, the approach that we have developed is not specific to algae and can be readily expanded to other microbial systems as well as higher plants and animals.« less
Modeling of fast neutral-beam-generated ions and rotation effects on RWM stability in DIII-D plasmas
Turco, Francesca; Turnbull, Alan D.; Hanson, Jeremy M.; ...
2015-10-15
Here, validation results for the MARS-K code for DIII-D equilibria, predict that the absence of fast Neutral Beam (NB) generated ions leads to a plasma response ~40–60% higher than in NB-sustained H-mode plasmas when the no-wall β N limit is reached. In a β N scan, the MARS-K model with thermal and fast-ions, reproduces the experimental measurements above the no-wall limit, except at the highest β N where the phase of the plasma response is overestimated. The dependencies extrapolate unfavorably to machines such as ITER with smaller fast ion fractions since elevated responses in the absence of fast ions indicatemore » the potential onset of a resistive wall mode (RWM). The model was also tested for the effects of rotation at high β N, and recovers the measured response even when fast-ions are neglected, reversing the effect found in lower β N cases, but consistent with the higher β N results above the no-wall limit. The agreement in the response amplitude and phase for the rotation scan is not as good, and additional work will be needed to reproduce the experimental trends. In the case of current-driven instabilities, the magnetohydrodynamic spectroscopy system used to measure the plasma response reacts differently from that for pressure driven instabilities: the response amplitude remains low up to ~93% of the current limit, showing an abrupt increase only in the last ~5% of the current ramp. This makes it much less effective as a diagnostic for the approach to an ideal limit. However, the mode structure of the current driven RWM extends radially inwards, consistent with that in the pressure driven case for plasmas with q edge~2. This suggests that previously developed RWM feedback techniques together with the additional optimizations that enabled q edge~2 operation, can be applied to control of both current-driven and pressure-driven modes at high β N.« less
Estimating daily forest carbon fluxes using a combination of ground and remotely sensed data
NASA Astrophysics Data System (ADS)
Chirici, Gherardo; Chiesi, Marta; Corona, Piermaria; Salvati, Riccardo; Papale, Dario; Fibbi, Luca; Sirca, Costantino; Spano, Donatella; Duce, Pierpaolo; Marras, Serena; Matteucci, Giorgio; Cescatti, Alessandro; Maselli, Fabio
2016-02-01
Several studies have demonstrated that Monteith's approach can efficiently predict forest gross primary production (GPP), while the modeling of net ecosystem production (NEP) is more critical, requiring the additional simulation of forest respirations. The NEP of different forest ecosystems in Italy was currently simulated by the use of a remote sensing driven parametric model (modified C-Fix) and a biogeochemical model (BIOME-BGC). The outputs of the two models, which simulate forests in quasi-equilibrium conditions, are combined to estimate the carbon fluxes of actual conditions using information regarding the existing woody biomass. The estimates derived from the methodology have been tested against daily reference GPP and NEP data collected through the eddy correlation technique at five study sites in Italy. The first test concerned the theoretical validity of the simulation approach at both annual and daily time scales and was performed using optimal model drivers (i.e., collected or calibrated over the site measurements). Next, the test was repeated to assess the operational applicability of the methodology, which was driven by spatially extended data sets (i.e., data derived from existing wall-to-wall digital maps). A good estimation accuracy was generally obtained for GPP and NEP when using optimal model drivers. The use of spatially extended data sets worsens the accuracy to a varying degree, which is properly characterized. The model drivers with the most influence on the flux modeling strategy are, in increasing order of importance, forest type, soil features, meteorology, and forest woody biomass (growing stock volume).
Data-Driven Modeling of Complex Systems by means of a Dynamical ANN
NASA Astrophysics Data System (ADS)
Seleznev, A.; Mukhin, D.; Gavrilov, A.; Loskutov, E.; Feigin, A.
2017-12-01
The data-driven methods for modeling and prognosis of complex dynamical systems become more and more popular in various fields due to growth of high-resolution data. We distinguish the two basic steps in such an approach: (i) determining the phase subspace of the system, or embedding, from available time series and (ii) constructing an evolution operator acting in this reduced subspace. In this work we suggest a novel approach combining these two steps by means of construction of an artificial neural network (ANN) with special topology. The proposed ANN-based model, on the one hand, projects the data onto a low-dimensional manifold, and, on the other hand, models a dynamical system on this manifold. Actually, this is a recurrent multilayer ANN which has internal dynamics and capable of generating time series. Very important point of the proposed methodology is the optimization of the model allowing us to avoid overfitting: we use Bayesian criterion to optimize the ANN structure and estimate both the degree of evolution operator nonlinearity and the complexity of nonlinear manifold which the data are projected on. The proposed modeling technique will be applied to the analysis of high-dimensional dynamical systems: Lorenz'96 model of atmospheric turbulence, producing high-dimensional space-time chaos, and quasi-geostrophic three-layer model of the Earth's atmosphere with the natural orography, describing the dynamics of synoptical vortexes as well as mesoscale blocking systems. The possibility of application of the proposed methodology to analyze real measured data is also discussed. The study was supported by the Russian Science Foundation (grant #16-12-10198).
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-03
... by Index and Model-Driven Funds ACTION: Notice. SUMMARY: The Department of Labor (DOL) is submitting...) titled, ``Prohibited Transaction Class Exemption for Cross-Trades of Securities by Index and Model-Driven... and Model-Driven Funds permits cross-trades of securities between index and model-driven funds managed...
Jiang, Jiehui; Sun, Yiwu; Zhou, Hucheng; Li, Shaoping; Huang, Zhemin; Wu, Ping; Shi, Kuangyu; Zuo, Chuantao; Neuroimaging Initiative, Alzheimer's Disease
2018-01-01
18 F-FDG PET scan is one of the most frequently used neural imaging scans. However, the influence of age has proven to be the greatest interfering factor for many clinical dementia diagnoses when analyzing 18 F-FDG PET images, since radiologists encounter difficulties when deciding whether the abnormalities in specific regions correlate with normal aging, disease, or both. In the present paper, the authors aimed to define specific brain regions and determine an age-correction mathematical model. A data-driven approach was used based on 255 healthy subjects. The inferior frontal gyrus, the left medial part and the left medial orbital part of superior frontal gyrus, the right insula, the left anterior cingulate, the left median cingulate, and paracingulate gyri, and bilateral superior temporal gyri were found to have a strong negative correlation with age. For evaluation, an age-correction model was applied to 262 healthy subjects and 50 AD subjects selected from the ADNI database, and partial correlations between SUVR mean and three clinical results were carried out before and after age correction. All correlation coefficients were significantly improved after the age correction. The proposed model was effective in the age correction of both healthy and AD subjects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ping; Song, Heda; Wang, Hong
Blast furnace (BF) in ironmaking is a nonlinear dynamic process with complicated physical-chemical reactions, where multi-phase and multi-field coupling and large time delay occur during its operation. In BF operation, the molten iron temperature (MIT) as well as Si, P and S contents of molten iron are the most essential molten iron quality (MIQ) indices, whose measurement, modeling and control have always been important issues in metallurgic engineering and automation field. This paper develops a novel data-driven nonlinear state space modeling for the prediction and control of multivariate MIQ indices by integrating hybrid modeling and control techniques. First, to improvemore » modeling efficiency, a data-driven hybrid method combining canonical correlation analysis and correlation analysis is proposed to identify the most influential controllable variables as the modeling inputs from multitudinous factors would affect the MIQ indices. Then, a Hammerstein model for the prediction of MIQ indices is established using the LS-SVM based nonlinear subspace identification method. Such a model is further simplified by using piecewise cubic Hermite interpolating polynomial method to fit the complex nonlinear kernel function. Compared to the original Hammerstein model, this simplified model can not only significantly reduce the computational complexity, but also has almost the same reliability and accuracy for a stable prediction of MIQ indices. Last, in order to verify the practicability of the developed model, it is applied in designing a genetic algorithm based nonlinear predictive controller for multivariate MIQ indices by directly taking the established model as a predictor. Industrial experiments show the advantages and effectiveness of the proposed approach.« less
Sierra-de-Grado, Rosario; Pando, Valentín; Martínez-Zurimendi, Pablo; Peñalvo, Alejandro; Báscones, Esther; Moulia, Bruno
2008-06-01
Stem straightness is an important selection trait in Pinus pinaster Ait. breeding programs. Despite the stability of stem straightness rankings in provenance trials, the efficiency of breeding programs based on a quantitative index of stem straightness remains low. An alternative approach is to analyze biomechanical processes that underlie stem form. The rationale for this selection method is that genetic differences in the biomechanical processes that maintain stem straightness in young plants will continue to control stem form throughout the life of the tree. We analyzed the components contributing most to genetic differences among provenances in stem straightening processes by kinetic analysis and with a biomechanical model defining the interactions between the variables involved (Fournier's model). This framework was tested on three P. pinaster provenances differing in adult stem straightness and growth. One-year-old plants were tilted at 45 degrees, and individual stem positions and sizes were recorded weekly for 5 months. We measured the radial extension of reaction wood and the anatomical features of wood cells in serial stem cross sections. The integral effect of reaction wood on stem leaning was computed with Fournier's model. Responses driven by both primary and secondary growth were involved in the stem straightening process, but secondary-growth-driven responses accounted for most differences among provenances. Plants from the straight-stemmed provenance showed a greater capacity for stem straightening than plants from the sinuous provenances mainly because of (1) more efficient reaction wood (higher maturation strains) and (2) more pronounced secondary-growth-driven autotropic decurving. These two process-based traits are thus good candidates for early selection of stem straightness, but additional tests on a greater number of genotypes over a longer period are required.
Chen, Young-Bin; Lan, Ying-Wei; Hung, Tsai-Hsien; Chen, Lih-Geeng; Choo, Kong-Bung; Cheng, Winston T K; Lee, Hsuan-Shu; Chong, Kowit-Yu
2015-07-01
Several studies of stem cell-based gene therapy have indicated that long-lasting regeneration following vessel ischemia may be stimulated through VEGFA gene therapy and/or MSC transplantation for reduction of ischemic injury in limb ischemia and heart failure. The therapeutic potential of MSC transplantation can be further improved by genetically modifying MSCs with genes which enhance angiogenesis following ischemic injury. In the present study, we aimed to develop an approach in MSC-based therapy for repair and mitigation of ischemic injury and regeneration of damaged tissues in ischemic disease. HSP70 promoter-driven VEGFA expression was induced by resveratrol (RSV) in MSCs, and in combination with known RSV biological functions, the protective effects of our approach were investigated by using ex vivo aortic ring coculture system and a 3D scaffolds in vivo model. Results of this investigation demonstrated that HSP promoter-driven VEGFA expression in MSC increased approximately 2-fold over the background VEGFA levels upon HSP70 promoter induction by RSV. Exposure of HUVEC cells to medium containing MSC in which VEGFA had been induced by cis-RSV enhanced tube formation in the treated HUVEC cells. RSV-treated MSC cells differentiated into endothelial-like phenotypes, exhibiting markedly elevated expression of endothelial cell markers. These MSCs also induced aortic ring sprouting, characteristic of neovascular formation from pre-existing vessels, and additionally promoted neovascularization at the MSC transplantation site in a mouse model. These observations support a hypothesis that VEGFA expression induced by cis-RSV acting on the HSP70 promoter in transplanted MSC augments the angiogenic effects of stem cell gene therapy. The use of an inducible system also vastly reduces possible clinical risks associated with constitutive VEGFA expression.
A time domain frequency-selective multivariate Granger causality approach.
Leistritz, Lutz; Witte, Herbert
2016-08-01
The investigation of effective connectivity is one of the major topics in computational neuroscience to understand the interaction between spatially distributed neuronal units of the brain. Thus, a wide variety of methods has been developed during the last decades to investigate functional and effective connectivity in multivariate systems. Their spectrum ranges from model-based to model-free approaches with a clear separation into time and frequency range methods. We present in this simulation study a novel time domain approach based on Granger's principle of predictability, which allows frequency-selective considerations of directed interactions. It is based on a comparison of prediction errors of multivariate autoregressive models fitted to systematically modified time series. These modifications are based on signal decompositions, which enable a targeted cancellation of specific signal components with specific spectral properties. Depending on the embedded signal decomposition method, a frequency-selective or data-driven signal-adaptive Granger Causality Index may be derived.
The TIPS Evaluation Project: A Theory-Driven Approach to Dissemination Research.
ERIC Educational Resources Information Center
Mulvey, Kevin P.; Hayashi, Susan W.; Hubbard, Susan M.; Kopstien, Andrea; Huang, Judy Y.
2003-01-01
Introduces the special section that focuses on four major studies under the treatment improvement protocols (TIPs) evaluation project. Provides an overview of each article, and addresses the value of using a theory-driven approach to dissemination research. (SLD)
Data driven propulsion system weight prediction model
NASA Astrophysics Data System (ADS)
Gerth, Richard J.
1994-10-01
The objective of the research was to develop a method to predict the weight of paper engines, i.e., engines that are in the early stages of development. The impetus for the project was the Single Stage To Orbit (SSTO) project, where engineers need to evaluate alternative engine designs. Since the SSTO is a performance driven project the performance models for alternative designs were well understood. The next tradeoff is weight. Since it is known that engine weight varies with thrust levels, a model is required that would allow discrimination between engines that produce the same thrust. Above all, the model had to be rooted in data with assumptions that could be justified based on the data. The general approach was to collect data on as many existing engines as possible and build a statistical model of the engines weight as a function of various component performance parameters. This was considered a reasonable level to begin the project because the data would be readily available, and it would be at the level of most paper engines, prior to detailed component design.
Ren, Jiaping; Wang, Xinjie; Manocha, Dinesh
2016-01-01
We present a biologically plausible dynamics model to simulate swarms of flying insects. Our formulation, which is based on biological conclusions and experimental observations, is designed to simulate large insect swarms of varying densities. We use a force-based model that captures different interactions between the insects and the environment and computes collision-free trajectories for each individual insect. Furthermore, we model the noise as a constructive force at the collective level and present a technique to generate noise-induced insect movements in a large swarm that are similar to those observed in real-world trajectories. We use a data-driven formulation that is based on pre-recorded insect trajectories. We also present a novel evaluation metric and a statistical validation approach that takes into account various characteristics of insect motions. In practice, the combination of Curl noise function with our dynamics model is used to generate realistic swarm simulations and emergent behaviors. We highlight its performance for simulating large flying swarms of midges, fruit fly, locusts and moths and demonstrate many collective behaviors, including aggregation, migration, phase transition, and escape responses. PMID:27187068
Formanowicz, Dorota; Radom, Marcin; Rybarczyk, Agnieszka; Formanowicz, Piotr
2018-03-01
The superoxide-driven Fenton reaction plays an important role in the transformation of poorly reactive radicals into highly reactive ones. These highly reactive species (ROS), especially hydroxyl radicals can lead to many disturbances contributing to the endothelial dysfunction being a starting point for atherosclerosis. Although, iron has been identified as a possible culprit influencing formation of ROS, its significance in this process is still debatable. To better understand this phenomenon, the influence of blockade of Fenton reaction in a proposed Petri net-based model of the selected aspects of the iron ROS-induced toxicity in atherosclerosis has been evaluated. As a result of the blockade of iron ions formation in the model, even up to 70% of the paths leading to the progression of atherosclerosis in this model has been blocked. In addition, after adding to the model, the blockade of the lipids peroxidation paths, progression of atherosclerotic plaque has been not observed. This allowed to conclude that the superoxide-driven Fenton reaction plays a significant role in the atherosclerosis. Copyright © 2018 Elsevier B.V. All rights reserved.
Numerical modeling of the solar wind flow with observational boundary conditions
Pogorelov, N. V.; Borovikov, S. N.; Burlaga, L. F.; ...
2012-11-20
In this paper we describe our group efforts to develop a self-consistent, data-driven model of the solar wind (SW) interaction with the local interstellar medium. The motion of plasma in this model is described with the MHD approach, while the transport of neutral atoms is addressed by either kinetic or multi-fluid equations. The model and its implementation in the Multi-Scale Fluid-Kinetic Simulation Suite (MS-FLUKSS) are continuously tested and validated by comparing our results with other models and spacecraft measurements. In particular, it was successfully applied to explain an unusual SW behavior discovered by the Voyager 1 spacecraft, i.e., the developmentmore » of a substantial negative radial velocity component, flow turning in the transverse direction, while the latitudinal velocity component goes to very small values. We explain recent SW velocity measurements at Voyager 1 in the context of our 3-D, MHD modeling. We also present a comparison of different turbulence models in their ability to reproduce the SW temperature profile from Voyager 2 measurements. Lastly, the boundary conditions obtained at 50 solar radii from data-driven numerical simulations are used to model a CME event throughout the heliosphere.« less
Designing an optimal software intensive system acquisition: A game theoretic approach
NASA Astrophysics Data System (ADS)
Buettner, Douglas John
The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.
Interpretable Deep Models for ICU Outcome Prediction
Che, Zhengping; Purushotham, Sanjay; Khemani, Robinder; Liu, Yan
2016-01-01
Exponential surge in health care data, such as longitudinal data from electronic health records (EHR), sensor data from intensive care unit (ICU), etc., is providing new opportunities to discover meaningful data-driven characteristics and patterns ofdiseases. Recently, deep learning models have been employedfor many computational phenotyping and healthcare prediction tasks to achieve state-of-the-art performance. However, deep models lack interpretability which is crucial for wide adoption in medical research and clinical decision-making. In this paper, we introduce a simple yet powerful knowledge-distillation approach called interpretable mimic learning, which uses gradient boosting trees to learn interpretable models and at the same time achieves strong prediction performance as deep learning models. Experiment results on Pediatric ICU dataset for acute lung injury (ALI) show that our proposed method not only outperforms state-of-the-art approaches for morality and ventilator free days prediction tasks but can also provide interpretable models to clinicians. PMID:28269832
Robust PLS approach for KPI-related prediction and diagnosis against outliers and missing data
NASA Astrophysics Data System (ADS)
Yin, Shen; Wang, Guang; Yang, Xu
2014-07-01
In practical industrial applications, the key performance indicator (KPI)-related prediction and diagnosis are quite important for the product quality and economic benefits. To meet these requirements, many advanced prediction and monitoring approaches have been developed which can be classified into model-based or data-driven techniques. Among these approaches, partial least squares (PLS) is one of the most popular data-driven methods due to its simplicity and easy implementation in large-scale industrial process. As PLS is totally based on the measured process data, the characteristics of the process data are critical for the success of PLS. Outliers and missing values are two common characteristics of the measured data which can severely affect the effectiveness of PLS. To ensure the applicability of PLS in practical industrial applications, this paper introduces a robust version of PLS to deal with outliers and missing values, simultaneously. The effectiveness of the proposed method is finally demonstrated by the application results of the KPI-related prediction and diagnosis on an industrial benchmark of Tennessee Eastman process.
Coastal upwelling by wind-driven forcing in Jervis Bay, New South Wales: A numerical study for 2011
NASA Astrophysics Data System (ADS)
Sun, Youn-Jong; Jalón-Rojas, Isabel; Wang, Xiao Hua; Jiang, Donghui
2018-06-01
The Princeton Ocean Model (POM) was used to investigate an upwelling event in Jervis Bay, New South Wales (SE Australia), with varying wind directions and strengths. The POM was adopted with a downscaling approach for the regional ocean model one-way nested to a global ocean model. The upwelling event was detected from the observed wind data and satellite sea surface temperature images. The validated model reproduced the upwelling event showing the input of bottom cold water driven by wind to the bay, its subsequent deflection to the south, and its outcropping to the surface along the west and south coasts. Nevertheless, the behavior of the bottom water that intruded into the bay varied with different wind directions and strengths. Upwelling-favorable wind directions for flushing efficiency within the bay were ranked in the following order: N (0°; northerly) > NNE (30°; northeasterly) > NW (315°; northwesterly) > NE (45°; northeasterly) > ENE (60°; northeasterly). Increasing wind strengths also enhance cold water penetration and water exchange. It was determined that wind-driven downwelling within the bay, which occurred with NNE, NE and ENE winds, played a key role in blocking the intrusion of the cold water upwelled through the bay entrance. A northerly wind stress higher than 0.3 N m-2 was required for the cold water to reach the northern innermost bay.
Modeling Quasi-Static and Fatigue-Driven Delamination Migration
NASA Technical Reports Server (NTRS)
De Carvalho, N. V.; Ratcliffe, J. G.; Chen, B. Y.; Pinho, S. T.; Baiz, P. M.; Tay, T. E.
2014-01-01
An approach was proposed and assessed for the high-fidelity modeling of progressive damage and failure in composite materials. It combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. Delamination, matrix cracking, and migration were captured failure and migration criteria based on fracture mechanics. Quasi-static and fatigue loading were modeled within the same overall framework. The methodology proposed was illustrated by simulating the delamination migration test, showing good agreement with the available experimental data.
Zgonnikov, Arkady; Lubashevsky, Ihor
2015-11-01
When facing a task of balancing a dynamic system near an unstable equilibrium, humans often adopt intermittent control strategy: Instead of continuously controlling the system, they repeatedly switch the control on and off. Paradigmatic example of such a task is stick balancing. Despite the simplicity of the task itself, the complexity of human intermittent control dynamics in stick balancing still puzzles researchers in motor control. Here we attempt to model one of the key mechanisms of human intermittent control, control activation, using as an example the task of overdamped stick balancing. In doing so, we focus on the concept of noise-driven activation, a more general alternative to the conventional threshold-driven activation. We describe control activation as a random walk in an energy potential, which changes in response to the state of the controlled system. By way of numerical simulations, we show that the developed model captures the core properties of human control activation observed previously in the experiments on overdamped stick balancing. Our results demonstrate that the double-well potential model provides tractable mathematical description of human control activation at least in the considered task and suggest that the adopted approach can potentially aid in understanding human intermittent control in more complex processes.
Nama, Nitesh; Barnkob, Rune; Mao, Zhangming; Kähler, Christian J; Costanzo, Francesco; Huang, Tony Jun
2015-06-21
We present a numerical study of the acoustophoretic motion of particles suspended in a liquid-filled PDMS microchannel on a lithium niobate substrate acoustically driven by surface acoustic waves. We employ a perturbation approach where the flow variables are divided into first- and second-order fields. We use impedance boundary conditions to model the PDMS microchannel walls and we model the acoustic actuation by a displacement function from the literature based on a numerical study of piezoelectric actuation. Consistent with the type of actuation, the obtained first-order field is a horizontal standing wave that travels vertically from the actuated wall towards the upper PDMS wall. This is in contrast to what is observed in bulk acoustic wave devices. The first-order fields drive the acoustic streaming, as well as the time-averaged acoustic radiation force acting on suspended particles. We analyze the motion of suspended particles driven by the acoustic streaming drag and the radiation force. We examine a range of particle diameters to demonstrate the transition from streaming-drag-dominated acoustophoresis to radiation-force-dominated acoustophoresis. Finally, as an application of our numerical model, we demonstrate the capability to tune the position of the vertical pressure node along the channel width by tuning the phase difference between two incoming surface acoustic waves.