Sample records for proposed modeling framework

  1. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  2. A smoothed particle hydrodynamics framework for modelling multiphase interactions at meso-scale

    NASA Astrophysics Data System (ADS)

    Li, Ling; Shen, Luming; Nguyen, Giang D.; El-Zein, Abbas; Maggi, Federico

    2018-01-01

    A smoothed particle hydrodynamics (SPH) framework is developed for modelling multiphase interactions at meso-scale, including the liquid-solid interaction induced deformation of the solid phase. With an inter-particle force formulation that mimics the inter-atomic force in molecular dynamics, the proposed framework includes the long-range attractions between particles, and more importantly, the short-range repulsive forces to avoid particle clustering and instability problems. Three-dimensional numerical studies have been conducted to demonstrate the capabilities of the proposed framework to quantitatively replicate the surface tension of water, to model the interactions between immiscible liquids and solid, and more importantly, to simultaneously model the deformation of solid and liquid induced by the multiphase interaction. By varying inter-particle potential magnitude, the proposed SPH framework has successfully simulated various wetting properties ranging from hydrophobic to hydrophilic surfaces. The simulation results demonstrate the potential of the proposed framework to genuinely study complex multiphase interactions in wet granular media.

  3. Koopman Operator Framework for Time Series Modeling and Analysis

    NASA Astrophysics Data System (ADS)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  4. On the Conditioning of Machine-Learning-Assisted Turbulence Modeling

    NASA Astrophysics Data System (ADS)

    Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng

    2017-11-01

    Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.

  5. NoSQL Based 3D City Model Management System

    NASA Astrophysics Data System (ADS)

    Mao, B.; Harrie, L.; Cao, J.; Wu, Z.; Shen, J.

    2014-04-01

    To manage increasingly complicated 3D city models, a framework based on NoSQL database is proposed in this paper. The framework supports import and export of 3D city model according to international standards such as CityGML, KML/COLLADA and X3D. We also suggest and implement 3D model analysis and visualization in the framework. For city model analysis, 3D geometry data and semantic information (such as name, height, area, price and so on) are stored and processed separately. We use a Map-Reduce method to deal with the 3D geometry data since it is more complex, while the semantic analysis is mainly based on database query operation. For visualization, a multiple 3D city representation structure CityTree is implemented within the framework to support dynamic LODs based on user viewpoint. Also, the proposed framework is easily extensible and supports geoindexes to speed up the querying. Our experimental results show that the proposed 3D city management system can efficiently fulfil the analysis and visualization requirements.

  6. A Regularized Volumetric Fusion Framework for Large-Scale 3D Reconstruction

    NASA Astrophysics Data System (ADS)

    Rajput, Asif; Funk, Eugen; Börner, Anko; Hellwich, Olaf

    2018-07-01

    Modern computational resources combined with low-cost depth sensing systems have enabled mobile robots to reconstruct 3D models of surrounding environments in real-time. Unfortunately, low-cost depth sensors are prone to produce undesirable estimation noise in depth measurements which result in either depth outliers or introduce surface deformations in the reconstructed model. Conventional 3D fusion frameworks integrate multiple error-prone depth measurements over time to reduce noise effects, therefore additional constraints such as steady sensor movement and high frame-rates are required for high quality 3D models. In this paper we propose a generic 3D fusion framework with controlled regularization parameter which inherently reduces noise at the time of data fusion. This allows the proposed framework to generate high quality 3D models without enforcing additional constraints. Evaluation of the reconstructed 3D models shows that the proposed framework outperforms state of art techniques in terms of both absolute reconstruction error and processing time.

  7. A Unified Framework for Complex Networks with Degree Trichotomy Based on Markov Chains.

    PubMed

    Hui, David Shui Wing; Chen, Yi-Chao; Zhang, Gong; Wu, Weijie; Chen, Guanrong; Lui, John C S; Li, Yingtao

    2017-06-16

    This paper establishes a Markov chain model as a unified framework for describing the evolution processes in complex networks. The unique feature of the proposed model is its capability in addressing the formation mechanism that can reflect the "trichotomy" observed in degree distributions, based on which closed-form solutions can be derived. Important special cases of the proposed unified framework are those classical models, including Poisson, Exponential, Power-law distributed networks. Both simulation and experimental results demonstrate a good match of the proposed model with real datasets, showing its superiority over the classical models. Implications of the model to various applications including citation analysis, online social networks, and vehicular networks design, are also discussed in the paper.

  8. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  9. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  10. An active monitoring method for flood events

    NASA Astrophysics Data System (ADS)

    Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya

    2018-07-01

    Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.

  11. Physiome-model-based state-space framework for cardiac deformation recovery.

    PubMed

    Wong, Ken C L; Zhang, Heye; Liu, Huafeng; Shi, Pengcheng

    2007-11-01

    To more reliably recover cardiac information from noise-corrupted, patient-specific measurements, it is essential to employ meaningful constraining models and adopt appropriate optimization criteria to couple the models with the measurements. Although biomechanical models have been extensively used for myocardial motion recovery with encouraging results, the passive nature of such constraints limits their ability to fully count for the deformation caused by active forces of the myocytes. To overcome such limitations, we propose to adopt a cardiac physiome model as the prior constraint for cardiac motion analysis. The cardiac physiome model comprises an electric wave propagation model, an electromechanical coupling model, and a biomechanical model, which are connected through a cardiac system dynamics for a more complete description of the macroscopic cardiac physiology. Embedded within a multiframe state-space framework, the uncertainties of the model and the patient's measurements are systematically dealt with to arrive at optimal cardiac kinematic estimates and possibly beyond. Experiments have been conducted to compare our proposed cardiac-physiome-model-based framework with the solely biomechanical model-based framework. The results show that our proposed framework recovers more accurate cardiac deformation from synthetic data and obtains more sensible estimates from real magnetic resonance image sequences. With the active components introduced by the cardiac physiome model, cardiac deformations recovered from patient's medical images are more physiologically plausible.

  12. Material and morphology parameter sensitivity analysis in particulate composite materials

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoyu; Oskay, Caglar

    2017-12-01

    This manuscript presents a novel parameter sensitivity analysis framework for damage and failure modeling of particulate composite materials subjected to dynamic loading. The proposed framework employs global sensitivity analysis to study the variance in the failure response as a function of model parameters. In view of the computational complexity of performing thousands of detailed microstructural simulations to characterize sensitivities, Gaussian process (GP) surrogate modeling is incorporated into the framework. In order to capture the discontinuity in response surfaces, the GP models are integrated with a support vector machine classification algorithm that identifies the discontinuities within response surfaces. The proposed framework is employed to quantify variability and sensitivities in the failure response of polymer bonded particulate energetic materials under dynamic loads to material properties and morphological parameters that define the material microstructure. Particular emphasis is placed on the identification of sensitivity to interfaces between the polymer binder and the energetic particles. The proposed framework has been demonstrated to identify the most consequential material and morphological parameters under vibrational and impact loads.

  13. Learning situation models in a smart home.

    PubMed

    Brdiczka, Oliver; Crowley, James L; Reignier, Patrick

    2009-02-01

    This paper addresses the problem of learning situation models for providing context-aware services. Context for modeling human behavior in a smart environment is represented by a situation model describing environment, users, and their activities. A framework for acquiring and evolving different layers of a situation model in a smart environment is proposed. Different learning methods are presented as part of this framework: role detection per entity, unsupervised extraction of situations from multimodal data, supervised learning of situation representations, and evolution of a predefined situation model with feedback. The situation model serves as frame and support for the different methods, permitting to stay in an intuitive declarative framework. The proposed methods have been integrated into a whole system for smart home environment. The implementation is detailed, and two evaluations are conducted in the smart home environment. The obtained results validate the proposed approach.

  14. Multimodal Speaker Diarization.

    PubMed

    Noulas, A; Englebienne, G; Krose, B J A

    2012-01-01

    We present a novel probabilistic framework that fuses information coming from the audio and video modality to perform speaker diarization. The proposed framework is a Dynamic Bayesian Network (DBN) that is an extension of a factorial Hidden Markov Model (fHMM) and models the people appearing in an audiovisual recording as multimodal entities that generate observations in the audio stream, the video stream, and the joint audiovisual space. The framework is very robust to different contexts, makes no assumptions about the location of the recording equipment, and does not require labeled training data as it acquires the model parameters using the Expectation Maximization (EM) algorithm. We apply the proposed model to two meeting videos and a news broadcast video, all of which come from publicly available data sets. The results acquired in speaker diarization are in favor of the proposed multimodal framework, which outperforms the single modality analysis results and improves over the state-of-the-art audio-based speaker diarization.

  15. College-"Conocimiento": Toward an Interdisciplinary College Choice Framework for Latinx Students

    ERIC Educational Resources Information Center

    Acevedo-Gil, Nancy

    2017-01-01

    This paper builds upon Perna's college choice model by integrating Anzaldúa's theory of "conocimiento" to propose an interdisciplinary college choice framework for Latinx students. Using previous literature, this paper proposes college-"conocimiento" as a framework that contextualizes Latinx student college choices within the…

  16. Translation from UML to Markov Model: A Performance Modeling Framework

    NASA Astrophysics Data System (ADS)

    Khan, Razib Hayat; Heegaard, Poul E.

    Performance engineering focuses on the quantitative investigation of the behavior of a system during the early phase of the system development life cycle. Bearing this on mind, we delineate a performance modeling framework of the application for communication system that proposes a translation process from high level UML notation to Continuous Time Markov Chain model (CTMC) and solves the model for relevant performance metrics. The framework utilizes UML collaborations, activity diagrams and deployment diagrams to be used for generating performance model for a communication system. The system dynamics will be captured by UML collaboration and activity diagram as reusable specification building blocks, while deployment diagram highlights the components of the system. The collaboration and activity show how reusable building blocks in the form of collaboration can compose together the service components through input and output pin by highlighting the behavior of the components and later a mapping between collaboration and system component identified by deployment diagram will be delineated. Moreover the UML models are annotated to associate performance related quality of service (QoS) information which is necessary for solving the performance model for relevant performance metrics through our proposed framework. The applicability of our proposed performance modeling framework in performance evaluation is delineated in the context of modeling a communication system.

  17. Enterprise application architecture development based on DoDAF and TOGAF

    NASA Astrophysics Data System (ADS)

    Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng

    2017-05-01

    For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.

  18. Using subject-specific three-dimensional (3D) anthropometry data in digital human modelling: case study in hand motion simulation.

    PubMed

    Tsao, Liuxing; Ma, Liang

    2016-11-01

    Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.

  19. A Framework of Multi Objectives Negotiation for Dynamic Supply Chain Model

    NASA Astrophysics Data System (ADS)

    Chai, Jia Yee; Sakaguchi, Tatsuhiko; Shirase, Keiichi

    Trends of globalization and advances in Information Technology (IT) have created opportunity in collaborative manufacturing across national borders. A dynamic supply chain utilizes these advances to enable more flexibility in business cooperation. This research proposes a concurrent decision making framework for a three echelons dynamic supply chain model. The dynamic supply chain is formed by autonomous negotiation among agents based on multi agents approach. Instead of generating negotiation aspects (such as amount, price and due date) arbitrary, this framework proposes to utilize the information available at operational level of an organization in order to generate realistic negotiation aspect. The effectiveness of the proposed model is demonstrated by various case studies.

  20. Common and Innovative Visuals: A sparsity modeling framework for video.

    PubMed

    Abdolhosseini Moghadam, Abdolreza; Kumar, Mrityunjay; Radha, Hayder

    2014-05-02

    Efficient video representation models are critical for many video analysis and processing tasks. In this paper, we present a framework based on the concept of finding the sparsest solution to model video frames. To model the spatio-temporal information, frames from one scene are decomposed into two components: (i) a common frame, which describes the visual information common to all the frames in the scene/segment, and (ii) a set of innovative frames, which depicts the dynamic behaviour of the scene. The proposed approach exploits and builds on recent results in the field of compressed sensing to jointly estimate the common frame and the innovative frames for each video segment. We refer to the proposed modeling framework by CIV (Common and Innovative Visuals). We show how the proposed model can be utilized to find scene change boundaries and extend CIV to videos from multiple scenes. Furthermore, the proposed model is robust to noise and can be used for various video processing applications without relying on motion estimation and detection or image segmentation. Results for object tracking, video editing (object removal, inpainting) and scene change detection are presented to demonstrate the efficiency and the performance of the proposed model.

  1. Evolutionary game based control for biological systems with applications in drug delivery.

    PubMed

    Li, Xiaobo; Lenaghan, Scott C; Zhang, Mingjun

    2013-06-07

    Control engineering and analysis of biological systems have become increasingly important for systems and synthetic biology. Unfortunately, no widely accepted control framework is currently available for these systems, especially at the cell and molecular levels. This is partially due to the lack of appropriate mathematical models to describe the unique dynamics of biological systems, and the lack of implementation techniques, such as ultra-fast and ultra-small devices and corresponding control algorithms. This paper proposes a control framework for biological systems subject to dynamics that exhibit adaptive behavior under evolutionary pressures. The control framework was formulated based on evolutionary game based modeling, which integrates both the internal dynamics and the population dynamics. In the proposed control framework, the adaptive behavior was characterized as an internal dynamic, and the external environment was regarded as an external control input. The proposed open-interface control framework can be integrated with additional control algorithms for control of biological systems. To demonstrate the effectiveness of the proposed framework, an optimal control strategy was developed and validated for drug delivery using the pathogen Giardia lamblia as a test case. In principle, the proposed control framework can be applied to any biological system exhibiting adaptive behavior under evolutionary pressures. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. The Development of a Proposed Global Work-Integrated Learning Framework

    ERIC Educational Resources Information Center

    McRae, Norah; Johnston, Nancy

    2016-01-01

    Building on the work completed in BC that resulted in the development of a WIL Matrix for comparing and contrasting various forms of WIL with the Canadian co-op model, this paper proposes a Global Work-Integrated Learning Framework that allows for the comparison of a variety of models of work-integrated learning found in the international…

  3. Modeling Nonlinear Change via Latent Change and Latent Acceleration Frameworks: Examining Velocity and Acceleration of Growth Trajectories

    ERIC Educational Resources Information Center

    Grimm, Kevin; Zhang, Zhiyong; Hamagami, Fumiaki; Mazzocco, Michele

    2013-01-01

    We propose the use of the latent change and latent acceleration frameworks for modeling nonlinear growth in structural equation models. Moving to these frameworks allows for the direct identification of "rates of change" and "acceleration" in latent growth curves--information available indirectly through traditional growth…

  4. An integrated framework for detecting suspicious behaviors in video surveillance

    NASA Astrophysics Data System (ADS)

    Zin, Thi Thi; Tin, Pyke; Hama, Hiromitsu; Toriu, Takashi

    2014-03-01

    In this paper, we propose an integrated framework for detecting suspicious behaviors in video surveillance systems which are established in public places such as railway stations, airports, shopping malls and etc. Especially, people loitering in suspicion, unattended objects left behind and exchanging suspicious objects between persons are common security concerns in airports and other transit scenarios. These involve understanding scene/event, analyzing human movements, recognizing controllable objects, and observing the effect of the human movement on those objects. In the proposed framework, multiple background modeling technique, high level motion feature extraction method and embedded Markov chain models are integrated for detecting suspicious behaviors in real time video surveillance systems. Specifically, the proposed framework employs probability based multiple backgrounds modeling technique to detect moving objects. Then the velocity and distance measures are computed as the high level motion features of the interests. By using an integration of the computed features and the first passage time probabilities of the embedded Markov chain, the suspicious behaviors in video surveillance are analyzed for detecting loitering persons, objects left behind and human interactions such as fighting. The proposed framework has been tested by using standard public datasets and our own video surveillance scenarios.

  5. Locomotion Dynamics for Bio-inspired Robots with Soft Appendages: Application to Flapping Flight and Passive Swimming

    NASA Astrophysics Data System (ADS)

    Boyer, Frédéric; Porez, Mathieu; Morsli, Ferhat; Morel, Yannick

    2017-08-01

    In animal locomotion, either in fish or flying insects, the use of flexible terminal organs or appendages greatly improves the performance of locomotion (thrust and lift). In this article, we propose a general unified framework for modeling and simulating the (bio-inspired) locomotion of robots using soft organs. The proposed approach is based on the model of Mobile Multibody Systems (MMS). The distributed flexibilities are modeled according to two major approaches: the Floating Frame Approach (FFA) and the Geometrically Exact Approach (GEA). Encompassing these two approaches in the Newton-Euler modeling formalism of robotics, this article proposes a unique modeling framework suited to the fast numerical integration of the dynamics of a MMS in both the FFA and the GEA. This general framework is applied on two illustrative examples drawn from bio-inspired locomotion: the passive swimming in von Karman Vortex Street, and the hovering flight with flexible flapping wings.

  6. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  7. A unified framework for image retrieval using keyword and visual features.

    PubMed

    Jing, Feng; Li, Mingling; Zhang, Hong-Jiang; Zhang, Bo

    2005-07-01

    In this paper, a unified image retrieval framework based on both keyword annotations and visual features is proposed. In this framework, a set of statistical models are built based on visual features of a small set of manually labeled images to represent semantic concepts and used to propagate keywords to other unlabeled images. These models are updated periodically when more images implicitly labeled by users become available through relevance feedback. In this sense, the keyword models serve the function of accumulation and memorization of knowledge learned from user-provided relevance feedback. Furthermore, two sets of effective and efficient similarity measures and relevance feedback schemes are proposed for query by keyword scenario and query by image example scenario, respectively. Keyword models are combined with visual features in these schemes. In particular, a new, entropy-based active learning strategy is introduced to improve the efficiency of relevance feedback for query by keyword. Furthermore, a new algorithm is proposed to estimate the keyword features of the search concept for query by image example. It is shown to be more appropriate than two existing relevance feedback algorithms. Experimental results demonstrate the effectiveness of the proposed framework.

  8. A Spatiotemporal Prediction Framework for Air Pollution Based on Deep RNN

    NASA Astrophysics Data System (ADS)

    Fan, J.; Li, Q.; Hou, J.; Feng, X.; Karimian, H.; Lin, S.

    2017-10-01

    Time series data in practical applications always contain missing values due to sensor malfunction, network failure, outliers etc. In order to handle missing values in time series, as well as the lack of considering temporal properties in machine learning models, we propose a spatiotemporal prediction framework based on missing value processing algorithms and deep recurrent neural network (DRNN). By using missing tag and missing interval to represent time series patterns, we implement three different missing value fixing algorithms, which are further incorporated into deep neural network that consists of LSTM (Long Short-term Memory) layers and fully connected layers. Real-world air quality and meteorological datasets (Jingjinji area, China) are used for model training and testing. Deep feed forward neural networks (DFNN) and gradient boosting decision trees (GBDT) are trained as baseline models against the proposed DRNN. Performances of three missing value fixing algorithms, as well as different machine learning models are evaluated and analysed. Experiments show that the proposed DRNN framework outperforms both DFNN and GBDT, therefore validating the capacity of the proposed framework. Our results also provides useful insights for better understanding of different strategies that handle missing values.

  9. Proposed evaluation framework for assessing operator performance with multisensor displays

    NASA Technical Reports Server (NTRS)

    Foyle, David C.

    1992-01-01

    Despite aggressive work on the development of sensor fusion algorithms and techniques, no formal evaluation procedures have been proposed. Based on existing integration models in the literature, an evaluation framework is developed to assess an operator's ability to use multisensor, or sensor fusion, displays. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The operator's performance with the sensor fusion display can be compared to the models' predictions based on the operator's performance when viewing the original sensor displays prior to fusion. This allows for the determination as to when a sensor fusion system leads to: 1) poorer performance than one of the original sensor displays (clearly an undesirable system in which the fused sensor system causes some distortion or interference); 2) better performance than with either single sensor system alone, but at a sub-optimal (compared to the model predictions) level; 3) optimal performance (compared to model predictions); or, 4) super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays. An experiment demonstrating the usefulness of the proposed evaluation framework is discussed.

  10. Stochastic filtering for damage identification through nonlinear structural finite element model updating

    NASA Astrophysics Data System (ADS)

    Astroza, Rodrigo; Ebrahimian, Hamed; Conte, Joel P.

    2015-03-01

    This paper describes a novel framework that combines advanced mechanics-based nonlinear (hysteretic) finite element (FE) models and stochastic filtering techniques to estimate unknown time-invariant parameters of nonlinear inelastic material models used in the FE model. Using input-output data recorded during earthquake events, the proposed framework updates the nonlinear FE model of the structure. The updated FE model can be directly used for damage identification and further used for damage prognosis. To update the unknown time-invariant parameters of the FE model, two alternative stochastic filtering methods are used: the extended Kalman filter (EKF) and the unscented Kalman filter (UKF). A three-dimensional, 5-story, 2-by-1 bay reinforced concrete (RC) frame is used to verify the proposed framework. The RC frame is modeled using fiber-section displacement-based beam-column elements with distributed plasticity and is subjected to the ground motion recorded at the Sylmar station during the 1994 Northridge earthquake. The results indicate that the proposed framework accurately estimate the unknown material parameters of the nonlinear FE model. The UKF outperforms the EKF when the relative root-mean-square error of the recorded responses are compared. In addition, the results suggest that the convergence of the estimate of modeling parameters is smoother and faster when the UKF is utilized.

  11. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  12. Large scale air pollution estimation method combining land use regression and chemical transport modeling in a geostatistical framework.

    PubMed

    Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey

    2014-04-15

    In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.

  13. Word-level language modeling for P300 spellers based on discriminative graphical models

    NASA Astrophysics Data System (ADS)

    Delgado Saa, Jaime F.; de Pesters, Adriana; McFarland, Dennis; Çetin, Müjdat

    2015-04-01

    Objective. In this work we propose a probabilistic graphical model framework that uses language priors at the level of words as a mechanism to increase the performance of P300-based spellers. Approach. This paper is concerned with brain-computer interfaces based on P300 spellers. Motivated by P300 spelling scenarios involving communication based on a limited vocabulary, we propose a probabilistic graphical model framework and an associated classification algorithm that uses learned statistical models of language at the level of words. Exploiting such high-level contextual information helps reduce the error rate of the speller. Main results. Our experimental results demonstrate that the proposed approach offers several advantages over existing methods. Most importantly, it increases the classification accuracy while reducing the number of times the letters need to be flashed, increasing the communication rate of the system. Significance. The proposed approach models all the variables in the P300 speller in a unified framework and has the capability to correct errors in previous letters in a word, given the data for the current one. The structure of the model we propose allows the use of efficient inference algorithms, which in turn makes it possible to use this approach in real-time applications.

  14. Formulation of Higher Education Institutional Strategy Using Operational Research Approaches

    ERIC Educational Resources Information Center

    Labib, Ashraf; Read, Martin; Gladstone-Millar, Charlotte; Tonge, Richard; Smith, David

    2014-01-01

    In this paper a framework is proposed for the formulation of a higher education institutional (HEI) strategy. This work provides a practical example, through a case study, to demonstrate how the proposed framework can be applied to the issue of formulation of HEI strategy. The proposed hybrid model is based on two operational research…

  15. Computer-aided pulmonary image analysis in small animal models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J.

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next.more » The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.« less

  16. A methodology to model causal relationships on offshore safety assessment focusing on human and organizational factors.

    PubMed

    Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B

    2008-01-01

    Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and implications. A case study of the collision risk between a Floating Production, Storage and Offloading (FPSO) unit and authorized vessels caused by human and organizational factors (HOFs) during operations is used to illustrate an industrial application of the proposed methodology.

  17. ECG Denoising Using Marginalized Particle Extended Kalman Filter With an Automatic Particle Weighting Strategy.

    PubMed

    Hesar, Hamed Danandeh; Mohebbi, Maryam

    2017-05-01

    In this paper, a model-based Bayesian filtering framework called the "marginalized particle-extended Kalman filter (MP-EKF) algorithm" is proposed for electrocardiogram (ECG) denoising. This algorithm does not have the extended Kalman filter (EKF) shortcoming in handling non-Gaussian nonstationary situations because of its nonlinear framework. In addition, it has less computational complexity compared with particle filter. This filter improves ECG denoising performance by implementing marginalized particle filter framework while reducing its computational complexity using EKF framework. An automatic particle weighting strategy is also proposed here that controls the reliance of our framework to the acquired measurements. We evaluated the proposed filter on several normal ECGs selected from MIT-BIH normal sinus rhythm database. To do so, artificial white Gaussian and colored noises as well as nonstationary real muscle artifact (MA) noise over a range of low SNRs from 10 to -5 dB were added to these normal ECG segments. The benchmark methods were the EKF and extended Kalman smoother (EKS) algorithms which are the first model-based Bayesian algorithms introduced in the field of ECG denoising. From SNR viewpoint, the experiments showed that in the presence of Gaussian white noise, the proposed framework outperforms the EKF and EKS algorithms in lower input SNRs where the measurements and state model are not reliable. Owing to its nonlinear framework and particle weighting strategy, the proposed algorithm attained better results at all input SNRs in non-Gaussian nonstationary situations (such as presence of pink noise, brown noise, and real MA). In addition, the impact of the proposed filtering method on the distortion of diagnostic features of the ECG was investigated and compared with EKF/EKS methods using an ECG diagnostic distortion measure called the "Multi-Scale Entropy Based Weighted Distortion Measure" or MSEWPRD. The results revealed that our proposed algorithm had the lowest MSEPWRD for all noise types at low input SNRs. Therefore, the morphology and diagnostic information of ECG signals were much better conserved compared with EKF/EKS frameworks, especially in non-Gaussian nonstationary situations.

  18. Exploiting Acoustic and Syntactic Features for Automatic Prosody Labeling in a Maximum Entropy Framework

    PubMed Central

    Sridhar, Vivek Kumar Rangarajan; Bangalore, Srinivas; Narayanan, Shrikanth S.

    2009-01-01

    In this paper, we describe a maximum entropy-based automatic prosody labeling framework that exploits both language and speech information. We apply the proposed framework to both prominence and phrase structure detection within the Tones and Break Indices (ToBI) annotation scheme. Our framework utilizes novel syntactic features in the form of supertags and a quantized acoustic–prosodic feature representation that is similar to linear parameterizations of the prosodic contour. The proposed model is trained discriminatively and is robust in the selection of appropriate features for the task of prosody detection. The proposed maximum entropy acoustic–syntactic model achieves pitch accent and boundary tone detection accuracies of 86.0% and 93.1% on the Boston University Radio News corpus, and, 79.8% and 90.3% on the Boston Directions corpus. The phrase structure detection through prosodic break index labeling provides accuracies of 84% and 87% on the two corpora, respectively. The reported results are significantly better than previously reported results and demonstrate the strength of maximum entropy model in jointly modeling simple lexical, syntactic, and acoustic features for automatic prosody labeling. PMID:19603083

  19. A Formal Theory for Modular ERDF Ontologies

    NASA Astrophysics Data System (ADS)

    Analyti, Anastasia; Antoniou, Grigoris; Damásio, Carlos Viegas

    The success of the Semantic Web is impossible without any form of modularity, encapsulation, and access control. In an earlier paper, we extended RDF graphs with weak and strong negation, as well as derivation rules. The ERDF #n-stable model semantics of the extended RDF framework (ERDF) is defined, extending RDF(S) semantics. In this paper, we propose a framework for modular ERDF ontologies, called modular ERDF framework, which enables collaborative reasoning over a set of ERDF ontologies, while support for hidden knowledge is also provided. In particular, the modular ERDF stable model semantics of modular ERDF ontologies is defined, extending the ERDF #n-stable model semantics. Our proposed framework supports local semantics and different points of view, local closed-world and open-world assumptions, and scoped negation-as-failure. Several complexity results are provided.

  20. A Framework for Curriculum Research.

    ERIC Educational Resources Information Center

    Kimpston, Richard D.; Rogers, Karen B.

    1986-01-01

    A framework for generating curriculum research is proposed from a synthesis of Dunkin and Biddle's model of teaching variables with Beauchamp's "curriculum system" planning functions. The framework systematically defines variables that delineate curriculum planning processes. (CJH)

  1. A Framework for Studying Minority Youths' Transitions to Fatherhood: The Case of Puerto Rican Adolescents

    ERIC Educational Resources Information Center

    Erkut, Sumru; Szalacha, Laura A.; Coll, Cynthia Garcia

    2005-01-01

    A theoretical framework is proposed for studying minority young men's involvement with their babies that combines the integrative model of minority youth development and a life course developmental perspective with Lamb's revised four-factor model of father involvement. This framework posits a relationship between demographic and family background…

  2. Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, J. A.

    A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).

  3. Tracking Skill Acquisition with Cognitive Diagnosis Models: A Higher-Order, Hidden Markov Model with Covariates

    ERIC Educational Resources Information Center

    Wang, Shiyu; Yang, Yan; Culpepper, Steven Andrew; Douglas, Jeffrey A.

    2018-01-01

    A family of learning models that integrates a cognitive diagnostic model and a higher-order, hidden Markov model in one framework is proposed. This new framework includes covariates to model skill transition in the learning environment. A Bayesian formulation is adopted to estimate parameters from a learning model. The developed methods are…

  4. Cross-cultural re-entry for missionaries: a new application for the Dual Process Model.

    PubMed

    Selby, Susan; Clark, Sheila; Braunack-Mayer, Annette; Jones, Alison; Moulding, Nicole; Beilby, Justin

    Nearly half a million foreign aid workers currently work worldwide, including over 140,000 missionaries. During re-entry these workers may experience significant psychological distress. This article positions previous research about psychological distress during re-entry, emphasizing loss and grief. At present there is no identifiable theoretical framework to provide a basis for assessment, management, and prevention of re-entry distress in the clinical setting. The development of theoretical concepts and frameworks surrounding loss and grief including the Dual Process Model (DPM) are discussed. All the parameters of the DPM have been shown to be appropriate for the proposed re-entry model, the Dual Process Model applied to Re-entry (DPMR). It is proposed that the DPMR is an appropriate framework to address the processes and strategies of managing re-entry loss and grief. Possible future clinical applications and limitations of the proposed model are discussed. The DPMR is offered for further validation and use in clinical practice.

  5. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  6. Helping Health Care Providers and Clinical Scientists Understand Apparently Irrational Policy Decisions.

    PubMed

    Demeter, Sandor J

    2016-12-21

    Health care providers (HCP) and clinical scientists (CS) are generally most comfortable using evidence-based rational decision-making models. They become very frustrated when policymakers make decisions that, on the surface, seem irrational and unreasonable. However, such decisions usually make sense when analysed properly. The goal of this paper to provide a basic theoretical understanding of major policy models, to illustrate which models are most prevalent in publicly funded health care systems, and to propose a policy analysis framework to better understand the elements that drive policy decision-making. The proposed policy framework will also assist HCP and CS achieve greater success with their own proposals.

  7. The Role of Interpersonal Relations in Healthcare Team Communication and Patient Safety: A Proposed Model of Interpersonal Process in Teamwork.

    PubMed

    Lee, Charlotte Tsz-Sum; Doran, Diane Marie

    2017-06-01

    Patient safety is compromised by medical errors and adverse events related to miscommunications among healthcare providers. Communication among healthcare providers is affected by human factors, such as interpersonal relations. Yet, discussions of interpersonal relations and communication are lacking in healthcare team literature. This paper proposes a theoretical framework that explains how interpersonal relations among healthcare team members affect communication and team performance, such as patient safety. We synthesized studies from health and social science disciplines to construct a theoretical framework that explicates the links among these constructs. From our synthesis, we identified two relevant theories: framework on interpersonal processes based on social relation model and the theory of relational coordination. The former involves three steps: perception, evaluation, and feedback; and the latter captures relational communicative behavior. We propose that manifestations of provider relations are embedded in the third step of the framework on interpersonal processes: feedback. Thus, varying team-member relationships lead to varying collaborative behavior, which affects patient-safety outcomes via a change in team communication. The proposed framework offers new perspectives for understanding how workplace relations affect healthcare team performance. The framework can be used by nurses, administrators, and educators to improve patient safety, team communication, or to resolve conflicts.

  8. Theories and Frameworks for Online Education: Seeking an Integrated Model

    ERIC Educational Resources Information Center

    Picciano, Anthony G.

    2017-01-01

    This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

  9. Modeling asset price processes based on mean-field framework

    NASA Astrophysics Data System (ADS)

    Ieda, Masashi; Shiino, Masatoshi

    2011-12-01

    We propose a model of the dynamics of financial assets based on the mean-field framework. This framework allows us to construct a model which includes the interaction among the financial assets reflecting the market structure. Our study is on the cutting edge in the sense of a microscopic approach to modeling the financial market. To demonstrate the effectiveness of our model concretely, we provide a case study, which is the pricing problem of the European call option with short-time memory noise.

  10. Health level 7 development framework for medication administration.

    PubMed

    Kim, Hwa Sun; Cho, Hune

    2009-01-01

    We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.

  11. A Historical Forcing Ice Sheet Model Validation Framework for Greenland

    NASA Astrophysics Data System (ADS)

    Price, S. F.; Hoffman, M. J.; Howat, I. M.; Bonin, J. A.; Chambers, D. P.; Kalashnikova, I.; Neumann, T.; Nowicki, S.; Perego, M.; Salinger, A.

    2014-12-01

    We propose an ice sheet model testing and validation framework for Greenland for the years 2000 to the present. Following Perego et al. (2014), we start with a realistic ice sheet initial condition that is in quasi-equilibrium with climate forcing from the late 1990's. This initial condition is integrated forward in time while simultaneously applying (1) surface mass balance forcing (van Angelen et al., 2013) and (2) outlet glacier flux anomalies, defined using a new dataset of Greenland outlet glacier flux for the past decade (Enderlin et al., 2014). Modeled rates of mass and elevation change are compared directly to remote sensing observations obtained from GRACE and ICESat. Here, we present a detailed description of the proposed validation framework including the ice sheet model and model forcing approach, the model-to-observation comparison process, and initial results comparing model output and observations for the time period 2000-2013.

  12. Processing SPARQL queries with regular expressions in RDF databases

    PubMed Central

    2011-01-01

    Background As the Resource Description Framework (RDF) data model is widely used for modeling and sharing a lot of online bioinformatics resources such as Uniprot (dev.isb-sib.ch/projects/uniprot-rdf) or Bio2RDF (bio2rdf.org), SPARQL - a W3C recommendation query for RDF databases - has become an important query language for querying the bioinformatics knowledge bases. Moreover, due to the diversity of users’ requests for extracting information from the RDF data as well as the lack of users’ knowledge about the exact value of each fact in the RDF databases, it is desirable to use the SPARQL query with regular expression patterns for querying the RDF data. To the best of our knowledge, there is currently no work that efficiently supports regular expression processing in SPARQL over RDF databases. Most of the existing techniques for processing regular expressions are designed for querying a text corpus, or only for supporting the matching over the paths in an RDF graph. Results In this paper, we propose a novel framework for supporting regular expression processing in SPARQL query. Our contributions can be summarized as follows. 1) We propose an efficient framework for processing SPARQL queries with regular expression patterns in RDF databases. 2) We propose a cost model in order to adapt the proposed framework in the existing query optimizers. 3) We build a prototype for the proposed framework in C++ and conduct extensive experiments demonstrating the efficiency and effectiveness of our technique. Conclusions Experiments with a full-blown RDF engine show that our framework outperforms the existing ones by up to two orders of magnitude in processing SPARQL queries with regular expression patterns. PMID:21489225

  13. Processing SPARQL queries with regular expressions in RDF databases.

    PubMed

    Lee, Jinsoo; Pham, Minh-Duc; Lee, Jihwan; Han, Wook-Shin; Cho, Hune; Yu, Hwanjo; Lee, Jeong-Hoon

    2011-03-29

    As the Resource Description Framework (RDF) data model is widely used for modeling and sharing a lot of online bioinformatics resources such as Uniprot (dev.isb-sib.ch/projects/uniprot-rdf) or Bio2RDF (bio2rdf.org), SPARQL - a W3C recommendation query for RDF databases - has become an important query language for querying the bioinformatics knowledge bases. Moreover, due to the diversity of users' requests for extracting information from the RDF data as well as the lack of users' knowledge about the exact value of each fact in the RDF databases, it is desirable to use the SPARQL query with regular expression patterns for querying the RDF data. To the best of our knowledge, there is currently no work that efficiently supports regular expression processing in SPARQL over RDF databases. Most of the existing techniques for processing regular expressions are designed for querying a text corpus, or only for supporting the matching over the paths in an RDF graph. In this paper, we propose a novel framework for supporting regular expression processing in SPARQL query. Our contributions can be summarized as follows. 1) We propose an efficient framework for processing SPARQL queries with regular expression patterns in RDF databases. 2) We propose a cost model in order to adapt the proposed framework in the existing query optimizers. 3) We build a prototype for the proposed framework in C++ and conduct extensive experiments demonstrating the efficiency and effectiveness of our technique. Experiments with a full-blown RDF engine show that our framework outperforms the existing ones by up to two orders of magnitude in processing SPARQL queries with regular expression patterns.

  14. A Framework for the Study of Emotions in Organizational Contexts.

    ERIC Educational Resources Information Center

    Fiebig, Greg V.; Kramer, Michael W.

    1998-01-01

    Approaches the study of emotions in organizations holistically, based on a proposed framework. Provides descriptive data that suggests the presence of the framework's major elements. States that future examination of emotions based on this framework should assist in understanding emotions, which are frequently ignored in a rational model. (PA)

  15. FRAMEWORK FOR EVALUATION OF PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODELS FOR USE IN SAFETY OR RISK ASSESSMENT

    EPA Science Inventory

    ABSTRACT

    Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...

  16. Determination of sample size for higher volatile data using new framework of Box-Jenkins model with GARCH: A case study on gold price

    NASA Astrophysics Data System (ADS)

    Roslindar Yaziz, Siti; Zakaria, Roslinazairimah; Hura Ahmad, Maizah

    2017-09-01

    The model of Box-Jenkins - GARCH has been shown to be a promising tool for forecasting higher volatile time series. In this study, the framework of determining the optimal sample size using Box-Jenkins model with GARCH is proposed for practical application in analysing and forecasting higher volatile data. The proposed framework is employed to daily world gold price series from year 1971 to 2013. The data is divided into 12 different sample sizes (from 30 to 10200). Each sample is tested using different combination of the hybrid Box-Jenkins - GARCH model. Our study shows that the optimal sample size to forecast gold price using the framework of the hybrid model is 1250 data of 5-year sample. Hence, the empirical results of model selection criteria and 1-step-ahead forecasting evaluations suggest that the latest 12.25% (5-year data) of 10200 data is sufficient enough to be employed in the model of Box-Jenkins - GARCH with similar forecasting performance as by using 41-year data.

  17. A Kernel-Based Low-Rank (KLR) Model for Low-Dimensional Manifold Recovery in Highly Accelerated Dynamic MRI.

    PubMed

    Nakarmi, Ukash; Wang, Yanhua; Lyu, Jingyuan; Liang, Dong; Ying, Leslie

    2017-11-01

    While many low rank and sparsity-based approaches have been developed for accelerated dynamic magnetic resonance imaging (dMRI), they all use low rankness or sparsity in input space, overlooking the intrinsic nonlinear correlation in most dMRI data. In this paper, we propose a kernel-based framework to allow nonlinear manifold models in reconstruction from sub-Nyquist data. Within this framework, many existing algorithms can be extended to kernel framework with nonlinear models. In particular, we have developed a novel algorithm with a kernel-based low-rank model generalizing the conventional low rank formulation. The algorithm consists of manifold learning using kernel, low rank enforcement in feature space, and preimaging with data consistency. Extensive simulation and experiment results show that the proposed method surpasses the conventional low-rank-modeled approaches for dMRI.

  18. Using coupled hydrogeophysical models and data assimilation to enhance the information content in geoelectrical leak detection

    NASA Astrophysics Data System (ADS)

    Tso, C. H. M.; Johnson, T. C.; Song, X.; Chen, X.; Binley, A. M.

    2017-12-01

    Time-lapse electrical resistivity tomography (ERT) measurements provides indirect observation of hydrological processes in the Earth's shallow subsurface at high spatial and temporal resolutions. ERT has been used for a number of decades to detect leaks and monitor the evolution of associated contaminant plumes. However, this has been limited to a few hazardous environmental sites. Furthermore, an assessment of uncertainty in such applications has thus far been neglected, despite the clear need to provide site managers with appropriate information for decision making purposes. There is a need to establish a framework that allows leak detection with uncertainty assessment from geophysical observations. Ideally such a framework should allow the incorporation of additional data sources in order to reduce uncertainty in predictions. To tackle these issues, we propose an ensemble-based data assimilation framework that evaluates proposed hydrological models (i.e. different hydrogeological units, different leak locations and loads) against observed time-lapse ERT measurements. Each proposed hydrological model is run through the parallel coupled hydrogeophysical code PFLOTRAN-E4D (Johnson et al 2016) to obtain simulated ERT measurements. The ensemble of model proposals is then updated based on data misfit. Our approach does not focus on obtaining detailed images of hydraulic properties or plume movement. Rather, it seeks to estimate the contaminant mass discharge (CMD) across a user-defined plane in space probabilistically. The proposed approach avoids the ambiguity in interpreting detailed hydrological processes from geophysical images. The resultant distributions of CMD give a straightforward metric, with realistic uncertainty bounds, for decision making. The proposed framework is also computationally efficient so that it can exploit large, long-term ERT datasets, making it possible to track time-varying loadings of plume sources. In this presentation, we illustrate our framework on synthetic data and field data collected from an ERT trial simulating a leak at the Sellafield nuclear facility in the UK (Kuras et al 2016). We compare our results to interpretation from geophysical inversion and discuss the additional information that hydrological model proposals provide.

  19. An Epistemological Analysis of the Evolution of Didactical Activities in Teaching-Learning Sequences: The Case of Fluids. Special Issue

    ERIC Educational Resources Information Center

    Psillos, D.; Tselfes, Vassilis; Kariotoglou, Petros

    2004-01-01

    In the present paper we propose a theoretical framework for an epistemological modelling of teaching-learning (didactical) activities, which draws on recent studies of scientific practice. We present and analyse the framework, which includes three categories: namely, Cosmos-Evidence-Ideas (CEI). We also apply this framework in order to model a…

  20. CLAss-Specific Subspace Kernel Representations and Adaptive Margin Slack Minimization for Large Scale Classification.

    PubMed

    Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan

    2018-02-01

    In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.

  1. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    NASA Astrophysics Data System (ADS)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  2. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  3. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  4. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges

    PubMed Central

    Amarasingham, Ruben; Audet, Anne-Marie J.; Bates, David W.; Glenn Cohen, I.; Entwistle, Martin; Escobar, G. J.; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M.; Shahi, Anand; Stewart, Walter F.; Steyerberg, Ewout W.; Xie, Bin

    2016-01-01

    Context: The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Objectives: Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. Methods: To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. Findings: The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and Ethics) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility.Ethics: Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models. PMID:27141516

  5. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges.

    PubMed

    Amarasingham, Ruben; Audet, Anne-Marie J; Bates, David W; Glenn Cohen, I; Entwistle, Martin; Escobar, G J; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M; Shahi, Anand; Stewart, Walter F; Steyerberg, Ewout W; Xie, Bin

    2016-01-01

    The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and ETHICS) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility. Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models.

  6. Implicit kernel sparse shape representation: a sparse-neighbors-based objection segmentation framework.

    PubMed

    Yao, Jincao; Yu, Huimin; Hu, Roland

    2017-01-01

    This paper introduces a new implicit-kernel-sparse-shape-representation-based object segmentation framework. Given an input object whose shape is similar to some of the elements in the training set, the proposed model can automatically find a cluster of implicit kernel sparse neighbors to approximately represent the input shape and guide the segmentation. A distance-constrained probabilistic definition together with a dualization energy term is developed to connect high-level shape representation and low-level image information. We theoretically prove that our model not only derives from two projected convex sets but is also equivalent to a sparse-reconstruction-error-based representation in the Hilbert space. Finally, a "wake-sleep"-based segmentation framework is applied to drive the evolutionary curve to recover the original shape of the object. We test our model on two public datasets. Numerical experiments on both synthetic images and real applications show the superior capabilities of the proposed framework.

  7. Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite

    NASA Astrophysics Data System (ADS)

    Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin

    2017-09-01

    State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all-electric spacecraft system design.

  8. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  9. HEAVY-DUTY DIESEL VEHICLE MODAL EMISSION MODEL (HDDV-MEM): VOLUME I: MODAL EMISSION MODELING FRAMEWORK; VOLUME II: MODAL COMPONENTS AND OUTPUTS

    EPA Science Inventory

    This research outlines a proposed Heavy-Duty Diesel Vehicle Modal Emission Modeling Framework (HDDV-MEMF) for heavy-duty diesel-powered trucks and buses. The heavy-duty vehicle modal modules being developed under this research effort, although different, should be compatible wi...

  10. Hierarchical kernel mixture models for the prediction of AIDS disease progression using HIV structural gp120 profiles

    PubMed Central

    2010-01-01

    Changes to the glycosylation profile on HIV gp120 can influence viral pathogenesis and alter AIDS disease progression. The characterization of glycosylation differences at the sequence level is inadequate as the placement of carbohydrates is structurally complex. However, no structural framework is available to date for the study of HIV disease progression. In this study, we propose a novel machine-learning based framework for the prediction of AIDS disease progression in three stages (RP, SP, and LTNP) using the HIV structural gp120 profile. This new intelligent framework proves to be accurate and provides an important benchmark for predicting AIDS disease progression computationally. The model is trained using a novel HIV gp120 glycosylation structural profile to detect possible stages of AIDS disease progression for the target sequences of HIV+ individuals. The performance of the proposed model was compared to seven existing different machine-learning models on newly proposed gp120-Benchmark_1 dataset in terms of error-rate (MSE), accuracy (CCI), stability (STD), and complexity (TBM). The novel framework showed better predictive performance with 67.82% CCI, 30.21 MSE, 0.8 STD, and 2.62 TBM on the three stages of AIDS disease progression of 50 HIV+ individuals. This framework is an invaluable bioinformatics tool that will be useful to the clinical assessment of viral pathogenesis. PMID:21143806

  11. Fast image interpolation via random forests.

    PubMed

    Huang, Jun-Jie; Siu, Wan-Chi; Liu, Tian-Rui

    2015-10-01

    This paper proposes a two-stage framework for fast image interpolation via random forests (FIRF). The proposed FIRF method gives high accuracy, as well as requires low computation. The underlying idea of this proposed work is to apply random forests to classify the natural image patch space into numerous subspaces and learn a linear regression model for each subspace to map the low-resolution image patch to high-resolution image patch. The FIRF framework consists of two stages. Stage 1 of the framework removes most of the ringing and aliasing artifacts in the initial bicubic interpolated image, while Stage 2 further refines the Stage 1 interpolated image. By varying the number of decision trees in the random forests and the number of stages applied, the proposed FIRF method can realize computationally scalable image interpolation. Extensive experimental results show that the proposed FIRF(3, 2) method achieves more than 0.3 dB improvement in peak signal-to-noise ratio over the state-of-the-art nonlocal autoregressive modeling (NARM) method. Moreover, the proposed FIRF(1, 1) obtains similar or better results as NARM while only takes its 0.3% computational time.

  12. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    NASA Astrophysics Data System (ADS)

    Han, Xue; Sandels, Claes; Zhu, Kun; Nordström, Lars

    2013-08-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles.

  13. Generalized multiple kernel learning with data-dependent priors.

    PubMed

    Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li

    2015-06-01

    Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.

  14. Multilevel analysis of sports video sequences

    NASA Astrophysics Data System (ADS)

    Han, Jungong; Farin, Dirk; de With, Peter H. N.

    2006-01-01

    We propose a fully automatic and flexible framework for analysis and summarization of tennis broadcast video sequences, using visual features and specific game-context knowledge. Our framework can analyze a tennis video sequence at three levels, which provides a broad range of different analysis results. The proposed framework includes novel pixel-level and object-level tennis video processing algorithms, such as a moving-player detection taking both the color and the court (playing-field) information into account, and a player-position tracking algorithm based on a 3-D camera model. Additionally, we employ scene-level models for detecting events, like service, base-line rally and net-approach, based on a number real-world visual features. The system can summarize three forms of information: (1) all court-view playing frames in a game, (2) the moving trajectory and real-speed of each player, as well as relative position between the player and the court, (3) the semantic event segments in a game. The proposed framework is flexible in choosing the level of analysis that is desired. It is effective because the framework makes use of several visual cues obtained from the real-world domain to model important events like service, thereby increasing the accuracy of the scene-level analysis. The paper presents attractive experimental results highlighting the system efficiency and analysis capabilities.

  15. Wavelet neural networks: a practical guide.

    PubMed

    Alexandridis, Antonios K; Zapranis, Achilleas D

    2013-06-01

    Wavelet networks (WNs) are a new class of networks which have been used with great success in a wide range of applications. However a general accepted framework for applying WNs is missing from the literature. In this study, we present a complete statistical model identification framework in order to apply WNs in various applications. The following subjects were thoroughly examined: the structure of a WN, training methods, initialization algorithms, variable significance and variable selection algorithms, model selection methods and finally methods to construct confidence and prediction intervals. In addition the complexity of each algorithm is discussed. Our proposed framework was tested in two simulated cases, in one chaotic time series described by the Mackey-Glass equation and in three real datasets described by daily temperatures in Berlin, daily wind speeds in New York and breast cancer classification. Our results have shown that the proposed algorithms produce stable and robust results indicating that our proposed framework can be applied in various applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Towards a Theoretical Framework for Educational Simulations.

    ERIC Educational Resources Information Center

    Winer, Laura R.; Vazquez-Abad, Jesus

    1981-01-01

    Discusses the need for a sustained and systematic effort toward establishing a theoretical framework for educational simulations, proposes the adaptation of models borrowed from the natural and applied sciences, and describes three simulations based on such a model adapted using Brunerian learning theory. Sixteen references are listed. (LLS)

  17. Nonlinear and non-Gaussian Bayesian based handwriting beautification

    NASA Astrophysics Data System (ADS)

    Shi, Cao; Xiao, Jianguo; Xu, Canhui; Jia, Wenhua

    2013-03-01

    A framework is proposed in this paper to effectively and efficiently beautify handwriting by means of a novel nonlinear and non-Gaussian Bayesian algorithm. In the proposed framework, format and size of handwriting image are firstly normalized, and then typeface in computer system is applied to optimize vision effect of handwriting. The Bayesian statistics is exploited to characterize the handwriting beautification process as a Bayesian dynamic model. The model parameters to translate, rotate and scale typeface in computer system are controlled by state equation, and the matching optimization between handwriting and transformed typeface is employed by measurement equation. Finally, the new typeface, which is transformed from the original one and gains the best nonlinear and non-Gaussian optimization, is the beautification result of handwriting. Experimental results demonstrate the proposed framework provides a creative handwriting beautification methodology to improve visual acceptance.

  18. Unified Program Design: Organizing Existing Programming Models, Delivery Options, and Curriculum

    ERIC Educational Resources Information Center

    Rubenstein, Lisa DaVia; Ridgley, Lisa M.

    2017-01-01

    A persistent problem in the field of gifted education has been the lack of categorization and delineation of gifted programming options. To address this issue, we propose Unified Program Design as a structural framework for gifted program models. This framework defines gifted programs as the combination of delivery methods and curriculum models.…

  19. An Odds Ratio Approach for Detecting DDF under the Nested Logit Modeling Framework

    ERIC Educational Resources Information Center

    Terzi, Ragip; Suh, Youngsuk

    2015-01-01

    An odds ratio approach (ORA) under the framework of a nested logit model was proposed for evaluating differential distractor functioning (DDF) in multiple-choice items and was compared with an existing ORA developed under the nominal response model. The performances of the two ORAs for detecting DDF were investigated through an extensive…

  20. Modeling and Detecting Feature Interactions among Integrated Services of Home Network Systems

    NASA Astrophysics Data System (ADS)

    Igaki, Hiroshi; Nakamura, Masahide

    This paper presents a framework for formalizing and detecting feature interactions (FIs) in the emerging smart home domain. We first establish a model of home network system (HNS), where every networked appliance (or the HNS environment) is characterized as an object consisting of properties and methods. Then, every HNS service is defined as a sequence of method invocations of the appliances. Within the model, we next formalize two kinds of FIs: (a) appliance interactions and (b) environment interactions. An appliance interaction occurs when two method invocations conflict on the same appliance, whereas an environment interaction arises when two method invocations conflict indirectly via the environment. Finally, we propose offline and online methods that detect FIs before service deployment and during execution, respectively. Through a case study with seven practical services, it is shown that the proposed framework is generic enough to capture feature interactions in HNS integrated services. We also discuss several FI resolution schemes within the proposed framework.

  1. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  2. Machine Learning-based discovery of closures for reduced models of dynamical systems

    NASA Astrophysics Data System (ADS)

    Pan, Shaowu; Duraisamy, Karthik

    2017-11-01

    Despite the successful application of machine learning (ML) in fields such as image processing and speech recognition, only a few attempts has been made toward employing ML to represent the dynamics of complex physical systems. Previous attempts mostly focus on parameter calibration or data-driven augmentation of existing models. In this work we present a ML framework to discover closure terms in reduced models of dynamical systems and provide insights into potential problems associated with data-driven modeling. Based on exact closure models for linear system, we propose a general linear closure framework from viewpoint of optimization. The framework is based on trapezoidal approximation of convolution term. Hyperparameters that need to be determined include temporal length of memory effect, number of sampling points, and dimensions of hidden states. To circumvent the explicit specification of memory effect, a general framework inspired from neural networks is also proposed. We conduct both a priori and posteriori evaluations of the resulting model on a number of non-linear dynamical systems. This work was supported in part by AFOSR under the project ``LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.

  3. Integral Nursing: An Emerging Framework for Engaging the Evolution of the Profession.

    ERIC Educational Resources Information Center

    Fiandt, Kathryn; Forman, John; Megel, Mary Erickson; Pakieser, Ruth A.; Burge, Stephanie

    2003-01-01

    Proposes the Integral Nursing framework, which combines Wilber's All-Quadrant/All-Level model, a heuristic device to organize human experience, and the Spiral Dynamics model of human development organized around value memes or cultural units of information. Includes commentary by Beth L. Rodgers. (Contains 17 references.) (JOW)

  4. Modelling Diffusion of a Personalized Learning Framework

    ERIC Educational Resources Information Center

    Karmeshu; Raman, Raghu; Nedungadi, Prema

    2012-01-01

    A new modelling approach for diffusion of personalized learning as an educational process innovation in social group comprising adopter-teachers is proposed. An empirical analysis regarding the perception of 261 adopter-teachers from 18 schools in India about a particular personalized learning framework has been made. Based on this analysis,…

  5. A Unified Framework for Monetary Theory and Policy Analysis.

    ERIC Educational Resources Information Center

    Lagos, Ricardo; Wright, Randall

    2005-01-01

    Search-theoretic models of monetary exchange are based on explicit descriptions of the frictions that make money essential. However, tractable versions of these models typically make strong assumptions that render them ill suited for monetary policy analysis. We propose a new framework, based on explicit micro foundations, within which macro…

  6. Development of agent-based on-line adaptive signal control (ASC) framework using connected vehicle (CV) technology.

    DOT National Transportation Integrated Search

    2016-04-01

    In this study, we developed an adaptive signal control (ASC) framework for connected vehicles (CVs) using agent-based modeling technique. : The proposed framework consists of two types of agents: 1) vehicle agents (VAs); and 2) signal controller agen...

  7. Research and Design of the Three-tier Distributed Network Management System Based on COM / COM + and DNA

    NASA Astrophysics Data System (ADS)

    Liang, Likai; Bi, Yushen

    Considered on the distributed network management system's demand of high distributives, extensibility and reusability, a framework model of Three-tier distributed network management system based on COM/COM+ and DNA is proposed, which adopts software component technology and N-tier application software framework design idea. We also give the concrete design plan of each layer of this model. Finally, we discuss the internal running process of each layer in the distributed network management system's framework model.

  8. Feature selection and classifier parameters estimation for EEG signals peak detection using particle swarm optimization.

    PubMed

    Adam, Asrul; Shapiai, Mohd Ibrahim; Tumari, Mohd Zaidi Mohd; Mohamad, Mohd Saberi; Mubin, Marizan

    2014-01-01

    Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model.

  9. Improve Biomedical Information Retrieval using Modified Learning to Rank Methods.

    PubMed

    Xu, Bo; Lin, Hongfei; Lin, Yuan; Ma, Yunlong; Yang, Liang; Wang, Jian; Yang, Zhihao

    2016-06-14

    In these years, the number of biomedical articles has increased exponentially, which becomes a problem for biologists to capture all the needed information manually. Information retrieval technologies, as the core of search engines, can deal with the problem automatically, providing users with the needed information. However, it is a great challenge to apply these technologies directly for biomedical retrieval, because of the abundance of domain specific terminologies. To enhance biomedical retrieval, we propose a novel framework based on learning to rank. Learning to rank is a series of state-of-the-art information retrieval techniques, and has been proved effective in many information retrieval tasks. In the proposed framework, we attempt to tackle the problem of the abundance of terminologies by constructing ranking models, which focus on not only retrieving the most relevant documents, but also diversifying the searching results to increase the completeness of the resulting list for a given query. In the model training, we propose two novel document labeling strategies, and combine several traditional retrieval models as learning features. Besides, we also investigate the usefulness of different learning to rank approaches in our framework. Experimental results on TREC Genomics datasets demonstrate the effectiveness of our framework for biomedical information retrieval.

  10. On Connectivity of Wireless Sensor Networks with Directional Antennas

    PubMed Central

    Wang, Qiu; Dai, Hong-Ning; Zheng, Zibin; Imran, Muhammad; Vasilakos, Athanasios V.

    2017-01-01

    In this paper, we investigate the network connectivity of wireless sensor networks with directional antennas. In particular, we establish a general framework to analyze the network connectivity while considering various antenna models and the channel randomness. Since existing directional antenna models have their pros and cons in the accuracy of reflecting realistic antennas and the computational complexity, we propose a new analytical directional antenna model called the iris model to balance the accuracy against the complexity. We conduct extensive simulations to evaluate the analytical framework. Our results show that our proposed analytical model on the network connectivity is accurate, and our iris antenna model can provide a better approximation to realistic directional antennas than other existing antenna models. PMID:28085081

  11. Formal Models of Word Recognition. Final Report.

    ERIC Educational Resources Information Center

    Travers, Jeffrey R.

    Existing mathematical models of word recognition are reviewed and a new theory is proposed in this research. The new theory integrates earlier proposals within a single framework, sacrificing none of the predictive power of the earlier proposals, but offering a gain in theoretical economy. The theory holds that word recognition is accomplished by…

  12. A novel framework of tissue membrane systems for image fusion.

    PubMed

    Zhang, Zulin; Yi, Xinzhong; Peng, Hong

    2014-01-01

    This paper proposes a tissue membrane system-based framework to deal with the optimal image fusion problem. A spatial domain fusion algorithm is given, and a tissue membrane system of multiple cells is used as its computing framework. Based on the multicellular structure and inherent communication mechanism of the tissue membrane system, an improved velocity-position model is developed. The performance of the fusion framework is studied with comparison of several traditional fusion methods as well as genetic algorithm (GA)-based and differential evolution (DE)-based spatial domain fusion methods. Experimental results show that the proposed fusion framework is superior or comparable to the other methods and can be efficiently used for image fusion.

  13. An Optimization-Based State Estimatioin Framework for Large-Scale Natural Gas Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalving, Jordan; Zavala, Victor M.

    We propose an optimization-based state estimation framework to track internal spacetime flow and pressure profiles of natural gas networks during dynamic transients. We find that the estimation problem is ill-posed (because of the infinite-dimensional nature of the states) and that this leads to instability of the estimator when short estimation horizons are used. To circumvent this issue, we propose moving horizon strategies that incorporate prior information. In particular, we propose a strategy that initializes the prior using steady-state information and compare its performance against a strategy that does not initialize the prior. We find that both strategies are capable ofmore » tracking the state profiles but we also find that superior performance is obtained with steady-state prior initialization. We also find that, under the proposed framework, pressure sensor information at junctions is sufficient to track the state profiles. We also derive approximate transport models and show that some of these can be used to achieve significant computational speed-ups without sacrificing estimation performance. We show that the estimator can be easily implemented in the graph-based modeling framework Plasmo.jl and use a multipipeline network study to demonstrate the developments.« less

  14. A new framework to enhance the interpretation of external validation studies of clinical prediction models.

    PubMed

    Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M

    2015-03-01

    It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Towards a Framework for Evolvable Network Design

    NASA Astrophysics Data System (ADS)

    Hassan, Hoda; Eltarras, Ramy; Eltoweissy, Mohamed

    The layered Internet architecture that had long guided network design and protocol engineering was an “interconnection architecture” defining a framework for interconnecting networks rather than a model for generic network structuring and engineering. We claim that the approach of abstracting the network in terms of an internetwork hinders the thorough understanding of the network salient characteristics and emergent behavior resulting in impeding design evolution required to address extreme scale, heterogeneity, and complexity. This paper reports on our work in progress that aims to: 1) Investigate the problem space in terms of the factors and decisions that influenced the design and development of computer networks; 2) Sketch the core principles for designing complex computer networks; and 3) Propose a model and related framework for building evolvable, adaptable and self organizing networks We will adopt a bottom up strategy primarily focusing on the building unit of the network model, which we call the “network cell”. The model is inspired by natural complex systems. A network cell is intrinsically capable of specialization, adaptation and evolution. Subsequently, we propose CellNet; a framework for evolvable network design. We outline scenarios for using the CellNet framework to enhance legacy Internet protocol stack.

  16. Obesity in sub-Saharan Africa: development of an ecological theoretical framework.

    PubMed

    Scott, Alison; Ejikeme, Chinwe Stella; Clottey, Emmanuel Nii; Thomas, Joy Goens

    2013-03-01

    The prevalence of overweight and obesity is increasing in sub-Saharan Africa (SSA). There is a need for theoretical frameworks to catalyze further research and to inform the development of multi-level, context-appropriate interventions. In this commentary, we propose a preliminary ecological theoretical framework to conceptualize factors that contribute to increases in overweight and obesity in SSA. The framework is based on a Causality Continuum model [Coreil et al. Social and Behavioral Foundations of Public Health. Sage Publications, Thousand Oaks] that considers distant, intermediate and proximate influences. The influences incorporated in the model include globalization and urbanization as distant factors; occupation, social relationships, built environment and cultural perceptions of weight as intermediate factors and caloric intake, physical inactivity and genetics as proximate factors. The model illustrates the interaction of factors along a continuum, from the individual to the global marketplace, in shaping trends in overweight and obesity in SSA. The framework will be presented, each influence elucidated and implications for research and intervention development discussed. There is a tremendous need for further research on obesity in SSA. An improved evidence base will serve to validate and develop the proposed framework further.

  17. A framework for m-health service development and success evaluation.

    PubMed

    Sadegh, S Saeedeh; Khakshour Saadat, Parisa; Sepehri, Mohammad Mehdi; Assadi, Vahid

    2018-04-01

    The emergence of mobile technology has influenced many service industries including health care. Mobile health (m-Health) applications have been used widely, and many services have been developed that have changed delivery systems and have improved effectiveness of health care services. Stakeholders of m-Health services have various resources and rights that lends to a complexity in service delivery. In addition, abundance of different m-Health services makes it difficult to choose an appropriate service for these stakeholders that include customers, patients, users or even providers. Moreover, a comprehensive framework is not yet provided in the literature that would help manage and evaluate m-health services, considering various stakeholder's benefits. In this paper, a comprehensive literature review has been done on famous frameworks and models in the field of Information Technology and electronic health with the aim of finding different aspects of developing and managing m-health services. Using the results of literature review and conducting a stakeholder analysis, we have proposed an m-health evaluation framework which evaluates the success of a given m-health service through a three-stage life cycle: (1) Service Requirement Analysis, (2) Service Development, and (3) Service Delivery. Key factors of m-health evaluation in each step are introduced in the proposed framework considering m-health key stakeholder's benefits. The proposed framework is validated via expert interviews, and key factors in each evaluation step is validated using PLS model. Results show that path coefficients are higher than their threshold which supports the validity of proposed framework. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    NASA Astrophysics Data System (ADS)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  19. Entity-Centric Abstraction and Modeling Framework for Transportation Architectures

    NASA Technical Reports Server (NTRS)

    Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.

    2007-01-01

    A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.

  20. a Framework for Voxel-Based Global Scale Modeling of Urban Environments

    NASA Astrophysics Data System (ADS)

    Gehrung, Joachim; Hebel, Marcus; Arens, Michael; Stilla, Uwe

    2016-10-01

    The generation of 3D city models is a very active field of research. Modeling environments as point clouds may be fast, but has disadvantages. These are easily solvable by using volumetric representations, especially when considering selective data acquisition, change detection and fast changing environments. Therefore, this paper proposes a framework for the volumetric modeling and visualization of large scale urban environments. Beside an architecture and the right mix of algorithms for the task, two compression strategies for volumetric models as well as a data quality based approach for the import of range measurements are proposed. The capabilities of the framework are shown on a mobile laser scanning dataset of the Technical University of Munich. Furthermore the loss of the compression techniques is evaluated and their memory consumption is compared to that of raw point clouds. The presented results show that generation, storage and real-time rendering of even large urban models are feasible, even with off-the-shelf hardware.

  1. Scene-based nonuniformity correction and enhancement: pixel statistics and subpixel motion.

    PubMed

    Zhao, Wenyi; Zhang, Chao

    2008-07-01

    We propose a framework for scene-based nonuniformity correction (NUC) and nonuniformity correction and enhancement (NUCE) that is required for focal-plane array-like sensors to obtain clean and enhanced-quality images. The core of the proposed framework is a novel registration-based nonuniformity correction super-resolution (NUCSR) method that is bootstrapped by statistical scene-based NUC methods. Based on a comprehensive imaging model and an accurate parametric motion estimation, we are able to remove severe/structured nonuniformity and in the presence of subpixel motion to simultaneously improve image resolution. One important feature of our NUCSR method is the adoption of a parametric motion model that allows us to (1) handle many practical scenarios where parametric motions are present and (2) carry out perfect super-resolution in principle by exploring available subpixel motions. Experiments with real data demonstrate the efficiency of the proposed NUCE framework and the effectiveness of the NUCSR method.

  2. A Lightweight Hierarchical Activity Recognition Framework Using Smartphone Sensors

    PubMed Central

    Han, Manhyung; Bang, Jae Hun; Nugent, Chris; McClean, Sally; Lee, Sungyoung

    2014-01-01

    Activity recognition for the purposes of recognizing a user's intentions using multimodal sensors is becoming a widely researched topic largely based on the prevalence of the smartphone. Previous studies have reported the difficulty in recognizing life-logs by only using a smartphone due to the challenges with activity modeling and real-time recognition. In addition, recognizing life-logs is difficult due to the absence of an established framework which enables the use of different sources of sensor data. In this paper, we propose a smartphone-based Hierarchical Activity Recognition Framework which extends the Naïve Bayes approach for the processing of activity modeling and real-time activity recognition. The proposed algorithm demonstrates higher accuracy than the Naïve Bayes approach and also enables the recognition of a user's activities within a mobile environment. The proposed algorithm has the ability to classify fifteen activities with an average classification accuracy of 92.96%. PMID:25184486

  3. A robust multi-kernel change detection framework for detecting leaf beetle defoliation using Landsat 7 ETM+ data

    NASA Astrophysics Data System (ADS)

    Anees, Asim; Aryal, Jagannath; O'Reilly, Małgorzata M.; Gale, Timothy J.; Wardlaw, Tim

    2016-12-01

    A robust non-parametric framework, based on multiple Radial Basic Function (RBF) kernels, is proposed in this study, for detecting land/forest cover changes using Landsat 7 ETM+ images. One of the widely used frameworks is to find change vectors (difference image) and use a supervised classifier to differentiate between change and no-change. The Bayesian Classifiers e.g. Maximum Likelihood Classifier (MLC), Naive Bayes (NB), are widely used probabilistic classifiers which assume parametric models, e.g. Gaussian function, for the class conditional distributions. However, their performance can be limited if the data set deviates from the assumed model. The proposed framework exploits the useful properties of Least Squares Probabilistic Classifier (LSPC) formulation i.e. non-parametric and probabilistic nature, to model class posterior probabilities of the difference image using a linear combination of a large number of Gaussian kernels. To this end, a simple technique, based on 10-fold cross-validation is also proposed for tuning model parameters automatically instead of selecting a (possibly) suboptimal combination from pre-specified lists of values. The proposed framework has been tested and compared with Support Vector Machine (SVM) and NB for detection of defoliation, caused by leaf beetles (Paropsisterna spp.) in Eucalyptus nitens and Eucalyptus globulus plantations of two test areas, in Tasmania, Australia, using raw bands and band combination indices of Landsat 7 ETM+. It was observed that due to multi-kernel non-parametric formulation and probabilistic nature, the LSPC outperforms parametric NB with Gaussian assumption in change detection framework, with Overall Accuracy (OA) ranging from 93.6% (κ = 0.87) to 97.4% (κ = 0.94) against 85.3% (κ = 0.69) to 93.4% (κ = 0.85), and is more robust to changing data distributions. Its performance was comparable to SVM, with added advantages of being probabilistic and capable of handling multi-class problems naturally with its original formulation.

  4. A Bayesian framework based on a Gaussian mixture model and radial-basis-function Fisher discriminant analysis (BayGmmKda V1.1) for spatial prediction of floods

    NASA Astrophysics Data System (ADS)

    Tien Bui, Dieu; Hoang, Nhat-Duc

    2017-09-01

    In this study, a probabilistic model, named as BayGmmKda, is proposed for flood susceptibility assessment in a study area in central Vietnam. The new model is a Bayesian framework constructed by a combination of a Gaussian mixture model (GMM), radial-basis-function Fisher discriminant analysis (RBFDA), and a geographic information system (GIS) database. In the Bayesian framework, GMM is used for modeling the data distribution of flood-influencing factors in the GIS database, whereas RBFDA is utilized to construct a latent variable that aims at enhancing the model performance. As a result, the posterior probabilistic output of the BayGmmKda model is used as flood susceptibility index. Experiment results showed that the proposed hybrid framework is superior to other benchmark models, including the adaptive neuro-fuzzy inference system and the support vector machine. To facilitate the model implementation, a software program of BayGmmKda has been developed in MATLAB. The BayGmmKda program can accurately establish a flood susceptibility map for the study region. Accordingly, local authorities can overlay this susceptibility map onto various land-use maps for the purpose of land-use planning or management.

  5. Clique Relaxations in Biological and Social Network Analysis Foundations and Algorithms

    DTIC Science & Technology

    2015-10-26

    study of clique relaxation models arising in biological and social networks. This project examines the elementary clique-defining properties... elementary clique-defining properties inherently exploited in the available clique relaxation models and pro- poses a taxonomic framework that not...analyzes the elementary clique-defining properties implicitly exploited in the available clique relaxation models and proposes a taxonomic framework that

  6. Boolean Dynamic Modeling Approaches to Study Plant Gene Regulatory Networks: Integration, Validation, and Prediction.

    PubMed

    Velderraín, José Dávila; Martínez-García, Juan Carlos; Álvarez-Buylla, Elena R

    2017-01-01

    Mathematical models based on dynamical systems theory are well-suited tools for the integration of available molecular experimental data into coherent frameworks in order to propose hypotheses about the cooperative regulatory mechanisms driving developmental processes. Computational analysis of the proposed models using well-established methods enables testing the hypotheses by contrasting predictions with observations. Within such framework, Boolean gene regulatory network dynamical models have been extensively used in modeling plant development. Boolean models are simple and intuitively appealing, ideal tools for collaborative efforts between theorists and experimentalists. In this chapter we present protocols used in our group for the study of diverse plant developmental processes. We focus on conceptual clarity and practical implementation, providing directions to the corresponding technical literature.

  7. Feedback control by online learning an inverse model.

    PubMed

    Waegeman, Tim; Wyffels, Francis; Schrauwen, Francis

    2012-10-01

    A model, predictor, or error estimator is often used by a feedback controller to control a plant. Creating such a model is difficult when the plant exhibits nonlinear behavior. In this paper, a novel online learning control framework is proposed that does not require explicit knowledge about the plant. This framework uses two learning modules, one for creating an inverse model, and the other for actually controlling the plant. Except for their inputs, they are identical. The inverse model learns by the exploration performed by the not yet fully trained controller, while the actual controller is based on the currently learned model. The proposed framework allows fast online learning of an accurate controller. The controller can be applied on a broad range of tasks with different dynamic characteristics. We validate this claim by applying our control framework on several control tasks: 1) the heating tank problem (slow nonlinear dynamics); 2) flight pitch control (slow linear dynamics); and 3) the balancing problem of a double inverted pendulum (fast linear and nonlinear dynamics). The results of these experiments show that fast learning and accurate control can be achieved. Furthermore, a comparison is made with some classical control approaches, and observations concerning convergence and stability are made.

  8. Statistical label fusion with hierarchical performance models

    PubMed Central

    Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.

    2014-01-01

    Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809

  9. Concept-oriented indexing of video databases: toward semantic sensitive retrieval and browsing.

    PubMed

    Fan, Jianping; Luo, Hangzai; Elmagarmid, Ahmed K

    2004-07-01

    Digital video now plays an important role in medical education, health care, telemedicine and other medical applications. Several content-based video retrieval (CBVR) systems have been proposed in the past, but they still suffer from the following challenging problems: semantic gap, semantic video concept modeling, semantic video classification, and concept-oriented video database indexing and access. In this paper, we propose a novel framework to make some advances toward the final goal to solve these problems. Specifically, the framework includes: 1) a semantic-sensitive video content representation framework by using principal video shots to enhance the quality of features; 2) semantic video concept interpretation by using flexible mixture model to bridge the semantic gap; 3) a novel semantic video-classifier training framework by integrating feature selection, parameter estimation, and model selection seamlessly in a single algorithm; and 4) a concept-oriented video database organization technique through a certain domain-dependent concept hierarchy to enable semantic-sensitive video retrieval and browsing.

  10. A Hierarchical Building Segmentation in Digital Surface Models for 3D Reconstruction.

    PubMed

    Yan, Yiming; Gao, Fengjiao; Deng, Shupei; Su, Nan

    2017-01-24

    In this study, a hierarchical method for segmenting buildings in a digital surface model (DSM), which is used in a novel framework for 3D reconstruction, is proposed. Most 3D reconstructions of buildings are model-based. However, the limitations of these methods are overreliance on completeness of the offline-constructed models of buildings, and the completeness is not easily guaranteed since in modern cities buildings can be of a variety of types. Therefore, a model-free framework using high precision DSM and texture-images buildings was introduced. There are two key problems with this framework. The first one is how to accurately extract the buildings from the DSM. Most segmentation methods are limited by either the terrain factors or the difficult choice of parameter-settings. A level-set method are employed to roughly find the building regions in the DSM, and then a recently proposed 'occlusions of random textures model' are used to enhance the local segmentation of the buildings. The second problem is how to generate the facades of buildings. Synergizing with the corresponding texture-images, we propose a roof-contour guided interpolation of building facades. The 3D reconstruction results achieved by airborne-like images and satellites are compared. Experiments show that the segmentation method has good performance, and 3D reconstruction is easily performed by our framework, and better visualization results can be obtained by airborne-like images, which can be further replaced by UAV images.

  11. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    PubMed

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  12. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification

    PubMed Central

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594

  13. A Model Driven Framework to Address Challenges in a Mobile Learning Environment

    ERIC Educational Resources Information Center

    Khaddage, Ferial; Christensen, Rhonda; Lai, Wing; Knezek, Gerald; Norris, Cathie; Soloway, Elliot

    2015-01-01

    In this paper a review of the pedagogical, technological, policy and research challenges and concepts underlying mobile learning is presented, followed by a brief description of categories of implementations. A model Mobile learning framework and dynamic criteria for mobile learning implementations are proposed, along with a case study of one site…

  14. Demographic Accounting and Model-Building. Education and Development Technical Reports.

    ERIC Educational Resources Information Center

    Stone, Richard

    This report describes and develops a model for coordinating a variety of demographic and social statistics within a single framework. The framework proposed, together with its associated methods of analysis, serves both general and specific functions. The general aim of these functions is to give numerical definition to the pattern of society and…

  15. Narrative review of frameworks for translating research evidence into policy and practice.

    PubMed

    Milat, Andrew J; Li, Ben

    2017-02-15

    A significant challenge in research translation is that interested parties interpret and apply the associated terms and conceptual frameworks in different ways. The purpose of this review was to: a) examine different research translation frameworks; b) examine the similarities and differences between the frameworks; and c) identify key strengths and weaknesses of the models when they are applied in practice. The review involved a keyword search of PubMed. The search string was (translational research OR knowledge translation OR evidence to practice) AND (framework OR model OR theory) AND (public health OR health promotion OR medicine). Included studies were published in English between January 1990 and December 2014, and described frameworks, models or theories associated with research translation. The final review included 98 papers, and 41 different frameworks and models were identified. The most frequently applied knowledge translation framework in the literature was RE-AIM, followed by the knowledge translation continuum or 'T' models, the Knowledge to Action framework, the PARiHS framework, evidence based public health models, and the stages of research and evaluation model. The models identified in this review stem from different fields, including implementation science, basic and medical sciences, health services research and public health, and propose different but related pathways to closing the research-practice gap.

  16. A Hierarchical Learning Control Framework for an Aerial Manipulation System

    NASA Astrophysics Data System (ADS)

    Ma, Le; Chi, yanxun; Li, Jiapeng; Li, Zhongsheng; Ding, Yalei; Liu, Lixing

    2017-07-01

    A hierarchical learning control framework for an aerial manipulation system is proposed. Firstly, the mechanical design of aerial manipulation system is introduced and analyzed, and the kinematics and the dynamics based on Newton-Euler equation are modeled. Secondly, the framework of hierarchical learning for this system is presented, in which flight platform and manipulator are controlled by different controller respectively. The RBF (Radial Basis Function) neural networks are employed to estimate parameters and control. The Simulation and experiment demonstrate that the methods proposed effective and advanced.

  17. Generic framework for the secure Yuen 2000 quantum-encryption protocol employing the wire-tap channel approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mihaljevic, Miodrag J.

    2007-05-15

    It is shown that the security, against known-plaintext attacks, of the Yuen 2000 (Y00) quantum-encryption protocol can be considered via the wire-tap channel model assuming that the heterodyne measurement yields the sample for security evaluation. Employing the results reported on the wire-tap channel, a generic framework is proposed for developing secure Y00 instantiations. The proposed framework employs a dedicated encoding which together with inherent quantum noise at the attacker's side provides Y00 security.

  18. Predictor-Based Model Reference Adaptive Control

    NASA Technical Reports Server (NTRS)

    Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.

    2010-01-01

    This paper is devoted to the design and analysis of a predictor-based model reference adaptive control. Stable adaptive laws are derived using Lyapunov framework. The proposed architecture is compared with the now classical model reference adaptive control. A simulation example is presented in which numerical evidence indicates that the proposed controller yields improved transient characteristics.

  19. Towards a Model of Technology Adoption: A Conceptual Model Proposition

    NASA Astrophysics Data System (ADS)

    Costello, Pat; Moreton, Rob

    A conceptual model for Information Communication Technology (ICT) adoption by Small Medium Enterprises (SMEs) is proposed. The research uses several ICT adoption models as its basis with theoretical underpinning provided by the Diffusion of Innovation theory and the Technology Acceptance Model (TAM). Taking an exploratory research approach the model was investigated amongst 200 SMEs whose core business is ICT. Evidence from this study demonstrates that these SMEs face the same issues as all other industry sectors. This work points out weaknesses in SMEs environments regarding ICT adoption and suggests what they may need to do to increase the success rate of any proposed adoption. The methodology for development of the framework is described and recommendations made for improved Government-led ICT adoption initiatives. Application of the general methodology has resulted in new opportunities to embed the ethos and culture surrounding the issues into the framework of new projects developed as a result of Government intervention. A conceptual model is proposed that may lead to a deeper understanding of the issues under consideration.

  20. Geomechanical Modeling of Gas Hydrate Bearing Sediments

    NASA Astrophysics Data System (ADS)

    Sanchez, M. J.; Gai, X., Sr.

    2015-12-01

    This contribution focuses on an advance geomechanical model for methane hydrate-bearing soils based on concepts of elasto-plasticity for strain hardening/softening soils and incorporates bonding and damage effects. The core of the proposed model includes: a hierarchical single surface critical state framework, sub-loading concepts for modeling the plastic strains generally observed inside the yield surface and a hydrate enhancement factor to account for the cementing effects provided by the presence of hydrates in sediments. The proposed framework has been validated against recently published experiments involving both, synthetic and natural hydrate soils, as well as different sediments types (i.e., different hydrate saturations, and different hydrates morphologies) and confinement conditions. The performance of the model in these different case studies was very satisfactory.

  1. Psychosocial Pain Management Moderation: The Limit, Activate, and Enhance Model.

    PubMed

    Day, Melissa A; Ehde, Dawn M; Jensen, Mark P

    2015-10-01

    There is a growing emphasis in the pain literature on understanding the following second-order research questions: Why do psychosocial pain treatments work? For whom do various treatments work? This critical review summarizes research that addresses the latter question and proposes a moderation model to help guide future research. A theoretical moderation framework for matching individuals to specific psychosocial pain interventions has been lacking. However, several such frameworks have been proposed in the broad psychotherapy and implementation science literature. Drawing on these theories and adapting them specifically for psychosocial pain treatment, here we propose a Limit, Activate, and Enhance model of pain treatment moderation. This model is unique in that it includes algorithms not only for matching treatments on the basis of patient weaknesses but also for directing patients to interventions that build on their strengths. Critically, this model provides a basis for specific a priori hypothesis generation, and a selection of the possible hypotheses drawn from the model are proposed and discussed. Future research considerations are presented that could refine and expand the model based on theoretically driven empirical evidence. The Limit, Activate, and Enhance model presented here is a theoretically derived framework that provides an a priori basis for hypothesis generation regarding psychosocial pain treatment moderators. The model will advance moderation research via its unique focus on matching patients to specific treatments that (1) limit maladaptive responses, (2) activate adaptive responses, and (3) enhance treatment outcomes based on patient strengths and resources. Copyright © 2015 American Pain Society. Published by Elsevier Inc. All rights reserved.

  2. Developing and Implementing a Framework of Participatory Simulation for Mobile Learning Using Scaffolding

    ERIC Educational Resources Information Center

    Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen

    2013-01-01

    This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…

  3. QoS Composition and Decomposition Model in Uniframe

    DTIC Science & Technology

    2003-08-01

    Architecture Tradeoff Analysis Method.………………….19 2.2 Analysis of Non-Functional Requirements at the Early Design Phase………19 2.2.1 Parmenides Framework...early design phase are discussed in the following sections. 2.2.1 Parmenides Framework In [22], an architecture-based framework is proposed for

  4. Episodic Laryngeal Breathing Disorders: Literature Review and Proposal of Preliminary Theoretical Framework.

    PubMed

    Shembel, Adrianna C; Sandage, Mary J; Verdolini Abbott, Katherine

    2017-01-01

    The purposes of this literature review were (1) to identify and assess frameworks for clinical characterization of episodic laryngeal breathing disorders (ELBD) and their subtypes, (2) to integrate concepts from these frameworks into a novel theoretical paradigm, and (3) to provide a preliminary algorithm to classify clinical features of ELBD for future study of its clinical manifestations and underlying pathophysiological mechanisms. This is a literature review. Peer-reviewed literature from 1983 to 2015 pertaining to models for ELBD was searched using Pubmed, Ovid, Proquest, Cochrane Database of Systematic Reviews, and Google Scholar. Theoretical models for ELBD were identified, evaluated, and integrated into a novel comprehensive framework. Consensus across three salient models provided a working definition and inclusionary criteria for ELBD within the new framework. Inconsistencies and discrepancies within the models provided an analytic platform for future research. Comparison among three conceptual models-(1) Irritable larynx syndrome, (2) Dichotomous triggers, and (3) Periodic occurrence of laryngeal obstruction-showed that the models uniformly consider ELBD to involve episodic laryngeal obstruction causing dyspnea. The models differed in their description of source of dyspnea, in their inclusion of corollary behaviors, in their inclusion of other laryngeal-based behaviors (eg, cough), and types of triggers. The proposed integrated theoretical framework for ELBD provides a preliminary systematic platform for the identification of key clinical feature patterns indicative of ELBD and associated clinical subgroups. This algorithmic paradigm should evolve with better understanding of this spectrum of disorders and its underlying pathophysiological mechanisms. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  5. Overarching framework for data-based modelling

    NASA Astrophysics Data System (ADS)

    Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco

    2014-02-01

    One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.

  6. Linking service quality, customer satisfaction, and behavioral intention.

    PubMed

    Woodside, A G; Frey, L L; Daly, R T

    1989-12-01

    Based on the service quality and script theory literature, a framework of relationships among service quality, customer satisfaction, and behavioral intention for service purchases is proposed. Specific models are developed from the general framework and the models are applied and tested for the highly complex and divergent consumer service of overnight hospital care. Service quality, customer satisfaction, and behavioral intention data were collected from recent patients of two hospitals. The findings support the specific models and general framework. Implications for theory, service marketing, and future research are discussed.

  7. A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks

    PubMed Central

    Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng

    2009-01-01

    Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885

  8. A Hierarchical Building Segmentation in Digital Surface Models for 3D Reconstruction

    PubMed Central

    Yan, Yiming; Gao, Fengjiao; Deng, Shupei; Su, Nan

    2017-01-01

    In this study, a hierarchical method for segmenting buildings in a digital surface model (DSM), which is used in a novel framework for 3D reconstruction, is proposed. Most 3D reconstructions of buildings are model-based. However, the limitations of these methods are overreliance on completeness of the offline-constructed models of buildings, and the completeness is not easily guaranteed since in modern cities buildings can be of a variety of types. Therefore, a model-free framework using high precision DSM and texture-images buildings was introduced. There are two key problems with this framework. The first one is how to accurately extract the buildings from the DSM. Most segmentation methods are limited by either the terrain factors or the difficult choice of parameter-settings. A level-set method are employed to roughly find the building regions in the DSM, and then a recently proposed ‘occlusions of random textures model’ are used to enhance the local segmentation of the buildings. The second problem is how to generate the facades of buildings. Synergizing with the corresponding texture-images, we propose a roof-contour guided interpolation of building facades. The 3D reconstruction results achieved by airborne-like images and satellites are compared. Experiments show that the segmentation method has good performance, and 3D reconstruction is easily performed by our framework, and better visualization results can be obtained by airborne-like images, which can be further replaced by UAV images. PMID:28125018

  9. A model for AGN variability on multiple time-scales

    NASA Astrophysics Data System (ADS)

    Sartori, Lia F.; Schawinski, Kevin; Trakhtenbrot, Benny; Caplar, Neven; Treister, Ezequiel; Koss, Michael J.; Urry, C. Megan; Zhang, C. E.

    2018-05-01

    We present a framework to link and describe active galactic nuclei (AGN) variability on a wide range of time-scales, from days to billions of years. In particular, we concentrate on the AGN variability features related to changes in black hole fuelling and accretion rate. In our framework, the variability features observed in different AGN at different time-scales may be explained as realisations of the same underlying statistical properties. In this context, we propose a model to simulate the evolution of AGN light curves with time based on the probability density function (PDF) and power spectral density (PSD) of the Eddington ratio (L/LEdd) distribution. Motivated by general galaxy population properties, we propose that the PDF may be inspired by the L/LEdd distribution function (ERDF), and that a single (or limited number of) ERDF+PSD set may explain all observed variability features. After outlining the framework and the model, we compile a set of variability measurements in terms of structure function (SF) and magnitude difference. We then combine the variability measurements on a SF plot ranging from days to Gyr. The proposed framework enables constraints on the underlying PSD and the ability to link AGN variability on different time-scales, therefore providing new insights into AGN variability and black hole growth phenomena.

  10. A proposed analytic framework for determining the impact of an antimicrobial resistance intervention.

    PubMed

    Grohn, Yrjo T; Carson, Carolee; Lanzas, Cristina; Pullum, Laura; Stanhope, Michael; Volkova, Victoriya

    2017-06-01

    Antimicrobial use (AMU) is increasingly threatened by antimicrobial resistance (AMR). The FDA is implementing risk mitigation measures promoting prudent AMU in food animals. Their evaluation is crucial: the AMU/AMR relationship is complex; a suitable framework to analyze interventions is unavailable. Systems science analysis, depicting variables and their associations, would help integrate mathematics/epidemiology to evaluate the relationship. This would identify informative data and models to evaluate interventions. This National Institute for Mathematical and Biological Synthesis AMR Working Group's report proposes a system framework to address the methodological gap linking livestock AMU and AMR in foodborne bacteria. It could evaluate how AMU (and interventions) impact AMR. We will evaluate pharmacokinetic/dynamic modeling techniques for projecting AMR selection pressure on enteric bacteria. We study two methods to model phenotypic AMR changes in bacteria in the food supply and evolutionary genotypic analyses determining molecular changes in phenotypic AMR. Systems science analysis integrates the methods, showing how resistance in the food supply is explained by AMU and concurrent factors influencing the whole system. This process is updated with data and techniques to improve prediction and inform improvements for AMU/AMR surveillance. Our proposed framework reflects both the AMR system's complexity, and desire for simple, reliable conclusions.

  11. Feature Selection and Classifier Parameters Estimation for EEG Signals Peak Detection Using Particle Swarm Optimization

    PubMed Central

    Adam, Asrul; Mohd Tumari, Mohd Zaidi; Mohamad, Mohd Saberi

    2014-01-01

    Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model. PMID:25243236

  12. Logic Modeling as a Tool to Prepare to Evaluate Disaster and Emergency Preparedness, Response, and Recovery in Schools

    ERIC Educational Resources Information Center

    Zantal-Wiener, Kathy; Horwood, Thomas J.

    2010-01-01

    The authors propose a comprehensive evaluation framework to prepare for evaluating school emergency management programs. This framework involves a logic model that incorporates Government Performance and Results Act (GPRA) measures as a foundation for comprehensive evaluation that complements performance monitoring used by the U.S. Department of…

  13. A conceptual framework for ranking crown fire potential in wildland fuelbeds.

    Treesearch

    Mark D. Schaaf; David V. Sandberg; Maarten D. Schreuder; Cynthia L. Riccardi

    2007-01-01

    This paper presents a conceptual framework for ranking the crown fire potential of wildland fuelbeds with forest canopies. This approach extends the work by Van Wagner and Rothermel, and introduces several new physical concepts to the modeling of crown fire behavior derived from the reformulated Rothemel surface fire modeling concepts proposed by Sandberg et al. This...

  14. Ecological Dynamics as a Theoretical Framework for Development of Sustainable Behaviours towards the Environment

    ERIC Educational Resources Information Center

    Brymer, Eric; Davids, Keith

    2013-01-01

    This paper proposes how the theoretical framework of ecological dynamics can provide an influential model of the learner and the learning process to pre-empt effective behaviour changes. Here we argue that ecological dynamics supports a well-established model of the learner ideally suited to the environmental education context because of its…

  15. Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers

    ERIC Educational Resources Information Center

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-01-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…

  16. Towards a Pedagogical Model for Science Education: Bridging Educational Contexts through a Blended Learning Approach

    ERIC Educational Resources Information Center

    Bidarra, José; Rusman, Ellen

    2017-01-01

    This paper proposes a design framework to support science education through blended learning, based on a participatory and interactive approach supported by ICT-based tools, called "Science Learning Activities Model" (SLAM). The development of this design framework started as a response to complex changes in society and education (e.g.…

  17. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  18. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  19. Multitask TSK fuzzy system modeling by mining intertask common hidden structure.

    PubMed

    Jiang, Yizhang; Chung, Fu-Lai; Ishibuchi, Hisao; Deng, Zhaohong; Wang, Shitong

    2015-03-01

    The classical fuzzy system modeling methods implicitly assume data generated from a single task, which is essentially not in accordance with many practical scenarios where data can be acquired from the perspective of multiple tasks. Although one can build an individual fuzzy system model for each task, the result indeed tells us that the individual modeling approach will get poor generalization ability due to ignoring the intertask hidden correlation. In order to circumvent this shortcoming, we consider a general framework for preserving the independent information among different tasks and mining hidden correlation information among all tasks in multitask fuzzy modeling. In this framework, a low-dimensional subspace (structure) is assumed to be shared among all tasks and hence be the hidden correlation information among all tasks. Under this framework, a multitask Takagi-Sugeno-Kang (TSK) fuzzy system model called MTCS-TSK-FS (TSK-FS for multiple tasks with common hidden structure), based on the classical L2-norm TSK fuzzy system, is proposed in this paper. The proposed model can not only take advantage of independent sample information from the original space for each task, but also effectively use the intertask common hidden structure among multiple tasks to enhance the generalization performance of the built fuzzy systems. Experiments on synthetic and real-world datasets demonstrate the applicability and distinctive performance of the proposed multitask fuzzy system model in multitask regression learning scenarios.

  20. The Cusp Catastrophe Model as Cross-Sectional and Longitudinal Mixture Structural Equation Models

    PubMed Central

    Chow, Sy-Miin; Witkiewitz, Katie; Grasman, Raoul P. P. P.; Maisto, Stephen A.

    2015-01-01

    Catastrophe theory (Thom, 1972, 1993) is the study of the many ways in which continuous changes in a system’s parameters can result in discontinuous changes in one or several outcome variables of interest. Catastrophe theory–inspired models have been used to represent a variety of change phenomena in the realm of social and behavioral sciences. Despite their promise, widespread applications of catastrophe models have been impeded, in part, by difficulties in performing model fitting and model comparison procedures. We propose a new modeling framework for testing one kind of catastrophe model — the cusp catastrophe model — as a mixture structural equation model (MSEM) when cross-sectional data are available; or alternatively, as an MSEM with regime-switching (MSEM-RS) when longitudinal panel data are available. The proposed models and the advantages offered by this alternative modeling framework are illustrated using two empirical examples and a simulation study. PMID:25822209

  1. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  2. A Framework to Design and Optimize Chemical Flooding Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  3. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  4. Distributed Peer-to-Peer Target Tracking in Wireless Sensor Networks

    PubMed Central

    Wang, Xue; Wang, Sheng; Bi, Dao-Wei; Ma, Jun-Jie

    2007-01-01

    Target tracking is usually a challenging application for wireless sensor networks (WSNs) because it is always computation-intensive and requires real-time processing. This paper proposes a practical target tracking system based on the auto regressive moving average (ARMA) model in a distributed peer-to-peer (P2P) signal processing framework. In the proposed framework, wireless sensor nodes act as peers that perform target detection, feature extraction, classification and tracking, whereas target localization requires the collaboration between wireless sensor nodes for improving the accuracy and robustness. For carrying out target tracking under the constraints imposed by the limited capabilities of the wireless sensor nodes, some practically feasible algorithms, such as the ARMA model and the 2-D integer lifting wavelet transform, are adopted in single wireless sensor nodes due to their outstanding performance and light computational burden. Furthermore, a progressive multi-view localization algorithm is proposed in distributed P2P signal processing framework considering the tradeoff between the accuracy and energy consumption. Finally, a real world target tracking experiment is illustrated. Results from experimental implementations have demonstrated that the proposed target tracking system based on a distributed P2P signal processing framework can make efficient use of scarce energy and communication resources and achieve target tracking successfully.

  5. Numerical Coupling and Simulation of Point-Mass System with the Turbulent Fluid Flow

    NASA Astrophysics Data System (ADS)

    Gao, Zheng

    A computational framework that combines the Eulerian description of the turbulence field with a Lagrangian point-mass ensemble is proposed in this dissertation. Depending on the Reynolds number, the turbulence field is simulated using Direct Numerical Simulation (DNS) or eddy viscosity model. In the meanwhile, the particle system, such as spring-mass system and cloud droplets, are modeled using the ordinary differential system, which is stiff and hence poses a challenge to the stability of the entire system. This computational framework is applied to the numerical study of parachute deceleration and cloud microphysics. These two distinct problems can be uniformly modeled with Partial Differential Equations (PDEs) and Ordinary Differential Equations (ODEs), and numerically solved in the same framework. For the parachute simulation, a novel porosity model is proposed to simulate the porous effects of the parachute canopy. This model is easy to implement with the projection method and is able to reproduce Darcy's law observed in the experiment. Moreover, the impacts of using different versions of k-epsilon turbulence model in the parachute simulation have been investigated and conclude that the standard and Re-Normalisation Group (RNG) model may overestimate the turbulence effects when Reynolds number is small while the Realizable model has a consistent performance with both large and small Reynolds number. For another application, cloud microphysics, the cloud entrainment-mixing problem is studied in the same numerical framework. Three sets of DNS are carried out with both decaying and forced turbulence. The numerical result suggests a new way parameterize the cloud mixing degree using the dynamical measures. The numerical experiments also verify the negative relationship between the droplets number concentration and the vorticity field. The results imply that the gravity has fewer impacts on the forced turbulence than the decaying turbulence. In summary, the proposed framework can be used to solve a physics problem that involves turbulence field and point-mass system, and therefore has a broad application.

  6. Video event classification and image segmentation based on noncausal multidimensional hidden Markov models.

    PubMed

    Ma, Xiang; Schonfeld, Dan; Khokhar, Ashfaq A

    2009-06-01

    In this paper, we propose a novel solution to an arbitrary noncausal, multidimensional hidden Markov model (HMM) for image and video classification. First, we show that the noncausal model can be solved by splitting it into multiple causal HMMs and simultaneously solving each causal HMM using a fully synchronous distributed computing framework, therefore referred to as distributed HMMs. Next we present an approximate solution to the multiple causal HMMs that is based on an alternating updating scheme and assumes a realistic sequential computing framework. The parameters of the distributed causal HMMs are estimated by extending the classical 1-D training and classification algorithms to multiple dimensions. The proposed extension to arbitrary causal, multidimensional HMMs allows state transitions that are dependent on all causal neighbors. We, thus, extend three fundamental algorithms to multidimensional causal systems, i.e., 1) expectation-maximization (EM), 2) general forward-backward (GFB), and 3) Viterbi algorithms. In the simulations, we choose to limit ourselves to a noncausal 2-D model whose noncausality is along a single dimension, in order to significantly reduce the computational complexity. Simulation results demonstrate the superior performance, higher accuracy rate, and applicability of the proposed noncausal HMM framework to image and video classification.

  7. Sustained sensorimotor control as intermittent decisions about prediction errors: computational framework and application to ground vehicle steering.

    PubMed

    Markkula, Gustav; Boer, Erwin; Romano, Richard; Merat, Natasha

    2018-06-01

    A conceptual and computational framework is proposed for modelling of human sensorimotor control and is exemplified for the sensorimotor task of steering a car. The framework emphasises control intermittency and extends on existing models by suggesting that the nervous system implements intermittent control using a combination of (1) motor primitives, (2) prediction of sensory outcomes of motor actions, and (3) evidence accumulation of prediction errors. It is shown that approximate but useful sensory predictions in the intermittent control context can be constructed without detailed forward models, as a superposition of simple prediction primitives, resembling neurobiologically observed corollary discharges. The proposed mathematical framework allows straightforward extension to intermittent behaviour from existing one-dimensional continuous models in the linear control and ecological psychology traditions. Empirical data from a driving simulator are used in model-fitting analyses to test some of the framework's main theoretical predictions: it is shown that human steering control, in routine lane-keeping and in a demanding near-limit task, is better described as a sequence of discrete stepwise control adjustments, than as continuous control. Results on the possible roles of sensory prediction in control adjustment amplitudes, and of evidence accumulation mechanisms in control onset timing, show trends that match the theoretical predictions; these warrant further investigation. The results for the accumulation-based model align with other recent literature, in a possibly converging case against the type of threshold mechanisms that are often assumed in existing models of intermittent control.

  8. Performance measurement integrated information framework in e-Manufacturing

    NASA Astrophysics Data System (ADS)

    Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José

    2014-11-01

    The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.

  9. A deep learning framework for financial time series using stacked autoencoders and long-short term memory.

    PubMed

    Bao, Wei; Yue, Jun; Rao, Yulei

    2017-01-01

    The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day's closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance.

  10. Framework for the mapping of the monthly average daily solar radiation using an advanced case-based reasoning and a geostatistical technique.

    PubMed

    Lee, Minhyun; Koo, Choongwan; Hong, Taehoon; Park, Hyo Seon

    2014-04-15

    For the effective photovoltaic (PV) system, it is necessary to accurately determine the monthly average daily solar radiation (MADSR) and to develop an accurate MADSR map, which can simplify the decision-making process for selecting the suitable location of the PV system installation. Therefore, this study aimed to develop a framework for the mapping of the MADSR using an advanced case-based reasoning (CBR) and a geostatistical technique. The proposed framework consists of the following procedures: (i) the geographic scope for the mapping of the MADSR is set, and the measured MADSR and meteorological data in the geographic scope are collected; (ii) using the collected data, the advanced CBR model is developed; (iii) using the advanced CBR model, the MADSR at unmeasured locations is estimated; and (iv) by applying the measured and estimated MADSR data to the geographic information system, the MADSR map is developed. A practical validation was conducted by applying the proposed framework to South Korea. It was determined that the MADSR map developed through the proposed framework has been improved in terms of accuracy. The developed MADSR map can be used for estimating the MADSR at unmeasured locations and for determining the optimal location for the PV system installation.

  11. Placing Health Trajectories in Family and Historical Context: A Proposed Enrichment of the Life Course Health and Development Model.

    PubMed

    Jones, Marian Moser; Roy, Kevin

    2017-10-01

    Purpose This article offers constructive commentary on The Life Course Health and Development Model (LCHD) as an organizing framework for MCH research. Description The LCHD has recently been proposed as an organizing framework for MCH research. This model integrates biomedical, biopsychosocial, and life course frameworks, to explain how "individual health trajectories" develop over time. In this article, we propose that the LCHD can improve its relevance to MCH policy and practice by: (1) placing individual health trajectories within the context of family health trajectories, which unfold within communities and societies, over historical and generational time; and (2) placing greater weight on the social determinants that shape health development trajectories of individuals and families to produce greater or lesser health equity. Assessment We argue that emphasizing these nested, historically specific social contexts in life course models will enrich study design and data analysis for future developmental science research, will make the LCHD model more relevant in shaping MCH policy and interventions, and will guard against its application as a deterministic framework. Specific ways to measure these and examples of how they can be integrated into the LCHD model are articulated. Conclusion Research applying the LCHD should incorporate the specific family and socio-historical contexts in which development occurs to serve as a useful basis for policy and interventions. Future longitudinal studies of maternal and child health should include collection of time-dependent data related to family environment and other social determinants of health, and analyze the impact of historical events and trends on specific cohorts.

  12. Ecological Modelling of Individual and Contextual Influences: A Person-in-Environment Framework for Hypothetico-Deductive Information Behaviour Research

    ERIC Educational Resources Information Center

    Sin, Sei-Ching Joanna

    2015-01-01

    Introduction: This paper discusses the person-in-environment framework, which proposes the inclusion of environmental factors, alongside personal factors, as the explanatory factors of individual-level information behaviour and outcome. Method: The paper first introduces the principles and schematic formulas of the person-in-environment framework.…

  13. Generating action descriptions from statistically integrated representations of human motions and sentences.

    PubMed

    Takano, Wataru; Kusajima, Ikuo; Nakamura, Yoshihiko

    2016-08-01

    It is desirable for robots to be able to linguistically understand human actions during human-robot interactions. Previous research has developed frameworks for encoding human full body motion into model parameters and for classifying motion into specific categories. For full understanding, the motion categories need to be connected to the natural language such that the robots can interpret human motions as linguistic expressions. This paper proposes a novel framework for integrating observation of human motion with that of natural language. This framework consists of two models; the first model statistically learns the relations between motions and their relevant words, and the second statistically learns sentence structures as word n-grams. Integration of these two models allows robots to generate sentences from human motions by searching for words relevant to the motion using the first model and then arranging these words in appropriate order using the second model. This allows making sentences that are the most likely to be generated from the motion. The proposed framework was tested on human full body motion measured by an optical motion capture system. In this, descriptive sentences were manually attached to the motions, and the validity of the system was demonstrated. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Extraction of information from major element chemical analyses of lunar basalts

    NASA Technical Reports Server (NTRS)

    Butler, J. C.

    1985-01-01

    Major element chemical analyses often form the framework within which similarities and differences of analyzed specimens are noted and used to propose or devise models. When percentages are formed the ratios of pairs of components are preserved whereas many familiar statistical and geometrical descriptors are likely to exhibit major changes. This ratio preserving aspect forms the basis for a proposed framework. An analysis of compositional variability within the data set of 42 major element analyses of lunar reference samples was selected to investigate this proposal.

  15. Towards an interactive electromechanical model of the heart

    PubMed Central

    Talbot, Hugo; Marchesseau, Stéphanie; Duriez, Christian; Sermesant, Maxime; Cotin, Stéphane; Delingette, Hervé

    2013-01-01

    In this work, we develop an interactive framework for rehearsal of and training in cardiac catheter ablation, and for planning cardiac resynchronization therapy. To this end, an interactive and real-time electrophysiology model of the heart is developed to fit patient-specific data. The proposed interactive framework relies on two main contributions. First, an efficient implementation of cardiac electrophysiology is proposed, using the latest graphics processing unit computing techniques. Second, a mechanical simulation is then coupled to the electrophysiological signals to produce realistic motion of the heart. We demonstrate that pathological mechanical and electrophysiological behaviour can be simulated. PMID:24427533

  16. The pitch of vibrato tones: a model based on instantaneous frequency decomposition.

    PubMed

    Mesz, Bruno A; Eguia, Manuel C

    2009-07-01

    We study vibrato as the more ubiquitous manifestation of a nonstationary tone that can evoke a single overall pitch. Some recent results using nonsymmetrical vibrato tones suggest that the perceived pitch could be governed by some stability-sensitive mechanism. For nonstationary sounds the adequate tools are time-frequency representations (TFRs). We show that a recently proposed TFR could be the simplest framework to explain this hypothetical stability-sensitive mechanism. We propose a one-parameter model within this framework that is able to predict previously reported results and we present new results obtained from psychophysical experiments performed in our laboratory.

  17. A security framework for nationwide health information exchange based on telehealth strategy.

    PubMed

    Zaidan, B B; Haiqi, Ahmed; Zaidan, A A; Abdulnabi, Mohamed; Kiah, M L Mat; Muzamel, Hussaen

    2015-05-01

    This study focuses on the situation of health information exchange (HIE) in the context of a nationwide network. It aims to create a security framework that can be implemented to ensure the safe transmission of health information across the boundaries of care providers in Malaysia and other countries. First, a critique of the major elements of nationwide health information networks is presented from the perspective of security, along with such topics as the importance of HIE, issues, and main approaches. Second, a systematic evaluation is conducted on the security solutions that can be utilized in the proposed nationwide network. Finally, a secure framework for health information transmission is proposed within a central cloud-based model, which is compatible with the Malaysian telehealth strategy. The outcome of this analysis indicates that a complete security framework for a global structure of HIE is yet to be defined and implemented. Our proposed framework represents such an endeavor and suggests specific techniques to achieve this goal.

  18. An epidemiological modeling and data integration framework.

    PubMed

    Pfeifer, B; Wurz, M; Hanser, F; Seger, M; Netzer, M; Osl, M; Modre-Osprian, R; Schreier, G; Baumgartner, C

    2010-01-01

    In this work, a cellular automaton software package for simulating different infectious diseases, storing the simulation results in a data warehouse system and analyzing the obtained results to generate prediction models as well as contingency plans, is proposed. The Brisbane H3N2 flu virus, which has been spreading during the winter season 2009, was used for simulation in the federal state of Tyrol, Austria. The simulation-modeling framework consists of an underlying cellular automaton. The cellular automaton model is parameterized by known disease parameters and geographical as well as demographical conditions are included for simulating the spreading. The data generated by simulation are stored in the back room of the data warehouse using the Talend Open Studio software package, and subsequent statistical and data mining tasks are performed using the tool, termed Knowledge Discovery in Database Designer (KD3). The obtained simulation results were used for generating prediction models for all nine federal states of Austria. The proposed framework provides a powerful and easy to handle interface for parameterizing and simulating different infectious diseases in order to generate prediction models and improve contingency plans for future events.

  19. A scalable delivery framework and a pricing model for streaming media with advertisements

    NASA Astrophysics Data System (ADS)

    Al-Hadrusi, Musab; Sarhan, Nabil J.

    2008-01-01

    This paper presents a delivery framework for streaming media with advertisements and an associated pricing model. The delivery model combines the benefits of periodic broadcasting and stream merging. The advertisements' revenues are used to subsidize the price of the media content. The pricing is determined based on the total ads' viewing time. Moreover, this paper presents an efficient ad allocation scheme and three modified scheduling policies that are well suited to the proposed delivery framework. Furthermore, we study the effectiveness of the delivery framework and various scheduling polices through extensive simulation in terms of numerous metrics, including customer defection probability, average number of ads viewed per client, price, arrival rate, profit, and revenue.

  20. Development of reliable pavement models.

    DOT National Transportation Integrated Search

    2011-05-01

    The current report proposes a framework for estimating the reliability of a given pavement structure as analyzed by : the Mechanistic-Empirical Pavement Design Guide (MEPDG). The methodology proposes using a previously fit : response surface, in plac...

  1. An ice sheet model validation framework for the Greenland ice sheet.

    PubMed

    Price, Stephen F; Hoffman, Matthew J; Bonin, Jennifer A; Howat, Ian M; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P; Evans, Katherine J; Kennedy, Joseph H; Lenaerts, Jan; Lipscomb, William H; Perego, Mauro; Salinger, Andrew G; Tuminaro, Raymond S; van den Broeke, Michiel R; Nowicki, Sophie M J

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.

  2. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  3. From Pixels to Response Maps: Discriminative Image Filtering for Face Alignment in the Wild.

    PubMed

    Asthana, Akshay; Zafeiriou, Stefanos; Tzimiropoulos, Georgios; Cheng, Shiyang; Pantic, Maja

    2015-06-01

    We propose a face alignment framework that relies on the texture model generated by the responses of discriminatively trained part-based filters. Unlike standard texture models built from pixel intensities or responses generated by generic filters (e.g. Gabor), our framework has two important advantages. First, by virtue of discriminative training, invariance to external variations (like identity, pose, illumination and expression) is achieved. Second, we show that the responses generated by discriminatively trained filters (or patch-experts) are sparse and can be modeled using a very small number of parameters. As a result, the optimization methods based on the proposed texture model can better cope with unseen variations. We illustrate this point by formulating both part-based and holistic approaches for generic face alignment and show that our framework outperforms the state-of-the-art on multiple "wild" databases. The code and dataset annotations are available for research purposes from http://ibug.doc.ic.ac.uk/resources.

  4. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  5. Bayesian Image Segmentations by Potts Prior and Loopy Belief Propagation

    NASA Astrophysics Data System (ADS)

    Tanaka, Kazuyuki; Kataoka, Shun; Yasuda, Muneki; Waizumi, Yuji; Hsu, Chiou-Ting

    2014-12-01

    This paper presents a Bayesian image segmentation model based on Potts prior and loopy belief propagation. The proposed Bayesian model involves several terms, including the pairwise interactions of Potts models, and the average vectors and covariant matrices of Gauss distributions in color image modeling. These terms are often referred to as hyperparameters in statistical machine learning theory. In order to determine these hyperparameters, we propose a new scheme for hyperparameter estimation based on conditional maximization of entropy in the Potts prior. The algorithm is given based on loopy belief propagation. In addition, we compare our conditional maximum entropy framework with the conventional maximum likelihood framework, and also clarify how the first order phase transitions in loopy belief propagations for Potts models influence our hyperparameter estimation procedures.

  6. A framework model for water-sharing among co-basin states of a river basin

    NASA Astrophysics Data System (ADS)

    Garg, N. K.; Azad, Shambhu

    2018-05-01

    A new framework model is presented in this study for sharing of water in a river basin using certain governing variables, in an effort to enhance the objectivity for a reasonable and equitable allocation of water among co-basin states. The governing variables were normalised to reduce the governing variables of different co-basin states of a river basin on same scale. In the absence of objective methods for evaluating the weights to be assigned to co-basin states for water allocation, a framework was conceptualised and formulated to determine the normalised weighting factors of different co-basin states as a function of the governing variables. The water allocation to any co-basin state had been assumed to be proportional to its struggle for equity, which in turn was assumed to be a function of the normalised discontent, satisfaction, and weighting factors of each co-basin state. System dynamics was used effectively to represent and solve the proposed model formulation. The proposed model was successfully applied to the Vamsadhara river basin located in the South-Eastern part of India, and a sensitivity analysis of the proposed model parameters was carried out to prove its robustness in terms of the proposed model convergence and validity over the broad spectrum values of the proposed model parameters. The solution converged quickly to a final allocation of 1444 million cubic metre (MCM) in the case of the Odisha co-basin state, and to 1067 MCM for the Andhra Pradesh co-basin state. The sensitivity analysis showed that the proposed model's allocation varied from 1584 MCM to 1336 MCM for Odisha state and from 927 to 1175 MCM for Andhra, depending upon the importance weights given to the governing variables for the calculation of the weighting factors. Thus, the proposed model was found to be very flexible to explore various policy options to arrive at a decision in a water sharing problem. It can therefore be effectively applied to any trans-boundary problem where there is conflict about water-sharing among co-basin states.

  7. Microeconomics of the ideal gas like market models

    NASA Astrophysics Data System (ADS)

    Chakrabarti, Anindya S.; Chakrabarti, Bikas K.

    2009-10-01

    We develop a framework based on microeconomic theory from which the ideal gas like market models can be addressed. A kinetic exchange model based on that framework is proposed and its distributional features have been studied by considering its moments. Next, we derive the moments of the CC model (Eur. Phys. J. B 17 (2000) 167) as well. Some precise solutions are obtained which conform with the solutions obtained earlier. Finally, an output market is introduced with global price determination in the model with some necessary modifications.

  8. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv

    2018-02-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  9. A post-Bertalanffy Systemics Healthcare Competitive Framework Proposal.

    PubMed

    Fiorini, Rodolfo A; Santacroce, Giulia F

    2014-01-01

    Health Information community can take advantage of a new evolutive categorization cybernetic framework. A systemic concept of principles organizing nature is proposed. It can be used as a multiscaling reference framework to develop successful and competitive antifragile system and new HRO information management strategies in advanced healthcare organization (HO) and high reliability organization (HRO) conveniently. Expected impacts are multifarious and quite articulated at different system scale level: major one is that, for the first time, Biomedical Engineering ideal system categorization levels can be matched exactly to practical system modeling interaction styles, with no paradigmatic operational ambiguity and information loss.

  10. A quasi-likelihood approach to non-negative matrix factorization

    PubMed Central

    Devarajan, Karthik; Cheung, Vincent C.K.

    2017-01-01

    A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511

  11. A data management infrastructure for bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Byun, Jaewook; Kim, Daeyoung; Sohn, Hoon; Bae, In Hwan; Law, Kincho H.

    2015-04-01

    This paper discusses a data management infrastructure framework for bridge monitoring applications. As sensor technologies mature and become economically affordable, their deployment for bridge monitoring will continue to grow. Data management becomes a critical issue not only for storing the sensor data but also for integrating with the bridge model to support other functions, such as management, maintenance and inspection. The focus of this study is on the effective data management of bridge information and sensor data, which is crucial to structural health monitoring and life cycle management of bridge structures. We review the state-of-the-art of bridge information modeling and sensor data management, and propose a data management framework for bridge monitoring based on NoSQL database technologies that have been shown useful in handling high volume, time-series data and to flexibly deal with unstructured data schema. Specifically, Apache Cassandra and Mongo DB are deployed for the prototype implementation of the framework. This paper describes the database design for an XML-based Bridge Information Modeling (BrIM) schema, and the representation of sensor data using Sensor Model Language (SensorML). The proposed prototype data management framework is validated using data collected from the Yeongjong Bridge in Incheon, Korea.

  12. Towards a formal semantics for Ada 9X

    NASA Technical Reports Server (NTRS)

    Guaspari, David; Mchugh, John; Wolfgang, Polak; Saaltink, Mark

    1995-01-01

    The Ada 9X language precision team was formed during the revisions of Ada 83, with the goal of analyzing the proposed design, identifying problems, and suggesting improvements, through the use of mathematical models. This report defines a framework for formally describing Ada 9X, based on Kahn's 'natural semantics', and applies the framework to portions of the language. The proposals for exceptions and optimization freedoms are also analyzed, using a different technique.

  13. The forecasting of menstruation based on a state-space modeling of basal body temperature time series.

    PubMed

    Fukaya, Keiichi; Kawamori, Ai; Osada, Yutaka; Kitazawa, Masumi; Ishiguro, Makio

    2017-09-20

    Women's basal body temperature (BBT) shows a periodic pattern that associates with menstrual cycle. Although this fact suggests a possibility that daily BBT time series can be useful for estimating the underlying phase state as well as for predicting the length of current menstrual cycle, little attention has been paid to model BBT time series. In this study, we propose a state-space model that involves the menstrual phase as a latent state variable to explain the daily fluctuation of BBT and the menstruation cycle length. Conditional distributions of the phase are obtained by using sequential Bayesian filtering techniques. A predictive distribution of the next menstruation day can be derived based on this conditional distribution and the model, leading to a novel statistical framework that provides a sequentially updated prediction for upcoming menstruation day. We applied this framework to a real data set of women's BBT and menstruation days and compared prediction accuracy of the proposed method with that of previous methods, showing that the proposed method generally provides a better prediction. Because BBT can be obtained with relatively small cost and effort, the proposed method can be useful for women's health management. Potential extensions of this framework as the basis of modeling and predicting events that are associated with the menstrual cycles are discussed. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  14. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    NASA Astrophysics Data System (ADS)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  15. A MULTISCALE FRAMEWORK FOR THE STOCHASTIC ASSIMILATION AND MODELING OF UNCERTAINTY ASSOCIATED NCF COMPOSITE MATERIALS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehrez, Loujaine; Ghanem, Roger; McAuliffe, Colin

    multiscale framework to construct stochastic macroscopic constitutive material models is proposed. A spectral projection approach, specifically polynomial chaos expansion, has been used to construct explicit functional relationships between the homogenized properties and input parameters from finer scales. A homogenization engine embedded in Multiscale Designer, software for composite materials, has been used for the upscaling process. The framework is demonstrated using non-crimp fabric composite materials by constructing probabilistic models of the homogenized properties of a non-crimp fabric laminate in terms of the input parameters together with the homogenized properties from finer scales.

  16. The Roy Adaptation Model: A Theoretical Framework for Nurses Providing Care to Individuals With Anorexia Nervosa.

    PubMed

    Jennings, Karen M

    Using a nursing theoretical framework to understand, elucidate, and propose nursing research is fundamental to knowledge development. This article presents the Roy Adaptation Model as a theoretical framework to better understand individuals with anorexia nervosa during acute treatment, and the role of nursing assessments and interventions in the promotion of weight restoration. Nursing assessments and interventions situated within the Roy Adaptation Model take into consideration how weight restoration does not occur in isolation but rather reflects an adaptive process within external and internal environments, and has the potential for more holistic care.

  17. A Penalized Likelihood Framework For High-Dimensional Phylogenetic Comparative Methods And An Application To New-World Monkeys Brain Evolution.

    PubMed

    Julien, Clavel; Leandro, Aristide; Hélène, Morlon

    2018-06-19

    Working with high-dimensional phylogenetic comparative datasets is challenging because likelihood-based multivariate methods suffer from low statistical performances as the number of traits p approaches the number of species n and because some computational complications occur when p exceeds n. Alternative phylogenetic comparative methods have recently been proposed to deal with the large p small n scenario but their use and performances are limited. Here we develop a penalized likelihood framework to deal with high-dimensional comparative datasets. We propose various penalizations and methods for selecting the intensity of the penalties. We apply this general framework to the estimation of parameters (the evolutionary trait covariance matrix and parameters of the evolutionary model) and model comparison for the high-dimensional multivariate Brownian (BM), Early-burst (EB), Ornstein-Uhlenbeck (OU) and Pagel's lambda models. We show using simulations that our penalized likelihood approach dramatically improves the estimation of evolutionary trait covariance matrices and model parameters when p approaches n, and allows for their accurate estimation when p equals or exceeds n. In addition, we show that penalized likelihood models can be efficiently compared using Generalized Information Criterion (GIC). We implement these methods, as well as the related estimation of ancestral states and the computation of phylogenetic PCA in the R package RPANDA and mvMORPH. Finally, we illustrate the utility of the new proposed framework by evaluating evolutionary models fit, analyzing integration patterns, and reconstructing evolutionary trajectories for a high-dimensional 3-D dataset of brain shape in the New World monkeys. We find a clear support for an Early-burst model suggesting an early diversification of brain morphology during the ecological radiation of the clade. Penalized likelihood offers an efficient way to deal with high-dimensional multivariate comparative data.

  18. Developing a theoretical framework for complex community-based interventions.

    PubMed

    Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana

    2014-01-01

    Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.

  19. Toward a New Paradigm: Governance in a Broader Framework.

    ERIC Educational Resources Information Center

    Deegan, William L.

    1985-01-01

    Argues that the issues and trends of the past decade make it necessary to reconsider governance processes and the way substantive issues are generated. Reviews major models for governance and proposes a broader, more integrated framework for analyzing governance issues. (DMM)

  20. A SVM framework for fault detection of the braking system in a high speed train

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Li, Yan-Fu; Zio, Enrico

    2017-03-01

    In April 2015, the number of operating High Speed Trains (HSTs) in the world has reached 3603. An efficient, effective and very reliable braking system is evidently very critical for trains running at a speed around 300 km/h. Failure of a highly reliable braking system is a rare event and, consequently, informative recorded data on fault conditions are scarce. This renders the fault detection problem a classification problem with highly unbalanced data. In this paper, a Support Vector Machine (SVM) framework, including feature selection, feature vector selection, model construction and decision boundary optimization, is proposed for tackling this problem. Feature vector selection can largely reduce the data size and, thus, the computational burden. The constructed model is a modified version of the least square SVM, in which a higher cost is assigned to the error of classification of faulty conditions than the error of classification of normal conditions. The proposed framework is successfully validated on a number of public unbalanced datasets. Then, it is applied for the fault detection of braking systems in HST: in comparison with several SVM approaches for unbalanced datasets, the proposed framework gives better results.

  1. A novel framework for virtual prototyping of rehabilitation exoskeletons.

    PubMed

    Agarwal, Priyanshu; Kuo, Pei-Hsin; Neptune, Richard R; Deshpande, Ashish D

    2013-06-01

    Human-worn rehabilitation exoskeletons have the potential to make therapeutic exercises increasingly accessible to disabled individuals while reducing the cost and labor involved in rehabilitation therapy. In this work, we propose a novel human-model-in-the-loop framework for virtual prototyping (design, control and experimentation) of rehabilitation exoskeletons by merging computational musculoskeletal analysis with simulation-based design techniques. The framework allows to iteratively optimize design and control algorithm of an exoskeleton using simulation. We introduce biomechanical, morphological, and controller measures to quantify the performance of the device for optimization study. Furthermore, the framework allows one to carry out virtual experiments for testing specific "what-if" scenarios to quantify device performance and recovery progress. To illustrate the application of the framework, we present a case study wherein the design and analysis of an index-finger exoskeleton is carried out using the proposed framework.

  2. Analysis of Implicit Uncertain Systems. Part 1: Theoretical Framework

    DTIC Science & Technology

    1994-12-07

    Analysis of Implicit Uncertain Systems Part I: Theoretical Framework Fernando Paganini * John Doyle 1 December 7, 1994 Abst rac t This paper...Analysis of Implicit Uncertain Systems Part I: Theoretical Framework 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...model and a number of constraints relevant to the analysis problem under consideration. In Part I of this paper we propose a theoretical framework which

  3. A statistical framework for biomedical literature mining.

    PubMed

    Chung, Dongjun; Lawson, Andrew; Zheng, W Jim

    2017-09-30

    In systems biology, it is of great interest to identify new genes that were not previously reported to be associated with biological pathways related to various functions and diseases. Identification of these new pathway-modulating genes does not only promote understanding of pathway regulation mechanisms but also allow identification of novel targets for therapeutics. Recently, biomedical literature has been considered as a valuable resource to investigate pathway-modulating genes. While the majority of currently available approaches are based on the co-occurrence of genes within an abstract, it has been reported that these approaches show only sub-optimal performances because 70% of abstracts contain information only for a single gene. To overcome such limitation, we propose a novel statistical framework based on the concept of ontology fingerprint that uses gene ontology to extract information from large biomedical literature data. The proposed framework simultaneously identifies pathway-modulating genes and facilitates interpreting functions of these new genes. We also propose a computationally efficient posterior inference procedure based on Metropolis-Hastings within Gibbs sampler for parameter updates and the poor man's reversible jump Markov chain Monte Carlo approach for model selection. We evaluate the proposed statistical framework with simulation studies, experimental validation, and an application to studies of pathway-modulating genes in yeast. The R implementation of the proposed model is currently available at https://dongjunchung.github.io/bayesGO/. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  4. What can we learn from international comparisons of health systems and health system reform?

    PubMed Central

    McPake, B.; Mills, A.

    2000-01-01

    Most commonly, lessons derived from comparisons of international health sector reform can only be generalized in a limited way to similar countries. However, there is little guidance as to what constitutes "similarity" in this respect. We propose that a framework for assessing similarity could be derived from the performance of individual policies in different contexts, and from the cause and effect processes related to the policies. We demonstrate this process by considering research evidence in the "public-private mix", and propose variables for an initial framework that we believe determine private involvement in the public health sector. The most influential model of public leadership places the private role in a contracting framework. Research in countries that have adopted this model suggests an additional list of variables to add to the framework. The variables can be grouped under the headings "demand factors", "supply factors", and "strength of the public sector". These illustrate the nature of a framework that could emerge, and which would help countries aiming to learn from international experience. PMID:10916918

  5. An Ensemble-Based Forecasting Framework to Optimize Reservoir Releases

    NASA Astrophysics Data System (ADS)

    Ramaswamy, V.; Saleh, F.

    2017-12-01

    Increasing frequency of extreme precipitation events are stressing the need to manage water resources on shorter timescales. Short-term management of water resources becomes proactive when inflow forecasts are available and this information can be effectively used in the control strategy. This work investigates the utility of short term hydrological ensemble forecasts for operational decision making during extreme weather events. An advanced automated hydrologic prediction framework integrating a regional scale hydrologic model, GIS datasets and the meteorological ensemble predictions from the European Center for Medium Range Weather Forecasting (ECMWF) was coupled to an implicit multi-objective dynamic programming model to optimize releases from a water supply reservoir. The proposed methodology was evaluated by retrospectively forecasting the inflows to the Oradell reservoir in the Hackensack River basin in New Jersey during the extreme hydrologic event, Hurricane Irene. Additionally, the flexibility of the forecasting framework was investigated by forecasting the inflows from a moderate rainfall event to provide important perspectives on using the framework to assist reservoir operations during moderate events. The proposed forecasting framework seeks to provide a flexible, assistive tool to alleviate the complexity of operational decision-making.

  6. Skew-t partially linear mixed-effects models for AIDS clinical studies.

    PubMed

    Lu, Tao

    2016-01-01

    We propose partially linear mixed-effects models with asymmetry and missingness to investigate the relationship between two biomarkers in clinical studies. The proposed models take into account irregular time effects commonly observed in clinical studies under a semiparametric model framework. In addition, commonly assumed symmetric distributions for model errors are substituted by asymmetric distribution to account for skewness. Further, informative missing data mechanism is accounted for. A Bayesian approach is developed to perform parameter estimation simultaneously. The proposed model and method are applied to an AIDS dataset and comparisons with alternative models are performed.

  7. Models for truth-telling in physician-patient encounters: what can we learn from Yoruba concept of Ooto?

    PubMed

    Ewuoso, Cornelius

    2017-09-29

    Empirical studies have now established that many patients make clinical decisions based on models other than Anglo American model of truth-telling and patient autonomy. Some scholars also add that current medical ethics frameworks and recent proposals for enhancing communication in health professional-patient relationship have not adequately accommodated these models. In certain clinical contexts where health professional and patients are motivated by significant cultural and religious values, these current frameworks cannot prevent communication breakdown, which can, in turn, jeopardize patient care, cause undue distress to a patient in certain clinical contexts or negatively impact his/her relationship with the community. These empirical studies have now recommended that additional frameworks developed around other models of truth-telling; and which take very seriously significant value-differences which sometimes exist between health professional and patients, as well as patient's cultural/religious values or relational capacities, must be developed. This paper contributes towards the development of one. Specifically, this study proposes a framework for truth-telling developed around African model of truth-telling by drawing insights from the communitarian concept of ootọ́ amongst the Yoruba people of south west Nigeria. I am optimistic that if this model is incorporated into current medical ethics codes and curricula, it will significantly enhance health professional-patient communication. © 2017 John Wiley & Sons Ltd.

  8. Automatic Earth observation data service based on reusable geo-processing workflow

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min

    2008-12-01

    A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.

  9. A framework to simulate small shallow inland water bodies in semi-arid regions

    NASA Astrophysics Data System (ADS)

    Abbasi, Ali; Ohene Annor, Frank; van de Giesen, Nick

    2017-12-01

    In this study, a framework for simulating the flow field and heat transfer processes in small shallow inland water bodies has been developed. As the dynamics and thermal structure of these water bodies are crucial in studying the quality of stored water , and in assessing the heat fluxes from their surfaces as well, the heat transfer and temperature simulations were modeled. The proposed model is able to simulate the full 3-D water flow and heat transfer in the water body by applying complex and time varying boundary conditions. In this model, the continuity, momentum and temperature equations together with the turbulence equations, which comprise the buoyancy effect, have been solved. This model is built on the Reynolds Averaged Navier Stokes (RANS) equations with the widely used Boussinesq approach to solve the turbulence issues of the flow field. Micrometeorological data were obtained from an Automatic Weather Station (AWS) installed on the site and combined with field bathymetric measurements for the model. In the framework developed, a simple, applicable and generalizable approach is proposed for preparing the geometry of small shallow water bodies using coarsely measured bathymetry. All parts of the framework are based on open-source tools, which is essential for developing countries.

  10. An Evaluation Model for a Multidisciplinary Chronic Pelvic Pain Clinic: Application of the RE-AIM Framework.

    PubMed

    Chen, Innie; Money, Deborah; Yong, Paul; Williams, Christina; Allaire, Catherine

    2015-09-01

    Chronic pelvic pain (CPP) is a prevalent, debilitating, and costly condition. Although national guidelines and empiric evidence support the use of a multidisciplinary model of care for such patients, such clinics are uncommon in Canada. The BC Women's Centre for Pelvic Pain and Endometriosis was created to respond to this need, and there is interest in this model of care's impact on the burden of disease in British Columbia. We sought to create an approach to its evaluation using the RE-AIM (Reach, Efficacy, Adoption, Implementation, Maintenance) evaluation framework to assess the impact of the care model and to guide clinical decision-making and policy. The RE-AIM evaluation framework was applied to consider the different dimensions of impact of the BC Centre. The proposed measures, data sources, and data management strategies for this mixed-methods approach were identified. The five dimensions of impact were considered at individual and organizational levels, and corresponding indicators were proposed to enable integration into existing data infrastructure to facilitate collection and early program evaluation. The RE-AIM framework can be applied to the evaluation of a multidisciplinary chronic pelvic pain clinic. This will allow better assessment of the impact of innovative models of care for women with chronic pelvic pain.

  11. Structural Equation Models in a Redundancy Analysis Framework With Covariates.

    PubMed

    Lovaglio, Pietro Giorgio; Vittadini, Giorgio

    2014-01-01

    A recent method to specify and fit structural equation modeling in the Redundancy Analysis framework based on so-called Extended Redundancy Analysis (ERA) has been proposed in the literature. In this approach, the relationships between the observed exogenous variables and the observed endogenous variables are moderated by the presence of unobservable composites, estimated as linear combinations of exogenous variables. However, in the presence of direct effects linking exogenous and endogenous variables, or concomitant indicators, the composite scores are estimated by ignoring the presence of the specified direct effects. To fit structural equation models, we propose a new specification and estimation method, called Generalized Redundancy Analysis (GRA), allowing us to specify and fit a variety of relationships among composites, endogenous variables, and external covariates. The proposed methodology extends the ERA method, using a more suitable specification and estimation algorithm, by allowing for covariates that affect endogenous indicators indirectly through the composites and/or directly. To illustrate the advantages of GRA over ERA we propose a simulation study of small samples. Moreover, we propose an application aimed at estimating the impact of formal human capital on the initial earnings of graduates of an Italian university, utilizing a structural model consistent with well-established economic theory.

  12. An ecological framework for informing permitting decisions on scientific activities in protected areas

    PubMed Central

    Saarman, Emily T.; Owens, Brian; Murray, Steven N.; Weisberg, Stephen B.; Field, John C.; Nielsen, Karina J.

    2018-01-01

    There are numerous reasons to conduct scientific research within protected areas, but research activities may also negatively impact organisms and habitats, and thus conflict with a protected area’s conservation goals. We developed a quantitative ecological decision-support framework that estimates these potential impacts so managers can weigh costs and benefits of proposed research projects and make informed permitting decisions. The framework generates quantitative estimates of the ecological impacts of the project and the cumulative impacts of the proposed project and all other projects in the protected area, and then compares the estimated cumulative impacts of all projects with policy-based acceptable impact thresholds. We use a series of simplified equations (models) to assess the impacts of proposed research to: a) the population of any targeted species, b) the major ecological assemblages that make up the community, and c) the physical habitat that supports protected area biota. These models consider both targeted and incidental impacts to the ecosystem and include consideration of the vulnerability of targeted species, assemblages, and habitats, based on their recovery time and ecological role. We parameterized the models for a wide variety of potential research activities that regularly occur in the study area using a combination of literature review and expert judgment with a precautionary approach to uncertainty. We also conducted sensitivity analyses to examine the relationships between model input parameters and estimated impacts to understand the dominant drivers of the ecological impact estimates. Although the decision-support framework was designed for and adopted by the California Department of Fish and Wildlife for permitting scientific studies in the state-wide network of marine protected areas (MPAs), the framework can readily be adapted for terrestrial and freshwater protected areas. PMID:29920527

  13. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation

    PubMed Central

    Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-01-01

    Background Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Objective Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Methods Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Results Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. Conclusions To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. PMID:29506966

  14. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation.

    PubMed

    Sadat, Md Nazmus; Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-03-05

    Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. ©Md Nazmus Sadat, Xiaoqian Jiang, Md Momin Al Aziz, Shuang Wang, Noman Mohammed. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.03.2018.

  15. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  16. Research recruitment: A marketing framework to improve sample representativeness in health research.

    PubMed

    Howcutt, Sarah J; Barnett, Anna L; Barbosa-Boucas, Sofia; Smith, Lesley A

    2018-04-01

    This discussion paper proposes a five-part theoretical framework to inform recruitment strategies. The framework is based on a marketing model of consumer decision-making. Respondents in surveys are typically healthier than non-respondents, which has an impact on the availability of information about those most in need. Previous research has identified response patterns, provided theories about why people participate in research and evaluated different recruitment strategies. Social marketing has been applied successfully to recruitment and promotes focus on the needs of the participant, but little attention has been paid to the periods before and after participant-researcher contact (during advertising and following completion of studies). We propose a new model which conceptualises participation as a decision involving motivation, perception of information, attitude formation, integration of intention and action and finally evaluation and sharing of experience. Discussion paper. This discussion paper presents a critical review. No literature was excluded on date and the included citations span the years 1981-2017. The proposed framework suggests that researchers could engage a broader demographic if they shape research design and advertising to perform functions that participants are seeking to achieve. The framework provides a novel and useful conceptualisation of recruitment which could help to inform public engagement in research design, researcher training and research policy. This framework challenges researchers to investigate the goals of the potential participants when designing a study's advertising and procedures. © 2017 John Wiley & Sons Ltd.

  17. Intelligent and robust optimization frameworks for smart grids

    NASA Astrophysics Data System (ADS)

    Dhansri, Naren Reddy

    A smart grid implies a cyberspace real-time distributed power control system to optimally deliver electricity based on varying consumer characteristics. Although smart grids solve many of the contemporary problems, they give rise to new control and optimization problems with the growing role of renewable energy sources such as wind or solar energy. Under highly dynamic nature of distributed power generation and the varying consumer demand and cost requirements, the total power output of the grid should be controlled such that the load demand is met by giving a higher priority to renewable energy sources. Hence, the power generated from renewable energy sources should be optimized while minimizing the generation from non renewable energy sources. This research develops a demand-based automatic generation control and optimization framework for real-time smart grid operations by integrating conventional and renewable energy sources under varying consumer demand and cost requirements. Focusing on the renewable energy sources, the intelligent and robust control frameworks optimize the power generation by tracking the consumer demand in a closed-loop control framework, yielding superior economic and ecological benefits and circumvent nonlinear model complexities and handles uncertainties for superior real-time operations. The proposed intelligent system framework optimizes the smart grid power generation for maximum economical and ecological benefits under an uncertain renewable wind energy source. The numerical results demonstrate that the proposed framework is a viable approach to integrate various energy sources for real-time smart grid implementations. The robust optimization framework results demonstrate the effectiveness of the robust controllers under bounded power plant model uncertainties and exogenous wind input excitation while maximizing economical and ecological performance objectives. Therefore, the proposed framework offers a new worst-case deterministic optimization algorithm for smart grid automatic generation control.

  18. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  19. A Theoretically Consistent Framework for Modelling Lagrangian Particle Deposition in Plant Canopies

    NASA Astrophysics Data System (ADS)

    Bailey, Brian N.; Stoll, Rob; Pardyjak, Eric R.

    2018-06-01

    We present a theoretically consistent framework for modelling Lagrangian particle deposition in plant canopies. The primary focus is on describing the probability of particles encountering canopy elements (i.e., potential deposition), and provides a consistent means for including the effects of imperfect deposition through any appropriate sub-model for deposition efficiency. Some aspects of the framework draw upon an analogy to radiation propagation through a turbid medium with which to develop model theory. The present method is compared against one of the most commonly used heuristic Lagrangian frameworks, namely that originally developed by Legg and Powell (Agricultural Meteorology, 1979, Vol. 20, 47-67), which is shown to be theoretically inconsistent. A recommendation is made to discontinue the use of this heuristic approach in favour of the theoretically consistent framework developed herein, which is no more difficult to apply under equivalent assumptions. The proposed framework has the additional advantage that it can be applied to arbitrary canopy geometries given readily measurable parameters describing vegetation structure.

  20. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  1. A Query Expansion Framework in Image Retrieval Domain Based on Local and Global Analysis

    PubMed Central

    Rahman, M. M.; Antani, S. K.; Thoma, G. R.

    2011-01-01

    We present an image retrieval framework based on automatic query expansion in a concept feature space by generalizing the vector space model of information retrieval. In this framework, images are represented by vectors of weighted concepts similar to the keyword-based representation used in text retrieval. To generate the concept vocabularies, a statistical model is built by utilizing Support Vector Machine (SVM)-based classification techniques. The images are represented as “bag of concepts” that comprise perceptually and/or semantically distinguishable color and texture patches from local image regions in a multi-dimensional feature space. To explore the correlation between the concepts and overcome the assumption of feature independence in this model, we propose query expansion techniques in the image domain from a new perspective based on both local and global analysis. For the local analysis, the correlations between the concepts based on the co-occurrence pattern, and the metrical constraints based on the neighborhood proximity between the concepts in encoded images, are analyzed by considering local feedback information. We also analyze the concept similarities in the collection as a whole in the form of a similarity thesaurus and propose an efficient query expansion based on the global analysis. The experimental results on a photographic collection of natural scenes and a biomedical database of different imaging modalities demonstrate the effectiveness of the proposed framework in terms of precision and recall. PMID:21822350

  2. Incorporating coping into an expectancy framework for explaining drinking behaviour.

    PubMed

    Hasking, Penelope A; Oei, Tian P S

    2008-01-01

    Expectancy Theory has offered much in the way of understanding alcohol use and abuse, and has contributed greatly to prevention and treatment initiatives. However although many cognitive-behavioural treatment approaches are based on expectancy constructs, such as outcome expectancies and self-efficacy, high relapse rates imply that expectancy theory may be too narrow in scope, and that additional variables need to be examined if a comprehensive understanding of drinking behaviour, and better treatment outcomes, are to be achieved. We suggest that the coping strategies an individual employs present one such set of variables that have largely been neglected from an expectancy framework. Although coping skills training is routinely used in prevention and treatment of alcohol problems, coping research has suffered from a poor theoretical framework. In this paper we review the existing research relating expectancies, self-efficacy and coping to drinking behaviour and propose a model which explains both social and dependent drinking, by incorporating coping into an expectancy theory framework. We also outline research and clinical implications of the proposed model.

  3. Development of a new integrated local trajectory planning and tracking control framework for autonomous ground vehicles

    NASA Astrophysics Data System (ADS)

    Li, Xiaohui; Sun, Zhenping; Cao, Dongpu; Liu, Daxue; He, Hangen

    2017-03-01

    This study proposes a novel integrated local trajectory planning and tracking control (ILTPTC) framework for autonomous vehicles driving along a reference path with obstacles avoidance. For this ILTPTC framework, an efficient state-space sampling-based trajectory planning scheme is employed to smoothly follow the reference path. A model-based predictive path generation algorithm is applied to produce a set of smooth and kinematically-feasible paths connecting the initial state with the sampling terminal states. A velocity control law is then designed to assign a speed value at each of the points along the generated paths. An objective function considering both safety and comfort performance is carefully formulated for assessing the generated trajectories and selecting the optimal one. For accurately tracking the optimal trajectory while overcoming external disturbances and model uncertainties, a combined feedforward and feedback controller is developed. Both simulation analyses and vehicle testing are performed to verify the effectiveness of the proposed ILTPTC framework, and future research is also briefly discussed.

  4. A Proposed Theoretical Model Using the Work of Thomas Kuhn, David Ausubel, and Mauritz Johnson as a Basis for Curriculum and Instruction Decisions in Science Education.

    ERIC Educational Resources Information Center

    Bowen, Barbara Lynn

    This study presents a holistic framework which can be used as a basis for decision-making at various points in the curriculum-instruction development process as described by Johnson in a work published in 1967. The proposed framework has conceptual bases in the work of Thomas S. Kuhn and David P. Ausubel and utilizes the work of several perceptual…

  5. An RFID-Based Manufacturing Control Framework for Loosely Coupled Distributed Manufacturing System Supporting Mass Customization

    NASA Astrophysics Data System (ADS)

    Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur

    In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.

  6. Supporting capacity sharing in the cloud manufacturing environment based on game theory and fuzzy logic

    NASA Astrophysics Data System (ADS)

    Argoneto, Pierluigi; Renna, Paolo

    2016-02-01

    This paper proposes a Framework for Capacity Sharing in Cloud Manufacturing (FCSCM) able to support the capacity sharing issue among independent firms. The success of geographical distributed plants depends strongly on the use of opportune tools to integrate their resources and demand forecast in order to gather a specific production objective. The framework proposed is based on two different tools: a cooperative game algorithm, based on the Gale-Shapley model, and a fuzzy engine. The capacity allocation policy takes into account the utility functions of the involved firms. It is shown how the capacity allocation policy proposed induces all firms to report truthfully their information about their requirements. A discrete event simulation environment has been developed to test the proposed FCSCM. The numerical results show the drastic reduction of unsatisfied capacity obtained by the model of cooperation implemented in this work.

  7. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE PAGES

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.; ...

    2017-09-07

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  8. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  9. Quantified Choice of Root-Mean-Square Errors of Approximation for Evaluation and Power Analysis of Small Differences between Structural Equation Models

    ERIC Educational Resources Information Center

    Li, Libo; Bentler, Peter M.

    2011-01-01

    MacCallum, Browne, and Cai (2006) proposed a new framework for evaluation and power analysis of small differences between nested structural equation models (SEMs). In their framework, the null and alternative hypotheses for testing a small difference in fit and its related power analyses were defined by some chosen root-mean-square error of…

  10. Can model-free reinforcement learning explain deontological moral judgments?

    PubMed

    Ayars, Alisabeth

    2016-05-01

    Dual-systems frameworks propose that moral judgments are derived from both an immediate emotional response, and controlled/rational cognition. Recently Cushman (2013) proposed a new dual-system theory based on model-free and model-based reinforcement learning. Model-free learning attaches values to actions based on their history of reward and punishment, and explains some deontological, non-utilitarian judgments. Model-based learning involves the construction of a causal model of the world and allows for far-sighted planning; this form of learning fits well with utilitarian considerations that seek to maximize certain kinds of outcomes. I present three concerns regarding the use of model-free reinforcement learning to explain deontological moral judgment. First, many actions that humans find aversive from model-free learning are not judged to be morally wrong. Moral judgment must require something in addition to model-free learning. Second, there is a dearth of evidence for central predictions of the reinforcement account-e.g., that people with different reinforcement histories will, all else equal, make different moral judgments. Finally, to account for the effect of intention within the framework requires certain assumptions which lack support. These challenges are reasonable foci for future empirical/theoretical work on the model-free/model-based framework. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Generalized Aggregation and Coordination of Residential Loads in a Smart Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, He; Somani, Abhishek; Lian, Jianming

    2015-11-02

    Flexibility from residential loads presents an enormous potential to provide various services to the smart grid. In this paper, we propose a unified hierarchical framework for aggregation and coordination of various residential loads in a smart community, such as Thermostatically Controlled Loads (TCLs), Distributed Energy Storages (DESs), residential Pool Pumps (PPs), and Electric Vehicles (EVs). A central idea of this framework is a virtual battery model, which provides a simple and intuitive tool to aggregate the flexibility of distributed loads. Moreover, a multi-stage Nash-bargainingbased coordination strategy is proposed to coordinate different aggregations of residential loads for demand response. Case studiesmore » are provided to demonstrate the efficacy of our proposed framework and coordination strategy in managing peak power demand in a smart residential community.« less

  12. Web of Objects Based Ambient Assisted Living Framework for Emergency Psychiatric State Prediction

    PubMed Central

    Alam, Md Golam Rabiul; Abedin, Sarder Fakhrul; Al Ameen, Moshaddique; Hong, Choong Seon

    2016-01-01

    Ambient assisted living can facilitate optimum health and wellness by aiding physical, mental and social well-being. In this paper, patients’ psychiatric symptoms are collected through lightweight biosensors and web-based psychiatric screening scales in a smart home environment and then analyzed through machine learning algorithms to provide ambient intelligence in a psychiatric emergency. The psychiatric states are modeled through a Hidden Markov Model (HMM), and the model parameters are estimated using a Viterbi path counting and scalable Stochastic Variational Inference (SVI)-based training algorithm. The most likely psychiatric state sequence of the corresponding observation sequence is determined, and an emergency psychiatric state is predicted through the proposed algorithm. Moreover, to enable personalized psychiatric emergency care, a service a web of objects-based framework is proposed for a smart-home environment. In this framework, the biosensor observations and the psychiatric rating scales are objectified and virtualized in the web space. Then, the web of objects of sensor observations and psychiatric rating scores are used to assess the dweller’s mental health status and to predict an emergency psychiatric state. The proposed psychiatric state prediction algorithm reported 83.03 percent prediction accuracy in an empirical performance study. PMID:27608023

  13. Multiple hypothesis tracking for cluttered biological image sequences.

    PubMed

    Chenouard, Nicolas; Bloch, Isabelle; Olivo-Marin, Jean-Christophe

    2013-11-01

    In this paper, we present a method for simultaneously tracking thousands of targets in biological image sequences, which is of major importance in modern biology. The complexity and inherent randomness of the problem lead us to propose a unified probabilistic framework for tracking biological particles in microscope images. The framework includes realistic models of particle motion and existence and of fluorescence image features. For the track extraction process per se, the very cluttered conditions motivate the adoption of a multiframe approach that enforces tracking decision robustness to poor imaging conditions and to random target movements. We tackle the large-scale nature of the problem by adapting the multiple hypothesis tracking algorithm to the proposed framework, resulting in a method with a favorable tradeoff between the model complexity and the computational cost of the tracking procedure. When compared to the state-of-the-art tracking techniques for bioimaging, the proposed algorithm is shown to be the only method providing high-quality results despite the critically poor imaging conditions and the dense target presence. We thus demonstrate the benefits of advanced Bayesian tracking techniques for the accurate computational modeling of dynamical biological processes, which is promising for further developments in this domain.

  14. The application of data mining and cloud computing techniques in data-driven models for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.

    2016-04-01

    Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.

  15. Advanced Information Technology in Simulation Based Life Cycle Design

    NASA Technical Reports Server (NTRS)

    Renaud, John E.

    2003-01-01

    In this research a Collaborative Optimization (CO) approach for multidisciplinary systems design is used to develop a decision based design framework for non-deterministic optimization. To date CO strategies have been developed for use in application to deterministic systems design problems. In this research the decision based design (DBD) framework proposed by Hazelrigg is modified for use in a collaborative optimization framework. The Hazelrigg framework as originally proposed provides a single level optimization strategy that combines engineering decisions with business decisions in a single level optimization. By transforming this framework for use in collaborative optimization one can decompose the business and engineering decision making processes. In the new multilevel framework of Decision Based Collaborative Optimization (DBCO) the business decisions are made at the system level. These business decisions result in a set of engineering performance targets that disciplinary engineering design teams seek to satisfy as part of subspace optimizations. The Decision Based Collaborative Optimization framework more accurately models the existing relationship between business and engineering in multidisciplinary systems design.

  16. Model-Based Policymaking: A Framework to Promote Ethical "Good Practice" in Mathematical Modeling for Public Health Policymaking.

    PubMed

    Boden, Lisa A; McKendrick, Iain J

    2017-01-01

    Mathematical models are increasingly relied upon as decision support tools, which estimate risks and generate recommendations to underpin public health policies. However, there are no formal agreements about what constitutes professional competencies or duties in mathematical modeling for public health. In this article, we propose a framework to evaluate whether mathematical models that assess human and animal disease risks and control strategies meet standards consistent with ethical "good practice" and are thus "fit for purpose" as evidence in support of policy. This framework is derived from principles of biomedical ethics: independence, transparency (autonomy), beneficence/non-maleficence, and justice. We identify ethical risks associated with model development and implementation and consider the extent to which scientists are accountable for the translation and communication of model results to policymakers so that the strengths and weaknesses of the scientific evidence base and any socioeconomic and ethical impacts of biased or uncertain predictions are clearly understood. We propose principles to operationalize a framework for ethically sound model development and risk communication between scientists and policymakers. These include the creation of science-policy partnerships to mutually define policy questions and communicate results; development of harmonized international standards for model development; and data stewardship and improvement of the traceability and transparency of models via a searchable archive of policy-relevant models. Finally, we suggest that bespoke ethical advisory groups, with relevant expertise and access to these resources, would be beneficial as a bridge between science and policy, advising modelers of potential ethical risks and providing overview of the translation of modeling advice into policy.

  17. A multimodal 3D framework for fire characteristics estimation

    NASA Astrophysics Data System (ADS)

    Toulouse, T.; Rossi, L.; Akhloufi, M. A.; Pieri, A.; Maldague, X.

    2018-02-01

    In the last decade we have witnessed an increasing interest in using computer vision and image processing in forest fire research. Image processing techniques have been successfully used in different fire analysis areas such as early detection, monitoring, modeling and fire front characteristics estimation. While the majority of the work deals with the use of 2D visible spectrum images, recent work has introduced the use of 3D vision in this field. This work proposes a new multimodal vision framework permitting the extraction of the three-dimensional geometrical characteristics of fires captured by multiple 3D vision systems. The 3D system is a multispectral stereo system operating in both the visible and near-infrared (NIR) spectral bands. The framework supports the use of multiple stereo pairs positioned so as to capture complementary views of the fire front during its propagation. Multimodal registration is conducted using the captured views in order to build a complete 3D model of the fire front. The registration process is achieved using multisensory fusion based on visual data (2D and NIR images), GPS positions and IMU inertial data. Experiments were conducted outdoors in order to show the performance of the proposed framework. The obtained results are promising and show the potential of using the proposed framework in operational scenarios for wildland fire research and as a decision management system in fighting.

  18. A Framework for Developing the Structure of Public Health Economic Models.

    PubMed

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-01-01

    A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less

  20. Non-Gaussian spatiotemporal simulation of multisite daily precipitation: downscaling framework

    NASA Astrophysics Data System (ADS)

    Ben Alaya, M. A.; Ouarda, T. B. M. J.; Chebana, F.

    2018-01-01

    Probabilistic regression approaches for downscaling daily precipitation are very useful. They provide the whole conditional distribution at each forecast step to better represent the temporal variability. The question addressed in this paper is: how to simulate spatiotemporal characteristics of multisite daily precipitation from probabilistic regression models? Recent publications point out the complexity of multisite properties of daily precipitation and highlight the need for using a non-Gaussian flexible tool. This work proposes a reasonable compromise between simplicity and flexibility avoiding model misspecification. A suitable nonparametric bootstrapping (NB) technique is adopted. A downscaling model which merges a vector generalized linear model (VGLM as a probabilistic regression tool) and the proposed bootstrapping technique is introduced to simulate realistic multisite precipitation series. The model is applied to data sets from the southern part of the province of Quebec, Canada. It is shown that the model is capable of reproducing both at-site properties and the spatial structure of daily precipitations. Results indicate the superiority of the proposed NB technique, over a multivariate autoregressive Gaussian framework (i.e. Gaussian copula).

  1. A Learning Framework for Control-Oriented Modeling of Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and bigmore » data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.« less

  2. Semantic-based surveillance video retrieval.

    PubMed

    Hu, Weiming; Xie, Dan; Fu, Zhouyu; Zeng, Wenrong; Maybank, Steve

    2007-04-01

    Visual surveillance produces large amounts of video data. Effective indexing and retrieval from surveillance video databases are very important. Although there are many ways to represent the content of video clips in current video retrieval algorithms, there still exists a semantic gap between users and retrieval systems. Visual surveillance systems supply a platform for investigating semantic-based video retrieval. In this paper, a semantic-based video retrieval framework for visual surveillance is proposed. A cluster-based tracking algorithm is developed to acquire motion trajectories. The trajectories are then clustered hierarchically using the spatial and temporal information, to learn activity models. A hierarchical structure of semantic indexing and retrieval of object activities, where each individual activity automatically inherits all the semantic descriptions of the activity model to which it belongs, is proposed for accessing video clips and individual objects at the semantic level. The proposed retrieval framework supports various queries including queries by keywords, multiple object queries, and queries by sketch. For multiple object queries, succession and simultaneity restrictions, together with depth and breadth first orders, are considered. For sketch-based queries, a method for matching trajectories drawn by users to spatial trajectories is proposed. The effectiveness and efficiency of our framework are tested in a crowded traffic scene.

  3. Localizing text in scene images by boundary clustering, stroke segmentation, and string fragment classification.

    PubMed

    Yi, Chucai; Tian, Yingli

    2012-09-01

    In this paper, we propose a novel framework to extract text regions from scene images with complex backgrounds and multiple text appearances. This framework consists of three main steps: boundary clustering (BC), stroke segmentation, and string fragment classification. In BC, we propose a new bigram-color-uniformity-based method to model both text and attachment surface, and cluster edge pixels based on color pairs and spatial positions into boundary layers. Then, stroke segmentation is performed at each boundary layer by color assignment to extract character candidates. We propose two algorithms to combine the structural analysis of text stroke with color assignment and filter out background interferences. Further, we design a robust string fragment classification based on Gabor-based text features. The features are obtained from feature maps of gradient, stroke distribution, and stroke width. The proposed framework of text localization is evaluated on scene images, born-digital images, broadcast video images, and images of handheld objects captured by blind persons. Experimental results on respective datasets demonstrate that the framework outperforms state-of-the-art localization algorithms.

  4. Shape Distributions of Nonlinear Dynamical Systems for Video-Based Inference.

    PubMed

    Venkataraman, Vinay; Turaga, Pavan

    2016-12-01

    This paper presents a shape-theoretic framework for dynamical analysis of nonlinear dynamical systems which appear frequently in several video-based inference tasks. Traditional approaches to dynamical modeling have included linear and nonlinear methods with their respective drawbacks. A novel approach we propose is the use of descriptors of the shape of the dynamical attractor as a feature representation of nature of dynamics. The proposed framework has two main advantages over traditional approaches: a) representation of the dynamical system is derived directly from the observational data, without any inherent assumptions, and b) the proposed features show stability under different time-series lengths where traditional dynamical invariants fail. We illustrate our idea using nonlinear dynamical models such as Lorenz and Rossler systems, where our feature representations (shape distribution) support our hypothesis that the local shape of the reconstructed phase space can be used as a discriminative feature. Our experimental analyses on these models also indicate that the proposed framework show stability for different time-series lengths, which is useful when the available number of samples are small/variable. The specific applications of interest in this paper are: 1) activity recognition using motion capture and RGBD sensors, 2) activity quality assessment for applications in stroke rehabilitation, and 3) dynamical scene classification. We provide experimental validation through action and gesture recognition experiments on motion capture and Kinect datasets. In all these scenarios, we show experimental evidence of the favorable properties of the proposed representation.

  5. Action Understanding as Inverse Planning

    ERIC Educational Resources Information Center

    Baker, Chris L.; Saxe, Rebecca; Tenenbaum, Joshua B.

    2009-01-01

    Humans are adept at inferring the mental states underlying other agents' actions, such as goals, beliefs, desires, emotions and other thoughts. We propose a computational framework based on Bayesian inverse planning for modeling human action understanding. The framework represents an intuitive theory of intentional agents' behavior based on the…

  6. Facilitative Leadership: A Framework for the Creative Arts Therapies

    ERIC Educational Resources Information Center

    Kaimal, Girija; Metzl, Einat; Millrod, Eri

    2017-01-01

    We propose a leadership framework for the creative art therapies (CATs) as a means to affect the sociopolitical contexts of our clinical and scholarly practices. The new model of facilitative leadership includes 3 aspects: developing the self, developing others, and envisioning a creative and just future.

  7. Clustering of financial time series

    NASA Astrophysics Data System (ADS)

    D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo

    2013-05-01

    This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.

  8. Dynamic Infinite Mixed-Membership Stochastic Blockmodel.

    PubMed

    Fan, Xuhui; Cao, Longbing; Xu, Richard Yi Da

    2015-09-01

    Directional and pairwise measurements are often used to model interactions in a social network setting. The mixed-membership stochastic blockmodel (MMSB) was a seminal work in this area, and its ability has been extended. However, models such as MMSB face particular challenges in modeling dynamic networks, for example, with the unknown number of communities. Accordingly, this paper proposes a dynamic infinite mixed-membership stochastic blockmodel, a generalized framework that extends the existing work to potentially infinite communities inside a network in dynamic settings (i.e., networks are observed over time). Additional model parameters are introduced to reflect the degree of persistence among one's memberships at consecutive time stamps. Under this framework, two specific models, namely mixture time variant and mixture time invariant models, are proposed to depict two different time correlation structures. Two effective posterior sampling strategies and their results are presented, respectively, using synthetic and real-world data.

  9. Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT.

    PubMed

    Lavassani, Mehrzad; Forsström, Stefan; Jennehag, Ulf; Zhang, Tingting

    2018-05-12

    Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications.

  10. Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT

    PubMed Central

    Lavassani, Mehrzad; Jennehag, Ulf; Zhang, Tingting

    2018-01-01

    Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications. PMID:29757227

  11. Proposal for a Spatial Organization Model in Soil Science (The Example of the European Communities Soil Map).

    ERIC Educational Resources Information Center

    King, D.; And Others

    1994-01-01

    Discusses the computational problems of automating paper-based spatial information. A new relational structure for soil science information based on the main conceptual concepts used during conventional cartographic work is proposed. This model is a computerized framework for coherent description of the geographical variability of soils, combined…

  12. The Relationship between BIBFRAME and OCLC's Linked-Data Model of Bibliographic Description: A Working Paper

    ERIC Educational Resources Information Center

    Godby, Carol Jean

    2013-01-01

    This document describes a proposed alignment between BIBFRAME (Bibliographic Framework) and a model being explored by the Online Computer Library Center (OCLC) with extensions proposed by the Schema Bib Extend project, a Worldwide Web Consortium sponsored (W3C-sponsored) community group tasked with enhancing Schema.org to the description of…

  13. A thermo-chemo-mechanically coupled constitutive model for curing of glassy polymers

    NASA Astrophysics Data System (ADS)

    Sain, Trisha; Loeffel, Kaspar; Chester, Shawn

    2018-07-01

    Curing of a polymer is the process through which a polymer liquid transitions into a solid polymer, capable of bearing mechanical loads. The curing process is a coupled thermo-chemo-mechanical conversion process which requires a thorough understanding of the system behavior to predict the cure dependent mechanical behavior of the solid polymer. In this paper, a thermodynamically consistent, frame indifferent, thermo-chemo-mechanically coupled continuum level constitutive framework is proposed for thermally cured glassy polymers. The constitutive framework considers the thermodynamics of chemical reactions, as well as the material behavior for a glassy polymer. A stress-free intermediate configuration is introduced within a finite deformation setting to capture the formation of the network in a stress-free configuration. This work considers a definition for the degree of cure based on the chemistry of the curing reactions. A simplified version of the proposed model has been numerically implemented, and simulations are used to understand the capabilities of the model and framework.

  14. A deep learning framework for financial time series using stacked autoencoders and long-short term memory

    PubMed Central

    Bao, Wei; Rao, Yulei

    2017-01-01

    The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day’s closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance. PMID:28708865

  15. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: An Earth Modeling System Software Framework Strawman Design that Integrates Cactus and UCLA/UCB Distributed Data Broker

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.

  16. Niche construction game cancer cells play

    NASA Astrophysics Data System (ADS)

    Bergman, Aviv; Gligorijevic, Bojana

    2015-10-01

    Niche construction concept was originally defined in evolutionary biology as the continuous interplay between natural selection via environmental conditions and the modification of these conditions by the organism itself. Processes unraveling during cancer metastasis include construction of niches, which cancer cells use towards more efficient survival, transport into new environments and preparation of the remote sites for their arrival. Many elegant experiments were done lately illustrating, for example, the premetastatic niche construction, but there is practically no mathematical modeling done which would apply the niche construction framework. To create models useful for understanding niche construction role in cancer progression, we argue that a) genetic, b) phenotypic and c) ecological levels are to be included. While the model proposed here is phenomenological in its current form, it can be converted into a predictive outcome model via experimental measurement of the model parameters. Here we give an overview of an experimentally formulated problem in cancer metastasis and propose how niche construction framework can be utilized and broadened to model it. Other life science disciplines, such as host-parasite coevolution, may also benefit from niche construction framework adaptation, to satisfy growing need for theoretical considerations of data collected by experimental biology.

  17. Niche construction game cancer cells play.

    PubMed

    Bergman, Aviv; Gligorijevic, Bojana

    2015-10-01

    Niche construction concept was originally defined in evolutionary biology as the continuous interplay between natural selection via environmental conditions and the modification of these conditions by the organism itself. Processes unraveling during cancer metastasis include construction of niches, which cancer cells use towards more efficient survival, transport into new environments and preparation of the remote sites for their arrival. Many elegant experiments were done lately illustrating, for example, the premetastatic niche construction, but there is practically no mathematical modeling done which would apply the niche construction framework. To create models useful for understanding niche construction role in cancer progression, we argue that a) genetic, b) phenotypic and c) ecological levels are to be included. While the model proposed here is phenomenological in its current form, it can be converted into a predictive outcome model via experimental measurement of the model parameters. Here we give an overview of an experimentally formulated problem in cancer metastasis and propose how niche construction framework can be utilized and broadened to model it. Other life science disciplines, such as host-parasite coevolution, may also benefit from niche construction framework adaptation, to satisfy growing need for theoretical considerations of data collected by experimental biology.

  18. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  19. Nonlinear finite element model updating for damage identification of civil structures using batch Bayesian estimation

    NASA Astrophysics Data System (ADS)

    Ebrahimian, Hamed; Astroza, Rodrigo; Conte, Joel P.; de Callafon, Raymond A.

    2017-02-01

    This paper presents a framework for structural health monitoring (SHM) and damage identification of civil structures. This framework integrates advanced mechanics-based nonlinear finite element (FE) modeling and analysis techniques with a batch Bayesian estimation approach to estimate time-invariant model parameters used in the FE model of the structure of interest. The framework uses input excitation and dynamic response of the structure and updates a nonlinear FE model of the structure to minimize the discrepancies between predicted and measured response time histories. The updated FE model can then be interrogated to detect, localize, classify, and quantify the state of damage and predict the remaining useful life of the structure. As opposed to recursive estimation methods, in the batch Bayesian estimation approach, the entire time history of the input excitation and output response of the structure are used as a batch of data to estimate the FE model parameters through a number of iterations. In the case of non-informative prior, the batch Bayesian method leads to an extended maximum likelihood (ML) estimation method to estimate jointly time-invariant model parameters and the measurement noise amplitude. The extended ML estimation problem is solved efficiently using a gradient-based interior-point optimization algorithm. Gradient-based optimization algorithms require the FE response sensitivities with respect to the model parameters to be identified. The FE response sensitivities are computed accurately and efficiently using the direct differentiation method (DDM). The estimation uncertainties are evaluated based on the Cramer-Rao lower bound (CRLB) theorem by computing the exact Fisher Information matrix using the FE response sensitivities with respect to the model parameters. The accuracy of the proposed uncertainty quantification approach is verified using a sampling approach based on the unscented transformation. Two validation studies, based on realistic structural FE models of a bridge pier and a moment resisting steel frame, are performed to validate the performance and accuracy of the presented nonlinear FE model updating approach and demonstrate its application to SHM. These validation studies show the excellent performance of the proposed framework for SHM and damage identification even in the presence of high measurement noise and/or way-out initial estimates of the model parameters. Furthermore, the detrimental effects of the input measurement noise on the performance of the proposed framework are illustrated and quantified through one of the validation studies.

  20. Rapid development of entity-based data models for bioinformatics with persistence object-oriented design and structured interfaces.

    PubMed

    Ezra Tsur, Elishai

    2017-01-01

    Databases are imperative for research in bioinformatics and computational biology. Current challenges in database design include data heterogeneity and context-dependent interconnections between data entities. These challenges drove the development of unified data interfaces and specialized databases. The curation of specialized databases is an ever-growing challenge due to the introduction of new data sources and the emergence of new relational connections between established datasets. Here, an open-source framework for the curation of specialized databases is proposed. The framework supports user-designed models of data encapsulation, objects persistency and structured interfaces to local and external data sources such as MalaCards, Biomodels and the National Centre for Biotechnology Information (NCBI) databases. The proposed framework was implemented using Java as the development environment, EclipseLink as the data persistency agent and Apache Derby as the database manager. Syntactic analysis was based on J3D, jsoup, Apache Commons and w3c.dom open libraries. Finally, a construction of a specialized database for aneurysms associated vascular diseases is demonstrated. This database contains 3-dimensional geometries of aneurysms, patient's clinical information, articles, biological models, related diseases and our recently published model of aneurysms' risk of rapture. Framework is available in: http://nbel-lab.com.

  1. The Curriculum Innovation Canvas: A Design Thinking Framework for the Engaged Educational Entrepreneur

    ERIC Educational Resources Information Center

    Willness, Chelsea; Bruni-Bossio, Vince

    2017-01-01

    Integrating literature on entrepreneurial business models and community-based experiential learning, we propose a new framework to advance the practice of curriculum innovation. Grounded in principles of design thinking, the curriculum innovation canvas provides a human-centered, collaborative, and holistic platform for instructors, curriculum…

  2. Progressive Education Standards: A Neuroscience Framework

    ERIC Educational Resources Information Center

    O'Grady, Patty

    2011-01-01

    This paper proposes a coherent and unique set of 12 standards, adopting a neuroscience framework for biologically based on school reform. This model of educational principles and practices aligns with the long-standing principles and practices of the Progressive Education Movement in the United States and the emerging principles of neuroscience.…

  3. Multicriteria framework for selecting a process modelling language

    NASA Astrophysics Data System (ADS)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  4. More of the same? Comment on "An integrated framework for the optimisation of sport and athlete development: a practitioner approach".

    PubMed

    MacNamara, Aine; Collins, Dave

    2014-01-01

    Gulbin and colleagues (Gulbin, J. P., Croser, M. J., Morley, E. J., & Weissensteiner, J. R. (2013). An integrated framework for the optimisation of sport and athlete development: A practitioner approach. Journal of Sports Sciences) present a new sport and athlete development framework that evolved from empirical observations from working with the Australian Institute of Sport. The FTEM (Foundations, Talent, Elite, Mastery) framework is proposed to integrate general and specialised phases of development for participants within the active lifestyle, sport participation and sport excellence pathways. A number of issues concerning the FTEM framework are presented. We also propose the need to move beyond prescriptive models of talent identification and development towards a consideration of features of best practice and process markers of development together with robust guidelines about the implementation of these in applied practice.

  5. An Enterprise Architecture Perspective to Electronic Health Record Based Care Governance.

    PubMed

    Motoc, Bogdan

    2017-01-01

    This paper proposes an Enterprise Architecture viewpoint of Electronic Health Record (EHR) based care governance. The improvements expected are derived from the collaboration framework and the clinical health model proposed as foundation for the concept of EHR.

  6. A system framework of inter-enterprise machining quality control based on fractal theory

    NASA Astrophysics Data System (ADS)

    Zhao, Liping; Qin, Yongtao; Yao, Yiyong; Yan, Peng

    2014-03-01

    In order to meet the quality control requirement of dynamic and complicated product machining processes among enterprises, a system framework of inter-enterprise machining quality control based on fractal was proposed. In this system framework, the fractal-specific characteristic of inter-enterprise machining quality control function was analysed, and the model of inter-enterprise machining quality control was constructed by the nature of fractal structures. Furthermore, the goal-driven strategy of inter-enterprise quality control and the dynamic organisation strategy of inter-enterprise quality improvement were constructed by the characteristic analysis on this model. In addition, the architecture of inter-enterprise machining quality control based on fractal was established by means of Web service. Finally, a case study for application was presented. The result showed that the proposed method was available, and could provide guidance for quality control and support for product reliability in inter-enterprise machining processes.

  7. Market-Based Coordination of Thermostatically Controlled Loads—Part I: A Mechanism Design Formulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Sen; Zhang, Wei; Lian, Jianming

    This paper focuses on the coordination of a population of Thermostatically Controlled Loads (TCLs) with unknown parameters to achieve group objectives. The problem involves designing the bidding and market clearing strategy to motivate self-interested users to realize efficient energy allocation subject to a peak power constraint. Using the mechanism design approach, we propose a market-based coordination framework, which can effectively incorporate heterogeneous load dynamics, systematically deal with user preferences, account for the unknown load model parameters, and enable the real-world implementation with limited communication resources. This paper is divided into two parts. Part I presents a mathematical formulation of themore » problem and develops a coordination framework using the mechanism design approach. Part II presents a learning scheme to account for the unknown load model parameters, and evaluates the proposed framework through realistic simulations.« less

  8. A Conceptual Framework for Decision-making Support in Uncertainty- and Risk-based Diagnosis of Rare Clinical Cases by Specialist Physicians.

    PubMed

    Santos, Adriano A; Moura, J Antão B; de Araújo, Joseana Macêdo Fechine Régis

    2015-01-01

    Mitigating uncertainty and risks faced by specialist physicians in analysis of rare clinical cases is something desired by anyone who needs health services. The number of clinical cases never seen by these experts, with little documentation, may introduce errors in decision-making. Such errors negatively affect well-being of patients, increase procedure costs, rework, health insurance premiums, and impair the reputation of specialists and medical systems involved. In this context, IT and Clinical Decision Support Systems (CDSS) play a fundamental role, supporting decision-making process, making it more efficient and effective, reducing a number of avoidable medical errors and enhancing quality of treatment given to patients. An investigation has been initiated to look into characteristics and solution requirements of this problem, model it, propose a general solution in terms of a conceptual risk-based, automated framework to support rare-case medical diagnostics and validate it by means of case studies. A preliminary validation study of the proposed framework has been carried out by interviews conducted with experts who are practicing professionals, academics, and researchers in health care. This paper summarizes the investigation and its positive results. These results motivate continuation of research towards development of the conceptual framework and of a software tool that implements the proposed model.

  9. Modelling volumetric growth in a thick walled fibre reinforced artery

    NASA Astrophysics Data System (ADS)

    Eriksson, T. S. E.; Watton, P. N.; Luo, X. Y.; Ventikos, Y.

    2014-12-01

    A novel framework for simulating growth and remodelling (G&R) of a fibre-reinforced artery, including volumetric adaption, is proposed. We show how to implement this model into a finite element framework and propose and examine two underlying assumptions for modelling growth, namely constant individual density (CID) or adaptive individual density (AID). Moreover, we formulate a novel approach which utilises a combination of both AID and CID to simulate volumetric G&R for a tissue composed of several different constituents. We consider a special case of the G&R of an artery subjected to prescribed elastin degradation and we theorise on the assumptions and suitability of CID, AID and the mixed approach for modelling arterial biology. For simulating the volumetric changes that occur during aneurysm enlargement, we observe that it is advantageous to describe the growth of collagen using CID whilst it is preferable to model the atrophy of elastin using AID.

  10. Flood mapping in ungauged basins using fully continuous hydrologic-hydraulic modeling

    NASA Astrophysics Data System (ADS)

    Grimaldi, Salvatore; Petroselli, Andrea; Arcangeletti, Ettore; Nardi, Fernando

    2013-04-01

    SummaryIn this work, a fully-continuous hydrologic-hydraulic modeling framework for flood mapping is introduced and tested. It is characterized by a simulation of a long rainfall time series at sub-daily resolution that feeds a continuous rainfall-runoff model producing a discharge time series that is directly given as an input to a bi-dimensional hydraulic model. The main advantage of the proposed approach is to avoid the use of the design hyetograph and the design hydrograph that constitute the main source of subjective analysis and uncertainty for standard methods. The proposed procedure is optimized for small and ungauged watersheds where empirical models are commonly applied. Results of a simple real case study confirm that this experimental fully-continuous framework may pave the way for the implementation of a less subjective and potentially automated procedure for flood hazard mapping.

  11. Toward a consistent modeling framework to assess multi-sectoral climate impacts.

    PubMed

    Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin

    2018-02-13

    Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.

  12. Integrated city as a model for a new wave urban tourism

    NASA Astrophysics Data System (ADS)

    Ariani, V.

    2018-03-01

    Cities are a major player for an urban tourism destination. Massive tourism movement for urban tourism gains competitiveness to the city with similar characteristic. The new framework model for new wave urban tourism is crucial to give more experience to the tourist and valuing for the city itself. The integrated city is the answer for creating a new model for an urban tourism destination. The purpose of this preliminary research is to define integrated city framework for urban tourism development. It provides a rationale for tourism planner pursuing an innovative approach, competitive advantages, and general urban tourism destination model. The methodology applies to this research includes desk survey, literature review and focus group discussion. A conceptual framework is proposed, discussed and exemplified. The framework model adopts a place-based approach to tourism destination and suggests an integrated city model for urban tourism development. This model is a tool for strategy making in re-invention integrated city as an urban tourism destination.

  13. A conceptual framework for a long-term economic model for the treatment of attention-deficit/hyperactivity disorder.

    PubMed

    Nagy, Balázs; Setyawan, Juliana; Coghill, David; Soroncz-Szabó, Tamás; Kaló, Zoltán; Doshi, Jalpa A

    2017-06-01

    Models incorporating long-term outcomes (LTOs) are not available to assess the health economic impact of attention-deficit/hyperactivity disorder (ADHD). Develop a conceptual modelling framework capable of assessing long-term economic impact of ADHD therapies. Literature was reviewed; a conceptual structure for the long-term model was outlined with attention to disease characteristics and potential impact of treatment strategies. The proposed model has four layers: i) multi-state short-term framework to differentiate between ADHD treatments; ii) multiple states being merged into three core health states associated with LTOs; iii) series of sub-models in which particular LTOs are depicted; iv) outcomes collected to be either used directly for economic analyses or translated into other relevant measures. This conceptual model provides a framework to assess relationships between short- and long-term outcomes of the disease and its treatment, and to estimate the economic impact of ADHD treatments throughout the course of the disease.

  14. Experimental analysis of chaotic neural network models for combinatorial optimization under a unifying framework.

    PubMed

    Kwok, T; Smith, K A

    2000-09-01

    The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.

  15. An ice sheet model validation framework for the Greenland ice sheet

    NASA Astrophysics Data System (ADS)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of < 1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.

  16. An ice sheet model validation framework for the Greenland ice sheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.

    We propose a new ice sheet model validation framework the Cryospheric Model Comparison Tool (CMCT) that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quanti- tative metricsmore » for use in evaluating the different model simulations against the observations. We find 10 that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, the model initial condition as well as output from idealized and dynamic models all provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CMCT, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CMCT as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.« less

  17. An ice sheet model validation framework for the Greenland ice sheet

    DOE PAGES

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; ...

    2017-01-17

    We propose a new ice sheet model validation framework the Cryospheric Model Comparison Tool (CMCT) that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quanti- tative metricsmore » for use in evaluating the different model simulations against the observations. We find 10 that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, the model initial condition as well as output from idealized and dynamic models all provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CMCT, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CMCT as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.« less

  18. An ice sheet model validation framework for the Greenland ice sheet

    PubMed Central

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.

    2018-01-01

    We propose a new ice sheet model validation framework – the Cryospheric Model Comparison Tool (CmCt) – that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation. PMID:29697704

  19. An Ice Sheet Model Validation Framework for the Greenland Ice Sheet

    NASA Technical Reports Server (NTRS)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas A.; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey R.; Chambers, Don P.; Evans, Katherine J.; hide

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of less than 1 meter). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.

  20. Development of a Neural Network-Based Renewable Energy Forecasting Framework for Process Industries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Soobin; Ryu, Jun-Hyung; Hodge, Bri-Mathias

    2016-06-25

    This paper presents a neural network-based forecasting framework for photovoltaic power (PV) generation as a decision-supporting tool to employ renewable energies in the process industry. The applicability of the proposed framework is illustrated by comparing its performance against other methodologies such as linear and nonlinear time series modelling approaches. A case study of an actual PV power plant in South Korea is presented.

  1. Neural Meta-Memes Framework for Combinatorial Optimization

    NASA Astrophysics Data System (ADS)

    Song, Li Qin; Lim, Meng Hiot; Ong, Yew Soon

    In this paper, we present a Neural Meta-Memes Framework (NMMF) for combinatorial optimization. NMMF is a framework which models basic optimization algorithms as memes and manages them dynamically when solving combinatorial problems. NMMF encompasses neural networks which serve as the overall planner/coordinator to balance the workload between memes. We show the efficacy of the proposed NMMF through empirical study on a class of combinatorial problem, the quadratic assignment problem (QAP).

  2. A revised Self- and Family Management Framework.

    PubMed

    Grey, Margaret; Schulman-Green, Dena; Knafl, Kathleen; Reynolds, Nancy R

    2015-01-01

    Research on self- and family management of chronic conditions has advanced over the past 6 years, but the use of simple frameworks has hampered the understanding of the complexities involved. We sought to update our previously published model with new empirical, synthetic, and theoretical work. We used synthesis of previous studies to update the framework. We propose a revised framework that clarifies facilitators and barriers, processes, proximal outcomes, and distal outcomes of self- and family management and their relationships. We offer the revised framework as a model that can be used in studies aimed at advancing self- and family management science. The use of the framework to guide studies would allow for the design of studies that can address more clearly how self-management interventions work and under what conditions. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS's hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs' spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets.

  4. A framework for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps

    NASA Astrophysics Data System (ADS)

    Xu, Jin; Li, Zheng; Li, Shuliang; Zhang, Yanyan

    2015-07-01

    There is still a lack of effective paradigms and tools for analysing and discovering the contents and relationships of project knowledge contexts in the field of project management. In this paper, a new framework for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps under big data environments is proposed and developed. The conceptual paradigm, theoretical underpinning, extended topic model, and illustration examples of the ontology model for project knowledge maps are presented, with further research work envisaged.

  5. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less

  6. Dealing with equality and benefit for water allocation in a lake watershed: A Gini-coefficient based stochastic optimization approach

    NASA Astrophysics Data System (ADS)

    Dai, C.; Qin, X. S.; Chen, Y.; Guo, H. C.

    2018-06-01

    A Gini-coefficient based stochastic optimization (GBSO) model was developed by integrating the hydrological model, water balance model, Gini coefficient and chance-constrained programming (CCP) into a general multi-objective optimization modeling framework for supporting water resources allocation at a watershed scale. The framework was advantageous in reflecting the conflicting equity and benefit objectives for water allocation, maintaining the water balance of watershed, and dealing with system uncertainties. GBSO was solved by the non-dominated sorting Genetic Algorithms-II (NSGA-II), after the parameter uncertainties of the hydrological model have been quantified into the probability distribution of runoff as the inputs of CCP model, and the chance constraints were converted to the corresponding deterministic versions. The proposed model was applied to identify the Pareto optimal water allocation schemes in the Lake Dianchi watershed, China. The optimal Pareto-front results reflected the tradeoff between system benefit (αSB) and Gini coefficient (αG) under different significance levels (i.e. q) and different drought scenarios, which reveals the conflicting nature of equity and efficiency in water allocation problems. A lower q generally implies a lower risk of violating the system constraints and a worse drought intensity scenario corresponds to less available water resources, both of which would lead to a decreased system benefit and a less equitable water allocation scheme. Thus, the proposed modeling framework could help obtain the Pareto optimal schemes under complexity and ensure that the proposed water allocation solutions are effective for coping with drought conditions, with a proper tradeoff between system benefit and water allocation equity.

  7. A framework for the selection and ensemble development of flood vulnerability models

    NASA Astrophysics Data System (ADS)

    Figueiredo, Rui; Schröter, Kai; Kreibich, Heidi; Martina, Mario

    2017-04-01

    Effective understanding and management of flood risk requires comprehensive risk assessment studies that consider not only the hazard component, but also the impacts that the phenomena may have on the built environment, economy and society. This integrated approach has gained importance over recent decades, and with it so has the scientific attention given to flood vulnerability models describing the relationships between flood intensity metrics and damage to physical assets, also known as flood loss models. Despite considerable progress in this field, many challenges persist. Flood damage mechanisms are complex and depend on multiple variables, which can have different degrees of importance depending on the application setting. In addition, data required for the development and validation of such models tend to be scarce, particularly in data poor regions. These issues are reflected in the large amount of flood vulnerability models that are available in the literature today, as well as in their high heterogeneity: they are built with different modelling approaches, in different geographic contexts, utilizing different explanatory variables, and with varying levels of complexity. Notwithstanding recent developments in this area, uncertainty remains high, and large disparities exist among models. For these reasons, identifying which model or models, given their properties, are appropriate for a given context is not straightforward. In the present study, we propose a framework that guides the structured selection of flood vulnerability models and enables ranking them according to their suitability for a certain application, based on expert judgement. The approach takes advantage of current state of the art and most up-to-date knowledge on flood vulnerability processes. Given the heterogeneity and uncertainty currently present in flood vulnerability models, we propose the use of a model ensemble. With this in mind, the proposed approach is based on a weighting scheme within a logic-tree framework that enables the generation of such ensembles in a logically consistent manner. We test and discuss the results by applying the framework to the case study of the 2002 floods along the Mulde River in Germany. Applications of individual models and model ensembles are compared and discussed.

  8. Moving vehicles segmentation based on Gaussian motion model

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Fang, Xiang Z.; Lin, Wei Y.

    2005-07-01

    Moving objects segmentation is a challenge in computer vision. This paper focuses on the segmentation of moving vehicles in dynamic scene. We analyses the psychology of human vision and present a framework for segmenting moving vehicles in the highway. The proposed framework consists of two parts. Firstly, we propose an adaptive background update method in which the background is updated according to the change of illumination conditions and thus can adapt to the change of illumination sensitively. Secondly, we construct a Gaussian motion model to segment moving vehicles, in which the motion vectors of the moving pixels are modeled as a Gaussian model and an on-line EM algorithm is used to update the model. The Gaussian distribution of the adaptive model is elevated to determine which moving vectors result from moving vehicles and which from other moving objects such as waving trees. Finally, the pixels with motion vector result from the moving vehicles are segmented. Experimental results of several typical scenes show that the proposed model can detect the moving vehicles correctly and is immune from influence of the moving objects caused by the waving trees and the vibration of camera.

  9. Oirat Tones and Break Indices (O-ToBI): Intonational Structure of the Oirat Language

    ERIC Educational Resources Information Center

    Indjieva, Elena

    2009-01-01

    This doctoral dissertation describes intonation patterns in Spoken Oirat (SO) and proposes a model of the intonational structure of Oirat. The proposed prosodic model is represented in the framework of Oirat Tones and Break Indices (O-ToBI), which is based on the design principles of the original English ToBI (Beckman & Ayers 1994; Beckman…

  10. Robust Decision-making Applied to Model Selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2012-08-06

    The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define eachmore » of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.« less

  11. MECHANISTIC-BASED DISINFECTION AND DISINFECTION BYPRODUCT MODELS

    EPA Science Inventory

    We propose developing a mechanistic-based numerical model for chlorine decay and regulated DBP (THM and HAA) formation derived from (free) chlorination; the model framework will allow future modifications for other DBPs and chloramination. Predicted chlorine residual and DBP r...

  12. Dynamics of functional failures and recovery in complex road networks

    NASA Astrophysics Data System (ADS)

    Zhan, Xianyuan; Ukkusuri, Satish V.; Rao, P. Suresh C.

    2017-11-01

    We propose a new framework for modeling the evolution of functional failures and recoveries in complex networks, with traffic congestion on road networks as the case study. Differently from conventional approaches, we transform the evolution of functional states into an equivalent dynamic structural process: dual-vertex splitting and coalescing embedded within the original network structure. The proposed model successfully explains traffic congestion and recovery patterns at the city scale based on high-resolution data from two megacities. Numerical analysis shows that certain network structural attributes can amplify or suppress cascading functional failures. Our approach represents a new general framework to model functional failures and recoveries in flow-based networks and allows understanding of the interplay between structure and function for flow-induced failure propagation and recovery.

  13. On the Formulation of Anisotropic-Polyaxial Failure Criteria: A Comparative Study

    NASA Astrophysics Data System (ADS)

    Parisio, Francesco; Laloui, Lyesse

    2018-02-01

    The correct representation of the failure of geomaterials that feature strength anisotropy and polyaxiality is crucial for many applications. In this contribution, we propose and evaluate through a comparative study a generalized framework that covers both features. Polyaxiality of strength is modeled with a modified Van Eekelen approach, while the anisotropy is modeled using a fabric tensor approach of the Pietruszczak and Mroz type. Both approaches share the same philosophy as they can be applied to simpler failure surfaces, allowing great flexibility in model formulation. The new failure surface is tested against experimental data and its performance compared against classical failure criteria commonly used in geomechanics. Our study finds that the global error between predictions and data is generally smaller for the proposed framework compared to other classical approaches.

  14. Service Contract Compliance Management in Business Process Management

    NASA Astrophysics Data System (ADS)

    El Kharbili, Marwane; Pulvermueller, Elke

    Compliance management is a critical concern for corporations, required to respect contracts. This concern is particularly relevant in the context of business process management (BPM) as this paradigm is getting adopted more widely for-designing and building IT systems. Enforcing contractual compliance needs to be modeled at different levels of a BPM framework, which also includes the service layer. In this paper, we discuss requirements and methods for modeling contractual compliance for an SOA-supported BPM. We also show how business rule management integrated into an industry BPM tool allows modeling and processing functional and non-functional-property constraints which may be extracted from business process contracts. This work proposes a framework that responds to the requirements identified and proposes an architecture implementing it. Our approach is also illustrated by an example.

  15. A new fit-for-purpose model testing framework: Decision Crash Tests

    NASA Astrophysics Data System (ADS)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building decisions. In one case, we show the set of model building decisions has a low probability to correctly support the upgrade decision. In the other case, we show evidence suggesting another set of model building decisions has a high probability to correctly support the decision. The proposed DCT framework focuses on what model users typically care about: the management decision in question. The DCT framework will often be very strict and will produce easy to interpret results enabling clear unsuitability determinations. In the past, hydrologic modelling progress has necessarily meant new models and model building methods. Continued progress in hydrologic modelling requires finding clear evidence to motivate researchers to disregard unproductive models and methods and the DCT framework is built to produce this kind of evidence. References: Andréassian, V., C. Perrin, L. Berthet, N. Le Moine, J. Lerat, C. Loumagne, L. Oudin, T. Mathevet, M.-H. Ramos, and A. Valéry (2009), Crash tests for a standardized evaluation of hydrological models. Hydrology and Earth System Sciences, 13, 1757-1764. Klemeš, V. (1986), Operational testing of hydrological simulation models. Hydrological Sciences Journal, 31 (1), 13-24.

  16. A Bayesian Framework for Estimating the Concordance Correlation Coefficient Using Skew-elliptical Distributions.

    PubMed

    Feng, Dai; Baumgartner, Richard; Svetnik, Vladimir

    2018-04-05

    The concordance correlation coefficient (CCC) is a widely used scaled index in the study of agreement. In this article, we propose estimating the CCC by a unified Bayesian framework that can (1) accommodate symmetric or asymmetric and light- or heavy-tailed data; (2) select model from several candidates; and (3) address other issues frequently encountered in practice such as confounding covariates and missing data. The performance of the proposal was studied and demonstrated using simulated as well as real-life biomarker data from a clinical study of an insomnia drug. The implementation of the proposal is accessible through a package in the Comprehensive R Archive Network.

  17. Modeling Forest Biomass and Growth: Coupling Long-Term Inventory and Lidar Data

    NASA Technical Reports Server (NTRS)

    Babcock, Chad; Finley, Andrew O.; Cook, Bruce D.; Weiskittel, Andrew; Woodall, Christopher W.

    2016-01-01

    Combining spatially-explicit long-term forest inventory and remotely sensed information from Light Detection and Ranging (LiDAR) datasets through statistical models can be a powerful tool for predicting and mapping above-ground biomass (AGB) at a range of geographic scales. We present and examine a novel modeling approach to improve prediction of AGB and estimate AGB growth using LiDAR data. The proposed model accommodates temporal misalignment between field measurements and remotely sensed data-a problem pervasive in such settings-by including multiple time-indexed measurements at plot locations to estimate AGB growth. We pursue a Bayesian modeling framework that allows for appropriately complex parameter associations and uncertainty propagation through to prediction. Specifically, we identify a space-varying coefficients model to predict and map AGB and its associated growth simultaneously. The proposed model is assessed using LiDAR data acquired from NASA Goddard's LiDAR, Hyper-spectral & Thermal imager and field inventory data from the Penobscot Experimental Forest in Bradley, Maine. The proposed model outperformed the time-invariant counterpart models in predictive performance as indicated by a substantial reduction in root mean squared error. The proposed model adequately accounts for temporal misalignment through the estimation of forest AGB growth and accommodates residual spatial dependence. Results from this analysis suggest that future AGB models informed using remotely sensed data, such as LiDAR, may be improved by adapting traditional modeling frameworks to account for temporal misalignment and spatial dependence using random effects.

  18. Connected word recognition using a cascaded neuro-computational model

    NASA Astrophysics Data System (ADS)

    Hoya, Tetsuya; van Leeuwen, Cees

    2016-10-01

    We propose a novel framework for processing a continuous speech stream that contains a varying number of words, as well as non-speech periods. Speech samples are segmented into word-tokens and non-speech periods. An augmented version of an earlier-proposed, cascaded neuro-computational model is used for recognising individual words within the stream. Simulation studies using both a multi-speaker-dependent and speaker-independent digit string database show that the proposed method yields a recognition performance comparable to that obtained by a benchmark approach using hidden Markov models with embedded training.

  19. Model-Based Policymaking: A Framework to Promote Ethical “Good Practice” in Mathematical Modeling for Public Health Policymaking

    PubMed Central

    Boden, Lisa A.; McKendrick, Iain J.

    2017-01-01

    Mathematical models are increasingly relied upon as decision support tools, which estimate risks and generate recommendations to underpin public health policies. However, there are no formal agreements about what constitutes professional competencies or duties in mathematical modeling for public health. In this article, we propose a framework to evaluate whether mathematical models that assess human and animal disease risks and control strategies meet standards consistent with ethical “good practice” and are thus “fit for purpose” as evidence in support of policy. This framework is derived from principles of biomedical ethics: independence, transparency (autonomy), beneficence/non-maleficence, and justice. We identify ethical risks associated with model development and implementation and consider the extent to which scientists are accountable for the translation and communication of model results to policymakers so that the strengths and weaknesses of the scientific evidence base and any socioeconomic and ethical impacts of biased or uncertain predictions are clearly understood. We propose principles to operationalize a framework for ethically sound model development and risk communication between scientists and policymakers. These include the creation of science–policy partnerships to mutually define policy questions and communicate results; development of harmonized international standards for model development; and data stewardship and improvement of the traceability and transparency of models via a searchable archive of policy-relevant models. Finally, we suggest that bespoke ethical advisory groups, with relevant expertise and access to these resources, would be beneficial as a bridge between science and policy, advising modelers of potential ethical risks and providing overview of the translation of modeling advice into policy. PMID:28424768

  20. Visual Hybrid Development Learning System (VHDLS) framework for children with autism.

    PubMed

    Banire, Bilikis; Jomhari, Nazean; Ahmad, Rodina

    2015-10-01

    The effect of education on children with autism serves as a relative cure for their deficits. As a result of this, they require special techniques to gain their attention and interest in learning as compared to typical children. Several studies have shown that these children are visual learners. In this study, we proposed a Visual Hybrid Development Learning System (VHDLS) framework that is based on an instructional design model, multimedia cognitive learning theory, and learning style in order to guide software developers in developing learning systems for children with autism. The results from this study showed that the attention of children with autism increased more with the proposed VHDLS framework.

  1. A causal analysis framework for land-use change and the potential role of bioenergy policy

    DOE PAGES

    Efroymson, Rebecca A.; Kline, Keith L.; Angelsen, Arild; ...

    2016-10-05

    Here we propose a causal analysis framework to increase the reliability of land-use change (LUC) models and the accuracy of net greenhouse gas (GHG) emissions calculations for biofuels. The health-sciences-inspired framework is used here to determine probable causes of LUC, with an emphasis on bioenergy and deforestation. Calculations of net GHG emissions for LUC are critical in determining whether a fuel qualifies as a biofuel or advanced biofuel category under national (U.S., U.K.), state (California), and European Union regulations. Biofuel policymakers and scientists continue to discuss whether presumed indirect land-use change (ILUC) estimates, which often involve deforestation, should be includedmore » in GHG accounting for biofuel pathways. Current estimates of ILUC for bioenergy rely largely on economic simulation models that focus on causal pathways involving global commodity trade and use coarse land cover data with simple land classification systems. ILUC estimates are highly uncertain, partly because changes are not clearly defined and key causal links are not sufficiently included in the models. The proposed causal analysis framework begins with a definition of the change that has occurred and proceeds to a strength-of-evidence approach based on types of epidemiological evidence including plausibility of the relationship, completeness of the causal pathway, spatial co-occurrence, time order, analogous agents, simulation model results, and quantitative agent response relationships.Lastly, we discuss how LUC may be allocated among probable causes for policy purposes and how the application of the framework has the potential to increase the validity of LUC models and resolve ILUC and biofuel controversies.« less

  2. A causal analysis framework for land-use change and the potential role of bioenergy policy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Efroymson, Rebecca A.; Kline, Keith L.; Angelsen, Arild

    Here we propose a causal analysis framework to increase the reliability of land-use change (LUC) models and the accuracy of net greenhouse gas (GHG) emissions calculations for biofuels. The health-sciences-inspired framework is used here to determine probable causes of LUC, with an emphasis on bioenergy and deforestation. Calculations of net GHG emissions for LUC are critical in determining whether a fuel qualifies as a biofuel or advanced biofuel category under national (U.S., U.K.), state (California), and European Union regulations. Biofuel policymakers and scientists continue to discuss whether presumed indirect land-use change (ILUC) estimates, which often involve deforestation, should be includedmore » in GHG accounting for biofuel pathways. Current estimates of ILUC for bioenergy rely largely on economic simulation models that focus on causal pathways involving global commodity trade and use coarse land cover data with simple land classification systems. ILUC estimates are highly uncertain, partly because changes are not clearly defined and key causal links are not sufficiently included in the models. The proposed causal analysis framework begins with a definition of the change that has occurred and proceeds to a strength-of-evidence approach based on types of epidemiological evidence including plausibility of the relationship, completeness of the causal pathway, spatial co-occurrence, time order, analogous agents, simulation model results, and quantitative agent response relationships.Lastly, we discuss how LUC may be allocated among probable causes for policy purposes and how the application of the framework has the potential to increase the validity of LUC models and resolve ILUC and biofuel controversies.« less

  3. Theoretical Models and Operational Frameworks in Public Health Ethics

    PubMed Central

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  4. Frequentist Model Averaging in Structural Equation Modelling.

    PubMed

    Jin, Shaobo; Ankargren, Sebastian

    2018-06-04

    Model selection from a set of candidate models plays an important role in many structural equation modelling applications. However, traditional model selection methods introduce extra randomness that is not accounted for by post-model selection inference. In the current study, we propose a model averaging technique within the frequentist statistical framework. Instead of selecting an optimal model, the contributions of all candidate models are acknowledged. Valid confidence intervals and a [Formula: see text] test statistic are proposed. A simulation study shows that the proposed method is able to produce a robust mean-squared error, a better coverage probability, and a better goodness-of-fit test compared to model selection. It is an interesting compromise between model selection and the full model.

  5. Examining Prediction Models of Giving up within a Resource-Based Framework of Coping in Primary School Students with and without Learning Disabilities

    ERIC Educational Resources Information Center

    Skues, Jason L.; Cunningham, Everarda G.; Theiler, Stephen S.

    2016-01-01

    This study tests a proposed model of coping outcomes for 290 primary school students in Years 5 and 6 (mean age = 11.50 years) with and without learning disabilities (LDs) within a resource-based framework of coping. Group-administered educational and intelligence tests were used to screen students for LDs. Students also completed a questionnaire…

  6. Architectural frameworks: defining the structures for implementing learning health systems.

    PubMed

    Lessard, Lysanne; Michalowski, Wojtek; Fung-Kee-Fung, Michael; Jones, Lori; Grudniewicz, Agnes

    2017-06-23

    The vision of transforming health systems into learning health systems (LHSs) that rapidly and continuously transform knowledge into improved health outcomes at lower cost is generating increased interest in government agencies, health organizations, and health research communities. While existing initiatives demonstrate that different approaches can succeed in making the LHS vision a reality, they are too varied in their goals, focus, and scale to be reproduced without undue effort. Indeed, the structures necessary to effectively design and implement LHSs on a larger scale are lacking. In this paper, we propose the use of architectural frameworks to develop LHSs that adhere to a recognized vision while being adapted to their specific organizational context. Architectural frameworks are high-level descriptions of an organization as a system; they capture the structure of its main components at varied levels, the interrelationships among these components, and the principles that guide their evolution. Because these frameworks support the analysis of LHSs and allow their outcomes to be simulated, they act as pre-implementation decision-support tools that identify potential barriers and enablers of system development. They thus increase the chances of successful LHS deployment. We present an architectural framework for LHSs that incorporates five dimensions-goals, scientific, social, technical, and ethical-commonly found in the LHS literature. The proposed architectural framework is comprised of six decision layers that model these dimensions. The performance layer models goals, the scientific layer models the scientific dimension, the organizational layer models the social dimension, the data layer and information technology layer model the technical dimension, and the ethics and security layer models the ethical dimension. We describe the types of decisions that must be made within each layer and identify methods to support decision-making. In this paper, we outline a high-level architectural framework grounded in conceptual and empirical LHS literature. Applying this architectural framework can guide the development and implementation of new LHSs and the evolution of existing ones, as it allows for clear and critical understanding of the types of decisions that underlie LHS operations. Further research is required to assess and refine its generalizability and methods.

  7. A general framework to test gravity using galaxy clusters - I. Modelling the dynamical mass of haloes in f(R) gravity

    NASA Astrophysics Data System (ADS)

    Mitchell, Myles A.; He, Jian-hua; Arnold, Christian; Li, Baojiu

    2018-06-01

    We propose a new framework for testing gravity using cluster observations, which aims to provide an unbiased constraint on modified gravity models from Sunyaev-Zel'dovich (SZ) and X-ray cluster counts and the cluster gas fraction, among other possible observables. Focusing on a popular f(R) model of gravity, we propose a novel procedure to recalibrate mass scaling relations from Λ cold dark matter (ΛCDM) to f(R) gravity for SZ and X-ray cluster observables. We find that the complicated modified gravity effects can be simply modelled as a dependence on a combination of the background scalar field and redshift, fR(z)/(1 + z), regardless of the f(R) model parameter. By employing a large suite of N-body simulations, we demonstrate that a theoretically derived tanh fitting formula is in excellent agreement with the dynamical mass enhancement of dark matter haloes for a large range of background field parameters and redshifts. Our framework is sufficiently flexible to allow for tests of other models and inclusion of further observables, and the one-parameter description of the dynamical mass enhancement can have important implications on the theoretical modelling of observables and on practical tests of gravity.

  8. High-Performance Data Analytics Beyond the Relational and Graph Data Models with GEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Minutoli, Marco; Bhatt, Shreyansh

    Graphs represent an increasingly popular data model for data-analytics, since they can naturally represent relationships and interactions between entities. Relational databases and their pure table-based data model are not well suitable to store and process sparse data. Consequently, graph databases have gained interest in the last few years and the Resource Description Framework (RDF) became the standard data model for graph data. Nevertheless, while RDF is well suited to analyze the relationships between the entities, it is not efficient in representing their attributes and properties. In this work we propose the adoption of a new hybrid data model, based onmore » attributed graphs, that aims at overcoming the limitations of the pure relational and graph data models. We present how we have re-designed the GEMS data-analytics framework to fully take advantage of the proposed hybrid data model. To improve analysts productivity, in addition to a C++ API for applications development, we adopt GraQL as input query language. We validate our approach implementing a set of queries on net-flow data and we compare our framework performance against Neo4j. Experimental results show significant performance improvement over Neo4j, up to several orders of magnitude when increasing the size of the input data.« less

  9. Toward a unified approach to dose-response modeling in ecotoxicology.

    PubMed

    Ritz, Christian

    2010-01-01

    This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.

  10. Quantum Gravity and Cosmology: an intimate interplay

    NASA Astrophysics Data System (ADS)

    Sakellariadou, Mairi

    2017-08-01

    I will briefly discuss three cosmological models built upon three distinct quantum gravity proposals. I will first highlight the cosmological rôle of a vector field in the framework of a string/brane cosmological model. I will then present the resolution of the big bang singularity and the occurrence of an early era of accelerated expansion of a geometric origin, in the framework of group field theory condensate cosmology. I will then summarise results from an extended gravitational model based on non-commutative spectral geometry, a model that offers a purely geometric explanation for the standard model of particle physics.

  11. Robust group-wise rigid registration of point sets using t-mixture model

    NASA Astrophysics Data System (ADS)

    Ravikumar, Nishant; Gooya, Ali; Frangi, Alejandro F.; Taylor, Zeike A.

    2016-03-01

    A probabilistic framework for robust, group-wise rigid alignment of point-sets using a mixture of Students t-distribution especially when the point sets are of varying lengths, are corrupted by an unknown degree of outliers or in the presence of missing data. Medical images (in particular magnetic resonance (MR) images), their segmentations and consequently point-sets generated from these are highly susceptible to corruption by outliers. This poses a problem for robust correspondence estimation and accurate alignment of shapes, necessary for training statistical shape models (SSMs). To address these issues, this study proposes to use a t-mixture model (TMM), to approximate the underlying joint probability density of a group of similar shapes and align them to a common reference frame. The heavy-tailed nature of t-distributions provides a more robust registration framework in comparison to state of the art algorithms. Significant reduction in alignment errors is achieved in the presence of outliers, using the proposed TMM-based group-wise rigid registration method, in comparison to its Gaussian mixture model (GMM) counterparts. The proposed TMM-framework is compared with a group-wise variant of the well-known Coherent Point Drift (CPD) algorithm and two other group-wise methods using GMMs, using both synthetic and real data sets. Rigid alignment errors for groups of shapes are quantified using the Hausdorff distance (HD) and quadratic surface distance (QSD) metrics.

  12. A penalized framework for distributed lag non-linear models.

    PubMed

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  13. Multisensor satellite data for water quality analysis and water pollution risk assessment: decision making under deep uncertainty with fuzzy algorithm in framework of multimodel approach

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim

    2017-10-01

    Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.

  14. A Customizable Language Learning Support System Using Ontology-Driven Engine

    ERIC Educational Resources Information Center

    Wang, Jingyun; Mendori, Takahiko; Xiong, Juan

    2013-01-01

    This paper proposes a framework for web-based language learning support systems designed to provide customizable pedagogical procedures based on the analysis of characteristics of both learner and course. This framework employs a course-centered ontology and a teaching method ontology as the foundation for the student model, which includes learner…

  15. Evaluation in Cross-Cultural Contexts: Proposing a Framework for International Education and Training Project Evaluations.

    ERIC Educational Resources Information Center

    bin Yahya, Ismail; And Others

    This paper focuses on the need for increased sensitivity and responsiveness in international education and training project evaluations, particularly those in Third World countries. A conceptual-theoretical framework for designing and developing models appropriate for evaluating education and training projects in non-Western cultures is presented.…

  16. A general theoretical framework for decoherence in open and closed systems

    NASA Astrophysics Data System (ADS)

    Castagnino, Mario; Fortin, Sebastian; Laura, Roberto; Lombardi, Olimpia

    2008-08-01

    A general theoretical framework for decoherence is proposed, which encompasses formalisms originally devised to deal just with open or closed systems. The conditions for decoherence are clearly stated and the relaxation and decoherence times are compared. Finally, the spin-bath model is developed in detail from the new perspective.

  17. Transnational Corporations and Strategic Challenges: An Analysis of Knowledge Flows and Competitive Advantage

    ERIC Educational Resources Information Center

    de Pablos, Patricia Ordonez

    2006-01-01

    Purpose: The purpose of this paper is to analyse knowledge transfers in transnational corporations. Design/methodology/approach: The paper develops a conceptual framework for the analysis of knowledge flow transfers in transnationals. Based on this theoretical framework, the paper propose's research hypotheses and builds a causal model that links…

  18. Reengineering Framework for Systems in Education

    ERIC Educational Resources Information Center

    Choquet, Christophe; Corbiere, Alain

    2006-01-01

    Specifications recently proposed as standards in the domain of Technology Enhanced Learning (TEL), question the designers of TEL systems on how to put them into practice. Recent studies in Model Driven Engineering have highlighted the need for a framework which could formalize the use of these specifications as well as enhance the quality of the…

  19. An Organizational Framework for Understanding the Role of Culture in Counseling.

    ERIC Educational Resources Information Center

    Ponterotto, Joseph G.; Benesch, Kevin F.

    1988-01-01

    Proposes a direction for cross-cultural training and research different from the traditional ones which emphasize differences between groups and the need to develop culture-specific techniques. Delineates a conceptual model which provides an integrative framework for understanding the role of culture in counseling. Draws on work in transpersonal…

  20. Deep Understanding of Electromagnetism Using Crosscutting Concepts

    ERIC Educational Resources Information Center

    De Poorter, John; De Lange, Jan; Devoldere, Lies; Van Landeghem, Jouri; Strubbe, Katrien

    2017-01-01

    Crosscutting concepts like patterns and models are fundamental parts in both the American framework of science education (from the AAAS) and our proposals for a new science education framework in Flanders. These concepts deepen the insight of both students and teachers. They help students to ask relevant questions during an inquiry and they give…

  1. A Framework for Studying Organizational Innovation in Research Libraries

    ERIC Educational Resources Information Center

    Jantz, Ronald C.

    2012-01-01

    The objective of this paper is two-fold: to propose a theoretical framework and model for studying organizational innovation in research libraries and to set forth propositions that can provide directions for future empirical studies of innovation in research libraries. Research libraries can be considered members of a class of organizations…

  2. A multi-objective framework to predict flows of ungauged rivers within regions of sparse hydrometeorologic observation

    NASA Astrophysics Data System (ADS)

    Alipour, M.; Kibler, K. M.

    2017-12-01

    Despite advances in flow prediction, managers of ungauged rivers located within broad regions of sparse hydrometeorologic observation still lack prescriptive methods robust to the data challenges of such regions. We propose a multi-objective streamflow prediction framework for regions of minimum observation to select models that balance runoff efficiency with choice of accurate parameter values. We supplement sparse observed data with uncertain or low-resolution information incorporated as `soft' a priori parameter estimates. The performance of the proposed framework is tested against traditional single-objective and constrained single-objective calibrations in two catchments in a remote area of southwestern China. We find that the multi-objective approach performs well with respect to runoff efficiency in both catchments (NSE = 0.74 and 0.72), within the range of efficiencies returned by other models (NSE = 0.67 - 0.78). However, soil moisture capacity estimated by the multi-objective model resonates with a priori estimates (parameter residuals of 61 cm versus 289 and 518 cm for maximum soil moisture capacity in one catchment, and 20 cm versus 246 and 475 cm in the other; parameter residuals of 0.48 versus 0.65 and 0.7 for soil moisture distribution shape factor in one catchment, and 0.91 versus 0.79 and 1.24 in the other). Thus, optimization to a multi-criteria objective function led to very different representations of soil moisture capacity as compared to models selected by single-objective calibration, without compromising runoff efficiency. These different soil moisture representations may translate into considerably different hydrological behaviors. The proposed approach thus offers a preliminary step towards greater process understanding in regions of severe data limitations. For instance, the multi-objective framework may be an adept tool to discern between models of similar efficiency to select models that provide the "right answers for the right reasons". Managers may feel more confident to utilize such models to predict flows in fully ungauged areas.

  3. Where's the reef: the role of framework in the Holocene

    USGS Publications Warehouse

    Hubbard, D.K.; Burke, R.B.; Gill, I.P.

    1998-01-01

    Holocene reef models generally emphasize the role of in-place and interlocking framework in the creation of a rigid structure that rises above its surroundings. By extension, a number of ancient biohermal deposits have been disqualified as 'true reefs' owing to their lack of recognizable framework. Fifty-four cores from several eastern Caribbean sites clearly demonstrate that in-place and interlocking framework is not common in these reefs that are comprised of varying mixtures of recognizable coral (primary framework), loose sediment/rubble and secondary framework made up mostly of coralgal fragments bound together by submarine cementation and biological encrustation. Recvovery of primary and secondary framework ranged from 22% (avg.) in branching-coral facies to 33% in intervals dominated by head corals. Accretion rate decreases as expected with water depth. However, the recovery of recognizable coral generally increased with water depth, inversely to presumed coral-growth rates. This pattern reflects a spectrum in the relative importance of coral growth (primary construction), bioerosion, hydromechanical breakdown and the transport of sediment and detritus. The relative importance of each is controlled by the physical-oceanographic conditions at the stie of reef development and will dictate both the architecture and the character of its internal fabric. We do not propose that framework reeds do not exist, as they most assuredly do. However, the fact that so many modern reefs are not dominated by in-place and interlocking framework suggests that its use as the primary determinant of ancient reefs may be unreasonable. We, therefore, propose the abandonment of framework-based models in favor of those that treat framework generation, physical/biological degradation, sedimentation, and encrustation as equal partners in the development of modern and ancient reefs alike.

  4. A Modeling Framework for Optimal Computational Resource Allocation Estimation: Considering the Trade-offs between Physical Resolutions, Uncertainty and Computational Costs

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.; Rajagopal, R.

    2014-12-01

    Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.

  5. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework

    PubMed Central

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation. PMID:28225811

  6. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework.

    PubMed

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation.

  7. Multi-View Multi-Instance Learning Based on Joint Sparse Representation and Multi-View Dictionary Learning.

    PubMed

    Li, Bing; Yuan, Chunfeng; Xiong, Weihua; Hu, Weiming; Peng, Houwen; Ding, Xinmiao; Maybank, Steve

    2017-12-01

    In multi-instance learning (MIL), the relations among instances in a bag convey important contextual information in many applications. Previous studies on MIL either ignore such relations or simply model them with a fixed graph structure so that the overall performance inevitably degrades in complex environments. To address this problem, this paper proposes a novel multi-view multi-instance learning algorithm (MIL) that combines multiple context structures in a bag into a unified framework. The novel aspects are: (i) we propose a sparse -graph model that can generate different graphs with different parameters to represent various context relations in a bag, (ii) we propose a multi-view joint sparse representation that integrates these graphs into a unified framework for bag classification, and (iii) we propose a multi-view dictionary learning algorithm to obtain a multi-view graph dictionary that considers cues from all views simultaneously to improve the discrimination of the MIL. Experiments and analyses in many practical applications prove the effectiveness of the M IL.

  8. Multi-sparse dictionary colorization algorithm based on the feature classification and detail enhancement

    NASA Astrophysics Data System (ADS)

    Yan, Dan; Bai, Lianfa; Zhang, Yi; Han, Jing

    2018-02-01

    For the problems of missing details and performance of the colorization based on sparse representation, we propose a conceptual model framework for colorizing gray-scale images, and then a multi-sparse dictionary colorization algorithm based on the feature classification and detail enhancement (CEMDC) is proposed based on this framework. The algorithm can achieve a natural colorized effect for a gray-scale image, and it is consistent with the human vision. First, the algorithm establishes a multi-sparse dictionary classification colorization model. Then, to improve the accuracy rate of the classification, the corresponding local constraint algorithm is proposed. Finally, we propose a detail enhancement based on Laplacian Pyramid, which is effective in solving the problem of missing details and improving the speed of image colorization. In addition, the algorithm not only realizes the colorization of the visual gray-scale image, but also can be applied to the other areas, such as color transfer between color images, colorizing gray fusion images, and infrared images.

  9. Economic growth and CO2 emissions: an investigation with smooth transition autoregressive distributed lag models for the 1800-2014 period in the USA.

    PubMed

    Bildirici, Melike; Ersin, Özgür Ömer

    2018-01-01

    The study aims to combine the autoregressive distributed lag (ARDL) cointegration framework with smooth transition autoregressive (STAR)-type nonlinear econometric models for causal inference. Further, the proposed STAR distributed lag (STARDL) models offer new insights in terms of modeling nonlinearity in the long- and short-run relations between analyzed variables. The STARDL method allows modeling and testing nonlinearity in the short-run and long-run parameters or both in the short- and long-run relations. To this aim, the relation between CO 2 emissions and economic growth rates in the USA is investigated for the 1800-2014 period, which is one of the largest data sets available. The proposed hybrid models are the logistic, exponential, and second-order logistic smooth transition autoregressive distributed lag (LSTARDL, ESTARDL, and LSTAR2DL) models combine the STAR framework with nonlinear ARDL-type cointegration to augment the linear ARDL approach with smooth transitional nonlinearity. The proposed models provide a new approach to the relevant econometrics and environmental economics literature. Our results indicated the presence of asymmetric long-run and short-run relations between the analyzed variables that are from the GDP towards CO 2 emissions. By the use of newly proposed STARDL models, the results are in favor of important differences in terms of the response of CO 2 emissions in regimes 1 and 2 for the estimated LSTAR2DL and LSTARDL models.

  10. Early Validation of Failure Detection, Isolation, and Recovery Design Using Heterogeneous Modelling and Simulation

    NASA Astrophysics Data System (ADS)

    van der Plas, Peter; Guerriero, Suzanne; Cristiano, Leorato; Rugina, Ana

    2012-08-01

    Modelling and simulation can support a number of use cases across the spacecraft development life-cycle. Given the increasing complexity of space missions, the observed general trend is for a more extensive usage of simulation already in the early phases. A major perceived advantage is that modelling and simulation can enable the validation of critical aspects of the spacecraft design before the actual development is started, as such reducing the risk in later phases.Failure Detection, Isolation, and Recovery (FDIR) is one of the areas with a high potential to benefit from early modelling and simulation. With the increasing level of required spacecraft autonomy, FDIR specifications can grow in such a way that the traditional document-based review process soon becomes inadequate.This paper shows that FDIR modelling and simulation in a system context can provide a powerful tool to support the FDIR verification process. It is highlighted that FDIR modelling at this early stage requires heterogeneous modelling tools and languages, in order to provide an adequate functional description of the different components (i.e. FDIR functions, environment, equipment, etc.) to be modelled.For this reason, an FDIR simulation framework is proposed in this paper. This framework is based on a number of tools already available in the Avionics Systems Laboratory at ESTEC, which are the Avionics Test Bench Functional Engineering Simulator (ATB FES), Matlab/Simulink, TASTE, and Real Time Developer Studio (RTDS).The paper then discusses the application of the proposed simulation framework to a real case-study, i.e. the FDIR modelling of a satellite in support of actual ESA mission. Challenges and benefits of the approach are described. Finally, lessons learned and the generality of the proposed approach are discussed.

  11. New theoretical framework for designing nonionic surfactant mixtures that exhibit a desired adsorption kinetics behavior.

    PubMed

    Moorkanikkara, Srinivas Nageswaran; Blankschtein, Daniel

    2010-12-21

    How does one design a surfactant mixture using a set of available surfactants such that it exhibits a desired adsorption kinetics behavior? The traditional approach used to address this design problem involves conducting trial-and-error experiments with specific surfactant mixtures. This approach is typically time-consuming and resource-intensive and becomes increasingly challenging when the number of surfactants that can be mixed increases. In this article, we propose a new theoretical framework to identify a surfactant mixture that most closely meets a desired adsorption kinetics behavior. Specifically, the new theoretical framework involves (a) formulating the surfactant mixture design problem as an optimization problem using an adsorption kinetics model and (b) solving the optimization problem using a commercial optimization package. The proposed framework aims to identify the surfactant mixture that most closely satisfies the desired adsorption kinetics behavior subject to the predictive capabilities of the chosen adsorption kinetics model. Experiments can then be conducted at the identified surfactant mixture condition to validate the predictions. We demonstrate the reliability and effectiveness of the proposed theoretical framework through a realistic case study by identifying a nonionic surfactant mixture consisting of up to four alkyl poly(ethylene oxide) surfactants (C(10)E(4), C(12)E(5), C(12)E(6), and C(10)E(8)) such that it most closely exhibits a desired dynamic surface tension (DST) profile. Specifically, we use the Mulqueen-Stebe-Blankschtein (MSB) adsorption kinetics model (Mulqueen, M.; Stebe, K. J.; Blankschtein, D. Langmuir 2001, 17, 5196-5207) to formulate the optimization problem as well as the SNOPT commercial optimization solver to identify a surfactant mixture consisting of these four surfactants that most closely exhibits the desired DST profile. Finally, we compare the experimental DST profile measured at the surfactant mixture condition identified by the new theoretical framework with the desired DST profile and find good agreement between the two profiles.

  12. Modelling Lean and Green Supply Chain

    NASA Astrophysics Data System (ADS)

    Duarte, Susana Carla Vieira Lino Medina

    The success of an organization depends on the effective control of its supply chain. It is important to recognize new opportunities for organization and its supply chain. In the last few years the approach to lean, agile, resilient and green supply chain paradigms has been addressed in the scientific literature. Research in this field shows that the integration of these concepts revealed some contradictions among so many paradigms. This thesis is mainly focused on the lean and green approaches. Thirteen different management frameworks, embodied in awards, standards and tools were studied to understand if they could contribute for the modelling process of a lean and green approach. The study reveals a number of categories that are common in most management frameworks, providing adequate conditions for a lean and green supply chain transformation. A conceptual framework for the evaluation of a lean and green organization`s supply chain was proposed. The framework considers six key criteria, namely, leadership, people, strategic planning, stakeholders, processes and results. It was proposed an assessment method considering a criteria score for each criterion. The purpose is to understand how lean and green supply chain can be compatible, using principles, practices, techniques or tools (i.e. elements) that support both, a lean and a green approach, in all key criteria. A case study in the automotive upstream supply chain was performed to understand more deeply if the elements proposed for the conceptual framework could be implemented in a real-scenario. Based on the conceptual framework and the case study, a roadmap to achieve a lean-green transformation is presented. The proposed roadmap revealed its contribution to the understanding on how and when an organization`s supply chain should apply the lean and green elements. This study is relevant to practice, as it may assist managers in the adoption of a lean and green supply chain approach, giving insights for the implementation of a hybrid supply chain.

  13. Brain activity and cognition: a connection from thermodynamics and information theory.

    PubMed

    Collell, Guillem; Fauquet, Jordi

    2015-01-01

    The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity.

  14. Multilayer Stock Forecasting Model Using Fuzzy Time Series

    PubMed Central

    Javedani Sadaei, Hossein; Lee, Muhammad Hisyam

    2014-01-01

    After reviewing the vast body of literature on using FTS in stock market forecasting, certain deficiencies are distinguished in the hybridization of findings. In addition, the lack of constructive systematic framework, which can be helpful to indicate direction of growth in entire FTS forecasting systems, is outstanding. In this study, we propose a multilayer model for stock market forecasting including five logical significant layers. Every single layer has its detailed concern to assist forecast development by reconciling certain problems exclusively. To verify the model, a set of huge data containing Taiwan Stock Index (TAIEX), National Association of Securities Dealers Automated Quotations (NASDAQ), Dow Jones Industrial Average (DJI), and S&P 500 have been chosen as experimental datasets. The results indicate that the proposed methodology has the potential to be accepted as a framework for model development in stock market forecasts using FTS. PMID:24605058

  15. Probabilistic Common Spatial Patterns for Multichannel EEG Analysis

    PubMed Central

    Chen, Zhe; Gao, Xiaorong; Li, Yuanqing; Brown, Emery N.; Gao, Shangkai

    2015-01-01

    Common spatial patterns (CSP) is a well-known spatial filtering algorithm for multichannel electroencephalogram (EEG) analysis. In this paper, we cast the CSP algorithm in a probabilistic modeling setting. Specifically, probabilistic CSP (P-CSP) is proposed as a generic EEG spatio-temporal modeling framework that subsumes the CSP and regularized CSP algorithms. The proposed framework enables us to resolve the overfitting issue of CSP in a principled manner. We derive statistical inference algorithms that can alleviate the issue of local optima. In particular, an efficient algorithm based on eigendecomposition is developed for maximum a posteriori (MAP) estimation in the case of isotropic noise. For more general cases, a variational algorithm is developed for group-wise sparse Bayesian learning for the P-CSP model and for automatically determining the model size. The two proposed algorithms are validated on a simulated data set. Their practical efficacy is also demonstrated by successful applications to single-trial classifications of three motor imagery EEG data sets and by the spatio-temporal pattern analysis of one EEG data set recorded in a Stroop color naming task. PMID:26005228

  16. Phase noise suppression for coherent optical block transmission systems: a unified framework.

    PubMed

    Yang, Chuanchuan; Yang, Feng; Wang, Ziyu

    2011-08-29

    A unified framework for phase noise suppression is proposed in this paper, which could be applied in any coherent optical block transmission systems, including coherent optical orthogonal frequency-division multiplexing (CO-OFDM), coherent optical single-carrier frequency-domain equalization block transmission (CO-SCFDE), etc. Based on adaptive modeling of phase noise, unified observation equations for different coherent optical block transmission systems are constructed, which lead to unified phase noise estimation and suppression. Numerical results demonstrate that the proposal is powerful in mitigating laser phase noise.

  17. Model-based Bayesian filtering of cardiac contaminants from biomedical recordings.

    PubMed

    Sameni, R; Shamsollahi, M B; Jutten, C

    2008-05-01

    Electrocardiogram (ECG) and magnetocardiogram (MCG) signals are among the most considerable sources of noise for other biomedical signals. In some recent works, a Bayesian filtering framework has been proposed for denoising the ECG signals. In this paper, it is shown that this framework may be effectively used for removing cardiac contaminants such as the ECG, MCG and ballistocardiographic artifacts from different biomedical recordings such as the electroencephalogram, electromyogram and also for canceling maternal cardiac signals from fetal ECG/MCG. The proposed method is evaluated on simulated and real signals.

  18. An alternative to the neoliberal model in health: the case of Venezuela.

    PubMed

    Feo, Oscar; Siqueira, Carlos Eduardo

    2004-01-01

    The authors present a synthesis of the proposals put forth by the health sector of Venezuela during the framing of the new Venezuelan Constitution. They summarize the background to the National Constituent Assembly and the legal framework typical of the health sector at that time, identify the methodological aspects that substantiated the health topics included in the new Constitution, and analyze the articles that shape the current constitutional health framework in Venezuela, summarizing their most important features and comparing them with neoliberal health proposals.

  19. Semantic Framework of Internet of Things for Smart Cities: Case Studies.

    PubMed

    Zhang, Ningyu; Chen, Huajun; Chen, Xi; Chen, Jiaoyan

    2016-09-14

    In recent years, the advancement of sensor technology has led to the generation of heterogeneous Internet-of-Things (IoT) data by smart cities. Thus, the development and deployment of various aspects of IoT-based applications are necessary to mine the potential value of data to the benefit of people and their lives. However, the variety, volume, heterogeneity, and real-time nature of data obtained from smart cities pose considerable challenges. In this paper, we propose a semantic framework that integrates the IoT with machine learning for smart cities. The proposed framework retrieves and models urban data for certain kinds of IoT applications based on semantic and machine-learning technologies. Moreover, we propose two case studies: pollution detection from vehicles and traffic pattern detection. The experimental results show that our system is scalable and capable of accommodating a large number of urban regions with different types of IoT applications.

  20. Semantic Framework of Internet of Things for Smart Cities: Case Studies

    PubMed Central

    Zhang, Ningyu; Chen, Huajun; Chen, Xi; Chen, Jiaoyan

    2016-01-01

    In recent years, the advancement of sensor technology has led to the generation of heterogeneous Internet-of-Things (IoT) data by smart cities. Thus, the development and deployment of various aspects of IoT-based applications are necessary to mine the potential value of data to the benefit of people and their lives. However, the variety, volume, heterogeneity, and real-time nature of data obtained from smart cities pose considerable challenges. In this paper, we propose a semantic framework that integrates the IoT with machine learning for smart cities. The proposed framework retrieves and models urban data for certain kinds of IoT applications based on semantic and machine-learning technologies. Moreover, we propose two case studies: pollution detection from vehicles and traffic pattern detection. The experimental results show that our system is scalable and capable of accommodating a large number of urban regions with different types of IoT applications. PMID:27649185

  1. Robustness Analysis of Integrated LPV-FDI Filters and LTI-FTC System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Khong, Thuan H.; Shin, Jong-Yeob

    2007-01-01

    This paper proposes an analysis framework for robustness analysis of a nonlinear dynamics system that can be represented by a polynomial linear parameter varying (PLPV) system with constant bounded uncertainty. The proposed analysis framework contains three key tools: 1) a function substitution method which can convert a nonlinear system in polynomial form into a PLPV system, 2) a matrix-based linear fractional transformation (LFT) modeling approach, which can convert a PLPV system into an LFT system with the delta block that includes key uncertainty and scheduling parameters, 3) micro-analysis, which is a well known robust analysis tool for linear systems. The proposed analysis framework is applied to evaluating the performance of the LPV-fault detection and isolation (FDI) filters of the closed-loop system of a transport aircraft in the presence of unmodeled actuator dynamics and sensor gain uncertainty. The robustness analysis results are compared with nonlinear time simulations.

  2. A superpixel-based framework for automatic tumor segmentation on breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Yu, Ning; Wu, Jia; Weinstein, Susan P.; Gaonkar, Bilwaj; Keller, Brad M.; Ashraf, Ahmed B.; Jiang, YunQing; Davatzikos, Christos; Conant, Emily F.; Kontos, Despina

    2015-03-01

    Accurate and efficient automated tumor segmentation in breast dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is highly desirable for computer-aided tumor diagnosis. We propose a novel automatic segmentation framework which incorporates mean-shift smoothing, superpixel-wise classification, pixel-wise graph-cuts partitioning, and morphological refinement. A set of 15 breast DCE-MR images, obtained from the American College of Radiology Imaging Network (ACRIN) 6657 I-SPY trial, were manually segmented to generate tumor masks (as ground truth) and breast masks (as regions of interest). Four state-of-the-art segmentation approaches based on diverse models were also utilized for comparison. Based on five standard evaluation metrics for segmentation, the proposed framework consistently outperformed all other approaches. The performance of the proposed framework was: 1) 0.83 for Dice similarity coefficient, 2) 0.96 for pixel-wise accuracy, 3) 0.72 for VOC score, 4) 0.79 mm for mean absolute difference, and 5) 11.71 mm for maximum Hausdorff distance, which surpassed the second best method (i.e., adaptive geodesic transformation), a semi-automatic algorithm depending on precise initialization. Our results suggest promising potential applications of our segmentation framework in assisting analysis of breast carcinomas.

  3. SU-D-BRA-04: Computerized Framework for Marker-Less Localization of Anatomical Feature Points in Range Images Based On Differential Geometry Features for Image-Guided Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soufi, M; Arimura, H; Toyofuku, F

    Purpose: To propose a computerized framework for localization of anatomical feature points on the patient surface in infrared-ray based range images by using differential geometry (curvature) features. Methods: The general concept was to reconstruct the patient surface by using a mathematical modeling technique for the computation of differential geometry features that characterize the local shapes of the patient surfaces. A region of interest (ROI) was firstly extracted based on a template matching technique applied on amplitude (grayscale) images. The extracted ROI was preprocessed for reducing temporal and spatial noises by using Kalman and bilateral filters, respectively. Next, a smooth patientmore » surface was reconstructed by using a non-uniform rational basis spline (NURBS) model. Finally, differential geometry features, i.e. the shape index and curvedness features were computed for localizing the anatomical feature points. The proposed framework was trained for optimizing shape index and curvedness thresholds and tested on range images of an anthropomorphic head phantom. The range images were acquired by an infrared ray-based time-of-flight (TOF) camera. The localization accuracy was evaluated by measuring the mean of minimum Euclidean distances (MMED) between reference (ground truth) points and the feature points localized by the proposed framework. The evaluation was performed for points localized on convex regions (e.g. apex of nose) and concave regions (e.g. nasofacial sulcus). Results: The proposed framework has localized anatomical feature points on convex and concave anatomical landmarks with MMEDs of 1.91±0.50 mm and 3.70±0.92 mm, respectively. A statistically significant difference was obtained between the feature points on the convex and concave regions (P<0.001). Conclusion: Our study has shown the feasibility of differential geometry features for localization of anatomical feature points on the patient surface in range images. The proposed framework might be useful for tasks involving feature-based image registration in range-image guided radiation therapy.« less

  4. Assessing Vocational Interests in the Basque Country Using Paired Comparison Design

    ERIC Educational Resources Information Center

    Elosua, Paula

    2007-01-01

    This article proposes the Thurstonian paired comparison model to assess vocational preferences and uses this approach to evaluate the Realistic, Investigative, Artistic, Social, Enterprise, and Conventional (RIASEC) model in the Basque Country (Spain). First, one unrestricted model is estimated in the Structural Equation Modelling framework using…

  5. Some Statistics for Assessing Person-Fit Based on Continuous-Response Models

    ERIC Educational Resources Information Center

    Ferrando, Pere Joan

    2010-01-01

    This article proposes several statistics for assessing individual fit based on two unidimensional models for continuous responses: linear factor analysis and Samejima's continuous response model. Both models are approached using a common framework based on underlying response variables and are formulated at the individual level as fixed regression…

  6. Extrinsic local regression on manifold-valued data

    PubMed Central

    Lin, Lizhen; St Thomas, Brian; Zhu, Hongtu; Dunson, David B.

    2017-01-01

    We propose an extrinsic regression framework for modeling data with manifold valued responses and Euclidean predictors. Regression with manifold responses has wide applications in shape analysis, neuroscience, medical imaging and many other areas. Our approach embeds the manifold where the responses lie onto a higher dimensional Euclidean space, obtains a local regression estimate in that space, and then projects this estimate back onto the image of the manifold. Outside the regression setting both intrinsic and extrinsic approaches have been proposed for modeling i.i.d manifold-valued data. However, to our knowledge our work is the first to take an extrinsic approach to the regression problem. The proposed extrinsic regression framework is general, computationally efficient and theoretically appealing. Asymptotic distributions and convergence rates of the extrinsic regression estimates are derived and a large class of examples are considered indicating the wide applicability of our approach. PMID:29225385

  7. A Computational Framework to Control Verification and Robustness Analysis

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2010-01-01

    This paper presents a methodology for evaluating the robustness of a controller based on its ability to satisfy the design requirements. The framework proposed is generic since it allows for high-fidelity models, arbitrary control structures and arbitrary functional dependencies between the requirements and the uncertain parameters. The cornerstone of this contribution is the ability to bound the region of the uncertain parameter space where the degradation in closed-loop performance remains acceptable. The size of this bounding set, whose geometry can be prescribed according to deterministic or probabilistic uncertainty models, is a measure of robustness. The robustness metrics proposed herein are the parametric safety margin, the reliability index, the failure probability and upper bounds to this probability. The performance observed at the control verification setting, where the assumptions and approximations used for control design may no longer hold, will fully determine the proposed control assessment.

  8. Gauge invariance of phenomenological models of the interaction of quantum dissipative systems with electromagnetic fields

    NASA Astrophysics Data System (ADS)

    Tokman, M. D.

    2009-05-01

    We discuss specific features of the electrodynamic characteristics of quantum systems within the framework of models that include a phenomenological description of the relaxation processes. As is shown by W. E. Lamb, Jr., R. R. Schlicher, and M. O. Scully [Phys. Rev. A 36, 2763 (1987)], the use of phenomenological relaxation operators, which adequately describe the attenuation of eigenvibrations of a quantum system, may lead to incorrect solutions in the presence of external electromagnetic fields determined by the vector potential for different resonance processes. This incorrectness can be eliminated by giving a gauge-invariant form to the relaxation operator. Lamb, Jr., proposed the corresponding gauge-invariant modification for the Weisskopf-Wigner relaxation operator, which is introduced directly into the Schrödinger equation within the framework of the two-level approximation. In the present paper, this problem is studied for the von Neumann equation supplemented by a relaxation operator. First, we show that the solution of the equation for the density matrix with the relaxation operator correctly obtained “from the first principles” has properties that ensure gauge invariance for the observables. Second, we propose a common recipe for transformation of the phenomenological relaxation operator into the correct (gauge-invariant) form in the density-matrix equations for a multilevel system. Also, we discuss the methods of elimination of other inaccuracies (not related to the gauge-invariance problem) which arise if the electrodynamic response of a dissipative quantum system is calculated within the framework of simplified relaxation models (first of all, the model corresponding to constant relaxation rates of coherences in quantum transitions). Examples illustrating the correctness of the results obtained within the framework of the proposed methods in contrast to inaccuracy of the results of the standard calculation techniques are given.

  9. Error modeling for surrogates of dynamical systems using machine learning: Machine-learning-based error model for surrogates of dynamical systems

    DOE PAGES

    Trehan, Sumeet; Carlberg, Kevin T.; Durlofsky, Louis J.

    2017-07-14

    A machine learning–based framework for modeling the error introduced by surrogate models of parameterized dynamical systems is proposed. The framework entails the use of high-dimensional regression techniques (eg, random forests, and LASSO) to map a large set of inexpensively computed “error indicators” (ie, features) produced by the surrogate model at a given time instance to a prediction of the surrogate-model error in a quantity of interest (QoI). This eliminates the need for the user to hand-select a small number of informative features. The methodology requires a training set of parameter instances at which the time-dependent surrogate-model error is computed bymore » simulating both the high-fidelity and surrogate models. Using these training data, the method first determines regression-model locality (via classification or clustering) and subsequently constructs a “local” regression model to predict the time-instantaneous error within each identified region of feature space. We consider 2 uses for the resulting error model: (1) as a correction to the surrogate-model QoI prediction at each time instance and (2) as a way to statistically model arbitrary functions of the time-dependent surrogate-model error (eg, time-integrated errors). We then apply the proposed framework to model errors in reduced-order models of nonlinear oil-water subsurface flow simulations, with time-varying well-control (bottom-hole pressure) parameters. The reduced-order models used in this work entail application of trajectory piecewise linearization in conjunction with proper orthogonal decomposition. Moreover, when the first use of the method is considered, numerical experiments demonstrate consistent improvement in accuracy in the time-instantaneous QoI prediction relative to the original surrogate model, across a large number of test cases. When the second use is considered, results show that the proposed method provides accurate statistical predictions of the time- and well-averaged errors.« less

  10. Error modeling for surrogates of dynamical systems using machine learning: Machine-learning-based error model for surrogates of dynamical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trehan, Sumeet; Carlberg, Kevin T.; Durlofsky, Louis J.

    A machine learning–based framework for modeling the error introduced by surrogate models of parameterized dynamical systems is proposed. The framework entails the use of high-dimensional regression techniques (eg, random forests, and LASSO) to map a large set of inexpensively computed “error indicators” (ie, features) produced by the surrogate model at a given time instance to a prediction of the surrogate-model error in a quantity of interest (QoI). This eliminates the need for the user to hand-select a small number of informative features. The methodology requires a training set of parameter instances at which the time-dependent surrogate-model error is computed bymore » simulating both the high-fidelity and surrogate models. Using these training data, the method first determines regression-model locality (via classification or clustering) and subsequently constructs a “local” regression model to predict the time-instantaneous error within each identified region of feature space. We consider 2 uses for the resulting error model: (1) as a correction to the surrogate-model QoI prediction at each time instance and (2) as a way to statistically model arbitrary functions of the time-dependent surrogate-model error (eg, time-integrated errors). We then apply the proposed framework to model errors in reduced-order models of nonlinear oil-water subsurface flow simulations, with time-varying well-control (bottom-hole pressure) parameters. The reduced-order models used in this work entail application of trajectory piecewise linearization in conjunction with proper orthogonal decomposition. Moreover, when the first use of the method is considered, numerical experiments demonstrate consistent improvement in accuracy in the time-instantaneous QoI prediction relative to the original surrogate model, across a large number of test cases. When the second use is considered, results show that the proposed method provides accurate statistical predictions of the time- and well-averaged errors.« less

  11. DEFINING RECOVERY GOALS AND STRATEGIES FOR ENDANGERED SPECIES USING SPATIALLY-EXPLICIT POPULATION MODELS

    EPA Science Inventory

    We used a spatially explicit population model of wolves (Canis lupus) to propose a framework for defining rangewide recovery priorities and finer-scale strategies for regional reintroductions. The model predicts that Yellowstone and central Idaho, where wolves have recently been ...

  12. A decomposition model and voxel selection framework for fMRI analysis to predict neural response of visual stimuli.

    PubMed

    Raut, Savita V; Yadav, Dinkar M

    2018-03-28

    This paper presents an fMRI signal analysis methodology using geometric mean curve decomposition (GMCD) and mutual information-based voxel selection framework. Previously, the fMRI signal analysis has been conducted using empirical mean curve decomposition (EMCD) model and voxel selection on raw fMRI signal. The erstwhile methodology loses frequency component, while the latter methodology suffers from signal redundancy. Both challenges are addressed by our methodology in which the frequency component is considered by decomposing the raw fMRI signal using geometric mean rather than arithmetic mean and the voxels are selected from EMCD signal using GMCD components, rather than raw fMRI signal. The proposed methodologies are adopted for predicting the neural response. Experimentations are conducted in the openly available fMRI data of six subjects, and comparisons are made with existing decomposition models and voxel selection frameworks. Subsequently, the effect of degree of selected voxels and the selection constraints are analyzed. The comparative results and the analysis demonstrate the superiority and the reliability of the proposed methodology.

  13. A definitional framework for the human/biometric sensor interaction model

    NASA Astrophysics Data System (ADS)

    Elliott, Stephen J.; Kukula, Eric P.

    2010-04-01

    Existing definitions for biometric testing and evaluation do not fully explain errors in a biometric system. This paper provides a definitional framework for the Human Biometric-Sensor Interaction (HBSI) model. This paper proposes six new definitions based around two classifications of presentations, erroneous and correct. The new terms are: defective interaction (DI), concealed interaction (CI), false interaction (FI), failure to detect (FTD), failure to extract (FTX), and successfully acquired samples (SAS). As with all definitions, the new terms require a modification to the general biometric model developed by Mansfield and Wayman [1].

  14. Monitoring alert and drowsy states by modeling EEG source nonstationarity

    NASA Astrophysics Data System (ADS)

    Hsu, Sheng-Hsiou; Jung, Tzyy-Ping

    2017-10-01

    Objective. As a human brain performs various cognitive functions within ever-changing environments, states of the brain characterized by recorded brain activities such as electroencephalogram (EEG) are inevitably nonstationary. The challenges of analyzing the nonstationary EEG signals include finding neurocognitive sources that underlie different brain states and using EEG data to quantitatively assess the state changes. Approach. This study hypothesizes that brain activities under different states, e.g. levels of alertness, can be modeled as distinct compositions of statistically independent sources using independent component analysis (ICA). This study presents a framework to quantitatively assess the EEG source nonstationarity and estimate levels of alertness. The framework was tested against EEG data collected from 10 subjects performing a sustained-attention task in a driving simulator. Main results. Empirical results illustrate that EEG signals under alert versus drowsy states, indexed by reaction speeds to driving challenges, can be characterized by distinct ICA models. By quantifying the goodness-of-fit of each ICA model to the EEG data using the model deviation index (MDI), we found that MDIs were significantly correlated with the reaction speeds (r  =  -0.390 with alertness models and r  =  0.449 with drowsiness models) and the opposite correlations indicated that the two models accounted for sources in the alert and drowsy states, respectively. Based on the observed source nonstationarity, this study also proposes an online framework using a subject-specific ICA model trained with an initial (alert) state to track the level of alertness. For classification of alert against drowsy states, the proposed online framework achieved an averaged area-under-curve of 0.745 and compared favorably with a classic power-based approach. Significance. This ICA-based framework provides a new way to study changes of brain states and can be applied to monitoring cognitive or mental states of human operators in attention-critical settings or in passive brain-computer interfaces.

  15. Absenteeism in Undergraduate Business Education: A Proposed Model and Exploratory Investigation

    ERIC Educational Resources Information Center

    Burke, Lisa A.

    2010-01-01

    One issue in undergraduate business education remaining underexamined is student absenteeism. In this article, the literature on undergraduate absenteeism is reviewed culminating in a proposed conceptual framework to guide future research, and an exploratory investigation of management students' attitudes about absenteeism is conducted.…

  16. Person-centered nursing practice with older people in Ireland.

    PubMed

    Landers, Margaret G; McCarthy, Geraldine M

    2007-01-01

    This column presents an analysis of McCormack's conceptual framework for person-centered practice with older people as a theoretical basis for the delivery of care of older adults in an Irish context. The evaluative process is guided by the framework proposed by Fawcett (2000) for the analysis and evaluation of conceptual models of nursing. The historical evolution, philosophical claims, and an overview of the content of the model are addressed. The following criteria are then applied: logical congruence, the generation of the theory, the credibility of the model, and the contribution of the model to the discipline of nursing.

  17. A Temporal Mining Framework for Classifying Un-Evenly Spaced Clinical Data: An Approach for Building Effective Clinical Decision-Making System.

    PubMed

    Jane, Nancy Yesudhas; Nehemiah, Khanna Harichandran; Arputharaj, Kannan

    2016-01-01

    Clinical time-series data acquired from electronic health records (EHR) are liable to temporal complexities such as irregular observations, missing values and time constrained attributes that make the knowledge discovery process challenging. This paper presents a temporal rough set induced neuro-fuzzy (TRiNF) mining framework that handles these complexities and builds an effective clinical decision-making system. TRiNF provides two functionalities namely temporal data acquisition (TDA) and temporal classification. In TDA, a time-series forecasting model is constructed by adopting an improved double exponential smoothing method. The forecasting model is used in missing value imputation and temporal pattern extraction. The relevant attributes are selected using a temporal pattern based rough set approach. In temporal classification, a classification model is built with the selected attributes using a temporal pattern induced neuro-fuzzy classifier. For experimentation, this work uses two clinical time series dataset of hepatitis and thrombosis patients. The experimental result shows that with the proposed TRiNF framework, there is a significant reduction in the error rate, thereby obtaining the classification accuracy on an average of 92.59% for hepatitis and 91.69% for thrombosis dataset. The obtained classification results prove the efficiency of the proposed framework in terms of its improved classification accuracy.

  18. Haptic simulation framework for determining virtual dental occlusion.

    PubMed

    Wu, Wen; Chen, Hui; Cen, Yuhai; Hong, Yang; Khambay, Balvinder; Heng, Pheng Ann

    2017-04-01

    The surgical treatment of many dentofacial deformities is often complex due to its three-dimensional nature. To determine the dental occlusion in the most stable position is essential for the success of the treatment. Computer-aided virtual planning on individualized patient-specific 3D model can help formulate the surgical plan and predict the surgical change. However, in current computer-aided planning systems, it is not possible to determine the dental occlusion of the digital models in the intuitive way during virtual surgical planning because of absence of haptic feedback. In this paper, a physically based haptic simulation framework is proposed, which can provide surgeons with the intuitive haptic feedback to determine the dental occlusion of the digital models in their most stable position. To provide the physically realistic force feedback when the dental models contact each other during the searching process, the contact model is proposed to describe the dynamic and collision properties of the dental models during the alignment. The simulated impulse/contact-based forces are integrated into the unified simulation framework. A validation study has been conducted on fifteen sets of virtual dental models chosen at random and covering a wide range of the dental relationships found clinically. The dental occlusions obtained by an expert were employed as a benchmark to compare the virtual occlusion results. The mean translational and angular deviations of the virtual occlusion results from the benchmark were small. The experimental results show the validity of our method. The simulated forces can provide valuable insights to determine the virtual dental occlusion. The findings of this work and the validation of proposed concept lead the way for full virtual surgical planning on patient-specific virtual models allowing fully customized treatment plans for the surgical correction of dentofacial deformities.

  19. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  20. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE PAGES

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; ...

    2016-04-25

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  1. The Role of Twinning Deformation on the Hardening Response of Polycrystalline Magnesium from Discrete Dislocation Dynamics Simulations

    DTIC Science & Technology

    2015-01-01

    polycrystalline magnesium (Mg) was studied using three-dimensional discrete dislocation dynamics ( DDD ). A systematic interaction model between dislocations...and f1012g tension twin boundaries (TBs) was proposed and introduced into the DDD framework. In addition, a nominal grain boundary (GB) model based...dynamics ( DDD ). A systematic interaction model between dislocations and f10 12g tension twin boundaries (TBs) was proposed and introduced into the DDD

  2. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    PubMed Central

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  3. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  4. Assessing a cross-border logistics policy using a performance measurement system framework: the case of Hong Kong and the Pearl River Delta region

    NASA Astrophysics Data System (ADS)

    Wong, David W. C.; Choy, K. L.; Chow, Harry K. H.; Lin, Canhong

    2014-06-01

    For the most rapidly growing economic entity in the world, China, a new logistics operation called the indirect cross-border supply chain model has recently emerged. The primary idea of this model is to reduce logistics costs by storing goods at a bonded warehouse with low storage cost in certain Chinese regions, such as the Pearl River Delta (PRD). This research proposes a performance measurement system (PMS) framework to assess the direct and indirect cross-border supply chain models. The PMS covers four categories including cost, time, quality and flexibility in the assessment of the performance of direct and indirect models. Furthermore, a survey was conducted to investigate the logistics performance of third party logistics (3PLs) at the PRD regions, including Guangzhou, Shenzhen and Hong Kong. The significance of the proposed PMS framework allows 3PLs accurately pinpoint the weakness and strengths of it current operations policy at four major performance measurement categories. Hence, this helps 3PLs further enhance the competitiveness and operations efficiency through better resources allocation at the area of warehousing and transportation.

  5. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    PubMed

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.

  6. A VGI data integration framework based on linked data model

    NASA Astrophysics Data System (ADS)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  7. Heterogeneity, Mixing, and the Spatial Scales of Mosquito-Borne Pathogen Transmission

    PubMed Central

    Perkins, T. Alex; Scott, Thomas W.; Le Menach, Arnaud; Smith, David L.

    2013-01-01

    The Ross-Macdonald model has dominated theory for mosquito-borne pathogen transmission dynamics and control for over a century. The model, like many other basic population models, makes the mathematically convenient assumption that populations are well mixed; i.e., that each mosquito is equally likely to bite any vertebrate host. This assumption raises questions about the validity and utility of current theory because it is in conflict with preponderant empirical evidence that transmission is heterogeneous. Here, we propose a new dynamic framework that is realistic enough to describe biological causes of heterogeneous transmission of mosquito-borne pathogens of humans, yet tractable enough to provide a basis for developing and improving general theory. The framework is based on the ecological context of mosquito blood meals and the fine-scale movements of individual mosquitoes and human hosts that give rise to heterogeneous transmission. Using this framework, we describe pathogen dispersion in terms of individual-level analogues of two classical quantities: vectorial capacity and the basic reproductive number, . Importantly, this framework explicitly accounts for three key components of overall heterogeneity in transmission: heterogeneous exposure, poor mixing, and finite host numbers. Using these tools, we propose two ways of characterizing the spatial scales of transmission—pathogen dispersion kernels and the evenness of mixing across scales of aggregation—and demonstrate the consequences of a model's choice of spatial scale for epidemic dynamics and for estimation of , both by a priori model formulas and by inference of the force of infection from time-series data. PMID:24348223

  8. Framework for e-learning assessment in dental education: a global model for the future.

    PubMed

    Arevalo, Carolina R; Bayne, Stephen C; Beeley, Josie A; Brayshaw, Christine J; Cox, Margaret J; Donaldson, Nora H; Elson, Bruce S; Grayden, Sharon K; Hatzipanagos, Stylianos; Johnson, Lynn A; Reynolds, Patricia A; Schönwetter, Dieter J

    2013-05-01

    The framework presented in this article demonstrates strategies for a global approach to e-curricula in dental education by considering a collection of outcome assessment tools. By combining the outcomes for overall assessment, a global model for a pilot project that applies e-assessment tools to virtual learning environments (VLE), including haptics, is presented. Assessment strategies from two projects, HapTEL (Haptics in Technology Enhanced Learning) and UDENTE (Universal Dental E-learning), act as case-user studies that have helped develop the proposed global framework. They incorporate additional assessment tools and include evaluations from questionnaires and stakeholders' focus groups. These measure each of the factors affecting the classical teaching/learning theory framework as defined by Entwistle in a standardized manner. A mathematical combinatorial approach is proposed to join these results together as a global assessment. With the use of haptic-based simulation learning, exercises for tooth preparation assessing enamel and dentine were compared to plastic teeth in manikins. Equivalence for student performance for haptic versus traditional preparation methods was established, thus establishing the validity of the haptic solution for performing these exercises. Further data collected from HapTEL are still being analyzed, and pilots are being conducted to validate the proposed test measures. Initial results have been encouraging, but clearly the need persists to develop additional e-assessment methods for new learning domains.

  9. Extendable supervised dictionary learning for exploring diverse and concurrent brain activities in task-based fMRI.

    PubMed

    Zhao, Shijie; Han, Junwei; Hu, Xintao; Jiang, Xi; Lv, Jinglei; Zhang, Tuo; Zhang, Shu; Guo, Lei; Liu, Tianming

    2018-06-01

    Recently, a growing body of studies have demonstrated the simultaneous existence of diverse brain activities, e.g., task-evoked dominant response activities, delayed response activities and intrinsic brain activities, under specific task conditions. However, current dominant task-based functional magnetic resonance imaging (tfMRI) analysis approach, i.e., the general linear model (GLM), might have difficulty in discovering those diverse and concurrent brain responses sufficiently. This subtraction-based model-driven approach focuses on the brain activities evoked directly from the task paradigm, thus likely overlooks other possible concurrent brain activities evoked during the information processing. To deal with this problem, in this paper, we propose a novel hybrid framework, called extendable supervised dictionary learning (E-SDL), to explore diverse and concurrent brain activities under task conditions. A critical difference between E-SDL framework and previous methods is that we systematically extend the basic task paradigm regressor into meaningful regressor groups to account for possible regressor variation during the information processing procedure in the brain. Applications of the proposed framework on five independent and publicly available tfMRI datasets from human connectome project (HCP) simultaneously revealed more meaningful group-wise consistent task-evoked networks and common intrinsic connectivity networks (ICNs). These results demonstrate the advantage of the proposed framework in identifying the diversity of concurrent brain activities in tfMRI datasets.

  10. Working toward integrated models of alpine plant distribution.

    PubMed

    Carlson, Bradley Z; Randin, Christophe F; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe

    2013-10-01

    Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial-temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution.

  11. An Ontology-Based Framework for Bridging Learning Design and Learning Content

    ERIC Educational Resources Information Center

    Knight, Colin, Gasevic, Dragan; Richards, Griff

    2006-01-01

    The paper describes an ontology-based framework for bridging learning design and learning object content. In present solutions, researchers have proposed conceptual models and developed tools for both of those subjects, but without detailed discussions of how they can be used together. In this paper we advocate the use of ontologies to explicitly…

  12. Evidence-Based Administration for Decision Making in the Framework of Knowledge Strategic Management

    ERIC Educational Resources Information Center

    Del Junco, Julio Garcia; Zaballa, Rafael De Reyna; de Perea, Juan Garcia Alvarez

    2010-01-01

    Purpose: This paper seeks to present a model based on evidence-based administration (EBA), which aims to facilitate the creation, transformation and diffusion of knowledge in learning organizations. Design/methodology/approach: A theoretical framework is proposed based on EBA and the case method. Accordingly, an empirical study was carried out in…

  13. Learning in the model space for cognitive fault diagnosis.

    PubMed

    Chen, Huanhuan; Tino, Peter; Rodan, Ali; Yao, Xin

    2014-01-01

    The emergence of large sensor networks has facilitated the collection of large amounts of real-time data to monitor and control complex engineering systems. However, in many cases the collected data may be incomplete or inconsistent, while the underlying environment may be time-varying or unformulated. In this paper, we develop an innovative cognitive fault diagnosis framework that tackles the above challenges. This framework investigates fault diagnosis in the model space instead of the signal space. Learning in the model space is implemented by fitting a series of models using a series of signal segments selected with a sliding window. By investigating the learning techniques in the fitted model space, faulty models can be discriminated from healthy models using a one-class learning algorithm. The framework enables us to construct a fault library when unknown faults occur, which can be regarded as cognitive fault isolation. This paper also theoretically investigates how to measure the pairwise distance between two models in the model space and incorporates the model distance into the learning algorithm in the model space. The results on three benchmark applications and one simulated model for the Barcelona water distribution network confirm the effectiveness of the proposed framework.

  14. A Function-Behavior-State Approach to Designing Human Machine Interface for Nuclear Power Plant Operators

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Zhang, W. J.

    2005-02-01

    This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.

  15. The Elaborated Environmental Stress Hypothesis as a Framework for Understanding the Association Between Motor Skills and Internalizing Problems: A Mini-Review

    PubMed Central

    Mancini, Vincent O.; Rigoli, Daniela; Cairney, John; Roberts, Lynne D.; Piek, Jan P.

    2016-01-01

    Poor motor skills have been shown to be associated with a range of psychosocial issues, including internalizing problems (anxiety and depression). While well-documented empirically, our understanding of why this relationship occurs remains theoretically underdeveloped. The Elaborated Environmental Stress Hypothesis by Cairney et al. (2013) provides a promising framework that seeks to explain the association between motor skills and internalizing problems, specifically in children with developmental coordination disorder (DCD). The framework posits that poor motor skills predispose the development of internalizing problems via interactions with intermediary environmental stressors. At the time the model was proposed, limited direct evidence was available to support or refute the framework. Several studies and developments related to the framework have since been published. This mini-review seeks to provide an up-to-date overview of recent developments related to the Elaborated Environmental Stress Hypothesis. We briefly discuss the past research that led to its development, before moving to studies that have investigated the framework since it was proposed. While originally developed within the context of DCD in childhood, recent developments have found support for the model in community samples. Through the reviewed literature, this article provides support for the Elaborated Environmental Stress Hypothesis as a promising theoretical framework that explains the psychosocial correlates across the broader spectrum of motor ability. However, given its recent conceptualization, ongoing evaluation of the Elaborated Environmental Stress Hypothesis is recommended. PMID:26941690

  16. Volumetric image classification using homogeneous decomposition and dictionary learning: A study using retinal optical coherence tomography for detecting age-related macular degeneration.

    PubMed

    Albarrak, Abdulrahman; Coenen, Frans; Zheng, Yalin

    2017-01-01

    Three-dimensional (3D) (volumetric) diagnostic imaging techniques are indispensable with respect to the diagnosis and management of many medical conditions. However there is a lack of automated diagnosis techniques to facilitate such 3D image analysis (although some support tools do exist). This paper proposes a novel framework for volumetric medical image classification founded on homogeneous decomposition and dictionary learning. In the proposed framework each image (volume) is recursively decomposed until homogeneous regions are arrived at. Each region is represented using a Histogram of Oriented Gradients (HOG) which is transformed into a set of feature vectors. The Gaussian Mixture Model (GMM) is then used to generate a "dictionary" and the Improved Fisher Kernel (IFK) approach is used to encode feature vectors so as to generate a single feature vector for each volume, which can then be fed into a classifier generator. The principal advantage offered by the framework is that it does not require the detection (segmentation) of specific objects within the input data. The nature of the framework is fully described. A wide range of experiments was conducted with which to analyse the operation of the proposed framework and these are also reported fully in the paper. Although the proposed approach is generally applicable to 3D volumetric images, the focus for the work is 3D retinal Optical Coherence Tomography (OCT) images in the context of the diagnosis of Age-related Macular Degeneration (AMD). The results indicate that excellent diagnostic predictions can be produced using the proposed framework. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A Unified Estimation Framework for State-Related Changes in Effective Brain Connectivity.

    PubMed

    Samdin, S Balqis; Ting, Chee-Ming; Ombao, Hernando; Salleh, Sh-Hussain

    2017-04-01

    This paper addresses the critical problem of estimating time-evolving effective brain connectivity. Current approaches based on sliding window analysis or time-varying coefficient models do not simultaneously capture both slow and abrupt changes in causal interactions between different brain regions. To overcome these limitations, we develop a unified framework based on a switching vector autoregressive (SVAR) model. Here, the dynamic connectivity regimes are uniquely characterized by distinct vector autoregressive (VAR) processes and allowed to switch between quasi-stationary brain states. The state evolution and the associated directed dependencies are defined by a Markov process and the SVAR parameters. We develop a three-stage estimation algorithm for the SVAR model: 1) feature extraction using time-varying VAR (TV-VAR) coefficients, 2) preliminary regime identification via clustering of the TV-VAR coefficients, 3) refined regime segmentation by Kalman smoothing and parameter estimation via expectation-maximization algorithm under a state-space formulation, using initial estimates from the previous two stages. The proposed framework is adaptive to state-related changes and gives reliable estimates of effective connectivity. Simulation results show that our method provides accurate regime change-point detection and connectivity estimates. In real applications to brain signals, the approach was able to capture directed connectivity state changes in functional magnetic resonance imaging data linked with changes in stimulus conditions, and in epileptic electroencephalograms, differentiating ictal from nonictal periods. The proposed framework accurately identifies state-dependent changes in brain network and provides estimates of connectivity strength and directionality. The proposed approach is useful in neuroscience studies that investigate the dynamics of underlying brain states.

  18. A Method for Evaluating Information Security Governance (ISG) Components in Banking Environment

    NASA Astrophysics Data System (ADS)

    Ula, M.; Ula, M.; Fuadi, W.

    2017-02-01

    As modern banking increasingly relies on the internet and computer technologies to operate their businesses and market interactions, the threats and security breaches have highly increased in recent years. Insider and outsider attacks have caused global businesses lost trillions of Dollars a year. Therefore, that is a need for a proper framework to govern the information security in the banking system. The aim of this research is to propose and design an enhanced method to evaluate information security governance (ISG) implementation in banking environment. This research examines and compares the elements from the commonly used information security governance frameworks, standards and best practices. Their strength and weakness are considered in its approaches. The initial framework for governing the information security in banking system was constructed from document review. The framework was categorized into three levels which are Governance level, Managerial level, and technical level. The study further conducts an online survey for banking security professionals to get their professional judgment about the ISG most critical components and the importance for each ISG component that should be implemented in banking environment. Data from the survey was used to construct a mathematical model for ISG evaluation, component importance data used as weighting coefficient for the related component in the mathematical model. The research further develops a method for evaluating ISG implementation in banking based on the mathematical model. The proposed method was tested through real bank case study in an Indonesian local bank. The study evidently proves that the proposed method has sufficient coverage of ISG in banking environment and effectively evaluates the ISG implementation in banking environment.

  19. A statistical metadata model for clinical trials' data management.

    PubMed

    Vardaki, Maria; Papageorgiou, Haralambos; Pentaris, Fragkiskos

    2009-08-01

    We introduce a statistical, process-oriented metadata model to describe the process of medical research data collection, management, results analysis and dissemination. Our approach explicitly provides a structure for pieces of information used in Clinical Study Data Management Systems, enabling a more active role for any associated metadata. Using the object-oriented paradigm, we describe the classes of our model that participate during the design of a clinical trial and the subsequent collection and management of the relevant data. The advantage of our approach is that we focus on presenting the structural inter-relation of these classes when used during datasets manipulation by proposing certain transformations that model the simultaneous processing of both data and metadata. Our solution reduces the possibility of human errors and allows for the tracking of all changes made during datasets lifecycle. The explicit modeling of processing steps improves data quality and assists in the problem of handling data collected in different clinical trials. The case study illustrates the applicability of the proposed framework demonstrating conceptually the simultaneous handling of datasets collected during two randomized clinical studies. Finally, we provide the main considerations for implementing the proposed framework into a modern Metadata-enabled Information System.

  20. [Social determinants of odontalgia in epidemiological studies: theoretical review and proposed conceptual model].

    PubMed

    Bastos, João Luiz Dornelles; Gigante, Denise Petrucci; Peres, Karen Glazer; Nedel, Fúlvio Borges

    2007-01-01

    The epidemiological literature has been limited by the absence of a theoretical framework reflecting the complexity of causal mechanisms for the occurrence of health phenomena / disease conditions. In the field of oral epidemiology, such lack of theory also prevails, since dental caries the leading topic in oral research has been often studied through a biological and reductionist viewpoint. One of the most important consequences of dental caries is dental pain (odontalgia), which has received little attention in studies with sophisticated theoretical models and powerful designs to establish causal relationships. The purpose of this study is to review the scientific literature on the determinants of odontalgia and to discuss theories proposed for the explanation of the phenomenon. Conceptual models and emerging theories on the social determinants of oral health are revised, in an attempt to build up links with the bio-psychosocial pain model, proposing a more elaborate causal model for odontalgia. The framework suggests causal pathways between social structure and oral health through material, psychosocial and behavioral pathways. Aspects of the social structure are highlighted in order to relate them to odontalgia, stressing their importance in discussions of causal relationships in oral health research.

  1. Hierarchy-associated semantic-rule inference framework for classifying indoor scenes

    NASA Astrophysics Data System (ADS)

    Yu, Dan; Liu, Peng; Ye, Zhipeng; Tang, Xianglong; Zhao, Wei

    2016-03-01

    Typically, the initial task of classifying indoor scenes is challenging, because the spatial layout and decoration of a scene can vary considerably. Recent efforts at classifying object relationships commonly depend on the results of scene annotation and predefined rules, making classification inflexible. Furthermore, annotation results are easily affected by external factors. Inspired by human cognition, a scene-classification framework was proposed using the empirically based annotation (EBA) and a match-over rule-based (MRB) inference system. The semantic hierarchy of images is exploited by EBA to construct rules empirically for MRB classification. The problem of scene classification is divided into low-level annotation and high-level inference from a macro perspective. Low-level annotation involves detecting the semantic hierarchy and annotating the scene with a deformable-parts model and a bag-of-visual-words model. In high-level inference, hierarchical rules are extracted to train the decision tree for classification. The categories of testing samples are generated from the parts to the whole. Compared with traditional classification strategies, the proposed semantic hierarchy and corresponding rules reduce the effect of a variable background and improve the classification performance. The proposed framework was evaluated on a popular indoor scene dataset, and the experimental results demonstrate its effectiveness.

  2. Stochastic Robust Mathematical Programming Model for Power System Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Cong; Changhyeok, Lee; Haoyong, Chen

    2016-01-01

    This paper presents a stochastic robust framework for two-stage power system optimization problems with uncertainty. The model optimizes the probabilistic expectation of different worst-case scenarios with ifferent uncertainty sets. A case study of unit commitment shows the effectiveness of the proposed model and algorithms.

  3. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE PAGES

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; ...

    2018-02-20

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less

  4. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    NASA Astrophysics Data System (ADS)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; Houze, Robert A.; Xiao, Heng

    2018-02-01

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii) the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. In addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.

  5. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less

  6. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  7. Exploring Distributed Leadership for the Quality Management of Online Learning Environments

    ERIC Educational Resources Information Center

    Palmer, Stuart; Holt, Dale; Gosper, Maree; Sankey, Michael; Allan, Garry

    2013-01-01

    Online learning environments (OLEs) are complex information technology (IT) systems that intersect with many areas of university organisation. Distributed models of leadership have been proposed as appropriate for the good governance of OLEs. Based on theoretical and empirical research, a group of Australian universities proposed a framework for…

  8. Real Time 3D Facial Movement Tracking Using a Monocular Camera

    PubMed Central

    Dong, Yanchao; Wang, Yanming; Yue, Jiguang; Hu, Zhencheng

    2016-01-01

    The paper proposes a robust framework for 3D facial movement tracking in real time using a monocular camera. It is designed to estimate the 3D face pose and local facial animation such as eyelid movement and mouth movement. The framework firstly utilizes the Discriminative Shape Regression method to locate the facial feature points on the 2D image and fuses the 2D data with a 3D face model using Extended Kalman Filter to yield 3D facial movement information. An alternating optimizing strategy is adopted to fit to different persons automatically. Experiments show that the proposed framework could track the 3D facial movement across various poses and illumination conditions. Given the real face scale the framework could track the eyelid with an error of 1 mm and mouth with an error of 2 mm. The tracking result is reliable for expression analysis or mental state inference. PMID:27463714

  9. Real Time 3D Facial Movement Tracking Using a Monocular Camera.

    PubMed

    Dong, Yanchao; Wang, Yanming; Yue, Jiguang; Hu, Zhencheng

    2016-07-25

    The paper proposes a robust framework for 3D facial movement tracking in real time using a monocular camera. It is designed to estimate the 3D face pose and local facial animation such as eyelid movement and mouth movement. The framework firstly utilizes the Discriminative Shape Regression method to locate the facial feature points on the 2D image and fuses the 2D data with a 3D face model using Extended Kalman Filter to yield 3D facial movement information. An alternating optimizing strategy is adopted to fit to different persons automatically. Experiments show that the proposed framework could track the 3D facial movement across various poses and illumination conditions. Given the real face scale the framework could track the eyelid with an error of 1 mm and mouth with an error of 2 mm. The tracking result is reliable for expression analysis or mental state inference.

  10. A Novel Multi-Class Ensemble Model for Classifying Imbalanced Biomedical Datasets

    NASA Astrophysics Data System (ADS)

    Bikku, Thulasi; Sambasiva Rao, N., Dr; Rao, Akepogu Ananda, Dr

    2017-08-01

    This paper mainly focuseson developing aHadoop based framework for feature selection and classification models to classify high dimensionality data in heterogeneous biomedical databases. Wide research has been performing in the fields of Machine learning, Big data and Data mining for identifying patterns. The main challenge is extracting useful features generated from diverse biological systems. The proposed model can be used for predicting diseases in various applications and identifying the features relevant to particular diseases. There is an exponential growth of biomedical repositories such as PubMed and Medline, an accurate predictive model is essential for knowledge discovery in Hadoop environment. Extracting key features from unstructured documents often lead to uncertain results due to outliers and missing values. In this paper, we proposed a two phase map-reduce framework with text preprocessor and classification model. In the first phase, mapper based preprocessing method was designed to eliminate irrelevant features, missing values and outliers from the biomedical data. In the second phase, a Map-Reduce based multi-class ensemble decision tree model was designed and implemented in the preprocessed mapper data to improve the true positive rate and computational time. The experimental results on the complex biomedical datasets show that the performance of our proposed Hadoop based multi-class ensemble model significantly outperforms state-of-the-art baselines.

  11. Modeling and simulating industrial land-use evolution in Shanghai, China

    NASA Astrophysics Data System (ADS)

    Qiu, Rongxu; Xu, Wei; Zhang, John; Staenz, Karl

    2018-01-01

    This study proposes a cellular automata-based Industrial and Residential Land Use Competition Model to simulate the dynamic spatial transformation of industrial land use in Shanghai, China. In the proposed model, land development activities in a city are delineated as competitions among different land-use types. The Hedonic Land Pricing Model is adopted to implement the competition framework. To improve simulation results, the Land Price Agglomeration Model was devised to simulate and adjust classic land price theory. A new evolutionary algorithm-based parameter estimation method was devised in place of traditional methods. Simulation results show that the proposed model closely resembles actual land transformation patterns and the model can not only simulate land development, but also redevelopment processes in metropolitan areas.

  12. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  13. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand's Official Statistics System

    PubMed Central

    Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.

    2013-01-01

    Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231

  14. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  15. A business rules design framework for a pharmaceutical validation and alert system.

    PubMed

    Boussadi, A; Bousquet, C; Sabatier, B; Caruba, T; Durieux, P; Degoulet, P

    2011-01-01

    Several alert systems have been developed to improve the patient safety aspects of clinical information systems (CIS). Most studies have focused on the evaluation of these systems, with little information provided about the methodology leading to system implementation. We propose here an 'agile' business rule design framework (BRDF) supporting both the design of alerts for the validation of drug prescriptions and the incorporation of the end user into the design process. We analyzed the unified process (UP) design life cycle and defined the activities, subactivities, actors and UML artifacts that could be used to enhance the agility of the proposed framework. We then applied the proposed framework to two different sets of data in the context of the Georges Pompidou University Hospital (HEGP) CIS. We introduced two new subactivities into UP: business rule specification and business rule instantiation activity. The pharmacist made an effective contribution to five of the eight BRDF design activities. Validation of the two new subactivities was effected in the context of drug dosage adaption to the patients' clinical and biological contexts. Pilot experiment shows that business rules modeled with BRDF and implemented as an alert system triggered an alert for 5824 of the 71,413 prescriptions considered (8.16%). A business rule design framework approach meets one of the strategic objectives for decision support design by taking into account three important criteria posing a particular challenge to system designers: 1) business processes, 2) knowledge modeling of the context of application, and 3) the agility of the various design steps.

  16. Modelling of anisotropic growth in biological tissues. A new approach and computational aspects.

    PubMed

    Menzel, A

    2005-03-01

    In this contribution, we develop a theoretical and computational framework for anisotropic growth phenomena. As a key idea of the proposed phenomenological approach, a fibre or rather structural tensor is introduced, which allows the description of transversely isotropic material behaviour. Based on this additional argument, anisotropic growth is modelled via appropriate evolution equations for the fibre while volumetric remodelling is realised by an evolution of the referential density. Both the strength of the fibre as well as the density follow Wolff-type laws. We however elaborate on two different approaches for the evolution of the fibre direction, namely an alignment with respect to strain or with respect to stress. One of the main benefits of the developed framework is therefore the opportunity to address the evolutions of the fibre strength and the fibre direction separately. It is then straightforward to set up appropriate integration algorithms such that the developed framework fits nicely into common, finite element schemes. Finally, several numerical examples underline the applicability of the proposed formulation.

  17. Multi-Dielectric Brownian Dynamics and Design-Space-Exploration Studies of Permeation in Ion Channels.

    PubMed

    Siksik, May; Krishnamurthy, Vikram

    2017-09-01

    This paper proposes a multi-dielectric Brownian dynamics simulation framework for design-space-exploration (DSE) studies of ion-channel permeation. The goal of such DSE studies is to estimate the channel modeling-parameters that minimize the mean-squared error between the simulated and expected "permeation characteristics." To address this computational challenge, we use a methodology based on statistical inference that utilizes the knowledge of channel structure to prune the design space. We demonstrate the proposed framework and DSE methodology using a case study based on the KcsA ion channel, in which the design space is successfully reduced from a 6-D space to a 2-D space. Our results show that the channel dielectric map computed using the framework matches with that computed directly using molecular dynamics with an error of 7%. Finally, the scalability and resolution of the model used are explored, and it is shown that the memory requirements needed for DSE remain constant as the number of parameters (degree of heterogeneity) increases.

  18. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  19. Predicting visual semantic descriptive terms from radiological image data: preliminary results with liver lesions in CT.

    PubMed

    Depeursinge, Adrien; Kurtz, Camille; Beaulieu, Christopher; Napel, Sandy; Rubin, Daniel

    2014-08-01

    We describe a framework to model visual semantics of liver lesions in CT images in order to predict the visual semantic terms (VST) reported by radiologists in describing these lesions. Computational models of VST are learned from image data using linear combinations of high-order steerable Riesz wavelets and support vector machines (SVM). In a first step, these models are used to predict the presence of each semantic term that describes liver lesions. In a second step, the distances between all VST models are calculated to establish a nonhierarchical computationally-derived ontology of VST containing inter-term synonymy and complementarity. A preliminary evaluation of the proposed framework was carried out using 74 liver lesions annotated with a set of 18 VSTs from the RadLex ontology. A leave-one-patient-out cross-validation resulted in an average area under the ROC curve of 0.853 for predicting the presence of each VST. The proposed framework is expected to foster human-computer synergies for the interpretation of radiological images while using rotation-covariant computational models of VSTs to 1) quantify their local likelihood and 2) explicitly link them with pixel-based image content in the context of a given imaging domain.

  20. Towards a voxel-based geographic automata for the simulation of geospatial processes

    NASA Astrophysics Data System (ADS)

    Jjumba, Anthony; Dragićević, Suzana

    2016-07-01

    Many geographic processes evolve in a three dimensional space and time continuum. However, when they are represented with the aid of geographic information systems (GIS) or geosimulation models they are modelled in a framework of two-dimensional space with an added temporal component. The objective of this study is to propose the design and implementation of voxel-based automata as a methodological approach for representing spatial processes evolving in the four-dimensional (4D) space-time domain. Similar to geographic automata models which are developed to capture and forecast geospatial processes that change in a two-dimensional spatial framework using cells (raster geospatial data), voxel automata rely on the automata theory and use three-dimensional volumetric units (voxels). Transition rules have been developed to represent various spatial processes which range from the movement of an object in 3D to the diffusion of airborne particles and landslide simulation. In addition, the proposed 4D models demonstrate that complex processes can be readily reproduced from simple transition functions without complex methodological approaches. The voxel-based automata approach provides a unique basis to model geospatial processes in 4D for the purpose of improving representation, analysis and understanding their spatiotemporal dynamics. This study contributes to the advancement of the concepts and framework of 4D GIS.

  1. Enhancing a socio-hydrological modelling framework through field observations: a case study in India

    NASA Astrophysics Data System (ADS)

    den Besten, Nadja; Pande, Saket; Savenije, Huub H. G.

    2016-04-01

    Recently a smallholder socio-hydrological modelling framework was proposed and deployed to understand the underlying dynamics of Agrarian Crisis in Maharashtra state of India. It was found that cotton and sugarcane smallholders whom lack irrigation and storage techniques are most susceptible to distress. This study further expands the application of the modelling framework to other crops that are abundant in the state of Maharashtra, such as Paddy, Jowar and Soyabean to assess whether the conclusions on the possible causes behind smallholder distress still hold. Further, a fieldwork will be undertaken in March 2016 in the district of Pune. During the fieldwork 50 smallholders will be interviewed in which socio-hydrological assumptions on hydrology and capital equations and corresponding closure relationships, incorporated the current model, will be put to test. Besides the assumptions, the questionnaires will be used to better understand the hydrological reality of the farm holders, in terms of water usage and storage capacity. In combination with historical records on the smallholders' socio-economic data acquired over the last thirty years available through several NGOs in the region, socio-hydrological realism of the modelling framework will be enhanced. The preliminary outcomes of a desktop study show the possibilities of a water-centric modelling framework in understanding the constraints on smallholder farming. The results and methods described can be a first step guiding following research on the modelling framework: a start in testing the framework in multiple rural locations around the globe.

  2. The use of technology to promote vaccination: A social ecological model based framework.

    PubMed

    Kolff, Chelsea A; Scott, Vanessa P; Stockwell, Melissa S

    2018-05-21

    Vaccinations are an important and effective cornerstone of preventive medical care. Growing technologic capabilities and use by both patients and providers present critical opportunities to leverage these tools to improve vaccination rates and public health. We propose the Social Ecological Model as a useful theoretical framework to identify areas in which technology has been or may be leveraged to target undervaccination across the individual, interpersonal, organizational, community, and society levels and the ways in which these levels interact.

  3. Quantified choice of root-mean-square errors of approximation for evaluation and power analysis of small differences between structural equation models.

    PubMed

    Li, Libo; Bentler, Peter M

    2011-06-01

    MacCallum, Browne, and Cai (2006) proposed a new framework for evaluation and power analysis of small differences between nested structural equation models (SEMs). In their framework, the null and alternative hypotheses for testing a small difference in fit and its related power analyses were defined by some chosen root-mean-square error of approximation (RMSEA) pairs. In this article, we develop a new method that quantifies those chosen RMSEA pairs and allows a quantitative comparison of them. Our method proposes the use of single RMSEA values to replace the choice of RMSEA pairs for model comparison and power analysis, thus avoiding the differential meaning of the chosen RMSEA pairs inherent in the approach of MacCallum et al. (2006). With this choice, the conventional cutoff values in model overall evaluation can directly be transferred and applied to the evaluation and power analysis of model differences. © 2011 American Psychological Association

  4. Discriminative Nonlinear Analysis Operator Learning: When Cosparse Model Meets Image Classification.

    PubMed

    Wen, Zaidao; Hou, Biao; Jiao, Licheng

    2017-05-03

    Linear synthesis model based dictionary learning framework has achieved remarkable performances in image classification in the last decade. Behaved as a generative feature model, it however suffers from some intrinsic deficiencies. In this paper, we propose a novel parametric nonlinear analysis cosparse model (NACM) with which a unique feature vector will be much more efficiently extracted. Additionally, we derive a deep insight to demonstrate that NACM is capable of simultaneously learning the task adapted feature transformation and regularization to encode our preferences, domain prior knowledge and task oriented supervised information into the features. The proposed NACM is devoted to the classification task as a discriminative feature model and yield a novel discriminative nonlinear analysis operator learning framework (DNAOL). The theoretical analysis and experimental performances clearly demonstrate that DNAOL will not only achieve the better or at least competitive classification accuracies than the state-of-the-art algorithms but it can also dramatically reduce the time complexities in both training and testing phases.

  5. Theoretical aspect of suitable spatial boundary condition specified for adjoint model on limited area

    NASA Astrophysics Data System (ADS)

    Wang, Yuan; Wu, Rongsheng

    2001-12-01

    Theoretical argumentation for so-called suitable spatial condition is conducted by the aid of homotopy framework to demonstrate that the proposed boundary condition does guarantee that the over-specification boundary condition resulting from an adjoint model on a limited-area is no longer an issue, and yet preserve its well-poseness and optimal character in the boundary setting. The ill-poseness of over-specified spatial boundary condition is in a sense, inevitable from an adjoint model since data assimilation processes have to adapt prescribed observations that used to be over-specified at the spatial boundaries of the modeling domain. In the view of pragmatic implement, the theoretical framework of our proposed condition for spatial boundaries indeed can be reduced to the hybrid formulation of nudging filter, radiation condition taking account of ambient forcing, together with Dirichlet kind of compatible boundary condition to the observations prescribed in data assimilation procedure. All of these treatments, no doubt, are very familiar to mesoscale modelers.

  6. Framework for Detection and Localization of Extreme Climate Event with Pixel Recursive Super Resolution

    NASA Astrophysics Data System (ADS)

    Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.

    2017-12-01

    Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.

  7. Robust model predictive control for multi-step short range spacecraft rendezvous

    NASA Astrophysics Data System (ADS)

    Zhu, Shuyi; Sun, Ran; Wang, Jiaolong; Wang, Jihe; Shao, Xiaowei

    2018-07-01

    This work presents a robust model predictive control (MPC) approach for the multi-step short range spacecraft rendezvous problem. During the specific short range phase concerned, the chaser is supposed to be initially outside the line-of-sight (LOS) cone. Therefore, the rendezvous process naturally includes two steps: the first step is to transfer the chaser into the LOS cone and the second step is to transfer the chaser into the aimed region with its motion confined within the LOS cone. A novel MPC framework named after Mixed MPC (M-MPC) is proposed, which is the combination of the Variable-Horizon MPC (VH-MPC) framework and the Fixed-Instant MPC (FI-MPC) framework. The M-MPC framework enables the optimization for the two steps to be implemented jointly rather than to be separated factitiously, and its computation workload is acceptable for the usually low-power processors onboard spacecraft. Then considering that disturbances including modeling error, sensor noise and thrust uncertainty may induce undesired constraint violations, a robust technique is developed and it is attached to the above M-MPC framework to form a robust M-MPC approach. The robust technique is based on the chance-constrained idea, which ensures that constraints can be satisfied with a prescribed probability. It improves the robust technique proposed by Gavilan et al., because it eliminates the unnecessary conservativeness by explicitly incorporating known statistical properties of the navigation uncertainty. The efficacy of the robust M-MPC approach is shown in a simulation study.

  8. Learning, Behaviour and Reaction Framework: A Model for Training Raters to Improve Assessment Quality

    ERIC Educational Resources Information Center

    Chen, Chung-Yang; Chang, Huiju; Hsu, Wen-Chin; Sheen, Gwo-Ji

    2017-01-01

    This paper proposes a training model for raters, with the goal to improve the intra- and inter-consistency of evaluation quality for higher education curricula. The model, termed the learning, behaviour and reaction (LBR) circular training model, is an interdisciplinary application from the business and organisational training domain. The…

  9. Argumentation in Science Education: A Model-based Framework

    NASA Astrophysics Data System (ADS)

    Böttcher, Florian; Meisert, Anke

    2011-02-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.

  10. An ontology for component-based models of water resource systems

    NASA Astrophysics Data System (ADS)

    Elag, Mostafa; Goodall, Jonathan L.

    2013-08-01

    Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.

  11. Is there something quantum-like about the human mental lexicon?

    PubMed Central

    Bruza, Peter; Kitto, Kirsty; Nelson, Douglas; McEvoy, Cathy

    2010-01-01

    Following an early claim by Nelson & McEvoy (35) suggesting that word associations can display ‘spooky action at a distance behaviour’, a serious investigation of the potentially quantum nature of such associations is currently underway. In this paper quantum theory is proposed as a framework suitable for modelling the human mental lexicon, specifically the results obtained from both intralist and extralist word association experiments. Some initial models exploring this hypothesis are discussed, and experiments capable of testing these models proposed. PMID:20224806

  12. Systematic narrative review of decision frameworks to select the appropriate modelling approaches for health economic evaluations.

    PubMed

    Tsoi, B; O'Reilly, D; Jegathisawaran, J; Tarride, J-E; Blackhouse, G; Goeree, R

    2015-06-17

    In constructing or appraising a health economic model, an early consideration is whether the modelling approach selected is appropriate for the given decision problem. Frameworks and taxonomies that distinguish between modelling approaches can help make this decision more systematic and this study aims to identify and compare the decision frameworks proposed to date on this topic area. A systematic review was conducted to identify frameworks from peer-reviewed and grey literature sources. The following databases were searched: OVID Medline and EMBASE; Wiley's Cochrane Library and Health Economic Evaluation Database; PubMed; and ProQuest. Eight decision frameworks were identified, each focused on a different set of modelling approaches and employing a different collection of selection criterion. The selection criteria can be categorized as either: (i) structural features (i.e. technical elements that are factual in nature) or (ii) practical considerations (i.e. context-dependent attributes). The most commonly mentioned structural features were population resolution (i.e. aggregate vs. individual) and interactivity (i.e. static vs. dynamic). Furthermore, understanding the needs of the end-users and stakeholders was frequently incorporated as a criterion within these frameworks. There is presently no universally-accepted framework for selecting an economic modelling approach. Rather, each highlights different criteria that may be of importance when determining whether a modelling approach is appropriate. Further discussion is thus necessary as the modelling approach selected will impact the validity of the underlying economic model and have downstream implications on its efficiency, transparency and relevance to decision-makers.

  13. Evidencing Learning Outcomes: A Multi-Level, Multi-Dimensional Course Alignment Model

    ERIC Educational Resources Information Center

    Sridharan, Bhavani; Leitch, Shona; Watty, Kim

    2015-01-01

    This conceptual framework proposes a multi-level, multi-dimensional course alignment model to implement a contextualised constructive alignment of rubric design that authentically evidences and assesses learning outcomes. By embedding quality control mechanisms at each level for each dimension, this model facilitates the development of an aligned…

  14. Robust Decision Making in a Nonlinear World

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Thomas, Rick P.

    2012-01-01

    The authors propose a general modeling framework called the general monotone model (GeMM), which allows one to model psychological phenomena that manifest as nonlinear relations in behavior data without the need for making (overly) precise assumptions about functional form. Using both simulated and real data, the authors illustrate that GeMM…

  15. Averaging Models: Parameters Estimation with the R-Average Procedure

    ERIC Educational Resources Information Center

    Vidotto, G.; Massidda, D.; Noventa, S.

    2010-01-01

    The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982), can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto &…

  16. Using a Participatory Culture-Specific Model to Increase the Effectiveness of Social Justice Courses in School Psychology

    ERIC Educational Resources Information Center

    Graybill, Emily C.; Varjas, Kris; Meyers, Joel; Greenberg, Daphne; Roach, Andrew T.

    2013-01-01

    The Participatory Culture-Specific Model of Course Development (PCSMCD), adapted from the Participatory Culture-Specific Intervention Model, is a proposed framework to address challenges to social justice education by addressing the following four course variables: instructor characteristics, instructor experiences, student characteristics, and…

  17. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  18. A stochastically fully connected conditional random field framework for super resolution OCT

    NASA Astrophysics Data System (ADS)

    Boroomand, A.; Tan, B.; Wong, A.; Bizheva, K.

    2017-02-01

    A number of factors can degrade the resolution and contrast of OCT images, such as: (1) changes of the OCT pointspread function (PSF) resulting from wavelength dependent scattering and absorption of light along the imaging depth (2) speckle noise, as well as (3) motion artifacts. We propose a new Super Resolution OCT (SR OCT) imaging framework that takes advantage of a Stochastically Fully Connected Conditional Random Field (SF-CRF) model to generate a Super Resolved OCT (SR OCT) image of higher quality from a set of Low-Resolution OCT (LR OCT) images. The proposed SF-CRF SR OCT imaging is able to simultaneously compensate for all of the factors mentioned above, that degrade the OCT image quality, using a unified computational framework. The proposed SF-CRF SR OCT imaging framework was tested on a set of simulated LR human retinal OCT images generated from a high resolution, high contrast retinal image, and on a set of in-vivo, high resolution, high contrast rat retinal OCT images. The reconstructed SR OCT images show considerably higher spatial resolution, less speckle noise and higher contrast compared to other tested methods. Visual assessment of the results demonstrated the usefulness of the proposed approach in better preservation of fine details and structures of the imaged sample, retaining biological tissue boundaries while reducing speckle noise using a unified computational framework. Quantitative evaluation using both Contrast to Noise Ratio (CNR) and Edge Preservation (EP) parameter also showed superior performance of the proposed SF-CRF SR OCT approach compared to other image processing approaches.

  19. Prediction of as-cast grain size of inoculated aluminum alloys melt solidified under non-isothermal conditions

    NASA Astrophysics Data System (ADS)

    Du, Qiang; Li, Yanjun

    2015-06-01

    In this paper, a multi-scale as-cast grain size prediction model is proposed to predict as-cast grain size of inoculated aluminum alloys melt solidified under non-isothermal condition, i.e., the existence of temperature gradient. Given melt composition, inoculation and heat extraction boundary conditions, the model is able to predict maximum nucleation undercooling, cooling curve, primary phase solidification path and final as-cast grain size of binary alloys. The proposed model has been applied to two Al-Mg alloys, and comparison with laboratory and industrial solidification experimental results have been carried out. The preliminary conclusion is that the proposed model is a promising suitable microscopic model used within the multi-scale casting simulation modelling framework.

  20. A novel framework for objective detection and tracking of TC center from noisy satellite imagery

    NASA Astrophysics Data System (ADS)

    Johnson, Bibin; Thomas, Sachin; Rani, J. Sheeba

    2018-07-01

    This paper proposes a novel framework for automatically determining and tracking the center of a tropical cyclone (TC) during its entire life-cycle from the Thermal infrared (TIR) channel data of the geostationary satellite. The proposed method handles meteorological images with noise, missing or partial information due to the seasonal variability and lack of significant spatial or vortex features. To retrieve the cyclone center from these circumstances, a synergistic approach based on objective measures and Numerical Weather Prediction (NWP) model is being proposed. This method employs a spatial gradient scheme to process missing and noisy frames or a spatio-temporal gradient scheme for image sequences that are continuous and contain less noise. The initial estimate of the TC center from the missing imagery is corrected by exploiting a NWP model based post-processing scheme. The validity of the framework is tested on Infrared images of different cyclones obtained from various Geostationary satellites such as the Meteosat-7, INSAT- 3 D , Kalpana-1 etc. The computed track is compared with the actual track data obtained from Joint Typhoon Warning Center (JTWC), and it shows a reduction of mean track error by 11 % as compared to the other state of the art methods in the presence of missing and noisy frames. The proposed method is also successfully tested for simultaneous retrieval of the TC center from images containing multiple non-overlapping cyclones.

  1. Enterprise resource planning (ERP) implementation using the value engineering methodology and Six Sigma tools

    NASA Astrophysics Data System (ADS)

    Leu, Jun-Der; Lee, Larry Jung-Hsing

    2017-09-01

    Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.

  2. Design optimization for active twist rotor blades

    NASA Astrophysics Data System (ADS)

    Mok, Ji Won

    This dissertation introduces the process of optimizing active twist rotor blades in the presence of embedded anisotropic piezo-composite actuators. Optimum design of active twist blades is a complex task, since it involves a rich design space with tightly coupled design variables. The study presents the development of an optimization framework for active helicopter rotor blade cross-sectional design. This optimization framework allows for exploring a rich and highly nonlinear design space in order to optimize the active twist rotor blades. Different analytical components are combined in the framework: cross-sectional analysis (UM/VABS), an automated mesh generator, a beam solver (DYMORE), a three-dimensional local strain recovery module, and a gradient based optimizer within MATLAB. Through the mathematical optimization problem, the static twist actuation performance of a blade is maximized while satisfying a series of blade constraints. These constraints are associated with locations of the center of gravity and elastic axis, blade mass per unit span, fundamental rotating blade frequencies, and the blade strength based on local three-dimensional strain fields under worst loading conditions. Through pre-processing, limitations of the proposed process have been studied. When limitations were detected, resolution strategies were proposed. These include mesh overlapping, element distortion, trailing edge tab modeling, electrode modeling and foam implementation of the mesh generator, and the initial point sensibility of the current optimization scheme. Examples demonstrate the effectiveness of this process. Optimization studies were performed on the NASA/Army/MIT ATR blade case. Even though that design was built and shown significant impact in vibration reduction, the proposed optimization process showed that the design could be improved significantly. The second example, based on a model scale of the AH-64D Apache blade, emphasized the capability of this framework to explore the nonlinear design space of complex planform. Especially for this case, detailed design is carried out to make the actual blade manufacturable. The proposed optimization framework is shown to be an effective tool to design high authority active twist blades to reduce vibration in future helicopter rotor blades.

  3. Effects of Cognitive Load on Driving Performance: The Cognitive Control Hypothesis.

    PubMed

    Engström, Johan; Markkula, Gustav; Victor, Trent; Merat, Natasha

    2017-08-01

    The objective of this paper was to outline an explanatory framework for understanding effects of cognitive load on driving performance and to review the existing experimental literature in the light of this framework. Although there is general consensus that taking the eyes off the forward roadway significantly impairs most aspects of driving, the effects of primarily cognitively loading tasks on driving performance are not well understood. Based on existing models of driver attention, an explanatory framework was outlined. This framework can be summarized in terms of the cognitive control hypothesis: Cognitive load selectively impairs driving subtasks that rely on cognitive control but leaves automatic performance unaffected. An extensive literature review was conducted wherein existing results were reinterpreted based on the proposed framework. It was demonstrated that the general pattern of experimental results reported in the literature aligns well with the cognitive control hypothesis and that several apparent discrepancies between studies can be reconciled based on the proposed framework. More specifically, performance on nonpracticed or inherently variable tasks, relying on cognitive control, is consistently impaired by cognitive load, whereas the performance on automatized (well-practiced and consistently mapped) tasks is unaffected and sometimes even improved. Effects of cognitive load on driving are strongly selective and task dependent. The present results have important implications for the generalization of results obtained from experimental studies to real-world driving. The proposed framework can also serve to guide future research on the potential causal role of cognitive load in real-world crashes.

  4. With the Design in Mind: "High School Reform Model Features That Matter in Implementation." Conference Paper

    ERIC Educational Resources Information Center

    Shiffman, Catherine Dunn

    2015-01-01

    This paper proposes a framework for analyzing program design features that seem to matter in implementation. The framework is based on findings from a study conducted by the Consortium for Policy Research in Education (CPRE) between 2004 and 2007 that explored how reform ideas and practices created by five external provider organizations were…

  5. Model Wind Turbine Design in a Project-Based Middle School Engineering Curriculum Built on State Frameworks

    ERIC Educational Resources Information Center

    Cogger, Steven D.; Miley, Daniel H.

    2012-01-01

    This paper proposes that project-based active learning is a key part of engineering education at the middle school level. One project from a comprehensive middle school engineering curriculum developed by the authors is described to show how active learning and state frameworks can coexist. The theoretical basis for learning and assessment in a…

  6. A proposal for a computer-based framework of support for public health in the management of biological incidents: the Czech Republic experience.

    PubMed

    Bures, Vladimír; Otcenásková, Tereza; Cech, Pavel; Antos, Karel

    2012-11-01

    Biological incidents jeopardising public health require decision-making that consists of one dominant feature: complexity. Therefore, public health decision-makers necessitate appropriate support. Based on the analogy with business intelligence (BI) principles, the contextual analysis of the environment and available data resources, and conceptual modelling within systems and knowledge engineering, this paper proposes a general framework for computer-based decision support in the case of a biological incident. At the outset, the analysis of potential inputs to the framework is conducted and several resources such as demographic information, strategic documents, environmental characteristics, agent descriptors and surveillance systems are considered. Consequently, three prototypes were developed, tested and evaluated by a group of experts. Their selection was based on the overall framework scheme. Subsequently, an ontology prototype linked with an inference engine, multi-agent-based model focusing on the simulation of an environment, and expert-system prototypes were created. All prototypes proved to be utilisable support tools for decision-making in the field of public health. Nevertheless, the research revealed further issues and challenges that might be investigated by both public health focused researchers and practitioners.

  7. A decision framework for managing risk to airports from terrorist attack.

    PubMed

    Shafieezadeh, Abdollah; Cha, Eun J; Ellingwood, Bruce R

    2015-02-01

    This article presents an asset-level security risk management framework to assist stakeholders of critical assets with allocating limited budgets for enhancing their safety and security against terrorist attack. The proposed framework models the security system of an asset, considers various threat scenarios, and models the sequential decision framework of attackers during the attack. Its novel contributions are the introduction of the notion of partial neutralization of attackers by defenders, estimation of total loss from successful, partially successful, and unsuccessful actions of attackers at various stages of an attack, and inclusion of the effects of these losses on the choices made by terrorists at various stages of the attack. The application of the proposed method is demonstrated in an example dealing with security risk management of a U.S. commercial airport, in which a set of plausible threat scenarios and risk mitigation options are considered. It is found that a combination of providing blast-resistant cargo containers and a video surveillance system on the airport perimeter fence is the best option based on minimum expected life-cycle cost considering a 10-year service period. © 2014 Society for Risk Analysis.

  8. Link predication based on matrix factorization by fusion of multi class organizations of the network.

    PubMed

    Jiao, Pengfei; Cai, Fei; Feng, Yiding; Wang, Wenjun

    2017-08-21

    Link predication aims at forecasting the latent or unobserved edges in the complex networks and has a wide range of applications in reality. Almost existing methods and models only take advantage of one class organization of the networks, which always lose important information hidden in other organizations of the network. In this paper, we propose a link predication framework which makes the best of the structure of networks in different level of organizations based on nonnegative matrix factorization, which is called NMF 3 here. We first map the observed network into another space by kernel functions, which could get the different order organizations. Then we combine the adjacency matrix of the network with one of other organizations, which makes us obtain the objective function of our framework for link predication based on the nonnegative matrix factorization. Third, we derive an iterative algorithm to optimize the objective function, which converges to a local optimum, and we propose a fast optimization strategy for large networks. Lastly, we test the proposed framework based on two kernel functions on a series of real world networks under different sizes of training set, and the experimental results show the feasibility, effectiveness, and competitiveness of the proposed framework.

  9. Predicting readmission risk with institution-specific prediction models.

    PubMed

    Yu, Shipeng; Farooq, Faisal; van Esbroeck, Alexander; Fung, Glenn; Anand, Vikram; Krishnapuram, Balaji

    2015-10-01

    The ability to predict patient readmission risk is extremely valuable for hospitals, especially under the Hospital Readmission Reduction Program of the Center for Medicare and Medicaid Services which went into effect starting October 1, 2012. There is a plethora of work in the literature that deals with developing readmission risk prediction models, but most of them do not have sufficient prediction accuracy to be deployed in a clinical setting, partly because different hospitals may have different characteristics in their patient populations. We propose a generic framework for institution-specific readmission risk prediction, which takes patient data from a single institution and produces a statistical risk prediction model optimized for that particular institution and, optionally, for a specific condition. This provides great flexibility in model building, and is also able to provide institution-specific insights in its readmitted patient population. We have experimented with classification methods such as support vector machines, and prognosis methods such as the Cox regression. We compared our methods with industry-standard methods such as the LACE model, and showed the proposed framework is not only more flexible but also more effective. We applied our framework to patient data from three hospitals, and obtained some initial results for heart failure (HF), acute myocardial infarction (AMI), pneumonia (PN) patients as well as patients with all conditions. On Hospital 2, the LACE model yielded AUC 0.57, 0.56, 0.53 and 0.55 for AMI, HF, PN and All Cause readmission prediction, respectively, while the proposed model yielded 0.66, 0.65, 0.63, 0.74 for the corresponding conditions, all significantly better than the LACE counterpart. The proposed models that leverage all features at discharge time is more accurate than the models that only leverage features at admission time (0.66 vs. 0.61 for AMI, 0.65 vs. 0.61 for HF, 0.63 vs. 0.56 for PN, 0.74 vs. 0.60 for All Cause). Furthermore, the proposed admission-time models already outperform the performance of LACE, which is a discharge-time model (0.61 vs. 0.57 for AMI, 0.61 vs. 0.56 for HF, 0.56 vs. 0.53 for PN, 0.60 vs. 0.55 for All Cause). Similar conclusions can be drawn from other hospitals as well. The same performance comparison also holds for precision and recall at top-decile predictions. Most of the performance improvements are statistically significant. The institution-specific readmission risk prediction framework is more flexible and more effective than the one-size-fit-all models like the LACE, sometimes twice and three-time more effective. The admission-time models are able to give early warning signs compared to the discharge-time models, and may be able to help hospital staff intervene early while the patient is still in the hospital. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Delineating Hydrofacies Spatial Distribution by Integrating Ensemble Data Assimilation and Indicator Geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Xuehang; Chen, Xingyuan; Ye, Ming

    2015-07-01

    This study develops a new framework of facies-based data assimilation for characterizing spatial distribution of hydrofacies and estimating their associated hydraulic properties. This framework couples ensemble data assimilation with transition probability-based geostatistical model via a parameterization based on a level set function. The nature of ensemble data assimilation makes the framework efficient and flexible to be integrated with various types of observation data. The transition probability-based geostatistical model keeps the updated hydrofacies distributions under geological constrains. The framework is illustrated by using a two-dimensional synthetic study that estimates hydrofacies spatial distribution and permeability in each hydrofacies from transient head data.more » Our results show that the proposed framework can characterize hydrofacies distribution and associated permeability with adequate accuracy even with limited direct measurements of hydrofacies. Our study provides a promising starting point for hydrofacies delineation in complex real problems.« less

  11. A framework for automatic feature extraction from airborne light detection and ranging data

    NASA Astrophysics Data System (ADS)

    Yan, Jianhua

    Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance.

  12. A Pressure Control Method for Emulsion Pump Station Based on Elman Neural Network

    PubMed Central

    Tan, Chao; Qi, Nan; Yao, Xingang; Wang, Zhongbin; Si, Lei

    2015-01-01

    In order to realize pressure control of emulsion pump station which is key equipment of coal mine in the safety production, the control requirements were analyzed and a pressure control method based on Elman neural network was proposed. The key techniques such as system framework, pressure prediction model, pressure control model, and the flowchart of proposed approach were presented. Finally, a simulation example was carried out and comparison results indicated that the proposed approach was feasible and efficient and outperformed others. PMID:25861253

  13. Online gesture spotting from visual hull data.

    PubMed

    Peng, Bo; Qian, Gang

    2011-06-01

    This paper presents a robust framework for online full-body gesture spotting from visual hull data. Using view-invariant pose features as observations, hidden Markov models (HMMs) are trained for gesture spotting from continuous movement data streams. Two major contributions of this paper are 1) view-invariant pose feature extraction from visual hulls, and 2) a systematic approach to automatically detecting and modeling specific nongesture movement patterns and using their HMMs for outlier rejection in gesture spotting. The experimental results have shown the view-invariance property of the proposed pose features for both training poses and new poses unseen in training, as well as the efficacy of using specific nongesture models for outlier rejection. Using the IXMAS gesture data set, the proposed framework has been extensively tested and the gesture spotting results are superior to those reported on the same data set obtained using existing state-of-the-art gesture spotting methods.

  14. A physics-based crystallographic modeling framework for describing the thermal creep behavior of Fe-Cr alloys

    DOE PAGES

    Wen, Wei; Capolungo, Laurent; Patra, Anirban; ...

    2017-02-23

    In this work, a physics-based thermal creep model is developed based on the understanding of the microstructure in Fe-Cr alloys. This model is associated with a transition state theory based framework that considers the distribution of internal stresses at sub-material point level. The thermally activated dislocation glide and climb mechanisms are coupled in the obstacle-bypass processes for both dislocation and precipitate-type barriers. A kinetic law is proposed to track the dislocation densities evolution in the subgrain interior and in the cell wall. The predicted results show that this model, embedded in the visco-plastic self-consistent (VPSC) framework, captures well the creepmore » behaviors for primary and steady-state stages under various loading conditions. We also discuss the roles of the mechanisms involved.« less

  15. A Bayesian approach for calibrating probability judgments

    NASA Astrophysics Data System (ADS)

    Firmino, Paulo Renato A.; Santana, Nielson A.

    2012-10-01

    Eliciting experts' opinions has been one of the main alternatives for addressing paucity of data. In the vanguard of this area is the development of calibration models (CMs). CMs are models dedicated to overcome miscalibration, i.e. judgment biases reflecting deficient strategies of reasoning adopted by the expert when inferring about an unknown. One of the main challenges of CMs is to determine how and when to intervene against miscalibration, in order to enhance the tradeoff between costs (time spent with calibration processes) and accuracy of the resulting models. The current paper dedicates special attention to this issue by presenting a dynamic Bayesian framework for monitoring, diagnosing, and handling miscalibration patterns. The framework is based on Beta-, Uniform, or Triangular-Bernoulli models and classes of judgmental calibration theories. Issues regarding the usefulness of the proposed framework are discussed and illustrated via simulation studies.

  16. Random forest feature selection approach for image segmentation

    NASA Astrophysics Data System (ADS)

    Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina; Vaida, Mircea Florin

    2017-03-01

    In the field of image segmentation, discriminative models have shown promising performance. Generally, every such model begins with the extraction of numerous features from annotated images. Most authors create their discriminative model by using many features without using any selection criteria. A more reliable model can be built by using a framework that selects the important variables, from the point of view of the classification, and eliminates the unimportant once. In this article we present a framework for feature selection and data dimensionality reduction. The methodology is built around the random forest (RF) algorithm and its variable importance evaluation. In order to deal with datasets so large as to be practically unmanageable, we propose an algorithm based on RF that reduces the dimension of the database by eliminating irrelevant features. Furthermore, this framework is applied to optimize our discriminative model for brain tumor segmentation.

  17. An improved shuffled frog leaping algorithm based evolutionary framework for currency exchange rate prediction

    NASA Astrophysics Data System (ADS)

    Dash, Rajashree

    2017-11-01

    Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.

  18. Fitting identity in the reasoned action framework: A meta-analysis and model comparison.

    PubMed

    Paquin, Ryan S; Keating, David M

    2017-01-01

    Several competing models have been put forth regarding the role of identity in the reasoned action framework. The standard model proposes that identity is a background variable. Under a typical augmented model, identity is treated as an additional direct predictor of intention and behavior. Alternatively, it has been proposed that identity measures are inadvertent indicators of an underlying intention factor (e.g., a manifest-intention model). In order to test these competing hypotheses, we used data from 73 independent studies (total N = 23,917) to conduct a series of meta-analytic structural equation models. We also tested for moderation effects based on whether there was a match between identity constructs and the target behaviors examined (e.g., if the study examined a "smoker identity" and "smoking behavior," there would be a match; if the study examined a "health conscious identity" and "smoking behavior," there would not be a match). Average effects among primary reasoned action variables were all substantial, rs = .37-.69. Results gave evidence for the manifest-intention model over the other explanations, and a moderation effect by identity-behavior matching.

  19. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  20. Real-time solution of linear computational problems using databases of parametric reduced-order models with arbitrary underlying meshes

    NASA Astrophysics Data System (ADS)

    Amsallem, David; Tezaur, Radek; Farhat, Charbel

    2016-12-01

    A comprehensive approach for real-time computations using a database of parametric, linear, projection-based reduced-order models (ROMs) based on arbitrary underlying meshes is proposed. In the offline phase of this approach, the parameter space is sampled and linear ROMs defined by linear reduced operators are pre-computed at the sampled parameter points and stored. Then, these operators and associated ROMs are transformed into counterparts that satisfy a certain notion of consistency. In the online phase of this approach, a linear ROM is constructed in real-time at a queried but unsampled parameter point by interpolating the pre-computed linear reduced operators on matrix manifolds and therefore computing an interpolated linear ROM. The proposed overall model reduction framework is illustrated with two applications: a parametric inverse acoustic scattering problem associated with a mockup submarine, and a parametric flutter prediction problem associated with a wing-tank system. The second application is implemented on a mobile device, illustrating the capability of the proposed computational framework to operate in real-time.

  1. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a generic framework for solving the network planning problem under uncertainties. In addition to reviewing the various network planning problems involving uncertainties, we also propose that a unified framework based on robust optimization can be used to solve a rather large segment of network planning problem under uncertainties. Robust optimization is first introduced in the operations research literature and is a framework that incorporates information about the uncertainty sets for the parameters in the optimization model. Even though robust optimization is originated from tackling the uncertainty in the optimization process, it can serve as a comprehensive and suitable framework for tackling generic network planning problems under uncertainties. In this paper, we begin by explaining the main ideas behind the robust optimization approach. Then we demonstrate the capabilities of the proposed framework by giving out some examples of how the robust optimization framework can be applied to the current common network planning problems under uncertain environments. Next, we list some practical considerations for solving the network planning problem under uncertainties with the proposed framework. Finally, we conclude this article with some thoughts on the future directions for applying this framework to solve other network planning problems.

  2. Structured functional additive regression in reproducing kernel Hilbert spaces.

    PubMed

    Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen

    2014-06-01

    Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application.

  3. Towards a multilevel cognitive probabilistic representation of space

    NASA Astrophysics Data System (ADS)

    Tapus, Adriana; Vasudevan, Shrihari; Siegwart, Roland

    2005-03-01

    This paper addresses the problem of perception and representation of space for a mobile agent. A probabilistic hierarchical framework is suggested as a solution to this problem. The method proposed is a combination of probabilistic belief with "Object Graph Models" (OGM). The world is viewed from a topological optic, in terms of objects and relationships between them. The hierarchical representation that we propose permits an efficient and reliable modeling of the information that the mobile agent would perceive from its environment. The integration of both navigational and interactional capabilities through efficient representation is also addressed. Experiments on a set of images taken from the real world that validate the approach are reported. This framework draws on the general understanding of human cognition and perception and contributes towards the overall efforts to build cognitive robot companions.

  4. E-Services quality assessment framework for collaborative networks

    NASA Astrophysics Data System (ADS)

    Stegaru, Georgiana; Danila, Cristian; Sacala, Ioan Stefan; Moisescu, Mihnea; Mihai Stanescu, Aurelian

    2015-08-01

    In a globalised networked economy, collaborative networks (CNs) are formed to take advantage of new business opportunities. Collaboration involves shared resources and capabilities, such as e-Services that can be dynamically composed to automate CN participants' business processes. Quality is essential for the success of business process automation. Current approaches mostly focus on quality of service (QoS)-based service selection and ranking algorithms, overlooking the process of service composition which requires interoperable, adaptable and secure e-Services to ensure seamless collaboration, data confidentiality and integrity. Lack of assessment of these quality attributes can result in e-Service composition failure. The quality of e-Service composition relies on the quality of each e-Service and on the quality of the composition process. Therefore, there is the need for a framework that addresses quality from both views: product and process. We propose a quality of e-Service composition (QoESC) framework for quality assessment of e-Service composition for CNs which comprises of a quality model for e-Service evaluation and guidelines for quality of e-Service composition process. We implemented a prototype considering a simplified telemedicine use case which involves a CN in e-Healthcare domain. To validate the proposed quality-driven framework, we analysed service composition reliability with and without using the proposed framework.

  5. A framework using cluster-based hybrid network architecture for collaborative virtual surgery.

    PubMed

    Qin, Jing; Choi, Kup-Sze; Poon, Wai-Sang; Heng, Pheng-Ann

    2009-12-01

    Research on collaborative virtual environments (CVEs) opens the opportunity for simulating the cooperative work in surgical operations. It is however a challenging task to implement a high performance collaborative surgical simulation system because of the difficulty in maintaining state consistency with minimum network latencies, especially when sophisticated deformable models and haptics are involved. In this paper, an integrated framework using cluster-based hybrid network architecture is proposed to support collaborative virtual surgery. Multicast transmission is employed to transmit updated information among participants in order to reduce network latencies, while system consistency is maintained by an administrative server. Reliable multicast is implemented using distributed message acknowledgment based on cluster cooperation and sliding window technique. The robustness of the framework is guaranteed by the failure detection chain which enables smooth transition when participants join and leave the collaboration, including normal and involuntary leaving. Communication overhead is further reduced by implementing a number of management approaches such as computational policies and collaborative mechanisms. The feasibility of the proposed framework is demonstrated by successfully extending an existing standalone orthopedic surgery trainer into a collaborative simulation system. A series of experiments have been conducted to evaluate the system performance. The results demonstrate that the proposed framework is capable of supporting collaborative surgical simulation.

  6. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter.

    PubMed

    Yang, Chun; Mohammadi, Arash; Chen, Qing-Wei

    2016-11-02

    Motivated by the key importance of multi-sensor information fusion algorithms in the state-of-the-art integrated navigation systems due to recent advancements in sensor technologies, telecommunication, and navigation systems, the paper proposes an improved and innovative fault-tolerant fusion framework. An integrated navigation system is considered consisting of four sensory sub-systems, i.e., Strap-down Inertial Navigation System (SINS), Global Navigation System (GPS), the Bei-Dou2 (BD2) and Celestial Navigation System (CNS) navigation sensors. In such multi-sensor applications, on the one hand, the design of an efficient fusion methodology is extremely constrained specially when no information regarding the system's error characteristics is available. On the other hand, the development of an accurate fault detection and integrity monitoring solution is both challenging and critical. The paper addresses the sensitivity issues of conventional fault detection solutions and the unavailability of a precisely known system model by jointly designing fault detection and information fusion algorithms. In particular, by using ideas from Interacting Multiple Model (IMM) filters, the uncertainty of the system will be adjusted adaptively by model probabilities and using the proposed fuzzy-based fusion framework. The paper also addresses the problem of using corrupted measurements for fault detection purposes by designing a two state propagator chi-square test jointly with the fusion algorithm. Two IMM predictors, running in parallel, are used and alternatively reactivated based on the received information form the fusion filter to increase the reliability and accuracy of the proposed detection solution. With the combination of the IMM and the proposed fusion method, we increase the failure sensitivity of the detection system and, thereby, significantly increase the overall reliability and accuracy of the integrated navigation system. Simulation results indicate that the proposed fault tolerant fusion framework provides superior performance over its traditional counterparts.

  8. Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter

    PubMed Central

    Yang, Chun; Mohammadi, Arash; Chen, Qing-Wei

    2016-01-01

    Motivated by the key importance of multi-sensor information fusion algorithms in the state-of-the-art integrated navigation systems due to recent advancements in sensor technologies, telecommunication, and navigation systems, the paper proposes an improved and innovative fault-tolerant fusion framework. An integrated navigation system is considered consisting of four sensory sub-systems, i.e., Strap-down Inertial Navigation System (SINS), Global Navigation System (GPS), the Bei-Dou2 (BD2) and Celestial Navigation System (CNS) navigation sensors. In such multi-sensor applications, on the one hand, the design of an efficient fusion methodology is extremely constrained specially when no information regarding the system’s error characteristics is available. On the other hand, the development of an accurate fault detection and integrity monitoring solution is both challenging and critical. The paper addresses the sensitivity issues of conventional fault detection solutions and the unavailability of a precisely known system model by jointly designing fault detection and information fusion algorithms. In particular, by using ideas from Interacting Multiple Model (IMM) filters, the uncertainty of the system will be adjusted adaptively by model probabilities and using the proposed fuzzy-based fusion framework. The paper also addresses the problem of using corrupted measurements for fault detection purposes by designing a two state propagator chi-square test jointly with the fusion algorithm. Two IMM predictors, running in parallel, are used and alternatively reactivated based on the received information form the fusion filter to increase the reliability and accuracy of the proposed detection solution. With the combination of the IMM and the proposed fusion method, we increase the failure sensitivity of the detection system and, thereby, significantly increase the overall reliability and accuracy of the integrated navigation system. Simulation results indicate that the proposed fault tolerant fusion framework provides superior performance over its traditional counterparts. PMID:27827832

  9. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors

    NASA Astrophysics Data System (ADS)

    Cenek, Martin; Dahl, Spencer K.

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  10. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors.

    PubMed

    Cenek, Martin; Dahl, Spencer K

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  11. A probabilistic model framework for evaluating year-to-year variation in crop productivity

    NASA Astrophysics Data System (ADS)

    Yokozawa, M.; Iizumi, T.; Tao, F.

    2008-12-01

    Most models describing the relation between crop productivity and weather condition have so far been focused on mean changes of crop yield. For keeping stable food supply against abnormal weather as well as climate change, evaluating the year-to-year variations in crop productivity rather than the mean changes is more essential. We here propose a new framework of probabilistic model based on Bayesian inference and Monte Carlo simulation. As an example, we firstly introduce a model on paddy rice production in Japan. It is called PRYSBI (Process- based Regional rice Yield Simulator with Bayesian Inference; Iizumi et al., 2008). The model structure is the same as that of SIMRIW, which was developed and used widely in Japan. The model includes three sub- models describing phenological development, biomass accumulation and maturing of rice crop. These processes are formulated to include response nature of rice plant to weather condition. This model inherently was developed to predict rice growth and yield at plot paddy scale. We applied it to evaluate the large scale rice production with keeping the same model structure. Alternatively, we assumed the parameters as stochastic variables. In order to let the model catch up actual yield at larger scale, model parameters were determined based on agricultural statistical data of each prefecture of Japan together with weather data averaged over the region. The posterior probability distribution functions (PDFs) of parameters included in the model were obtained using Bayesian inference. The MCMC (Markov Chain Monte Carlo) algorithm was conducted to numerically solve the Bayesian theorem. For evaluating the year-to-year changes in rice growth/yield under this framework, we firstly iterate simulations with set of parameter values sampled from the estimated posterior PDF of each parameter and then take the ensemble mean weighted with the posterior PDFs. We will also present another example for maize productivity in China. The framework proposed here provides us information on uncertainties, possibilities and limitations on future improvements in crop model as well.

  12. Coupled Thermo-Hydro-Mechanical Numerical Framework for Simulating Unconventional Formations

    NASA Astrophysics Data System (ADS)

    Garipov, T. T.; White, J. A.; Lapene, A.; Tchelepi, H.

    2016-12-01

    Unconventional deposits are found in all world oil provinces. Modeling these systems is challenging, however, due to complex thermo-hydro-mechanical processes that govern their behavior. As a motivating example, we consider in situ thermal processing of oil shale deposits. When oil shale is heated to sufficient temperatures, kerogen can be converted to oil and gas products over a relatively short timespan. This phase change dramatically impact both the mechanical and hydrologic properties of the rock, leading to strongly coupled THMC interactions. Here, we present a numerical framework for simulating tightly-coupled chemistry, geomechanics, and multiphase flow within a reservoir simulator (the AD-GPRS General Purpose Research Simulator). We model changes in constitutive behavior of the rock using a thermoplasticity model that accounts for microstructural evolution. The multi-component, multiphase flow and transport processes of both mass and heat are modeled at the macroscopic (e.g., Darcy) scale. The phase compositions and properties are described by a cubic equation of state; Arrhenius-type chemical reactions are used to represent kerogen conversion. The system of partial differential equations is discretized using a combination of finite-volumes and finite-elements, respectively, for the flow and mechanics problems. Fully implicit and sequentially implicit method are used to solve resulting nonlinear problem. The proposed framework is verified against available analytical and numerical benchmark cases. We demonstrate the efficiency, performance, and capabilities of the proposed simulation framework by analyzing near well deformation in an oil shale formation.

  13. Using a User-Interactive QA System for Personalized E-Learning

    ERIC Educational Resources Information Center

    Hu, Dawei; Chen, Wei; Zeng, Qingtian; Hao, Tianyong; Min, Feng; Wenyin, Liu

    2008-01-01

    A personalized e-learning framework based on a user-interactive question-answering (QA) system is proposed, in which a user-modeling approach is used to capture personal information of students and a personalized answer extraction algorithm is proposed for personalized automatic answering. In our approach, a topic ontology (or concept hierarchy)…

  14. The Career Resources Model: An Integrative Framework for Career Counsellors

    ERIC Educational Resources Information Center

    Hirschi, Andreas

    2012-01-01

    Changes in the nature of work and organisations have led to an increased need for self-directed career management (SDCM). However, there is no consensus in the literature of what constitutes SDCM and many related concepts have been proposed. Integrating previous research across different conceptualisations of SDCM, the article proposes four…

  15. Establishing a Numerical Modeling Framework for Hydrologic Engineering Analyses of Extreme Storm Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby

    In this study a numerical modeling framework for simulating extreme storm events was established using the Weather Research and Forecasting (WRF) model. Such a framework is necessary for the derivation of engineering parameters such as probable maximum precipitation that are the cornerstone of large water management infrastructure design. Here this framework was built based on a heavy storm that occurred in Nashville (USA) in 2010, and verified using two other extreme storms. To achieve the optimal setup, several combinations of model resolutions, initial/boundary conditions (IC/BC), cloud microphysics and cumulus parameterization schemes were evaluated using multiple metrics of precipitation characteristics. Themore » evaluation suggests that WRF is most sensitive to IC/BC option. Simulation generally benefits from finer resolutions up to 5 km. At the 15km level, NCEP2 IC/BC produces better results, while NAM IC/BC performs best at the 5km level. Recommended model configuration from this study is: NAM or NCEP2 IC/BC (depending on data availability), 15km or 15km-5km nested grids, Morrison microphysics and Kain-Fritsch cumulus schemes. Validation of the optimal framework suggests that these options are good starting choices for modeling extreme events similar to the test cases. This optimal framework is proposed in response to emerging engineering demands of extreme storm events forecasting and analyses for design, operations and risk assessment of large water infrastructures.« less

  16. The experiential health information processing model: supporting collaborative web-based patient education

    PubMed Central

    O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine

    2008-01-01

    Background First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. Results In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. Conclusion An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided. PMID:19087353

  17. The experiential health information processing model: supporting collaborative web-based patient education.

    PubMed

    O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine

    2008-12-16

    First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided.

  18. A multi-stakeholder framework for urban runoff quality management: Application of social choice and bargaining techniques.

    PubMed

    Ghodsi, Seyed Hamed; Kerachian, Reza; Zahmatkesh, Zahra

    2016-04-15

    In this paper, an integrated framework is proposed for urban runoff management. To control and improve runoff quality and quantity, Low Impact Development (LID) practices are utilized. In order to determine the LIDs' areas and locations, the Non-dominated Sorting Genetic Algorithm-II (NSGA-II), which considers three objective functions of minimizing runoff volume, runoff pollution and implementation cost of LIDs, is utilized. In this framework, the Storm Water Management Model (SWMM) is used for stream flow simulation. The non-dominated solutions provided by the NSGA-II are considered as management scenarios. To select the most preferred scenario, interactions among the main stakeholders in the study area with conflicting utilities are incorporated by utilizing bargaining models including a non-cooperative game, Nash model and social choice procedures of Borda count and approval voting. Moreover, a new social choice procedure, named pairwise voting method, is proposed and applied. Based on each conflict resolution approach, a scenario is identified as the ideal solution providing the LIDs' areas, locations and implementation cost. The proposed framework is applied for urban water quality and quantity management in the northern part of Tehran metropolitan city, Iran. Results show that the proposed pairwise voting method tends to select a scenario with a higher percentage of reduction in TSS (Total Suspended Solid) load and runoff volume, in comparison with the Borda count and approval voting methods. Besides, the Nash method presents a management scenario with the highest cost for LIDs' implementation and the maximum values for percentage of runoff volume reduction and TSS removal. The results also signify that selection of an appropriate management scenario by stakeholders in the study area depends on the available financial resources and the relative importance of runoff quality improvement in comparison with reducing the runoff volume. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Concurrent enterprise: a conceptual framework for enterprise supply-chain network activities

    NASA Astrophysics Data System (ADS)

    Addo-Tenkorang, Richard; Helo, Petri T.; Kantola, Jussi

    2017-04-01

    Supply-chain management (SCM) in manufacturing industries has evolved significantly over the years. Recently, a lot more relevant research has picked up on the development of integrated solutions. Thus, seeking a collaborative optimisation of geographical, just-in-time (JIT), quality (customer demand/satisfaction) and return-on-investment (profits), aspects of organisational management and planning through 'best practice' business-process management - concepts and application; employing system tools such as certain applications/aspects of enterprise resource planning (ERP) - SCM systems information technology (IT) enablers to enhance enterprise integrated product development/concurrent engineering principles. This article assumed three main organisation theory applications in positioning its assumptions. Thus, proposing a feasible industry-specific framework not currently included within the SCOR model's level four (4) implementation level, as well as other existing SCM integration reference models such as in the MIT process handbook's - Process Interchange Format (PIF), the TOVE project, etc. which could also be replicated in other SCs. However, the wider focus of this paper's contribution will be concentrated on a complimentary proposed framework to the SCC's SCOR reference model. Quantitative empirical closed-ended questionnaires in addition to the main data collected from a qualitative empirical real-life industrial-based pilot case study were used: To propose a conceptual concurrent enterprise framework for SCM network activities. This research adopts a design structure matrix simulation approach analysis to propose an optimal enterprise SCM-networked value-adding, customised master data-management platform/portal for efficient SCM network information exchange and an effective supply-chain (SC) network systems-design teams' structure. Furthermore, social network theory analysis will be employed in a triangulation approach with statistical correlation analysis to assess the scale/level of frequency, importance, level of collaborative-ness, mutual trust as well as roles and responsibility among the enterprise SCM network for systems product development (PD) design teams' technical communication network as well as extensive literature reviews.

  20. A general framework for parametric survival analysis.

    PubMed

    Crowther, Michael J; Lambert, Paul C

    2014-12-30

    Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

  1. Multiresolution multiscale active mask segmentation of fluorescence microscope images

    NASA Astrophysics Data System (ADS)

    Srinivasa, Gowri; Fickus, Matthew; Kovačević, Jelena

    2009-08-01

    We propose an active mask segmentation framework that combines the advantages of statistical modeling, smoothing, speed and flexibility offered by the traditional methods of region-growing, multiscale, multiresolution and active contours respectively. At the crux of this framework is a paradigm shift from evolving contours in the continuous domain to evolving multiple masks in the discrete domain. Thus, the active mask framework is particularly suited to segment digital images. We demonstrate the use of the framework in practice through the segmentation of punctate patterns in fluorescence microscope images. Experiments reveal that statistical modeling helps the multiple masks converge from a random initial configuration to a meaningful one. This obviates the need for an involved initialization procedure germane to most of the traditional methods used to segment fluorescence microscope images. While we provide the mathematical details of the functions used to segment fluorescence microscope images, this is only an instantiation of the active mask framework. We suggest some other instantiations of the framework to segment different types of images.

  2. A unified framework for heat and mass transport at the atomic scale

    NASA Astrophysics Data System (ADS)

    Ponga, Mauricio; Sun, Dingyi

    2018-04-01

    We present a unified framework to simulate heat and mass transport in systems of particles. The proposed framework is based on kinematic mean field theory and uses a phenomenological master equation to compute effective transport rates between particles without the need to evaluate operators. We exploit this advantage and apply the model to simulate transport phenomena at the nanoscale. We demonstrate that, when calibrated to experimentally-measured transport coefficients, the model can accurately predict transient and steady state temperature and concentration profiles even in scenarios where the length of the device is comparable to the mean free path of the carriers. Through several example applications, we demonstrate the validity of our model for all classes of materials, including ones that, until now, would have been outside the domain of computational feasibility.

  3. A unified theoretical framework for mapping models for the multi-state Hamiltonian.

    PubMed

    Liu, Jian

    2016-11-28

    We propose a new unified theoretical framework to construct equivalent representations of the multi-state Hamiltonian operator and present several approaches for the mapping onto the Cartesian phase space. After mapping an F-dimensional Hamiltonian onto an F+1 dimensional space, creation and annihilation operators are defined such that the F+1 dimensional space is complete for any combined excitation. Commutation and anti-commutation relations are then naturally derived, which show that the underlying degrees of freedom are neither bosons nor fermions. This sets the scene for developing equivalent expressions of the Hamiltonian operator in quantum mechanics and their classical/semiclassical counterparts. Six mapping models are presented as examples. The framework also offers a novel way to derive such as the well-known Meyer-Miller model.

  4. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation

    PubMed Central

    Razzaq, Misbah; Ahmad, Jamil

    2015-01-01

    Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework. PMID:26713449

  5. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation.

    PubMed

    Razzaq, Misbah; Ahmad, Jamil

    2015-01-01

    Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework.

  6. Adaptive and neuroadaptive control for nonnegative and compartmental dynamical systems

    NASA Astrophysics Data System (ADS)

    Volyanskyy, Kostyantyn Y.

    Neural networks have been extensively used for adaptive system identification as well as adaptive and neuroadaptive control of highly uncertain systems. The goal of adaptive and neuroadaptive control is to achieve system performance without excessive reliance on system models. To improve robustness and the speed of adaptation of adaptive and neuroadaptive controllers several controller architectures have been proposed in the literature. In this dissertation, we develop a new neuroadaptive control architecture for nonlinear uncertain dynamical systems. The proposed framework involves a novel controller architecture with additional terms in the update laws that are constructed using a moving window of the integrated system uncertainty. These terms can be used to identify the ideal system weights of the neural network as well as effectively suppress system uncertainty. Linear and nonlinear parameterizations of the system uncertainty are considered and state and output feedback neuroadaptive controllers are developed. Furthermore, we extend the developed framework to discrete-time dynamical systems. To illustrate the efficacy of the proposed approach we apply our results to an aircraft model with wing rock dynamics, a spacecraft model with unknown moment of inertia, and an unmanned combat aerial vehicle undergoing actuator failures, and compare our results with standard neuroadaptive control methods. Nonnegative systems are essential in capturing the behavior of a wide range of dynamical systems involving dynamic states whose values are nonnegative. A sub-class of nonnegative dynamical systems are compartmental systems. These systems are derived from mass and energy balance considerations and are comprised of homogeneous interconnected microscopic subsystems or compartments which exchange variable quantities of material via intercompartmental flow laws. In this dissertation, we develop direct adaptive and neuroadaptive control framework for stabilization, disturbance rejection and noise suppression for nonnegative and compartmental dynamical systems with noise and exogenous system disturbances. We then use the developed framework to control the infusion of the anesthetic drug propofol for maintaining a desired constant level of depth of anesthesia for surgery in the face of continuing hemorrhage and hemodilution. Critical care patients, whether undergoing surgery or recovering in intensive care units, require drug administration to regulate physiological variables such as blood pressure, cardiac output, heart rate, and degree of consciousness. The rate of infusion of each administered drug is critical, requiring constant monitoring and frequent adjustments. In this dissertation, we develop a neuroadaptive output feedback control framework for nonlinear uncertain nonnegative and compartmental systems with nonnegative control inputs and noisy measurements. The proposed framework is Lyapunov-based and guarantees ultimate boundedness of the error signals. In addition, the neuroadaptive controller guarantees that the physical system states remain in the nonnegative orthant of the state space. Finally, the developed approach is used to control the infusion of the anesthetic drug propofol for maintaining a desired constant level of depth of anesthesia for surgery in the face of noisy electroencephalographic (EEG) measurements. Clinical trials demonstrate excellent regulation of unconsciousness allowing for a safe and effective administration of the anesthetic agent propofol. Furthermore, a neuroadaptive output feedback control architecture for nonlinear nonnegative dynamical systems with input amplitude and integral constraints is developed. Specifically, the neuroadaptive controller guarantees that the imposed amplitude and integral input constraints are satisfied and the physical system states remain in the nonnegative orthant of the state space. The proposed approach is used to control the infusion of the anesthetic drug propofol for maintaining a desired constant level of depth of anesthesia for noncardiac surgery in the face of infusion rate constraints and a drug dosing constraint over a specified period. In addition, the aforementioned control architecture is used to control lung volume and minute ventilation with input pressure constraints that also accounts for spontaneous breathing by the patient. Specifically, we develop a pressure- and work-limited neuroadaptive controller for mechanical ventilation based on a nonlinear multi-compartmental lung model. The control framework does not rely on any averaged data and is designed to automatically adjust the input pressure to the patient's physiological characteristics capturing lung resistance and compliance modeling uncertainty. Moreover, the controller accounts for input pressure constraints as well as work of breathing constraints. The effect of spontaneous breathing is incorporated within the lung model and the control framework. Finally, a neural network hybrid adaptive control framework for nonlinear uncertain hybrid dynamical systems is developed. The proposed hybrid adaptive control framework is Lyapunov-based and guarantees partial asymptotic stability of the closed-loop hybrid system; that is, asymptotic stability with respect to part of the closed-loop system states associated with the hybrid plant states. A numerical example is provided to demonstrate the efficacy of the proposed hybrid adaptive stabilization approach.

  7. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Quantifying structural states of soft mudrocks

    NASA Astrophysics Data System (ADS)

    Li, B.; Wong, R. C. K.

    2016-05-01

    In this paper, a cm model is proposed to quantify structural states of soft mudrocks, which are dependent on clay fractions and porosities. Physical properties of natural and reconstituted soft mudrock samples are used to derive two parameters in the cm model. With the cm model, a simplified homogenization approach is proposed to estimate geomechanical properties and fabric orientation distributions of soft mudrocks based on the mixture theory. Soft mudrocks are treated as a mixture of nonclay minerals and clay-water composites. Nonclay minerals have a high stiffness and serve as a structural framework of mudrocks when they have a high volume fraction. Clay-water composites occupy the void space among nonclay minerals and serve as an in-fill matrix. With the increase of volume fraction of clay-water composites, there is a transition in the structural state from the state of framework supported to the state of matrix supported. The decreases in shear strength and pore size as well as increases in compressibility and anisotropy in fabric are quantitatively related to such transition. The new homogenization approach based on the proposed cm model yields better performance evaluation than common effective medium modeling approaches because the interactions among nonclay minerals and clay-water composites are considered. With wireline logging data, the cm model is applied to quantify the structural states of Colorado shale formations at different depths in the Cold Lake area, Alberta, Canada. Key geomechancial parameters are estimated based on the proposed homogenization approach and the critical intervals with low strength shale formations are identified.

  9. Coalescent: an open-science framework for importance sampling in coalescent theory.

    PubMed

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only effective sample size. Here, we evaluate proposals in the coalescent literature, to discover that the order of efficiency among the three importance sampling schemes changes when one considers running time as well as effective sample size. We also describe a computational technique called "just-in-time delegation" available to improve the trade-off between running time and precision by constructing improved importance sampling schemes from existing ones. Thus, our systems approach is a potential solution to the "2(8) programs problem" highlighted by Felsenstein, because it provides the flexibility to include or exclude various features of similar coalescent models or importance sampling schemes.

  10. Expanding on Successful Concepts, Models, and Organization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teeguarden, Justin G.; Tan, Yu-Mei; Edwards, Stephen W.

    In her letter to the editor1 regarding our recent Feature Article “Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework” 2, Dr. von Göetz expressed several concerns about terminology, and the perception that we propose the replacement of successful approaches and models for exposure assessment with a concept. We are glad to have the opportunity to address these issues here. If the goal of the AEP framework was to replace existing exposure models or databases for organizing exposure data with a concept, we would share Dr. von Göetz concerns. Instead,more » the outcome we promote is broader use of an organizational framework for exposure science. The framework would support improved generation, organization, and interpretation of data as well as modeling and prediction, not replacement of models. The field of toxicology has seen the benefits of wide use of one or more organizational frameworks (e.g., mode and mechanism of action, adverse outcome pathway). These frameworks influence how experiments are designed, data are collected, curated, stored and interpreted and ultimately how data are used in risk assessment. Exposure science is poised to similarly benefit from broader use of a parallel organizational framework, which Dr. von Göetz correctly points out, is currently used in the exposure modeling community. In our view, the concepts used so effectively in the exposure modeling community, expanded upon in the AEP framework, could see wider adoption by the field as a whole. The value of such a framework was recognized by the National Academy of Sciences.3 Replacement of models, databases, or any application with the AEP framework was not proposed in our article. The positive role broader more consistent use of such a framework might have in enabling and advancing “general activities such as data acquisition, organization…,” and exposure modeling was discussed in some detail. Like Dr. von Göetz, we recognized the challenges associated with acceptance of the terminology, definitions, and structure proposed in the paper. To address these challenges, an expert workshop was held in May, 2016 to consider and revise the “basic elements” outlined in the paper. The attendees produced revisions to the terminology (e.g., key events) that align with terminology currently in use in the field. We were also careful in our paper to acknowledge a point raised by Dr. von Göetz, that the term AEP implies aggregation, providing these clarifications: “The simplest form of an AEP represents a single source and a single pathway and may more commonly be referred to as an exposure pathway,”; and “An aggregate exposure pathway may represent multiple sources and transfer through single pathways to the TSE, single sources and transfer through multiple pathways to the target site exposure (TSE), or any combination of these.” These clarifications address the concern that the AEP term is not accurate or logical, and further expands upon the word “aggregate” in a broader context. Our use of AEP is consistent with the definition for “aggregate exposure”, which refers to the combined exposures to a single chemical across multiple routes and pathways.3 The AEP framework embraces existing methods for collection, prediction, organization, and interpretation of human and ecological exposure data cited by Dr. von Göetz. We remain hopeful that wider recognition and use of an organizing concept for exposure information across the exposure science, toxicology and epidemiology communities advances the development of the kind of infrastructure and models Dr. von Göetz discusses. This outcome would be a step forward, rather than a step backward.« less

  11. A general modeling framework for describing spatially structured population dynamics

    USGS Publications Warehouse

    Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan

    2017-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance that comparative analyses are colored by model details rather than general principles

  12. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    ERIC Educational Resources Information Center

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  13. CO 2 induced phase transitions in diamine-appended metal–organic frameworks

    DOE PAGES

    Vlaisavljevich, Bess; Odoh, Samuel O.; Schnell, Sondre K.; ...

    2015-06-17

    Using a combination of density functional theory and lattice models, we study the effect of CO 2 adsorption in an amine functionalized metal–organic framework. These materials exhibit a step in the adsorption isotherm indicative of a phase change. The pressure at which this step occurs is not only temperature dependent but is also metal center dependent. Likewise, the heats of adsorption vary depending on the metal center. Herein we demonstrate via quantum chemical calculations that the amines should not be considered firmly anchored to the framework and we explore the mechanism for CO 2 adsorption. An ammonium carbamate species ismore » formed via the insertion of CO 2 into the M–N amine bonds. Furthermore, we translate the quantum chemical results into isotherms using a coarse grained Monte Carlo simulation technique and show that this adsorption mechanism can explain the characteristic step observed in the experimental isotherm while a previously proposed mechanism cannot. Furthermore, metal analogues have been explored and the CO 2 binding energies show a strong metal dependence corresponding to the M–N amine bond strength. We show that this difference can be exploited to tune the pressure at which the step in the isotherm occurs. Additionally, the mmen–Ni 2(dobpdc) framework shows Langmuir like behavior, and our simulations show how this can be explained by competitive adsorption between the new model and a previously proposed model.« less

  14. Working toward integrated models of alpine plant distribution

    PubMed Central

    Carlson, Bradley Z.; Randin, Christophe F.; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe

    2014-01-01

    Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial–temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution. PMID:24790594

  15. Multiplicative Multitask Feature Learning

    PubMed Central

    Wang, Xin; Bi, Jinbo; Yu, Shipeng; Sun, Jiangwen; Song, Minghu

    2016-01-01

    We investigate a general framework of multiplicative multitask feature learning which decomposes individual task’s model parameters into a multiplication of two components. One of the components is used across all tasks and the other component is task-specific. Several previous methods can be proved to be special cases of our framework. We study the theoretical properties of this framework when different regularization conditions are applied to the two decomposed components. We prove that this framework is mathematically equivalent to the widely used multitask feature learning methods that are based on a joint regularization of all model parameters, but with a more general form of regularizers. Further, an analytical formula is derived for the across-task component as related to the task-specific component for all these regularizers, leading to a better understanding of the shrinkage effects of different regularizers. Study of this framework motivates new multitask learning algorithms. We propose two new learning formulations by varying the parameters in the proposed framework. An efficient blockwise coordinate descent algorithm is developed suitable for solving the entire family of formulations with rigorous convergence analysis. Simulation studies have identified the statistical properties of data that would be in favor of the new formulations. Extensive empirical studies on various classification and regression benchmark data sets have revealed the relative advantages of the two new formulations by comparing with the state of the art, which provides instructive insights into the feature learning problem with multiple tasks. PMID:28428735

  16. A "Uses and Gratification Expectancy Model" to Predict Students' "Perceived e-Learning Experience"

    ERIC Educational Resources Information Center

    Mondi, Makingu; Woods, Peter; Rafi, Ahmad

    2008-01-01

    This study investigates "how and why" students' "Uses and Gratification Expectancy" (UGE) for e-learning resources influences their "Perceived e-Learning Experience." A "Uses and Gratification Expectancy Model" (UGEM) framework is proposed to predict students' "Perceived e-Learning Experience," and…

  17. A nonlinear autoregressive Volterra model of the Hodgkin-Huxley equations.

    PubMed

    Eikenberry, Steffen E; Marmarelis, Vasilis Z

    2013-02-01

    We propose a new variant of Volterra-type model with a nonlinear auto-regressive (NAR) component that is a suitable framework for describing the process of AP generation by the neuron membrane potential, and we apply it to input-output data generated by the Hodgkin-Huxley (H-H) equations. Volterra models use a functional series expansion to describe the input-output relation for most nonlinear dynamic systems, and are applicable to a wide range of physiologic systems. It is difficult, however, to apply the Volterra methodology to the H-H model because is characterized by distinct subthreshold and suprathreshold dynamics. When threshold is crossed, an autonomous action potential (AP) is generated, the output becomes temporarily decoupled from the input, and the standard Volterra model fails. Therefore, in our framework, whenever membrane potential exceeds some threshold, it is taken as a second input to a dual-input Volterra model. This model correctly predicts membrane voltage deflection both within the subthreshold region and during APs. Moreover, the model naturally generates a post-AP afterpotential and refractory period. It is known that the H-H model converges to a limit cycle in response to a constant current injection. This behavior is correctly predicted by the proposed model, while the standard Volterra model is incapable of generating such limit cycle behavior. The inclusion of cross-kernels, which describe the nonlinear interactions between the exogenous and autoregressive inputs, is found to be absolutely necessary. The proposed model is general, non-parametric, and data-derived.

  18. Brain activity and cognition: a connection from thermodynamics and information theory

    PubMed Central

    Collell, Guillem; Fauquet, Jordi

    2015-01-01

    The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity. PMID:26136709

  19. Landscape-level effects on aboveground biomass of tropical forests: A conceptual framework.

    PubMed

    Melito, Melina; Metzger, Jean Paul; de Oliveira, Alexandre A

    2018-02-01

    Despite the general recognition that fragmentation can reduce forest biomass through edge effects, a systematic review of the literature does not reveal a clear role of edges in modulating biomass loss. Additionally, the edge effects appear to be constrained by matrix type, suggesting that landscape composition has an influence on biomass stocks. The lack of empirical evidence of pervasive edge-related biomass losses across tropical forests highlights the necessity for a general framework linking landscape structure with aboveground biomass. Here, we propose a conceptual model in which landscape composition and configuration mediate the magnitude of edge effects and seed-flux among forest patches, which ultimately has an influence on biomass. Our model hypothesizes that a rapid reduction of biomass can occur below a threshold of forest cover loss. Just below this threshold, we predict that changes in landscape configuration can strongly influence the patch's isolation, thus enhancing biomass loss. Moreover, we expect a synergism between landscape composition and patch attributes, where matrix type mediates the effects of edges on species decline, particularly for shade-tolerant species. To test our conceptual framework, we propose a sampling protocol where the effects of edges, forest amount, forest isolation, fragment size, and matrix type on biomass stocks can be assessed both collectively and individually. The proposed model unifies the combined effects of landscape and patch structure on biomass into a single framework, providing a new set of main drivers of biomass loss in human-modified landscapes. We argue that carbon trading agendas (e.g., REDD+) and carbon-conservation initiatives must go beyond the effects of forest loss and edges on biomass, considering the whole set of effects on biomass related to changes in landscape composition and configuration. © 2017 John Wiley & Sons Ltd.

  20. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  1. Sparsity-aware tight frame learning with adaptive subspace recognition for multiple fault diagnosis

    NASA Astrophysics Data System (ADS)

    Zhang, Han; Chen, Xuefeng; Du, Zhaohui; Yang, Boyuan

    2017-09-01

    It is a challenging problem to design excellent dictionaries to sparsely represent diverse fault information and simultaneously discriminate different fault sources. Therefore, this paper describes and analyzes a novel multiple feature recognition framework which incorporates the tight frame learning technique with an adaptive subspace recognition strategy. The proposed framework consists of four stages. Firstly, by introducing the tight frame constraint into the popular dictionary learning model, the proposed tight frame learning model could be formulated as a nonconvex optimization problem which can be solved by alternatively implementing hard thresholding operation and singular value decomposition. Secondly, the noises are effectively eliminated through transform sparse coding techniques. Thirdly, the denoised signal is decoupled into discriminative feature subspaces by each tight frame filter. Finally, in guidance of elaborately designed fault related sensitive indexes, latent fault feature subspaces can be adaptively recognized and multiple faults are diagnosed simultaneously. Extensive numerical experiments are sequently implemented to investigate the sparsifying capability of the learned tight frame as well as its comprehensive denoising performance. Most importantly, the feasibility and superiority of the proposed framework is verified through performing multiple fault diagnosis of motor bearings. Compared with the state-of-the-art fault detection techniques, some important advantages have been observed: firstly, the proposed framework incorporates the physical prior with the data-driven strategy and naturally multiple fault feature with similar oscillation morphology can be adaptively decoupled. Secondly, the tight frame dictionary directly learned from the noisy observation can significantly promote the sparsity of fault features compared to analytical tight frames. Thirdly, a satisfactory complete signal space description property is guaranteed and thus weak feature leakage problem is avoided compared to typical learning methods.

  2. Proposed framework for thermomechanical life modeling of metal matrix composites

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.; Lerch, Bradley A.; Saltsman, James F.

    1993-01-01

    The framework of a mechanics of materials model is proposed for thermomechanical fatigue (TMF) life prediction of unidirectional, continuous-fiber metal matrix composites (MMC's). Axially loaded MMC test samples are analyzed as structural components whose fatigue lives are governed by local stress-strain conditions resulting from combined interactions of the matrix, interfacial layer, and fiber constituents. The metallic matrix is identified as the vehicle for tracking fatigue crack initiation and propagation. The proposed framework has three major elements. First, TMF flow and failure characteristics of in situ matrix material are approximated from tests of unreinforced matrix material, and matrix TMF life prediction equations are numerically calibrated. The macrocrack initiation fatigue life of the matrix material is divided into microcrack initiation and microcrack propagation phases. Second, the influencing factors created by the presence of fibers and interfaces are analyzed, characterized, and documented in equation form. Some of the influences act on the microcrack initiation portion of the matrix fatigue life, others on the microcrack propagation life, while some affect both. Influencing factors include coefficient of thermal expansion mismatch strains, residual (mean) stresses, multiaxial stress states, off-axis fibers, internal stress concentrations, multiple initiation sites, nonuniform fiber spacing, fiber debonding, interfacial layers and cracking, fractured fibers, fiber deflections of crack fronts, fiber bridging of matrix cracks, and internal oxidation along internal interfaces. Equations exist for some, but not all, of the currently identified influencing factors. The third element is the inclusion of overriding influences such as maximum tensile strain limits of brittle fibers that could cause local fractures and ensuing catastrophic failure of surrounding matrix material. Some experimental data exist for assessing the plausibility of the proposed framework.

  3. From trees to forest: relational complexity network and workload of air traffic controllers.

    PubMed

    Zhang, Jingyu; Yang, Jiazhong; Wu, Changxu

    2015-01-01

    In this paper, we propose a relational complexity (RC) network framework based on RC metric and network theory to model controllers' workload in conflict detection and resolution. We suggest that, at the sector level, air traffic showing a centralised network pattern can provide cognitive benefits in visual search and resolution decision which will in turn result in lower workload. We found that the network centralisation index can account for more variance in predicting perceived workload and task completion time in both a static conflict detection task (Study 1) and a dynamic one (Study 2) in addition to other aircraft-level and pair-level factors. This finding suggests that linear combination of aircraft-level or dyad-level information may not be adequate and the global-pattern-based index is necessary. Theoretical and practical implications of using this framework to improve future workload modelling and management are discussed. We propose a RC network framework to model the workload of air traffic controllers. The effect of network centralisation was examined in both a static conflict detection task and a dynamic one. Network centralisation was predictive of perceived workload and task completion time over and above other control variables.

  4. Building health behavior models to guide the development of just-in-time adaptive interventions: A pragmatic framework

    PubMed Central

    Nahum-Shani, Inbal; Hekler, Eric B.; Spruijt-Metz, Donna

    2016-01-01

    Advances in wireless devices and mobile technology offer many opportunities for delivering just-in-time adaptive interventions (JITAIs)--suites of interventions that adapt over time to an individual’s changing status and circumstances with the goal to address the individual’s need for support, whenever this need arises. A major challenge confronting behavioral scientists aiming to develop a JITAI concerns the selection and integration of existing empirical, theoretical and practical evidence into a scientific model that can inform the construction of a JITAI and help identify scientific gaps. The purpose of this paper is to establish a pragmatic framework that can be used to organize existing evidence into a useful model for JITAI construction. This framework involves clarifying the conceptual purpose of a JITAI, namely the provision of just-in-time support via adaptation, as well as describing the components of a JITAI and articulating a list of concrete questions to guide the establishment of a useful model for JITAI construction. The proposed framework includes an organizing scheme for translating the relatively static scientific models underlying many health behavior interventions into a more dynamic model that better incorporates the element of time. This framework will help to guide the next generation of empirical work to support the creation of effective JITAIs. PMID:26651462

  5. A framework to enhance security of physically unclonable functions using chaotic circuits

    NASA Astrophysics Data System (ADS)

    Chen, Lanxiang

    2018-05-01

    As a new technique for authentication and key generation, physically unclonable function (PUF) has attracted considerable attentions, with extensive research results achieved already. To resist the popular machine learning modeling attacks, a framework to enhance the security of PUFs is proposed. The basic idea is to combine PUFs with a chaotic system of which the response is highly sensitive to initial conditions. For this framework, a specific construction which combines the common arbiter PUF circuit, a converter, and the Chua's circuit is given to implement a more secure PUF. Simulation experiments are presented to further validate the framework. Finally, some practical suggestions for the framework and specific construction are also discussed.

  6. Vocational Choice: A Decision Making Perspective

    ERIC Educational Resources Information Center

    Sauermann, Henry

    2005-01-01

    We propose a model of vocational choice that can be used for analyzing and guiding the decision processes underlying career and job choices. Our model is based on research in behavioral decision making (BDM), in particular the choice goals framework developed by Bettman, Luce, and Payne (1998). The basic model involves two major processes. First,…

  7. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    ERIC Educational Resources Information Center

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  8. Systematizing Web Search through a Meta-Cognitive, Systems-Based, Information Structuring Model (McSIS)

    ERIC Educational Resources Information Center

    Abuhamdieh, Ayman H.; Harder, Joseph T.

    2015-01-01

    This paper proposes a meta-cognitive, systems-based, information structuring model (McSIS) to systematize online information search behavior based on literature review of information-seeking models. The General Systems Theory's (GST) prepositions serve as its framework. Factors influencing information-seekers, such as the individual learning…

  9. Teaching Quality Management Model for the Training of Innovation Ability and the Multilevel Decomposition Indicators

    ERIC Educational Resources Information Center

    Lu, Xingjiang; Yao, Chen; Zheng, Jianmin

    2013-01-01

    This paper focuses on the training of undergraduate students' innovation ability. On top of the theoretical framework of the Quality Function Deployment (QFD), we propose a teaching quality management model. Based on this model, we establish a multilevel decomposition indicator system, which integrates innovation ability characterized by four…

  10. Achievement Goals and Discrete Achievement Emotions: A Theoretical Model and Prospective Test

    ERIC Educational Resources Information Center

    Pekrun, Reinhard; Elliot, Andrew J.; Maier, Markus A.

    2006-01-01

    A theoretical model linking achievement goals to discrete achievement emotions is proposed. The model posits relations between the goals of the trichotomous achievement goal framework and 8 commonly experienced achievement emotions organized in a 2 (activity/outcome focus) x 2 (positive/negative valence) taxonomy. Two prospective studies tested…

  11. Exploring the Argumentation Pattern in Modeling-Based Learning about Apparent Motion of Mars

    ERIC Educational Resources Information Center

    Park, Su-Kyeong

    2016-01-01

    This study proposed an analytic framework for coding students' dialogic argumentation and investigated the characteristics of the small-group argumentation pattern observed in modeling-based learning. The participants were 122 second grade high school students in South Korea divided into an experimental and a comparison group. Modeling-based…

  12. Getting a Picture that Is Both Accurate and Stable: Situation Models and Epistemic Validation

    ERIC Educational Resources Information Center

    Schroeder, Sascha; Richter, Tobias; Hoever, Inga

    2008-01-01

    Text comprehension entails the construction of a situation model that prepares individuals for situated action. In order to meet this function, situation model representations are required to be both accurate and stable. We propose a framework according to which comprehenders rely on epistemic validation to prevent inaccurate information from…

  13. The Superskills Model: A Supervisory Microskill Competency Training Model

    ERIC Educational Resources Information Center

    Destler, Dusty

    2017-01-01

    Streamlined supervision frameworks are needed to enhance and progress the practice and training of supervisors. This author proposes the SuperSkills Model (SSM), grounded in the practice of microskills and supervision common factors, with a focus on the development and foundational learning of supervisors-in-training. The SSM worksheet prompts for…

  14. A model of icebergs and sea ice in a joint continuum framework

    NASA Astrophysics Data System (ADS)

    Vaňková, Irena; Holland, David M.

    2017-04-01

    The ice mélange, a mixture of sea ice and icebergs, often present in front of tidewater glaciers in Greenland or ice shelves in Antarctica, can have a profound effect on the dynamics of the ice-ocean system. The current inability to numerically model the ice mélange motivates a new modeling approach proposed here. A continuum sea-ice model is taken as a starting point and icebergs are represented as thick and compact pieces of sea ice held together by large tensile and shear strength selectively introduced into the sea ice rheology. In order to modify the rheology correctly, a semi-Lagrangian time stepping scheme is introduced and at each time step a Lagrangian grid is constructed such that iceberg shape is preserved exactly. With the proposed treatment, sea ice and icebergs are considered a single fluid with spatially varying rheological properties, mutual interactions are thus automatically included without the need of further parametrization. An important advantage of the presented framework for an ice mélange model is its potential to be easily included in existing climate models.

  15. A Model of Icebergs and Sea Ice in a Joint Continuum Framework

    NASA Astrophysics Data System (ADS)

    VaÅková, Irena; Holland, David M.

    2017-11-01

    The ice mélange, a mixture of sea ice and icebergs, often present in front of outlet glaciers in Greenland or ice shelves in Antarctica, can have a profound effect on the dynamics of the ice-ocean system. The current inability to numerically model the ice mélange motivates a new modeling approach proposed here. A continuum sea-ice model is taken as a starting point and icebergs are represented as thick and compact pieces of sea ice held together by large tensile and shear strength, selectively introduced into the sea-ice rheology. In order to modify the rheology correctly, an iceberg tracking procedure is implemented within a semi-Lagrangian time-stepping scheme, designed to exactly preserve iceberg shape through time. With the proposed treatment, sea ice and icebergs are considered a single fluid with spatially varying rheological properties. Mutual interactions are thus automatically included without the need for further parametrization. An important advantage of the presented framework for an ice mélange model is its potential to be easily included within sea-ice components of existing climate models.

  16. Research on regularized mean-variance portfolio selection strategy with modified Roy safety-first principle.

    PubMed

    Atta Mills, Ebenezer Fiifi Emire; Yan, Dawen; Yu, Bo; Wei, Xinyuan

    2016-01-01

    We propose a consolidated risk measure based on variance and the safety-first principle in a mean-risk portfolio optimization framework. The safety-first principle to financial portfolio selection strategy is modified and improved. Our proposed models are subjected to norm regularization to seek near-optimal stable and sparse portfolios. We compare the cumulative wealth of our preferred proposed model to a benchmark, S&P 500 index for the same period. Our proposed portfolio strategies have better out-of-sample performance than the selected alternative portfolio rules in literature and control the downside risk of the portfolio returns.

  17. Vehicle logo recognition using multi-level fusion model

    NASA Astrophysics Data System (ADS)

    Ming, Wei; Xiao, Jianli

    2018-04-01

    Vehicle logo recognition plays an important role in manufacturer identification and vehicle recognition. This paper proposes a new vehicle logo recognition algorithm. It has a hierarchical framework, which consists of two fusion levels. At the first level, a feature fusion model is employed to map the original features to a higher dimension feature space. In this space, the vehicle logos become more recognizable. At the second level, a weighted voting strategy is proposed to promote the accuracy and the robustness of the recognition results. To evaluate the performance of the proposed algorithm, extensive experiments are performed, which demonstrate that the proposed algorithm can achieve high recognition accuracy and work robustly.

  18. A framework for different levels of integration of computational models into web-based virtual patients.

    PubMed

    Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-23

    Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome.

  19. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. Conclusions This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome. PMID:24463466

  20. Modelling electro-active polymers with a dispersion-type anisotropy

    NASA Astrophysics Data System (ADS)

    Hossain, Mokarram; Steinmann, Paul

    2018-02-01

    We propose a novel constitutive framework for electro-active polymers (EAPs) that can take into account anisotropy with a chain dispersion. To enhance actuation behaviour, particle-filled EAPs become promising candidates nowadays. Recent studies suggest that particle-filled EAPs, which can be cured under an electric field during the manufacturing time, do not necessarily form perfect anisotropic composites, rather they create composites with dispersed chains. Hence in this contribution, an electro-mechanically coupled constitutive model is devised that considers the chain dispersion with a probability distribution function in an integral form. To obtain relevant quantities in discrete form, numerical integration over the unit sphere is utilized. Necessary constitutive equations are derived exploiting the basic laws of thermodynamics that result in a thermodynamically consistent formulation. To demonstrate the performance of the proposed electro-mechanically coupled framework, we analytically solve a non-homogeneous boundary value problem, the extension and inflation of an axisymmetric cylindrical tube under electro-mechanically coupled load. The results capture various electro-mechanical couplings with the formulation proposed for EAP composites.

  1. Marketing and Languages: An Integrative Model.

    ERIC Educational Resources Information Center

    McCall, Ian

    1988-01-01

    A framework is proposed for an integrated course in which knowledge of a language is consciously related to the processes of interpersonal communication and the cultural aspects of marketing and negotiation. (Editor)

  2. Latent log-linear models for handwritten digit classification.

    PubMed

    Deselaers, Thomas; Gass, Tobias; Heigold, Georg; Ney, Hermann

    2012-06-01

    We present latent log-linear models, an extension of log-linear models incorporating latent variables, and we propose two applications thereof: log-linear mixture models and image deformation-aware log-linear models. The resulting models are fully discriminative, can be trained efficiently, and the model complexity can be controlled. Log-linear mixture models offer additional flexibility within the log-linear modeling framework. Unlike previous approaches, the image deformation-aware model directly considers image deformations and allows for a discriminative training of the deformation parameters. Both are trained using alternating optimization. For certain variants, convergence to a stationary point is guaranteed and, in practice, even variants without this guarantee converge and find models that perform well. We tune the methods on the USPS data set and evaluate on the MNIST data set, demonstrating the generalization capabilities of our proposed models. Our models, although using significantly fewer parameters, are able to obtain competitive results with models proposed in the literature.

  3. Towards a Framework to Improve the Quality of Teaching and Learning: Consciousness and Validation in Computer Engineering Science, UCT

    ERIC Educational Resources Information Center

    Lévano, Marcos; Albornoz, Andrea

    2016-01-01

    This paper aims to propose a framework to improve the quality in teaching and learning in order to develop good practices to train professionals in the career of computer engineering science. To demonstrate the progress and achievements, our work is based on two principles for the formation of professionals, one based on the model of learning…

  4. An Argument Framework for the Application of Null Hypothesis Statistical Testing in Support of Research

    ERIC Educational Resources Information Center

    LeMire, Steven D.

    2010-01-01

    This paper proposes an argument framework for the teaching of null hypothesis statistical testing and its application in support of research. Elements of the Toulmin (1958) model of argument are used to illustrate the use of p values and Type I and Type II error rates in support of claims about statistical parameters and subject matter research…

  5. Integrating human and natural systems in community psychology: an ecological model of stewardship behavior.

    PubMed

    Moskell, Christine; Allred, Shorna Broussard

    2013-03-01

    Community psychology (CP) research on the natural environment lacks a theoretical framework for analyzing the complex relationship between human systems and the natural world. We introduce other academic fields concerned with the interactions between humans and the natural environment, including environmental sociology and coupled human and natural systems. To demonstrate how the natural environment can be included within CP's ecological framework, we propose an ecological model of urban forest stewardship action. Although ecological models of behavior in CP have previously modeled health behaviors, we argue that these frameworks are also applicable to actions that positively influence the natural environment. We chose the environmental action of urban forest stewardship because cities across the United States are planting millions of trees and increased citizen participation in urban tree planting and stewardship will be needed to sustain the benefits provided by urban trees. We used the framework of an ecological model of behavior to illustrate multiple levels of factors that may promote or hinder involvement in urban forest stewardship actions. The implications of our model for the development of multi-level ecological interventions to foster stewardship actions are discussed, as well as directions for future research to further test and refine the model.

  6. A Micro-Level Data-Calibrated Agent-Based Model: The Synergy between Microsimulation and Agent-Based Modeling.

    PubMed

    Singh, Karandeep; Ahn, Chang-Won; Paik, Euihyun; Bae, Jang Won; Lee, Chun-Hee

    2018-01-01

    Artificial life (ALife) examines systems related to natural life, its processes, and its evolution, using simulations with computer models, robotics, and biochemistry. In this article, we focus on the computer modeling, or "soft," aspects of ALife and prepare a framework for scientists and modelers to be able to support such experiments. The framework is designed and built to be a parallel as well as distributed agent-based modeling environment, and does not require end users to have expertise in parallel or distributed computing. Furthermore, we use this framework to implement a hybrid model using microsimulation and agent-based modeling techniques to generate an artificial society. We leverage this artificial society to simulate and analyze population dynamics using Korean population census data. The agents in this model derive their decisional behaviors from real data (microsimulation feature) and interact among themselves (agent-based modeling feature) to proceed in the simulation. The behaviors, interactions, and social scenarios of the agents are varied to perform an analysis of population dynamics. We also estimate the future cost of pension policies based on the future population structure of the artificial society. The proposed framework and model demonstrates how ALife techniques can be used by researchers in relation to social issues and policies.

  7. IntelliHealth: A medical decision support application using a novel weighted multi-layer classifier ensemble framework.

    PubMed

    Bashir, Saba; Qamar, Usman; Khan, Farhan Hassan

    2016-02-01

    Accuracy plays a vital role in the medical field as it concerns with the life of an individual. Extensive research has been conducted on disease classification and prediction using machine learning techniques. However, there is no agreement on which classifier produces the best results. A specific classifier may be better than others for a specific dataset, but another classifier could perform better for some other dataset. Ensemble of classifiers has been proved to be an effective way to improve classification accuracy. In this research we present an ensemble framework with multi-layer classification using enhanced bagging and optimized weighting. The proposed model called "HM-BagMoov" overcomes the limitations of conventional performance bottlenecks by utilizing an ensemble of seven heterogeneous classifiers. The framework is evaluated on five different heart disease datasets, four breast cancer datasets, two diabetes datasets, two liver disease datasets and one hepatitis dataset obtained from public repositories. The analysis of the results show that ensemble framework achieved the highest accuracy, sensitivity and F-Measure when compared with individual classifiers for all the diseases. In addition to this, the ensemble framework also achieved the highest accuracy when compared with the state of the art techniques. An application named "IntelliHealth" is also developed based on proposed model that may be used by hospitals/doctors for diagnostic advice. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. A unified framework of unsupervised subjective optimized bit allocation for multiple video object coding

    NASA Astrophysics Data System (ADS)

    Chen, Zhenzhong; Han, Junwei; Ngan, King Ngi

    2005-10-01

    MPEG-4 treats a scene as a composition of several objects or so-called video object planes (VOPs) that are separately encoded and decoded. Such a flexible video coding framework makes it possible to code different video object with different distortion scale. It is necessary to analyze the priority of the video objects according to its semantic importance, intrinsic properties and psycho-visual characteristics such that the bit budget can be distributed properly to video objects to improve the perceptual quality of the compressed video. This paper aims to provide an automatic video object priority definition method based on object-level visual attention model and further propose an optimization framework for video object bit allocation. One significant contribution of this work is that the human visual system characteristics are incorporated into the video coding optimization process. Another advantage is that the priority of the video object can be obtained automatically instead of fixing weighting factors before encoding or relying on the user interactivity. To evaluate the performance of the proposed approach, we compare it with traditional verification model bit allocation and the optimal multiple video object bit allocation algorithms. Comparing with traditional bit allocation algorithms, the objective quality of the object with higher priority is significantly improved under this framework. These results demonstrate the usefulness of this unsupervised subjective quality lifting framework.

  9. Multi-level multi-task learning for modeling cross-scale interactions in nested geospatial data

    USGS Publications Warehouse

    Yuan, Shuai; Zhou, Jiayu; Tan, Pang-Ning; Fergus, Emi; Wagner, Tyler; Sorrano, Patricia

    2017-01-01

    Predictive modeling of nested geospatial data is a challenging problem as the models must take into account potential interactions among variables defined at different spatial scales. These cross-scale interactions, as they are commonly known, are particularly important to understand relationships among ecological properties at macroscales. In this paper, we present a novel, multi-level multi-task learning framework for modeling nested geospatial data in the lake ecology domain. Specifically, we consider region-specific models to predict lake water quality from multi-scaled factors. Our framework enables distinct models to be developed for each region using both its local and regional information. The framework also allows information to be shared among the region-specific models through their common set of latent factors. Such information sharing helps to create more robust models especially for regions with limited or no training data. In addition, the framework can automatically determine cross-scale interactions between the regional variables and the local variables that are nested within them. Our experimental results show that the proposed framework outperforms all the baseline methods in at least 64% of the regions for 3 out of 4 lake water quality datasets evaluated in this study. Furthermore, the latent factors can be clustered to obtain a new set of regions that is more aligned with the response variables than the original regions that were defined a priori from the ecology domain.

  10. A Hierarchical Framework for Demand-Side Frequency Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moya, Christian; Zhang, Wei; Lian, Jianming

    2014-06-02

    With large-scale plans to integrate renewable generation, more resources will be needed to compensate for the uncertainty associated with intermittent generation resources. Under such conditions, performing frequency control using only supply-side resources become not only prohibitively expensive but also technically difficult. It is therefore important to explore how a sufficient proportion of the loads could assume a routine role in frequency control to maintain the stability of the system at an acceptable cost. In this paper, a novel hierarchical decentralized framework for frequency based load control is proposed. The framework involves two decision layers. The top decision layer determines themore » optimal droop gain required from the aggregated load response on each bus using a robust decentralized control approach. The second layer consists of a large number of devices, which switch probabilistically during contingencies so that the aggregated power change matches the desired droop amount according to the updated gains. The proposed framework is based on the classical nonlinear multi-machine power system model, and can deal with timevarying system operating conditions while respecting the physical constraints of individual devices. Realistic simulation results based on a 68-bus system are provided to demonstrate the effectiveness of the proposed strategy.« less

  11. Prognostic residual mean flow in an ocean general circulation model and its relation to prognostic Eulerian mean flow

    DOE PAGES

    Saenz, Juan A.; Chen, Qingshan; Ringler, Todd

    2015-05-19

    Recent work has shown that taking the thickness-weighted average (TWA) of the Boussinesq equations in buoyancy coordinates results in exact equations governing the prognostic residual mean flow where eddy–mean flow interactions appear in the horizontal momentum equations as the divergence of the Eliassen–Palm flux tensor (EPFT). It has been proposed that, given the mathematical tractability of the TWA equations, the physical interpretation of the EPFT, and its relation to potential vorticity fluxes, the TWA is an appropriate framework for modeling ocean circulation with parameterized eddies. The authors test the feasibility of this proposition and investigate the connections between the TWAmore » framework and the conventional framework used in models, where Eulerian mean flow prognostic variables are solved for. Using the TWA framework as a starting point, this study explores the well-known connections between vertical transfer of horizontal momentum by eddy form drag and eddy overturning by the bolus velocity, used by Greatbatch and Lamb and Gent and McWilliams to parameterize eddies. After implementing the TWA framework in an ocean general circulation model, we verify our analysis by comparing the flows in an idealized Southern Ocean configuration simulated using the TWA and conventional frameworks with the same mesoscale eddy parameterization.« less

  12. 3D Reconstruction of Space Objects from Multi-Views by a Visible Sensor

    PubMed Central

    Zhang, Haopeng; Wei, Quanmao; Jiang, Zhiguo

    2017-01-01

    In this paper, a novel 3D reconstruction framework is proposed to recover the 3D structural model of a space object from its multi-view images captured by a visible sensor. Given an image sequence, this framework first estimates the relative camera poses and recovers the depths of the surface points by the structure from motion (SFM) method, then the patch-based multi-view stereo (PMVS) algorithm is utilized to generate a dense 3D point cloud. To resolve the wrong matches arising from the symmetric structure and repeated textures of space objects, a new strategy is introduced, in which images are added to SFM in imaging order. Meanwhile, a refining process exploiting the structural prior knowledge that most sub-components of artificial space objects are composed of basic geometric shapes is proposed and applied to the recovered point cloud. The proposed reconstruction framework is tested on both simulated image datasets and real image datasets. Experimental results illustrate that the recovered point cloud models of space objects are accurate and have a complete coverage of the surface. Moreover, outliers and points with severe noise are effectively filtered out by the refinement, resulting in an distinct improvement of the structure and visualization of the recovered points. PMID:28737675

  13. Finite element based N-Port model for preliminary design of multibody systems

    NASA Astrophysics Data System (ADS)

    Sanfedino, Francesco; Alazard, Daniel; Pommier-Budinger, Valérie; Falcoz, Alexandre; Boquet, Fabrice

    2018-02-01

    This article presents and validates a general framework to build a linear dynamic Finite Element-based model of large flexible structures for integrated Control/Structure design. An extension of the Two-Input Two-Output Port (TITOP) approach is here developed. The authors had already proposed such framework for simple beam-like structures: each beam was considered as a TITOP sub-system that could be interconnected to another beam thanks to the ports. The present work studies bodies with multiple attaching points by allowing complex interconnections among several sub-structures in tree-like assembly. The TITOP approach is extended to generate NINOP (N-Input N-Output Port) models. A Matlab toolbox is developed integrating beam and bending plate elements. In particular a NINOP formulation of bending plates is proposed to solve analytic two-dimensional problems. The computation of NINOP models using the outputs of a MSC/Nastran modal analysis is also investigated in order to directly use the results provided by a commercial finite element software. The main advantage of this tool is to provide a model of a multibody system under the form of a block diagram with a minimal number of states. This model is easy to operate for preliminary design and control. An illustrative example highlights the potential of the proposed approach: the synthesis of the dynamical model of a spacecraft with two deployable and flexible solar arrays.

  14. A judgment and decision-making model for plant behavior.

    PubMed

    Karban, Richard; Orrock, John L

    2018-06-12

    Recently plant biologists have documented that plants, like animals, engage in many activities that can be considered as behaviors, although plant biologists currently lack a conceptual framework to understand these processes. Borrowing the well-established framework developed by psychologists, we propose that plant behaviors can be constructively modeled by identifying four distinct components: 1) a cue or stimulus that provides information, 2) a judgment whereby the plant perceives and processes this informative cue, 3) a decision whereby the plant chooses among several options based on their relative costs and benefits, and 4) action. Judgment for plants can be determined empirically by monitoring signaling associated with electrical, calcium, or hormonal fluxes. Decision-making can be evaluated empirically by monitoring gene expression or differential allocation of resources. We provide examples of the utility of this judgment and decision-making framework by considering cases in which plants either successfully or unsuccessfully induced resistance against attacking herbivores. Separating judgment from decision-making suggests new analytical paradigms (i.e., Bayesian methods for judgment and economic utility models for decision-making). Following this framework, we propose an experimental approach to plant behavior that explicitly manipulates the stimuli provided to plants, uses plants that vary in sensory abilities, and examines how environmental context affects plant responses. The concepts and approaches that follow from the judgment and decision-making framework can shape how we study and understand plant-herbivore interactions, biological invasions, plant responses to climate change, and the susceptibility of plants to evolutionary traps. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Jupiter Europa Orbiter Architecture Definition Process

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Shishko, Robert

    2011-01-01

    The proposed Jupiter Europa Orbiter mission, planned for launch in 2020, is using a new architectural process and framework tool to drive its model-based systems engineering effort. The process focuses on getting the architecture right before writing requirements and developing a point design. A new architecture framework tool provides for the structured entry and retrieval of architecture artifacts based on an emerging architecture meta-model. This paper describes the relationships among these artifacts and how they are used in the systems engineering effort. Some early lessons learned are discussed.

  16. A Security Audit Framework to Manage Information System Security

    NASA Astrophysics Data System (ADS)

    Pereira, Teresa; Santos, Henrique

    The widespread adoption of information and communication technology have promoted an increase dependency of organizations in the performance of their Information Systems. As a result, adequate security procedures to properly manage information security must be established by the organizations, in order to protect their valued or critical resources from accidental or intentional attacks, and ensure their normal activity. A conceptual security framework to manage and audit Information System Security is proposed and discussed. The proposed framework intends to assist organizations firstly to understand what they precisely need to protect assets and what are their weaknesses (vulnerabilities), enabling to perform an adequate security management. Secondly, enabling a security audit framework to support the organization to assess the efficiency of the controls and policy adopted to prevent or mitigate attacks, threats and vulnerabilities, promoted by the advances of new technologies and new Internet-enabled services, that the organizations are subject of. The presented framework is based on a conceptual model approach, which contains the semantic description of the concepts defined in information security domain, based on the ISO/IEC_JCT1 standards.

  17. Knowledge Discovery from Vibration Measurements

    PubMed Central

    Li, Jian; Wang, Daoyao

    2014-01-01

    The framework as well as the particular algorithms of pattern recognition process is widely adopted in structural health monitoring (SHM). However, as a part of the overall process of knowledge discovery from data bases (KDD), the results of pattern recognition are only changes and patterns of changes of data features. In this paper, based on the similarity between KDD and SHM and considering the particularity of SHM problems, a four-step framework of SHM is proposed which extends the final goal of SHM from detecting damages to extracting knowledge to facilitate decision making. The purposes and proper methods of each step of this framework are discussed. To demonstrate the proposed SHM framework, a specific SHM method which is composed by the second order structural parameter identification, statistical control chart analysis, and system reliability analysis is then presented. To examine the performance of this SHM method, real sensor data measured from a lab size steel bridge model structure are used. The developed four-step framework of SHM has the potential to clarify the process of SHM to facilitate the further development of SHM techniques. PMID:24574933

  18. A COMPARISON OF INTERCELL METRICS ON DISCRETE GLOBAL GRID SYSTEMS

    EPA Science Inventory

    A discrete global grid system (DGGS) is a spatial data model that aids in global research by serving as a framework for environmental modeling, monitoring and sampling across the earth at multiple spatial scales. Topological and geometric criteria have been proposed to evaluate a...

  19. Mental Models in Expert Physics Reasoning.

    ERIC Educational Resources Information Center

    Roschelle, Jeremy; Greeno, James G.

    Proposed is a relational framework for characterizing experienced physicists' representations of physics problem situations and the process of constructing these representations. A representation includes a coherent set of relations among: (1) a mental model of the objects in the situation, along with their relevant properties and relations; (2) a…

  20. Selection Criteria for Mathematical Models Used in Exposure Assessments: Atmospheric Dispersion Models

    EPA Science Inventory

    Before the U.S. Environmental Protection Agency issued the 1988 Guidelines for Estimating Exposures, it published proposed guidelines in the Federal Register for public review and comment. he guidelines are intended to give risk analysis a basic framework and the tools they need ...

Top