Sample records for use-case driven approach

  1. Defining datasets and creating data dictionaries for quality improvement and research in chronic disease using routinely collected data: an ontology-driven approach.

    PubMed

    de Lusignan, Simon; Liaw, Siaw-Teng; Michalakidis, Georgios; Jones, Simon

    2011-01-01

    The burden of chronic disease is increasing, and research and quality improvement will be less effective if case finding strategies are suboptimal. To describe an ontology-driven approach to case finding in chronic disease and how this approach can be used to create a data dictionary and make the codes used in case finding transparent. A five-step process: (1) identifying a reference coding system or terminology; (2) using an ontology-driven approach to identify cases; (3) developing metadata that can be used to identify the extracted data; (4) mapping the extracted data to the reference terminology; and (5) creating the data dictionary. Hypertension is presented as an exemplar. A patient with hypertension can be represented by a range of codes including diagnostic, history and administrative. Metadata can link the coding system and data extraction queries to the correct data mapping and translation tool, which then maps it to the equivalent code in the reference terminology. The code extracted, the term, its domain and subdomain, and the name of the data extraction query can then be automatically grouped and published online as a readily searchable data dictionary. An exemplar online is: www.clininf.eu/qickd-data-dictionary.html Adopting an ontology-driven approach to case finding could improve the quality of disease registers and of research based on routine data. It would offer considerable advantages over using limited datasets to define cases. This approach should be considered by those involved in research and quality improvement projects which utilise routine data.

  2. Use case driven approach to develop simulation model for PCS of APR1400 simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang

    2006-07-01

    The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less

  3. Biomimetics and the case of the remarkable ragworms.

    PubMed

    Hesselberg, Thomas

    2007-08-01

    Biomimetics is a rapidly growing field both as an academic and as an applied discipline. This paper gives a short introduction to the current status of the discipline before it describes three approaches to biomimetics: the mechanism-driven, which is based on the study of a specific mechanism; the focused organism-driven, which is based on the study of one function in a model organism; and the integrative organism-driven approach, where multiple functions of a model organism provide inspiration. The first two are established approaches and include many modern studies and the famous biomimetic discoveries of Velcro and the Lotus-Effect, whereas the last approach is not yet well recognized. The advantages of the integrative organism-driven approach are discussed using the ragworms as a case study. A morphological and locomotory study of these marine polychaetes reveals their biomimetic potential, which includes using their ability to move in slippery substrates as inspiration for novel endoscopes, using their compound setae as models for passive friction structures and using their three gaits, slow crawling, fast crawling, and swimming as well as their rapid burrowing technique to provide inspiration for the design of displacement pumps and multifunctional robots.

  4. Biomimetics and the case of the remarkable ragworms

    NASA Astrophysics Data System (ADS)

    Hesselberg, Thomas

    2007-08-01

    Biomimetics is a rapidly growing field both as an academic and as an applied discipline. This paper gives a short introduction to the current status of the discipline before it describes three approaches to biomimetics: the mechanism-driven, which is based on the study of a specific mechanism; the focused organism-driven, which is based on the study of one function in a model organism; and the integrative organism-driven approach, where multiple functions of a model organism provide inspiration. The first two are established approaches and include many modern studies and the famous biomimetic discoveries of Velcro and the Lotus-Effect, whereas the last approach is not yet well recognized. The advantages of the integrative organism-driven approach are discussed using the ragworms as a case study. A morphological and locomotory study of these marine polychaetes reveals their biomimetic potential, which includes using their ability to move in slippery substrates as inspiration for novel endoscopes, using their compound setae as models for passive friction structures and using their three gaits, slow crawling, fast crawling, and swimming as well as their rapid burrowing technique to provide inspiration for the design of displacement pumps and multifunctional robots.

  5. Bending of Euler-Bernoulli nanobeams based on the strain-driven and stress-driven nonlocal integral models: a numerical approach

    NASA Astrophysics Data System (ADS)

    Oskouie, M. Faraji; Ansari, R.; Rouhi, H.

    2018-04-01

    Eringen's nonlocal elasticity theory is extensively employed for the analysis of nanostructures because it is able to capture nanoscale effects. Previous studies have revealed that using the differential form of the strain-driven version of this theory leads to paradoxical results in some cases, such as bending analysis of cantilevers, and recourse must be made to the integral version. In this article, a novel numerical approach is developed for the bending analysis of Euler-Bernoulli nanobeams in the context of strain- and stress-driven integral nonlocal models. This numerical approach is proposed for the direct solution to bypass the difficulties related to converting the integral governing equation into a differential equation. First, the governing equation is derived based on both strain-driven and stress-driven nonlocal models by means of the minimum total potential energy. Also, in each case, the governing equation is obtained in both strong and weak forms. To solve numerically the derived equations, matrix differential and integral operators are constructed based upon the finite difference technique and trapezoidal integration rule. It is shown that the proposed numerical approach can be efficiently applied to the strain-driven nonlocal model with the aim of resolving the mentioned paradoxes. Also, it is able to solve the problem based on the strain-driven model without inconsistencies of the application of this model that are reported in the literature.

  6. An exchange format for use-cases of hospital information systems.

    PubMed

    Masuda, G; Sakamoto, N; Sakai, R; Yamamoto, R

    2001-01-01

    Object-oriented software development is a powerful methodology for development of large hospital information systems. We think use-case driven approach is particularly useful for the development. In the use-cases driven approach, use-cases are documented at the first stage in the software development process and they are used through the whole steps in a variety of ways. Therefore, it is important to exchange and share the use-cases and make effective use of them through the overall lifecycle of a development process. In this paper, we propose a method of sharing and exchanging use-case models between applications, developers, and projects. We design an XML based exchange format for use-cases. We then discuss an application of the exchange format to support several software development activities. We preliminarily implemented a support system for object-oriented analysis based on the exchange format. The result shows that using the structural and semantic information in the exchange format enables the support system to assist the object-oriented analysis successfully.

  7. Lightweight approach to model traceability in a CASE tool

    NASA Astrophysics Data System (ADS)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  8. Flood probability quantification for road infrastructure: Data-driven spatial-statistical approach and case study applications.

    PubMed

    Kalantari, Zahra; Cavalli, Marco; Cantone, Carolina; Crema, Stefano; Destouni, Georgia

    2017-03-01

    Climate-driven increase in the frequency of extreme hydrological events is expected to impose greater strain on the built environment and major transport infrastructure, such as roads and railways. This study develops a data-driven spatial-statistical approach to quantifying and mapping the probability of flooding at critical road-stream intersection locations, where water flow and sediment transport may accumulate and cause serious road damage. The approach is based on novel integration of key watershed and road characteristics, including also measures of sediment connectivity. The approach is concretely applied to and quantified for two specific study case examples in southwest Sweden, with documented road flooding effects of recorded extreme rainfall. The novel contributions of this study in combining a sediment connectivity account with that of soil type, land use, spatial precipitation-runoff variability and road drainage in catchments, and in extending the connectivity measure use for different types of catchments, improve the accuracy of model results for road flood probability. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Inter-subject phase synchronization for exploratory analysis of task-fMRI.

    PubMed

    Bolt, Taylor; Nomi, Jason S; Vij, Shruti G; Chang, Catie; Uddin, Lucina Q

    2018-08-01

    Analysis of task-based fMRI data is conventionally carried out using a hypothesis-driven approach, where blood-oxygen-level dependent (BOLD) time courses are correlated with a hypothesized temporal structure. In some experimental designs, this temporal structure can be difficult to define. In other cases, experimenters may wish to take a more exploratory, data-driven approach to detecting task-driven BOLD activity. In this study, we demonstrate the efficiency and power of an inter-subject synchronization approach for exploratory analysis of task-based fMRI data. Combining the tools of instantaneous phase synchronization and independent component analysis, we characterize whole-brain task-driven responses in terms of group-wise similarity in temporal signal dynamics of brain networks. We applied this framework to fMRI data collected during performance of a simple motor task and a social cognitive task. Analyses using an inter-subject phase synchronization approach revealed a large number of brain networks that dynamically synchronized to various features of the task, often not predicted by the hypothesized temporal structure of the task. We suggest that this methodological framework, along with readily available tools in the fMRI community, provides a powerful exploratory, data-driven approach for analysis of task-driven BOLD activity. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Towards Customer-Driven Management in Hospitality Education: A Case Study of the Higher Hotel Institute, Cyprus.

    ERIC Educational Resources Information Center

    Varnavas, Andreas P.; Soteriou, Andreas C.

    2002-01-01

    Presents and discusses the approach used by the Higher Hotel Institute in Cyprus to incorporate total quality management through establishment of a customer-driven management culture in its hospitality education program. Discusses how it collects and uses service-quality related data from future employers, staff, and students in pursuing this…

  11. Combining Model-Based and Feature-Driven Diagnosis Approaches - A Case Study on Electromechanical Actuators

    NASA Technical Reports Server (NTRS)

    Narasimhan, Sriram; Roychoudhury, Indranil; Balaban, Edward; Saxena, Abhinav

    2010-01-01

    Model-based diagnosis typically uses analytical redundancy to compare predictions from a model against observations from the system being diagnosed. However this approach does not work very well when it is not feasible to create analytic relations describing all the observed data, e.g., for vibration data which is usually sampled at very high rates and requires very detailed finite element models to describe its behavior. In such cases, features (in time and frequency domains) that contain diagnostic information are extracted from the data. Since this is a computationally intensive process, it is not efficient to extract all the features all the time. In this paper we present an approach that combines the analytic model-based and feature-driven diagnosis approaches. The analytic approach is used to reduce the set of possible faults and then features are chosen to best distinguish among the remaining faults. We describe an implementation of this approach on the Flyable Electro-mechanical Actuator (FLEA) test bed.

  12. "Growing" a Campus Native Species Garden: Sustaining Volunteer-Driven Sustainability

    ERIC Educational Resources Information Center

    McKinne, Kristan L.; Halfacre, Angela C.

    2008-01-01

    Purpose: This paper aims to examine the challenges of volunteer-driven college campus sustainability projects through a case study of the development of an urban native plant species garden on the College of Charleston campus in Charleston, South Carolina, USA. Design/methodology/approach: The research used participant observation as the primary…

  13. A General and Efficient Method for Incorporating Precise Spike Times in Globally Time-Driven Simulations

    PubMed Central

    Hanuschkin, Alexander; Kunkel, Susanne; Helias, Moritz; Morrison, Abigail; Diesmann, Markus

    2010-01-01

    Traditionally, event-driven simulations have been limited to the very restricted class of neuronal models for which the timing of future spikes can be expressed in closed form. Recently, the class of models that is amenable to event-driven simulation has been extended by the development of techniques to accurately calculate firing times for some integrate-and-fire neuron models that do not enable the prediction of future spikes in closed form. The motivation of this development is the general perception that time-driven simulations are imprecise. Here, we demonstrate that a globally time-driven scheme can calculate firing times that cannot be discriminated from those calculated by an event-driven implementation of the same model; moreover, the time-driven scheme incurs lower computational costs. The key insight is that time-driven methods are based on identifying a threshold crossing in the recent past, which can be implemented by a much simpler algorithm than the techniques for predicting future threshold crossings that are necessary for event-driven approaches. As run time is dominated by the cost of the operations performed at each incoming spike, which includes spike prediction in the case of event-driven simulation and retrospective detection in the case of time-driven simulation, the simple time-driven algorithm outperforms the event-driven approaches. Additionally, our method is generally applicable to all commonly used integrate-and-fire neuronal models; we show that a non-linear model employing a standard adaptive solver can reproduce a reference spike train with a high degree of precision. PMID:21031031

  14. A standard-driven approach for electronic submission to pharmaceutical regulatory authorities.

    PubMed

    Lin, Ching-Heng; Chou, Hsin-I; Yang, Ueng-Cheng

    2018-03-01

    Using standards is not only useful for data interchange during the process of a clinical trial, but also useful for analyzing data in a review process. Any step, which speeds up approval of new drugs, may benefit patients. As a result, adopting standards for regulatory submission becomes mandatory in some countries. However, preparing standard-compliant documents, such as annotated case report form (aCRF), needs a great deal of knowledge and experience. The process is complex and labor-intensive. Therefore, there is a need to use information technology to facilitate this process. Instead of standardizing data after the completion of a clinical trial, this study proposed a standard-driven approach. This approach was achieved by implementing a computer-assisted "standard-driven pipeline (SDP)" in an existing clinical data management system. SDP used CDISC standards to drive all processes of a clinical trial, such as the design, data acquisition, tabulation, etc. RESULTS: A completed phase I/II trial was used to prove the concept and to evaluate the effects of this approach. By using the CDISC-compliant question library, aCRFs were generated automatically when the eCRFs were completed. For comparison purpose, the data collection process was simulated and the collected data was transformed by the SDP. This new approach reduced the missing data fields from sixty-two to eight and the controlled term mismatch field reduced from eight to zero during data tabulation. This standard-driven approach accelerated CRF annotation and assured data tabulation integrity. The benefits of this approach include an improvement in the use of standards during the clinical trial and a reduction in missing and unexpected data during tabulation. The standard-driven approach is an advanced design idea that can be used for future clinical information system development. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Anaemia management protocols in the care of haemodialysis patients: examining patient outcomes.

    PubMed

    Saunders, Sushila; MacLeod, Martha L P; Salyers, Vince; MacMillan, Peter D; Ogborn, Malcolm R

    2013-08-01

    To determine whether the use of a nurse-driven protocol in the haemodialysis setting is as safe and effective as traditional physician-driven approaches to anaemia management. The role of haemodialysis nurses in renal anaemia management has evolved through the implementation of nurse-driven protocols, addressing the trend of exceeding haemoglobin targets and rising costs of erythropoietin-stimulating agents. Retrospective, non-equivalent case control group design. The sample was from three haemodialysis units in a control group (n = 64) and three haemodialysis units in a protocol group (n = 43). The protocol group used a nurse-driven renal anaemia management protocol, while the control group used a traditional physician-driven approach to renal anaemia management. All retrospective data were obtained from a provincial renal database. Data were analysed using chi-square tests and t-tests. Patient outcomes examined were haemoglobin levels, transferrin saturation levels, erythropoietin-stimulating agents use and intravenous iron use. Cost comparisons were determined using average use of erythropoietin-stimulating agents and intravenous iron. Control and protocol groups reached haemoglobin target levels. In the protocol group, 75% reached transferrin saturation target levels in comparison with 25% of the control group. Use and costs for iron was higher in the control group, while use and costs for erythropoietin was higher in the protocol group. The higher usage of erythropoietin-stimulating agents was potentially related to comorbid conditions amongst the protocol group. A nurse-driven protocol approach to renal anaemia management was as effective as the physician-driven approach in reaching haemoglobin and transferrin saturation levels. Further examination of the use and dosing of erythropoietin-stimulating agents and intravenous iron, their impact on haemoglobin levels related to patient comorbidities and subsequent cost effectiveness of protocols is required. Using a nurse-driven protocol in practice supports the independent nursing role while contributing to safe patient outcomes. © 2013 Blackwell Publishing Ltd.

  16. Infusing Technology Driven Design Thinking in Industrial Design Education: A Case Study

    ERIC Educational Resources Information Center

    Mubin, Omar; Novoa, Mauricio; Al Mahmud, Abdullah

    2017-01-01

    Purpose: This paper narrates a case study on design thinking-based education work in an industrial design honours program. Student projects were developed in a multi-disciplinary setting across a Computing and Engineering faculty that allowed promoting technologically and user-driven innovation strategies. Design/methodology/approach: A renewed…

  17. Shock dynamics of two-lane driven lattice gases

    NASA Astrophysics Data System (ADS)

    Schiffmann, Christoph; Appert-Rolland, Cécile; Santen, Ludger

    2010-06-01

    Driven lattice gases such as those of the ASEP model are useful tools for the modelling of various stochastic transport processes carried out by self-driven particles, such as molecular motors or vehicles in road traffic. Often these processes take place in one-dimensional systems offering several tracks to the particles, and in many cases the particles are able to change track with a given rate. In this work we consider the case of strong coupling where the rate of hopping along the tracks and the exchange rates are of the same order, and show how a phenomenological approach based on a domain wall theory can be used to describe the dynamics of the system. In particular, the domain walls on the different tracks form pairs, whose dynamics dominate the behaviour of the system.

  18. Emotional Connections in Higher Education Marketing

    ERIC Educational Resources Information Center

    Durkin, Mark; McKenna, Seamas; Cummins, Darryl

    2012-01-01

    Purpose: Through examination of a case study this paper aims to describe a brand re-positioning exercise and explore how an emotionally driven approach to branding can help create meaningful connections with potential undergraduate students and can positively influence choice. Design/methodology/approach: The paper's approach is a case study…

  19. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  20. Combining Domain-driven Design and Mashups for Service Development

    NASA Astrophysics Data System (ADS)

    Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni

    This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.

  1. Combining Knowledge and Data Driven Insights for Identifying Risk Factors using Electronic Health Records

    PubMed Central

    Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.

    2012-01-01

    Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365

  2. Data-driven non-linear elasticity: constitutive manifold construction and problem discretization

    NASA Astrophysics Data System (ADS)

    Ibañez, Ruben; Borzacchiello, Domenico; Aguado, Jose Vicente; Abisset-Chavanne, Emmanuelle; Cueto, Elias; Ladeveze, Pierre; Chinesta, Francisco

    2017-11-01

    The use of constitutive equations calibrated from data has been implemented into standard numerical solvers for successfully addressing a variety problems encountered in simulation-based engineering sciences (SBES). However, the complexity remains constantly increasing due to the need of increasingly detailed models as well as the use of engineered materials. Data-Driven simulation constitutes a potential change of paradigm in SBES. Standard simulation in computational mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy,\\ldots ), whereas the second one consists of models that scientists have extracted from collected, either natural or synthetic, data. Data-driven (or data-intensive) simulation consists of directly linking experimental data to computers in order to perform numerical simulations. These simulations will employ laws, universally recognized as epistemic, while minimizing the need of explicit, often phenomenological, models. The main drawback of such an approach is the large amount of required data, some of them inaccessible from the nowadays testing facilities. Such difficulty can be circumvented in many cases, and in any case alleviated, by considering complex tests, collecting as many data as possible and then using a data-driven inverse approach in order to generate the whole constitutive manifold from few complex experimental tests, as discussed in the present work.

  3. Comparison between collective coordinate models for domain wall motion in PMA nanostrips in the presence of the Dzyaloshinskii-Moriya interaction

    NASA Astrophysics Data System (ADS)

    Vandermeulen, J.; Nasseri, S. A.; Van de Wiele, B.; Durin, G.; Van Waeyenberge, B.; Dupré, L.

    2018-03-01

    Lagrangian-based collective coordinate models for magnetic domain wall (DW) motion rely on an ansatz for the DW profile and a Lagrangian approach to describe the DW motion in terms of a set of time-dependent collective coordinates: the DW position, the DW magnetization angle, the DW width and the DW tilting angle. Another approach was recently used to derive similar equations of motion by averaging the Landau-Lifshitz-Gilbert equation without any ansatz, and identifying the relevant collective coordinates afterwards. In this paper, we use an updated version of the semi-analytical equations to compare the Lagrangian-based collective coordinate models with micromagnetic simulations for field- and STT-driven (spin-transfer torque-driven) DW motion in Pt/CoFe/MgO and Pt/Co/AlOx nanostrips. Through this comparison, we assess the accuracy of the different models, and provide insight into the deviations of the models from simulations. It is found that the lack of terms related to DW asymmetry in the Lagrangian-based collective coordinate models significantly contributes to the discrepancy between the predictions of the most accurate Lagrangian-based model and the micromagnetic simulations in the field-driven case. This is in contrast to the STT-driven case where the DW remains symmetric.

  4. The Evolution of System Safety at NASA

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Everett, Chris; Groen, Frank

    2014-01-01

    The NASA system safety framework is in the process of change, motivated by the desire to promote an objectives-driven approach to system safety that explicitly focuses system safety efforts on system-level safety performance, and serves to unify, in a purposeful manner, safety-related activities that otherwise might be done in a way that results in gaps, redundancies, or unnecessary work. An objectives-driven approach to system safety affords more flexibility to determine, on a system-specific basis, the means by which adequate safety is achieved and verified. Such flexibility and efficiency is becoming increasingly important in the face of evolving engineering modalities and acquisition models, where, for example, NASA will increasingly rely on commercial providers for transportation services to low-earth orbit. A key element of this objectives-driven approach is the use of the risk-informed safety case (RISC): a structured argument, supported by a body of evidence, that provides a compelling, comprehensible and valid case that a system is or will be adequately safe for a given application in a given environment. The RISC addresses each of the objectives defined for the system, providing a rational basis for making informed risk acceptance decisions at relevant decision points in the system life cycle.

  5. Using the Time-Driven Activity-Based Costing Model in the Eye Clinic at The Hospital for Sick Children: A Case Study and Lessons Learned.

    PubMed

    Gulati, Sanchita; During, David; Mainland, Jeff; Wong, Agnes M F

    2018-01-01

    One of the key challenges to healthcare organizations is the development of relevant and accurate cost information. In this paper, we used time-driven activity-based costing (TDABC) method to calculate the costs of treating individual patients with specific medical conditions over their full cycle of care. We discussed how TDABC provides a critical, systematic and data-driven approach to estimate costs accurately and dynamically, as well as its potential to enable structural and rational cost reduction to bring about a sustainable healthcare system. © 2018 Longwoods Publishing.

  6. Data-driven approaches in the investigation of social perception

    PubMed Central

    Adolphs, Ralph; Nummenmaa, Lauri; Todorov, Alexander; Haxby, James V.

    2016-01-01

    The complexity of social perception poses a challenge to traditional approaches to understand its psychological and neurobiological underpinnings. Data-driven methods are particularly well suited to tackling the often high-dimensional nature of stimulus spaces and of neural representations that characterize social perception. Such methods are more exploratory, capitalize on rich and large datasets, and attempt to discover patterns often without strict hypothesis testing. We present four case studies here: behavioural studies on face judgements, two neuroimaging studies of movies, and eyetracking studies in autism. We conclude with suggestions for particular topics that seem ripe for data-driven approaches, as well as caveats and limitations. PMID:27069045

  7. Theory-Driven Intervention for Changing Personality: Expectancy Value Theory, Behavioral Activation, and Conscientiousness

    PubMed Central

    Magidson, Jessica F.; Roberts, Brent; Collado-Rodriguez, Anahi; Lejuez, C.W.

    2013-01-01

    Considerable evidence suggests that personality traits may be changeable, raising the possibility that personality traits most linked to health problems can be modified with intervention. A growing body of research suggests that problematic personality traits may be altered with behavioral intervention using a bottom-approach. That is, by targeting core behaviors that underlie personality traits with the goal of engendering new, healthier patterns of behavior that over time become automatized and manifest in changes in personality traits. Nevertheless, a bottom-up model for changing personality traits is somewhat diffuse and requires clearer integration of theory and relevant interventions to enable real clinical application. As such, this manuscript proposes a set of guiding principles for theory-driven modification of targeted personality traits using a bottom-up approach, focusing specifically on targeting the trait of conscientiousness using a relevant behavioral intervention, Behavioral Activation (BA), considered within the motivational framework of Expectancy Value Theory (EVT). We conclude with a real case example of the application of BA to alter behaviors counter to conscientiousness in a substance dependent patient, highlighting the EVT principles most relevant to the approach and the importance and viability of a theoretically-driven, bottom-up approach to changing personality traits. PMID:23106844

  8. Illustrative Case Using the RISK21 Roadmap and Matrix: Prioritization for Evaluation of Chemicals Found in Drinking Water

    EPA Science Inventory

    The HESI-led RISK21 effort has developed a framework supporting the use of twenty first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach tha...

  9. Accurate position estimation methods based on electrical impedance tomography measurements

    NASA Astrophysics Data System (ADS)

    Vergara, Samuel; Sbarbaro, Daniel; Johansen, T. A.

    2017-08-01

    Electrical impedance tomography (EIT) is a technology that estimates the electrical properties of a body or a cross section. Its main advantages are its non-invasiveness, low cost and operation free of radiation. The estimation of the conductivity field leads to low resolution images compared with other technologies, and high computational cost. However, in many applications the target information lies in a low intrinsic dimensionality of the conductivity field. The estimation of this low-dimensional information is addressed in this work. It proposes optimization-based and data-driven approaches for estimating this low-dimensional information. The accuracy of the results obtained with these approaches depends on modelling and experimental conditions. Optimization approaches are sensitive to model discretization, type of cost function and searching algorithms. Data-driven methods are sensitive to the assumed model structure and the data set used for parameter estimation. The system configuration and experimental conditions, such as number of electrodes and signal-to-noise ratio (SNR), also have an impact on the results. In order to illustrate the effects of all these factors, the position estimation of a circular anomaly is addressed. Optimization methods based on weighted error cost functions and derivate-free optimization algorithms provided the best results. Data-driven approaches based on linear models provided, in this case, good estimates, but the use of nonlinear models enhanced the estimation accuracy. The results obtained by optimization-based algorithms were less sensitive to experimental conditions, such as number of electrodes and SNR, than data-driven approaches. Position estimation mean squared errors for simulation and experimental conditions were more than twice for the optimization-based approaches compared with the data-driven ones. The experimental position estimation mean squared error of the data-driven models using a 16-electrode setup was less than 0.05% of the tomograph radius value. These results demonstrate that the proposed approaches can estimate an object’s position accurately based on EIT measurements if enough process information is available for training or modelling. Since they do not require complex calculations it is possible to use them in real-time applications without requiring high-performance computers.

  10. Influence of Reservoirs on Pressure Driven Gas Flow in a Microchannel

    NASA Astrophysics Data System (ADS)

    Shterev, K. S.; Stefanov, S. K.

    2011-11-01

    Rapidly emerging micro-electro-mechanical devices create new potential microfluidic applications. A simulation of an internal and external gas flows with accurate boundary conditions for these devices is important for their design. In this paper we study influence of reservoirs used at the microchannel inlet and outlet on the characteristics of the gas flow in the microchannel. The problem is solved by using finite volume method SIMPLE-TS (continuum approach), which is validated using Direct Simulation Monte Carlo (molecular approach). We investigate two cases: a microchannels with reservoirs and without reservoirs. We compare the microchannels with different aspect ratios A = Lch/Hch = 10,15,20,30,40 and 50, where Lch is the channel length, Hch is the channel height. Comparisons of results obtained by using continuum approach for pressure driven flow in a microchannel with and without reservoirs at the channel ends are presented.

  11. Three Research Strategies of Neuroscience and the Future of Legal Imaging Evidence.

    PubMed

    Jun, Jinkwon; Yoo, Soyoung

    2018-01-01

    Neuroscientific imaging evidence (NIE) has become an integral part of the criminal justice system in the United States. However, in most legal cases, NIE is submitted and used only to mitigate penalties because the court does not recognize it as substantial evidence, considering its lack of reliability. Nevertheless, we here discuss how neuroscience is expected to improve the use of NIE in the legal system. For this purpose, we classified the efforts of neuroscientists into three research strategies: cognitive subtraction, the data-driven approach, and the brain-manipulation approach. Cognitive subtraction is outdated and problematic; consequently, the court deemed it to be an inadequate approach in terms of legal evidence in 2012. In contrast, the data-driven and brain manipulation approaches, which are state-of-the-art approaches, have overcome the limitations of cognitive subtraction. The data-driven approach brings data science into the field and is benefiting immensely from the development of research platforms that allow automatized collection, analysis, and sharing of data. This broadens the scale of imaging evidence. The brain-manipulation approach uses high-functioning tools that facilitate non-invasive and precise human brain manipulation. These two approaches are expected to have synergistic effects. Neuroscience has strived to improve the evidential reliability of NIE, with considerable success. With the support of cutting-edge technologies, and the progress of these approaches, the evidential status of NIE will be improved and NIE will become an increasingly important part of legal practice.

  12. Impact of Data-driven Respiratory Gating in Clinical PET.

    PubMed

    Büther, Florian; Vehren, Thomas; Schäfers, Klaus P; Schäfers, Michael

    2016-10-01

    Purpose To study the feasibility and impact of respiratory gating in positron emission tomographic (PET) imaging in a clinical trial comparing conventional hardware-based gating with a data-driven approach and to describe the distribution of determined parameters. Materials and Methods This prospective study was approved by the ethics committee of the University Hospital of Münster (AZ 2014-217-f-N). Seventy-four patients suspected of having abdominal or thoracic fluorine 18 fluorodeoxyglucose (FDG)-positive lesions underwent clinical whole-body FDG PET/computed tomographic (CT) examinations. Respiratory gating was performed by using a pressure-sensitive belt system (belt gating [BG]) and an automatic data-driven approach (data-driven gating [DDG]). PET images were analyzed for lesion uptake, metabolic volumes, respiratory shifts of lesions, and diagnostic image quality. Results Forty-eight patients had at least one lesion in the field of view, resulting in a total of 164 lesions analyzed (range of number of lesions per patient, one to 13). Both gating methods revealed respiratory shifts of lesions (4.4 mm ± 3.1 for BG vs 4.8 mm ± 3.6 for DDG, P = .76). Increase in uptake of the lesions compared with nongated values did not differ significantly between both methods (maximum standardized uptake value [SUVmax], +7% ± 13 for BG vs +8% ± 16 for DDG, P = .76). Similarly, gating significantly decreased metabolic lesion volumes with both methods (-6% ± 26 for BG vs -7% ± 21 for DDG, P = .44) compared with nongated reconstructions. Blinded reading revealed significant improvements in diagnostic image quality when using gating, without significant differences between the methods (DDG was judged to be inferior to BG in 22 cases, equal in 12 cases, and superior in 15 cases; P = .32). Conclusion Respiratory gating increases diagnostic image quality and uptake values and decreases metabolic volumes compared with nongated acquisitions. Data-driven approaches are clinically applicable alternatives to belt-based methods and might help establishing routine respiratory gating in clinical PET/CT. (©) RSNA, 2016 Online supplemental material is available for this article.

  13. Towards a Pattern-Driven Topical Ontology Modeling Methodology in Elderly Care Homes

    NASA Astrophysics Data System (ADS)

    Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert; Pudkey, Kevin

    This paper presents a pattern-driven ontology modeling methodology, which is used to create topical ontologies in the human resource management (HRM) domain. An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). We use the Organization for Economic Co-operation and Development (OECD) and the National Vocational Qualification (NVQ) as the resource to create the topical ontologies in this paper. The methodology is implemented in a tool called PAD-ON suit. The paper approach is illustrated with a use case from elderly care homes in UK.

  14. Developing the DESCARTE Model: The Design of Case Study Research in Health Care.

    PubMed

    Carolan, Clare M; Forbat, Liz; Smith, Annetta

    2016-04-01

    Case study is a long-established research tradition which predates the recent surge in mixed-methods research. Although a myriad of nuanced definitions of case study exist, seminal case study authors agree that the use of multiple data sources typify this research approach. The expansive case study literature demonstrates a lack of clarity and guidance in designing and reporting this approach to research. Informed by two reviews of the current health care literature, we posit that methodological description in case studies principally focuses on description of case study typology, which impedes the construction of methodologically clear and rigorous case studies. We draw from the case study and mixed-methods literature to develop the DESCARTE model as an innovative approach to the design, conduct, and reporting of case studies in health care. We examine how case study fits within the overall enterprise of qualitatively driven mixed-methods research, and the potential strengths of the model are considered. © The Author(s) 2015.

  15. Poetic Signs of Third Place: A Case Study of Student-Driven Imitation in a Shelter for Young Homeless People in Copenhagen

    ERIC Educational Resources Information Center

    Matthiesen, Christina

    2014-01-01

    During a series of writing workshops at a shelter for young homeless people in Copenhagen, I examined to what extent the literary practice of student-driven imitation with its emphasis on self-governance and a dialogical approach can engage marginalized learners in reading and writing. I found that student-driven imitation had the potential to…

  16. Managing and Securing Critical Infrastructure - A Semantic Policy and Trust Driven Approach

    DTIC Science & Technology

    2011-08-01

    enviromental factors, then it is very likely that the corresponding device has been compromised and controlled by an adversary. In this case, the report... Enviromental Factors in Faulty Case (b) Result of Policy Execution in Faulty Case Figure 7: Policy Execution in Faulty Case (a) Enviromental Factors

  17. Bottom-up approaches to strengthening child protection systems: Placing children, families, and communities at the center.

    PubMed

    Wessells, Michael G

    2015-05-01

    Efforts to strengthen national child protection systems have frequently taken a top-down approach of imposing formal, government-managed services. Such expert-driven approaches are often characterized by low use of formal services and the misalignment of the nonformal and formal aspects of the child protection system. This article examines an alternative approach of community-driven, bottom-up work that enables nonformal-formal collaboration and alignment, greater use of formal services, internally driven social change, and high levels of community ownership. The dominant approach of reliance on expert-driven Child Welfare Committees produces low levels of community ownership. Using an approach developed and tested in rural Sierra Leone, community-driven action, including collaboration and linkages with the formal system, promoted the use of formal services and achieved increased ownership, effectiveness, and sustainability of the system. The field needs less reliance on expert-driven approaches and much wider use of slower, community-driven, bottom-up approaches to child protection. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  18. Automated segmentation of middle hepatic vein in non-contrast x-ray CT images based on an atlas-driven approach

    NASA Astrophysics Data System (ADS)

    Kitagawa, Teruhiko; Zhou, Xiangrong; Hara, Takeshi; Fujita, Hiroshi; Yokoyama, Ryujiro; Kondo, Hiroshi; Kanematsu, Masayuki; Hoshi, Hiroaki

    2008-03-01

    In order to support the diagnosis of hepatic diseases, understanding the anatomical structures of hepatic lobes and hepatic vessels is necessary. Although viewing and understanding the hepatic vessels in contrast media-enhanced CT images is easy, the observation of the hepatic vessels in non-contrast X-ray CT images that are widely used for the screening purpose is difficult. We are developing a computer-aided diagnosis (CAD) system to support the liver diagnosis based on non-contrast X-ray CT images. This paper proposes a new approach to segment the middle hepatic vein (MHV), a key structure (landmark) for separating the liver region into left and right lobes. Extraction and classification of hepatic vessels are difficult in non-contrast X-ray CT images because the contrast between hepatic vessels and other liver tissues is low. Our approach uses an atlas-driven method by the following three stages. (1) Construction of liver atlases of left and right hepatic lobes using a learning datasets. (2) Fully-automated enhancement and extraction of hepatic vessels in liver regions. (3) Extraction of MHV based on the results of (1) and (2). The proposed approach was applied to 22 normal liver cases of non-contrast X-ray CT images. The preliminary results show that the proposed approach achieves the success in 14 cases for MHV extraction.

  19. Theory-driven intervention for changing personality: expectancy value theory, behavioral activation, and conscientiousness.

    PubMed

    Magidson, Jessica F; Roberts, Brent W; Collado-Rodriguez, Anahi; Lejuez, C W

    2014-05-01

    Considerable evidence suggests that personality traits may be changeable, raising the possibility that personality traits most linked to health problems can be modified with intervention. A growing body of research suggests that problematic personality traits may be altered with behavioral intervention using a bottom-up approach. That is, by targeting core behaviors that underlie personality traits with the goal of engendering new, healthier patterns of behavior that, over time, become automatized and manifest in changes in personality traits. Nevertheless, a bottom-up model for changing personality traits is somewhat diffuse and requires clearer integration of theory and relevant interventions to enable real clinical application. As such, this article proposes a set of guiding principles for theory-driven modification of targeted personality traits using a bottom-up approach, focusing specifically on targeting the trait of conscientiousness using a relevant behavioral intervention, Behavioral Activation (BA), considered within the motivational framework of expectancy value theory (EVT). We conclude with a real case example of the application of BA to alter behaviors counter to conscientiousness in a substance-dependent patient, highlighting the EVT principles most relevant to the approach and the importance and viability of a theoretically driven, bottom-up approach to changing personality traits. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  20. Data-driven approach for assessing utility of medical tests using electronic medical records.

    PubMed

    Skrøvseth, Stein Olav; Augestad, Knut Magne; Ebadollahi, Shahram

    2015-02-01

    To precisely define the utility of tests in a clinical pathway through data-driven analysis of the electronic medical record (EMR). The information content was defined in terms of the entropy of the expected value of the test related to a given outcome. A kernel density classifier was used to estimate the necessary distributions. To validate the method, we used data from the EMR of the gastrointestinal department at a university hospital. Blood tests from patients undergoing surgery for gastrointestinal surgery were analyzed with respect to second surgery within 30 days of the index surgery. The information content is clearly reflected in the patient pathway for certain combinations of tests and outcomes. C-reactive protein tests coupled to anastomosis leakage, a severe complication show a clear pattern of information gain through the patient trajectory, where the greatest gain from the test is 3-4 days post index surgery. We have defined the information content in a data-driven and information theoretic way such that the utility of a test can be precisely defined. The results reflect clinical knowledge. In the case we used the tests carry little negative impact. The general approach can be expanded to cases that carry a substantial negative impact, such as in certain radiological techniques. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Structural damage continuous monitoring by using a data driven approach based on principal component analysis and cross-correlation analysis

    NASA Astrophysics Data System (ADS)

    Camacho-Navarro, Jhonatan; Ruiz, Magda; Villamizar, Rodolfo; Mujica, Luis; Moreno-Beltrán, Gustavo; Quiroga, Jabid

    2017-05-01

    Continuous monitoring for damage detection in structural assessment comprises implementation of low cost equipment and efficient algorithms. This work describes the stages involved in the design of a methodology with high feasibility to be used in continuous damage assessment. Specifically, an algorithm based on a data-driven approach by using principal component analysis and pre-processing acquired signals by means of cross-correlation functions, is discussed. A carbon steel pipe section and a laboratory tower were used as test structures in order to demonstrate the feasibility of the methodology to detect abrupt changes in the structural response when damages occur. Two types of damage cases are studied: crack and leak for each structure, respectively. Experimental results show that the methodology is promising in the continuous monitoring of real structures.

  2. A Data-Driven Approach to Interactive Visualization of Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Jun

    Driven by emerging industry standards, electric utilities and grid coordination organizations are eager to seek advanced tools to assist grid operators to perform mission-critical tasks and enable them to make quick and accurate decisions. The emerging field of visual analytics holds tremendous promise for improving the business practices in today’s electric power industry. The conducted investigation, however, has revealed that the existing commercial power grid visualization tools heavily rely on human designers, hindering user’s ability to discover. Additionally, for a large grid, it is very labor-intensive and costly to build and maintain the pre-designed visual displays. This project proposes amore » data-driven approach to overcome the common challenges. The proposed approach relies on developing powerful data manipulation algorithms to create visualizations based on the characteristics of empirically or mathematically derived data. The resulting visual presentations emphasize what the data is rather than how the data should be presented, thus fostering comprehension and discovery. Furthermore, the data-driven approach formulates visualizations on-the-fly. It does not require a visualization design stage, completely eliminating or significantly reducing the cost for building and maintaining visual displays. The research and development (R&D) conducted in this project is mainly divided into two phases. The first phase (Phase I & II) focuses on developing data driven techniques for visualization of power grid and its operation. Various data-driven visualization techniques were investigated, including pattern recognition for auto-generation of one-line diagrams, fuzzy model based rich data visualization for situational awareness, etc. The R&D conducted during the second phase (Phase IIB) focuses on enhancing the prototyped data driven visualization tool based on the gathered requirements and use cases. The goal is to evolve the prototyped tool developed during the first phase into a commercial grade product. We will use one of the identified application areas as an example to demonstrate how research results achieved in this project are successfully utilized to address an emerging industry need. In summary, the data-driven visualization approach developed in this project has proven to be promising for building the next-generation power grid visualization tools. Application of this approach has resulted in a state-of-the-art commercial tool currently being leveraged by more than 60 utility organizations in North America and Europe .« less

  3. Resonance Effects in Magnetically Driven Mass-Spring Oscillations

    ERIC Educational Resources Information Center

    Taylor, Ken

    2011-01-01

    Resonance effects are among the most intriguing phenomena in physics and engineering. The classical case of a mass-spring oscillator driven at its resonant frequency is one of the earliest examples that students encounter. Perhaps the most commonly depicted method of driving the vibrating system is mechanical. An alternative approach presented in…

  4. Mass imbalances in EPANET water-quality simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Michael J.; Janke, Robert; Taxon, Thomas N.

    EPANET is widely employed to simulate water quality in water distribution systems. However, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results, in general, only for small water-quality time steps; use of an adequately short time step may not be feasible. Overly long time steps can yield errors in concentrations and result in situations in which constituent mass is not conserved. Mass may not be conserved even when EPANET gives no errors or warnings. This paper explains how such imbalances can occur and provides examples of such cases; it also presents a preliminary event-driven approachmore » that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, to those obtained using the new approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations.« less

  5. Insights into Departure Intention: A Qualitative Case Study

    ERIC Educational Resources Information Center

    Natoli, Riccardo; Jackling, Beverley; Siddique, Salina

    2015-01-01

    Efforts to address attrition rates at universities have been driven by Tinto's (1975) model of student engagement with its focus on student: (a) pre entry attributes; (b) academic engagement; and (c) social engagement. Using an ethnographic approach, the study involves interviews with business students to explore the links between these aspects…

  6. Supported Employment Handbook: A Customer-Driven Approach for Persons with Significant Disabilities.

    ERIC Educational Resources Information Center

    Brooke, Valerie, Ed.; And Others

    This manual provides training information for implementing supported employment by using a customer-driven approach. Chapter 1, "Supported Employment: A Customer-Driven Approach" (Valerie Brooke and others), describes current best practices, a new customer-driven approach to supported employment, and the role of the employment specialist. Chapter…

  7. Identifying Personal Goals of Patients With Long Term Condition: A Service Design Thinking Approach.

    PubMed

    Lee, Eunji; Gammon, Deede

    2017-01-01

    Care for patients with long term conditions is often characterized as fragmented and ineffective, and fails to engage the resources of patients and their families in the care process. Information and communication technology can potentially help bridge the gap between patients' lives and resources and services provided by professionals. However, there is little attention on how to identify and incorporate the patients' individual needs, values, preferences and care goals into the digitally driven care settings. We conducted a case study with healthcare professionals and patients participated applying a service design thinking approach. The participants could elaborate some personal goals of patients with long term condition which can potentially be incorporated in digitally driven care plans using examples from their own experiences.

  8. An empirical method that separates irreversible stem radial growth from bark water content changes in trees: theory and case studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mencuccini, Maurizio; Salmon, Yann; Mitchell, Patrick

    Substantial uncertainty surrounds our knowledge of tree stem growth, with some of the most basic questions, such as when stem radial growth occurs through the daily cycle, still unanswered. Here, we employed high-resolution point dendrometers, sap flow sensors, and developed theory and statistical approaches, to devise a novel method separating irreversible radial growth from elastic tension-driven and elastic osmotically driven changes in bark water content. We tested this method using data from five case study species. Experimental manipulations, namely a field irrigation experiment on Scots pine and a stem girdling experiment on red forest gum trees, were used to validatemore » the theory. Time courses of stem radial growth following irrigation and stem girdling were consistent with a-priori predictions. Patterns of stem radial growth varied across case studies, with growth occurring during the day and/or night, consistent with the available literature. Importantly, our approach provides a valuable alternative to existing methods, as it can be approximated by a simple empirical interpolation routine that derives irreversible radial growth using standard regression techniques. In conclusion, our novel method provides an improved understanding of the relative source–sink carbon dynamics of tree stems at a sub-daily time scale.« less

  9. An empirical method that separates irreversible stem radial growth from bark water content changes in trees: theory and case studies.

    PubMed

    Mencuccini, Maurizio; Salmon, Yann; Mitchell, Patrick; Hölttä, Teemu; Choat, Brendan; Meir, Patrick; O'Grady, Anthony; Tissue, David; Zweifel, Roman; Sevanto, Sanna; Pfautsch, Sebastian

    2017-02-01

    Substantial uncertainty surrounds our knowledge of tree stem growth, with some of the most basic questions, such as when stem radial growth occurs through the daily cycle, still unanswered. We employed high-resolution point dendrometers, sap flow sensors, and developed theory and statistical approaches, to devise a novel method separating irreversible radial growth from elastic tension-driven and elastic osmotically driven changes in bark water content. We tested this method using data from five case study species. Experimental manipulations, namely a field irrigation experiment on Scots pine and a stem girdling experiment on red forest gum trees, were used to validate the theory. Time courses of stem radial growth following irrigation and stem girdling were consistent with a-priori predictions. Patterns of stem radial growth varied across case studies, with growth occurring during the day and/or night, consistent with the available literature. Importantly, our approach provides a valuable alternative to existing methods, as it can be approximated by a simple empirical interpolation routine that derives irreversible radial growth using standard regression techniques. Our novel method provides an improved understanding of the relative source-sink carbon dynamics of tree stems at a sub-daily time scale. © 2016 The Authors Plant, Cell & Environment Published by John Wiley & Sons Ltd.

  10. An empirical method that separates irreversible stem radial growth from bark water content changes in trees: theory and case studies

    DOE PAGES

    Mencuccini, Maurizio; Salmon, Yann; Mitchell, Patrick; ...

    2017-11-12

    Substantial uncertainty surrounds our knowledge of tree stem growth, with some of the most basic questions, such as when stem radial growth occurs through the daily cycle, still unanswered. Here, we employed high-resolution point dendrometers, sap flow sensors, and developed theory and statistical approaches, to devise a novel method separating irreversible radial growth from elastic tension-driven and elastic osmotically driven changes in bark water content. We tested this method using data from five case study species. Experimental manipulations, namely a field irrigation experiment on Scots pine and a stem girdling experiment on red forest gum trees, were used to validatemore » the theory. Time courses of stem radial growth following irrigation and stem girdling were consistent with a-priori predictions. Patterns of stem radial growth varied across case studies, with growth occurring during the day and/or night, consistent with the available literature. Importantly, our approach provides a valuable alternative to existing methods, as it can be approximated by a simple empirical interpolation routine that derives irreversible radial growth using standard regression techniques. In conclusion, our novel method provides an improved understanding of the relative source–sink carbon dynamics of tree stems at a sub-daily time scale.« less

  11. Redefining the Practice of Peer Review Through Intelligent Automation Part 2: Data-Driven Peer Review Selection and Assignment.

    PubMed

    Reiner, Bruce I

    2017-12-01

    In conventional radiology peer review practice, a small number of exams (routinely 5% of the total volume) is randomly selected, which may significantly underestimate the true error rate within a given radiology practice. An alternative and preferable approach would be to create a data-driven model which mathematically quantifies a peer review risk score for each individual exam and uses this data to identify high risk exams and readers, and selectively target these exams for peer review. An analogous model can also be created to assist in the assignment of these peer review cases in keeping with specific priorities of the service provider. An additional option to enhance the peer review process would be to assign the peer review cases in a truly blinded fashion. In addition to eliminating traditional peer review bias, this approach has the potential to better define exam-specific standard of care, particularly when multiple readers participate in the peer review process.

  12. A management approach that drives actions strategically: balanced scorecard in a mental health trust case study.

    PubMed

    Schmidt, Stefan; Bateman, Ian; Breinlinger-O'Reilly, Jochen; Smith, Peter

    2006-01-01

    Achieving excellence is a current preoccupation in U.K. public health organisations. This article aims to use a case study to explain how a mental health trust delivers excellent performance using a balanced scorecard (BSC) management approach. Reports a project to implement a BSC approach in the South West Yorkshire Mental Health NHS Trust to achieve its "excellence" objectives. The authors were participants in the project. The design of the pilot project was informed theoretically by the work of Kaplan and Norton and practically by in-house discussions on a strategy to achieve excellence. Explains the process of building a BSC strategy step-by-step. Discusses how the vision and strategies of a mental health trust can be translated into tangible measures, which are the basis for actions that are driven strategically. There are many possibilities for a BSC management approach and this case study is specific to mental health trusts in the UK, although it is believed that the case has a universally applicable modus operandi. This article will help healthcare managers to evaluate the benefits of a BSC management approach. This article explains how actions can be structured in connection with a BSC management approach.

  13. Leadership and the Design of Data-Driven Professional Networks in Schools

    ERIC Educational Resources Information Center

    Liou, Yi-Hwa; Grigg, Jeffrey; Halverson, Richard

    2014-01-01

    Using data from a multi-method comparative case study of two matched schools, this paper adds to the growing body of applications of social network analysis to the study of distributed leadership and accountability. We contrast two approaches to instructional leadership, prescriptive and discretionary, to investigate how leaders design…

  14. SMART NAS Test Bed Overview

    NASA Technical Reports Server (NTRS)

    Palopo, Kee

    2016-01-01

    These slides presents an overview of SMART NAS Test Bed. The test bed is envisioned to be connected to operational systems and to allow a new concept and technology to be evaluated in its realistic environment. Its role as an accelerator of concepts and technologies development, its use-case-driven development approach, and its state are presented.

  15. Strategic Industrial Alliances in Paper Industry: XML- vs Ontology-Based Integration Platforms

    ERIC Educational Resources Information Center

    Naumenko, Anton; Nikitin, Sergiy; Terziyan, Vagan; Zharko, Andriy

    2005-01-01

    Purpose: To identify cases related to design of ICT platforms for industrial alliances, where the use of Ontology-driven architectures based on Semantic web standards is more advantageous than application of conventional modeling together with XML standards. Design/methodology/approach: A comparative analysis of the two latest and the most obvious…

  16. Desegregating Conversations about Race and Identity in Culturally Specific Museums

    ERIC Educational Resources Information Center

    Brown, Lovisa; Gutierrez, Caren; Okmin, Janine; McCullough, Susan

    2017-01-01

    Recent years have witnessed a surge in field-wide discussion about how to talk openly about race and culture within museum education. This article provides an analysis, using case studies from three culturally specific museums to explore how these identity-driven institutions navigate challenging, and often controversial, approaches to discussing…

  17. The ``Missing Compounds'' affair in functionality-driven material discovery

    NASA Astrophysics Data System (ADS)

    Zunger, Alex

    2014-03-01

    In the paradigm of ``data-driven discovery,'' underlying one of the leading streams of the Material Genome Initiative (MGI), one attempts to compute high-throughput style as many of the properties of as many of the N (about 10**5- 10**6) compounds listed in databases of previously known compounds. One then inspects the ensuing Big Data, searching for useful trends. The alternative and complimentary paradigm of ``functionality-directed search and optimization'' used here, searches instead for the n much smaller than N configurations and compositions that have the desired value of the target functionality. Examples include the use of genetic and other search methods that optimize the structure or identity of atoms on lattice sites, using atomistic electronic structure (such as first-principles) approaches in search of a given electronic property. This addresses a few of the bottlenecks that have faced the alternative, data-driven/high throughput/Big Data philosophy: (i) When the configuration space is theoretically of infinite size, building a complete data base as in data-driven discovery is impossible, yet searching for the optimum functionality, is still a well-posed problem. (ii) The configuration space that we explore might include artificially grown, kinetically stabilized systems (such as 2D layer stacks; superlattices; colloidal nanostructures; Fullerenes) that are not listed in compound databases (used by data-driven approaches), (iii) a large fraction of chemically plausible compounds have not been experimentally synthesized, so in the data-driven approach these are often skipped. In our approach we search explicitly for such ``Missing Compounds''. It is likely that many interesting material properties will be found in cases (i)-(iii) that elude high throughput searches based on databases encapsulating existing knowledge. I will illustrate (a) Functionality-driven discovery of topological insulators and valley-split quantum-computer semiconductors, as well as (b) Use of ``first principles thermodynamics'' to discern which of the previously ``missing compounds'' should, in fact exist and in which structure. Synthesis efforts by Poeppelmeier group at NU realized 20 never-before-made half-Heusler compounds out of the 20 predicted ones, in our predicted space groups. This type of theory-led experimental search of designed materials with target functionalities may shorten the current process of discovery of interesting functional materials. Supported by DOE ,Office of Science, Energy Frontier Research Center for Inverse Design

  18. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis.

    PubMed

    Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria

    2016-09-23

    The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents' dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.

  19. Comparison of driven and simulated "free" stall flutter in a wind tunnel

    NASA Astrophysics Data System (ADS)

    Culler, Ethan; Farnsworth, John; Fagley, Casey; Seidel, Jurgen

    2016-11-01

    Stall flutter and dynamic stall have received a significant amount of attention over the years. To experimentally study this problem, the body undergoing stall flutter is typically driven at a characteristic, single frequency sinusoid with a prescribed pitching amplitude and mean angle of attack offset. This approach allows for testing with repeatable kinematics, however it effectively decouples the structural motion from the aerodynamic forcing. Recent results suggest that this driven approach could misrepresent the forcing observed in a "free" stall flutter scenario. Specifically, a dynamically pitched rigid NACA 0018 wing section was tested in the wind tunnel under two modes of operation: (1) Cyber-Physical where "free" stall flutter was physically simulated through a custom motor-control system modeling a torsional spring and (2) Direct Motor-Driven Dynamic Pitch at a single frequency sinusoid representative of the cyber-physical motion. The time-resolved pitch angle and moment were directly measured and compared for each case. It was found that small deviations in the pitch angle trajectory between these two operational cases generate significantly different aerodynamic pitching moments on the wing section, with the pitching moments nearly 180o out of phase in some cases. This work is supported by the Air Force Office of Scientific Research through the Flow Interactions and Control Program and by the National Defense Science and Engineering Graduate Fellowship Program.

  20. Computational neuroscience approach to biomarkers and treatments for mental disorders.

    PubMed

    Yahata, Noriaki; Kasai, Kiyoto; Kawato, Mitsuo

    2017-04-01

    Psychiatry research has long experienced a stagnation stemming from a lack of understanding of the neurobiological underpinnings of phenomenologically defined mental disorders. Recently, the application of computational neuroscience to psychiatry research has shown great promise in establishing a link between phenomenological and pathophysiological aspects of mental disorders, thereby recasting current nosology in more biologically meaningful dimensions. In this review, we highlight recent investigations into computational neuroscience that have undertaken either theory- or data-driven approaches to quantitatively delineate the mechanisms of mental disorders. The theory-driven approach, including reinforcement learning models, plays an integrative role in this process by enabling correspondence between behavior and disorder-specific alterations at multiple levels of brain organization, ranging from molecules to cells to circuits. Previous studies have explicated a plethora of defining symptoms of mental disorders, including anhedonia, inattention, and poor executive function. The data-driven approach, on the other hand, is an emerging field in computational neuroscience seeking to identify disorder-specific features among high-dimensional big data. Remarkably, various machine-learning techniques have been applied to neuroimaging data, and the extracted disorder-specific features have been used for automatic case-control classification. For many disorders, the reported accuracies have reached 90% or more. However, we note that rigorous tests on independent cohorts are critically required to translate this research into clinical applications. Finally, we discuss the utility of the disorder-specific features found by the data-driven approach to psychiatric therapies, including neurofeedback. Such developments will allow simultaneous diagnosis and treatment of mental disorders using neuroimaging, thereby establishing 'theranostics' for the first time in clinical psychiatry. © 2016 The Authors. Psychiatry and Clinical Neurosciences © 2016 Japanese Society of Psychiatry and Neurology.

  1. A data-driven multiplicative fault diagnosis approach for automation processes.

    PubMed

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  2. Screening of pollution control and clean-up materials for river chemical spills using the multiple case-based reasoning method with a difference-driven revision strategy.

    PubMed

    Liu, Rentao; Jiang, Jiping; Guo, Liang; Shi, Bin; Liu, Jie; Du, Zhaolin; Wang, Peng

    2016-06-01

    In-depth filtering of emergency disposal technology (EDT) and materials has been required in the process of environmental pollution emergency disposal. However, an urgent problem that must be solved is how to quickly and accurately select the most appropriate materials for treating a pollution event from the existing spill control and clean-up materials (SCCM). To meet this need, the following objectives were addressed in this study. First, the material base and a case base for environment pollution emergency disposal were established to build a foundation and provide material for SCCM screening. Second, the multiple case-based reasoning model method with a difference-driven revision strategy (DDRS-MCBR) was applied to improve the original dual case-based reasoning model method system, and screening and decision-making was performed for SCCM using this model. Third, an actual environmental pollution accident from 2012 was used as a case study to verify the material base, case base, and screening model. The results demonstrated that the DDRS-MCBR method was fast, efficient, and practical. The DDRS-MCBR method changes the passive situation in which the choice of SCCM screening depends only on the subjective experience of the decision maker and offers a new approach to screening SCCM.

  3. Effective Rating Scale Development for Speaking Tests: Performance Decision Trees

    ERIC Educational Resources Information Center

    Fulcher, Glenn; Davidson, Fred; Kemp, Jenny

    2011-01-01

    Rating scale design and development for testing speaking is generally conducted using one of two approaches: the measurement-driven approach or the performance data-driven approach. The measurement-driven approach prioritizes the ordering of descriptors onto a single scale. Meaning is derived from the scaling methodology and the agreement of…

  4. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education.

    PubMed

    Hervatis, Vasilis; Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-10-06

    Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators' decision making. A deductive case study approach was applied to develop the conceptual model. The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach.

  5. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education

    PubMed Central

    Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-01-01

    Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840

  6. Integrating geo web services for a user driven exploratory analysis

    NASA Astrophysics Data System (ADS)

    Moncrieff, Simon; Turdukulov, Ulanbek; Gulland, Elizabeth-Kate

    2016-04-01

    In data exploration, several online data sources may need to be dynamically aggregated or summarised over spatial region, time interval, or set of attributes. With respect to thematic data, web services are mainly used to present results leading to a supplier driven service model limiting the exploration of the data. In this paper we propose a user need driven service model based on geo web processing services. The aim of the framework is to provide a method for the scalable and interactive access to various geographic data sources on the web. The architecture combines a data query, processing technique and visualisation methodology to rapidly integrate and visually summarise properties of a dataset. We illustrate the environment on a health related use case that derives Age Standardised Rate - a dynamic index that needs integration of the existing interoperable web services of demographic data in conjunction with standalone non-spatial secure database servers used in health research. Although the example is specific to the health field, the architecture and the proposed approach are relevant and applicable to other fields that require integration and visualisation of geo datasets from various web services and thus, we believe is generic in its approach.

  7. A Case Study of the Technology Use and Information Flow at a Hospital-Driven Telemedicine Service.

    PubMed

    Smaradottir, Berglind; Fensli, Rune

    2017-01-01

    Health care services face the challenge of providing individualised treatment to a growing ageing population prone to chronic conditions and multi-morbidities. The research project Patients and Professionals in Productive Teams aims to study health care services that are run with a patient-centred teamwork approach. In this context, a case study was made of a hospital-driven telemedicine service for chronic obstructive pulmonary disease patients after hospital discharge, with a focus on information flow and technology use. The methods used were observation and interviews with key informants. The results showed that the technology was perceived as well-functioning for telemedicine support, but the technology used was a standalone system and not integrated with the electronic health record of the hospital. In addition, there was lack of support to provide the patients at home with written instructions on advices of medical treatment and care. The electronic information used for this telemedicine services, allowed shared access of information for teamwork between professional only within the hospital.

  8. Ontology-Based Retrieval of Spatially Related Objects for Location Based Services

    NASA Astrophysics Data System (ADS)

    Haav, Hele-Mai; Kaljuvee, Aivi; Luts, Martin; Vajakas, Toivo

    Advanced Location Based Service (LBS) applications have to integrate information stored in GIS, information about users' preferences (profile) as well as contextual information and information about application itself. Ontology engineering provides methods to semantically integrate several data sources. We propose an ontology-driven LBS development framework: the paper describes the architecture of ontologies and their usage for retrieval of spatially related objects relevant to the user. Our main contribution is to enable personalised ontology driven LBS by providing a novel approach for defining personalised semantic spatial relationships by means of ontologies. The approach is illustrated by an industrial case study.

  9. Low altitude remote sensing technologies for crop stress monitoring: a case study on spatial and temporal monitoring of irrigated pinto bean

    USDA-ARS?s Scientific Manuscript database

    Site-specific crop management is a promising approach to maximize crop yield with optimal use of rapidly depleting natural resources. Availability of high resolution crop data at critical growth stages is a key for real-time data-driven decisions during the production season. The goal of this study ...

  10. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  11. Exponential integrators in time-dependent density-functional calculations

    NASA Astrophysics Data System (ADS)

    Kidd, Daniel; Covington, Cody; Varga, Kálmán

    2017-12-01

    The integrating factor and exponential time differencing methods are implemented and tested for solving the time-dependent Kohn-Sham equations. Popular time propagation methods used in physics, as well as other robust numerical approaches, are compared to these exponential integrator methods in order to judge the relative merit of the computational schemes. We determine an improvement in accuracy of multiple orders of magnitude when describing dynamics driven primarily by a nonlinear potential. For cases of dynamics driven by a time-dependent external potential, the accuracy of the exponential integrator methods are less enhanced but still match or outperform the best of the conventional methods tested.

  12. Practice and Policy to Enhance Student Induction and Transition: A Case Study of Institution-Wide Change

    ERIC Educational Resources Information Center

    Alsford, Sally; Rose, Christine

    2014-01-01

    This case study gives an analytical account of institutional development in induction provision. Driven by student experience concerns, a London post-1992 University set up an "enhanced induction project" to provide a more integrated, personalised approach through more coordinated processes. In a large, diverse context, university-wide…

  13. A Formal Model of Ambiguity and its Applications in Machine Translation

    DTIC Science & Technology

    2010-01-01

    structure indicates linguisti- cally implausible segmentation that might be generated using dictionary - driven approaches...derivation. As was done in the monolingual case, the functions LHS, RHSi, RHSo and υ can be extended to a derivation δ. D(q) where q ∈V denotes the... monolingual parses. My algorithm runs more efficiently than O(n6) with many grammars (including those that required using heuristic search with other parsers

  14. Floquet spin states in graphene under ac-driven spin-orbit interaction

    NASA Astrophysics Data System (ADS)

    López, A.; Sun, Z. Z.; Schliemann, J.

    2012-05-01

    We study the role of periodically driven time-dependent Rashba spin-orbit coupling (RSOC) on a monolayer graphene sample. After recasting the originally 4×4 system of dynamical equations as two time-reversal related two-level problems, the quasienergy spectrum and the related dynamics are investigated via various techniques and approximations. In the static case, the system is gapped at the Dirac point. The rotating wave approximation (RWA) applied to the driven system unphysically preserves this feature, while the Magnus-Floquet approach as well as a numerically exact evaluation of the Floquet equation show that this gap is dynamically closed. In addition, a sizable oscillating pattern of the out-of-plane spin polarization is found in the driven case for states that are completely unpolarized in the static limit. Evaluation of the autocorrelation function shows that the original uniform interference pattern corresponding to time-independent RSOC gets distorted. The resulting structure can be qualitatively explained as a consequence of the transitions induced by the ac driving among the static eigenstates, i.e., these transitions modulate the relative phases that add up to give the quantum revivals of the autocorrelation function. Contrary to the static case, in the driven scenario, quantum revivals (suppressions) are correlated to spin-up (down) phases.

  15. Managing Complex Change in Clinical Study Metadata

    PubMed Central

    Brandt, Cynthia A.; Gadagkar, Rohit; Rodriguez, Cesar; Nadkarni, Prakash M.

    2004-01-01

    In highly functional metadata-driven software, the interrelationships within the metadata become complex, and maintenance becomes challenging. We describe an approach to metadata management that uses a knowledge-base subschema to store centralized information about metadata dependencies and use cases involving specific types of metadata modification. Our system borrows ideas from production-rule systems in that some of this information is a high-level specification that is interpreted and executed dynamically by a middleware engine. Our approach is implemented in TrialDB, a generic clinical study data management system. We review approaches that have been used for metadata management in other contexts and describe the features, capabilities, and limitations of our system. PMID:15187070

  16. Toward a user-driven approach to radiology software solutions: putting the wag back in the dog.

    PubMed

    Morgan, Matthew; Mates, Jonathan; Chang, Paul

    2006-09-01

    The relationship between healthcare providers and the software industry is evolving. In many cases, industry's traditional, market-driven model is failing to meet the increasingly sophisticated and appropriately individualized needs of providers. Advances in both technology infrastructure and development methodologies have set the stage for the transition from a vendor-driven to a more user-driven process of solution engineering. To make this transition, providers must take an active role in the development process and vendors must provide flexible frameworks on which to build. Only then can the provider/vendor relationship mature from a purchaser/supplier to a codesigner/partner model, where true insight and innovation can occur.

  17. Improving Self-Assembly by Varying the Temperature Periodically with Time

    NASA Astrophysics Data System (ADS)

    Raz, Oren; Jarzynski, Christopher

    Self-assembly (SA) is the process by which basic components organize into a larger structure without external guidance. These processes are common in Nature, and also have technological applications, e.g. growing a crystal with a specific structure. So far, artificial SA processes have been designed mostly using diffusive building blocks with high specificity and directionality. The formation of the self-assembled structures is then driven by free-energy minimization into a thermodynamically stable state. In an alternative approach to SA, macroscopic parameters such as temperature, pressure, pH, magnetic field etc., are varied periodically with time. In this case, the SA structures are the stable periodic states of the driven system. Currently there are no design principles for periodically driven SA, other than in the limits of fast or weak driving. We present guiding ideas for self-assembly under periodic driving. As an example, we show a particular case in which self-assembly errors can be dramatically reduced by varying a system's temperature periodically with time. James S. McDonnell Foundation, and the US National Science Foundation: DMR-1506969.

  18. The dynamics of information-driven coordination phenomena: A transfer entropy analysis

    PubMed Central

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-01-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data. PMID:27051875

  19. The dynamics of information-driven coordination phenomena: A transfer entropy analysis.

    PubMed

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-04-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.

  20. Data-Driven Engineering of Social Dynamics: Pattern Matching and Profit Maximization

    PubMed Central

    Peng, Huan-Kai; Lee, Hao-Chih; Pan, Jia-Yu; Marculescu, Radu

    2016-01-01

    In this paper, we define a new problem related to social media, namely, the data-driven engineering of social dynamics. More precisely, given a set of observations from the past, we aim at finding the best short-term intervention that can lead to predefined long-term outcomes. Toward this end, we propose a general formulation that covers two useful engineering tasks as special cases, namely, pattern matching and profit maximization. By incorporating a deep learning model, we derive a solution using convex relaxation and quadratic-programming transformation. Moreover, we propose a data-driven evaluation method in place of the expensive field experiments. Using a Twitter dataset, we demonstrate the effectiveness of our dynamics engineering approach for both pattern matching and profit maximization, and study the multifaceted interplay among several important factors of dynamics engineering, such as solution validity, pattern-matching accuracy, and intervention cost. Finally, the method we propose is general enough to work with multi-dimensional time series, so it can potentially be used in many other applications. PMID:26771830

  1. Data-Driven Engineering of Social Dynamics: Pattern Matching and Profit Maximization.

    PubMed

    Peng, Huan-Kai; Lee, Hao-Chih; Pan, Jia-Yu; Marculescu, Radu

    2016-01-01

    In this paper, we define a new problem related to social media, namely, the data-driven engineering of social dynamics. More precisely, given a set of observations from the past, we aim at finding the best short-term intervention that can lead to predefined long-term outcomes. Toward this end, we propose a general formulation that covers two useful engineering tasks as special cases, namely, pattern matching and profit maximization. By incorporating a deep learning model, we derive a solution using convex relaxation and quadratic-programming transformation. Moreover, we propose a data-driven evaluation method in place of the expensive field experiments. Using a Twitter dataset, we demonstrate the effectiveness of our dynamics engineering approach for both pattern matching and profit maximization, and study the multifaceted interplay among several important factors of dynamics engineering, such as solution validity, pattern-matching accuracy, and intervention cost. Finally, the method we propose is general enough to work with multi-dimensional time series, so it can potentially be used in many other applications.

  2. Designing an optimal software intensive system acquisition: A game theoretic approach

    NASA Astrophysics Data System (ADS)

    Buettner, Douglas John

    The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.

  3. New variational principles for locating periodic orbits of differential equations.

    PubMed

    Boghosian, Bruce M; Fazendeiro, Luis M; Lätt, Jonas; Tang, Hui; Coveney, Peter V

    2011-06-13

    We present new methods for the determination of periodic orbits of general dynamical systems. Iterative algorithms for finding solutions by these methods, for both the exact continuum case, and for approximate discrete representations suitable for numerical implementation, are discussed. Finally, we describe our approach to the computation of unstable periodic orbits of the driven Navier-Stokes equations, simulated using the lattice Boltzmann equation.

  4. Challenges to Institutionalizing Participatory Extension: The Case of Farmer Livestock Schools in Vietnam

    ERIC Educational Resources Information Center

    Minh, Thai Thi; Larsen, Carl Erik Schou; Neef, Andreas

    2010-01-01

    Purpose: The objective of this article is to analyze the introduction of participatory extension approaches (PEA) in the predominantly supply-driven, hierarchical Vietnamese extension system. Drawing on the case of the so-called Farmer Livestock School (FLS) concept, the authors investigate the potential and challenges of scaling up and out the…

  5. The Diploma in Rehabilitation Studies--The Birth of a New Form of Industry-Driven Learning.

    ERIC Educational Resources Information Center

    Leberman, Sarah I.

    The Accident Rehabilitation and Compensation Insurance Corporation (ARCIC) provides no-fault rehabilitation and compensation to all New Zealanders. In order to meet the training needs created by ARCIC's recent shift to a case management approach, the Victoria University of Wellington instituted a program to train case managers. The 27-week program…

  6. Crystalline nucleation in undercooled liquids: a Bayesian data-analysis approach for a nonhomogeneous Poisson process.

    PubMed

    Filipponi, A; Di Cicco, A; Principi, E

    2012-12-01

    A Bayesian data-analysis approach to data sets of maximum undercooling temperatures recorded in repeated melting-cooling cycles of high-purity samples is proposed. The crystallization phenomenon is described in terms of a nonhomogeneous Poisson process driven by a temperature-dependent sample nucleation rate J(T). The method was extensively tested by computer simulations and applied to real data for undercooled liquid Ge. It proved to be particularly useful in the case of scarce data sets where the usage of binned data would degrade the available experimental information.

  7. Workforce Development for Communities in Crisis and Transition: A Case Study of the Windward Islands.

    ERIC Educational Resources Information Center

    Whittington, L. Alfons

    The Windward Islands (Dominica, Grenada, St. Lucia, and St. Vincent and the Grenadines) have taken several approaches to educate the work force and prepare for the technology-driven society of the future. These approaches include government initiatives, such as the governments' commitment to primary education and more recently to secondary…

  8. A guide to using case-based learning in biochemistry education.

    PubMed

    Kulak, Verena; Newton, Genevieve

    2014-01-01

    Studies indicate that the majority of students in undergraduate biochemistry take a surface approach to learning, associated with rote memorization of material, rather than a deep approach, which implies higher cognitive processing. This behavior relates to poorer outcomes, including impaired course performance and reduced knowledge retention. The use of case-based learning (CBL) into biochemistry teaching may facilitate deep learning by increasing student engagement and interest. Abundant literature on CBL exists but clear guidance on how to design and implement case studies is not readily available. This guide provides a representative review of CBL uses in science and describes the process of developing CBL modules to be used in biochemistry. Included is a framework to implement a directed CBL assisted with lectures in a content-driven biochemistry course regardless of class size. Moreover, this guide can facilitate adopting CBL to other courses. Consequently, the information presented herein will be of value to undergraduate science educators with an interest in active learning pedagogies. © 2014 The International Union of Biochemistry and Molecular Biology.

  9. A scan statistic for identifying optimal risk windows in vaccine safety studies using self-controlled case series design.

    PubMed

    Xu, Stanley; Hambidge, Simon J; McClure, David L; Daley, Matthew F; Glanz, Jason M

    2013-08-30

    In the examination of the association between vaccines and rare adverse events after vaccination in postlicensure observational studies, it is challenging to define appropriate risk windows because prelicensure RCTs provide little insight on the timing of specific adverse events. Past vaccine safety studies have often used prespecified risk windows based on prior publications, biological understanding of the vaccine, and expert opinion. Recently, a data-driven approach was developed to identify appropriate risk windows for vaccine safety studies that use the self-controlled case series design. This approach employs both the maximum incidence rate ratio and the linear relation between the estimated incidence rate ratio and the inverse of average person time at risk, given a specified risk window. In this paper, we present a scan statistic that can identify appropriate risk windows in vaccine safety studies using the self-controlled case series design while taking into account the dependence of time intervals within an individual and while adjusting for time-varying covariates such as age and seasonality. This approach uses the maximum likelihood ratio test based on fixed-effects models, which has been used for analyzing data from self-controlled case series design in addition to conditional Poisson models. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Comparison of Classical and Lazy Approach in SCG Compiler

    NASA Astrophysics Data System (ADS)

    Jirák, Ota; Kolář, Dušan

    2011-09-01

    The existing parsing methods of scattered context grammar usually expand nonterminals deeply in the pushdown. This expansion is implemented by using either a linked list, or some kind of an auxiliary pushdown. This paper describes the parsing algorithm of an LL(1) scattered context grammar. The given algorithm merges two principles together. The first approach is a table-driven parsing method commonly used for parsing of the context-free grammars. The second is a delayed execution used in functional programming. The main part of this paper is a proof of equivalence between the common principle (the whole rule is applied at once) and our approach (execution of the rules is delayed). Therefore, this approach works with the pushdown top only. In the most cases, the second approach is faster than the first one. Finally, the future work is discussed.

  11. FIELD-DRIVEN APPROACHES TO SUBSURFACE CONTAMINANT TRANSPORT MODELING.

    EPA Science Inventory

    Observations from field sites provide a means for prioritizing research activities. In the case of petroleum releases, observations may include spiking of concentration distributions that may be related to water table fluctuation, co-location of contaminant plumes with geochemi...

  12. An Adaptation of the Distance Driven Projection Method for Single Pinhole Collimators in SPECT Imaging

    NASA Astrophysics Data System (ADS)

    Ihsani, Alvin; Farncombe, Troy

    2016-02-01

    The modelling of the projection operator in tomographic imaging is of critical importance especially when working with algebraic methods of image reconstruction. This paper proposes a distance-driven projection method which is targeted to single-pinhole single-photon emission computed tomograghy (SPECT) imaging since it accounts for the finite size of the pinhole, and the possible tilting of the detector surface in addition to other collimator-specific factors such as geometric sensitivity. The accuracy and execution time of the proposed method is evaluated by comparing to a ray-driven approach where the pinhole is sub-sampled with various sampling schemes. A point-source phantom whose projections were generated using OpenGATE was first used to compare the resolution of reconstructed images with each method using the full width at half maximum (FWHM). Furthermore, a high-activity Mini Deluxe Phantom (Data Spectrum Corp., Durham, NC, USA) SPECT resolution phantom was scanned using a Gamma Medica X-SPECT system and the signal-to-noise ratio (SNR) and structural similarity of reconstructed images was compared at various projection counts. Based on the reconstructed point-source phantom, the proposed distance-driven approach results in a lower FWHM than the ray-driven approach even when using a smaller detector resolution. Furthermore, based on the Mini Deluxe Phantom, it is shown that the distance-driven approach has consistently higher SNR and structural similarity compared to the ray-driven approach as the counts in measured projections deteriorates.

  13. Model-Driven Development of Safety Architectures

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2017-01-01

    We describe the use of model-driven development for safety assurance of a pioneering NASA flight operation involving a fleet of small unmanned aircraft systems (sUAS) flying beyond visual line of sight. The central idea is to develop a safety architecture that provides the basis for risk assessment and visualization within a safety case, the formal justification of acceptable safety required by the aviation regulatory authority. A safety architecture is composed from a collection of bow tie diagrams (BTDs), a practical approach to manage safety risk by linking the identified hazards to the appropriate mitigation measures. The safety justification for a given unmanned aircraft system (UAS) operation can have many related BTDs. In practice, however, each BTD is independently developed, which poses challenges with respect to incremental development, maintaining consistency across different safety artifacts when changes occur, and in extracting and presenting stakeholder specific information relevant for decision making. We show how a safety architecture reconciles the various BTDs of a system, and, collectively, provide an overarching picture of system safety, by considering them as views of a unified model. We also show how it enables model-driven development of BTDs, replete with validations, transformations, and a range of views. Our approach, which we have implemented in our toolset, AdvoCATE, is illustrated with a running example drawn from a real UAS safety case. The models and some of the innovations described here were instrumental in successfully obtaining regulatory flight approval.

  14. Modeling of fast neutral-beam-generated ions and rotation effects on RWM stability in DIII-D plasmas

    DOE PAGES

    Turco, Francesca; Turnbull, Alan D.; Hanson, Jeremy M.; ...

    2015-10-15

    Here, validation results for the MARS-K code for DIII-D equilibria, predict that the absence of fast Neutral Beam (NB) generated ions leads to a plasma response ~40–60% higher than in NB-sustained H-mode plasmas when the no-wall β N limit is reached. In a β N scan, the MARS-K model with thermal and fast-ions, reproduces the experimental measurements above the no-wall limit, except at the highest β N where the phase of the plasma response is overestimated. The dependencies extrapolate unfavorably to machines such as ITER with smaller fast ion fractions since elevated responses in the absence of fast ions indicatemore » the potential onset of a resistive wall mode (RWM). The model was also tested for the effects of rotation at high β N, and recovers the measured response even when fast-ions are neglected, reversing the effect found in lower β N cases, but consistent with the higher β N results above the no-wall limit. The agreement in the response amplitude and phase for the rotation scan is not as good, and additional work will be needed to reproduce the experimental trends. In the case of current-driven instabilities, the magnetohydrodynamic spectroscopy system used to measure the plasma response reacts differently from that for pressure driven instabilities: the response amplitude remains low up to ~93% of the current limit, showing an abrupt increase only in the last ~5% of the current ramp. This makes it much less effective as a diagnostic for the approach to an ideal limit. However, the mode structure of the current driven RWM extends radially inwards, consistent with that in the pressure driven case for plasmas with q edge~2. This suggests that previously developed RWM feedback techniques together with the additional optimizations that enabled q edge~2 operation, can be applied to control of both current-driven and pressure-driven modes at high β N.« less

  15. Socio-Technical Relations in the Creation of an Interest-Driven Open Course

    ERIC Educational Resources Information Center

    Ponti, Marisa

    2011-01-01

    The aim of this article is to present the findings from a small exploratory case study of an open course on cyberpunk literature conducted at the Peer 2 Peer University (P2PU), an online grass-roots organisation that runs non-accredited courses. Employing actor network theory to inform an ethnographic-inductive approach, the case study sought to…

  16. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Kinetics of catalytically activated duplication in aggregation growth

    NASA Astrophysics Data System (ADS)

    Wang, Hai-Feng; Lin, Zhen-Quan; Gao, Yan; Xu, Chao

    2009-08-01

    We propose a catalytically activated duplication model to mimic the coagulation and duplication of the DNA polymer system under the catalysis of the primer RNA. In the model, two aggregates of the same species can coagulate themselves and a DNA aggregate of any size can yield a new monomer or double itself with the help of RNA aggregates. By employing the mean-field rate equation approach we analytically investigate the evolution behaviour of the system. For the system with catalysis-driven monomer duplications, the aggregate size distribution of DNA polymers ak(t) always follows a power law in size in the long-time limit, and it decreases with time or approaches a time-independent steady-state form in the case of the duplication rate independent of the size of the mother aggregates, while it increases with time increasing in the case of the duplication rate proportional to the size of the mother aggregates. For the system with complete catalysis-driven duplications, the aggregate size distribution ak(t) approaches a generalized or modified scaling form.

  17. A Hybrid Physics-Based Data-Driven Approach for Point-Particle Force Modeling

    NASA Astrophysics Data System (ADS)

    Moore, Chandler; Akiki, Georges; Balachandar, S.

    2017-11-01

    This study improves upon the physics-based pairwise interaction extended point-particle (PIEP) model. The PIEP model leverages a physical framework to predict fluid mediated interactions between solid particles. While the PIEP model is a powerful tool, its pairwise assumption leads to increased error in flows with high particle volume fractions. To reduce this error, a regression algorithm is used to model the differences between the current PIEP model's predictions and the results of direct numerical simulations (DNS) for an array of monodisperse solid particles subjected to various flow conditions. The resulting statistical model and the physical PIEP model are superimposed to construct a hybrid, physics-based data-driven PIEP model. It must be noted that the performance of a pure data-driven approach without the model-form provided by the physical PIEP model is substantially inferior. The hybrid model's predictive capabilities are analyzed using more DNS. In every case tested, the hybrid PIEP model's prediction are more accurate than those of physical PIEP model. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE-1315138 and the U.S. DOE, NNSA, ASC Program, as a Cooperative Agreement under Contract No. DE-NA0002378.

  18. Data-driven outbreak forecasting with a simple nonlinear growth model

    PubMed Central

    Lega, Joceline; Brown, Heidi E.

    2016-01-01

    Recent events have thrown the spotlight on infectious disease outbreak response. We developed a data-driven method, EpiGro, which can be applied to cumulative case reports to estimate the order of magnitude of the duration, peak and ultimate size of an ongoing outbreak. It is based on a surprisingly simple mathematical property of many epidemiological data sets, does not require knowledge or estimation of disease transmission parameters, is robust to noise and to small data sets, and runs quickly due to its mathematical simplicity. Using data from historic and ongoing epidemics, we present the model. We also provide modeling considerations that justify this approach and discuss its limitations. In the absence of other information or in conjunction with other models, EpiGro may be useful to public health responders. PMID:27770752

  19. Aspiration dynamics of multi-player games in finite populations

    PubMed Central

    Du, Jinming; Wu, Bin; Altrock, Philipp M.; Wang, Long

    2014-01-01

    On studying strategy update rules in the framework of evolutionary game theory, one can differentiate between imitation processes and aspiration-driven dynamics. In the former case, individuals imitate the strategy of a more successful peer. In the latter case, individuals adjust their strategies based on a comparison of their pay-offs from the evolutionary game to a value they aspire, called the level of aspiration. Unlike imitation processes of pairwise comparison, aspiration-driven updates do not require additional information about the strategic environment and can thus be interpreted as being more spontaneous. Recent work has mainly focused on understanding how aspiration dynamics alter the evolutionary outcome in structured populations. However, the baseline case for understanding strategy selection is the well-mixed population case, which is still lacking sufficient understanding. We explore how aspiration-driven strategy-update dynamics under imperfect rationality influence the average abundance of a strategy in multi-player evolutionary games with two strategies. We analytically derive a condition under which a strategy is more abundant than the other in the weak selection limiting case. This approach has a long-standing history in evolutionary games and is mostly applied for its mathematical approachability. Hence, we also explore strong selection numerically, which shows that our weak selection condition is a robust predictor of the average abundance of a strategy. The condition turns out to differ from that of a wide class of imitation dynamics, as long as the game is not dyadic. Therefore, a strategy favoured under imitation dynamics can be disfavoured under aspiration dynamics. This does not require any population structure, and thus highlights the intrinsic difference between imitation and aspiration dynamics. PMID:24598208

  20. Aspiration dynamics of multi-player games in finite populations.

    PubMed

    Du, Jinming; Wu, Bin; Altrock, Philipp M; Wang, Long

    2014-05-06

    On studying strategy update rules in the framework of evolutionary game theory, one can differentiate between imitation processes and aspiration-driven dynamics. In the former case, individuals imitate the strategy of a more successful peer. In the latter case, individuals adjust their strategies based on a comparison of their pay-offs from the evolutionary game to a value they aspire, called the level of aspiration. Unlike imitation processes of pairwise comparison, aspiration-driven updates do not require additional information about the strategic environment and can thus be interpreted as being more spontaneous. Recent work has mainly focused on understanding how aspiration dynamics alter the evolutionary outcome in structured populations. However, the baseline case for understanding strategy selection is the well-mixed population case, which is still lacking sufficient understanding. We explore how aspiration-driven strategy-update dynamics under imperfect rationality influence the average abundance of a strategy in multi-player evolutionary games with two strategies. We analytically derive a condition under which a strategy is more abundant than the other in the weak selection limiting case. This approach has a long-standing history in evolutionary games and is mostly applied for its mathematical approachability. Hence, we also explore strong selection numerically, which shows that our weak selection condition is a robust predictor of the average abundance of a strategy. The condition turns out to differ from that of a wide class of imitation dynamics, as long as the game is not dyadic. Therefore, a strategy favoured under imitation dynamics can be disfavoured under aspiration dynamics. This does not require any population structure, and thus highlights the intrinsic difference between imitation and aspiration dynamics.

  1. Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon

    2014-01-01

    Approaches used in Earth science research such as case study analysis and climatology studies involve discovering and gathering diverse data sets and information to support the research goals. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. In cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. This paper presents a specialized search, aggregation and curation tool for Earth science to address these challenges. The search rool automatically creates curated 'Data Albums', aggregated collections of information related to a specific event, containing links to relevant data files [granules] from different instruments, tools and services for visualization and analysis, and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non relevant information and data.

  2. A new practice-driven approach to develop software in a cyber-physical system environment

    NASA Astrophysics Data System (ADS)

    Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei

    2016-02-01

    Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.

  3. Yarkovsky-driven Impact Predictions: Apophis and 1950 DA

    NASA Astrophysics Data System (ADS)

    Farnocchia, Davide; Chesley, S. R.; Chodas, P.; Milani, A.

    2013-05-01

    Abstract (2,250 Maximum Characters): Orbit determination for Near-Earth Asteroids presents unique technical challenges due to the imperative of early detection and careful assessment of the risk posed by specific Earth close approaches. The occurrence of an Earth impact can be decisively driven by the Yarkovsky effect, which is the most important nongravitational perturbation as it causes asteroids to undergo a secular variation in semimajor axis resulting in a quadratic effect in anomaly. We discuss the cases of (99942) Apophis and (29075) 1950 DA. The relevance of the Yarkovsky effect for Apophis is due to a scattering close approach in 2029 with minimum geocentric distance ~38000 km. For 1950 DA the influence of the Yarkovsky effect in 2880 is due to the long time interval preceding the impact. We use the available information on the asteroids' physical models as a starting point for a Monte Carlo method that allow us to measure how the Yarkovsky effect affects orbital predictions. For Apophis we map onto the 2029 close approach b-plane and analyze the keyholes corresponding to resonant close approaches. For 1950 DA we use the b-plane corresponding to the possible impact in 2880. We finally compute the impact probability from the mapped probability density function on the considered b-plane.

  4. Stakeholder-Driven Quality Improvement: A Compelling Force for Clinical Practice Guidelines.

    PubMed

    Rosenfeld, Richard M; Wyer, Peter C

    2018-01-01

    Clinical practice guideline development should be driven by rigorous methodology, but what is less clear is where quality improvement enters the process: should it be a priority-guiding force, or should it enter only after recommendations are formulated? We argue for a stakeholder-driven approach to guideline development, with an overriding goal of quality improvement based on stakeholder perceptions of needs, uncertainties, and knowledge gaps. In contrast, the widely used topic-driven approach, which often makes recommendations based only on randomized controlled trials, is driven by epidemiologic purity and evidence rigor, with quality improvement a downstream consideration. The advantages of a stakeholder-driven versus a topic-driven approach are highlighted by comparisons of guidelines for otitis media with effusion, thyroid nodules, sepsis, and acute bacterial rhinosinusitis. These comparisons show that stakeholder-driven guidelines are more likely to address the quality improvement needs and pressing concerns of clinicians and patients, including understudied populations and patients with multiple chronic conditions. Conversely, a topic-driven approach often addresses "typical" patients, based on research that may not reflect the needs of high-risk groups excluded from studies because of ethical issues or a desire for purity of research design.

  5. Integrated Optoelectronic Networks for Application-Driven Multicore Computing

    DTIC Science & Technology

    2017-05-08

    hybrid photonic torus, the all-optical Corona crossbar, and the hybrid hierarchical Firefly crossbar. • The key challenges for waveguide photonics...improves SXR but with relatively higher EDP overhead. Our evaluation results indicate that the encoding schemes improve worst-case-SXR in Corona and...photonic crossbar architectures ( Corona and Firefly) indicate that our approach improves worst-case signal-to-noise ratio (SNR) by up to 51.7

  6. The Future of Low-Wage Jobs: Case Studies in the Retail Industry. IEE Working Paper No. 10.

    ERIC Educational Resources Information Center

    Bernhardt, Annette

    The future of low-wage jobs is examined through a case study of firm restructuring in the retail industry. The study confirms that the retailing sector has come to be dominated by the Wal-Mart model, which emphasizes an efficient technology-driven inventory management system and a human resource approach that includes the following elements:…

  7. Phenotype-driven molecular autopsy for sudden cardiac death.

    PubMed

    Cann, F; Corbett, M; O'Sullivan, D; Tennant, S; Hailey, H; Grieve, J H K; Broadhurst, P; Rankin, R; Dean, J C S

    2017-01-01

    A phenotype-driven approach to molecular autopsy based in a multidisciplinary team comprising clinical and laboratory genetics, forensic medicine and cardiology is described. Over a 13 year period, molecular autopsy was undertaken in 96 sudden cardiac death cases. A total of 46 cases aged 1-40 years had normal hearts and suspected arrhythmic death. Seven (15%) had likely pathogenic variants in ion channelopathy genes [KCNQ1 (1), KCNH2 (4), SCN5A (1), RyR2(1)]. Fifty cases aged between 2 and 67 had a cardiomyopathy. Twenty-five had arrhythmogenic right ventricular cardiomyopathy (ARVC), 10 dilated cardiomyopathy (DCM) and 15 hypertrophic cardiomyopathy (HCM). Likely pathogenic variants were found in three ARVC cases (12%) in PKP2, DSC2 or DSP, two DCM cases (20%) in MYH7, and four HCM cases (27%) in MYBPC3 (3) or MYH7 (1). Uptake of cascade screening in relatives was higher when a molecular diagnosis was made at autopsy. In three families, variants previously published as pathogenic were detected, but clinical investigation revealed no abnormalities in carrier relatives. With a conservative approach to defining pathogenicity of sequence variants incorporating family phenotype information and population genomic data, a molecular diagnosis was made in 15% of sudden arrhythmic deaths and 18% of cardiomyopathy deaths. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Novel Formulation of Adaptive MPC as EKF Using ANN Model: Multiproduct Semibatch Polymerization Reactor Case Study.

    PubMed

    Kamesh, Reddi; Rani, Kalipatnapu Yamuna

    2017-12-01

    In this paper, a novel formulation for nonlinear model predictive control (MPC) has been proposed incorporating the extended Kalman filter (EKF) control concept using a purely data-driven artificial neural network (ANN) model based on measurements for supervisory control. The proposed scheme consists of two modules focusing on online parameter estimation based on past measurements and control estimation over control horizon based on minimizing the deviation of model output predictions from set points along the prediction horizon. An industrial case study for temperature control of a multiproduct semibatch polymerization reactor posed as a challenge problem has been considered as a test bed to apply the proposed ANN-EKFMPC strategy at supervisory level as a cascade control configuration along with proportional integral controller [ANN-EKFMPC with PI (ANN-EKFMPC-PI)]. The proposed approach is formulated incorporating all aspects of MPC including move suppression factor for control effort minimization and constraint-handling capability including terminal constraints. The nominal stability analysis and offset-free tracking capabilities of the proposed controller are proved. Its performance is evaluated by comparison with a standard MPC-based cascade control approach using the same adaptive ANN model. The ANN-EKFMPC-PI control configuration has shown better controller performance in terms of temperature tracking, smoother input profiles, as well as constraint-handling ability compared with the ANN-MPC with PI approach for two products in summer and winter. The proposed scheme is found to be versatile although it is based on a purely data-driven model with online parameter estimation.

  9. A primer on theory-driven web scraping: Automatic extraction of big data from the Internet for use in psychological research.

    PubMed

    Landers, Richard N; Brusso, Robert C; Cavanaugh, Katelyn J; Collmus, Andrew B

    2016-12-01

    The term big data encompasses a wide range of approaches of collecting and analyzing data in ways that were not possible before the era of modern personal computing. One approach to big data of great potential to psychologists is web scraping, which involves the automated collection of information from webpages. Although web scraping can create massive big datasets with tens of thousands of variables, it can also be used to create modestly sized, more manageable datasets with tens of variables but hundreds of thousands of cases, well within the skillset of most psychologists to analyze, in a matter of hours. In this article, we demystify web scraping methods as currently used to examine research questions of interest to psychologists. First, we introduce an approach called theory-driven web scraping in which the choice to use web-based big data must follow substantive theory. Second, we introduce data source theories , a term used to describe the assumptions a researcher must make about a prospective big data source in order to meaningfully scrape data from it. Critically, researchers must derive specific hypotheses to be tested based upon their data source theory, and if these hypotheses are not empirically supported, plans to use that data source should be changed or eliminated. Third, we provide a case study and sample code in Python demonstrating how web scraping can be conducted to collect big data along with links to a web tutorial designed for psychologists. Fourth, we describe a 4-step process to be followed in web scraping projects. Fifth and finally, we discuss legal, practical and ethical concerns faced when conducting web scraping projects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. A comparison of Data Driven models of solving the task of gender identification of author in Russian language texts for cases without and with the gender deception

    NASA Astrophysics Data System (ADS)

    Sboev, A.; Moloshnikov, I.; Gudovskikh, D.; Rybka, R.

    2017-12-01

    In this work we compare several data-driven approaches to the task of author’s gender identification for texts with or without gender imitation. The data corpus has been specially gathered with crowdsourcing for this task. The best models are convolutional neural network with input of morphological data (fl-measure: 88%±3) for texts without imitation, and gradient boosting model with vector of character n-grams frequencies as input data (f1-measure: 64% ± 3) for texts with gender imitation. The method to filter the crowdsourced corpus using limited reference sample of texts to increase the accuracy of result is discussed.

  11. Feedback-Driven Mode Rotation Control by Electro-Magnetic Torque

    NASA Astrophysics Data System (ADS)

    Okabayashi, M.; Strait, E. J.; Garofalo, A. M.; La Haye, R. J.; in, Y.; Hanson, J. M.; Shiraki, D.; Volpe, F.

    2013-10-01

    The recent experimental discovery of feedback-driven mode rotation control, supported by modeling, opens new approaches for avoidance of locked tearing modes that otherwise lead to disruptions. This approach is an application of electro-magnetic (EM) torque using 3D fields, routinely maximized through a simple feedback system. In DIII-D, it is observed that a feedback-applied radial field can be synchronized in phase with the poloidal field component of a large amplitude tearing mode, producing the maximum EM torque input. The mode frequency can be maintained in the 10 Hz to 100 Hz range in a well controlled manner, sustaining the discharges. Presently, in the ITER internal coils designed for edge localized mode (ELM) control can only be varied at few Hz, yet, well below the inverse wall time constant. Hence, ELM control system could in principle be used for this feedback-driven mode control in various ways. For instance, the locking of MHD modes can be avoided during the controlled shut down of multi hundreds Mega Joule EM stored energy in case of emergency. Feedback could also be useful to minimize mechanical resonances at the disruption events by forcing the MHD frequency away from dangerous ranges. Work supported by the US DOE under DE-AC02-09CH11466, DE-FC-02-04ER54698, DE-FG02-08ER85195, and DE-FG02-04ER54761.

  12. Effects-Driven Participatory Design: Learning from Sampling Interruptions.

    PubMed

    Brandrup, Morten; Østergaard, Kija Lin; Hertzum, Morten; Karasti, Helena; Simonsen, Jesper

    2017-01-01

    Participatory design (PD) can play an important role in obtaining benefits from healthcare information technologies, but we contend that to fulfil this role PD must incorporate feedback from real use of the technologies. In this paper we describe an effects-driven PD approach that revolves around a sustained focus on pursued effects and uses the experience sampling method (ESM) to collect real-use feedback. To illustrate the use of the method we analyze a case that involves the organizational implementation of electronic whiteboards at a Danish hospital to support the clinicians' intra- and interdepartmental coordination. The hospital aimed to reduce the number of phone calls involved in coordinating work because many phone calls were seen as unnecessary interruptions. To learn about the interruptions we introduced an app for capturing quantitative data and qualitative feedback about the phone calls. The investigation showed that the electronic whiteboards had little potential for reducing the number of phone calls at the operating ward. The combination of quantitative data and qualitative feedback worked both as a basis for aligning assumptions to data and showed ESM as an instrument for triggering in-situ reflection. The participant-driven design and redesign of the way data were captured by means of ESM is a central contribution to the understanding of how to conduct effects-driven PD.

  13. Knee fusion--a new technique using an old Belgian surgical approach and a new intramedullary nail.

    PubMed

    Alt, V; Seligson, D

    2001-02-01

    Knee arthrodesis is a useful procedure in difficult cases such as failed total knee arthroplasty, severe articular trauma, bone tumors, and infected knee joints. The most common techniques for knee fusion include external fixation and intramedullary nailing. Küntscher's nail is driven antegrade from the intertrochanteric region into the knee. We describe a new technique for knee arthrodesis using a new intramedullary nail and an old Belgian surgical approach to the knee joint published by Lambotte in 1913. This approach provides excellent exposure for the implantation of the nail by osteotomizing the patella vertically. The nail is implanted using HeyGroves method, whereby the nail is inserted retrograde into the femur and pulled distally anterograde into the tibia. We now use this technique as our standard procedure for knee fusion.

  14. More than Just Test Scores: Leading for Improvement with an Alternative Community-Driven Accountability Metric

    ERIC Educational Resources Information Center

    Spain, Angeline; McMahon, Kelly

    2016-01-01

    In this case, Sharon Rowley, a veteran principal, volunteers to participate in a new community-driven accountability initiative and encounters dilemmas about what it means to be a "data-driven" instructional leader. This case provides an opportunity for aspiring school leaders to explore and apply data-use theory to the work of leading…

  15. Learning Biological Networks via Bootstrapping with Optimized GO-based Gene Similarity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.

    2010-08-02

    Microarray gene expression data provide a unique information resource for learning biological networks using "reverse engineering" methods. However, there are a variety of cases in which we know which genes are involved in a given pathology of interest, but we do not have enough experimental evidence to support the use of fully-supervised/reverse-engineering learning methods. In this paper, we explore a novel semi-supervised approach in which biological networks are learned from a reference list of genes and a partial set of links for these genes extracted automatically from PubMed abstracts, using a knowledge-driven bootstrapping algorithm. We show how new relevant linksmore » across genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. We describe an application of this approach to the TGFB pathway as a case study and show how the ensuing results prove the feasibility of the approach as an alternate or complementary technique to fully supervised methods.« less

  16. How does achievement motivation influence mental effort mobilization? Physiological evidence of deteriorative effects of negative affects on the level of engagement.

    PubMed

    Capa, Rémi L; Audiffren, Michel

    2009-12-01

    We tested whether the effect of achievement motivation on effort is modulated by two possible factors of the motivational intensity theory (Wright and Kirby, 2001): perceived difficulty and maximally justified effort. Approach-driven (N=16) and avoidance-driven (N=16) participants were first instructed to perform a reaction time task to the best of their abilities. Next, the participants were instructed to consistently beat their performance standard established in the first condition. Approach-driven participants showed a stronger decrease of midfrequency band of heart rate variability, which was used as an index of mental effort, than avoidance-driven participants in the second instruction condition. Moreover, avoidance-driven participants showed a higher corrugator supercilii reactivity, which was used as an index of negative affects, than approach-driven participants in the second instruction condition. No difference of perceived difficulty between groups was observed. Results suggested that avoidance-driven participants developed negative affects in the second instruction condition decreasing the maximally justified effort and their level of engagement.

  17. On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.

    PubMed

    Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen

    2018-04-01

    In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.

  18. A Framework of Knowledge Integration and Discovery for Supporting Pharmacogenomics Target Predication of Adverse Drug Events: A Case Study of Drug-Induced Long QT Syndrome.

    PubMed

    Jiang, Guoqian; Wang, Chen; Zhu, Qian; Chute, Christopher G

    2013-01-01

    Knowledge-driven text mining is becoming an important research area for identifying pharmacogenomics target genes. However, few of such studies have been focused on the pharmacogenomics targets of adverse drug events (ADEs). The objective of the present study is to build a framework of knowledge integration and discovery that aims to support pharmacogenomics target predication of ADEs. We integrate a semantically annotated literature corpus Semantic MEDLINE with a semantically coded ADE knowledgebase known as ADEpedia using a semantic web based framework. We developed a knowledge discovery approach combining a network analysis of a protein-protein interaction (PPI) network and a gene functional classification approach. We performed a case study of drug-induced long QT syndrome for demonstrating the usefulness of the framework in predicting potential pharmacogenomics targets of ADEs.

  19. Overcoming Barriers to Integrating Behavioral Health and Primary Care Services

    PubMed Central

    Grazier, Kyle L.; Smiley, Mary L.; Bondalapati, Kirsten S.

    2016-01-01

    Objective: Despite barriers, organizations with varying characteristics have achieved full integration of primary care services with providers and services that identify, treat, and manage those with mental health and substance use disorders. What are the key factors and common themes in stories of this success? Methods: A systematic literature review and snowball sampling technique was used to identify organizations. Site visits and key informant interviews were conducted with 6 organizations that had over time integrated behavioral health and primary care services. Case studies of each organization were independently coded to identify traits common to multiple organizations. Results: Common characteristics include prioritized vulnerable populations, extensive community collaboration, team approaches that included the patient and family, diversified funding streams, and data-driven approaches and practices. Conclusions: While significant barriers to integrating behavioral health and primary care services exist, case studies of organizations that have successfully overcome these barriers share certain common factors. PMID:27380923

  20. Data-driven risk identification in phase III clinical trials using central statistical monitoring.

    PubMed

    Timmermans, Catherine; Venet, David; Burzykowski, Tomasz

    2016-02-01

    Our interest lies in quality control for clinical trials, in the context of risk-based monitoring (RBM). We specifically study the use of central statistical monitoring (CSM) to support RBM. Under an RBM paradigm, we claim that CSM has a key role to play in identifying the "risks to the most critical data elements and processes" that will drive targeted oversight. In order to support this claim, we first see how to characterize the risks that may affect clinical trials. We then discuss how CSM can be understood as a tool for providing a set of data-driven key risk indicators (KRIs), which help to organize adaptive targeted monitoring. Several case studies are provided where issues in a clinical trial have been identified thanks to targeted investigation after the identification of a risk using CSM. Using CSM to build data-driven KRIs helps to identify different kinds of issues in clinical trials. This ability is directly linked with the exhaustiveness of the CSM approach and its flexibility in the definition of the risks that are searched for when identifying the KRIs. In practice, a CSM assessment of the clinical database seems essential to ensure data quality. The atypical data patterns found in some centers and variables are seen as KRIs under a RBM approach. Targeted monitoring or data management queries can be used to confirm whether the KRIs point to an actual issue or not.

  1. A defect-driven diagnostic method for machine tool spindles

    PubMed Central

    Vogl, Gregory W.; Donmez, M. Alkan

    2016-01-01

    Simple vibration-based metrics are, in many cases, insufficient to diagnose machine tool spindle condition. These metrics couple defect-based motion with spindle dynamics; diagnostics should be defect-driven. A new method and spindle condition estimation device (SCED) were developed to acquire data and to separate system dynamics from defect geometry. Based on this method, a spindle condition metric relying only on defect geometry is proposed. Application of the SCED on various milling and turning spindles shows that the new approach is robust for diagnosing the machine tool spindle condition. PMID:28065985

  2. LeadMine: a grammar and dictionary driven approach to entity recognition.

    PubMed

    Lowe, Daniel M; Sayle, Roger A

    2015-01-01

    Chemical entity recognition has traditionally been performed by machine learning approaches. Here we describe an approach using grammars and dictionaries. This approach has the advantage that the entities found can be directly related to a given grammar or dictionary, which allows the type of an entity to be known and, if an entity is misannotated, indicates which resource should be corrected. As recognition is driven by what is expected, if spelling errors occur, they can be corrected. Correcting such errors is highly useful when attempting to lookup an entity in a database or, in the case of chemical names, converting them to structures. Our system uses a mixture of expertly curated grammars and dictionaries, as well as dictionaries automatically derived from public resources. We show that the heuristics developed to filter our dictionary of trivial chemical names (from PubChem) yields a better performing dictionary than the previously published Jochem dictionary. Our final system performs post-processing steps to modify the boundaries of entities and to detect abbreviations. These steps are shown to significantly improve performance (2.6% and 4.0% F1-score respectively). Our complete system, with incremental post-BioCreative workshop improvements, achieves 89.9% precision and 85.4% recall (87.6% F1-score) on the CHEMDNER test set. Grammar and dictionary approaches can produce results at least as good as the current state of the art in machine learning approaches. While machine learning approaches are commonly thought of as "black box" systems, our approach directly links the output entities to the input dictionaries and grammars. Our approach also allows correction of errors in detected entities, which can assist with entity resolution.

  3. LeadMine: a grammar and dictionary driven approach to entity recognition

    PubMed Central

    2015-01-01

    Background Chemical entity recognition has traditionally been performed by machine learning approaches. Here we describe an approach using grammars and dictionaries. This approach has the advantage that the entities found can be directly related to a given grammar or dictionary, which allows the type of an entity to be known and, if an entity is misannotated, indicates which resource should be corrected. As recognition is driven by what is expected, if spelling errors occur, they can be corrected. Correcting such errors is highly useful when attempting to lookup an entity in a database or, in the case of chemical names, converting them to structures. Results Our system uses a mixture of expertly curated grammars and dictionaries, as well as dictionaries automatically derived from public resources. We show that the heuristics developed to filter our dictionary of trivial chemical names (from PubChem) yields a better performing dictionary than the previously published Jochem dictionary. Our final system performs post-processing steps to modify the boundaries of entities and to detect abbreviations. These steps are shown to significantly improve performance (2.6% and 4.0% F1-score respectively). Our complete system, with incremental post-BioCreative workshop improvements, achieves 89.9% precision and 85.4% recall (87.6% F1-score) on the CHEMDNER test set. Conclusions Grammar and dictionary approaches can produce results at least as good as the current state of the art in machine learning approaches. While machine learning approaches are commonly thought of as "black box" systems, our approach directly links the output entities to the input dictionaries and grammars. Our approach also allows correction of errors in detected entities, which can assist with entity resolution. PMID:25810776

  4. Effects of Corpus-Aided Language Learning in the EFL Grammar Classroom: A Case Study of Students' Learning Attitudes and Teachers' Perceptions in Taiwan

    ERIC Educational Resources Information Center

    Lin, Ming Huei

    2016-01-01

    This study employed a blended approach to form an extensive assessment of the pedagogical suitability of data-driven learning (DDL) in Taiwan's EFL grammar classrooms. On the one hand, the study quantitatively investigated the effects of DDL compared with that of a traditional deductive approach on the learning motivation and self-efficacy of…

  5. Applying cost accounting to operating room staffing in otolaryngology: time-driven activity-based costing and outpatient adenotonsillectomy.

    PubMed

    Balakrishnan, Karthik; Goico, Brian; Arjmand, Ellis M

    2015-04-01

    (1) To describe the application of a detailed cost-accounting method (time-driven activity-cased costing) to operating room personnel costs, avoiding the proxy use of hospital and provider charges. (2) To model potential cost efficiencies using different staffing models with the case study of outpatient adenotonsillectomy. Prospective cost analysis case study. Tertiary pediatric hospital. All otolaryngology providers and otolaryngology operating room staff at our institution. Time-driven activity-based costing demonstrated precise per-case and per-minute calculation of personnel costs. We identified several areas of unused personnel capacity in a basic staffing model. Per-case personnel costs decreased by 23.2% by allowing a surgeon to run 2 operating rooms, despite doubling all other staff. Further cost reductions up to a total of 26.4% were predicted with additional staffing rearrangements. Time-driven activity-based costing allows detailed understanding of not only personnel costs but also how personnel time is used. This in turn allows testing of alternative staffing models to decrease unused personnel capacity and increase efficiency. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.

  6. Microbial Community Metabolic Modeling: A Community Data-Driven Network Reconstruction: COMMUNITY DATA-DRIVEN METABOLIC NETWORK MODELING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Christopher S.; Bernstein, Hans C.; Weisenhorn, Pamela

    Metabolic network modeling of microbial communities provides an in-depth understanding of community-wide metabolic and regulatory processes. Compared to single organism analyses, community metabolic network modeling is more complex because it needs to account for interspecies interactions. To date, most approaches focus on reconstruction of high-quality individual networks so that, when combined, they can predict community behaviors as a result of interspecies interactions. However, this conventional method becomes ineffective for communities whose members are not well characterized and cannot be experimentally interrogated in isolation. Here, we tested a new approach that uses community-level data as a critical input for the networkmore » reconstruction process. This method focuses on directly predicting interspecies metabolic interactions in a community, when axenic information is insufficient. We validated our method through the case study of a bacterial photoautotroph-heterotroph consortium that was used to provide data needed for a community-level metabolic network reconstruction. Resulting simulations provided experimentally validated predictions of how a photoautotrophic cyanobacterium supports the growth of an obligate heterotrophic species by providing organic carbon and nitrogen sources.« less

  7. A Case Study: Analyzing City Vitality with Four Pillars of Activity-Live, Work, Shop, and Play.

    PubMed

    Griffin, Matt; Nordstrom, Blake W; Scholes, Jon; Joncas, Kate; Gordon, Patrick; Krivenko, Elliott; Haynes, Winston; Higdon, Roger; Stewart, Elizabeth; Kolker, Natali; Montague, Elizabeth; Kolker, Eugene

    2016-03-01

    This case study evaluates and tracks vitality of a city (Seattle), based on a data-driven approach, using strategic, robust, and sustainable metrics. This case study was collaboratively conducted by the Downtown Seattle Association (DSA) and CDO Analytics teams. The DSA is a nonprofit organization focused on making the city of Seattle and its Downtown a healthy and vibrant place to Live, Work, Shop, and Play. DSA primarily operates through public policy advocacy, community and business development, and marketing. In 2010, the organization turned to CDO Analytics ( cdoanalytics.org ) to develop a process that can guide and strategically focus DSA efforts and resources for maximal benefit to the city of Seattle and its Downtown. CDO Analytics was asked to develop clear, easily understood, and robust metrics for a baseline evaluation of the health of the city, as well as for ongoing monitoring and comparisons of the vitality, sustainability, and growth. The DSA and CDO Analytics teams strategized on how to effectively assess and track the vitality of Seattle and its Downtown. The two teams filtered a variety of data sources, and evaluated the veracity of multiple diverse metrics. This iterative process resulted in the development of a small number of strategic, simple, reliable, and sustainable metrics across four pillars of activity: Live, Work, Shop, and Play. Data during the 5 years before 2010 were used for the development of the metrics and model and its training, and data during the 5 years from 2010 and on were used for testing and validation. This work enabled DSA to routinely track these strategic metrics, use them to monitor the vitality of Downtown Seattle, prioritize improvements, and identify new value-added programs. As a result, the four-pillar approach became an integral part of the data-driven decision-making and execution of the Seattle community's improvement activities. The approach described in this case study is actionable, robust, inexpensive, and easy to adopt and sustain. It can be applied to cities, districts, counties, regions, states, or countries, enabling cross-comparisons and improvements of vitality, sustainability, and growth.

  8. Mammographic phenotypes of breast cancer risk driven by breast anatomy

    NASA Astrophysics Data System (ADS)

    Gastounioti, Aimilia; Oustimov, Andrew; Hsieh, Meng-Kang; Pantalone, Lauren; Conant, Emily F.; Kontos, Despina

    2017-03-01

    Image-derived features of breast parenchymal texture patterns have emerged as promising risk factors for breast cancer, paving the way towards personalized recommendations regarding women's cancer risk evaluation and screening. The main steps to extract texture features of the breast parenchyma are the selection of regions of interest (ROIs) where texture analysis is performed, the texture feature calculation and the texture feature summarization in case of multiple ROIs. In this study, we incorporate breast anatomy in these three key steps by (a) introducing breast anatomical sampling for the definition of ROIs, (b) texture feature calculation aligned with the structure of the breast and (c) weighted texture feature summarization considering the spatial position and the underlying tissue composition of each ROI. We systematically optimize this novel framework for parenchymal tissue characterization in a case-control study with digital mammograms from 424 women. We also compare the proposed approach with a conventional methodology, not considering breast anatomy, recently shown to enhance the case-control discriminatory capacity of parenchymal texture analysis. The case-control classification performance is assessed using elastic-net regression with 5-fold cross validation, where the evaluation measure is the area under the curve (AUC) of the receiver operating characteristic. Upon optimization, the proposed breast-anatomy-driven approach demonstrated a promising case-control classification performance (AUC=0.87). In the same dataset, the performance of conventional texture characterization was found to be significantly lower (AUC=0.80, DeLong's test p-value<0.05). Our results suggest that breast anatomy may further leverage the associations of parenchymal texture features with breast cancer, and may therefore be a valuable addition in pipelines aiming to elucidate quantitative mammographic phenotypes of breast cancer risk.

  9. Two-dimensional numerical study of two counter-propagating helium plasma jets in air at atmospheric pressure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Wen; Sang, Chaofeng; Wang, Dezhen, E-mail: wangdez@dlut.edu.cn

    In this paper, a computational study of two counter-propagating helium plasma jets in ambient air is presented. A two-dimensional fluid model is applied to investigate the physical processes of the two plasma jets interaction (PJI) driven by equal and unequal voltages, respectively. In all studied cases, the PJI results in a decrease of both plasma bullets propagation velocity. When the two plasma jets are driven by equal voltages, they never merge but rather approach each other around the middle of the gas gap at a minimum approach distance, and the minimal distance decreases with the increase of both the appliedmore » voltages and initial electron density, but increases with the increase of the relative permittivity. When the two plasma jets are driven by unequal voltages, we observe the two plasma jets will merge at the position away from the middle of the gas gap. The effect of applied voltage difference on the PJI is also studied.« less

  10. A Hybrid Approach to Clinical Question Answering

    DTIC Science & Technology

    2014-11-01

    participation in TREC, we submitted a single run using a hybrid Natural Language Processing ( NLP )-driven approach to accomplish the given task. Evaluation re...for the CDS track uses a variety of NLP - based techniques to address the clinical questions provided. We present a description of our approach, and...discuss our experimental setup, results and eval- uation in the subsequent sections. 2 Description of Our Approach Our hybrid NLP -driven method presents a

  11. Logistic-based patient grouping for multi-disciplinary treatment.

    PubMed

    Maruşter, Laura; Weijters, Ton; de Vries, Geerhard; van den Bosch, Antal; Daelemans, Walter

    2002-01-01

    Present-day healthcare witnesses a growing demand for coordination of patient care. Coordination is needed especially in those cases in which hospitals have structured healthcare into specialty-oriented units, while a substantial portion of patient care is not limited to single units. From a logistic point of view, this multi-disciplinary patient care creates a tension between controlling the hospital's units, and the need for a control of the patient flow between units. A possible solution is the creation of new units in which different specialties work together for specific groups of patients. A first step in this solution is to identify the salient patient groups in need of multi-disciplinary care. Grouping techniques seem to offer a solution. However, most grouping approaches in medicine are driven by a search for pathophysiological homogeneity. In this paper, we present an alternative logistic-driven grouping approach. The starting point of our approach is a database with medical cases for 3,603 patients with peripheral arterial vascular (PAV) diseases. For these medical cases, six basic logistic variables (such as the number of visits to different specialist) are selected. Using these logistic variables, clustering techniques are used to group the medical cases in logistically homogeneous groups. In our approach, the quality of the resulting grouping is not measured by statistical significance, but by (i) the usefulness of the grouping for the creation of new multi-disciplinary units; (ii) how well patients can be selected for treatment in the new units. Given a priori knowledge of a patient (e.g. age, diagnosis), machine learning techniques are employed to induce rules that can be used for the selection of the patients eligible for treatment in the new units. In the paper, we describe the results of the above-proposed methodology for patients with PAV diseases. Two groupings and the accompanied classification rule sets are presented. One grouping is based on all the logistic variables, and another grouping is based on two latent factors found by applying factor analysis. On the basis of the experimental results, we can conclude that it is possible to search for medical logistic homogenous groups (i) that can be characterized by rules based on the aggregated logistic variables; (ii) for which we can formulate rules to predict to which cluster new patients belong.

  12. Managing business compliance using model-driven security management

    NASA Astrophysics Data System (ADS)

    Lang, Ulrich; Schreiner, Rudolf

    Compliance with regulatory and governance standards is rapidly becoming one of the hot topics of information security today. This is because, especially with regulatory compliance, both business and government have to expect large financial and reputational losses if compliance cannot be ensured and demonstrated. One major difficulty of implementing such regulations is caused the fact that they are captured at a high level of abstraction that is business-centric and not IT centric. This means that the abstract intent needs to be translated in a trustworthy, traceable way into compliance and security policies that the IT security infrastructure can enforce. Carrying out this mapping process manually is time consuming, maintenance-intensive, costly, and error-prone. Compliance monitoring is also critical in order to be able to demonstrate compliance at any given point in time. The problem is further complicated because of the need for business-driven IT agility, where IT policies and enforcement can change frequently, e.g. Business Process Modelling (BPM) driven Service Oriented Architecture (SOA). Model Driven Security (MDS) is an innovative technology approach that can solve these problems as an extension of identity and access management (IAM) and authorization management (also called entitlement management). In this paper we will illustrate the theory behind Model Driven Security for compliance, provide an improved and extended architecture, as well as a case study in the healthcare industry using our OpenPMF 2.0 technology.

  13. Flexible Programmes in Higher Professional Education: Expert Validation of a Flexible Educational Model

    ERIC Educational Resources Information Center

    Schellekens, Ad; Paas, Fred; Verbraeck, Alexander; van Merrienboer, Jeroen J. G.

    2010-01-01

    In a preceding case study, a process-focused demand-driven approach for organising flexible educational programmes in higher professional education (HPE) was developed. Operations management and instructional design contributed to designing a flexible educational model by means of discrete-event simulation. Educational experts validated the model…

  14. AnchorDock: Blind and Flexible Anchor-Driven Peptide Docking.

    PubMed

    Ben-Shimon, Avraham; Niv, Masha Y

    2015-05-05

    The huge conformational space stemming from the inherent flexibility of peptides is among the main obstacles to successful and efficient computational modeling of protein-peptide interactions. Current peptide docking methods typically overcome this challenge using prior knowledge from the structure of the complex. Here we introduce AnchorDock, a peptide docking approach, which automatically targets the docking search to the most relevant parts of the conformational space. This is done by precomputing the free peptide's structure and by computationally identifying anchoring spots on the protein surface. Next, a free peptide conformation undergoes anchor-driven simulated annealing molecular dynamics simulations around the predicted anchoring spots. In the challenging task of a completely blind docking test, AnchorDock produced exceptionally good results (backbone root-mean-square deviation ≤ 2.2Å, rank ≤15) for 10 of 13 unbound cases tested. The impressive performance of AnchorDock supports a molecular recognition pathway that is driven via pre-existing local structural elements. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Confinement effects in premelting dynamics

    NASA Astrophysics Data System (ADS)

    Pramanik, Satyajit; Wettlaufer, John

    2017-11-01

    We examine the effects of confinement on the dynamics of premelted films driven by thermomolecular pressure gradients. Our approach is to modify a well-studied setting in which the thermomolecular pressure gradient is driven by a temperature gradient parallel to an interfacially premelted elastic wall. The modification treats the increase in viscosity associated with the thinning of films studied in a wide variety of materials using a power law and we examine the consequent evolution of the elastic wall. We treat (i) a range of interactions that are known to underlie interfacial premelting and (ii) a constant temperature gradient wherein the thermomolecular pressure gradient is a constant. The difference between the cases with and without the proximity effect arises in the volume flux of premelted liquid. The proximity effect increases the viscosity as the film thickness decreases thereby requiring the thermomolecular pressure driven flux to be accommodated at larger temperatures where the premelted film thickness is the largest. Implications for experiment and observations of frost heave are discussed.

  16. Confinement effects in premelting dynamics

    NASA Astrophysics Data System (ADS)

    Pramanik, Satyajit; Wettlaufer, John S.

    2017-11-01

    We examine the effects of confinement on the dynamics of premelted films driven by thermomolecular pressure gradients. Our approach is to modify a well-studied setting in which the thermomolecular pressure gradient is driven by a temperature gradient parallel to an interfacially premelted elastic wall. The modification treats the increase in viscosity associated with the thinning of films, studied in a wide variety of materials, using a power law and we examine the consequent evolution of the confining elastic wall. We treat (1) a range of interactions that are known to underlie interfacial premelting and (2) a constant temperature gradient wherein the thermomolecular pressure gradient is a constant. The difference between the cases with and without the proximity effect arises in the volume flux of premelted liquid. The proximity effect increases the viscosity as the film thickness decreases thereby requiring the thermomolecular pressure driven flux to be accommodated at higher temperatures where the premelted film thickness is the largest. Implications for experiment and observations of frost heave are discussed.

  17. Appraisal of UTIAS implosion-driven hypervelocity launchers and shock tubes.

    NASA Technical Reports Server (NTRS)

    Glass, I. I.

    1972-01-01

    A critical appraisal is made of the design, research, development, and operation of the novel UTIAS implosion-driven hypervelocity launchers and shock tubes. Explosively driven (PbN6-lead azide, PETN-pentaerythritetetranitrate) implosions in detonating stoichiometric hydrogen-oxygen mixtures have been successfully developed as drivers for hypervelocity launchers and shock tubes in a safe and reusable facility. Intense loadings at very high calculated pressures, densities, and temperatures, at the implosion center, cause severe problems with projectile integrity. Misalignment of the focal point can occur and add to the difficulty in using small caliber projectiles. In addition, the extreme driving conditions cause barrel expansion, erosion, and possible gas leakage from the base to the head of the projectile which cut the predicted muzzle velocities to half or a third of the lossless calculated values. However, in the case of a shock-tube operation these difficulties are minimized or eliminated and the possibilities of approaching Jovian reentry velocities are encouraging.

  18. Absorption/transmission measurements of PSAP particle-laden filters from the Biomass Burning Observation Project (BBOP) field campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Presser, Cary; Nazarian, Ashot; Conny, Joseph M.

    Absorptivity measurements with a laser-heating approach, referred to as the laser-driven thermal reactor (LDTR), were carried out in the infrared and applied at ambient (laboratory) nonreacting conditions to particle-laden filters from a three-wavelength (visible) particle/soot absorption photometer (PSAP). Here, the particles were obtained during the Biomass Burning Observation Project (BBOP) field campaign. The focus of this study was to determine the particle absorption coefficient from field-campaign filter samples using the LDTR approach, and compare results with other commercially available instrumentation (in this case with the PSAP, which has been compared with numerous other optical techniques).

  19. Absorption/transmission measurements of PSAP particle-laden filters from the Biomass Burning Observation Project (BBOP) field campaign

    DOE PAGES

    Presser, Cary; Nazarian, Ashot; Conny, Joseph M.; ...

    2016-12-02

    Absorptivity measurements with a laser-heating approach, referred to as the laser-driven thermal reactor (LDTR), were carried out in the infrared and applied at ambient (laboratory) nonreacting conditions to particle-laden filters from a three-wavelength (visible) particle/soot absorption photometer (PSAP). Here, the particles were obtained during the Biomass Burning Observation Project (BBOP) field campaign. The focus of this study was to determine the particle absorption coefficient from field-campaign filter samples using the LDTR approach, and compare results with other commercially available instrumentation (in this case with the PSAP, which has been compared with numerous other optical techniques).

  20. A Null Space Control of Two Wheels Driven Mobile Manipulator Using Passivity Theory

    NASA Astrophysics Data System (ADS)

    Shibata, Tsuyoshi; Murakami, Toshiyuki

    This paper describes a control strategy of null space motion of a two wheels driven mobile manipulator. Recently, robot is utilized in various industrial fields and it is preferable for the robot manipulator to have multiple degrees of freedom motion. Several studies of kinematics for null space motion have been proposed. However stability analysis of null space motion is not enough. Furthermore, these approaches apply to stable systems, but they do not apply unstable systems. Then, in this research, base of manipulator equips with two wheels driven mobile robot. This robot is called two wheels driven mobile manipulator, which becomes unstable system. In the proposed approach, a control design of null space uses passivity based stabilizing. A proposed controller is decided so that closed-loop system of robot dynamics satisfies passivity. This is passivity based control. Then, control strategy is that stabilizing of the robot system applies to work space observer based approach and null space control while keeping end-effector position. The validity of the proposed approach is verified by simulations and experiments of two wheels driven mobile manipulator.

  1. Audiologist-driven versus patient-driven fine tuning of hearing instruments.

    PubMed

    Boymans, Monique; Dreschler, Wouter A

    2012-03-01

    Two methods of fine tuning the initial settings of hearing aids were compared: An audiologist-driven approach--using real ear measurements and a patient-driven fine-tuning approach--using feedback from real-life situations. The patient-driven fine tuning was conducted by employing the Amplifit(®) II system using audiovideo clips. The audiologist-driven fine tuning was based on the NAL-NL1 prescription rule. Both settings were compared using the same hearing aids in two 6-week trial periods following a randomized blinded cross-over design. After each trial period, the settings were evaluated by insertion-gain measurements. Performance was evaluated by speech tests in quiet, in noise, and in time-reversed speech, presented at 0° and with spatially separated sound sources. Subjective results were evaluated using extensive questionnaires and audiovisual video clips. A total of 73 participants were included. On average, higher gain values were found for the audiologist-driven settings than for the patient-driven settings, especially at 1000 and 2000 Hz. Better objective performance was obtained for the audiologist-driven settings for speech perception in quiet and in time-reversed speech. This was supported by better scores on a number of subjective judgments and in the subjective ratings of video clips. The perception of loud sounds scored higher than when patient-driven, but the overall preference was in favor of the audiologist-driven settings for 67% of the participants.

  2. Using model based systems engineering for the development of the Large Synoptic Survey Telescope's operational plan

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Willman, Beth; Petravick, Don; Johnson, Margaret; Reil, Kevin; Marshall, Stuart; Thomas, Sandrine; Lotz, Paul; Schumacher, German; Lim, Kian-Tat; Jenness, Tim; Jacoby, Suzanne; Emmons, Ben; Axelrod, Tim

    2016-08-01

    We† provide an overview of the Model Based Systems Engineering (MBSE) language, tool, and methodology being used in our development of the Operational Plan for Large Synoptic Survey Telescope (LSST) operations. LSST's Systems Engineering (SE) team is using a model-based approach to operational plan development to: 1) capture the topdown stakeholders' needs and functional allocations defining the scope, required tasks, and personnel needed for operations, and 2) capture the bottom-up operations and maintenance activities required to conduct the LSST survey across its distributed operations sites for the full ten year survey duration. To accomplish these complimentary goals and ensure that they result in self-consistent results, we have developed a holistic approach using the Sparx Enterprise Architect modeling tool and Systems Modeling Language (SysML). This approach utilizes SysML Use Cases, Actors, associated relationships, and Activity Diagrams to document and refine all of the major operations and maintenance activities that will be required to successfully operate the observatory and meet stakeholder expectations. We have developed several customized extensions of the SysML language including the creation of a custom stereotyped Use Case element with unique tagged values, as well as unique association connectors and Actor stereotypes. We demonstrate this customized MBSE methodology enables us to define: 1) the rolls each human Actor must take on to successfully carry out the activities associated with the Use Cases; 2) the skills each Actor must possess; 3) the functional allocation of all required stakeholder activities and Use Cases to organizational entities tasked with carrying them out; and 4) the organization structure required to successfully execute the operational survey. Our approach allows for continual refinement utilizing the systems engineering spiral method to expose finer levels of detail as necessary. For example, the bottom-up, Use Case-driven approach will be deployed in the future to develop the detailed work procedures required to successfully execute each operational activity.

  3. Airway-Specific Inducible Transgene Expression Using Aerosolized Doxycycline

    PubMed Central

    Tata, Purushothama Rao; Pardo-Saganta, Ana; Prabhu, Mythili; Vinarsky, Vladimir; Law, Brandon M.; Fontaine, Benjamin A.; Tager, Andrew M.

    2013-01-01

    Tissue-specific transgene expression using tetracycline (tet)-regulated promoter/operator elements has been used to revolutionize our understanding of cellular and molecular processes. However, because most tet-regulated mouse strains use promoters of genes expressed in multiple tissues, to achieve exclusive expression in an organ of interest is often impossible. Indeed, in the extreme case, unwanted transgene expression in other organ systems causes lethality and precludes the study of the transgene in the actual organ of interest. Here, we describe a novel approach to activating tet-inducible transgene expression solely in the airway by administering aerosolized doxycycline. By optimizing the dose and duration of aerosolized doxycycline exposure in mice possessing a ubiquitously expressed Rosa26 promoter–driven reverse tet-controlled transcriptional activator (rtTA) element, we induce transgene expression exclusively in the airways. We detect no changes in the cellular composition or proliferative behavior of airway cells. We used this newly developed method to achieve airway basal stem cell–specific transgene expression using a cytokeratin 5 (also known as keratin 5)–driven rtTA driver line to induce Notch pathway activation. We observed a more robust mucous metaplasia phenotype than in mice receiving doxycycline systemically. In addition, unwanted phenotypes outside of the lung that were evident when doxycycline was received systemically were now absent. Thus, our approach allows for rapid and efficient airway-specific transgene expression. After the careful strain by strain titration of the dose and timing of doxycycline inhalation, a suite of preexisting transgenic mice can now be used to study airway biology specifically in cases where transient transgene expression is sufficient to induce a phenotype. PMID:23848320

  4. Lifestyle Approaches for People With Intellectual Disabilities: A Systematic Multiple Case Analysis.

    PubMed

    Steenbergen, Henderika Annegien; Van der Schans, Cees P; Van Wijck, Ruud; De Jong, Johan; Waninge, Aly

    2017-11-01

    Health care organizations supporting individuals with intellectual disabilities (IDs) carry out a range of interventions to support and improve a healthy lifestyle. However, it is difficult to implement an active and healthy lifestyle into daily support. The presence of numerous intervention components, multiple levels of influence, and the explicit use of theory are factors that are considered to be essential for implementation in practice. A comprehensive written lifestyle policy provides for sustainability of a lifestyle approach. It is unknown to what extent these crucial factors for successful implementation are taken into consideration by health care organizations supporting this population. To analyze the intervention components, levels of influence, explicit use of theory, and conditions for sustainability of currently used lifestyle interventions within lifestyle approaches aiming at physical activity and nutrition in health care organizations supporting people with ID. In this descriptive multiple case study of 9 health care organizations, qualitative data of the lifestyle approaches with accompanying interventions and their components were compiled with a newly developed online inventory form. From 9 health care organizations, 59 interventions were included, of which 31% aimed to improve physical activity, 10% nutrition, and 59% a combination of both. Most (49%) interventions aimed at the educational component and less at daily (19%) and generic activities (16%) and the evaluation component (16%). Most interventions targeted individuals with ID and the professionals whereas social levels were underrepresented. Although 52% of the interventions were structurally embedded, only 10 of the 59 interventions were theory-driven. Health care organizations could improve their lifestyle approaches by using an explicit theoretical basis by expanding the current focus of the interventions that primarily concentrate on their clients and professionals toward also targeting the social and external environment as well as the introduction of a written lifestyle policy. This policy should encompass all interventions and should be the responsibility of those in the organization working with individuals with ID. In conclusion, comprehensive, integrated, and theory-driven approaches at multiple levels should be promoted. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  5. The TIPS Evaluation Project: A Theory-Driven Approach to Dissemination Research.

    ERIC Educational Resources Information Center

    Mulvey, Kevin P.; Hayashi, Susan W.; Hubbard, Susan M.; Kopstien, Andrea; Huang, Judy Y.

    2003-01-01

    Introduces the special section that focuses on four major studies under the treatment improvement protocols (TIPs) evaluation project. Provides an overview of each article, and addresses the value of using a theory-driven approach to dissemination research. (SLD)

  6. Dissipation-induced dipole blockade and antiblockade in driven Rydberg systems

    NASA Astrophysics Data System (ADS)

    Young, Jeremy T.; Boulier, Thomas; Magnan, Eric; Goldschmidt, Elizabeth A.; Wilson, Ryan M.; Rolston, Steven L.; Porto, James V.; Gorshkov, Alexey V.

    2018-02-01

    We study theoretically and experimentally the competing blockade and antiblockade effects induced by spontaneously generated contaminant Rydberg atoms in driven Rydberg systems. These contaminant atoms provide a source of strong dipole-dipole interactions and play a crucial role in the system's behavior. We study this problem theoretically using two different approaches. The first is a cumulant expansion approximation, in which we ignore third-order and higher connected correlations. Using this approach for the case of resonant drive, a many-body blockade radius picture arises, and we find qualitative agreement with previous experimental results. We further predict that as the atomic density is increased, the Rydberg population's dependence on Rabi frequency will transition from quadratic to linear dependence at lower Rabi frequencies. We study this behavior experimentally by observing this crossover at two different atomic densities. We confirm that the larger density system has a smaller crossover Rabi frequency than the smaller density system. The second theoretical approach is a set of phenomenological inhomogeneous rate equations. We compare the results of our rate-equation model to the experimental observations [E. A. Goldschmidt et al., Phys. Rev. Lett. 116, 113001 (2016), 10.1103/PhysRevLett.116.113001] and find that these rate equations provide quantitatively good scaling behavior of the steady-state Rydberg population for both resonant and off-resonant drives.

  7. Combining Theory-Driven Evaluation and Causal Loop Diagramming for Opening the 'Black Box' of an Intervention in the Health Sector: A Case of Performance-Based Financing in Western Uganda.

    PubMed

    Renmans, Dimitri; Holvoet, Nathalie; Criel, Bart

    2017-09-03

    Increased attention on "complexity" in health systems evaluation has resulted in many different methodological responses. Theory-driven evaluations and systems thinking are two such responses that aim for better understanding of the mechanisms underlying given outcomes. Here, we studied the implementation of a performance-based financing intervention by the Belgian Technical Cooperation in Western Uganda to illustrate a methodological strategy of combining these two approaches. We utilized a systems dynamics tool called causal loop diagramming (CLD) to generate hypotheses feeding into a theory-driven evaluation. Semi-structured interviews were conducted with 30 health workers from two districts (Kasese and Kyenjojo) and with 16 key informants. After CLD, we identified three relevant hypotheses: "success to the successful", "growth and underinvestment", and "supervision conundrum". The first hypothesis leads to increasing improvements in performance, as better performance leads to more incentives, which in turn leads to better performance. The latter two hypotheses point to potential bottlenecks. Thus, the proposed methodological strategy was a useful tool for identifying hypotheses that can inform a theory-driven evaluation. The hypotheses are represented in a comprehensible way while highlighting the underlying assumptions, and are more easily falsifiable than hypotheses identified without using CLD.

  8. Data-driven outbreak forecasting with a simple nonlinear growth model.

    PubMed

    Lega, Joceline; Brown, Heidi E

    2016-12-01

    Recent events have thrown the spotlight on infectious disease outbreak response. We developed a data-driven method, EpiGro, which can be applied to cumulative case reports to estimate the order of magnitude of the duration, peak and ultimate size of an ongoing outbreak. It is based on a surprisingly simple mathematical property of many epidemiological data sets, does not require knowledge or estimation of disease transmission parameters, is robust to noise and to small data sets, and runs quickly due to its mathematical simplicity. Using data from historic and ongoing epidemics, we present the model. We also provide modeling considerations that justify this approach and discuss its limitations. In the absence of other information or in conjunction with other models, EpiGro may be useful to public health responders. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Bifurcation Theory of the Transition to Collisionless Ion-temperature-gradient-driven Plasma Turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolesnikov, R.A.; Krommes, J.A.

    The collisionless limit of the transition to ion-temperature-gradient-driven plasma turbulence is considered with a dynamical-systems approach. The importance of systematic analysis for understanding the differences in the bifurcations and dynamics of linearly damped and undamped systems is emphasized. A model with ten degrees of freedom is studied as a concrete example. A four-dimensional center manifold (CM) is analyzed, and fixed points of its dynamics are identified and used to predict a ''Dimits shift'' of the threshold for turbulence due to the excitation of zonal flows. The exact value of that shift in terms of physical parameters is established for themore » model; the effects of higher-order truncations on the dynamics are noted. Multiple-scale analysis of the CM equations is used to discuss possible effects of modulational instability on scenarios for the transition to turbulence in both collisional and collisionless cases.« less

  10. A modelling approach to assessing the timescale uncertainties in proxy series with chronological errors

    NASA Astrophysics Data System (ADS)

    Divine, D. V.; Godtliebsen, F.; Rue, H.

    2012-01-01

    The paper proposes an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models based on tie points of mixed origin.

  11. A Qualitative Case Study of Teachers' Perceptions of Professional Learning through Mandated Collaboration

    ERIC Educational Resources Information Center

    Wilt, Barbara C.

    2016-01-01

    Teacher collaboration is a school improvement priority that has the potential to positively impact student learning by building the capacity of teachers. In some states, teacher collaboration is mandated by legislation. The literature indicates that policy-driven collaboration in a top-down approach results in unintentional consequences and…

  12. Research Knowledge Transfer through Business-Driven Student Assignment

    ERIC Educational Resources Information Center

    Sas, Corina

    2009-01-01

    Purpose: The purpose of this paper is to present a knowledge transfer method that capitalizes on both research and teaching dimensions of academic work. It also aims to propose a framework for evaluating the impact of such a method on the involved stakeholders. Design/methodology/approach: The case study outlines and evaluates the six-stage…

  13. The Relationship of Competition and Choice to Innovation in Education Markets: A Review of Research on Four Cases.

    ERIC Educational Resources Information Center

    Lubienski, Chris

    Concerned about the deadening effects of standardization imposed by monopolistic education bureaucracies, policymakers in many different countries endorse economic-style mechanisms of consumer choice and competition between autonomous providers as the key elements of "market-driven" education. The reasoning behind this approach is that market…

  14. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    ERIC Educational Resources Information Center

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  15. Using a multi-scale approach to identify and quantify oil and gas emissions: a case study for GHG emissions verification

    NASA Astrophysics Data System (ADS)

    Sweeney, C.; Kort, E. A.; Rella, C.; Conley, S. A.; Karion, A.; Lauvaux, T.; Frankenberg, C.

    2015-12-01

    Along with a boom in oil and natural gas production in the US, there has been a substantial effort to understand the true environmental impact of these operations on air and water quality, as well asnet radiation balance. This multi-institution effort funded by both governmental and non-governmental agencies has provided a case study for identification and verification of emissions using a multi-scale, top-down approach. This approach leverages a combination of remote sensing to identify areas that need specific focus and airborne in-situ measurements to quantify both regional and large- to mid-size single-point emitters. Ground-based networks of mobile and stationary measurements provide the bottom tier of measurements from which process-level information can be gathered to better understand the specific sources and temporal distribution of the emitters. The motivation for this type of approach is largely driven by recent work in the Barnett Shale region in Texas as well as the San Juan Basin in New Mexico and Colorado; these studies suggest that relatively few single-point emitters dominate the regional emissions of CH4.

  16. Graph reconstruction using covariance-based methods.

    PubMed

    Sulaimanov, Nurgazy; Koeppl, Heinz

    2016-12-01

    Methods based on correlation and partial correlation are today employed in the reconstruction of a statistical interaction graph from high-throughput omics data. These dedicated methods work well even for the case when the number of variables exceeds the number of samples. In this study, we investigate how the graphs extracted from covariance and concentration matrix estimates are related by using Neumann series and transitive closure and through discussing concrete small examples. Considering the ideal case where the true graph is available, we also compare correlation and partial correlation methods for large realistic graphs. In particular, we perform the comparisons with optimally selected parameters based on the true underlying graph and with data-driven approaches where the parameters are directly estimated from the data.

  17. Biological control of weeds: research by the United States Department of Agriculture-Agricultural Research Service: selected case studies.

    PubMed

    Quimby, Paul C; DeLoach, C Jack; Wineriter, Susan A; Goolsby, John A; Sobhian, Rouhollah; Boyette, C Douglas; Abbas, Hamed K

    2003-01-01

    Research by the USDA-Agricultural Research Service (ARS) on biological control of weeds has been practiced for many years because of its inherent ecological and economic advantages. Today, it is further driven by ARS adherence to Presidential Executive Order 13112 (3 February 1999) on invasive species and to USDA-ARS policy toward developing technology in support of sustainable agriculture with reduced dependence on non-renewable petrochemical resources. This paper reports examples or case studies selected to demonstrate the traditional or classical approach for biological control programs using Old World arthropods against Tamarix spp, Melaleuca quinquenervia (Cav) ST Blake and Galium spurium L/G aparine L, and the augmentative approach with a native plant pathogen against Pueraria lobata Ohwi = P montana. The examples illustrated various conflicts of interest with endangered species and ecological complexities of arthropods with associated microbes such as nematodes.

  18. Data-Driven Learning of Speech Acts Based on Corpora of DVD Subtitles

    ERIC Educational Resources Information Center

    Kitao, S. Kathleen; Kitao, Kenji

    2013-01-01

    Data-driven learning (DDL) is an inductive approach to language learning in which students study examples of authentic language and use them to find patterns of language use. This inductive approach to learning has the advantages of being learner-centered, encouraging hypothesis testing and learner autonomy, and helping develop learning skills.…

  19. Strategies for Early Outbreak Detection of Malaria in the Amhara Region of Ethiopia

    NASA Astrophysics Data System (ADS)

    Nekorchuk, D.; Gebrehiwot, T.; Mihretie, A.; Awoke, W.; Wimberly, M. C.

    2017-12-01

    Traditional epidemiological approaches to early detection of disease outbreaks are based on relatively straightforward thresholds (e.g. 75th percentile, standard deviations) estimated from historical case data. For diseases with strong seasonality, these can be modified to create separate thresholds for each seasonal time step. However, for disease processes that are non-stationary, more sophisticated techniques are needed to more accurately estimate outbreak threshold values. Early detection for geohealth-related diseases that also have environmental drivers, such as vector-borne diseases, may also benefit from the integration of time-lagged environmental data and disease ecology models into the threshold calculations. The Epidemic Prognosis Incorporating Disease and Environmental Monitoring for Integrated Assessment (EPIDEMIA) project has been integrating malaria case surveillance with remotely-sensed environmental data for early detection, warning, and forecasting of malaria epidemics in the Amhara region of Ethiopia, and has five years of weekly time series data from 47 woredas (districts). Efforts to reduce the burden of malaria in Ethiopia has been met with some notable success in the past two decades with major reduction in cases and deaths. However, malaria remains a significant public health threat as 60% of the population live in malarious areas, and due to the seasonal and unstable transmission patterns with cyclic outbreaks, protective immunity is generally low which could cause high morbidity and mortality during the epidemics. This study compared several approaches for defining outbreak thresholds and for identifying a potential outbreak based on deviations from these thresholds. We found that model-based approaches that accounted for climate-driven seasonality in malaria transmission were most effective, and that incorporating a trend component improved outbreak detection in areas with active malaria elimination efforts. An advantage of these early detection techniques is that they can detect climate-driven outbreaks as well as outbreaks driven by social factors such as human migration.

  20. Clinical reasoning and case-based decision making: the fundamental challenge to veterinary educators.

    PubMed

    May, Stephen A

    2013-01-01

    Confusion about the nature of human reasoning and its appropriate application to patients has hampered veterinary students' development of these skills. Expertise is associated with greater ability to deploy pattern recognition (type 1 reasoning), which is aided by progressive development of data-driven, forward reasoning (in contrast to scientific, backward reasoning), analytical approaches that lead to schema acquisition. The associative nature of type 1 reasoning makes it prone to bias, particularly in the face of "cognitive miserliness," when clues that indicate the need for triangulation with an analytical approach are ignored. However, combined reasoning approaches, from the earliest stages, are more successful than one approach alone, so it is important that those involved in curricular design and delivery promote student understanding of reasoning generally, and the situations in which reasoning goes awry, and develop students' ability to reason safely and accurately whether presented with a familiar case or with a case that they have never seen before.

  1. Data-driven Modelling for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Angria S, Layla; Dwi Sari, Yunita; Zarlis, Muhammad; Tulus

    2018-01-01

    The rise of the issues with the uncertainty of decision making has become a very warm conversation in operation research. Many models have been presented, one of which is with data-driven modelling (DDM). The purpose of this paper is to extract and recognize patterns in data, and find the best model in decision-making problem under uncertainty by using data-driven modeling approach with linear programming, linear and nonlinear differential equation, bayesian approach. Model criteria tested to determine the smallest error, and it will be the best model that can be used.

  2. Water crises—a new approach

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Atreyee

    2012-11-01

    Water crises are one the biggest challenges facing humanity in the 21st century. But what exactly is the nature of these crises? Scientists investigated the underlying causes driving water scarcity in 22 of the best studied cases across India, China, South America, Russia, and Australia using a quantitative technique that breaks down exhaustive case studies into measurable parameters. Srinivasan et al. show that in spite of the numerous ways in which humans interact with fresh water, each shortage or crisis is driven by one or more of eight underlying causes—which can be grouped into six “syndromes.” The authors found that just as declining natural supply can drive water shortages, so can human consumption or lack of proper policies.

  3. The Role of Community-Driven Data Curation for Enterprises

    NASA Astrophysics Data System (ADS)

    Curry, Edward; Freitas, Andre; O'Riáin, Sean

    With increased utilization of data within their operational and strategic processes, enterprises need to ensure data quality and accuracy. Data curation is a process that can ensure the quality of data and its fitness for use. Traditional approaches to curation are struggling with increased data volumes, and near real-time demands for curated data. In response, curation teams have turned to community crowd-sourcing and semi-automatedmetadata tools for assistance. This chapter provides an overview of data curation, discusses the business motivations for curating data and investigates the role of community-based data curation, focusing on internal communities and pre-competitive data collaborations. The chapter is supported by case studies from Wikipedia, The New York Times, Thomson Reuters, Protein Data Bank and ChemSpider upon which best practices for both social and technical aspects of community-driven data curation are described.

  4. SPH with dynamical smoothing length adjustment based on the local flow kinematics

    NASA Astrophysics Data System (ADS)

    Olejnik, Michał; Szewc, Kamil; Pozorski, Jacek

    2017-11-01

    Due to the Lagrangian nature of Smoothed Particle Hydrodynamics (SPH), the adaptive resolution remains a challenging task. In this work, we first analyse the influence of the simulation parameters and the smoothing length on solution accuracy, in particular in high strain regions. Based on this analysis we develop a novel approach to dynamically adjust the kernel range for each SPH particle separately, accounting for the local flow kinematics. We use the Okubo-Weiss parameter that distinguishes the strain and vorticity dominated regions in the flow domain. The proposed development is relatively simple and implies only a moderate computational overhead. We validate the modified SPH algorithm for a selection of two-dimensional test cases: the Taylor-Green flow, the vortex spin-down, the lid-driven cavity and the dam-break flow against a sharp-edged obstacle. The simulation results show good agreement with the reference data and improvement of the long-term accuracy for unsteady flows. For the lid-driven cavity case, the proposed dynamical adjustment remedies the problem of tensile instability (particle clustering).

  5. Optimized operation of dielectric laser accelerators: Multibunch

    NASA Astrophysics Data System (ADS)

    Hanuka, Adi; Schächter, Levi

    2018-06-01

    We present a self-consistent analysis to determine the optimal charge, gradient, and efficiency for laser driven accelerators operating with a train of microbunches. Specifically, we account for the beam loading reduction on the material occurring at the dielectric-vacuum interface. In the case of a train of microbunches, such beam loading effect could be detrimental due to energy spread, however this may be compensated by a tapered laser pulse. We ultimately propose an optimization procedure with an analytical solution for group velocity which equals to half the speed of light. This optimization results in a maximum efficiency 20% lower than the single bunch case, and a total accelerated charge of 1 06 electrons in the train. The approach holds promise for improving operations of dielectric laser accelerators and may have an impact on emerging laser accelerators driven by high-power optical lasers.

  6. Driven and decaying turbulence simulations of low–mass star formation: From clumps to cores to protostars

    DOE PAGES

    Offner, Stella S. R.; Klein, Richard I.; McKee, Christopher F.

    2008-10-20

    Molecular clouds are observed to be turbulent, but the origin of this turbulence is not well understood. As a result, there are two different approaches to simulating molecular clouds, one in which the turbulence is allowed to decay after it is initialized, and one in which it is driven. We use the adaptive mesh refinement (AMR) code, Orion, to perform high-resolution simulations of molecular cloud cores and protostars in environments with both driven and decaying turbulence. We include self-gravity, use a barotropic equation of state, and represent regions exceeding the maximum grid resolution with sink particles. We analyze the propertiesmore » of bound cores such as size, shape, line width, and rotational energy, and we find reasonable agreement with observation. At high resolution the different rates of core accretion in the two cases have a significant effect on protostellar system development. Clumps forming in a decaying turbulence environment produce high-multiplicity protostellar systems with Toomre Q unstable disks that exhibit characteristics of the competitive accretion model for star formation. In contrast, cores forming in the context of continuously driven turbulence and virial equilibrium form smaller protostellar systems with fewer low-mass members. Furthermore, our simulations of driven and decaying turbulence show some statistically significant differences, particularly in the production of brown dwarfs and core rotation, but the uncertainties are large enough that we are not able to conclude whether observations favor one or the other.« less

  7. Process-Driven Culture Learning in American KFL Classroom Settings

    ERIC Educational Resources Information Center

    Byon, Andrew Sangpil

    2007-01-01

    Teaching second language (L2) culture can be either content- or process-driven. The content-driven approach refers to explicit instruction of L2 cultural information. On the other hand, the process-driven approach focuses on students' active participation in cultural learning processes. In this approach, teachers are not only information…

  8. Hypothesis-driven physical examination curriculum.

    PubMed

    Allen, Sharon; Olson, Andrew; Menk, Jeremiah; Nixon, James

    2017-12-01

    Medical students traditionally learn physical examination skills as a rote list of manoeuvres. Alternatives like hypothesis-driven physical examination (HDPE) may promote students' understanding of the contribution of physical examination to diagnostic reasoning. We sought to determine whether first-year medical students can effectively learn to perform a physical examination using an HDPE approach, and then tailor the examination to specific clinical scenarios. Medical students traditionally learn physical examination skills as a rote list of manoeuvres CONTEXT: First-year medical students at the University of Minnesota were taught both traditional and HDPE approaches during a required 17-week clinical skills course in their first semester. The end-of-course evaluation assessed HDPE skills: students were assigned one of two cardiopulmonary cases. Each case included two diagnostic hypotheses. During an interaction with a standardised patient, students were asked to select physical examination manoeuvres in order to make a final diagnosis. Items were weighted and selection order was recorded. First-year students with minimal pathophysiology performed well. All students selected the correct diagnosis. Importantly, students varied the order when selecting examination manoeuvres depending on the diagnoses under consideration, demonstrating early clinical decision-making skills. An early introduction to HDPE may reinforce physical examination skills for hypothesis generation and testing, and can foster early clinical decision-making skills. This has important implications for further research in physical examination instruction. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  9. 4D Interconnect Experimental Development

    DTIC Science & Technology

    1993-06-29

    Polymerization of MMA Using Benzoyl Peroxide ..... .......... 6 2.1.2.3 Polymerization Using Benzoin ........ .................. 7 2.1.3 Adding the SHB...molecules.J6l This breakdown can be driven thermally (as in the case of benzoyl peroxide) or via UV illumination (as in the case of benzoin ). Once...taken is increased. 6 2.1.2.3 Polymerization Using Benzoin A commonly used alternative to the initiation being driven thermally is to use a photoini

  10. Building multi-country collaboration on watershed ...

    EPA Pesticide Factsheets

    Community-based watershed resilience programs that bridge public health and environmental outcomes often require cross-boundary, multi-country collaboration. The CRESSIDA project, led by the Regional Environmental Center for Central and Eastern Europe (REC) and supported by the US Environmental Protection Agency (EPA), forwards a resilience-focused approach for Western Balkan communities in the Drini and Drina river watersheds with the goal of safeguarding public health and the environment. The initial phases of this project give a contextualized example of how to advance resilience-driven environmental health goals in Western Balkan communities, and experience within the region has garnered several theme areas that require focus in order to promote a holistic watershed management program. In this paper, using CRESSIDA as a case study, we show (1) how watershed projects designed with resilience-driven environmental health goals can work in context, (2) provide data surrounding contextualized problems with resilience and suggest tools and strategies for the implementation of projects to address these problems, and (3) explore how cross-boundary foci are central to the success of these approaches in watersheds that comprise several countries. Published in the journal, Reviews on Environmental Health.

  11. Dynamics and inertia of a skyrmion in chiral magnets and interfaces: A linear response approach based on magnon excitations

    DOE PAGES

    Lin, Shi-Zeng

    2017-07-06

    We derive the skyrmion dynamics in response to a weak external drive, taking all the magnon modes into account. A skyrmion has rotational symmetry, and the magnon modes can be characterized by an angular momentum. For a weak distortion of a skyrmion, only the magnon modes with an angular momentum | m | = 1 govern the dynamics of skyrmion topological center. We also determine that the skyrmion inertia comes by way of the magnon modes in the continuum spectrum. For a skyrmion driven by a magnetic field gradient or by a spin transfer torque generated by a current, themore » dynamical response is practically instantaneous. This justifies the rigid skyrmion approximation used in Thiele's collective coordinate approach. For a skyrmion driven by a spin Hall torque, the torque couples to the skyrmion motion through the magnons in the continuum and damping; therefore the skyrmion dynamics shows sizable inertia in this case. The trajectory of a skyrmion is an ellipse for an ac drive of spin Hall torque.« less

  12. Aspect-Oriented Model-Driven Software Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  13. Intrinsic Decomposition of The Stretch Tensor for Fibrous Media

    NASA Astrophysics Data System (ADS)

    Kellermann, David C.

    2010-05-01

    This paper presents a novel mechanism for the description of fibre reorientation based on the decomposition of the stretch tensor according to a given material's intrinsic constitutive properties. This approach avoids the necessity for fibre directors, structural tensors or specialised model such as the ideal fibre reinforced model, which are commonly applied to the analysis of fibre kinematics in the finite deformation of fibrous media for biomechanical problems. The proposed approach uses Intrinsic-Field Tensors (IFTs) that build upon the linear orthotropic theory presented in a previous paper entitled Strongly orthotropic continuum mechanics and finite element treatment. The intrinsic decomposition of the stretch tensor therein provides superior capacity to represent the intermediary kinematics driven by finite orthotropic ratios, where the benefits are predominantly expressed in cases of large deformation as is typical in the biomechanical studies. Satisfaction of requirements such as Material Frame-Indifference (MFI) and Euclidean objectivity are demonstrated here—these factors being necessary for the proposed IFTs to be valid tensorial quantities. The resultant tensors, initially for the simplest case of linear elasticity, are able to describe the same fibre reorientation as would the contemporary approaches such as with use of structural tensors and the like, while additionally being capable of showing results intermediary to classical isotropy and the infinitely orthotropic representations. This intermediary case is previously unreported.

  14. High Performance Visualization using Query-Driven Visualizationand Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Campbell, Scott; Dart, Eli

    2006-06-15

    Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.

  15. A Model-Driven Approach to e-Course Management

    ERIC Educational Resources Information Center

    Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana

    2018-01-01

    This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…

  16. Location-Driven Image Retrieval for Images Collected by a Mobile Robot

    NASA Astrophysics Data System (ADS)

    Tanaka, Kanji; Hirayama, Mitsuru; Okada, Nobuhiro; Kondo, Eiji

    Mobile robot teleoperation is a method for a human user to interact with a mobile robot over time and distance. Successful teleoperation depends on how well images taken by the mobile robot are visualized to the user. To enhance the efficiency and flexibility of the visualization, an image retrieval system on such a robot’s image database would be very useful. The main difference of the robot’s image database from standard image databases is that various relevant images exist due to variety of viewing conditions. The main contribution of this paper is to propose an efficient retrieval approach, named location-driven approach, utilizing correlation between visual features and real world locations of images. Combining the location-driven approach with the conventional feature-driven approach, our goal can be viewed as finding an optimal classifier between relevant and irrelevant feature-location pairs. An active learning technique based on support vector machine is extended for this aim.

  17. Resistance noise spectroscopy across the thermally and electrically driven metal-insulator transitions in VO2 nanobeams

    NASA Astrophysics Data System (ADS)

    Alsaqqa, Ali; Kilcoyne, Colin; Singh, Sujay; Horrocks, Gregory; Marley, Peter; Banerjee, Sarbajit; Sambandamurthy, G.

    Vanadium dioxide (VO2) is a strongly correlated material that exhibits a sharp thermally driven metal-insulator transition at Tc ~ 340 K. The transition can also be triggered by a DC voltage in the insulating phase with a threshold (Vth) behavior. The mechanisms behind these transitions are hotly discussed and resistance noise spectroscopy is a suitable tool to delineate different transport mechanisms in correlated systems. We present results from a systematic study of the low frequency (1 mHz < f < 10 Hz) noise behavior in VO2 nanobeams across the thermally and electrically driven transitions. In the thermal transition, the power spectral density (PSD) of the resistance noise is unchanged as we approach Tc from 300 K and an abrupt drop in the magnitude is seen above Tc and it remains unchanged till 400 K. However, the noise behavior in the electrically driven case is distinctly different: as the voltage is ramped from zero, the PSD gradually increases by an order of magnitude before reaching Vth and an abrupt increase is seen at Vth. The noise magnitude decreases above Vth, approaching the V = 0 value. The individual roles of percolation, Joule heating and signatures of correlated behavior will be discussed. This work is supported by NSF DMR 0847324.

  18. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  19. Application of a New Hybrid RANS/LES Modeling Paradigm to Compressible Flow

    NASA Astrophysics Data System (ADS)

    Oliver, Todd; Pederson, Clark; Haering, Sigfried; Moser, Robert

    2017-11-01

    It is well-known that traditional hybrid RANS/LES modeling approaches suffer from a number of deficiencies. These deficiencies often stem from overly simplistic blending strategies based on scalar measures of turbulence length scale and grid resolution and from use of isotropic subgrid models in LES regions. A recently developed hybrid modeling approach has shown promise in overcoming these deficiencies in incompressible flows [Haering, 2015]. In the approach, RANS/LES blending is accomplished using a hybridization parameter that is governed by an additional model transport equation and is driven to achieve equilibrium between the resolved and unresolved turbulence for the given grid. Further, the model uses an tensor eddy viscosity that is formulated to represent the effects of anisotropic grid resolution on subgrid quantities. In this work, this modeling approach is extended to compressible flows and implemented in the compressible flow solver SU2 (http://su2.stanford.edu/). We discuss both modeling and implementation challenges and show preliminary results for compressible flow test cases with smooth wall separation.

  20. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  1. Applying Program Theory-Driven Approach to Design and Evaluate a Teacher Professional Development Program

    ERIC Educational Resources Information Center

    Lin, Su-ching; Wu, Ming-sui

    2016-01-01

    This study was the first year of a two-year project which applied a program theory-driven approach to evaluating the impact of teachers' professional development interventions on students' learning by using a mix of methods, qualitative inquiry, and quasi-experimental design. The current study was to show the results of using the method of…

  2. Extracting a respiratory signal from raw dynamic PET data that contain tracer kinetics.

    PubMed

    Schleyer, P J; Thielemans, K; Marsden, P K

    2014-08-07

    Data driven gating (DDG) methods provide an alternative to hardware based respiratory gating for PET imaging. Several existing DDG approaches obtain a respiratory signal by observing the change in PET-counts within specific regions of acquired PET data. Currently, these methods do not allow for tracer kinetics which can interfere with the respiratory signal and introduce error. In this work, we produced a DDG method for dynamic PET studies that exhibit tracer kinetics. Our method is based on an existing approach that uses frequency-domain analysis to locate regions within raw PET data that are subject to respiratory motion. In the new approach, an optimised non-stationary short-time Fourier transform was used to create a time-varying 4D map of motion affected regions. Additional processing was required to ensure that the relationship between the sign of the respiratory signal and the physical direction of movement remained consistent for each temporal segment of the 4D map. The change in PET-counts within the 4D map during the PET acquisition was then used to generate a respiratory curve. Using 26 min dynamic cardiac NH3 PET acquisitions which included a hardware derived respiratory measurement, we show that tracer kinetics can severely degrade the respiratory signal generated by the original DDG method. In some cases, the transition of tracer from the liver to the lungs caused the respiratory signal to invert. The new approach successfully compensated for tracer kinetics and improved the correlation between the data-driven and hardware based signals. On average, good correlation was maintained throughout the PET acquisitions.

  3. Scenario-based and scenario-neutral assessment of climate change impacts on operational performance of a multipurpose reservoir

    Treesearch

    Allison G. Danner; Mohammad Safeeq; Gordon E. Grant; Charlotte Wickham; Desirée Tullos; Mary V. Santelmann

    2017-01-01

    Scenario-based and scenario-neutral impacts assessment approaches provide complementary information about how climate change-driven effects on streamflow may change the operational performance of multipurpose dams. Examining a case study of Cougar Dam in Oregon, United States, we simulated current reservoir operations under scenarios of plausible future hydrology....

  4. Application of a Flexible, Clinically Driven Approach for Anger Reduction in the Case of Mr. P

    ERIC Educational Resources Information Center

    Kassinove, Howard; Tafrate, Raymond Chip

    2011-01-01

    We treat maladaptive anger in adults with a program based on traditional behavior therapy and cognitive behavior therapy. To these, we add client-centered motivational interviewing techniques. With the goal of modifying maladaptive stimulus-response relationships, our specific aim is to reduce anger reactivity to aversive triggers. Thus, in daily…

  5. Professional Learning Communities by Design: Putting the Learning Back into PLCs

    ERIC Educational Resources Information Center

    Easton, Lois Brown

    2011-01-01

    If you are looking for an organic approach to purpose-driven professional learning, this is the book for you. Award-winning educator Lois Brown Easton's latest work provides a compelling case study in narrative form, a chronological PLC planning outline, and first-hand "lessons learned" about how PLCs develop, mature, and sustain themselves. You…

  6. Strategic Planning in the Business Enterprise of Christian Colleges and Universities: A Multi-Case Study Approach

    ERIC Educational Resources Information Center

    Fletcher, Wayne Lewis

    2013-01-01

    Many tuition-driven private colleges and universities struggled for economic survival in the first decade of this millennium. The current study views higher education from a two-good framework that posits that every college and university provides teaching and service for the societal good while generating revenue from traditional business-like…

  7. Design-Driven Innovation as Seen in a Worldwide Values-Based Curriculum

    ERIC Educational Resources Information Center

    Hadlock, Camey Andersen; McDonald, Jason K.

    2014-01-01

    While instructional design's technological roots have given it many approaches for process and product improvement, in most cases designers still rely on instructional forms that do not allow them to develop instruction of a quality consistent with that expressed by the field's visionary leaders. As a result, often the teachers and students using…

  8. Wetting Behavior in Colloid-Polymer Mixtures at Different Substrates.

    PubMed

    Wijting, Willem K; Besseling, Nicolaas A M; Cohen Stuart, Martien A

    2003-09-25

    We present experimental observations on wetting phenomena in depletion interaction driven, phase separated colloidal dispersions. The contact angle of the colloidal liquid-gas interface at a solid substrate was determined for a series of compositions. Upon approach to the critical point, a transition occurs from partial to complete wetting. The interaction with the substrate was manipulated by modifying the substrate with a polymer. In that case, a transition from partial to complete drying is observed upon approach to the critical point.

  9. A New Path-Constrained Rendezvous Planning Approach for Large-Scale Event-Driven Wireless Sensor Networks

    PubMed Central

    Zhang, Gongxuan; Wang, Yongli; Wang, Tianshu

    2018-01-01

    We study the problem of employing a mobile-sink into a large-scale Event-Driven Wireless Sensor Networks (EWSNs) for the purpose of data harvesting from sensor-nodes. Generally, this employment improves the main weakness of WSNs that is about energy-consumption in battery-driven sensor-nodes. The main motivation of our work is to address challenges which are related to a network’s topology by adopting a mobile-sink that moves in a predefined trajectory in the environment. Since, in this fashion, it is not possible to gather data from sensor-nodes individually, we adopt the approach of defining some of the sensor-nodes as Rendezvous Points (RPs) in the network. We argue that RP-planning in this case is a tradeoff between minimizing the number of RPs while decreasing the number of hops for a sensor-node that needs data transformation to the related RP which leads to minimizing average energy consumption in the network. We address the problem by formulating the challenges and expectations as a Mixed Integer Linear Programming (MILP). Henceforth, by proving the NP-hardness of the problem, we propose three effective and distributed heuristics for RP-planning, identifying sojourn locations, and constructing routing trees. Finally, experimental results prove the effectiveness of our approach. PMID:29734718

  10. A New Path-Constrained Rendezvous Planning Approach for Large-Scale Event-Driven Wireless Sensor Networks.

    PubMed

    Vajdi, Ahmadreza; Zhang, Gongxuan; Zhou, Junlong; Wei, Tongquan; Wang, Yongli; Wang, Tianshu

    2018-05-04

    We study the problem of employing a mobile-sink into a large-scale Event-Driven Wireless Sensor Networks (EWSNs) for the purpose of data harvesting from sensor-nodes. Generally, this employment improves the main weakness of WSNs that is about energy-consumption in battery-driven sensor-nodes. The main motivation of our work is to address challenges which are related to a network’s topology by adopting a mobile-sink that moves in a predefined trajectory in the environment. Since, in this fashion, it is not possible to gather data from sensor-nodes individually, we adopt the approach of defining some of the sensor-nodes as Rendezvous Points (RPs) in the network. We argue that RP-planning in this case is a tradeoff between minimizing the number of RPs while decreasing the number of hops for a sensor-node that needs data transformation to the related RP which leads to minimizing average energy consumption in the network. We address the problem by formulating the challenges and expectations as a Mixed Integer Linear Programming (MILP). Henceforth, by proving the NP-hardness of the problem, we propose three effective and distributed heuristics for RP-planning, identifying sojourn locations, and constructing routing trees. Finally, experimental results prove the effectiveness of our approach.

  11. Increase of stagnation pressure and enthalpy in shock tunnels

    NASA Technical Reports Server (NTRS)

    Bogdanoff, David W.; Cambier, Jean-Luc

    1992-01-01

    High stagnation pressures and enthalpies are required for the testing of aerospace vehicles such as aerospace planes, aeroassist vehicles, and reentry vehicles. Among the most useful ground test facilities for performing such tests are shock tunnels. With a given driver gas condition, the enthalpy and pressure in the driven tube nozzle reservoir condition can be varied by changing the driven tube geometry and initial gas fill pressure. Reducing the driven tube diameter yields only very modest increases in reservoir pressure and enthalpy. Reducing the driven tube initial gas fill pressure can increase the reservoir enthalpy significantly, but at the cost of reduced reservoir pressure and useful test time. A new technique, the insertion of a converging section in the driven tube is found to produce substantial increases in both reservoir pressure and enthalpy. Using a one-dimensional inviscid full kinetics code, a number of different locations and shapes for the converging driven tube section were studied and the best cases found. For these best cases, for driven tube diameter reductions of factors of 2 and 3, the reservoir pressure can be increased by factors of 2.1 and 3.2, respectively and the enthalpy can be increased by factors of 1.5 and 2.1, respectively.

  12. Proposed Requirements-driven User-scenario Development Protocol for the Belmont Forum E-Infrastructure and Data Management Cooperative Research Agreement

    NASA Astrophysics Data System (ADS)

    Wee, B.; Car, N.; Percivall, G.; Allen, D.; Fitch, P. G.; Baumann, P.; Waldmann, H. C.

    2014-12-01

    The Belmont Forum E-Infrastructure and Data Management Cooperative Research Agreement (CRA) is designed to foster a global community to collaborate on e-infrastructure challenges. One of the deliverables is an implementation plan to address global data infrastructure interoperability challenges and align existing domestic and international capabilities. Work package three (WP3) of the CRA focuses on the harmonization of global data infrastructure for sharing environmental data. One of the subtasks under WP3 is the development of user scenarios that guide the development of applicable deliverables. This paper describes the proposed protocol for user scenario development. It enables the solicitation of user scenarios from a broad constituency, and exposes the mechanisms by which those solicitations are evaluated against requirements that map to the Belmont Challenge. The underlying principle of traceability forms the basis for a structured, requirements-driven approach resulting in work products amenable to trade-off analyses and objective prioritization. The protocol adopts the ISO Reference Model for Open Distributed Processing (RM-ODP) as a top level framework. User scenarios are developed within RM-ODP's "Enterprise Viewpoint". To harmonize with existing frameworks, the protocol utilizes the conceptual constructs of "scenarios", "use cases", "use case categories", and use case templates as adopted by recent GEOSS Architecture Implementation Project (AIP) deliverables and CSIRO's eReefs project. These constructs are encapsulated under the larger construct of "user scenarios". Once user scenarios are ranked by goodness-of-fit to the Belmont Challenge, secondary scoring metrics may be generated, like goodness-of-fit to FutureEarth science themes. The protocol also facilitates an assessment of the ease of implementing given user scenario using existing GEOSS AIP deliverables. In summary, the protocol results in a traceability graph that can be extended to coordinate across research programmes. If implemented using appropriate technologies and harmonized with existing ontologies, this approach enables queries, sensitivity analyses, and visualization of complex relationships.

  13. Algorithms for optimization of branching gravity-driven water networks

    NASA Astrophysics Data System (ADS)

    Dardani, Ian; Jones, Gerard F.

    2018-05-01

    The design of a water network involves the selection of pipe diameters that satisfy pressure and flow requirements while considering cost. A variety of design approaches can be used to optimize for hydraulic performance or reduce costs. To help designers select an appropriate approach in the context of gravity-driven water networks (GDWNs), this work assesses three cost-minimization algorithms on six moderate-scale GDWN test cases. Two algorithms, a backtracking algorithm and a genetic algorithm, use a set of discrete pipe diameters, while a new calculus-based algorithm produces a continuous-diameter solution which is mapped onto a discrete-diameter set. The backtracking algorithm finds the global optimum for all but the largest of cases tested, for which its long runtime makes it an infeasible option. The calculus-based algorithm's discrete-diameter solution produced slightly higher-cost results but was more scalable to larger network cases. Furthermore, the new calculus-based algorithm's continuous-diameter and mapped solutions provided lower and upper bounds, respectively, on the discrete-diameter global optimum cost, where the mapped solutions were typically within one diameter size of the global optimum. The genetic algorithm produced solutions even closer to the global optimum with consistently short run times, although slightly higher solution costs were seen for the larger network cases tested. The results of this study highlight the advantages and weaknesses of each GDWN design method including closeness to the global optimum, the ability to prune the solution space of infeasible and suboptimal candidates without missing the global optimum, and algorithm run time. We also extend an existing closed-form model of Jones (2011) to include minor losses and a more comprehensive two-part cost model, which realistically applies to pipe sizes that span a broad range typical of GDWNs of interest in this work, and for smooth and commercial steel roughness values.

  14. Hierarchically-driven Approach for Quantifying Materials Uncertainty in Creep Deformation and Failure of Aerospace Materials

    DTIC Science & Technology

    2016-07-01

    characteristics and to examine the sensitivity of using such techniques for evaluating microstructure. In addition to the GUI tool, a manual describing its use has... Evaluating Local Primary Dendrite Arm Spacing Characterization Techniques Using Synthetic Directionally Solidified Dendritic Microstructures, Metallurgical and...driven approach for quanti - fying materials uncertainty in creep deformation and failure of aerspace materials, Multi-scale Structural Mechanics and

  15. How to develop a theory-driven evaluation design? Lessons learned from an adolescent sexual and reproductive health programme in West Africa.

    PubMed

    Van Belle, Sara B; Marchal, Bruno; Dubourg, Dominique; Kegels, Guy

    2010-11-30

    This paper presents the development of a study design built on the principles of theory-driven evaluation. The theory-driven evaluation approach was used to evaluate an adolescent sexual and reproductive health intervention in Mali, Burkina Faso and Cameroon to improve continuity of care through the creation of networks of social and health care providers. Based on our experience and the existing literature, we developed a six-step framework for the design of theory-driven evaluations, which we applied in the ex-post evaluation of the networking component of the intervention. The protocol was drafted with the input of the intervention designer. The programme theory, the central element of theory-driven evaluation, was constructed on the basis of semi-structured interviews with designers, implementers and beneficiaries and an analysis of the intervention's logical framework. The six-step framework proved useful as it allowed for a systematic development of the protocol. We describe the challenges at each step. We found that there is little practical guidance in the existing literature, and also a mix up of terminology of theory-driven evaluation approaches. There is a need for empirical methodological development in order to refine the tools to be used in theory driven evaluation. We conclude that ex-post evaluations of programmes can be based on such an approach if the required information on context and mechanisms is collected during the programme.

  16. How to develop a theory-driven evaluation design? Lessons learned from an adolescent sexual and reproductive health programme in West Africa

    PubMed Central

    2010-01-01

    Background This paper presents the development of a study design built on the principles of theory-driven evaluation. The theory-driven evaluation approach was used to evaluate an adolescent sexual and reproductive health intervention in Mali, Burkina Faso and Cameroon to improve continuity of care through the creation of networks of social and health care providers. Methods/design Based on our experience and the existing literature, we developed a six-step framework for the design of theory-driven evaluations, which we applied in the ex-post evaluation of the networking component of the intervention. The protocol was drafted with the input of the intervention designer. The programme theory, the central element of theory-driven evaluation, was constructed on the basis of semi-structured interviews with designers, implementers and beneficiaries and an analysis of the intervention's logical framework. Discussion The six-step framework proved useful as it allowed for a systematic development of the protocol. We describe the challenges at each step. We found that there is little practical guidance in the existing literature, and also a mix up of terminology of theory-driven evaluation approaches. There is a need for empirical methodological development in order to refine the tools to be used in theory driven evaluation. We conclude that ex-post evaluations of programmes can be based on such an approach if the required information on context and mechanisms is collected during the programme. PMID:21118510

  17. Illustrative case using the RISK21 roadmap and matrix: prioritization for evaluation of chemicals found in drinking water

    PubMed Central

    Wolf, Douglas C.; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I.; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P.; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R.

    2016-01-01

    ABSTRACT The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework’s roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization. PMID:26451723

  18. Illustrative case using the RISK21 roadmap and matrix: prioritization for evaluation of chemicals found in drinking water.

    PubMed

    Wolf, Douglas C; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R

    2016-01-01

    The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework's roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization.

  19. Observational study using the tools of lean six sigma to improve the efficiency of the resident rounding process.

    PubMed

    Chand, David V

    2011-06-01

    Recent focus on resident work hours has challenged residency programs to modify their curricula to meet established duty hour restrictions and fulfill their mission to develop the next generation of clinicians. Simultaneously, health care systems strive to deliver efficient, high-quality care to patients and families. The primary goal of this observational study was to use a data-driven approach to eliminate examples of waste and variation identified in resident rounding using Lean Six Sigma methodology. A secondary goal was to improve the efficiency of the rounding process, as measured by the reduction in nonvalue-added time. We used the "DMAIC" methodology: define, measure, analyze, improve, and control. Pediatric and family medicine residents rotating on the pediatric hospitalist team participated in the observation phase. Residents, nurses, hospitalists, and parents of patients completed surveys to gauge their attitudes toward rounds. The Mann-Whitney test was used to test for differences in the median times measured during the preimprovement and postimprovement phases, and the Student t test was used for comparison of survey data. Collaborative, family-centered rounding with elimination of the "prerounding" process, as well as standard work instructions and pacing the process to meet customer demand (takt time), were implemented. Nonvalue-added time per patient was reduced by 64% (P  =  .005). Survey data suggested that team members preferred the collaborative, family-centered approach to the traditional model of rounding. Lean Six Sigma provides tools, a philosophy, and a structured, data-driven approach to address a problem. In our case this facilitated an effort to adhere to duty hour restrictions while promoting education and quality care. Such approaches will become increasingly useful as health care delivery and education continue to transform.

  20. Observational Study Using the Tools of Lean Six Sigma to Improve the Efficiency of the Resident Rounding Process

    PubMed Central

    Chand, David V.

    2011-01-01

    Background Recent focus on resident work hours has challenged residency programs to modify their curricula to meet established duty hour restrictions and fulfill their mission to develop the next generation of clinicians. Simultaneously, health care systems strive to deliver efficient, high-quality care to patients and families. The primary goal of this observational study was to use a data-driven approach to eliminate examples of waste and variation identified in resident rounding using Lean Six Sigma methodology. A secondary goal was to improve the efficiency of the rounding process, as measured by the reduction in nonvalue-added time. Methods We used the “DMAIC” methodology: define, measure, analyze, improve, and control. Pediatric and family medicine residents rotating on the pediatric hospitalist team participated in the observation phase. Residents, nurses, hospitalists, and parents of patients completed surveys to gauge their attitudes toward rounds. The Mann-Whitney test was used to test for differences in the median times measured during the preimprovement and postimprovement phases, and the Student t test was used for comparison of survey data. Results and Discussion Collaborative, family-centered rounding with elimination of the “prerounding” process, as well as standard work instructions and pacing the process to meet customer demand (takt time), were implemented. Nonvalue-added time per patient was reduced by 64% (P  =  .005). Survey data suggested that team members preferred the collaborative, family-centered approach to the traditional model of rounding. Conclusions Lean Six Sigma provides tools, a philosophy, and a structured, data-driven approach to address a problem. In our case this facilitated an effort to adhere to duty hour restrictions while promoting education and quality care. Such approaches will become increasingly useful as health care delivery and education continue to transform. PMID:22655134

  1. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    NASA Astrophysics Data System (ADS)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  2. Using Telephone Conversations to Develop Awareness of Pragmatic Skills: An Activity-Theory-Driven Approach

    ERIC Educational Resources Information Center

    Xia, Saihua

    2009-01-01

    This paper investigates ESL learners' awareness of pragmatic skills utilizing an activity-theory driven approach to perform an inquiry task into problem-solving service call conversations (PSSCs) between native speakers (NS) and non-native speakers of English (NNSs). Eight high-intermediate ESL learners, from five different language backgrounds,…

  3. Teachers Develop CLIL Materials in Argentina: A Workshop Experience

    ERIC Educational Resources Information Center

    Banegas, Darío Luis

    2016-01-01

    Content and language integrated learning (CLIL) is a Europe-born approach. Nevertheless, CLIL as a language learning approach has been implemented in Latin America in different ways and models: content-driven models and language-driven models. As regards the latter, new school curricula demand that CLIL be used in secondary education in Argentina…

  4. Designing an International Business Curriculum: A Market-Driven Approach.

    ERIC Educational Resources Information Center

    Javalgi, Rajshekhar G.; Vogelsang-Coombs, Vera; Lawson, Diana A.; White, D. Steven

    1997-01-01

    Describes a market-driven approach to international business curriculum development used in creation of a curriculum for a new school of international business. The school's leadership sought input from over 100 chief executives to determine how to align the curriculum with international business needs, and conducted structured focus groups and a…

  5. Combining Theory-Driven Evaluation and Causal Loop Diagramming for Opening the ‘Black Box’ of an Intervention in the Health Sector: A Case of Performance-Based Financing in Western Uganda

    PubMed Central

    Holvoet, Nathalie; Criel, Bart

    2017-01-01

    Increased attention on “complexity” in health systems evaluation has resulted in many different methodological responses. Theory-driven evaluations and systems thinking are two such responses that aim for better understanding of the mechanisms underlying given outcomes. Here, we studied the implementation of a performance-based financing intervention by the Belgian Technical Cooperation in Western Uganda to illustrate a methodological strategy of combining these two approaches. We utilized a systems dynamics tool called causal loop diagramming (CLD) to generate hypotheses feeding into a theory-driven evaluation. Semi-structured interviews were conducted with 30 health workers from two districts (Kasese and Kyenjojo) and with 16 key informants. After CLD, we identified three relevant hypotheses: “success to the successful”, “growth and underinvestment”, and “supervision conundrum”. The first hypothesis leads to increasing improvements in performance, as better performance leads to more incentives, which in turn leads to better performance. The latter two hypotheses point to potential bottlenecks. Thus, the proposed methodological strategy was a useful tool for identifying hypotheses that can inform a theory-driven evaluation. The hypotheses are represented in a comprehensible way while highlighting the underlying assumptions, and are more easily falsifiable than hypotheses identified without using CLD. PMID:28869518

  6. Piecewise adiabatic following in non-Hermitian cycling

    NASA Astrophysics Data System (ADS)

    Gong, Jiangbin; Wang, Qing-hai

    2018-05-01

    The time evolution of periodically driven non-Hermitian systems is in general nonunitary but can be stable. It is hence of considerable interest to examine the adiabatic following dynamics in periodically driven non-Hermitian systems. We show in this work the possibility of piecewise adiabatic following interrupted by hopping between instantaneous system eigenstates. This phenomenon is first observed in a computational model and then theoretically explained, using an exactly solvable model, in terms of the Stokes phenomenon. In the latter case, the piecewise adiabatic following is shown to be a genuine critical behavior and the precise phase boundary in the parameter space is located. Interestingly, the critical boundary for piecewise adiabatic following is found to be unrelated to the domain for exceptional points. To characterize the adiabatic following dynamics, we also advocate a simple definition of the Aharonov-Anandan (AA) phase for nonunitary cyclic dynamics, which always yields real AA phases. In the slow driving limit, the AA phase reduces to the Berry phase if adiabatic following persists throughout the driving without hopping, but oscillates violently and does not approach any limit in cases of piecewise adiabatic following. This work exposes the rich features of nonunitary dynamics in cases of slow cycling and should stimulate future applications of nonunitary dynamics.

  7. Implementation of and Ada real-time executive: A case study

    NASA Technical Reports Server (NTRS)

    Laird, James D.; Burton, Bruce A.; Koppes, Mary R.

    1986-01-01

    Current Ada language implementations and runtime environments are immature, unproven and are a key risk area for real-time embedded computer system (ECS). A test-case environment is provided in which the concerns of the real-time, ECS community are addressed. A priority driven executive is selected to be implemented in the Ada programming language. The model selected is representative of real-time executives tailored for embedded systems used missile, spacecraft, and avionics applications. An Ada-based design methodology is utilized, and two designs are considered. The first of these designs requires the use of vendor supplied runtime and tasking support. An alternative high-level design is also considered for an implementation requiring no vendor supplied runtime or tasking support. The former approach is carried through to implementation.

  8. Simplified subsurface modelling: data assimilation and violated model assumptions

    NASA Astrophysics Data System (ADS)

    Erdal, Daniel; Lange, Natascha; Neuweiler, Insa

    2017-04-01

    Integrated models are gaining more and more attention in hydrological modelling as they can better represent the interaction between different compartments. Naturally, these models come along with larger numbers of unknowns and requirements on computational resources compared to stand-alone models. If large model domains are to be represented, e.g. on catchment scale, the resolution of the numerical grid needs to be reduced or the model itself needs to be simplified. Both approaches lead to a reduced ability to reproduce the present processes. This lack of model accuracy may be compensated by using data assimilation methods. In these methods observations are used to update the model states, and optionally model parameters as well, in order to reduce the model error induced by the imposed simplifications. What is unclear is whether these methods combined with strongly simplified models result in completely data-driven models or if they can even be used to make adequate predictions of the model state for times when no observations are available. In the current work we consider the combined groundwater and unsaturated zone, which can be modelled in a physically consistent way using 3D-models solving the Richards equation. For use in simple predictions, however, simpler approaches may be considered. The question investigated here is whether a simpler model, in which the groundwater is modelled as a horizontal 2D-model and the unsaturated zones as a few sparse 1D-columns, can be used within an Ensemble Kalman filter to give predictions of groundwater levels and unsaturated fluxes. This is tested under conditions where the feedback between the two model-compartments are large (e.g. shallow groundwater table) and the simplification assumptions are clearly violated. Such a case may be a steep hill-slope or pumping wells, creating lateral fluxes in the unsaturated zone, or strong heterogeneous structures creating unaccounted flows in both the saturated and unsaturated compartments. Under such circumstances, direct modelling using a simplified model will not provide good results. However, a more data driven (e.g. grey box) approach, driven by the filter, may still provide an improved understanding of the system. Comparisons between full 3D simulations and simplified filter driven models will be shown and the resulting benefits and drawbacks will be discussed.

  9. Model-driven discovery of underground metabolic functions in Escherichia coli.

    PubMed

    Guzmán, Gabriela I; Utrilla, José; Nurk, Sergey; Brunk, Elizabeth; Monk, Jonathan M; Ebrahim, Ali; Palsson, Bernhard O; Feist, Adam M

    2015-01-20

    Enzyme promiscuity toward substrates has been discussed in evolutionary terms as providing the flexibility to adapt to novel environments. In the present work, we describe an approach toward exploring such enzyme promiscuity in the space of a metabolic network. This approach leverages genome-scale models, which have been widely used for predicting growth phenotypes in various environments or following a genetic perturbation; however, these predictions occasionally fail. Failed predictions of gene essentiality offer an opportunity for targeting biological discovery, suggesting the presence of unknown underground pathways stemming from enzymatic cross-reactivity. We demonstrate a workflow that couples constraint-based modeling and bioinformatic tools with KO strain analysis and adaptive laboratory evolution for the purpose of predicting promiscuity at the genome scale. Three cases of genes that are incorrectly predicted as essential in Escherichia coli--aspC, argD, and gltA--are examined, and isozyme functions are uncovered for each to a different extent. Seven isozyme functions based on genetic and transcriptional evidence are suggested between the genes aspC and tyrB, argD and astC, gabT and puuE, and gltA and prpC. This study demonstrates how a targeted model-driven approach to discovery can systematically fill knowledge gaps, characterize underground metabolism, and elucidate regulatory mechanisms of adaptation in response to gene KO perturbations.

  10. SST-Forced Seasonal Simulation and Prediction Skill for Versions of the NCEP/MRF Model.

    NASA Astrophysics Data System (ADS)

    Livezey, Robert E.; Masutani, Michiko; Jil, Ming

    1996-03-01

    The feasibility of using a two-tier approach to provide guidance to operational long-lead seasonal prediction is explored. The approach includes first a forecast of global sea surface temperatures (SSTs) using a coupled general circulation model, followed by an atmospheric forecast using an atmospheric general circulation model (AGCM). For this exploration, ensembles of decade-long integrations of the AGCM driven by observed SSTs and ensembles of integrations of select cases driven by forecast SSTs have been conducted. The ability of the model in these sets of runs to reproduce observed atmospheric conditions has been evaluated with a multiparameter performance analysis.Results have identified performance and skill levels in the specified SST runs, for winters and springs over the Pacific/North America region, that are sufficient to impact operational seasonal predictions in years with major El Niño-Southern Oscillation (ENSO) episodes. Further, these levels were substantially reproduced in the forecast SST runs for 1-month leads and in many instances for up to one-season leads. In fact, overall the 0- and 1-month-lead forecasts of seasonal temperature over the United States for three falls and winters with major ENSO episodes were substantially better than corresponding official forecasts. Thus, there is considerable reason to develop a dynamical component for the official seasonal forecast process.

  11. Assessment of cardiovascular risk based on a data-driven knowledge discovery approach.

    PubMed

    Mendes, D; Paredes, S; Rocha, T; Carvalho, P; Henriques, J; Cabiddu, R; Morais, J

    2015-01-01

    The cardioRisk project addresses the development of personalized risk assessment tools for patients who have been admitted to the hospital with acute myocardial infarction. Although there are models available that assess the short-term risk of death/new events for such patients, these models were established in circumstances that do not take into account the present clinical interventions and, in some cases, the risk factors used by such models are not easily available in clinical practice. The integration of the existing risk tools (applied in the clinician's daily practice) with data-driven knowledge discovery mechanisms based on data routinely collected during hospitalizations, will be a breakthrough in overcoming some of these difficulties. In this context, the development of simple and interpretable models (based on recent datasets), unquestionably will facilitate and will introduce confidence in this integration process. In this work, a simple and interpretable model based on a real dataset is proposed. It consists of a decision tree model structure that uses a reduced set of six binary risk factors. The validation is performed using a recent dataset provided by the Portuguese Society of Cardiology (11113 patients), which originally comprised 77 risk factors. A sensitivity, specificity and accuracy of, respectively, 80.42%, 77.25% and 78.80% were achieved showing the effectiveness of the approach.

  12. Crohn's disease complicated by Epstein-Barr virus-driven haemophagocytic lymphohistiocytosis successfully treated with rituximab.

    PubMed

    Thompson, Grace; Pepperell, Dominic; Lawrence, Ian; McGettigan, Benjamin David

    2017-02-22

    We report a case of Epstein-Barr virus (EBV)-driven haemophagocytic lymphohistiocytosis (HLH) in a man with Crohn's disease treated with 6-mercaptopurine and adalimumab therapy who was successfully treated with rituximab therapy alone. This is the first published case in an adult patient with EBV-driven HLH in the setting of thiopurine use and inflammatory bowel disease to be successfully treated with rituximab therapy alone. Here, we will discuss putative immunological mechanisms which may contribute to this potentially life-threatening complication. 2017 BMJ Publishing Group Ltd.

  13. A Hybrid Adaptive Routing Algorithm for Event-Driven Wireless Sensor Networks

    PubMed Central

    Figueiredo, Carlos M. S.; Nakamura, Eduardo F.; Loureiro, Antonio A. F.

    2009-01-01

    Routing is a basic function in wireless sensor networks (WSNs). For these networks, routing algorithms depend on the characteristics of the applications and, consequently, there is no self-contained algorithm suitable for every case. In some scenarios, the network behavior (traffic load) may vary a lot, such as an event-driven application, favoring different algorithms at different instants. This work presents a hybrid and adaptive algorithm for routing in WSNs, called Multi-MAF, that adapts its behavior autonomously in response to the variation of network conditions. In particular, the proposed algorithm applies both reactive and proactive strategies for routing infrastructure creation, and uses an event-detection estimation model to change between the strategies and save energy. To show the advantages of the proposed approach, it is evaluated through simulations. Comparisons with independent reactive and proactive algorithms show improvements on energy consumption. PMID:22423207

  14. A hybrid adaptive routing algorithm for event-driven wireless sensor networks.

    PubMed

    Figueiredo, Carlos M S; Nakamura, Eduardo F; Loureiro, Antonio A F

    2009-01-01

    Routing is a basic function in wireless sensor networks (WSNs). For these networks, routing algorithms depend on the characteristics of the applications and, consequently, there is no self-contained algorithm suitable for every case. In some scenarios, the network behavior (traffic load) may vary a lot, such as an event-driven application, favoring different algorithms at different instants. This work presents a hybrid and adaptive algorithm for routing in WSNs, called Multi-MAF, that adapts its behavior autonomously in response to the variation of network conditions. In particular, the proposed algorithm applies both reactive and proactive strategies for routing infrastructure creation, and uses an event-detection estimation model to change between the strategies and save energy. To show the advantages of the proposed approach, it is evaluated through simulations. Comparisons with independent reactive and proactive algorithms show improvements on energy consumption.

  15. Open-Inquiry Driven Overcoming of Epistemological Difficulties in Engineering Undergraduates: A Case Study in the Context of Thermal Science

    ERIC Educational Resources Information Center

    Pizzolato, Nicola; Fazio, Claudio; Sperandeo Mineo, Rosa Maria; Persano Adorno, Dominique

    2014-01-01

    This paper addresses the efficacy of an open-inquiry approach that allows students to build on traditionally received knowledge. A sample of thirty engineering undergraduates, having already attended traditional university physics instruction, was selected for this study. The students were involved in a six-week long learning experience of…

  16. Comparing University Performance by Legal Status: A Malmquist-Type Index Approach for the Case of the Spanish Higher Education System

    ERIC Educational Resources Information Center

    de la Torre, Eva M.; Gómez-Sancho, José-María; Perez-Esparrells, Carmen

    2017-01-01

    New public management and increasing levels of competition driven by global rankings are bringing the managerial practices of public and private higher education institutions closer together. However, these two types of institutions still maintain different objectives and traditions and enjoy different degrees of autonomy that are reflected in…

  17. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  18. Large deviation function for a driven underdamped particle in a periodic potential

    NASA Astrophysics Data System (ADS)

    Fischer, Lukas P.; Pietzonka, Patrick; Seifert, Udo

    2018-02-01

    Employing large deviation theory, we explore current fluctuations of underdamped Brownian motion for the paradigmatic example of a single particle in a one-dimensional periodic potential. Two different approaches to the large deviation function of the particle current are presented. First, we derive an explicit expression for the large deviation functional of the empirical phase space density, which replaces the level 2.5 functional used for overdamped dynamics. Using this approach, we obtain several bounds on the large deviation function of the particle current. We compare these to bounds for overdamped dynamics that have recently been derived, motivated by the thermodynamic uncertainty relation. Second, we provide a method to calculate the large deviation function via the cumulant generating function. We use this method to assess the tightness of the bounds in a numerical case study for a cosine potential.

  19. Drug reformulations and repositioning in the pharmaceutical industry and their impact on market access: regulatory implications

    PubMed Central

    Murteira, Susana; Millier, Aurélie; Ghezaiel, Zied; Lamure, Michel

    2014-01-01

    Background Repurposing has become a mainstream strategy in drug development, but it faces multiple challenges, amongst them the increasing and ever changing regulatory framework. This is the second study of a series of three-part publication project with the ultimate goal of understanding the market access rationale and conditions attributed to drug repurposing in the United States and in Europe. The aim of the current study to evaluate the regulatory path associated with each type of repurposing strategy according to the previously proposed nomenclature in the first article of this series. Methods From the cases identified, a selection process retrieved a total of 141 case studies in all countries, harmonized for data availability and common approval in the United States and in Europe. Regulatory information for each original and repurposed drug product was extracted, and several related regulatory attributes were also extracted such as, designation change and filing before or after patent expiry, among others. Descriptive analyses were conducted to determine trends and to investigate potential associations between the different regulatory paths and attributes of interest, for reformulation and repositioning cases separately. Results Within the studied European countries, most of the applications for reformulated products were filed through national applications. In contrast, for repositioned products, the centralized procedure was the most frequent regulatory pathway. Most of the repurposing cases were approved before patent expiry, and those cases have followed more complex regulatory pathways in the United States and in Europe. For new molecular entities filed in the United States, a similar number of cases were developed by serendipity and by a hypothesis-driven approach. However, for the new indication's regulatory pathway in the United States, most of the cases were developed through a hypothesis-driven approach. Conclusion The regulations in the United States and in Europe for drug repositionings and reformulations allowed confirming that repositioning strategies were usually filed under a more complex regulatory process than reformulations. Also, it seems that parameters such as patent expiry and type of repositioning approach or reformulation affect the regulatory pathways chosen for each case. PMID:27226839

  20. Organizational impact of evidence-informed decision making training initiatives: a case study comparison of two approaches.

    PubMed

    Champagne, François; Lemieux-Charles, Louise; Duranceau, Marie-France; MacKean, Gail; Reay, Trish

    2014-05-02

    The impact of efforts by healthcare organizations to enhance the use of evidence to improve organizational processes through training programs has seldom been assessed. We therefore endeavored to assess whether and how the training of mid- and senior-level healthcare managers could lead to organizational change. We conducted a theory-driven evaluation of the organizational impact of healthcare leaders' participation in two training programs using a logic model based on Nonaka's theory of knowledge conversion. We analyzed six case studies nested within the two programs using three embedded units of analysis (individual, group and organization). Interviews were conducted during intensive one-week data collection site visits. A total of 84 people were interviewed. We found that the impact of training could primarily be felt in trainees' immediate work environments. The conversion of attitudes was found to be easier to achieve than the conversion of skills. Our results show that, although socialization and externalization were common in all cases, a lack of combination impeded the conversion of skills. We also identified several individual, organizational and program design factors that facilitated and/or impeded the dissemination of the attitudes and skills gained by trainees to other organizational members. Our theory-driven evaluation showed that factors before, during and after training can influence the extent of skills and knowledge transfer. Our evaluation went further than previous research by revealing the influence--both positive and negative--of specific organizational factors on extending the impact of training programs.

  1. A multi-band environment-adaptive approach to noise suppression for cochlear implants.

    PubMed

    Saki, Fatemeh; Mirzahasanloo, Taher; Kehtarnavaz, Nasser

    2014-01-01

    This paper presents an improved environment-adaptive noise suppression solution for the cochlear implants speech processing pipeline. This improvement is achieved by using a multi-band data-driven approach in place of a previously developed single-band data-driven approach. Seven commonly encountered noisy environments of street, car, restaurant, mall, bus, pub and train are considered to quantify the improvement. The results obtained indicate about 10% improvement in speech quality measures.

  2. Relationships, variety & synergy: the vital ingredients for scholarship in engineering education? A case study

    NASA Astrophysics Data System (ADS)

    Clark, Robin; Andrews, Jane

    2014-11-01

    This paper begins with the argument that within modern-day society, engineering has shifted from being the scientific and technical mainstay of industrial, and more recently digital change to become the most vital driver of future advancement. In order to meet the inevitable challenges resulting from this role, the nature of engineering education is constantly evolving and as such engineering education has to change. The paper argues that what is needed is a fresh approach to engineering education - one that is sufficiently flexible so as to capture the fast-changing needs of engineering education as a discipline, whilst being pedagogically suitable for use with a range of engineering epistemologies. It provides an overview of a case study in which a new approach to engineering education has been developed and evaluated. The approach, which is based on the concept of scholarship, is described in detail. This is followed by a discussion of how the approach has been put into practice and evaluated. The paper concludes by arguing that within today's market-driven university world, the need for effective learning and teaching practice, based in good scholarship, is fundamental to student success.

  3. Reliable estimates of predictive uncertainty for an Alpine catchment using a non-parametric methodology

    NASA Astrophysics Data System (ADS)

    Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.

    2017-04-01

    Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.

  4. Large-scale disruptions in a current-carrying magnetofluid

    NASA Technical Reports Server (NTRS)

    Dahlburg, J. P.; Montgomery, D.; Doolen, G. D.; Matthaeus, W. H.

    1986-01-01

    Internal disruptions in a strongly magnetized electrically conducting fluid contained within a rigid conducting cylinder of square cross section are investigated theoretically, both with and without an externally applied axial electric field, by means of computer simulations using the pseudospectral three-dimensional Strauss-equations code of Dahlburg et al. (1985). Results from undriven inviscid, driven inviscid, and driven viscid simulations are presented graphically, and the significant effects of low-order truncations on the modeling accuracy are considered. A helical current filament about the cylinder axis is observed. The ratio of turbulent kinetic energy to total poloidal magnetic energy is found to undergo cyclic bounces in the undriven inviscid case, to exhibit one large bounce followed by decay to a quasi-steady state with poloidal fluid velocity flow in the driven inviscid case, and to show one large bounce followed by further sawtoothlike bounces in the driven viscid case.

  5. Off-equatorial current-driven instabilities ahead of approaching dipolarization fronts

    NASA Astrophysics Data System (ADS)

    Zhang, Xu; Angelopoulos, V.; Pritchett, P. L.; Liu, Jiang

    2017-05-01

    Recent kinetic simulations have revealed that electromagnetic instabilities near the ion gyrofrequency and slightly away from the equatorial plane can be driven by a current parallel to the magnetic field prior to the arrival of dipolarization fronts. Such instabilities are important because of their potential contribution to global electromagnetic energy conversion near dipolarization fronts. Of the several instabilities that may be consistent with such waves, the most notable are the current-driven electromagnetic ion cyclotron instability and the current-driven kink-like instability. To confirm the existence and characteristics of these instabilities, we used observations by two Time History of Events and Macroscale Interactions during Substorms satellites, one near the neutral sheet observing dipolarization fronts and the other at the boundary layer observing precursor waves and currents. We found that such instabilities with monochromatic signatures are rare, but one of the few cases was selected for further study. Two different instabilities, one at about 0.3 Hz and the other at a much lower frequency, 0.02 Hz, were seen in the data from the off-equatorial spacecraft. A parallel current attributed to an electron beam coexisted with the waves. Our instability analysis attributes the higher-frequency instability to a current-driven ion cyclotron instability and the lower frequency instability to a kink-like instability. The current-driven kink-like instability we observed is consistent with the instabilities observed in the simulation. We suggest that the currents needed to excite these low-frequency instabilities are so intense that the associated electron beams are easily thermalized and hence difficult to observe.

  6. Life-times of quantum resonances through the Geometrical Phase Propagator Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavlou, G.E.; Karanikas, A.I.; Diakonos, F.K., E-mail: fdiakono@phys.uoa.gr

    We employ the recently introduced Geometric Phase Propagator Approach (GPPA) (Diakonos et al., 2012) to develop an improved perturbative scheme for the calculation of life times in driven quantum systems. This incorporates a resummation of the contributions of virtual processes starting and ending at the same state in the considered time interval. The proposed procedure allows for a strict determination of the conditions leading to finite life times in a general driven quantum system by isolating the resummed terms in the perturbative expansion contributing to their generation. To illustrate how the derived conditions apply in practice, we consider the effect ofmore » driving in a system with purely discrete energy spectrum, as well as in a system for which the eigenvalue spectrum contains a continuous part. We show that in the first case, when the driving contains a dense set of frequencies acting as a noise to the system, the corresponding bound states acquire a finite life time. When the energy spectrum contains also a continuum set of eigenvalues then the bound states, due to the driving, couple to the continuum and become quasi-bound resonances. The benchmark of this change is the appearance of a Fano-type peak in the associated transmission profile. In both cases the corresponding life-time can be efficiently estimated within the reformulated GPPA approach.« less

  7. Computational modeling of electrically-driven deposition of ionized polydisperse particulate powder mixtures in advanced manufacturing processes

    NASA Astrophysics Data System (ADS)

    Zohdi, T. I.

    2017-07-01

    A key part of emerging advanced additive manufacturing methods is the deposition of specialized particulate mixtures of materials on substrates. For example, in many cases these materials are polydisperse powder mixtures whereby one set of particles is chosen with the objective to electrically, thermally or mechanically functionalize the overall mixture material and another set of finer-scale particles serves as an interstitial filler/binder. Often, achieving controllable, precise, deposition is difficult or impossible using mechanical means alone. It is for this reason that electromagnetically-driven methods are being pursued in industry, whereby the particles are ionized and an electromagnetic field is used to guide them into place. The goal of this work is to develop a model and simulation framework to investigate the behavior of a deposition as a function of an applied electric field. The approach develops a modular discrete-element type method for the simulation of the particle dynamics, which provides researchers with a framework to construct computational tools for this growing industry.

  8. Automatic ICD-10 multi-class classification of cause of death from plaintext autopsy reports through expert-driven feature selection.

    PubMed

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali

    2017-01-01

    Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports.

  9. Automatic ICD-10 multi-class classification of cause of death from plaintext autopsy reports through expert-driven feature selection

    PubMed Central

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali

    2017-01-01

    Objectives Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Methods Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Results Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. Conclusion The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports. PMID:28166263

  10. Supporting read-across using biological data.

    PubMed

    Zhu, Hao; Bouhifd, Mounir; Donley, Elizabeth; Egnash, Laura; Kleinstreuer, Nicole; Kroese, E Dinant; Liu, Zhichao; Luechtefeld, Thomas; Palmer, Jessica; Pamies, David; Shen, Jie; Strauss, Volker; Wu, Shengde; Hartung, Thomas

    2016-01-01

    Read-across, i.e. filling toxicological data gaps by relating to similar chemicals, for which test data are available, is usually done based on chemical similarity. Besides structure and physico-chemical properties, however, biological similarity based on biological data adds extra strength to this process. In the context of developing Good Read-Across Practice guidance, a number of case studies were evaluated to demonstrate the use of biological data to enrich read-across. In the simplest case, chemically similar substances also show similar test results in relevant in vitro assays. This is a well-established method for the read-across of e.g. genotoxicity assays. Larger datasets of biological and toxicological properties of hundreds and thousands of substances become increasingly available enabling big data approaches in read-across studies. Several case studies using various big data sources are described in this paper. An example is given for the US EPA's ToxCast dataset allowing read-across for high quality uterotrophic assays for estrogenic endocrine disruption. Similarly, an example for REACH registration data enhancing read-across for acute toxicity studies is given. A different approach is taken using omics data to establish biological similarity: Examples are given for stem cell models in vitro and short-term repeated dose studies in rats in vivo to support read-across and category formation. These preliminary biological data-driven read-across studies highlight the road to the new generation of read-across approaches that can be applied in chemical safety assessment.

  11. Magnetism in curved geometries

    NASA Astrophysics Data System (ADS)

    Streubel, Robert; Fischer, Peter; Kronast, Florian; Kravchuk, Volodymyr P.; Sheka, Denis D.; Gaididei, Yuri; Schmidt, Oliver G.; Makarov, Denys

    2016-09-01

    Extending planar two-dimensional structures into the three-dimensional space has become a general trend in multiple disciplines, including electronics, photonics, plasmonics and magnetics. This approach provides means to modify conventional or to launch novel functionalities by tailoring the geometry of an object, e.g. its local curvature. In a generic electronic system, curvature results in the appearance of scalar and vector geometric potentials inducing anisotropic and chiral effects. In the specific case of magnetism, even in the simplest case of a curved anisotropic Heisenberg magnet, the curvilinear geometry manifests two exchange-driven interactions, namely effective anisotropy and antisymmetric exchange, i.e. Dzyaloshinskii-Moriya-like interaction. As a consequence, a family of novel curvature-driven effects emerges, which includes magnetochiral effects and topologically induced magnetization patterning, resulting in theoretically predicted unlimited domain wall velocities, chirality symmetry breaking and Cherenkov-like effects for magnons. The broad range of altered physical properties makes these curved architectures appealing in view of fundamental research on e.g. skyrmionic systems, magnonic crystals or exotic spin configurations. In addition to these rich physics, the application potential of three-dimensionally shaped objects is currently being explored as magnetic field sensorics for magnetofluidic applications, spin-wave filters, advanced magneto-encephalography devices for diagnosis of epilepsy or for energy-efficient racetrack memory devices. These recent developments ranging from theoretical predictions over fabrication of three-dimensionally curved magnetic thin films, hollow cylinders or wires, to their characterization using integral means as well as the development of advanced tomography approaches are in the focus of this review.

  12. qPortal: A platform for data-driven biomedical research.

    PubMed

    Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven

    2018-01-01

    Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on high-performance computing systems via coupling of workflow management systems. Integration of project and data management as well as workflow resources in one place present clear advantages over existing solutions.

  13. Numerical simulation of the sedimentation of a sphere in a sheared granular fluid: a granular Stokes experiment.

    PubMed

    Tripathi, Anurag; Khakhar, D V

    2011-09-02

    We study, computationally, the sedimentation of a sphere of higher mass in a steady, gravity-driven granular flow of otherwise identical spheres, on a rough inclined plane. Taking a hydrodynamic approach at the scale of the particle, we find the drag force to be given by a modified Stokes law and the buoyancy force by the Archimedes principle, with excluded volume effects taken into account. We also find significant differences between the hydrodynamic case and the granular case, which are highlighted.

  14. Numerical Simulation of the Sedimentation of a Sphere in a Sheared Granular Fluid: A Granular Stokes Experiment

    NASA Astrophysics Data System (ADS)

    Tripathi, Anurag; Khakhar, D. V.

    2011-09-01

    We study, computationally, the sedimentation of a sphere of higher mass in a steady, gravity-driven granular flow of otherwise identical spheres, on a rough inclined plane. Taking a hydrodynamic approach at the scale of the particle, we find the drag force to be given by a modified Stokes law and the buoyancy force by the Archimedes principle, with excluded volume effects taken into account. We also find significant differences between the hydrodynamic case and the granular case, which are highlighted.

  15. Conducting requirements analyses for research using routinely collected health data: a model driven approach.

    PubMed

    de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul

    2012-01-01

    Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.

  16. Integrated rare variant-based risk gene prioritization in disease case-control sequencing studies.

    PubMed

    Lin, Jhih-Rong; Zhang, Quanwei; Cai, Ying; Morrow, Bernice E; Zhang, Zhengdong D

    2017-12-01

    Rare variants of major effect play an important role in human complex diseases and can be discovered by sequencing-based genome-wide association studies. Here, we introduce an integrated approach that combines the rare variant association test with gene network and phenotype information to identify risk genes implicated by rare variants for human complex diseases. Our data integration method follows a 'discovery-driven' strategy without relying on prior knowledge about the disease and thus maintains the unbiased character of genome-wide association studies. Simulations reveal that our method can outperform a widely-used rare variant association test method by 2 to 3 times. In a case study of a small disease cohort, we uncovered putative risk genes and the corresponding rare variants that may act as genetic modifiers of congenital heart disease in 22q11.2 deletion syndrome patients. These variants were missed by a conventional approach that relied on the rare variant association test alone.

  17. Can phronesis save the life of medical ethics?

    PubMed

    Beresford, E B

    1996-09-01

    There has been a growing interest in casuistry since the ground breaking work of Jonsen and Toulmin. Casuistry, in their view, offers the possibility of securing the moral agreement that policy makers desire but which has proved elusive to theory driven approaches to ethics. However, their account of casuistry is dependent upon the exercise of phronesis. As recent discussions of phronesis make clear, this requires attention not only to the particulars of the case, but also to the substantive goods at stake in the case. Without agreement on these goods attention to cases is unlikely to secure the productive consensus that Jonson and Toulmin seek.

  18. Condom Use among Immigrant Latino Sexual Minorities: Multilevel Analysis after Respondent-Driven Sampling

    PubMed Central

    Rhodes, Scott D.; McCoy, Thomas P.

    2014-01-01

    This study explored correlates of condom use within a respondent-driven sample of 190 Spanish-speaking immigrant Latino sexual minorities, including gay and bisexual men, other men who have sex with men (MSM), and transgender person, in North Carolina. Five analytic approaches for modeling data collected using respondent-driven sampling (RDS) were compared. Across most approaches, knowledge of HIV and sexually transmitted infections (STIs) and increased condom use self-efficacy predicted consistent condom use and increased homophobia predicted decreased consistent condom use. The same correlates were not significant in all analyses but were consistent in most. Clustering due to recruitment chains was low, while clustering due to recruiter was substantial. This highlights the importance accounting for clustering when analyzing RDS data. PMID:25646728

  19. Keratinocyte-driven contraction of reconstructed human skin.

    PubMed

    Chakrabarty, K H; Heaton, M; Dalley, A J; Dawson, R A; Freedlander, E; Khaw, P T; Mac Neil, S

    2001-01-01

    We have previously reported that reconstructed human skin, using deepidermized acellular sterilized dermis and allogeneic keratinocytes and fibroblasts, significantly contracts in vitro. Contracture of split skin grafts in burns injuries remains a serious problem and this in vitro model provides an opportunity to study keratinocyte/mesenchymal cell interactions and cell interactions with extracted normal human dermis. The aim of this study was to investigate the nature of this in vitro contraction and explore several approaches to prevent or reduce contraction. Three different methodologies for sterilization of the dermal matrix were examined: glycerol, ethylene oxide and a combination of glycerol and ethylene oxide. While the nature of the sterilization technique influenced the extent of contraction and thinner dermal matrices contracted proportionately more than thicker matrices, in all cases contraction was driven by the keratinocytes with relatively little influence from the fibroblasts. The contraction of the underlying dermis did not represent any change in tissue mass but rather a reorganization of the dermis which was rapidly reversed (within minutes) when the epidermal layer was removed. Pharmacological approaches to block contraction showed forskolin and mannose-6-phosphate to be ineffective and ascorbic acid-2-phosphate to exacerbate contraction. However, Galardin, a matrix metalloproteinase inhibitor and keratinocyte conditioned media, both inhibited contraction.

  20. Parametric Instability Rates in Periodically Driven Band Systems

    NASA Astrophysics Data System (ADS)

    Lellouch, S.; Bukov, M.; Demler, E.; Goldman, N.

    2017-04-01

    In this work, we analyze the dynamical properties of periodically driven band models. Focusing on the case of Bose-Einstein condensates, and using a mean-field approach to treat interparticle collisions, we identify the origin of dynamical instabilities arising from the interplay between the external drive and interactions. We present a widely applicable generic numerical method to extract instability rates and link parametric instabilities to uncontrolled energy absorption at short times. Based on the existence of parametric resonances, we then develop an analytical approach within Bogoliubov theory, which quantitatively captures the instability rates of the system and provides an intuitive picture of the relevant physical processes, including an understanding of how transverse modes affect the formation of parametric instabilities. Importantly, our calculations demonstrate an agreement between the instability rates determined from numerical simulations and those predicted by theory. To determine the validity regime of the mean-field analysis, we compare the latter to the weakly coupled conserving approximation. The tools developed and the results obtained in this work are directly relevant to present-day ultracold-atom experiments based on shaken optical lattices and are expected to provide an insightful guidance in the quest for Floquet engineering.

  1. Propensity approach to nonequilibrium thermodynamics of a chemical reaction network: Controlling single E-coli β-galactosidase enzyme catalysis through the elementary reaction stepsa)

    NASA Astrophysics Data System (ADS)

    Das, Biswajit; Banerjee, Kinshuk; Gangopadhyay, Gautam

    2013-12-01

    In this work, we develop an approach to nonequilibrium thermodynamics of an open chemical reaction network in terms of the elementary reaction propensities. The method is akin to the microscopic formulation of the dissipation function in terms of the Kullback-Leibler distance of phase space trajectories in Hamiltonian system. The formalism is applied to a single oligomeric enzyme kinetics at chemiostatic condition that leads the reaction system to a nonequilibrium steady state, characterized by a positive total entropy production rate. Analytical expressions are derived, relating the individual reaction contributions towards the total entropy production rate with experimentally measurable reaction velocity. Taking a real case of Escherichia coli β-galactosidase enzyme obeying Michaelis-Menten kinetics, we thoroughly analyze the temporal as well as the steady state behavior of various thermodynamic quantities for each elementary reaction. This gives a useful insight in the relative magnitudes of various energy terms and the dissipated heat to sustain a steady state of the reaction system operating far-from-equilibrium. It is also observed that, the reaction is entropy-driven at low substrate concentration and becomes energy-driven as the substrate concentration rises.

  2. Yarkovsky-driven Impact Predictions: Apophis and 1950 DA

    NASA Astrophysics Data System (ADS)

    Chesley, Steven R.; Farnocchia, D.; Chodas, P. W.; Milani, A.

    2013-10-01

    Orbit determination for Near-Earth Asteroids presents unique technical challenges due to the imperative of early detection and careful assessment of the risk posed by specific Earth close approaches. The occurrence of an Earth impact can be decisively driven by the Yarkovsky effect, which is the most important nongravitational perturbation as it causes asteroids to undergo a secular variation in semimajor axis resulting in a quadratic effect in anomaly. We discuss the cases of (99942) Apophis and (29075) 1950 DA. The relevance of the Yarkovsky effect for Apophis is due to a scattering close approach in 2029 with minimum geocentric distance ~38000 km. For 1950 DA the influence of the Yarkovsky effect in 2880 is due to the long time interval preceding the impact. We use the available information from the astrometry and the asteroids' physical models and dynamical evolution as a starting point for a Monte Carlo method that allows us to measure how the Yarkovsky effect affects orbital predictions. We also find that 1950 DA has a 98% likelihood of being a retrograde rotator. For Apophis we map onto the 2029 close approach b-plane and analyze the keyholes corresponding to resonant close approaches. For 1950 DA we use the b-plane corresponding to the possible impact in 2880. We finally compute the impact probability from the mapped probability density function on the considered b-plane. For Apophis we have 4 in a million chances of an impact in 2068, while the probability of Earth impact in 2880 for 1950 DA is 0.04%.

  3. Lead optimization attrition analysis (LOAA): a novel and general methodology for medicinal chemistry.

    PubMed

    Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos

    2015-08-01

    Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. A quantum framework for likelihood ratios

    NASA Astrophysics Data System (ADS)

    Bond, Rachael L.; He, Yang-Hui; Ormerod, Thomas C.

    The ability to calculate precise likelihood ratios is fundamental to science, from Quantum Information Theory through to Quantum State Estimation. However, there is no assumption-free statistical methodology to achieve this. For instance, in the absence of data relating to covariate overlap, the widely used Bayes’ theorem either defaults to the marginal probability driven “naive Bayes’ classifier”, or requires the use of compensatory expectation-maximization techniques. This paper takes an information-theoretic approach in developing a new statistical formula for the calculation of likelihood ratios based on the principles of quantum entanglement, and demonstrates that Bayes’ theorem is a special case of a more general quantum mechanical expression.

  5. Subxiphoid complex uniportal video-assisted major pulmonary resections.

    PubMed

    Gonzalez-Rivas, Diego; Lirio, Francisco; Sesma, Julio; Abu Akar, Firas

    2017-01-01

    In recent years, the search for a less invasive and thus, less painful approach has driven technical innovation in modern thoracic surgery. In this context, subxiphoid uniportal approach has emerged as an alternative to avoid intercostal space manipulation and decrease postoperative pain and intercostal nerve chronic impairment. Subxiphoid uniportal major lung resections have been safe and effective procedures when performed by experienced surgeons even in complex cases or unexpected intraoperative situations. We present six of these surgical scenarios such as big tumors, incomplete or absent fissures, hilar calcified lymph nodes, active bleeding and massive adhesions to show the feasibility of subxiphoid approach to manage even these conditions.

  6. Subxiphoid complex uniportal video-assisted major pulmonary resections

    PubMed Central

    Lirio, Francisco; Sesma, Julio; Abu Akar, Firas

    2017-01-01

    In recent years, the search for a less invasive and thus, less painful approach has driven technical innovation in modern thoracic surgery. In this context, subxiphoid uniportal approach has emerged as an alternative to avoid intercostal space manipulation and decrease postoperative pain and intercostal nerve chronic impairment. Subxiphoid uniportal major lung resections have been safe and effective procedures when performed by experienced surgeons even in complex cases or unexpected intraoperative situations. We present six of these surgical scenarios such as big tumors, incomplete or absent fissures, hilar calcified lymph nodes, active bleeding and massive adhesions to show the feasibility of subxiphoid approach to manage even these conditions. PMID:29078655

  7. Not Driven by High-Stakes Tests: Exploring Science Assessment and College Readiness of Students from an Urban Portfolio Community High School

    ERIC Educational Resources Information Center

    Fleshman, Robin Earle

    2017-01-01

    This case study seeks to explore three research questions: (1) What science teaching and learning processes, perspectives, and cultures exist within the science classroom of an urban portfolio community high school? (2) In what ways does the portfolio-based approach prepare high school students of color for college level science coursework,…

  8. Results-driven approach to improving quality and productivity

    Treesearch

    John Dramm

    2000-01-01

    Quality control (QC) programs do not often realize their full potential. Elaborate and expensive QC programs can easily get side tracked by the process of building a program with promises of “Someday, this will all pay off.” Training employees in QC methods is no guarantee that quality will improve. Several documented cases show that such activity-centered efforts...

  9. Estimating a Service-Life Distribution Based on Production Counts and a Failure Database

    DOE PAGES

    Ryan, Kenneth J.; Hamada, Michael Scott; Vardeman, Stephen B.

    2017-04-01

    A manufacturer wanted to compare the service-life distributions of two similar products. These concern product lifetimes after installation (not manufacture). For each product, there were available production counts and an imperfect database providing information on failing units. In the real case, these units were expensive repairable units warrantied against repairs. Failure (of interest here) was relatively rare and driven by a different mode/mechanism than ordinary repair events (not of interest here). Approach: Data models for the service life based on a standard parametric lifetime distribution and a related limited failure population were developed. These models were used to develop expressionsmore » for the likelihood of the available data that properly accounts for information missing in the failure database. Results: A Bayesian approach was employed to obtain estimates of model parameters (with associated uncertainty) in order to investigate characteristics of the service-life distribution. Custom software was developed and is included as Supplemental Material to this case study. One part of a responsible approach to the original case was a simulation experiment used to validate the correctness of the software and the behavior of the statistical methodology before using its results in the application, and an example of such an experiment is included here. Because of confidentiality issues that prevent use of the original data, simulated data with characteristics like the manufacturer’s proprietary data are used to illustrate some aspects of our real analyses. Lastly, we also note that, although this case focuses on rare and complete product failure, the statistical methodology provided is directly applicable to more standard warranty data problems involving typically much larger warranty databases where entries are warranty claims (often for repairs) rather than reports of complete failures.« less

  10. Estimating a Service-Life Distribution Based on Production Counts and a Failure Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryan, Kenneth J.; Hamada, Michael Scott; Vardeman, Stephen B.

    A manufacturer wanted to compare the service-life distributions of two similar products. These concern product lifetimes after installation (not manufacture). For each product, there were available production counts and an imperfect database providing information on failing units. In the real case, these units were expensive repairable units warrantied against repairs. Failure (of interest here) was relatively rare and driven by a different mode/mechanism than ordinary repair events (not of interest here). Approach: Data models for the service life based on a standard parametric lifetime distribution and a related limited failure population were developed. These models were used to develop expressionsmore » for the likelihood of the available data that properly accounts for information missing in the failure database. Results: A Bayesian approach was employed to obtain estimates of model parameters (with associated uncertainty) in order to investigate characteristics of the service-life distribution. Custom software was developed and is included as Supplemental Material to this case study. One part of a responsible approach to the original case was a simulation experiment used to validate the correctness of the software and the behavior of the statistical methodology before using its results in the application, and an example of such an experiment is included here. Because of confidentiality issues that prevent use of the original data, simulated data with characteristics like the manufacturer’s proprietary data are used to illustrate some aspects of our real analyses. Lastly, we also note that, although this case focuses on rare and complete product failure, the statistical methodology provided is directly applicable to more standard warranty data problems involving typically much larger warranty databases where entries are warranty claims (often for repairs) rather than reports of complete failures.« less

  11. Test Platform for Advanced Digital Control of Brushless DC Motors (MSFC Center Director's Discretionary Fund)

    NASA Technical Reports Server (NTRS)

    Gwaltney, D. A.

    2002-01-01

    A FY 2001 Center Director's Discretionary Fund task to develop a test platform for the development, implementation. and evaluation of adaptive and other advanced control techniques for brushless DC (BLDC) motor-driven mechanisms is described. Important applications for BLDC motor-driven mechanisms are the translation of specimens in microgravity experiments and electromechanical actuation of nozzle and fuel valves in propulsion systems. Motor-driven aerocontrol surfaces are also being utilized in developmental X vehicles. The experimental test platform employs a linear translation stage that is mounted vertically and driven by a BLDC motor. Control approaches are implemented on a digital signal processor-based controller for real-time, closed-loop control of the stage carriage position. The goal of the effort is to explore the application of advanced control approaches that can enhance the performance of a motor-driven actuator over the performance obtained using linear control approaches with fixed gains. Adaptive controllers utilizing an exact model knowledge controller and a self-tuning controller are implemented and the control system performance is illustrated through the presentation of experimental results.

  12. Undergraduate Labs for Biological Physics: Brownian Motion and Optical Trapping

    NASA Astrophysics Data System (ADS)

    Chu, Kelvin; Laughney, A.; Williams, J.

    2006-12-01

    We describe a set of case-study driven labs for an upper-division biological physics course. These labs are motivated by case-studies and consist of inquiry-driven investigations of Brownian motion and optical-trapping experiments. Each lab incorporates two innovative educational techniques to drive the process and application aspects of scientific learning. Case studies are used to encourage students to think independently and apply the scientific method to a novel lab situation. Student input from this case study is then used to decide how to best do the measurement, guide the project and ultimately evaluate the success of the program. Where appropriate, visualization and simulation using VPython is used. Direct visualization of Brownian motion allows students to directly calculate Avogadro's number or the Boltzmann constant. Following case-study driven discussion, students use video microscopy to measure the motion of latex spheres in different viscosity fluids arrive at a good approximation of NA or kB. Optical trapping (laser tweezer) experiments allow students to investigate the consequences of 100-pN forces on small particles. The case study consists of a discussion of the Boltzmann distribution and equipartition theorem followed by a consideration of the shape of the potential. Students can then use video capture to measure the distribution of bead positions to determine the shape and depth of the trap. This work supported by NSF DUE-0536773.

  13. Forecasting Chikungunya spread in the Americas via data-driven empirical approaches.

    PubMed

    Escobar, Luis E; Qiao, Huijie; Peterson, A Townsend

    2016-02-29

    Chikungunya virus (CHIKV) is endemic to Africa and Asia, but the Asian genotype invaded the Americas in 2013. The fast increase of human infections in the American epidemic emphasized the urgency of developing detailed predictions of case numbers and the potential geographic spread of this disease. We developed a simple model incorporating cases generated locally and cases imported from other countries, and forecasted transmission hotspots at the level of countries and at finer scales, in terms of ecological features. By late January 2015, >1.2 M CHIKV cases were reported from the Americas, with country-level prevalences between nil and more than 20 %. In the early stages of the epidemic, exponential growth in case numbers was common; later, however, poor and uneven reporting became more common, in a phenomenon we term "surveillance fatigue." Economic activity of countries was not associated with prevalence, but diverse social factors may be linked to surveillance effort and reporting. Our model predictions were initially quite inaccurate, but improved markedly as more data accumulated within the Americas. The data-driven methodology explored in this study provides an opportunity to generate descriptive and predictive information on spread of emerging diseases in the short-term under simple models based on open-access tools and data that can inform early-warning systems and public health intelligence.

  14. Testing by artificial intelligence: computational alternatives to the determination of mutagenicity.

    PubMed

    Klopman, G; Rosenkranz, H S

    1992-08-01

    In order to develop methods for evaluating the predictive performance of computer-driven structure-activity methods (SAR) as well as to determine the limits of predictivity, we investigated the behavior of two Salmonella mutagenicity data bases: (a) a subset from the Genetox Program and (b) one from the U.S. National Toxicology Program (NTP). For molecules common to the two data bases, the experimental concordance was 76% when "marginals" were included and 81% when they were excluded. Three SAR methods were evaluated: CASE, MULTICASE and CASE/Graph Indices (CASE/GI). The programs "learned" the Genetox data base and used it to predict NTP molecules that were not present in the Genetox compilation. The concordances were 72, 80 and 47% respectively. Obviously, the MULTICASE version is superior and approaches the 85% interlaboratory variability observed for the Salmonella mutagenicity assays when the latter was carried out under carefully controlled conditions.

  15. Psychological therapy for psychogenic amnesia: Successful treatment in a single case study.

    PubMed

    Cassel, Anneli; Humphreys, Kate

    2016-01-01

    Psychogenic amnesia is widely understood to be a memory impairment of psychological origin that occurs as a response to severe stress. However, there is a paucity of evidence regarding the effectiveness of psychological therapy approaches in the treatment of this disorder. The current article describes a single case, "Ben", who was treated with formulation-driven psychological therapy using techniques drawn from cognitive behavioural therapy (CBT) and acceptance and commitment therapy (ACT) for psychogenic amnesia. Before treatment, Ben exhibited isolated retrograde and anterograde memory impairments. He received 12 therapy sessions that targeted experiential avoidance followed by two review sessions, six weeks and five months later. Ben's retrograde and anterograde memory impairments improved following therapy to return to within the "average" to "superior" ranges, which were maintained at follow-up. Further experimental single case study designs and larger group studies are required to advance the understanding of the effectiveness and efficacy of psychological therapy for psychogenic amnesia.

  16. Knowledge Driven Variable Selection (KDVS) – a new approach to enrichment analysis of gene signatures obtained from high–throughput data

    PubMed Central

    2013-01-01

    Background High–throughput (HT) technologies provide huge amount of gene expression data that can be used to identify biomarkers useful in the clinical practice. The most frequently used approaches first select a set of genes (i.e. gene signature) able to characterize differences between two or more phenotypical conditions, and then provide a functional assessment of the selected genes with an a posteriori enrichment analysis, based on biological knowledge. However, this approach comes with some drawbacks. First, gene selection procedure often requires tunable parameters that affect the outcome, typically producing many false hits. Second, a posteriori enrichment analysis is based on mapping between biological concepts and gene expression measurements, which is hard to compute because of constant changes in biological knowledge and genome analysis. Third, such mapping is typically used in the assessment of the coverage of gene signature by biological concepts, that is either score–based or requires tunable parameters as well, limiting its power. Results We present Knowledge Driven Variable Selection (KDVS), a framework that uses a priori biological knowledge in HT data analysis. The expression data matrix is transformed, according to prior knowledge, into smaller matrices, easier to analyze and to interpret from both computational and biological viewpoints. Therefore KDVS, unlike most approaches, does not exclude a priori any function or process potentially relevant for the biological question under investigation. Differently from the standard approach where gene selection and functional assessment are applied independently, KDVS embeds these two steps into a unified statistical framework, decreasing the variability derived from the threshold–dependent selection, the mapping to the biological concepts, and the signature coverage. We present three case studies to assess the usefulness of the method. Conclusions We showed that KDVS not only enables the selection of known biological functionalities with accuracy, but also identification of new ones. An efficient implementation of KDVS was devised to obtain results in a fast and robust way. Computing time is drastically reduced by the effective use of distributed resources. Finally, integrated visualization techniques immediately increase the interpretability of results. Overall, KDVS approach can be considered as a viable alternative to enrichment–based approaches. PMID:23302187

  17. t4 report1

    PubMed Central

    Zhu, Hao; Bouhifd, Mounir; Kleinstreuer, Nicole; Kroese, E. Dinant; Liu, Zhichao; Luechtefeld, Thomas; Pamies, David; Shen, Jie; Strauss, Volker; Wu, Shengde; Hartung, Thomas

    2016-01-01

    Summary Read-across, i.e. filling toxicological data gaps by relating to similar chemicals, for which test data are available, is usually done based on chemical similarity. Besides structure and physico-chemical properties, however, biological similarity based on biological data adds extra strength to this process. In the context of developing Good Read-Across Practice guidance, a number of case studies were evaluated to demonstrate the use of biological data to enrich read-across. In the simplest case, chemically similar substances also show similar test results in relevant in vitro assays. This is a well-established method for the read-across of e.g. genotoxicity assays. Larger datasets of biological and toxicological properties of hundreds and thousands of substances become increasingly available enabling big data approaches in read-across studies. Several case studies using various big data sources are described in this paper. An example is given for the US EPA’s ToxCast dataset allowing read-across for high quality uterotrophic assays for estrogenic endocrine disruption. Similarly, an example for REACH registration data enhancing read-across for acute toxicity studies is given. A different approach is taken using omics data to establish biological similarity: Examples are given for stem cell models in vitro and short-term repeated dose studies in rats in vivo to support read-across and category formation. These preliminary biological data-driven read-across studies highlight the road to the new generation of read-across approaches that can be applied in chemical safety assessment. PMID:26863516

  18. Dynamics of a coherently driven micromaser by the Monte Carlo wavefunction approach

    NASA Astrophysics Data System (ADS)

    Bonacina, L.; Casagrande, F.; Lulli, A.

    2000-08-01

    Using a Monte Carlo wavefunction approach we investigate the dynamics of a micromaser driven by a resonant coherent field. At steady state, for increasing interaction times, the system exhibits driven Rabi oscillations, followed by collapse as the range of micromaser trapping states is approached. The system operates in regimes ranging from a strong to a weak amplifier. In the strong-amplifier regime the cavity mode shows a preferred phase and can exhibit quadrature squeezing and sub-Poissonian photon statistics. In the weak-amplifier regime the cavity mode has no preferred phase, is super-Poissonian and is influenced by trapping effects; no revival of Rabi oscillations occurs. The main predictions can be compared with experimental measurements on the populations of atoms leaving the cavity.

  19. Learning from Learners: A Non-Standard Direct Approach to the Teaching of Writing Skills in EFL in a University Context

    ERIC Educational Resources Information Center

    Fuster-Márquez, Miguel; Gregori-Signes, Carmen

    2018-01-01

    Corpora have been used in English as a foreign language materials for decades, and native corpora have been present in the classroom by means of direct approaches such as Data-Driven Learning (Johns, T., and P. King 1991. "'Should you be Persuaded'- Two Samples of Data-Driven Learning Materials." In "Classroom Concordancing,"…

  20. The Healthy Start Initiative: A Community-Driven Approach to Infant Mortality Reduction. Volume V: Collaboration with Managed Care Organizations.

    ERIC Educational Resources Information Center

    Joffe, Mark S.; Back, Kelli

    The Healthy Start Initiative is a national 5-year demonstration program that uses a broad range of community-driven, system development approaches to reduce infant mortality and improve the health and well-being of women, infants, children, and families. This volume, fifth in the series, deals with the topic of collaborating with managed care…

  1. A novel quantification-driven proteomic strategy identifies an endogenous peptide of pleiotrophin as a new biomarker of Alzheimer's disease.

    PubMed

    Skillbäck, Tobias; Mattsson, Niklas; Hansson, Karl; Mirgorodskaya, Ekaterina; Dahlén, Rahil; van der Flier, Wiesje; Scheltens, Philip; Duits, Floor; Hansson, Oskar; Teunissen, Charlotte; Blennow, Kaj; Zetterberg, Henrik; Gobom, Johan

    2017-10-17

    We present a new, quantification-driven proteomic approach to identifying biomarkers. In contrast to the identification-driven approach, limited in scope to peptides that are identified by database searching in the first step, all MS data are considered to select biomarker candidates. The endopeptidome of cerebrospinal fluid from 40 Alzheimer's disease (AD) patients, 40 subjects with mild cognitive impairment, and 40 controls with subjective cognitive decline was analyzed using multiplex isobaric labeling. Spectral clustering was used to match MS/MS spectra. The top biomarker candidate cluster (215% higher in AD compared to controls, area under ROC curve = 0.96) was identified as a fragment of pleiotrophin located near the protein's C-terminus. Analysis of another cohort (n = 60 over four clinical groups) verified that the biomarker was increased in AD patients while no change in controls, Parkinson's disease or progressive supranuclear palsy was observed. The identification of the novel biomarker pleiotrophin 151-166 demonstrates that our quantification-driven proteomic approach is a promising method for biomarker discovery, which may be universally applicable in clinical proteomics.

  2. Rethinking the cost of healthcare in low-resource settings: the value of time-driven activity-based costing

    PubMed Central

    McBain, Ryan K; Jerome, Gregory; Warsh, Jonathan; Browning, Micaela; Mistry, Bipin; Faure, Peterson Abnis I; Pierre, Claire; Fang, Anna P; Mugunga, Jean Claude; Rhatigan, Joseph; Leandre, Fernet; Kaplan, Robert

    2016-01-01

    Low-income and middle-income countries account for over 80% of the world's infectious disease burden, but <20% of global expenditures on health. In this context, judicious resource allocation can mean the difference between life and death, not just for individual patients, but entire patient populations. Understanding the cost of healthcare delivery is a prerequisite for allocating health resources, such as staff and medicines, in a way that is effective, efficient, just and fair. Nevertheless, health costs are often poorly understood, undermining effectiveness and efficiency of service delivery. We outline shortcomings, and consequences, of common approaches to estimating the cost of healthcare in low-resource settings, as well as advantages of a newly introduced approach in healthcare known as time-driven activity-based costing (TDABC). TDABC is a patient-centred approach to cost analysis, meaning that it begins by studying the flow of individual patients through the health system, and measuring the human, equipment and facility resources used to treat the patients. The benefits of this approach are numerous: fewer assumptions need to be made, heterogeneity in expenditures can be studied, service delivery can be modelled and streamlined and stronger linkages can be established between resource allocation and health outcomes. TDABC has demonstrated significant benefits for improving health service delivery in high-income countries but has yet to be adopted in resource-limited settings. We provide an illustrative case study of its application throughout a network of hospitals in Haiti, as well as a simplified framework for policymakers to apply this approach in low-resource settings around the world. PMID:28588971

  3. Rethinking the cost of healthcare in low-resource settings: the value of time-driven activity-based costing.

    PubMed

    McBain, Ryan K; Jerome, Gregory; Warsh, Jonathan; Browning, Micaela; Mistry, Bipin; Faure, Peterson Abnis I; Pierre, Claire; Fang, Anna P; Mugunga, Jean Claude; Rhatigan, Joseph; Leandre, Fernet; Kaplan, Robert

    2016-01-01

    Low-income and middle-income countries account for over 80% of the world's infectious disease burden, but <20% of global expenditures on health. In this context, judicious resource allocation can mean the difference between life and death, not just for individual patients, but entire patient populations. Understanding the cost of healthcare delivery is a prerequisite for allocating health resources, such as staff and medicines, in a way that is effective, efficient, just and fair. Nevertheless, health costs are often poorly understood, undermining effectiveness and efficiency of service delivery. We outline shortcomings, and consequences, of common approaches to estimating the cost of healthcare in low-resource settings, as well as advantages of a newly introduced approach in healthcare known as time-driven activity-based costing (TDABC). TDABC is a patient-centred approach to cost analysis, meaning that it begins by studying the flow of individual patients through the health system, and measuring the human, equipment and facility resources used to treat the patients. The benefits of this approach are numerous: fewer assumptions need to be made, heterogeneity in expenditures can be studied, service delivery can be modelled and streamlined and stronger linkages can be established between resource allocation and health outcomes. TDABC has demonstrated significant benefits for improving health service delivery in high-income countries but has yet to be adopted in resource-limited settings. We provide an illustrative case study of its application throughout a network of hospitals in Haiti, as well as a simplified framework for policymakers to apply this approach in low-resource settings around the world.

  4. Effectiveness and Usability of the Sensory Processing Measure-Preschool Quick Tips: Data-Driven Intervention Following the Use of the SPM-Preschool in an Early Childhood, Multiple-Case Study

    ERIC Educational Resources Information Center

    Olson, Carol H.; Henry, Diana A.; Kliner, Ashley Peck; Kyllo, Alissa; Richter, Chelsea Munson; Charley, Jane; Whitcher, Meagan Chapman; Reinke, Katherine Roth; Tysver, Chelsay Horner; Wagner, Lacey; Walworth, Jessica

    2016-01-01

    This pre- and posttest multiple-case study examined the effectiveness and usability of the Sensory Processing Measure-Preschool Quick Tips (SPM-P QT) by key stakeholders (parents and teachers) for implementing data-driven intervention to address sensory processing challenges. The Sensory Processing Measure-Preschool (SPM-P) was administered as an…

  5. Numerical simulation and comparison of two ventilation methods for a restaurant - displacement vs mixed flow ventilation

    NASA Astrophysics Data System (ADS)

    Chitaru, George; Berville, Charles; Dogeanu, Angel

    2018-02-01

    This paper presents a comparison between a displacement ventilation method and a mixed flow ventilation method using computational fluid dynamics (CFD) approach. The paper analyses different aspects of the two systems, like the draft effect in certain areas, the air temperatureand velocity distribution in the occupied zone. The results highlighted that the displacement ventilation system presents an advantage for the current scenario, due to the increased buoyancy driven flows caused by the interior heat sources. For the displacement ventilation case the draft effect was less prone to appear in the occupied zone but the high heat emissions from the interior sources have increased the temperature gradient in the occupied zone. Both systems have been studied in similar conditions, concentrating only on the flow patterns for each case.

  6. Electric-field-driven electron-transfer in mixed-valence molecules.

    PubMed

    Blair, Enrique P; Corcelli, Steven A; Lent, Craig S

    2016-07-07

    Molecular quantum-dot cellular automata is a computing paradigm in which digital information is encoded by the charge configuration of a mixed-valence molecule. General-purpose computing can be achieved by arranging these compounds on a substrate and exploiting intermolecular Coulombic coupling. The operation of such a device relies on nonequilibrium electron transfer (ET), whereby the time-varying electric field of one molecule induces an ET event in a neighboring molecule. The magnitude of the electric fields can be quite large because of close spatial proximity, and the induced ET rate is a measure of the nonequilibrium response of the molecule. We calculate the electric-field-driven ET rate for a model mixed-valence compound. The mixed-valence molecule is regarded as a two-state electronic system coupled to a molecular vibrational mode, which is, in turn, coupled to a thermal environment. Both the electronic and vibrational degrees-of-freedom are treated quantum mechanically, and the dissipative vibrational-bath interaction is modeled with the Lindblad equation. This approach captures both tunneling and nonadiabatic dynamics. Relationships between microscopic molecular properties and the driven ET rate are explored for two time-dependent applied fields: an abruptly switched field and a linearly ramped field. In both cases, the driven ET rate is only weakly temperature dependent. When the model is applied using parameters appropriate to a specific mixed-valence molecule, diferrocenylacetylene, terahertz-range ET transfer rates are predicted.

  7. Architectural Strategies for Enabling Data-Driven Science at Scale

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.

  8. Empirical Performance Model-Driven Data Layout Optimization and Library Call Selection for Tensor Contraction Expressions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Qingda; Gao, Xiaoyang; Krishnamoorthy, Sriram

    Empirical optimizers like ATLAS have been very effective in optimizing computational kernels in libraries. The best choice of parameters such as tile size and degree of loop unrolling is determined by executing different versions of the computation. In contrast, optimizing compilers use a model-driven approach to program transformation. While the model-driven approach of optimizing compilers is generally orders of magnitude faster than ATLAS-like library generators, its effectiveness can be limited by the accuracy of the performance models used. In this paper, we describe an approach where a class of computations is modeled in terms of constituent operations that are empiricallymore » measured, thereby allowing modeling of the overall execution time. The performance model with empirically determined cost components is used to perform data layout optimization together with the selection of library calls and layout transformations in the context of the Tensor Contraction Engine, a compiler for a high-level domain-specific language for expressing computational models in quantum chemistry. The effectiveness of the approach is demonstrated through experimental measurements on representative computations from quantum chemistry.« less

  9. Organizational impact of evidence-informed decision making training initiatives: a case study comparison of two approaches

    PubMed Central

    2014-01-01

    Background The impact of efforts by healthcare organizations to enhance the use of evidence to improve organizational processes through training programs has seldom been assessed. We therefore endeavored to assess whether and how the training of mid- and senior-level healthcare managers could lead to organizational change. Methods We conducted a theory-driven evaluation of the organizational impact of healthcare leaders’ participation in two training programs using a logic model based on Nonaka’s theory of knowledge conversion. We analyzed six case studies nested within the two programs using three embedded units of analysis (individual, group and organization). Interviews were conducted during intensive one-week data collection site visits. A total of 84 people were interviewed. Results We found that the impact of training could primarily be felt in trainees’ immediate work environments. The conversion of attitudes was found to be easier to achieve than the conversion of skills. Our results show that, although socialization and externalization were common in all cases, a lack of combination impeded the conversion of skills. We also identified several individual, organizational and program design factors that facilitated and/or impeded the dissemination of the attitudes and skills gained by trainees to other organizational members. Conclusions Our theory-driven evaluation showed that factors before, during and after training can influence the extent of skills and knowledge transfer. Our evaluation went further than previous research by revealing the influence—both positive and negative—of specific organizational factors on extending the impact of training programs. PMID:24885800

  10. Context-sensitive network-based disease genetics prediction and its implications in drug discovery

    PubMed Central

    Chen, Yang; Xu, Rong

    2017-01-01

    Abstract Motivation: Disease phenotype networks play an important role in computational approaches to identifying new disease-gene associations. Current disease phenotype networks often model disease relationships based on pairwise similarities, therefore ignore the specific context on how two diseases are connected. In this study, we propose a new strategy to model disease associations using context-sensitive networks (CSNs). We developed a CSN-based phenome-driven approach for disease genetics prediction, and investigated the translational potential of the predicted genes in drug discovery. Results: We constructed CSNs by directly connecting diseases with associated phenotypes. Here, we constructed two CSNs using different data sources; the two networks contain 26 790 and 13 822 nodes respectively. We integrated the CSNs with a genetic functional relationship network and predicted disease genes using a network-based ranking algorithm. For comparison, we built Similarity-Based disease Networks (SBN) using the same disease phenotype data. In a de novo cross validation for 3324 diseases, the CSN-based approach significantly increased the average rank from top 12.6 to top 8.8% for all tested genes comparing with the SBN-based approach (p

  11. Reverse engineering biomolecular systems using -omic data: challenges, progress and opportunities.

    PubMed

    Quo, Chang F; Kaddi, Chanchala; Phan, John H; Zollanvari, Amin; Xu, Mingqing; Wang, May D; Alterovitz, Gil

    2012-07-01

    Recent advances in high-throughput biotechnologies have led to the rapid growing research interest in reverse engineering of biomolecular systems (REBMS). 'Data-driven' approaches, i.e. data mining, can be used to extract patterns from large volumes of biochemical data at molecular-level resolution while 'design-driven' approaches, i.e. systems modeling, can be used to simulate emergent system properties. Consequently, both data- and design-driven approaches applied to -omic data may lead to novel insights in reverse engineering biological systems that could not be expected before using low-throughput platforms. However, there exist several challenges in this fast growing field of reverse engineering biomolecular systems: (i) to integrate heterogeneous biochemical data for data mining, (ii) to combine top-down and bottom-up approaches for systems modeling and (iii) to validate system models experimentally. In addition to reviewing progress made by the community and opportunities encountered in addressing these challenges, we explore the emerging field of synthetic biology, which is an exciting approach to validate and analyze theoretical system models directly through experimental synthesis, i.e. analysis-by-synthesis. The ultimate goal is to address the present and future challenges in reverse engineering biomolecular systems (REBMS) using integrated workflow of data mining, systems modeling and synthetic biology.

  12. Spatially adapted augmentation of age-specific atlas-based segmentation using patch-based priors

    NASA Astrophysics Data System (ADS)

    Liu, Mengyuan; Seshamani, Sharmishtaa; Harrylock, Lisa; Kitsch, Averi; Miller, Steven; Chau, Van; Poskitt, Kenneth; Rousseau, Francois; Studholme, Colin

    2014-03-01

    One of the most common approaches to MRI brain tissue segmentation is to employ an atlas prior to initialize an Expectation- Maximization (EM) image labeling scheme using a statistical model of MRI intensities. This prior is commonly derived from a set of manually segmented training data from the population of interest. However, in cases where subject anatomy varies significantly from the prior anatomical average model (for example in the case where extreme developmental abnormalities or brain injuries occur), the prior tissue map does not provide adequate information about the observed MRI intensities to ensure the EM algorithm converges to an anatomically accurate labeling of the MRI. In this paper, we present a novel approach for automatic segmentation of such cases. This approach augments the atlas-based EM segmentation by exploring methods to build a hybrid tissue segmentation scheme that seeks to learn where an atlas prior fails (due to inadequate representation of anatomical variation in the statistical atlas) and utilize an alternative prior derived from a patch driven search of the atlas data. We describe a framework for incorporating this patch-based augmentation of EM (PBAEM) into a 4D age-specific atlas-based segmentation of developing brain anatomy. The proposed approach was evaluated on a set of MRI brain scans of premature neonates with ages ranging from 27.29 to 46.43 gestational weeks (GWs). Results indicated superior performance compared to the conventional atlas-based segmentation method, providing improved segmentation accuracy for gray matter, white matter, ventricles and sulcal CSF regions.

  13. Understanding Whole Systems Change in Health Care: Insights into System Level Diffusion from Nursing Service Delivery Innovations--A Multiple Case Study

    ERIC Educational Resources Information Center

    Berta, Whitney; Virani, Tazim; Bajnok, Irmajean; Edwards, Nancy; Rowan, Margo

    2014-01-01

    Our study responds to calls for theory-driven approaches to studying innovation diffusion processes in health care. While most research on diffusion in health care is situated at the service delivery level, we study innovations and associated processes that have diffused to the system level, and refer to work on complex adaptive systems and whole…

  14. The temporal evolution of the resistive pressure-gradient-driven turbulence and anomalous transport in shear flow across the magnetic field

    NASA Astrophysics Data System (ADS)

    Lee, Hae June; Mikhailenko, Vladmir; Mikhailenko, Vladimir

    2017-10-01

    The temporal evolution of the resistive pressure-gradient-driven mode in the sheared flow is investigated by employing the shearing modes approach. It reveals an essential difference in the processes, which occur in the case of the flows with velocity shearing rate less than the growth rate of the instability in the steady plasmas, and in the case of the flows with velocity shear larger than the instability growth rate in steady plasmas. It displays the physical content of the empirical ``quench rule'' which predicts the suppression of the turbulence in the sheared flows when the velocity shearing rate becomes larger than the maximum growth rate of the possible instability. We found that the distortion of the perturbations by the sheared flow with such velocity shear introduces the time dependencies into the governing equations, which prohibits the application of the eigenmodes formalism and requires the solution of the initial value problem.

  15. Floating rGO-based black membranes for solar driven sterilization.

    PubMed

    Zhang, Yao; Zhao, Dengwu; Yu, Fan; Yang, Chao; Lou, Jinwei; Liu, Yanming; Chen, Yingying; Wang, Zhongyong; Tao, Peng; Shang, Wen; Wu, Jianbo; Song, Chengyi; Deng, Tao

    2017-12-14

    This paper presents a new steam sterilization approach that uses a solar-driven evaporation system at the water/air interface. Compared to the conventional solar autoclave, this new steam sterilization approach via interfacial evaporation requires no complex system design to bear high steam pressure. In such a system, a reduced graphene oxide/polytetrafluoroethylene composite membrane floating at the water/air interface serves as a light-to-heat conversion medium to harvest and convert incident solar light into localized heat. Such localized heat raises the temperature of the membrane substantially and helps generate steam with a temperature higher than 120 °C. A sterilization device that takes advantage of the interfacial solar-driven evaporation system was built and its successful sterilization capability was demonstrated through both chemical and biological sterilization tests. The interfacial evaporation-based solar driven sterilization approach offers a potential low cost solution to meet the need for sterilization in undeveloped areas that lack electrical power but have ample solar radiation.

  16. An Hypothesis-Driven, Molecular Phylogenetics Exercise for College Biology Students

    ERIC Educational Resources Information Center

    Parker, Joel D.; Ziemba, Robert E.; Cahan, Sara Helms; Rissing, Steven W.

    2004-01-01

    This hypothesis-driven laboratory exercise teaches how DNA evidence can be used to investigate an organism's evolutionary history while providing practical modeling of the fundamental processes of gene transcription and translation. We used an inquiry-based approach to construct a laboratory around a nontrivial, open-ended evolutionary question…

  17. Using exposure bands for rapid decision making in the ...

    EPA Pesticide Factsheets

    The ILSI Health and Environmental Sciences Institute (HESI) Risk Assessment in the 21st Century (RISK21) project was initiated to address and catalyze improvements in human health risk assessment. RISK21 is a problem formulation-based conceptual roadmap and risk matrix visualization tool, facilitating transparent evaluation of both hazard and exposure components. The RISK21 roadmap is exposure-driven, i.e. exposure is used as the second step (after problem formulation) to define and focus the assessment. This paper describes the exposure tiers of the RISK21 matrix and the approaches to adapt readily available information to more quickly inform exposure at a screening level. In particular, exposure look-up tables were developed from available exposure tools (European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) for worker exposure, ECETOC TRA, European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) for consumer exposure, and USEtox for indirect exposure to humans via the environment) were tested in a hypothetical mosquito bed netting case study. A detailed WHO risk assessment for a similar mosquito net use served as a benchmark for the performance of the RISK21 approach. The case study demonstrated that the screening methodologies provided suitable conservative exposure estimates for risk assessment. The results of this effort showed that the RISK21 approach is useful f

  18. Thermodynamics of a periodically driven qubit

    NASA Astrophysics Data System (ADS)

    Donvil, Brecht

    2018-04-01

    We present a new approach to the open system dynamics of a periodically driven qubit in contact with a temperature bath. We are specifically interested in the thermodynamics of the qubit. It is well known that by combining the Markovian approximation with Floquet theory it is possible to derive a stochastic Schrödinger equation in for the state of the qubit. We follow here a different approach. We use Floquet theory to embed the time-non autonomous qubit dynamics into time-autonomous yet infinite dimensional dynamics. We refer to the resulting infinite dimensional system as the dressed-qubit. Using the Markovian approximation we derive the stochastic Schrödinger equation for the dressed-qubit. The advantage of our approach is that the jump operators are ladder operators of the Hamiltonian. This simplifies the formulation of the thermodynamics. We use the thermodynamics of the infinite dimensional system to recover the thermodynamical description for the driven qubit. We compare our results with the existing literature and recover the known results.

  19. Model-Driven Safety Analysis of Closed-Loop Medical Systems

    PubMed Central

    Pajic, Miroslav; Mangharam, Rahul; Sokolsky, Oleg; Arney, David; Goldman, Julian; Lee, Insup

    2013-01-01

    In modern hospitals, patients are treated using a wide array of medical devices that are increasingly interacting with each other over the network, thus offering a perfect example of a cyber-physical system. We study the safety of a medical device system for the physiologic closed-loop control of drug infusion. The main contribution of the paper is the verification approach for the safety properties of closed-loop medical device systems. We demonstrate, using a case study, that the approach can be applied to a system of clinical importance. Our method combines simulation-based analysis of a detailed model of the system that contains continuous patient dynamics with model checking of a more abstract timed automata model. We show that the relationship between the two models preserves the crucial aspect of the timing behavior that ensures the conservativeness of the safety analysis. We also describe system design that can provide open-loop safety under network failure. PMID:24177176

  20. Model-Driven Safety Analysis of Closed-Loop Medical Systems.

    PubMed

    Pajic, Miroslav; Mangharam, Rahul; Sokolsky, Oleg; Arney, David; Goldman, Julian; Lee, Insup

    2012-10-26

    In modern hospitals, patients are treated using a wide array of medical devices that are increasingly interacting with each other over the network, thus offering a perfect example of a cyber-physical system. We study the safety of a medical device system for the physiologic closed-loop control of drug infusion. The main contribution of the paper is the verification approach for the safety properties of closed-loop medical device systems. We demonstrate, using a case study, that the approach can be applied to a system of clinical importance. Our method combines simulation-based analysis of a detailed model of the system that contains continuous patient dynamics with model checking of a more abstract timed automata model. We show that the relationship between the two models preserves the crucial aspect of the timing behavior that ensures the conservativeness of the safety analysis. We also describe system design that can provide open-loop safety under network failure.

  1. Magnetism in curved geometries

    DOE PAGES

    Streubel, Robert; Fischer, Peter; Kronast, Florian; ...

    2016-08-17

    Extending planar two-dimensional structures into the three-dimensional space has become a general trend in multiple disciplines, including electronics, photonics, plasmonics and magnetics. This approach provides means to modify conventional or to launch novel functionalities by tailoring the geometry of an object, e.g. its local curvature. In a generic electronic system, curvature results in the appearance of scalar and vector geometric potentials inducing anisotropic and chiral effects. In the specific case of magnetism, even in the simplest case of a curved anisotropic Heisenberg magnet, the curvilinear geometry manifests two exchange-driven interactions, namely effective anisotropy and antisymmetric exchange, i.e. Dzyaloshinskii–Moriya-like interaction. Asmore » a consequence, a family of novel curvature-driven effects emerges, which includes magnetochiral effects and topologically induced magnetization patterning, resulting in theoretically predicted unlimited domain wall velocities, chirality symmetry breaking and Cherenkov-like effects for magnons. The broad range of altered physical properties makes these curved architectures appealing in view of fundamental research on e.g. skyrmionic systems, magnonic crystals or exotic spin configurations. In addition to these rich physics, the application potential of three-dimensionally shaped objects is currently being explored as magnetic field sensorics for magnetofluidic applications, spin-wave filters, advanced magneto-encephalography devices for diagnosis of epilepsy or for energy-efficient racetrack memory devices. Finally, these recent developments ranging from theoretical predictions over fabrication of three-dimensionally curved magnetic thin films, hollow cylinders or wires, to their characterization using integral means as well as the development of advanced tomography approaches are in the focus of this review.« less

  2. Magnetism in curved geometries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Streubel, Robert; Fischer, Peter; Kronast, Florian

    Extending planar two-dimensional structures into the three-dimensional space has become a general trend in multiple disciplines, including electronics, photonics, plasmonics and magnetics. This approach provides means to modify conventional or to launch novel functionalities by tailoring the geometry of an object, e.g. its local curvature. In a generic electronic system, curvature results in the appearance of scalar and vector geometric potentials inducing anisotropic and chiral effects. In the specific case of magnetism, even in the simplest case of a curved anisotropic Heisenberg magnet, the curvilinear geometry manifests two exchange-driven interactions, namely effective anisotropy and antisymmetric exchange, i.e. Dzyaloshinskii–Moriya-like interaction. Asmore » a consequence, a family of novel curvature-driven effects emerges, which includes magnetochiral effects and topologically induced magnetization patterning, resulting in theoretically predicted unlimited domain wall velocities, chirality symmetry breaking and Cherenkov-like effects for magnons. The broad range of altered physical properties makes these curved architectures appealing in view of fundamental research on e.g. skyrmionic systems, magnonic crystals or exotic spin configurations. In addition to these rich physics, the application potential of three-dimensionally shaped objects is currently being explored as magnetic field sensorics for magnetofluidic applications, spin-wave filters, advanced magneto-encephalography devices for diagnosis of epilepsy or for energy-efficient racetrack memory devices. Finally, these recent developments ranging from theoretical predictions over fabrication of three-dimensionally curved magnetic thin films, hollow cylinders or wires, to their characterization using integral means as well as the development of advanced tomography approaches are in the focus of this review.« less

  3. Approaches to rationing antiretroviral treatment: ethical and equity implications.

    PubMed Central

    Bennett, Sara; Chanfreau, Catherine

    2005-01-01

    Despite a growing global commitment to the provision of antiretroviral therapy (ART), its availability is still likely to be less than the need. This imbalance raises ethical dilemmas about who should be granted access to publicly-subsidized ART programmes. This paper reviews the eligibility and targeting criteria used in four case-study countries at different points in the scale-up of ART, with the aim of drawing lessons regarding ethical approaches to rationing. Mexico, Senegal, Thailand and Uganda have each made an explicit policy commitment to provide antiretrovirals to all those in need, but are achieving this goal in steps--beginning with explicit rationing of access to care. Drawing upon the case-studies and experiences elsewhere, categories of explicit rationing criteria have been identified. These include biomedical factors, adherence to treatment, prevention-driven factors, social and economic benefits, financial factors and factors driven by ethical arguments. The initial criteria for determining eligibility are typically clinical criteria and assessment of adherence prospects, followed by a number of other factors. Rationing mechanisms reflect several underlying ethical theories and the ethical underpinnings of explicit rationing criteria should reflect societal values. In order to ensure this alignment, widespread consultation with a variety of stakeholders, and not only policy-makers or physicians, is critical. Without such explicit debate, more rationing will occur implicitly and this may be more inequitable. The effects of rationing mechanisms upon equity are critically dependent upon the implementation processes. As antiretroviral programmes are implemented it is crucial to monitor who gains access to these programmes. PMID:16175829

  4. Input-driven versus turnover-driven controls of simulated changes in soil carbon due to land-use change

    NASA Astrophysics Data System (ADS)

    Nyawira, S. S.; Nabel, J. E. M. S.; Brovkin, V.; Pongratz, J.

    2017-08-01

    Historical changes in soil carbon associated with land-use change (LUC) result mainly from the changes in the quantity of litter inputs to the soil and the turnover of carbon in soils. We use a factor separation technique to assess how the input-driven and turnover-driven controls, as well as their synergies, have contributed to historical changes in soil carbon associated with LUC. We apply this approach to equilibrium simulations of present-day and pre-industrial land use performed using the dynamic global vegetation model JSBACH. Our results show that both the input-driven and turnover-driven changes generally contribute to a gain in soil carbon in afforested regions and a loss in deforested regions. However, in regions where grasslands have been converted to croplands, we find an input-driven loss that is partly offset by a turnover-driven gain, which stems from a decrease in the fire-related carbon losses. Omitting land management through crop and wood harvest substantially reduces the global losses through the input-driven changes. Our study thus suggests that the dominating control of soil carbon losses is via the input-driven changes, which are more directly accessible to human management than the turnover-driven ones.

  5. The Data-Driven Approach to Spectroscopic Analyses

    NASA Astrophysics Data System (ADS)

    Ness, M.

    2018-01-01

    I review the data-driven approach to spectroscopy, The Cannon, which is a method for deriving fundamental diagnostics of galaxy formation of precise chemical compositions and stellar ages, across many stellar surveys that are mapping the Milky Way. With The Cannon, the abundances and stellar parameters from the multitude of stellar surveys can be placed directly on the same scale, using stars in common between the surveys. Furthermore, the information that resides in the data can be fully extracted, this has resulted in higher precision stellar parameters and abundances being delivered from spectroscopic data and has opened up new avenues in galactic archeology, for example, in the determination of ages for red giant stars across the Galactic disk. Coupled with Gaia distances, proper motions, and derived orbit families, the stellar age and individual abundance information delivered at the precision obtained with the data-driven approach provides very strong constraints on the evolution of and birthplace of stars in the Milky Way. I will review the role of data-driven spectroscopy as we enter the era where we have both the data and the tools to build the ultimate conglomerate of galactic information as well as highlight further applications of data-driven models in the coming decade.

  6. When benefits outweigh costs: reconsidering "automatic" phonological recoding when reading aloud.

    PubMed

    Robidoux, Serje; Besner, Derek

    2011-06-01

    Skilled readers are slower to read aloud exception words (e.g., PINT) than regular words (e.g., MINT). In the case of exception words, sublexical knowledge competes with the correct pronunciation driven by lexical knowledge, whereas no such competition occurs for regular words. The dominant view is that the cost of this "regularity" effect is evidence that sublexical spelling-sound conversion is impossible to prevent (i.e., is "automatic"). This view has become so reified that the field rarely questions it. However, the results of simulations from the most successful computational models on the table suggest that the claim of "automatic" sublexical phonological recoding is premature given that there is also a benefit conferred by sublexical processing. Taken together with evidence from skilled readers that sublexical phonological recoding can be stopped, we suggest that the field is too narrowly focused when it asserts that sublexical phonological recoding is "automatic" and that a broader, more nuanced and contextually driven approach provides a more useful framework.

  7. Force Evaluation in the Lattice Boltzmann Method Involving Curved Geometry

    NASA Technical Reports Server (NTRS)

    Mei, Renwei; Yu, Dazhi; Shyy, Wei; Luo, Li-Shi; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The present work investigates two approaches for force evaluation in the lattice Boltzmann equation: the momentum- exchange method and the stress-integration method on the surface of a body. The boundary condition for the particle distribution functions on curved geometries is handled with second order accuracy based on our recent works. The stress-integration method is computationally laborious for two-dimensional flows and in general difficult to implement for three-dimensional flows, while the momentum-exchange method is reliable, accurate, and easy to implement for both two-dimensional and three-dimensional flows. Several test cases are selected to evaluate the present methods, including: (i) two-dimensional pressure-driven channel flow; (ii) two-dimensional uniform flow past a column of cylinders; (iii) two-dimensional flow past a cylinder asymmetrically placed in a channel (with vortex shedding); (iv) three-dimensional pressure-driven flow in a circular pipe; and (v) three-dimensional flow past a sphere. The drag evaluated by using the momentum-exchange method agrees well with the exact or other published results.

  8. Activation rates for nonlinear stochastic flows driven by non-Gaussian noise

    NASA Astrophysics Data System (ADS)

    van den Broeck, C.; Hänggi, P.

    1984-11-01

    Activation rates are calculated for stochastic bistable flows driven by asymmetric dichotomic Markov noise (a two-state Markov process). This noise contains as limits both a particular type of non-Gaussian white shot noise and white Gaussian noise. Apart from investigating the role of colored noise on the escape rates, one can thus also study the influence of the non-Gaussian nature of the noise on these rates. The rate for white shot noise differs in leading order (Arrhenius factor) from the corresponding rate for white Gaussian noise of equal strength. In evaluating the rates we demonstrate the advantage of using transport theory over a mean first-passage time approach for cases with generally non-white and non-Gaussian noise sources. For white shot noise with exponentially distributed weights we succeed in evaluating the mean first-passage time of the corresponding integro-differential master-equation dynamics. The rate is shown to coincide in the weak noise limit with the inverse mean first-passage time.

  9. The Impact of Heterogeneity and Awareness in Modeling Epidemic Spreading on Multiplex Networks

    PubMed Central

    Scatà, Marialisa; Di Stefano, Alessandro; Liò, Pietro; La Corte, Aurelio

    2016-01-01

    In the real world, dynamic processes involving human beings are not disjoint. To capture the real complexity of such dynamics, we propose a novel model of the coevolution of epidemic and awareness spreading processes on a multiplex network, also introducing a preventive isolation strategy. Our aim is to evaluate and quantify the joint impact of heterogeneity and awareness, under different socioeconomic conditions. Considering, as case study, an emerging public health threat, Zika virus, we introduce a data-driven analysis by exploiting multiple sources and different types of data, ranging from Big Five personality traits to Google Trends, related to different world countries where there is an ongoing epidemic outbreak. Our findings demonstrate how the proposed model allows delaying the epidemic outbreak and increasing the resilience of nodes, especially under critical economic conditions. Simulation results, using data-driven approach on Zika virus, which has a growing scientific research interest, are coherent with the proposed analytic model. PMID:27848978

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Shi-Zeng

    We derive the skyrmion dynamics in response to a weak external drive, taking all the magnon modes into account. A skyrmion has rotational symmetry, and the magnon modes can be characterized by an angular momentum. For a weak distortion of a skyrmion, only the magnon modes with an angular momentum | m | = 1 govern the dynamics of skyrmion topological center. We also determine that the skyrmion inertia comes by way of the magnon modes in the continuum spectrum. For a skyrmion driven by a magnetic field gradient or by a spin transfer torque generated by a current, themore » dynamical response is practically instantaneous. This justifies the rigid skyrmion approximation used in Thiele's collective coordinate approach. For a skyrmion driven by a spin Hall torque, the torque couples to the skyrmion motion through the magnons in the continuum and damping; therefore the skyrmion dynamics shows sizable inertia in this case. The trajectory of a skyrmion is an ellipse for an ac drive of spin Hall torque.« less

  11. Methodological and practical viewpoints of qualitative-driven mixed method design: the case of decentralisation of primary healthcare services in Nepal.

    PubMed

    Regmi, Krishna

    2018-01-01

    Although considerable attention has been paid to the use of quantitative methods in health research, there has been limited focus on decentralisation research using a qualitative-driven mixed method design. Decentralisation presents both a problematic concept and methodological challenges, and is more context-specific and is often multi-dimensional. Researchers often consider using more than one method design when researching phenomena is complex in nature. Aim To explore the effects of decentralisation on the provision of primary healthcare services. Qualitative-driven mixed method design, employing three methods of data collections: focus group discussions (FGDs), semi-structured interviews (SSIs) and participant observations under two components, that is, core component and supplementary components were used. Four FGDs with health service practitioners, three FGDs with district stakeholders, 20 SSIs with health service users and 20 SSIs with national stakeholders were carried out. These were conducted sequentially. NVivo10, a data management program, was utilised to code the field data, employing a content analysis method for searching the underlying themes or concepts in the text material. Findings Both positive and negative experiences related to access, quality, planning, supplies, coordination and supervision were identified. This study suggests some evidence of the effects of decentralisation on health outcomes in general, as well as filling a gap of understanding and examining healthcare through a qualitative-driven mixed methods approach, in particular. Future research in the area of qualitative in-depth understanding of the problems (why decentralisation, why now and what for) would provoke an important data set that benefits the researchers and policy-makers for planning and implementing effective health services.

  12. A generic architecture for an adaptive, interoperable and intelligent type 2 diabetes mellitus care system.

    PubMed

    Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Schulz, Stefan

    2015-01-01

    Chronic diseases such as Type 2 Diabetes Mellitus (T2DM) constitute a big burden to the global health economy. T2DM Care Management requires a multi-disciplinary and multi-organizational approach. Because of different languages and terminologies, education, experiences, skills, etc., such an approach establishes a special interoperability challenge. The solution is a flexible, scalable, business-controlled, adaptive, knowledge-based, intelligent system following a systems-oriented, architecture-centric, ontology-based and policy-driven approach. The architecture of real systems is described, using the basics and principles of the Generic Component Model (GCM). For representing the functional aspects of a system the Business Process Modeling Notation (BPMN) is used. The system architecture obtained is presented using a GCM graphical notation, class diagrams and BPMN diagrams. The architecture-centric approach considers the compositional nature of the real world system and its functionalities, guarantees coherence, and provides right inferences. The level of generality provided in this paper facilitates use case specific adaptations of the system. By that way, intelligent, adaptive and interoperable T2DM care systems can be derived from the presented model as presented in another publication.

  13. Conditional Outlier Detection for Clinical Alerting

    PubMed Central

    Hauskrecht, Milos; Valko, Michal; Batal, Iyad; Clermont, Gilles; Visweswaran, Shyam; Cooper, Gregory F.

    2010-01-01

    We develop and evaluate a data-driven approach for detecting unusual (anomalous) patient-management actions using past patient cases stored in an electronic health record (EHR) system. Our hypothesis is that patient-management actions that are unusual with respect to past patients may be due to a potential error and that it is worthwhile to raise an alert if such a condition is encountered. We evaluate this hypothesis using data obtained from the electronic health records of 4,486 post-cardiac surgical patients. We base the evaluation on the opinions of a panel of experts. The results support that anomaly-based alerting can have reasonably low false alert rates and that stronger anomalies are correlated with higher alert rates. PMID:21346986

  14. Conditional outlier detection for clinical alerting.

    PubMed

    Hauskrecht, Milos; Valko, Michal; Batal, Iyad; Clermont, Gilles; Visweswaran, Shyam; Cooper, Gregory F

    2010-11-13

    We develop and evaluate a data-driven approach for detecting unusual (anomalous) patient-management actions using past patient cases stored in an electronic health record (EHR) system. Our hypothesis is that patient-management actions that are unusual with respect to past patients may be due to a potential error and that it is worthwhile to raise an alert if such a condition is encountered. We evaluate this hypothesis using data obtained from the electronic health records of 4,486 post-cardiac surgical patients. We base the evaluation on the opinions of a panel of experts. The results support that anomaly-based alerting can have reasonably low false alert rates and that stronger anomalies are correlated with higher alert rates.

  15. Paving the COWpath: data-driven design of pediatric order sets

    PubMed Central

    Zhang, Yiye; Padman, Rema; Levin, James E

    2014-01-01

    Objective Evidence indicates that users incur significant physical and cognitive costs in the use of order sets, a core feature of computerized provider order entry systems. This paper develops data-driven approaches for automating the construction of order sets that match closely with user preferences and workflow while minimizing physical and cognitive workload. Materials and methods We developed and tested optimization-based models embedded with clustering techniques using physical and cognitive click cost criteria. By judiciously learning from users’ actual actions, our methods identify items for constituting order sets that are relevant according to historical ordering data and grouped on the basis of order similarity and ordering time. We evaluated performance of the methods using 47 099 orders from the year 2011 for asthma, appendectomy and pneumonia management in a pediatric inpatient setting. Results In comparison with existing order sets, those developed using the new approach significantly reduce the physical and cognitive workload associated with usage by 14–52%. This approach is also capable of accommodating variations in clinical conditions that affect order set usage and development. Discussion There is a critical need to investigate the cognitive complexity imposed on users by complex clinical information systems, and to design their features according to ‘human factors’ best practices. Optimizing order set generation using cognitive cost criteria introduces a new approach that can potentially improve ordering efficiency, reduce unintended variations in order placement, and enhance patient safety. Conclusions We demonstrate that data-driven methods offer a promising approach for designing order sets that are generalizable, data-driven, condition-based, and up to date with current best practices. PMID:24674844

  16. The Use of Multiple Evaluation Approaches in Program Evaluation

    ERIC Educational Resources Information Center

    Bledsoe, Katrina L.; Graham, James A.

    2005-01-01

    The authors discuss the use of multiple evaluation approaches in conducting program evaluations. Specifically, they illustrate four evaluation approaches (theory-driven, consumer-based, empowerment, and inclusive evaluation) and briefly discuss a fifth (use-focused evaluation) as a side effect of the use of the others. The authors also address the…

  17. A modelling approach to assessing the timescale uncertainties in proxy series with chronological errors

    NASA Astrophysics Data System (ADS)

    Divine, D.; Godtliebsen, F.; Rue, H.

    2012-04-01

    Detailed knowledge of past climate variations is of high importance for gaining a better insight into the possible future climate scenarios. The relative shortness of available high quality instrumental climate data conditions the use of various climate proxy archives in making inference about past climate evolution. It, however, requires an accurate assessment of timescale errors in proxy-based paleoclimatic reconstructions. We here propose an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models constructed using tie points of mixed origin.

  18. Discovery and characterization of natural products that act as pheromones in fish.

    PubMed

    Li, Ke; Buchinger, Tyler J; Li, Weiming

    2018-06-20

    Covering: up to 2018 Fish use a diverse collection of molecules to communicate with conspecifics. Since Karlson and Lüscher termed these molecules 'pheromones', chemists and biologists have joined efforts to characterize their structures and functions. In particular, the understanding of insect pheromones developed at a rapid pace, set, in part, by the use of bioassay-guided fractionation and natural product chemistry. Research on vertebrate pheromones, however, has progressed more slowly. Initially, biologists characterized fish pheromones by screening commercially available compounds suspected to act as pheromones based upon their physiological function. Such biology-driven screening has proven a productive approach to studying pheromones in fish. However, the many functions of fish pheromones and diverse metabolites that fish release make predicting pheromone identity difficult and necessitate approaches led by chemistry. Indeed, the few cases in which pheromone identification was led by natural product chemistry indicated novel or otherwise unpredicted compounds act as pheromones. Here, we provide a brief review of the approaches to identifying pheromones, placing particular emphasis on the promise of using natural product chemistry together with assays of biological activity. Several case studies illustrate bioassay-guided fractionation as an approach to pheromone identification in fish and the unexpected diversity of pheromone structures discovered by natural product chemistry. With recent advances in natural product chemistry, bioassay-guided fractionation is likely to unveil an even broader collection of pheromone structures and enable research that spans across disciplines.

  19. Visual Analytics Tools for Sustainable Lifecycle Design: Current Status, Challenges, and Future Opportunities.

    PubMed

    Ramanujan, Devarajan; Bernstein, William Z; Chandrasegaran, Senthil K; Ramani, Karthik

    2017-01-01

    The rapid rise in technologies for data collection has created an unmatched opportunity to advance the use of data-rich tools for lifecycle decision-making. However, the usefulness of these technologies is limited by the ability to translate lifecycle data into actionable insights for human decision-makers. This is especially true in the case of sustainable lifecycle design (SLD), as the assessment of environmental impacts, and the feasibility of making corresponding design changes, often relies on human expertise and intuition. Supporting human sense-making in SLD requires the use of both data-driven and user-driven methods while exploring lifecycle data. A promising approach for combining the two is through the use of visual analytics (VA) tools. Such tools can leverage the ability of computer-based tools to gather, process, and summarize data along with the ability of human-experts to guide analyses through domain knowledge or data-driven insight. In this paper, we review previous research that has created VA tools in SLD. We also highlight existing challenges and future opportunities for such tools in different lifecycle stages-design, manufacturing, distribution & supply chain, use-phase, end-of-life, as well as life cycle assessment. Our review shows that while the number of VA tools in SLD is relatively small, researchers are increasingly focusing on the subject matter. Our review also suggests that VA tools can address existing challenges in SLD and that significant future opportunities exist.

  20. A Consumer-Driven Approach To Increase Suggestive Selling.

    ERIC Educational Resources Information Center

    Rohn, Don; Austin, John; Sanford, Alison

    2003-01-01

    Discussion of the effectiveness of behavioral interventions in improving suggestive selling behavior of sales staff focuses on a study that examined the efficacy of a consumer-driven approach to improve suggestive selling behavior of three employees of a fast food franchise. Reports that consumer-driven intervention increased suggestive selling…

  1. Contrasting analytical and data-driven frameworks for radiogenomic modeling of normal tissue toxicities in prostate cancer.

    PubMed

    Coates, James; Jeyaseelan, Asha K; Ybarra, Norma; David, Marc; Faria, Sergio; Souhami, Luis; Cury, Fabio; Duclos, Marie; El Naqa, Issam

    2015-04-01

    We explore analytical and data-driven approaches to investigate the integration of genetic variations (single nucleotide polymorphisms [SNPs] and copy number variations [CNVs]) with dosimetric and clinical variables in modeling radiation-induced rectal bleeding (RB) and erectile dysfunction (ED) in prostate cancer patients. Sixty-two patients who underwent curative hypofractionated radiotherapy (66 Gy in 22 fractions) between 2002 and 2010 were retrospectively genotyped for CNV and SNP rs5489 in the xrcc1 DNA repair gene. Fifty-four patients had full dosimetric profiles. Two parallel modeling approaches were compared to assess the risk of severe RB (Grade⩾3) and ED (Grade⩾1); Maximum likelihood estimated generalized Lyman-Kutcher-Burman (LKB) and logistic regression. Statistical resampling based on cross-validation was used to evaluate model predictive power and generalizability to unseen data. Integration of biological variables xrcc1 CNV and SNP improved the fit of the RB and ED analytical and data-driven models. Cross-validation of the generalized LKB models yielded increases in classification performance of 27.4% for RB and 14.6% for ED when xrcc1 CNV and SNP were included, respectively. Biological variables added to logistic regression modeling improved classification performance over standard dosimetric models by 33.5% for RB and 21.2% for ED models. As a proof-of-concept, we demonstrated that the combination of genetic and dosimetric variables can provide significant improvement in NTCP prediction using analytical and data-driven approaches. The improvement in prediction performance was more pronounced in the data driven approaches. Moreover, we have shown that CNVs, in addition to SNPs, may be useful structural genetic variants in predicting radiation toxicities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Automated interpretation of 3D laserscanned point clouds for plant organ segmentation.

    PubMed

    Wahabzada, Mirwaes; Paulus, Stefan; Kersting, Kristian; Mahlein, Anne-Katrin

    2015-08-08

    Plant organ segmentation from 3D point clouds is a relevant task for plant phenotyping and plant growth observation. Automated solutions are required to increase the efficiency of recent high-throughput plant phenotyping pipelines. However, plant geometrical properties vary with time, among observation scales and different plant types. The main objective of the present research is to develop a fully automated, fast and reliable data driven approach for plant organ segmentation. The automated segmentation of plant organs using unsupervised, clustering methods is crucial in cases where the goal is to get fast insights into the data or no labeled data is available or costly to achieve. For this we propose and compare data driven approaches that are easy-to-realize and make the use of standard algorithms possible. Since normalized histograms, acquired from 3D point clouds, can be seen as samples from a probability simplex, we propose to map the data from the simplex space into Euclidean space using Aitchisons log ratio transformation, or into the positive quadrant of the unit sphere using square root transformation. This, in turn, paves the way to a wide range of commonly used analysis techniques that are based on measuring the similarities between data points using Euclidean distance. We investigate the performance of the resulting approaches in the practical context of grouping 3D point clouds and demonstrate empirically that they lead to clustering results with high accuracy for monocotyledonous and dicotyledonous plant species with diverse shoot architecture. An automated segmentation of 3D point clouds is demonstrated in the present work. Within seconds first insights into plant data can be deviated - even from non-labelled data. This approach is applicable to different plant species with high accuracy. The analysis cascade can be implemented in future high-throughput phenotyping scenarios and will support the evaluation of the performance of different plant genotypes exposed to stress or in different environmental scenarios.

  3. Elucidating the electron transport in semiconductors via Monte Carlo simulations: an inquiry-driven learning path for engineering undergraduates

    NASA Astrophysics Data System (ADS)

    Persano Adorno, Dominique; Pizzolato, Nicola; Fazio, Claudio

    2015-09-01

    Within the context of higher education for science or engineering undergraduates, we present an inquiry-driven learning path aimed at developing a more meaningful conceptual understanding of the electron dynamics in semiconductors in the presence of applied electric fields. The electron transport in a nondegenerate n-type indium phosphide bulk semiconductor is modelled using a multivalley Monte Carlo approach. The main characteristics of the electron dynamics are explored under different values of the driving electric field, lattice temperature and impurity density. Simulation results are presented by following a question-driven path of exploration, starting from the validation of the model and moving up to reasoned inquiries about the observed characteristics of electron dynamics. Our inquiry-driven learning path, based on numerical simulations, represents a viable example of how to integrate a traditional lecture-based teaching approach with effective learning strategies, providing science or engineering undergraduates with practical opportunities to enhance their comprehension of the physics governing the electron dynamics in semiconductors. Finally, we present a general discussion about the advantages and disadvantages of using an inquiry-based teaching approach within a learning environment based on semiconductor simulations.

  4. Improving Forecasts Through Realistic Uncertainty Estimates: A Novel Data Driven Method for Model Uncertainty Quantification in Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Moradkhani, H.; Marshall, L. A.; Sharma, A.; Geenens, G.

    2016-12-01

    Effective combination of model simulations and observations through Data Assimilation (DA) depends heavily on uncertainty characterisation. Many traditional methods for quantifying model uncertainty in DA require some level of subjectivity (by way of tuning parameters or by assuming Gaussian statistics). Furthermore, the focus is typically on only estimating the first and second moments. We propose a data-driven methodology to estimate the full distributional form of model uncertainty, i.e. the transition density p(xt|xt-1). All sources of uncertainty associated with the model simulations are considered collectively, without needing to devise stochastic perturbations for individual components (such as model input, parameter and structural uncertainty). A training period is used to derive the distribution of errors in observed variables conditioned on hidden states. Errors in hidden states are estimated from the conditional distribution of observed variables using non-linear optimization. The theory behind the framework and case study applications are discussed in detail. Results demonstrate improved predictions and more realistic uncertainty bounds compared to a standard perturbation approach.

  5. Future-oriented tweets predict lower county-level HIV prevalence in the United States.

    PubMed

    Ireland, Molly E; Schwartz, H Andrew; Chen, Qijia; Ungar, Lyle H; Albarracín, Dolores

    2015-12-01

    Future orientation promotes health and well-being at the individual level. Computerized text analysis of a dataset encompassing billions of words used across the United States on Twitter tested whether community-level rates of future-oriented messages correlated with lower human immunodeficiency virus (HIV) rates and moderated the association between behavioral risk indicators and HIV. Over 150 million tweets mapped to U.S. counties were analyzed using 2 methods of text analysis. First, county-level HIV rates (cases per 100,000) were regressed on aggregate usage of future-oriented language (e.g., will, gonna). A second data-driven method regressed HIV rates on individual words and phrases. Results showed that counties with higher rates of future tense on Twitter had fewer HIV cases, independent of strong structural predictors of HIV such as population density. Future-oriented messages also appeared to buffer health risk: Sexually transmitted infection rates and references to risky behavior on Twitter were associated with higher HIV prevalence in all counties except those with high rates of future orientation. Data-driven analyses likewise showed that words and phrases referencing the future (e.g., tomorrow, would be) correlated with lower HIV prevalence. Integrating big data approaches to text analysis and epidemiology with psychological theory may provide an inexpensive, real-time method of anticipating outbreaks of HIV and etiologically similar diseases. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  6. Future-Oriented Tweets Predict Lower County-Level HIV Prevalence in the United States

    PubMed Central

    Ireland, Molly E.; Schwartz, Hansen A.; Chen, Qijia; Ungar, Lyle; Albarracín, Dolores

    2016-01-01

    Objective Future orientation promotes health and well-being at the individual level. Computerized text analysis of a dataset encompassing billions of words used across the United States on Twitter tested whether community-level rates of future-oriented messages correlated with lower HIV rates and moderated the association between behavioral risk indicators and HIV. Method Over 150 million Tweets mapped to US counties were analyzed using two methods of text analysis. First, county-level HIV rates (cases per 100,000) were regressed on aggregate usage of future-oriented language (e.g., will, gonna). A second data-driven method regressed HIV rates on individual words and phrases. Results Results showed that counties with higher rates of future tense on Twitter had fewer HIV cases, independent of strong structural predictors of HIV such as population density. Future-oriented messages also appeared to buffer health risk: Sexually transmitted infection rates and references to risky behavior on Twitter were associated with higher HIV prevalence in all counties except those with high rates of future orientation. Data-driven analyses likewise showed that words and phrases referencing the future (e.g., tomorrow, would be) correlated with lower HIV prevalence. Conclusion Integrating big data approaches to text analysis and epidemiology with psychological theory may provide an inexpensive, real-time method of anticipating outbreaks of HIV and etiologically similar diseases. PMID:26651466

  7. Cost evaluation to optimise radiation therapy implementation in different income settings: A time-driven activity-based analysis.

    PubMed

    Van Dyk, Jacob; Zubizarreta, Eduardo; Lievens, Yolande

    2017-11-01

    With increasing recognition of growing cancer incidence globally, efficient means of expanding radiotherapy capacity is imperative, and understanding the factors impacting human and financial needs is valuable. A time-driven activity-based costing analysis was performed, using a base case of 2-machine departments, with defined cost inputs and operating parameters. Four income groups were analysed, ranging from low to high income. Scenario analyses included department size, operating hours, fractionation, treatment complexity, efficiency, and centralised versus decentralised care. The base case cost/course is US$5,368 in HICs, US$2,028 in LICs; the annual operating cost is US$4,595,000 and US$1,736,000, respectively. Economies of scale show cost/course decreasing with increasing department size, mainly related to the equipment cost and most prominent up to 3 linacs. The cost in HICs is two or three times as high as in U-MICs or LICs, respectively. Decreasing operating hours below 8h/day has a dramatic impact on the cost/course. IMRT increases the cost/course by 22%. Centralising preparatory activities has a moderate impact on the costs. The results indicate trends that are useful for optimising local and regional circumstances. This methodology can provide input into a uniform and accepted approach to evaluating the cost of radiotherapy. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  8. System driven technology selection for future European launch systems

    NASA Astrophysics Data System (ADS)

    Baiocco, P.; Ramusat, G.; Sirbi, A.; Bouilly, Th.; Lavelle, F.; Cardone, T.; Fischer, H.; Appel, S.

    2015-02-01

    In the framework of the next generation launcher activity at ESA, a top-down approach and a bottom-up approach have been performed for the identification of promising technologies and alternative conception of future European launch vehicles. The top-down approach consists in looking for system-driven design solutions and the bottom-up approach features design solutions leading to substantial advantages for the system. The main investigations have been focused on the future launch vehicle technologies. Preliminary specifications have been used in order to permit sub-system design to find the major benefit for the overall launch system. The development cost, non-recurring and recurring cost, industrialization and operational aspects have been considered as competitiveness factors for the identification and down-selection of the most interesting technologies. The recurring cost per unit payload mass has been evaluated. The TRL/IRL has been assessed and a preliminary development plan has been traced for the most promising technologies. The potentially applicable launch systems are Ariane and VEGA evolution. The main FLPP technologies aim at reducing overall structural mass, increasing structural margins for robustness, metallic and composite containment of cryogenic hydrogen and oxygen propellants, propellant management subsystems, elements significantly reducing fabrication and operational costs, avionics, pyrotechnics, etc. to derive performing upper and booster stages. Application of the system driven approach allows creating performing technology demonstrators in terms of need, demonstration objective, size and cost. This paper outlines the process of technology down selection using a system driven approach, the accomplishments already achieved in the various technology fields up to now, as well as the potential associated benefit in terms of competitiveness factors.

  9. Deriving Flood-Mediated Connectivity between River Channels and Floodplains: Data-Driven Approaches

    NASA Astrophysics Data System (ADS)

    Zhao, Tongtiegang; Shao, Quanxi; Zhang, Yongyong

    2017-03-01

    The flood-mediated connectivity between river channels and floodplains plays a fundamental role in flood hazard mapping and exerts profound ecological effects. The classic nearest neighbor search (NNS) fails to derive this connectivity because of spatial heterogeneity and continuity. We develop two novel data-driven connectivity-deriving approaches, namely, progressive nearest neighbor search (PNNS) and progressive iterative nearest neighbor search (PiNNS). These approaches are illustrated through a case study in Northern Australia. First, PNNS and PiNNS are employed to identify flood pathways on floodplains through forward tracking. That is, progressive search is performed to associate newly inundated cells in each time step to previously inundated cells. In particular, iterations in PiNNS ensure that the connectivity is continuous - the connection between any two cells along the pathway is built through intermediate inundated cells. Second, inundated floodplain cells are collectively connected to river channel cells through backward tracing. Certain river channel sections are identified to connect to a large number of inundated floodplain cells. That is, the floodwater from these sections causes widespread floodplain inundation. Our proposed approaches take advantage of spatial-temporal data. They can be applied to achieve connectivity from hydro-dynamic and remote sensing data and assist in river basin planning and management.

  10. Model-driven Service Engineering with SoaML

    NASA Astrophysics Data System (ADS)

    Elvesæter, Brian; Carrez, Cyril; Mohagheghi, Parastoo; Berre, Arne-Jørgen; Johnsen, Svein G.; Solberg, Arnor

    This chapter presents a model-driven service engineering (MDSE) methodology that uses OMG MDA specifications such as BMM, BPMN and SoaML to identify and specify services within a service-oriented architecture. The methodology takes advantage of business modelling practices and provides a guide to service modelling with SoaML. The presentation is case-driven and illuminated using the telecommunication example. The chapter focuses in particular on the use of the SoaML modelling language as a means for expressing service specifications that are aligned with business models and can be realized in different platform technologies.

  11. Retrospective cost adaptive Reynolds-averaged Navier-Stokes k-ω model for data-driven unsteady turbulent simulations

    NASA Astrophysics Data System (ADS)

    Li, Zhiyong; Hoagg, Jesse B.; Martin, Alexandre; Bailey, Sean C. C.

    2018-03-01

    This paper presents a data-driven computational model for simulating unsteady turbulent flows, where sparse measurement data is available. The model uses the retrospective cost adaptation (RCA) algorithm to automatically adjust the closure coefficients of the Reynolds-averaged Navier-Stokes (RANS) k- ω turbulence equations to improve agreement between the simulated flow and the measurements. The RCA-RANS k- ω model is verified for steady flow using a pipe-flow test case and for unsteady flow using a surface-mounted-cube test case. Measurements used for adaptation of the verification cases are obtained from baseline simulations with known closure coefficients. These verification test cases demonstrate that the RCA-RANS k- ω model can successfully adapt the closure coefficients to improve agreement between the simulated flow field and a set of sparse flow-field measurements. Furthermore, the RCA-RANS k- ω model improves agreement between the simulated flow and the baseline flow at locations at which measurements do not exist. The RCA-RANS k- ω model is also validated with experimental data from 2 test cases: steady pipe flow, and unsteady flow past a square cylinder. In both test cases, the adaptation improves agreement with experimental data in comparison to the results from a non-adaptive RANS k- ω model that uses the standard values of the k- ω closure coefficients. For the steady pipe flow, adaptation is driven by mean stream-wise velocity measurements at 24 locations along the pipe radius. The RCA-RANS k- ω model reduces the average velocity error at these locations by over 35%. For the unsteady flow over a square cylinder, adaptation is driven by time-varying surface pressure measurements at 2 locations on the square cylinder. The RCA-RANS k- ω model reduces the average surface-pressure error at these locations by 88.8%.

  12. A study on a robot arm driven by three-dimensional trajectories predicted from non-invasive neural signals.

    PubMed

    Kim, Yoon Jae; Park, Sung Woo; Yeom, Hong Gi; Bang, Moon Suk; Kim, June Sic; Chung, Chun Kee; Kim, Sungwan

    2015-08-20

    A brain-machine interface (BMI) should be able to help people with disabilities by replacing their lost motor functions. To replace lost functions, robot arms have been developed that are controlled by invasive neural signals. Although invasive neural signals have a high spatial resolution, non-invasive neural signals are valuable because they provide an interface without surgery. Thus, various researchers have developed robot arms driven by non-invasive neural signals. However, robot arm control based on the imagined trajectory of a human hand can be more intuitive for patients. In this study, therefore, an integrated robot arm-gripper system (IRAGS) that is driven by three-dimensional (3D) hand trajectories predicted from non-invasive neural signals was developed and verified. The IRAGS was developed by integrating a six-degree of freedom robot arm and adaptive robot gripper. The system was used to perform reaching and grasping motions for verification. The non-invasive neural signals, magnetoencephalography (MEG) and electroencephalography (EEG), were obtained to control the system. The 3D trajectories were predicted by multiple linear regressions. A target sphere was placed at the terminal point of the real trajectories, and the system was commanded to grasp the target at the terminal point of the predicted trajectories. The average correlation coefficient between the predicted and real trajectories in the MEG case was [Formula: see text] ([Formula: see text]). In the EEG case, it was [Formula: see text] ([Formula: see text]). The success rates in grasping the target plastic sphere were 18.75 and 7.50 % with MEG and EEG, respectively. The success rates of touching the target were 52.50 and 58.75 % respectively. A robot arm driven by 3D trajectories predicted from non-invasive neural signals was implemented, and reaching and grasping motions were performed. In most cases, the robot closely approached the target, but the success rate was not very high because the non-invasive neural signal is less accurate. However the success rate could be sufficiently improved for practical applications by using additional sensors. Robot arm control based on hand trajectories predicted from EEG would allow for portability, and the performance with EEG was comparable to that with MEG.

  13. Information Technology in University-Level Mathematics Teaching and Learning: A Mathematician's Point of View

    ERIC Educational Resources Information Center

    Borovik, Alexandre

    2011-01-01

    Although mathematicians frequently use specialist software in direct teaching of mathematics, as a means of delivery e-learning technologies have so far been less widely used. We (mathematicians) insist that teaching methods should be subject-specific and content-driven, not delivery-driven. We oppose generic approaches to teaching, including…

  14. Development and evaluation of SOA-based AAL services in real-life environments: a case study and lessons learned.

    PubMed

    Stav, Erlend; Walderhaug, Ståle; Mikalsen, Marius; Hanke, Sten; Benc, Ivan

    2013-11-01

    The proper use of ICT services can support seniors in living independently longer. While such services are starting to emerge, current proprietary solutions are often expensive, covering only isolated parts of seniors' needs, and lack support for sharing information between services and between users. For developers, the challenge is that it is complex and time consuming to develop high quality, interoperable services, and new techniques are needed to simplify the development and reduce the development costs. This paper provides the complete view of the experiences gained in the MPOWER project with respect to using model-driven development (MDD) techniques for Service Oriented Architecture (SOA) system development in the Ambient Assisted Living (AAL) domain. To address this challenge, the approach of the European research project MPOWER (2006-2009) was to investigate and record the user needs, define a set of reusable software services based on these needs, and then implement pilot systems using these services. Further, a model-driven toolchain covering key development phases was developed to support software developers through this process. Evaluations were conducted both on the technical artefacts (methodology and tools), and on end user experience from using the pilot systems in trial sites. The outcome of the work on the user needs is a knowledge base recorded as a Unified Modeling Language (UML) model. This comprehensive model describes actors, use cases, and features derived from these. The model further includes the design of a set of software services, including full trace information back to the features and use cases motivating their design. Based on the model, the services were implemented for use in Service Oriented Architecture (SOA) systems, and are publicly available as open source software. The services were successfully used in the realization of two pilot applications. There is therefore a direct and traceable link from the user needs of the elderly, through the service design knowledge base, to the service and pilot implementations. The evaluation of the SOA approach on the developers in the project revealed that SOA is useful with respect to job performance and quality. Furthermore, they think SOA is easy to use and support development of AAL applications. An important finding is that the developers clearly report that they intend to use SOA in the future, but not for all type of projects. With respect to using model-driven development in web services design and implementation, the developers reported that it was useful. However, it is important that the code generated from the models is correct if the full potential of MDD should be achieved. The pilots and their evaluation in the trial sites showed that the services of the platform are sufficient to create suitable systems for end users in the domain. A SOA platform with a set of reusable domain services is a suitable foundation for more rapid development and tailoring of assisted living systems covering reoccurring needs among elderly users. It is feasible to realize a tool-chain for model-driven development of SOA applications in the AAL domain, and such a tool-chain can be accepted and found useful by software developers. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knirsch, Fabian; Engel, Dominik; Neureiter, Christian

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures needmore » to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.« less

  16. A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty

    USGS Publications Warehouse

    Friedel, Michael J.

    2011-01-01

    This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.

  17. Numerical Simulation of Interacting Magnetic Flux Ropes

    NASA Astrophysics Data System (ADS)

    Odstrcil, Dusan; Vandas, Marek; Pizzo, Victor J.; MacNeice, Peter

    2003-09-01

    A 212-D MHD numerical model is used to investigate the dynamic interaction between two flux ropes (clouds) in a homogeneous magnetized plasma. One cloud is set into motion while the other is initially at rest. The moving cloud generates a shock which interacts with the second cloud. Two cases with different characteristic speeds within the second cloud are presented. The shock front is significantly distorted when it propagates faster (slower) in the cloud with larger (smaller) characteristic speed. Correspondingly, the density behind the shock front becomes smaller (larger). Later, the clouds approach each other and by a momentum exchange they come to a common speed. The oppositely directed magnetic fields are pushed together, a driven magnetic reconnection takes a place, and the two flux ropes gradually coalescence into a single flux rope.

  18. Maximum Principle for General Controlled Systems Driven by Fractional Brownian Motions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han Yuecai; Hu Yaozhong; Song Jian, E-mail: jsong2@math.rutgers.edu

    2013-04-15

    We obtain a maximum principle for stochastic control problem of general controlled stochastic differential systems driven by fractional Brownian motions (of Hurst parameter H>1/2). This maximum principle specifies a system of equations that the optimal control must satisfy (necessary condition for the optimal control). This system of equations consists of a backward stochastic differential equation driven by both fractional Brownian motions and the corresponding underlying standard Brownian motions. In addition to this backward equation, the maximum principle also involves the Malliavin derivatives. Our approach is to use conditioning and Malliavin calculus. To arrive at our maximum principle we need tomore » develop some new results of stochastic analysis of the controlled systems driven by fractional Brownian motions via fractional calculus. Our approach of conditioning and Malliavin calculus is also applied to classical system driven by standard Brownian motions while the controller has only partial information. As a straightforward consequence, the classical maximum principle is also deduced in this more natural and simpler way.« less

  19. A New Multivariate Approach for Prognostics Based on Extreme Learning Machine and Fuzzy Clustering.

    PubMed

    Javed, Kamran; Gouriveau, Rafael; Zerhouni, Noureddine

    2015-12-01

    Prognostics is a core process of prognostics and health management (PHM) discipline, that estimates the remaining useful life (RUL) of a degrading machinery to optimize its service delivery potential. However, machinery operates in a dynamic environment and the acquired condition monitoring data are usually noisy and subject to a high level of uncertainty/unpredictability, which complicates prognostics. The complexity further increases, when there is absence of prior knowledge about ground truth (or failure definition). For such issues, data-driven prognostics can be a valuable solution without deep understanding of system physics. This paper contributes a new data-driven prognostics approach namely, an "enhanced multivariate degradation modeling," which enables modeling degrading states of machinery without assuming a homogeneous pattern. In brief, a predictability scheme is introduced to reduce the dimensionality of the data. Following that, the proposed prognostics model is achieved by integrating two new algorithms namely, the summation wavelet-extreme learning machine and subtractive-maximum entropy fuzzy clustering to show evolution of machine degradation by simultaneous predictions and discrete state estimation. The prognostics model is equipped with a dynamic failure threshold assignment procedure to estimate RUL in a realistic manner. To validate the proposition, a case study is performed on turbofan engines data from PHM challenge 2008 (NASA), and results are compared with recent publications.

  20. Evolutionary programming for goal-driven dynamic planning

    NASA Astrophysics Data System (ADS)

    Vaccaro, James M.; Guest, Clark C.; Ross, David O.

    2002-03-01

    Many complex artificial intelligence (IA) problems are goal- driven in nature and the opportunity exists to realize the benefits of a goal-oriented solution. In many cases, such as in command and control, a goal-oriented approach may be the only option. One of many appropriate applications for such an approach is War Gaming. War Gaming is an important tool for command and control because it provides a set of alternative courses of actions so that military leaders can contemplate their next move in the battlefield. For instance, when making decisions that save lives, it is necessary to completely understand the consequences of a given order. A goal-oriented approach provides a slowly evolving tractably reasoned solution that inherently follows one of the principles of war: namely concentration on the objective. Future decision-making will depend not only on the battlefield, but also on a virtual world where military leaders can wage wars and determine their options by playing computer war games much like the real world. The problem with these games is that the built-in AI does not learn nor adapt and many times cheats, because the intelligent player has access to all the information, while the user has access to limited information provided on a display. These games are written for the purpose of entertainment and actions are calculated a priori and off-line, and are made prior or during their development. With these games getting more sophisticated in structure and less domain specific in scope, there needs to be a more general intelligent player that can adapt and learn in case the battlefield situations or the rules of engagement change. One such war game that might be considered is Risk. Risk incorporates the principles of war, is a top-down scalable model, and provides a good application for testing a variety of goal- oriented AI approaches. By integrating a goal-oriented hybrid approach, one can develop a program that plays the Risk game effectively and move one step closer to solving more difficult real-world AI problems. Using a hybrid approach that includes adaptation via evolutionary computation for the intelligent planning of a Risk player's turn provides better dynamic intelligent planning than more uniform approaches.

  1. A prototype case-based reasoning human assistant for space crew assessment and mission management

    NASA Technical Reports Server (NTRS)

    Owen, Robert B.; Holland, Albert W.; Wood, Joanna

    1993-01-01

    We present a prototype human assistant system for space crew assessment and mission management. Our system is based on case episodes from American and Russian space missions and analog environments such as polar stations and undersea habitats. The general domain of small groups in isolated and confined environments represents a near ideal application area for case-based reasoning (CBR) - there are few reliable rules to follow, and most domain knowledge is in the form of cases. We define the problem domain and outline a unique knowledge representation system driven by conflict and communication triggers. The prototype system is able to represent, index, and retrieve case studies of human performance. We index by social, behavioral, and environmental factors. We present the problem domain, our current implementation, our research approach for an operational system, and prototype performance and results.

  2. Optimal filter design with progressive genetic algorithm for local damage detection in rolling bearings

    NASA Astrophysics Data System (ADS)

    Wodecki, Jacek; Michalak, Anna; Zimroz, Radoslaw

    2018-03-01

    Harsh industrial conditions present in underground mining cause a lot of difficulties for local damage detection in heavy-duty machinery. For vibration signals one of the most intuitive approaches of obtaining signal with expected properties, such as clearly visible informative features, is prefiltration with appropriately prepared filter. Design of such filter is very broad field of research on its own. In this paper authors propose a novel approach to dedicated optimal filter design using progressive genetic algorithm. Presented method is fully data-driven and requires no prior knowledge of the signal. It has been tested against a set of real and simulated data. Effectiveness of operation has been proven for both healthy and damaged case. Termination criterion for evolution process was developed, and diagnostic decision making feature has been proposed for final result determinance.

  3. A coarse-grained Monte Carlo approach to diffusion processes in metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Hauser, Andreas W.; Schnedlitz, Martin; Ernst, Wolfgang E.

    2017-06-01

    A kinetic Monte Carlo approach on a coarse-grained lattice is developed for the simulation of surface diffusion processes of Ni, Pd and Au structures with diameters in the range of a few nanometers. Intensity information obtained via standard two-dimensional transmission electron microscopy imaging techniques is used to create three-dimensional structure models as input for a cellular automaton. A series of update rules based on reaction kinetics is defined to allow for a stepwise evolution in time with the aim to simulate surface diffusion phenomena such as Rayleigh breakup and surface wetting. The material flow, in our case represented by the hopping of discrete portions of metal on a given grid, is driven by the attempt to minimize the surface energy, which can be achieved by maximizing the number of filled neighbor cells.

  4. Developing a habitat-driven approach to CWWT design

    USGS Publications Warehouse

    Sartoris, James J.; Thullen, Joan S.

    1998-01-01

    A habitat-driven approach to CWWT design is defined as designing the constructed wetland to maximize habitat values for a given site within the constraints of meeting specified treatment criteria. This is in contrast to the more typical approach of designing the CWWT to maximize treatment efficiency, and then, perhaps, adding wildlife habitat features. The habitat-driven approach is advocated for two reasons: (1) because good wetland habitat is critically lacking, and (2) because it is hypothesized that well-designed habitat will result in good, sustainable wastewater treatment.

  5. Joint sparsity based heterogeneous data-level fusion for target detection and estimation

    NASA Astrophysics Data System (ADS)

    Niu, Ruixin; Zulch, Peter; Distasio, Marcello; Blasch, Erik; Shen, Dan; Chen, Genshe

    2017-05-01

    Typical surveillance systems employ decision- or feature-level fusion approaches to integrate heterogeneous sensor data, which are sub-optimal and incur information loss. In this paper, we investigate data-level heterogeneous sensor fusion. Since the sensors monitor the common targets of interest, whose states can be determined by only a few parameters, it is reasonable to assume that the measurement domain has a low intrinsic dimensionality. For heterogeneous sensor data, we develop a joint-sparse data-level fusion (JSDLF) approach based on the emerging joint sparse signal recovery techniques by discretizing the target state space. This approach is applied to fuse signals from multiple distributed radio frequency (RF) signal sensors and a video camera for joint target detection and state estimation. The JSDLF approach is data-driven and requires minimum prior information, since there is no need to know the time-varying RF signal amplitudes, or the image intensity of the targets. It can handle non-linearity in the sensor data due to state space discretization and the use of frequency/pixel selection matrices. Furthermore, for a multi-target case with J targets, the JSDLF approach only requires discretization in a single-target state space, instead of discretization in a J-target state space, as in the case of the generalized likelihood ratio test (GLRT) or the maximum likelihood estimator (MLE). Numerical examples are provided to demonstrate that the proposed JSDLF approach achieves excellent performance with near real-time accurate target position and velocity estimates.

  6. A Fast Surrogate-facilitated Data-driven Bayesian Approach to Uncertainty Quantification of a Regional Groundwater Flow Model with Structural Error

    NASA Astrophysics Data System (ADS)

    Xu, T.; Valocchi, A. J.; Ye, M.; Liang, F.

    2016-12-01

    Due to simplification and/or misrepresentation of the real aquifer system, numerical groundwater flow and solute transport models are usually subject to model structural error. During model calibration, the hydrogeological parameters may be overly adjusted to compensate for unknown structural error. This may result in biased predictions when models are used to forecast aquifer response to new forcing. In this study, we extend a fully Bayesian method [Xu and Valocchi, 2015] to calibrate a real-world, regional groundwater flow model. The method uses a data-driven error model to describe model structural error and jointly infers model parameters and structural error. In this study, Bayesian inference is facilitated using high performance computing and fast surrogate models. The surrogate models are constructed using machine learning techniques to emulate the response simulated by the computationally expensive groundwater model. We demonstrate in the real-world case study that explicitly accounting for model structural error yields parameter posterior distributions that are substantially different from those derived by the classical Bayesian calibration that does not account for model structural error. In addition, the Bayesian with error model method gives significantly more accurate prediction along with reasonable credible intervals.

  7. High-performance computational fluid dynamics: a custom-code approach

    NASA Astrophysics Data System (ADS)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  8. Test-driven programming

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2013-12-01

    In this paper, are presented some possibilities concerning the implementation of a test-driven development as a programming method. Here is offered a different point of view for creation of advanced programming techniques (build tests before programming source with all necessary software tools and modules respectively). Therefore, this nontraditional approach for easier programmer's work through building tests at first is preferable way of software development. This approach allows comparatively simple programming (applied with different object-oriented programming languages as for example JAVA, XML, PYTHON etc.). It is predictable way to develop software tools and to provide help about creating better software that is also easier to maintain. Test-driven programming is able to replace more complicated casual paradigms, used by many programmers.

  9. Marketing the pathology practice.

    PubMed

    Berkowitz, E N

    1995-07-01

    Effective marketing of the pathology practice is essential in the face of an increasingly competitive market. Successful marketing begins with a market-driven planning process. As opposed to the traditional planning process used in health care organizations, a market-driven approach is externally driven. Implementing a market-driven plan also requires recognition of the definition of the service. Each market to which pathologists direct their service defines the service differently. Recognition of these different service definitions and creation of a product to meet these needs could lead to competitive advantages in the marketplace.

  10. Quantum correlations and limit cycles in the driven-dissipative Heisenberg lattice

    NASA Astrophysics Data System (ADS)

    Owen, E. T.; Jin, J.; Rossini, D.; Fazio, R.; Hartmann, M. J.

    2018-04-01

    Driven-dissipative quantum many-body systems have attracted increasing interest in recent years as they lead to novel classes of quantum many-body phenomena. In particular, mean-field calculations predict limit cycle phases, slow oscillations instead of stationary states, in the long-time limit for a number of driven-dissipative quantum many-body systems. Using a cluster mean-field and a self-consistent Mori projector approach, we explore the persistence of such limit cycles as short range quantum correlations are taken into account in a driven-dissipative Heisenberg model.

  11. Electric-field-driven electron-transfer in mixed-valence molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blair, Enrique P., E-mail: enrique-blair@baylor.edu; Corcelli, Steven A., E-mail: scorcell@nd.edu; Lent, Craig S., E-mail: lent@nd.edu

    2016-07-07

    Molecular quantum-dot cellular automata is a computing paradigm in which digital information is encoded by the charge configuration of a mixed-valence molecule. General-purpose computing can be achieved by arranging these compounds on a substrate and exploiting intermolecular Coulombic coupling. The operation of such a device relies on nonequilibrium electron transfer (ET), whereby the time-varying electric field of one molecule induces an ET event in a neighboring molecule. The magnitude of the electric fields can be quite large because of close spatial proximity, and the induced ET rate is a measure of the nonequilibrium response of the molecule. We calculate themore » electric-field-driven ET rate for a model mixed-valence compound. The mixed-valence molecule is regarded as a two-state electronic system coupled to a molecular vibrational mode, which is, in turn, coupled to a thermal environment. Both the electronic and vibrational degrees-of-freedom are treated quantum mechanically, and the dissipative vibrational-bath interaction is modeled with the Lindblad equation. This approach captures both tunneling and nonadiabatic dynamics. Relationships between microscopic molecular properties and the driven ET rate are explored for two time-dependent applied fields: an abruptly switched field and a linearly ramped field. In both cases, the driven ET rate is only weakly temperature dependent. When the model is applied using parameters appropriate to a specific mixed-valence molecule, diferrocenylacetylene, terahertz-range ET transfer rates are predicted.« less

  12. Double-well dynamics of noise-driven control activation in human intermittent control: the case of stick balancing.

    PubMed

    Zgonnikov, Arkady; Lubashevsky, Ihor

    2015-11-01

    When facing a task of balancing a dynamic system near an unstable equilibrium, humans often adopt intermittent control strategy: Instead of continuously controlling the system, they repeatedly switch the control on and off. Paradigmatic example of such a task is stick balancing. Despite the simplicity of the task itself, the complexity of human intermittent control dynamics in stick balancing still puzzles researchers in motor control. Here we attempt to model one of the key mechanisms of human intermittent control, control activation, using as an example the task of overdamped stick balancing. In doing so, we focus on the concept of noise-driven activation, a more general alternative to the conventional threshold-driven activation. We describe control activation as a random walk in an energy potential, which changes in response to the state of the controlled system. By way of numerical simulations, we show that the developed model captures the core properties of human control activation observed previously in the experiments on overdamped stick balancing. Our results demonstrate that the double-well potential model provides tractable mathematical description of human control activation at least in the considered task and suggest that the adopted approach can potentially aid in understanding human intermittent control in more complex processes.

  13. Development and validation of a diagnostic model for early differentiation of sepsis and non-infectious SIRS in critically ill children - a data-driven approach using machine-learning algorithms.

    PubMed

    Lamping, Florian; Jack, Thomas; Rübsamen, Nicole; Sasse, Michael; Beerbaum, Philipp; Mikolajczyk, Rafael T; Boehne, Martin; Karch, André

    2018-03-15

    Since early antimicrobial therapy is mandatory in septic patients, immediate diagnosis and distinction from non-infectious SIRS is essential but hampered by the similarity of symptoms between both entities. We aimed to develop a diagnostic model for differentiation of sepsis and non-infectious SIRS in critically ill children based on routinely available parameters (baseline characteristics, clinical/laboratory parameters, technical/medical support). This is a secondary analysis of a randomized controlled trial conducted at a German tertiary-care pediatric intensive care unit (PICU). Two hundred thirty-eight cases of non-infectious SIRS and 58 cases of sepsis (as defined by IPSCC criteria) were included. We applied a Random Forest approach to identify the best set of predictors out of 44 variables measured at the day of onset of the disease. The developed diagnostic model was validated in a temporal split-sample approach. A model including four clinical (length of PICU stay until onset of non-infectious SIRS/sepsis, central line, core temperature, number of non-infectious SIRS/sepsis episodes prior to diagnosis) and four laboratory parameters (interleukin-6, platelet count, procalcitonin, CRP) was identified in the training dataset. Validation in the test dataset revealed an AUC of 0.78 (95% CI: 0.70-0.87). Our model was superior to previously proposed biomarkers such as CRP, interleukin-6, procalcitonin or a combination of CRP and procalcitonin (maximum AUC = 0.63; 95% CI: 0.52-0.74). When aiming at a complete identification of sepsis cases (100%; 95% CI: 87-100%), 28% (95% CI: 20-38%) of non-infectious SIRS cases were assorted correctly. Our approach allows early recognition of sepsis with an accuracy superior to previously described biomarkers, and could potentially reduce antibiotic use by 30% in non-infectious SIRS cases. External validation studies are necessary to confirm the generalizability of our approach across populations and treatment practices. ClinicalTrials.gov number: NCT00209768; registration date: September 21, 2005.

  14. Learning tactile skills through curious exploration

    PubMed Central

    Pape, Leo; Oddo, Calogero M.; Controzzi, Marco; Cipriani, Christian; Förster, Alexander; Carrozza, Maria C.; Schmidhuber, Jürgen

    2012-01-01

    We present curiosity-driven, autonomous acquisition of tactile exploratory skills on a biomimetic robot finger equipped with an array of microelectromechanical touch sensors. Instead of building tailored algorithms for solving a specific tactile task, we employ a more general curiosity-driven reinforcement learning approach that autonomously learns a set of motor skills in absence of an explicit teacher signal. In this approach, the acquisition of skills is driven by the information content of the sensory input signals relative to a learner that aims at representing sensory inputs using fewer and fewer computational resources. We show that, from initially random exploration of its environment, the robotic system autonomously develops a small set of basic motor skills that lead to different kinds of tactile input. Next, the system learns how to exploit the learned motor skills to solve supervised texture classification tasks. Our approach demonstrates the feasibility of autonomous acquisition of tactile skills on physical robotic platforms through curiosity-driven reinforcement learning, overcomes typical difficulties of engineered solutions for active tactile exploration and underactuated control, and provides a basis for studying developmental learning through intrinsic motivation in robots. PMID:22837748

  15. Reflection of a Year Long Model-Driven Business and UI Modeling Development Project

    NASA Astrophysics Data System (ADS)

    Sukaviriya, Noi; Mani, Senthil; Sinha, Vibha

    Model-driven software development enables users to specify an application at a high level - a level that better matches problem domain. It also promises the users with better analysis and automation. Our work embarks on two collaborating domains - business process and human interactions - to build an application. Business modeling expresses business operations and flows then creates business flow implementation. Human interaction modeling expresses a UI design, its relationship with business data, logic, and flow, and can generate working UI. This double modeling approach automates the production of a working system with UI and business logic connected. This paper discusses the human aspects of this modeling approach after a year long of building a procurement outsourcing contract application using the approach - the result of which was deployed in December 2008. The paper discusses in multiple areas the happy endings and some heartache. We end with insights on how a model-driven approach could do better for humans in the process.

  16. Current uses of 2.0 applications in transportation : case studies of select state departments of transportation.

    DOT National Transportation Integrated Search

    2010-03-01

    Web 2.0 is an umbrella term for websites or online applications that are user-driven and emphasize collaboration and user interactivity. The trend away from static web pages to a more user-driven Internet model has also occurred in the public s...

  17. Integrative Systems Biology for Data Driven Knowledge Discovery

    PubMed Central

    Greene, Casey S.; Troyanskaya, Olga G.

    2015-01-01

    Integrative systems biology is an approach that brings together diverse high throughput experiments and databases to gain new insights into biological processes or systems at molecular through physiological levels. These approaches rely on diverse high-throughput experimental techniques that generate heterogeneous data by assaying varying aspects of complex biological processes. Computational approaches are necessary to provide an integrative view of these experimental results and enable data-driven knowledge discovery. Hypotheses generated from these approaches can direct definitive molecular experiments in a cost effective manner. Using integrative systems biology approaches, we can leverage existing biological knowledge and large-scale data to improve our understanding of yet unknown components of a system of interest and how its malfunction leads to disease. PMID:21044756

  18. Using a Time-Driven Activity-Based Costing Model To Determine the Actual Cost of Services Provided by a Transgenic Core.

    PubMed

    Gerwin, Philip M; Norinsky, Rada M; Tolwani, Ravi J

    2018-03-01

    Laboratory animal programs and core laboratories often set service rates based on cost estimates. However, actual costs may be unknown, and service rates may not reflect the actual cost of services. Accurately evaluating the actual costs of services can be challenging and time-consuming. We used a time-driven activity-based costing (ABC) model to determine the cost of services provided by a resource laboratory at our institution. The time-driven approach is a more efficient approach to calculating costs than using a traditional ABC model. We calculated only 2 parameters: the time required to perform an activity and the unit cost of the activity based on employee cost. This method allowed us to rapidly and accurately calculate the actual cost of services provided, including microinjection of a DNA construct, microinjection of embryonic stem cells, embryo transfer, and in vitro fertilization. We successfully implemented a time-driven ABC model to evaluate the cost of these services and the capacity of labor used to deliver them. We determined how actual costs compared with current service rates. In addition, we determined that the labor supplied to conduct all services (10,645 min/wk) exceeded the practical labor capacity (8400 min/wk), indicating that the laboratory team was highly efficient and that additional labor capacity was needed to prevent overloading of the current team. Importantly, this time-driven ABC approach allowed us to establish a baseline model that can easily be updated to reflect operational changes or changes in labor costs. We demonstrated that a time-driven ABC model is a powerful management tool that can be applied to other core facilities as well as to entire animal programs, providing valuable information that can be used to set rates based on the actual cost of services and to improve operating efficiency.

  19. Mass imbalances in EPANET water-quality simulations

    NASA Astrophysics Data System (ADS)

    Davis, Michael J.; Janke, Robert; Taxon, Thomas N.

    2018-04-01

    EPANET is widely employed to simulate water quality in water distribution systems. However, in general, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results only for short water-quality time steps. Overly long time steps can yield errors in concentration estimates and can result in situations in which constituent mass is not conserved. The use of a time step that is sufficiently short to avoid these problems may not always be feasible. The absence of EPANET errors or warnings does not ensure conservation of mass. This paper provides examples illustrating mass imbalances and explains how such imbalances can occur because of fundamental limitations in the water-quality routing algorithm used in EPANET. In general, these limitations cannot be overcome by the use of improved water-quality modeling practices. This paper also presents a preliminary event-driven approach that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, toward those obtained using the preliminary event-driven approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations. The results presented in this paper should be of value to those who perform water-quality simulations using EPANET or use the results of such simulations, including utility managers and engineers.

  20. Meta-Analysis for Sociology – A Measure-Driven Approach

    PubMed Central

    Roelfs, David J.; Shor, Eran; Falzon, Louise; Davidson, Karina W.; Schwartz, Joseph E.

    2013-01-01

    Meta-analytic methods are becoming increasingly important in sociological research. In this article we present an approach for meta-analysis which is especially helpful for sociologists. Conventional approaches to meta-analysis often prioritize “concept-driven” literature searches. However, in disciplines with high theoretical diversity, such as sociology, this search approach might constrain the researcher’s ability to fully exploit the entire body of relevant work. We explicate a “measure-driven” approach, in which iterative searches and new computerized search techniques are used to increase the range of publications found (and thus the range of possible analyses) and to traverse time and disciplinary boundaries. We demonstrate this measure-driven search approach with two meta-analytic projects, examining the effects of various social variables on all-cause mortality. PMID:24163498

  1. Real Time Energy Management Control Strategies for Hybrid Powertrains

    NASA Astrophysics Data System (ADS)

    Zaher, Mohamed Hegazi Mohamed

    In order to improve fuel efficiency and reduce emissions of mobile vehicles, various hybrid power-train concepts have been developed over the years. This thesis focuses on embedded control of hybrid powertrain concepts for mobile vehicle applications. Optimal robust control approach is used to develop a real time energy management strategy for continuous operations. The main idea is to store the normally wasted mechanical regenerative energy in energy storage devices for later usage. The regenerative energy recovery opportunity exists in any condition where the speed of motion is in opposite direction to the applied force or torque. This is the case when the vehicle is braking, decelerating, or the motion is driven by gravitational force, or load driven. There are three main concepts for regernerative energy storing devices in hybrid vehicles: electric, hydraulic, and flywheel. The real time control challenge is to balance the system power demand from the engine and the hybrid storage device, without depleting the energy storage device or stalling the engine in any work cycle, while making optimal use of the energy saving opportunities in a given operational, often repetitive cycle. In the worst case scenario, only engine is used and hybrid system completely disabled. A rule based control is developed and tuned for different work cycles and linked to a gain scheduling algorithm. A gain scheduling algorithm identifies the cycle being performed by the machine and its position via GPS, and maps them to the gains.

  2. Design and Data in Balance: Using Design-Driven Decision Making to Enable Student Success

    ERIC Educational Resources Information Center

    Fairchild, Susan; Farrell, Timothy; Gunton, Brad; Mackinnon, Anne; McNamara, Christina; Trachtman, Roberta

    2014-01-01

    Data-driven approaches to school decision making have come into widespread use in the past decade, nationally and in New York City. New Visions has been at the forefront of those developments: in New Visions schools, teacher teams and school teams regularly examine student performance data to understand patterns and drive classroom- and…

  3. Insights and participatory actions driven by a socio-hydrogeological approach for groundwater management: the Grombalia Basin case study (Tunisia)

    NASA Astrophysics Data System (ADS)

    Tringali, C.; Re, V.; Siciliano, G.; Chkir, N.; Tuci, C.; Zouari, K.

    2017-08-01

    Sustainable groundwater management strategies in water-scarce countries need to guide future decision-making processes pragmatically, by simultaneously considering local needs, environmental problems and economic development. The socio-hydrogeological approach named `Bir Al-Nas' has been tested in the Grombalia region (Cap Bon Peninsula, Tunisia), to evaluate the effectiveness of complementing hydrogeochemical and hydrogeological investigations with the social dimension of the issue at stake (which, in this case, is the identification of groundwater pollution sources). Within this approach, the social appraisal, performed through social network analysis and public engagement of water end-users, allowed hydrogeologists to get acquainted with the institutional dimension of local groundwater management, identifying issues, potential gaps (such as weak knowledge transfer among concerned stakeholders), and the key actors likely to support the implementation of the new science-based management practices resulting from the ongoing hydrogeological investigation. Results, hence, go beyond the specific relevance for the Grombaila basin, showing the effectiveness of the proposed approach and the importance of including social assessment in any given hydrogeological research aimed at supporting local development through groundwater protection measures.

  4. COLLABORATE©: a universal competency-based paradigm for professional case management, Part III: key considerations for making the paradigm shift.

    PubMed

    Treiger, Teresa M; Fink-Samnick, Ellen

    2014-01-01

    The purpose of the third of this 3-article series is to provide context and justification for a new paradigm of case management built upon a value-driven foundation that * improves the patient's experience of health care delivery, * provides consistency in approach applicable across health care populations, and * optimizes the potential for return on investment. Applicable to all health care sectors where case management is practiced. In moving forward the one fact that rings true is there will be constant change in our industry. As the health care terrain shifts and new influences continually surface, there will be consequences for case management practice. These impacts require nimble clinical professionals in possession of recognized and firmly established competencies. They must be agile to frame (and reframe) their professional practice to facilitate the best possible outcomes for their patients. Case managers can choose to be Gumby or Pokey. This is exactly why the definition of a competency-based case management model's time has come, one sufficiently fluid to fit into any setting of care. The practice of case management transcends the vast array of representative professional disciplines and educational levels. A majority of current models are driven by business priorities rather than the competencies critical to successful practice and quality patient outcomes. This results in a fragmented professional case management identity. While there is inherent value in what each discipline brings to the table, this advanced model unifies behind case management's unique, strengths-based identity instead of continuing to align within traditional divisions (e.g., discipline, work setting, population served). This model fosters case management's expanding career advancement opportunities, including a reflective clinical ladder.

  5. A model of strength

    USGS Publications Warehouse

    Johnson, Douglas H.; Cook, R.D.

    2013-01-01

    In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.

  6. Progress and Challenges in Short to Medium Range Coupled Prediction

    NASA Technical Reports Server (NTRS)

    Brassington, G. B.; Martin, M. J.; Tolman, H. L.; Akella, Santha; Balmeseda, M.; Chambers, C. R. S.; Cummings, J. A.; Drillet, Y.; Jansen, P. A. E. M.; Laloyaux, P.; hide

    2014-01-01

    The availability of GODAE Oceanview-type ocean forecast systems provides the opportunity to develop high-resolution, short- to medium-range coupled prediction systems. Several groups have undertaken the first experiments based on relatively unsophisticated approaches. Progress is being driven at the institutional level targeting a range of applications that represent their respective national interests with clear overlaps and opportunities for information exchange and collaboration. These include general circulation, hurricanes, extra-tropical storms, high-latitude weather and sea-ice forecasting as well as coastal air-sea interaction. In some cases, research has moved beyond case and sensitivity studies to controlled experiments to obtain statistically significant metrics.

  7. Data-Driven Model Uncertainty Estimation in Hydrologic Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S.; Moradkhani, H.; Marshall, L.; Sharma, A.; Geenens, G.

    2018-02-01

    The increasing availability of earth observations necessitates mathematical methods to optimally combine such data with hydrologic models. Several algorithms exist for such purposes, under the umbrella of data assimilation (DA). However, DA methods are often applied in a suboptimal fashion for complex real-world problems, due largely to several practical implementation issues. One such issue is error characterization, which is known to be critical for a successful assimilation. Mischaracterized errors lead to suboptimal forecasts, and in the worst case, to degraded estimates even compared to the no assimilation case. Model uncertainty characterization has received little attention relative to other aspects of DA science. Traditional methods rely on subjective, ad hoc tuning factors or parametric distribution assumptions that may not always be applicable. We propose a novel data-driven approach (named SDMU) to model uncertainty characterization for DA studies where (1) the system states are partially observed and (2) minimal prior knowledge of the model error processes is available, except that the errors display state dependence. It includes an approach for estimating the uncertainty in hidden model states, with the end goal of improving predictions of observed variables. The SDMU is therefore suited to DA studies where the observed variables are of primary interest. Its efficacy is demonstrated through a synthetic case study with low-dimensional chaotic dynamics and a real hydrologic experiment for one-day-ahead streamflow forecasting. In both experiments, the proposed method leads to substantial improvements in the hidden states and observed system outputs over a standard method involving perturbation with Gaussian noise.

  8. Modeling the sustainable development of innovation in transport construction based on the communication approach

    NASA Astrophysics Data System (ADS)

    Revunova, Svetlana; Vlasenko, Vyacheslav; Bukreev, Anatoly

    2017-10-01

    The article proposes the models of innovative activity development, which is driven by the formation of “points of innovation-driven growth”. The models are based on the analysis of the current state and dynamics of innovative development of construction enterprises in the transport sector and take into account a number of essential organizational and economic changes in management. The authors substantiate implementing such development models as an organizational innovation that has a communication genesis. The use of the communication approach to the formation of “points of innovation-driven growth” allowed the authors to apply the mathematical tools of the graph theory in order to activate the innovative activity of the transport industry in the region. As a result, the authors have proposed models that allow constructing an optimal mechanism for the formation of “points of innovation-driven growth”.

  9. Simple Kinematic Pathway Approach (KPA) to Catchment-scale Travel Time and Water Age Distributions

    NASA Astrophysics Data System (ADS)

    Soltani, S. S.; Cvetkovic, V.; Destouni, G.

    2017-12-01

    The distribution of catchment-scale water travel times is strongly influenced by morphological dispersion and is partitioned between hillslope and larger, regional scales. We explore whether hillslope travel times are predictable using a simple semi-analytical "kinematic pathway approach" (KPA) that accounts for dispersion on two levels of morphological and macro-dispersion. The study gives new insights to shallow (hillslope) and deep (regional) groundwater travel times by comparing numerical simulations of travel time distributions, referred to as "dynamic model", with corresponding KPA computations for three different real catchment case studies in Sweden. KPA uses basic structural and hydrological data to compute transient water travel time (forward mode) and age (backward mode) distributions at the catchment outlet. Longitudinal and morphological dispersion components are reflected in KPA computations by assuming an effective Peclet number and topographically driven pathway length distributions, respectively. Numerical simulations of advective travel times are obtained by means of particle tracking using the fully-integrated flow model MIKE SHE. The comparison of computed cumulative distribution functions of travel times shows significant influence of morphological dispersion and groundwater recharge rate on the compatibility of the "kinematic pathway" and "dynamic" models. Zones of high recharge rate in "dynamic" models are associated with topographically driven groundwater flow paths to adjacent discharge zones, e.g. rivers and lakes, through relatively shallow pathway compartments. These zones exhibit more compatible behavior between "dynamic" and "kinematic pathway" models than the zones of low recharge rate. Interestingly, the travel time distributions of hillslope compartments remain almost unchanged with increasing recharge rates in the "dynamic" models. This robust "dynamic" model behavior suggests that flow path lengths and travel times in shallow hillslope compartments are controlled by topography, and therefore application and further development of the simple "kinematic pathway" approach is promising for their modeling.

  10. Pressure driven currents near magnetic islands in 3D MHD equilibria: Effects of pressure variation within flux surfaces and of symmetry

    NASA Astrophysics Data System (ADS)

    Reiman, Allan H.

    2016-07-01

    In toroidal, magnetically confined plasmas, the heat and particle transport is strongly anisotropic, with transport along the field lines sufficiently strong relative to cross-field transport that the equilibrium pressure can generally be regarded as constant on the flux surfaces in much of the plasma. The regions near small magnetic islands, and those near the X-lines of larger islands, are exceptions, having a significant variation of the pressure within the flux surfaces. It is shown here that the variation of the equilibrium pressure within the flux surfaces in those regions has significant consequences for the pressure driven currents. It is further shown that the consequences are strongly affected by the symmetry of the magnetic field if the field is invariant under combined reflection in the poloidal and toroidal angles. (This symmetry property is called "stellarator symmetry.") In non-stellarator-symmetric equilibria, the pressure-driven currents have logarithmic singularities at the X-lines. In stellarator-symmetric MHD equilibria, the singular components of the pressure-driven currents vanish. These equilibria are to be contrasted with equilibria having B ṡ∇p =0 , where the singular components of the pressure-driven currents vanish regardless of the symmetry. They are also to be contrasted with 3D MHD equilibrium solutions that are constrained to have simply nested flux surfaces, where the pressure-driven current goes like 1 /x near rational surfaces, where x is the distance from the rational surface, except in the case of quasi-symmetric flux surfaces. For the purpose of calculating the pressure-driven currents near magnetic islands, we work with a closed subset of the MHD equilibrium equations that involves only perpendicular force balance, and is decoupled from parallel force balance. It is not correct to use the parallel component of the conventional MHD force balance equation, B ṡ∇p =0 , near magnetic islands. Small but nonzero values of B ṡ∇p are important in this region, and small non-MHD contributions to the parallel force balance equation cannot be neglected there. Two approaches are pursued to solve our equations for the pressure driven currents. First, the equilibrium equations are applied to an analytically tractable magnetic field with an island, obtaining explicit expressions for the rotational transform and magnetic coordinates, and for the pressure-driven current and its limiting behavior near the X-line. The second approach utilizes an expansion about the X-line to provide a more general calculation of the pressure-driven current near an X-line and of the rotational transform near a separatrix. The study presented in this paper is motivated, in part, by tokamak experiments with nonaxisymmetric magnetic perturbations, where significant differences are observed between the behavior of stellarator-symmetric and non-stellarator-symmetric configurations with regard to stabilization of edge localized modes by resonant magnetic perturbations. Implications for the coupling between neoclassical tearing modes, and for magnetic island stability calculations, are also discussed.

  11. Base Realignments and Closures: Report of the Defense Secretary’s Commission

    DTIC Science & Technology

    1988-12-29

    recommended to the Secretary of Defense several actions that should be taken to aid When it was determined that an local communities in their redevelopment...approved plans. Unlike some previous In many cases, they brought to the task a reviews, the Commission’s approach , first-hand knowledge of military...the by a professional staff (see Appendix D). individual Services. For example, the Air Force, driven by severe current and In organizing to accomplish

  12. Cost of outpatient endoscopic sinus surgery from the perspective of the Canadian government: a time-driven activity-based costing approach.

    PubMed

    Au, Jennifer; Rudmik, Luke

    2013-09-01

    The time-driven activity-based costing (TD-ABC) method is a novel approach to quantify the costs of a complex system. The aim of this study was to apply the TD-ABC technique to define the overall cost of a routine outpatient endoscopic sinus surgery (ESS) from the perspective of the Canadian government payer. Costing perspective was the Canadian government payer. All monetary values are in Canadian dollars as of December 2012. Costs were obtained by contacting staff unions, reviewing purchasing databases and provincial physician fee schedules. Practical capacity time values were collected from the College and Association of Registered Nurses of Alberta. Capacity cost rates ($/min) were calculated for all staff, capital equipment, and hospital space. The overall cost for routine outpatient ESS was $3510.31. The cost per ESS case for each clinical pathway encounter was as follows: preoperative holding ($49.19); intraoperative ($3296.60); sterilization ($90.20); postanesthesia care unit ($28.64); and postoperative day ward ($45.68). The 3 major cost drivers were physician fees, disposable equipment, and nursing costs. The intraoperative phase contributed to 94.5% of the overall cost. This study applied the TD-ABC method to evaluate the cost of outpatient ESS from the perspective of the Canadian government payer and defined the overall cost to be $3510.31 per case. © 2013 ARS-AAOA, LLC.

  13. Theory of Change: a theory-driven approach to enhance the Medical Research Council's framework for complex interventions

    PubMed Central

    2014-01-01

    Background The Medical Research Councils’ framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. Methods We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Results Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Conclusions Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex interventions, thereby increasing the likelihood that the intervention will be ultimately effective, sustainable and scalable. We urge researchers developing and evaluating complex interventions to consider using this approach, to evaluate its usefulness and to build an evidence base to further refine the methodology. Trial registration Clinical trials.gov: NCT02160249 PMID:24996765

  14. Theory of Change: a theory-driven approach to enhance the Medical Research Council's framework for complex interventions.

    PubMed

    De Silva, Mary J; Breuer, Erica; Lee, Lucy; Asher, Laura; Chowdhary, Neerja; Lund, Crick; Patel, Vikram

    2014-07-05

    The Medical Research Councils' framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex interventions, thereby increasing the likelihood that the intervention will be ultimately effective, sustainable and scalable. We urge researchers developing and evaluating complex interventions to consider using this approach, to evaluate its usefulness and to build an evidence base to further refine the methodology. Clinical trials.gov: NCT02160249.

  15. Knowledge-guided golf course detection using a convolutional neural network fine-tuned on temporally augmented data

    NASA Astrophysics Data System (ADS)

    Chen, Jingbo; Wang, Chengyi; Yue, Anzhi; Chen, Jiansheng; He, Dongxu; Zhang, Xiuyan

    2017-10-01

    The tremendous success of deep learning models such as convolutional neural networks (CNNs) in computer vision provides a method for similar problems in the field of remote sensing. Although research on repurposing pretrained CNN to remote sensing tasks is emerging, the scarcity of labeled samples and the complexity of remote sensing imagery still pose challenges. We developed a knowledge-guided golf course detection approach using a CNN fine-tuned on temporally augmented data. The proposed approach is a combination of knowledge-driven region proposal, data-driven detection based on CNN, and knowledge-driven postprocessing. To confront data complexity, knowledge-derived cooccurrence, composition, and area-based rules are applied sequentially to propose candidate golf regions. To confront sample scarcity, we employed data augmentation in the temporal domain, which extracts samples from multitemporal images. The augmented samples were then used to fine-tune a pretrained CNN for golf detection. Finally, commission error was further suppressed by postprocessing. Experiments conducted on GF-1 imagery prove the effectiveness of the proposed approach.

  16. Using a Technology-Based Case to Aid in Improving Assessment

    ERIC Educational Resources Information Center

    Zelin, Robert C., II

    2008-01-01

    This paper describes how a technology-based case using Microsoft Access can aid in the assessment process. A case was used in lieu of giving a final examination in an Accounting Information Systems course. Students worked in small groups to design a database-driven payroll system for a hypothetical company. Each group submitted its results along…

  17. Model‐Based Approach to Predict Adherence to Protocol During Antiobesity Trials

    PubMed Central

    Sharma, Vishnu D.; Combes, François P.; Vakilynejad, Majid; Lahu, Gezim; Lesko, Lawrence J.

    2017-01-01

    Abstract Development of antiobesity drugs is continuously challenged by high dropout rates during clinical trials. The objective was to develop a population pharmacodynamic model that describes the temporal changes in body weight, considering disease progression, lifestyle intervention, and drug effects. Markov modeling (MM) was applied for quantification and characterization of responder and nonresponder as key drivers of dropout rates, to ultimately support the clinical trial simulations and the outcome in terms of trial adherence. Subjects (n = 4591) from 6 Contrave® trials were included in this analysis. An indirect‐response model developed by van Wart et al was used as a starting point. Inclusion of drug effect was dose driven using a population dose‐ and time‐dependent pharmacodynamic (DTPD) model. Additionally, a population‐pharmacokinetic parameter‐ and data (PPPD)‐driven model was developed using the final DTPD model structure and final parameter estimates from a previously developed population pharmacokinetic model based on available Contrave® pharmacokinetic concentrations. Last, MM was developed to predict transition rate probabilities among responder, nonresponder, and dropout states driven by the pharmacodynamic effect resulting from the DTPD or PPPD model. Covariates included in the models and parameters were diabetes mellitus and race. The linked DTPD‐MM and PPPD‐MM was able to predict transition rates among responder, nonresponder, and dropout states well. The analysis concluded that body‐weight change is an important factor influencing dropout rates, and the MM depicted that overall a DTPD model‐driven approach provides a reasonable prediction of clinical trial outcome probabilities similar to a pharmacokinetic‐driven approach. PMID:28858397

  18. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  19. A data-driven approach to identify controls on global fire activity from satellite and climate observations (SOFIA V1)

    NASA Astrophysics Data System (ADS)

    Forkel, Matthias; Dorigo, Wouter; Lasslop, Gitta; Teubner, Irene; Chuvieco, Emilio; Thonicke, Kirsten

    2017-12-01

    Vegetation fires affect human infrastructures, ecosystems, global vegetation distribution, and atmospheric composition. However, the climatic, environmental, and socioeconomic factors that control global fire activity in vegetation are only poorly understood, and in various complexities and formulations are represented in global process-oriented vegetation-fire models. Data-driven model approaches such as machine learning algorithms have successfully been used to identify and better understand controlling factors for fire activity. However, such machine learning models cannot be easily adapted or even implemented within process-oriented global vegetation-fire models. To overcome this gap between machine learning-based approaches and process-oriented global fire models, we introduce a new flexible data-driven fire modelling approach here (Satellite Observations to predict FIre Activity, SOFIA approach version 1). SOFIA models can use several predictor variables and functional relationships to estimate burned area that can be easily adapted with more complex process-oriented vegetation-fire models. We created an ensemble of SOFIA models to test the importance of several predictor variables. SOFIA models result in the highest performance in predicting burned area if they account for a direct restriction of fire activity under wet conditions and if they include a land cover-dependent restriction or allowance of fire activity by vegetation density and biomass. The use of vegetation optical depth data from microwave satellite observations, a proxy for vegetation biomass and water content, reaches higher model performance than commonly used vegetation variables from optical sensors. We further analyse spatial patterns of the sensitivity between anthropogenic, climate, and vegetation predictor variables and burned area. We finally discuss how multiple observational datasets on climate, hydrological, vegetation, and socioeconomic variables together with data-driven modelling and model-data integration approaches can guide the future development of global process-oriented vegetation-fire models.

  20. Deformable registration of the inflated and deflated lung in cone-beam CT-guided thoracic surgery: Initial investigation of a combined model- and image-driven approach

    PubMed Central

    Uneri, Ali; Nithiananthan, Sajendra; Schafer, Sebastian; Otake, Yoshito; Stayman, J. Webster; Kleinszig, Gerhard; Sussman, Marc S.; Prince, Jerry L.; Siewerdsen, Jeffrey H.

    2013-01-01

    Purpose: Surgical resection is the preferred modality for curative treatment of early stage lung cancer, but localization of small tumors (<10 mm diameter) during surgery presents a major challenge that is likely to increase as more early-stage disease is detected incidentally and in low-dose CT screening. To overcome the difficulty of manual localization (fingers inserted through intercostal ports) and the cost, logistics, and morbidity of preoperative tagging (coil or dye placement under CT-fluoroscopy), the authors propose the use of intraoperative cone-beam CT (CBCT) and deformable image registration to guide targeting of small tumors in video-assisted thoracic surgery (VATS). A novel algorithm is reported for registration of the lung from its inflated state (prior to pleural breach) to the deflated state (during resection) to localize surgical targets and adjacent critical anatomy. Methods: The registration approach geometrically resolves images of the inflated and deflated lung using a coarse model-driven stage followed by a finer image-driven stage. The model-driven stage uses image features derived from the lung surfaces and airways: triangular surface meshes are morphed to capture bulk motion; concurrently, the airways generate graph structures from which corresponding nodes are identified. Interpolation of the sparse motion fields computed from the bounding surface and interior airways provides a 3D motion field that coarsely registers the lung and initializes the subsequent image-driven stage. The image-driven stage employs an intensity-corrected, symmetric form of the Demons method. The algorithm was validated over 12 datasets, obtained from porcine specimen experiments emulating CBCT-guided VATS. Geometric accuracy was quantified in terms of target registration error (TRE) in anatomical targets throughout the lung, and normalized cross-correlation. Variations of the algorithm were investigated to study the behavior of the model- and image-driven stages by modifying individual algorithmic steps and examining the effect in comparison to the nominal process. Results: The combined model- and image-driven registration process demonstrated accuracy consistent with the requirements of minimally invasive VATS in both target localization (∼3–5 mm within the target wedge) and critical structure avoidance (∼1–2 mm). The model-driven stage initialized the registration to within a median TRE of 1.9 mm (95% confidence interval (CI) maximum = 5.0 mm), while the subsequent image-driven stage yielded higher accuracy localization with 0.6 mm median TRE (95% CI maximum = 4.1 mm). The variations assessing the individual algorithmic steps elucidated the role of each step and in some cases identified opportunities for further simplification and improvement in computational speed. Conclusions: The initial studies show the proposed registration method to successfully register CBCT images of the inflated and deflated lung. Accuracy appears sufficient to localize the target and adjacent critical anatomy within ∼1–2 mm and guide localization under conditions in which the target cannot be discerned directly in CBCT (e.g., subtle, nonsolid tumors). The ability to directly localize tumors in the operating room could provide a valuable addition to the VATS arsenal, obviate the cost, logistics, and morbidity of preoperative tagging, and improve patient safety. Future work includes in vivo testing, optimization of workflow, and integration with a CBCT image guidance system. PMID:23298134

  1. Deformable registration of the inflated and deflated lung in cone-beam CT-guided thoracic surgery: initial investigation of a combined model- and image-driven approach.

    PubMed

    Uneri, Ali; Nithiananthan, Sajendra; Schafer, Sebastian; Otake, Yoshito; Stayman, J Webster; Kleinszig, Gerhard; Sussman, Marc S; Prince, Jerry L; Siewerdsen, Jeffrey H

    2013-01-01

    Surgical resection is the preferred modality for curative treatment of early stage lung cancer, but localization of small tumors (<10 mm diameter) during surgery presents a major challenge that is likely to increase as more early-stage disease is detected incidentally and in low-dose CT screening. To overcome the difficulty of manual localization (fingers inserted through intercostal ports) and the cost, logistics, and morbidity of preoperative tagging (coil or dye placement under CT-fluoroscopy), the authors propose the use of intraoperative cone-beam CT (CBCT) and deformable image registration to guide targeting of small tumors in video-assisted thoracic surgery (VATS). A novel algorithm is reported for registration of the lung from its inflated state (prior to pleural breach) to the deflated state (during resection) to localize surgical targets and adjacent critical anatomy. The registration approach geometrically resolves images of the inflated and deflated lung using a coarse model-driven stage followed by a finer image-driven stage. The model-driven stage uses image features derived from the lung surfaces and airways: triangular surface meshes are morphed to capture bulk motion; concurrently, the airways generate graph structures from which corresponding nodes are identified. Interpolation of the sparse motion fields computed from the bounding surface and interior airways provides a 3D motion field that coarsely registers the lung and initializes the subsequent image-driven stage. The image-driven stage employs an intensity-corrected, symmetric form of the Demons method. The algorithm was validated over 12 datasets, obtained from porcine specimen experiments emulating CBCT-guided VATS. Geometric accuracy was quantified in terms of target registration error (TRE) in anatomical targets throughout the lung, and normalized cross-correlation. Variations of the algorithm were investigated to study the behavior of the model- and image-driven stages by modifying individual algorithmic steps and examining the effect in comparison to the nominal process. The combined model- and image-driven registration process demonstrated accuracy consistent with the requirements of minimally invasive VATS in both target localization (∼3-5 mm within the target wedge) and critical structure avoidance (∼1-2 mm). The model-driven stage initialized the registration to within a median TRE of 1.9 mm (95% confidence interval (CI) maximum = 5.0 mm), while the subsequent image-driven stage yielded higher accuracy localization with 0.6 mm median TRE (95% CI maximum = 4.1 mm). The variations assessing the individual algorithmic steps elucidated the role of each step and in some cases identified opportunities for further simplification and improvement in computational speed. The initial studies show the proposed registration method to successfully register CBCT images of the inflated and deflated lung. Accuracy appears sufficient to localize the target and adjacent critical anatomy within ∼1-2 mm and guide localization under conditions in which the target cannot be discerned directly in CBCT (e.g., subtle, nonsolid tumors). The ability to directly localize tumors in the operating room could provide a valuable addition to the VATS arsenal, obviate the cost, logistics, and morbidity of preoperative tagging, and improve patient safety. Future work includes in vivo testing, optimization of workflow, and integration with a CBCT image guidance system.

  2. Piloted Simulation to Evaluate the Utility of a Real Time Envelope Protection System for Mitigating In-Flight Icing Hazards

    NASA Technical Reports Server (NTRS)

    Ranaudo, Richard J.; Martos, Borja; Norton, Bill W.; Gingras, David R.; Barnhart, Billy P.; Ratvasky, Thomas P.; Morelli, Eugene

    2011-01-01

    The utility of the Icing Contamination Envelope Protection (ICEPro) system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device (ICEFTD). ICEPro provides real time envelope protection cues and alerting messages on pilot displays. The pilots participating in this test were divided into two groups; a control group using baseline displays without ICEPro, and an experimental group using ICEPro driven display cueing. Each group flew identical precision approach and missed approach procedures with a simulated failure case icing condition. Pilot performance, workload, and survey questionnaires were collected for both groups of pilots. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their situation awareness of a hazardous aircraft state.

  3. Combining density functional theory calculations, supercomputing, and data-driven methods to design new materials (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Jain, Anubhav

    2017-04-01

    Density functional theory (DFT) simulations solve for the electronic structure of materials starting from the Schrödinger equation. Many case studies have now demonstrated that researchers can often use DFT to design new compounds in the computer (e.g., for batteries, catalysts, and hydrogen storage) before synthesis and characterization in the lab. In this talk, I will focus on how DFT calculations can be executed on large supercomputing resources in order to generate very large data sets on new materials for functional applications. First, I will briefly describe the Materials Project, an effort at LBNL that has virtually characterized over 60,000 materials using DFT and has shared the results with over 17,000 registered users. Next, I will talk about how such data can help discover new materials, describing how preliminary computational screening led to the identification and confirmation of a new family of bulk AMX2 thermoelectric compounds with measured zT reaching 0.8. I will outline future plans for how such data-driven methods can be used to better understand the factors that control thermoelectric behavior, e.g., for the rational design of electronic band structures, in ways that are different from conventional approaches.

  4. Reality check of socio-hydrological interactions in water quality and ecosystem management

    NASA Astrophysics Data System (ADS)

    Destouni, Georgia; Fischer, Ida; Prieto, Carmen

    2017-04-01

    Socio-hydrological interactions in water management for improving water quality and ecosystem status include as key components both (i) the societal measures taken for mitigation and control, and (ii) the societal characterization and monitoring efforts made for choosing management targets and checking the effects of measures taken to reach the targets. This study investigates such monitoring, characterization and management efforts and effects over the first six-year management cycle of the EU Water Framework Directive (WFD). The investigation uses Sweden and the WFD-regulated management of its stream and lake waters as a concrete quantification example, with focus on the nutrient and eutrophication conditions that determine the most prominent water quality and ecosystem problems in need of mitigation in the Swedish waters. The case results show a relatively small available monitoring base for determination of these nutrient and eutrophication conditions, even though they constitute key parts in the overall WFD-based approach to classification and management of ecosystem status. Specifically, actual nutrient monitoring exists in only around 1% (down to 0.2% for nutrient loads) of the Swedish stream and lake water bodies; modeling is used to fill the gaps for the remaining unmonitored fraction of classified and managed waters. The available data show that the hydro-climatically driven stream water discharge is a primary explanatory variable for the resulting societal classification of ecosystem status in Swedish waters; this may be due to the discharge magnitude being dominant in determining nutrient loading to these waters. At any rate, with such a hydro-climatically related, rather than human-pressure related, determinant of the societal ecosystem-status classification, the main human-driven causes and effects of eutrophication may not be appropriately identified, and the measures taken for mitigating these may not be well chosen. The available monitoring data from Swedish waters support this hypothesis, by showing that the first WFD management cycle 2009-2015 has led to only slight changes in measured nutrient concentrations, with moderate-to-bad status waters mostly undergoing concentration increases. These management results are in direct contrast to the WFD management goals that ecosystem status in all member-state waters must be improved to at least good level, and in any case not be allowed to further deteriorate. In general, the present results show that societal approaches to ecosystem status classification, monitoring and improvement may need a focus shift for improved identification and quantification of the human-driven components of nutrient inputs, concentrations and loads in water environments. Dominant hydro-climatic change drivers and effects must of course also be understood and accounted for. However, adaptation to hydro-climatic changes should be additional to and aligned with, rather than instead of, necessary mitigation of human-driven eutrophication. The present case results call for further science-based testing and evidence of societal water quality and ecosystem management actually targeting and following up the potential achievement of such mitigation.

  5. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  6. A Hypothesis-Driven Approach to Site Investigation

    NASA Astrophysics Data System (ADS)

    Nowak, W.

    2008-12-01

    Variability of subsurface formations and the scarcity of data lead to the notion of aquifer parameters as geostatistical random variables. Given an information need and limited resources for field campaigns, site investigation is often put into the context of optimal design. In optimal design, the types, numbers and positions of samples are optimized under case-specific objectives to meet the information needs. Past studies feature optimal data worth (balancing maximum financial profit in an engineering task versus the cost of additional sampling), or aim at a minimum prediction uncertainty of stochastic models for a prescribed investigation budget. Recent studies also account for other sources of uncertainty outside the hydrogeological range, such as uncertain toxicity, ingestion and behavioral parameters of the affected population when predicting the human health risk from groundwater contaminations. The current study looks at optimal site investigation from a new angle. Answering a yes/no question under uncertainty directly requires recasting the original question as a hypothesis test. Otherwise, false confidence in the resulting answer would be pretended. A straightforward example is whether a recent contaminant spill will cause contaminant concentrations in excess of a legal limit at a nearby drinking water well. This question can only be answered down to a specified chance of error, i.e., based on the significance level used in hypothesis tests. Optimal design is placed into the hypothesis-driven context by using the chance of providing a false yes/no answer as new criterion to be minimized. Different configurations apply for one-sided and two-sided hypothesis tests. If a false answer entails financial liability, the hypothesis-driven context can be re-cast in the context of data worth. The remaining difference is that failure is a hard constraint in the data worth context versus a monetary punishment term in the hypothesis-driven context. The basic principle is discussed and illustrated on the case of a hypothetical contaminant spill and the exceedance of critical contaminant levels at a downstream location. An tempting and important side question is whether site investigation could be tweaked towards a yes or no answer in maliciously biased campaigns by unfair formulation of the optimization objective.

  7. Computational Fluid Dynamics Simulation of Flows in an Oxidation Ditch Driven by a New Surface Aerator.

    PubMed

    Huang, Weidong; Li, Kun; Wang, Gan; Wang, Yingzhe

    2013-11-01

    In this article, we present a newly designed inverse umbrella surface aerator, and tested its performance in driving flow of an oxidation ditch. Results show that it has a better performance in driving the oxidation ditch than the original one with higher average velocity and more uniform flow field. We also present a computational fluid dynamics model for predicting the flow field in an oxidation ditch driven by a surface aerator. The improved momentum source term approach to simulate the flow field of the oxidation ditch driven by an inverse umbrella surface aerator was developed and validated through experiments. Four kinds of turbulent models were investigated with the approach, including the standard k - ɛ model, RNG k - ɛ model, realizable k - ɛ model, and Reynolds stress model, and the predicted data were compared with those calculated with the multiple rotating reference frame approach (MRF) and sliding mesh approach (SM). Results of the momentum source term approach are in good agreement with the experimental data, and its prediction accuracy is better than MRF, close to SM. It is also found that the momentum source term approach has lower computational expenses, is simpler to preprocess, and is easier to use.

  8. Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model

    NASA Astrophysics Data System (ADS)

    Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman

    2016-08-01

    e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity diagram, and initial class diagram from a list of requirements as identified earlier by the SE researchers

  9. The simulation of shock- and impact-driven flows with Mie-Gruneisen equations of state

    NASA Astrophysics Data System (ADS)

    Ward, Geoffrey M.

    An investigation of shock- and impact-driven flows with Mie-Gruneisen equation of state derived from a linear shock-particle speed Hugoniot relationship is presented. Cartesian mesh methods using structured adaptive refinement are applied to simulate several flows of interest in an Eulerian frame of reference. The flows central to the investigation include planar Richtmyer-Meshkov instability, the impact of a sphere with a plate, and an impact-driven Mach stem. First, for multicomponent shock-driven flows, a dimensionally unsplit, spatially high-order, hybrid, center-difference, limiter methodology is developed. Effective switching between center-difference and upwinding schemes is achieved by a set of robust tolerance and Lax-entropy-based criteria [49]. Oscillations that result from such a mixed stencil scheme are minimized by requiring that the upwinding method approaches the center-difference method in smooth regions. The solver is then applied to investigate planar Richtmyer-Meshkov instability in the context of an equation of state comparison. Comparisons of simulations with materials modeled by isotropic stress Mie-Gruneisen equations of state derived from a linear shock-particle speed Hugoniot relationship [36,52] to those of perfect gases are made with the intention of exposing the role of the equation of state. First, results for single- and triple-mode planar Richtmyer-Meshkov instability between mid-ocean ridge basalt (MORB) and molybdenum modeled by Mie-Gruneisen equations of state are presented for the case of a reflected shock. The single-mode case is explored for incident shock Mach numbers of 1.5 and 2.5. Additionally, examined is single-mode Richtmyer-Meshkov instability when a reflected expansion wave is present for incident Mach numbers of 1.5 and 2.5. Comparison to perfect gas solutions in such cases yields a higher degree of similarity in start-up time and growth rate oscillations. Vorticity distribution and corrugation centerline shortly after shock interaction is also examined. The formation of incipient weak shock waves in the heavy fluid driven by waves emanating from the perturbed transmitted shock is observed when an expansion wave is reflected. Next, the ghost fluid method [83] is explored for application to impact-driven flows with Mie-Gruneisen equations of state in a vacuum. Free surfaces are defined utilizing a level-set approach. The level-set is reinitialized to the signed distance function periodically by solution to a Hamilton-Jacobi differential equation in artificial time. Flux reconstruction along each Cartesian direction of the domain is performed by subdividing in a way that allows for robust treatment of grid-scale sized voids. Ghost cells in voided regions near the material-vacuum interface are determined from surface-normal Riemann problem solution. The method is then applied to several impact problems of interest. First, a one-dimensional impact problem is examined in Mie-Gruneisen aluminum with simple point erosion used to model separation by spallation under high tension. A similar three-dimensional axisymmetric simulation of two rods impacting is then performed without a model for spallation. Further results for three-dimensional axisymmetric simulation of a sphere hitting a plate are then presented. Finally, a brief investigation of the assumptions utilized in modeling solids as isotropic fluids is undertaken. An Eulerian solver approach to handling elastic and elastic-plastic solids is utilized for comparison to the simple fluid model assumption. First, in one dimension an impact problem is examined for elastic, elastic-plastic, and fluid equations of state for aluminum. The results demonstrate that in one dimension the fluid models the plastic shock structure of the flow well. Further investigation is made using a three-dimensional axisymmetric simulation of an impact problem involving a copper cylinder surrounded by aluminum. An aluminum slab impact drives a faster shock in the outer aluminum region yielding a Mach reflection in the copper. The results demonstrate similar plastic shock structures. Several differences are also notable that include a lack of roll-up instability at the material interface and slip-line emanating from the Mach stem's triple point. (Abstract shortened by UMI.)

  10. Modelling Conditions and Health Care Processes in Electronic Health Records: An Application to Severe Mental Illness with the Clinical Practice Research Datalink.

    PubMed

    Olier, Ivan; Springate, David A; Ashcroft, Darren M; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos

    2016-01-01

    The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists.

  11. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier-Stokes simulations: A data-driven, physics-informed Bayesian approach

    NASA Astrophysics Data System (ADS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes.

  12. Large-Scale Discovery of Disease-Disease and Disease-Gene Associations

    PubMed Central

    Gligorijevic, Djordje; Stojanovic, Jelena; Djuric, Nemanja; Radosavljevic, Vladan; Grbovic, Mihajlo; Kulathinal, Rob J.; Obradovic, Zoran

    2016-01-01

    Data-driven phenotype analyses on Electronic Health Record (EHR) data have recently drawn benefits across many areas of clinical practice, uncovering new links in the medical sciences that can potentially affect the well-being of millions of patients. In this paper, EHR data is used to discover novel relationships between diseases by studying their comorbidities (co-occurrences in patients). A novel embedding model is designed to extract knowledge from disease comorbidities by learning from a large-scale EHR database comprising more than 35 million inpatient cases spanning nearly a decade, revealing significant improvements on disease phenotyping over current computational approaches. In addition, the use of the proposed methodology is extended to discover novel disease-gene associations by including valuable domain knowledge from genome-wide association studies. To evaluate our approach, its effectiveness is compared against a held-out set where, again, it revealed very compelling results. For selected diseases, we further identify candidate gene lists for which disease-gene associations were not studied previously. Thus, our approach provides biomedical researchers with new tools to filter genes of interest, thus, reducing costly lab studies. PMID:27578529

  13. Modeling dust growth in protoplanetary disks: The breakthrough case

    NASA Astrophysics Data System (ADS)

    Drążkowska, J.; Windmark, F.; Dullemond, C. P.

    2014-07-01

    Context. Dust coagulation in protoplanetary disks is one of the initial steps toward planet formation. Simple toy models are often not sufficient to cover the complexity of the coagulation process, and a number of numerical approaches are therefore used, among which integration of the Smoluchowski equation and various versions of the Monte Carlo algorithm are the most popular. Aims: Recent progress in understanding the processes involved in dust coagulation have caused a need for benchmarking and comparison of various physical aspects of the coagulation process. In this paper, we directly compare the Smoluchowski and Monte Carlo approaches to show their advantages and disadvantages. Methods: We focus on the mechanism of planetesimal formation via sweep-up growth, which is a new and important aspect of the current planet formation theory. We use realistic test cases that implement a distribution in dust collision velocities. This allows a single collision between two grains to have a wide range of possible outcomes but also requires a very high numerical accuracy. Results: For most coagulation problems, we find a general agreement between the two approaches. However, for the sweep-up growth driven by the "lucky" breakthrough mechanism, the methods exhibit very different resolution dependencies. With too few mass bins, the Smoluchowski algorithm tends to overestimate the growth rate and the probability of breakthrough. The Monte Carlo method is less dependent on the number of particles in the growth timescale aspect but tends to underestimate the breakthrough chance due to its limited dynamic mass range. Conclusions: We find that the Smoluchowski approach, which is generally better for the breakthrough studies, is sensitive to low mass resolutions in the high-mass, low-number tail that is important in this scenario. To study the low number density features, a new modulation function has to be introduced to the interaction probabilities. As the minimum resolution needed for breakthrough studies depends strongly on setup, verification has to be performed on a case by case basis.

  14. Model-Driven Theme/UML

    NASA Astrophysics Data System (ADS)

    Carton, Andrew; Driver, Cormac; Jackson, Andrew; Clarke, Siobhán

    Theme/UML is an existing approach to aspect-oriented modelling that supports the modularisation and composition of concerns, including crosscutting ones, in design. To date, its lack of integration with model-driven engineering (MDE) techniques has limited its benefits across the development lifecycle. Here, we describe our work on facilitating the use of Theme/UML as part of an MDE process. We have developed a transformation tool that adopts model-driven architecture (MDA) standards. It defines a concern composition mechanism, implemented as a model transformation, to support the enhanced modularisation features of Theme/UML. We evaluate our approach by applying it to the development of mobile, context-aware applications-an application area characterised by many non-functional requirements that manifest themselves as crosscutting concerns.

  15. Visualising landscape evolution: the effects of resolution on soil redistribution

    NASA Astrophysics Data System (ADS)

    Schoorl, Jeroen M.; Claessens, Lieven; (A) Veldkamp, Tom

    2017-04-01

    Landscape forming processes such as erosion by water, land sliding by water and gravity or ploughing by gravity, are closely related to resolution and land use changes. These processes may be controlled and influenced by multiple bio-physical and socio-economic driving factors, resulting in a complex multi-scale system. Consequently, land use changes should not be analysed in isolation without accounting for both on-site and off-site effects of these landscape processes in landscapes where water driven and or gravity driven processes are very active,. Especially the visualisation of these on- and off-site effects as a movie of evolving time series and changes is a potential valuable possibility in DEM modelling approaches. To investigate the interactions between land use, land use change, resolution of DEMs and landscape processes, a case study for the Álora region in southern Spain will presented, mainly as movies of modelling time-series, Starting from a baseline scenario of land use change, different levels of resolutions, interactions and feedbacks are added to the coupled LAPSUS model framework: Quantities and spatial patterns of both land use change and soil redistribution are compared between the baseline scenario without interactions and with each of the interaction mechanisms implemented consecutively. All as a function of spatial resolution. Keywords: LAPSUS; land use change; soil erosion, movie;

  16. Current uses of Web 2.0 applications in transportation : case studies of select state departments of transportation

    DOT National Transportation Integrated Search

    2010-03-01

    Web 2.0 is an umbrella term for websites or online applications that are user-driven and emphasize collaboration and user interactivity. The trend away from static web pages to a more user-driven Internet model has also occurred in the public s...

  17. The role of parasite-driven selection in shaping landscape genomic structure in red grouse (Lagopus lagopus scotica).

    PubMed

    Wenzel, Marius A; Douglas, Alex; James, Marianne C; Redpath, Steve M; Piertney, Stuart B

    2016-01-01

    Landscape genomics promises to provide novel insights into how neutral and adaptive processes shape genome-wide variation within and among populations. However, there has been little emphasis on examining whether individual-based phenotype-genotype relationships derived from approaches such as genome-wide association (GWAS) manifest themselves as a population-level signature of selection in a landscape context. The two may prove irreconcilable as individual-level patterns become diluted by high levels of gene flow and complex phenotypic or environmental heterogeneity. We illustrate this issue with a case study that examines the role of the highly prevalent gastrointestinal nematode Trichostrongylus tenuis in shaping genomic signatures of selection in red grouse (Lagopus lagopus scotica). Individual-level GWAS involving 384 SNPs has previously identified five SNPs that explain variation in T. tenuis burden. Here, we examine whether these same SNPs display population-level relationships between T. tenuis burden and genetic structure across a small-scale landscape of 21 sites with heterogeneous parasite pressure. Moreover, we identify adaptive SNPs showing signatures of directional selection using F(ST) outlier analysis and relate population- and individual-level patterns of multilocus neutral and adaptive genetic structure to T. tenuis burden. The five candidate SNPs for parasite-driven selection were neither associated with T. tenuis burden on a population level, nor under directional selection. Similarly, there was no evidence of parasite-driven selection in SNPs identified as candidates for directional selection. We discuss these results in the context of red grouse ecology and highlight the broader consequences for the utility of landscape genomics approaches for identifying signatures of selection. © 2015 John Wiley & Sons Ltd.

  18. Driving for successful change processes in healthcare by putting staff at the wheel.

    PubMed

    Erlingsdottir, Gudbjörg; Ersson, Anders; Borell, Jonas; Rydenfält, Christofer

    2018-03-19

    Purpose The purpose of this paper is to describe five salient factors that emerge in two successful change processes in healthcare. Organizational changes in healthcare are often characterized by problems and solutions that have been formulated by higher levels of management. This top-down management approach has not been well received by the professional community. As a result, improvement processes are frequently abandoned, resulting in disrupted and dysfunctional organizations. This paper presents two successful change processes where managerial leadership was used to coach the change processes by distributing mandates and resources. After being managerially initiated, both processes were driven by local agency, decisions, planning and engagement. Design/methodology/approach The data in the paper derive from two qualitative case studies. Data were collected through in-depth interviews, observations and document studies. The cases are presented as process descriptions covering the different phases of the change processes. The focus in the studies is on the roles and interactions of the actors involved, the type of leadership and the distribution of agency. Findings Five factors emerged as paramount to the successful change processes in the two cases: local ownership of problems; a coached process where management initiates the change process and the problem recognition, and then lets the staff define the problems, formulate solutions and drive necessary changes; distributed leadership directed at enabling and supporting the staff's intentions and long-term self-leadership; mutually formulated norms and values that serve as a unifying force for the staff; and generous time allocation and planning, which allows the process to take time, and creates room for reevaluation. The authors also noted that in both cases, reorganization into multi-professional teams lent stability and endurance to the completed changes. Originality/value The research shows how management can initiate and support successful change processes that are staff driven and characterized by local agency, decisions, planning and engagement. Empirical descriptions of successful change processes are rare, which is why the description of such processes in this research increases the value of the paper.

  19. Neoclassic drug discovery: the case for lead generation using phenotypic and functional approaches.

    PubMed

    Lee, Jonathan A; Berg, Ellen L

    2013-12-01

    Innovation and new molecular entity production by the pharmaceutical industry has been below expectations. Surprisingly, more first-in-class small-molecule drugs approved by the U.S. Food and Drug Administration (FDA) between 1999 and 2008 were identified by functional phenotypic lead generation strategies reminiscent of pre-genomics pharmacology than contemporary molecular targeted strategies that encompass the vast majority of lead generation efforts. This observation, in conjunction with the difficulty in validating molecular targets for drug discovery, has diminished the impact of the "genomics revolution" and has led to a growing grassroots movement and now broader trend in pharma to reconsider the use of modern physiology-based or phenotypic drug discovery (PDD) strategies. This "From the Guest Editors" column provides an introduction and overview of the two-part special issues of Journal of Biomolecular Screening on PDD. Terminology and the business case for use of PDD are defined. Key issues such as assay performance, chemical optimization, target identification, and challenges to the organization and implementation of PDD are discussed. Possible solutions for these challenges and a new neoclassic vision for PDD that combines phenotypic and functional approaches with technology innovations resulting from the genomics-driven era of target-based drug discovery (TDD) are also described. Finally, an overview of the manuscripts in this special edition is provided.

  20. Phenome-driven disease genetics prediction toward drug discovery

    PubMed Central

    Chen, Yang; Li, Li; Zhang, Guo-Qiang; Xu, Rong

    2015-01-01

    Motivation: Discerning genetic contributions to diseases not only enhances our understanding of disease mechanisms, but also leads to translational opportunities for drug discovery. Recent computational approaches incorporate disease phenotypic similarities to improve the prediction power of disease gene discovery. However, most current studies used only one data source of human disease phenotype. We present an innovative and generic strategy for combining multiple different data sources of human disease phenotype and predicting disease-associated genes from integrated phenotypic and genomic data. Results: To demonstrate our approach, we explored a new phenotype database from biomedical ontologies and constructed Disease Manifestation Network (DMN). We combined DMN with mimMiner, which was a widely used phenotype database in disease gene prediction studies. Our approach achieved significantly improved performance over a baseline method, which used only one phenotype data source. In the leave-one-out cross-validation and de novo gene prediction analysis, our approach achieved the area under the curves of 90.7% and 90.3%, which are significantly higher than 84.2% (P < e−4) and 81.3% (P < e−12) for the baseline approach. We further demonstrated that our predicted genes have the translational potential in drug discovery. We used Crohn’s disease as an example and ranked the candidate drugs based on the rank of drug targets. Our gene prediction approach prioritized druggable genes that are likely to be associated with Crohn’s disease pathogenesis, and our rank of candidate drugs successfully prioritized the Food and Drug Administration-approved drugs for Crohn’s disease. We also found literature evidence to support a number of drugs among the top 200 candidates. In summary, we demonstrated that a novel strategy combining unique disease phenotype data with system approaches can lead to rapid drug discovery. Availability and implementation: nlp.case.edu/public/data/DMN Contact: rxx@case.edu PMID:26072493

  1. A knowledge-driven approach to cluster validity assessment.

    PubMed

    Bolshakova, Nadia; Azuaje, Francisco; Cunningham, Pádraig

    2005-05-15

    This paper presents an approach to assessing cluster validity based on similarity knowledge extracted from the Gene Ontology. The program is freely available for non-profit use on request from the authors.

  2. Totally Asymmetric Limit for Models of Heat Conduction

    NASA Astrophysics Data System (ADS)

    De Carlo, Leonardo; Gabrielli, Davide

    2017-08-01

    We consider one dimensional weakly asymmetric boundary driven models of heat conduction. In the cases of a constant diffusion coefficient and of a quadratic mobility we compute the quasi-potential that is a non local functional obtained by the solution of a variational problem. This is done using the dynamic variational approach of the macroscopic fluctuation theory (Bertini et al. in Rev Mod Phys 87:593, 2015). The case of a concave mobility corresponds essentially to the exclusion model that has been discussed in Bertini et al. (J Stat Mech L11001, 2010; Pure Appl Math 64(5):649-696, 2011; Commun Math Phys 289(1):311-334, 2009) and Enaud and Derrida (J Stat Phys 114:537-562, 2004). We consider here the convex case that includes for example the Kipnis-Marchioro-Presutti (KMP) model and its dual (KMPd) (Kipnis et al. in J Stat Phys 27:6574, 1982). This extends to the weakly asymmetric regime the computations in Bertini et al. (J Stat Phys 121(5/6):843-885, 2005). We consider then, both microscopically and macroscopically, the limit of large externalfields. Microscopically we discuss some possible totally asymmetric limits of the KMP model. In one case the totally asymmetric dynamics has a product invariant measure. Another possible limit dynamics has instead a non trivial invariant measure for which we give a duality representation. Macroscopically we show that the quasi-potentials of KMP and KMPd, which are non local for any value of the external field, become local in the limit. Moreover the dependence on one of the external reservoirs disappears. For models having strictly positive quadratic mobilities we obtain instead in the limit a non local functional having a structure similar to the one of the boundary driven asymmetric exclusion process.

  3. Pathways-Driven Sparse Regression Identifies Pathways and Genes Associated with High-Density Lipoprotein Cholesterol in Two Asian Cohorts

    PubMed Central

    Silver, Matt; Chen, Peng; Li, Ruoying; Cheng, Ching-Yu; Wong, Tien-Yin; Tai, E-Shyong; Teo, Yik-Ying; Montana, Giovanni

    2013-01-01

    Standard approaches to data analysis in genome-wide association studies (GWAS) ignore any potential functional relationships between gene variants. In contrast gene pathways analysis uses prior information on functional structure within the genome to identify pathways associated with a trait of interest. In a second step, important single nucleotide polymorphisms (SNPs) or genes may be identified within associated pathways. The pathways approach is motivated by the fact that genes do not act alone, but instead have effects that are likely to be mediated through their interaction in gene pathways. Where this is the case, pathways approaches may reveal aspects of a trait's genetic architecture that would otherwise be missed when considering SNPs in isolation. Most pathways methods begin by testing SNPs one at a time, and so fail to capitalise on the potential advantages inherent in a multi-SNP, joint modelling approach. Here, we describe a dual-level, sparse regression model for the simultaneous identification of pathways and genes associated with a quantitative trait. Our method takes account of various factors specific to the joint modelling of pathways with genome-wide data, including widespread correlation between genetic predictors, and the fact that variants may overlap multiple pathways. We use a resampling strategy that exploits finite sample variability to provide robust rankings for pathways and genes. We test our method through simulation, and use it to perform pathways-driven gene selection in a search for pathways and genes associated with variation in serum high-density lipoprotein cholesterol levels in two separate GWAS cohorts of Asian adults. By comparing results from both cohorts we identify a number of candidate pathways including those associated with cardiomyopathy, and T cell receptor and PPAR signalling. Highlighted genes include those associated with the L-type calcium channel, adenylate cyclase, integrin, laminin, MAPK signalling and immune function. PMID:24278029

  4. Pathways-driven sparse regression identifies pathways and genes associated with high-density lipoprotein cholesterol in two Asian cohorts.

    PubMed

    Silver, Matt; Chen, Peng; Li, Ruoying; Cheng, Ching-Yu; Wong, Tien-Yin; Tai, E-Shyong; Teo, Yik-Ying; Montana, Giovanni

    2013-11-01

    Standard approaches to data analysis in genome-wide association studies (GWAS) ignore any potential functional relationships between gene variants. In contrast gene pathways analysis uses prior information on functional structure within the genome to identify pathways associated with a trait of interest. In a second step, important single nucleotide polymorphisms (SNPs) or genes may be identified within associated pathways. The pathways approach is motivated by the fact that genes do not act alone, but instead have effects that are likely to be mediated through their interaction in gene pathways. Where this is the case, pathways approaches may reveal aspects of a trait's genetic architecture that would otherwise be missed when considering SNPs in isolation. Most pathways methods begin by testing SNPs one at a time, and so fail to capitalise on the potential advantages inherent in a multi-SNP, joint modelling approach. Here, we describe a dual-level, sparse regression model for the simultaneous identification of pathways and genes associated with a quantitative trait. Our method takes account of various factors specific to the joint modelling of pathways with genome-wide data, including widespread correlation between genetic predictors, and the fact that variants may overlap multiple pathways. We use a resampling strategy that exploits finite sample variability to provide robust rankings for pathways and genes. We test our method through simulation, and use it to perform pathways-driven gene selection in a search for pathways and genes associated with variation in serum high-density lipoprotein cholesterol levels in two separate GWAS cohorts of Asian adults. By comparing results from both cohorts we identify a number of candidate pathways including those associated with cardiomyopathy, and T cell receptor and PPAR signalling. Highlighted genes include those associated with the L-type calcium channel, adenylate cyclase, integrin, laminin, MAPK signalling and immune function.

  5. Optimal Force Control of Vibro-Impact Systems for Autonomous Drilling Applications

    NASA Technical Reports Server (NTRS)

    Aldrich, Jack B.; Okon, Avi B.

    2012-01-01

    The need to maintain optimal energy efficiency is critical during the drilling operations performed on future and current planetary rover missions (see figure). Specifically, this innovation seeks to solve the following problem. Given a spring-loaded percussive drill driven by a voice-coil motor, one needs to determine the optimal input voltage waveform (periodic function) and the optimal hammering period that minimizes the dissipated energy, while ensuring that the hammer-to-rock impacts are made with sufficient (user-defined) impact velocity (or impact energy). To solve this problem, it was first observed that when voice-coil-actuated percussive drills are driven at high power, it is of paramount importance to ensure that the electrical current of the device remains in phase with the velocity of the hammer. Otherwise, negative work is performed and the drill experiences a loss of performance (i.e., reduced impact energy) and an increase in Joule heating (i.e., reduction in energy efficiency). This observation has motivated many drilling products to incorporate the standard bang-bang control approach for driving their percussive drills. However, the bang-bang control approach is significantly less efficient than the optimal energy-efficient control approach solved herein. To obtain this solution, the standard tools of classical optimal control theory were applied. It is worth noting that these tools inherently require the solution of a two-point boundary value problem (TPBVP), i.e., a system of differential equations where half the equations have unknown boundary conditions. Typically, the TPBVP is impossible to solve analytically for high-dimensional dynamic systems. However, for the case of the spring-loaded vibro-impactor, this approach yields the exact optimal control solution as the sum of four analytic functions whose coefficients are determined using a simple, easy-to-implement algorithm. Once the optimal control waveform is determined, it can be used optimally in the context of both open-loop and closed-loop control modes (using standard realtime control hardware).

  6. On the nonlinear interaction of Goertler vortices and Tollmien-Schlichting waves in curved channel flows at finite Reynolds numbers

    NASA Technical Reports Server (NTRS)

    Daudpota, Q. Isa; Zang, Thomas A.; Hall, Philip

    1988-01-01

    The flow in a two-dimensional curved channel driven by an azimuthal pressure gradient can become linearly unstable due to axisymmetric perturbations and/or nonaxisymmetric perturbations depending on the curvature of the channel and the Reynolds number. For a particular small value of curvature, the critical neighborhood of this curvature value and critical Reynolds number, nonlinear interactions occur between these perturbations. The Stuart-Watson approach is used to derive two coupled Landau equations for the amplitudes of these perturbations. The stability of the various possible states of these perturbations is shown through bifurcation diagrams. Emphasis is given to those cases which have relevance to external flows.

  7. On the nonlinear interaction of Gortler vortices and Tollmien-Schlichting waves in curved channel flows at finite Reynolds numbers

    NASA Technical Reports Server (NTRS)

    Daudpota, Q. Isa; Hall, Philip; Zang, Thomas A.

    1987-01-01

    The flow in a two-dimensional curved channel driven by an azimuthal pressure gradient can become linearly unstable due to axisymmetric perturbations and/or nonaxisymmetric perturbations depending on the curvature of the channel and the Reynolds number. For a particular small value of curvature, the critical neighborhood of this curvature value and critical Reynolds number, nonlinear interactions occur between these perturbations. The Stuart-Watson approach is used to derive two coupled Landau equations for the amplitudes of these perturbations. The stability of the various possible states of these perturbations is shown through bifurcation diagrams. Emphasis is given to those cases which have relevance to external flows.

  8. Enrichment Clusters: A Practical Plan for Real-World, Student-Driven Learning.

    ERIC Educational Resources Information Center

    Renzulli, Joseph S.; Gentry, Marcia; Reis, Sally M.

    This guidebook provides a rationale and guidelines for implementing a student-driven learning approach using enrichment clusters. Enrichment clusters allow students who share a common interest to meet each week to produce a product, performance, or targeted service based on that common interest. Chapter 1 discusses different models of learning.…

  9. Closing the Loop: How We Better Serve Our Students through a Comprehensive Assessment Process

    ERIC Educational Resources Information Center

    Arcario, Paul; Eynon, Bret; Klages, Marisa; Polnariev, Bernard A.

    2013-01-01

    Outcomes assessment is often driven by demands for accountability. LaGuardia Community College's outcomes assessment model has advanced student learning, shaped academic program development, and created an impressive culture of faculty-driven assessment. Our inquiry-based approach uses ePortfolios for collection of student work and demonstrates…

  10. Eukaryotic Cell Cycle as a Test Case for Modeling Cellular Regulation in a Collaborative Problem-Solving Environment

    DTIC Science & Technology

    2007-03-01

    mitoses, some cells arrest in G2 while other cells continue to divide. In sea urchin and frog embryos, the first 12 cell cycles are known to be driven...with interlaced feedback and feed forward control loops, the hand-waving approach flounders in a stormy sea of conflicting signals, endless...we reduced the rate constants for degradation of Clb2, as described in the publication. Experiment Copies/cell, mean ± SEM (fold increase

  11. A Study of Enabling Factors for Rapid Fielding: Combined Practices to Balance Speed and Stability

    DTIC Science & Technology

    2013-05-01

    as contributors to the success of Agile projects, such as Scrum status meetings, continuous integration, test-driven development, etc. A second...Management Approach Type Product Size Team Size Sprint length / Prod Release Cycle A-P1 Pre- release Scrum Case management system ᝺M...SLOC 10-20 2 weeks/ TBD B-P1 12 years Scrum Analysis support system ᝺M SLOC 10-20 2 weeks/ 6 months – 1 year C-P1 3 years Scrum Training

  12. Evaluating Model-Driven Development for large-scale EHRs through the openEHR approach.

    PubMed

    Christensen, Bente; Ellingsen, Gunnar

    2016-05-01

    In healthcare, the openEHR standard is a promising Model-Driven Development (MDD) approach for electronic healthcare records. This paper aims to identify key socio-technical challenges when the openEHR approach is put to use in Norwegian hospitals. More specifically, key fundamental assumptions are investigated empirically. These assumptions promise a clear separation of technical and domain concerns, users being in control of the modelling process, and widespread user commitment. Finally, these assumptions promise an easy way to model and map complex organizations. This longitudinal case study is based on an interpretive approach, whereby data were gathered through 440h of participant observation, 22 semi-structured interviews and extensive document studies over 4 years. The separation of clinical and technical concerns seemed to be aspirational, because both designing the technical system and modelling the domain required technical and clinical competence. Hence developers and clinicians found themselves working together in both arenas. User control and user commitment seemed not to apply in large-scale projects, as modelling the domain turned out to be too complicated and hence to appeal only to especially interested users worldwide, not the local end-users. Modelling proved to be a complex standardization process that shaped both the actual modelling and healthcare practice itself. A broad assemblage of contributors seems to be needed for developing an archetype-based system, in which roles, responsibilities and contributions cannot be clearly defined and delimited. The way MDD occurs has implications for medical practice per se in the form of the need to standardize practices to ensure that medical concepts are uniform across practices. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Ontology development for provenance tracing in National Climate Assessment of the US Global Change Research Program

    NASA Astrophysics Data System (ADS)

    Fu, Linyun; Ma, Xiaogang; Zheng, Jin; Goldstein, Justin; Duggan, Brian; West, Patrick; Aulenbach, Steve; Tilmes, Curt; Fox, Peter

    2014-05-01

    This poster will show how we used a case-driven iterative methodology to develop an ontology to represent the content structure and the associated provenance information in a National Climate Assessment (NCA) report of the US Global Change Research Program (USGCRP). We applied the W3C PROV-O ontology to implement a formal representation of provenance. We argue that the use case-driven, iterative development process and the application of a formal provenance ontology help efficiently incorporate domain knowledge from earth and environmental scientists in a well-structured model interoperable in the context of the Web of Data.

  14. Context-sensitive network-based disease genetics prediction and its implications in drug discovery.

    PubMed

    Chen, Yang; Xu, Rong

    2017-04-01

    Disease phenotype networks play an important role in computational approaches to identifying new disease-gene associations. Current disease phenotype networks often model disease relationships based on pairwise similarities, therefore ignore the specific context on how two diseases are connected. In this study, we propose a new strategy to model disease associations using context-sensitive networks (CSNs). We developed a CSN-based phenome-driven approach for disease genetics prediction, and investigated the translational potential of the predicted genes in drug discovery. We constructed CSNs by directly connecting diseases with associated phenotypes. Here, we constructed two CSNs using different data sources; the two networks contain 26 790 and 13 822 nodes respectively. We integrated the CSNs with a genetic functional relationship network and predicted disease genes using a network-based ranking algorithm. For comparison, we built Similarity-Based disease Networks (SBN) using the same disease phenotype data. In a de novo cross validation for 3324 diseases, the CSN-based approach significantly increased the average rank from top 12.6 to top 8.8% for all tested genes comparing with the SBN-based approach ( p

  15. Dynamic behavior of the interface of striplike structures in driven lattice gases

    NASA Astrophysics Data System (ADS)

    Saracco, Gustavo P.; Albano, Ezequiel V.

    2008-09-01

    In this work, the dynamic behavior of the interfaces in both the standard and random driven lattice gas models (DLG and RDLG, respectively) is investigated via numerical Monte Carlo simulations in two dimensions. These models consider a lattice gas of density ρ=1/2 with nearest-neighbor attractive interactions between particles under the influence of an external driven field applied along one fixed direction in the case of the DLG model, and a randomly varying direction in the case of the RDLG model. The systems are also in contact with a reservoir at temperature T . Those systems undergo a second-order nonequilibrium phase transition between an ordered state characterized by high-density strips crossing the sample along the driving field, and a quasilattice gas disordered state. For T≲Tc , the average interface width of the strips (W) was measured as a function of the lattice size and the anisotropic shape factor. It was found that the saturation value Wsat2 only depends on the lattice size parallel to the external field axis Ly and exhibits two distinct regimes: Wsat2∝lnLy for low temperatures, that crosses over to Wsat2∝Ly2αI near the critical zone, αI=1/2 being the roughness exponent of the interface. By using the relationship αI=1/(1+ΔI) , the anisotropic exponent for the interface of the DLG model was estimated, giving ΔI≃1 , in agreement with the computed value for anisotropic bulk exponent ΔB in a recently proposed theoretical approach. At the crossover region between both regimes, we observed indications of bulk criticality. The time evolution of W at Tc was also monitored and shows two growing stages: first one observes that W∝lnt for several decades, and in the following times one has W∝tβI , where βI is the dynamic exponent of the interface width. By using this value we estimated the dynamic critical exponent of the correlation length in the perpendicular direction to the external field, giving z⊥I≈4 , which is consistent with the dynamic exponent of the bulk critical transition z⊥B in both theoretical approaches developed for the standard model. A similar scenario was also observed in the RDLG model, suggesting that both models may belong to the same universality class.

  16. A Novel Approach for Determining Source–Receptor Relationships in Model Simulations: A Case Study of Black Carbon Transport in Northern Hemisphere Winter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Po-Lun; Gattiker, J. R.; Liu, Xiaohong

    2013-06-27

    A Gaussian process (GP) emulator is applied to quantify the contribution of local and remote emissions of black carbon (BC) on the BC concentrations in different regions using a Latin Hypercube sampling strategy for emission perturbations in the offline version of the Community Atmosphere Model Version 5.1 (CAM5) simulations. The source-receptor relationships are computed based on simulations constrained by a standard free-running CAM5 simulation and the ERA-Interim reanalysis product. The analysis demonstrates that the emulator is capable of retrieving the source-receptor relationships based on a small number of CAM5 simulations. Most regions are found susceptible to their local emissions. Themore » emulator also finds that the source-receptor relationships retrieved from the model-driven and the reanalysis-driven simulations are very similar, suggesting that the simulated circulation in CAM5 resembles the assimilated meteorology in ERA-Interim. The robustness of the results provides confidence for applying the emulator to detect dose-response signals in the climate system.« less

  17. The Use of Linking Adverbials in Academic Essays by Non-Native Writers: How Data-Driven Learning Can Help

    ERIC Educational Resources Information Center

    Garner, James Robert

    2013-01-01

    Over the past several decades, the TESOL community has seen an increased interest in the use of data-driven learning (DDL) approaches. Most studies of DDL have focused on the acquisition of vocabulary items, including a wide range of information necessary for their correct usage. One type of vocabulary that has yet to be properly investigated has…

  18. Data Driven Model Development for the Supersonic Semispan Transport (S(sup 4)T)

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2011-01-01

    We investigate two common approaches to model development for robust control synthesis in the aerospace community; namely, reduced order aeroservoelastic modelling based on structural finite-element and computational fluid dynamics based aerodynamic models and a data-driven system identification procedure. It is shown via analysis of experimental Super- Sonic SemiSpan Transport (S4T) wind-tunnel data using a system identification approach it is possible to estimate a model at a fixed Mach, which is parsimonious and robust across varying dynamic pressures.

  19. Metal-driven and covalent synthesis of supramolecular grids from racks: a convergent approach to heterometallic and heteroleptic nanostructures.

    PubMed

    Schmittel, Michael; Kalsani, Venkateshwarlu; Bats, Jan W

    2005-06-13

    Supramolecular nanogrids were prepared from dynamic supramolecular racks through the coupling of terminal alkynes using either a covalent (with CuCl/O(2)) or a coordinative (with [trans-(PEt(3))(2)PtCl(2)]) approach. Because of the rapid equilibration of the racks (as tested by exchange reactions), oligomeric adducts potentially formed in the coupling process will selectively furnish the nanogrids through an entropically driven self-repair mechanism. To ascertain the structural assignment, the nanogrids were also synthesized by an independent strategy.

  20. Model-Based Approach to Predict Adherence to Protocol During Antiobesity Trials.

    PubMed

    Sharma, Vishnu D; Combes, François P; Vakilynejad, Majid; Lahu, Gezim; Lesko, Lawrence J; Trame, Mirjam N

    2018-02-01

    Development of antiobesity drugs is continuously challenged by high dropout rates during clinical trials. The objective was to develop a population pharmacodynamic model that describes the temporal changes in body weight, considering disease progression, lifestyle intervention, and drug effects. Markov modeling (MM) was applied for quantification and characterization of responder and nonresponder as key drivers of dropout rates, to ultimately support the clinical trial simulations and the outcome in terms of trial adherence. Subjects (n = 4591) from 6 Contrave ® trials were included in this analysis. An indirect-response model developed by van Wart et al was used as a starting point. Inclusion of drug effect was dose driven using a population dose- and time-dependent pharmacodynamic (DTPD) model. Additionally, a population-pharmacokinetic parameter- and data (PPPD)-driven model was developed using the final DTPD model structure and final parameter estimates from a previously developed population pharmacokinetic model based on available Contrave ® pharmacokinetic concentrations. Last, MM was developed to predict transition rate probabilities among responder, nonresponder, and dropout states driven by the pharmacodynamic effect resulting from the DTPD or PPPD model. Covariates included in the models and parameters were diabetes mellitus and race. The linked DTPD-MM and PPPD-MM was able to predict transition rates among responder, nonresponder, and dropout states well. The analysis concluded that body-weight change is an important factor influencing dropout rates, and the MM depicted that overall a DTPD model-driven approach provides a reasonable prediction of clinical trial outcome probabilities similar to a pharmacokinetic-driven approach. © 2017, The Authors. The Journal of Clinical Pharmacology published by Wiley Periodicals, Inc. on behalf of American College of Clinical Pharmacology.

  1. Ability Grouping and Differentiated Instruction in an Era of Data-Driven Decision Making

    ERIC Educational Resources Information Center

    Park, Vicki; Datnow, Amanda

    2017-01-01

    Despite data-driven decision making being a ubiquitous part of policy and school reform efforts, little is known about how teachers use data for instructional decision making. Drawing on data from a qualitative case study of four elementary schools, we examine the logic and patterns of teacher decision making about differentiation and ability…

  2. Replacement model of city bus: A dynamic programming approach

    NASA Astrophysics Data System (ADS)

    Arifin, Dadang; Yusuf, Edhi

    2017-06-01

    This paper aims to develop a replacement model of city bus vehicles operated in Bandung City. This study is driven from real cases encountered by the Damri Company in the efforts to improve services to the public. The replacement model propounds two policy alternatives: First, to maintain or keep the vehicles, and second is to replace them with new ones taking into account operating costs, revenue, salvage value, and acquisition cost of a new vehicle. A deterministic dynamic programming approach is used to solve the model. The optimization process was heuristically executed using empirical data of Perum Damri. The output of the model is to determine the replacement schedule and the best policy if the vehicle has passed the economic life. Based on the results, the technical life of the bus is approximately 20 years old, while the economic life is an average of 9 (nine) years. It means that after the bus is operated for 9 (nine) years, managers should consider the policy of rejuvenation.

  3. Development of the supply chain oriented quality assurance system for aerospace manufacturing SMEs and its implementation perspectives

    NASA Astrophysics Data System (ADS)

    Hussein, Abdullahi; Cheng, Kai

    2016-10-01

    Aerospace manufacturing SMEs are continuously facing the challenge on managing their supply chain and complying with the aerospace manufacturing quality standard requirement due to their lack of resources and the nature of business. In this paper, the ERP system based approach is presented to quality control and assurance work in light of seamless integration of in-process production data and information internally and therefore managing suppliers more effectively and efficiently. The Aerospace Manufacturing Quality Assurance Standard (BS/EN9100) is one of the most recognised and essential protocols for developing the industry-operated-and-driven quality assurance systems. The research investigates using the ERP based system as an enabler to implement BS/EN9100 quality management system at manufacturing SMEs and the associated implementation and application perspectives. An application case study on a manufacturing SME is presented by using the SAP based implementation, which helps further evaluate and validate the approach and application system development.

  4. Decontamination of soil washing wastewater using solar driven advanced oxidation processes.

    PubMed

    Bandala, Erick R; Velasco, Yuridia; Torres, Luis G

    2008-12-30

    Decontamination of soil washing wastewater was performed using two different solar driven advanced oxidation processes (AOPs): the photo-Fenton reaction and the cobalt/peroxymonosulfate/ultraviolet (Co/PMS/UV) process. Complete sodium dodecyl sulphate (SDS), the surfactant agent used to enhance soil washing process, degradation was achieved when the Co/PMS/UV process was used. In the case of photo-Fenton reaction, almost complete SDS degradation was achieved after the use of almost four times the actual energy amount required by the Co/PMS/UV process. Initial reaction rate in the first 15min (IR15) was determined for each process in order to compare them. Highest IR15 value was determined for the Co/PMS/UV process (0.011mmol/min) followed by the photo-Fenton reaction (0.0072mmol/min) and the dark Co/PMS and Fenton processes (IR15=0.002mmol/min in both cases). Organic matter depletion in the wastewater, as the sum of surfactant and total petroleum hydrocarbons present (measured as chemical oxygen demand, COD), was also determined for both solar driven processes. It was found that, for the case of COD, the highest removal (69%) was achieved when photo-Fenton reaction was used whereas Co/PMS/UV process yielded a slightly lower removal (51%). In both cases, organic matter removal achieved was over 50%, which can be consider proper for the coupling of the tested AOPs with conventional wastewater treatment processes such as biodegradation.

  5. Adiabatic markovian dynamics.

    PubMed

    Oreshkov, Ognyan; Calsamiglia, John

    2010-07-30

    We propose a theory of adiabaticity in quantum markovian dynamics based on a decomposition of the Hilbert space induced by the asymptotic behavior of the Lindblad semigroup. A central idea of our approach is that the natural generalization of the concept of eigenspace of the Hamiltonian in the case of markovian dynamics is a noiseless subsystem with a minimal noisy cofactor. Unlike previous attempts to define adiabaticity for open systems, our approach deals exclusively with physical entities and provides a simple, intuitive picture at the Hilbert-space level, linking the notion of adiabaticity to the theory of noiseless subsystems. As two applications of our theory, we propose a general framework for decoherence-assisted computation in noiseless codes and a dissipation-driven approach to holonomic computation based on adiabatic dragging of subsystems that is generally not achievable by nondissipative means.

  6. Taking Advantage of Selective Change Driven Processing for 3D Scanning

    PubMed Central

    Vegara, Francisco; Zuccarello, Pedro; Boluda, Jose A.; Pardo, Fernando

    2013-01-01

    This article deals with the application of the principles of SCD (Selective Change Driven) vision to 3D laser scanning. Two experimental sets have been implemented: one with a classical CMOS (Complementary Metal-Oxide Semiconductor) sensor, and the other one with a recently developed CMOS SCD sensor for comparative purposes, both using the technique known as Active Triangulation. An SCD sensor only delivers the pixels that have changed most, ordered by the magnitude of their change since their last readout. The 3D scanning method is based on the systematic search through the entire image to detect pixels that exceed a certain threshold, showing the SCD approach to be ideal for this application. Several experiments for both capturing strategies have been performed to try to find the limitations in high speed acquisition/processing. The classical approach is limited by the sequential array acquisition, as predicted by the Nyquist–Shannon sampling theorem, and this has been experimentally demonstrated in the case of a rotating helix. These limitations are overcome by the SCD 3D scanning prototype achieving a significantly higher performance. The aim of this article is to compare both capturing strategies in terms of performance in the time and frequency domains, so they share all the static characteristics including resolution, 3D scanning method, etc., thus yielding the same 3D reconstruction in static scenes. PMID:24084110

  7. General Aviation Interior Noise. Part 1; Source/Path Identification

    NASA Technical Reports Server (NTRS)

    Unruh, James F.; Till, Paul D.; Palumbo, Daniel L. (Technical Monitor)

    2002-01-01

    There were two primary objectives of the research effort reported herein. The first objective was to identify and evaluate noise source/path identification technology applicable to single engine propeller driven aircraft that can be used to identify interior noise sources originating from structure-borne engine/propeller vibration, airborne propeller transmission, airborne engine exhaust noise, and engine case radiation. The approach taken to identify the contributions of each of these possible sources was first to conduct a Principal Component Analysis (PCA) of an in-flight noise and vibration database acquired on a Cessna Model 182E aircraft. The second objective was to develop and evaluate advanced technology for noise source ranking of interior panel groups such as the aircraft windshield, instrument panel, firewall, and door/window panels within the cabin of a single engine propeller driven aircraft. The technology employed was that of Acoustic Holography (AH). AH was applied to the test aircraft by acquiring a series of in-flight microphone array measurements within the aircraft cabin and correlating the measurements via PCA. The source contributions of the various panel groups leading to the array measurements were then synthesized by solving the inverse problem using the boundary element model.

  8. Project Career: An individualized postsecondary approach to promoting independence, functioning, and employment success among students with traumatic brain injuries.

    PubMed

    Minton, Deborah; Elias, Eileen; Rumrill, Phillip; Hendricks, Deborah J; Jacobs, Karen; Leopold, Anne; Nardone, Amanda; Sampson, Elaine; Scherer, Marcia; Gee Cormier, Aundrea; Taylor, Aiyana; DeLatte, Caitlin

    2017-09-14

    Project Career is a five-year interdisciplinary demonstration project funded by NIDILRR. It provides technology-driven supports, merging Cognitive Support Technology (CST) evidence-based practices and rehabilitation counseling, to improve postsecondary and employment outcomes for veteran and civilian undergraduate students with traumatic brain injury (TBI). Provide a technology-driven individualized support program to improve career and employment outcomes for students with TBI. Project staff provide assessments of students' needs relative to assistive technology, academic achievement, and career preparation; provide CST training to 150 students; match students with mentors; provide vocational case management; deliver job development and placement assistance; and maintain an electronic portal regarding accommodation and career resources. Participating students receive cognitive support technology training, academic enrichment, and career preparatory assistance from trained professionals at three implementation sites. Staff address cognitive challenges using the 'Matching Person with Technology' assessment to accommodate CST use (iPad and selected applications (apps)). JBS International (JBS) provides the project's evaluation. To date, 117 students participate with 63% report improved life quality and 75% report improved academic performance. Project Career provides a national model based on best practices for enabling postsecondary students with TBI to attain academic, employment, and career goals.

  9. Inertial migration of elastic particles in a pressure-driven power-law fluid

    NASA Astrophysics Data System (ADS)

    Bowie, Samuel; Alexeev, Alexander

    2016-11-01

    Using three-dimensional computer simulations, we study the cross-stream migration of deformable particles in a channel filled with a non-Newtonian fluid driven by a pressure gradient. Our numerical approach integrates lattice Boltzmann method and lattice spring method in order to model fluid structural interactions of the elastic particle and the surrounding power fluid in the channel. The particles are modeled as elastic shells filled with a viscous fluid that are initially spherical. We focus on the regimes where the inertial effects cannot be neglected and cause cross-stream drift of particles. We probe the flow with different power law indexes including both the shear thickening and thinning fluids. We also examine migration of particles of with different elasticity and relative size. To isolate the non-Newtonian effects on particle migration, we compare the results with the inertial migration results found in the case where the channel is filled with a simple Newtonian fluid. The results can be useful for applications requiring high throughput separation, sorting, and focusing of both synthetic particles and biological cells in microfluidic devices. Financial support provided by National Science Foundation (NSF) Grant No. CMMI1538161.

  10. Climate-driven vital rates do not always mean climate-driven population.

    PubMed

    Tavecchia, Giacomo; Tenan, Simone; Pradel, Roger; Igual, José-Manuel; Genovart, Meritxell; Oro, Daniel

    2016-12-01

    Current climatic changes have increased the need to forecast population responses to climate variability. A common approach to address this question is through models that project current population state using the functional relationship between demographic rates and climatic variables. We argue that this approach can lead to erroneous conclusions when interpopulation dispersal is not considered. We found that immigration can release the population from climate-driven trajectories even when local vital rates are climate dependent. We illustrated this using individual-based data on a trans-equatorial migratory seabird, the Scopoli's shearwater Calonectris diomedea, in which the variation of vital rates has been associated with large-scale climatic indices. We compared the population annual growth rate λ i , estimated using local climate-driven parameters with ρ i , a population growth rate directly estimated from individual information and that accounts for immigration. While λ i varied as a function of climatic variables, reflecting the climate-dependent parameters, ρ i did not, indicating that dispersal decouples the relationship between population growth and climate variables from that between climatic variables and vital rates. Our results suggest caution when assessing demographic effects of climatic variability especially in open populations for very mobile organisms such as fish, marine mammals, bats, or birds. When a population model cannot be validated or it is not detailed enough, ignoring immigration might lead to misleading climate-driven projections. © 2016 John Wiley & Sons Ltd.

  11. Identifying species threat hotspots from global supply chains.

    PubMed

    Moran, Daniel; Kanemoto, Keiichiro

    2017-01-04

    Identifying hotspots of species threat has been a successful approach for setting conservation priorities. One important challenge in conservation is that, in many hotspots, export industries continue to drive overexploitation. Conservation measures must consider not just the point of impact, but also the consumer demand that ultimately drives resource use. To understand which species threat hotspots are driven by which consumers, we have developed a new approach to link a set of biodiversity footprint accounts to the hotspots of threatened species on the IUCN Red List of Threatened Species. The result is a map connecting consumption to spatially explicit hotspots driven by production on a global scale. Locating biodiversity threat hotspots driven by consumption of goods and services can help to connect conservationists, consumers, companies and governments in order to better target conservation actions.

  12. Sequence-of-events-driven automation of the deep space network

    NASA Technical Reports Server (NTRS)

    Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.

    1996-01-01

    In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.

  13. Sequence-of-Events-Driven Automation of the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.

    1996-01-01

    In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.

  14. Efficacy of ACA strategies in biography-driven science teaching: an investigation

    NASA Astrophysics Data System (ADS)

    MacDonald, Grizelda L.; Miller, Stuart S.; Murry, Kevin; Herrera, Socorro; Spears, Jacqueline D.

    2013-12-01

    This study explored the biography-driven approach to teaching culturally and linguistically diverse students in science education. Biography-driven instruction (BDI) embraces student diversity by incorporating students' sociocultural, linguistic, cognitive, and academic dimensions of their biographies into the learning process (Herrera in Biography-driven culturally responsive teaching. Teachers College Press, New York, 2010). Strategies have been developed (Herrera, Kavimandan and Holmes in Crossing the vocabulary bridge: differentiated strategies for diverse secondary classrooms. Teachers College Press, New York, 2011) that provide teachers with instructional routines that facilitate BDI. Using systematic classroom observations we empirically demonstrate that these activate, connect, affirm, strategies are likely to be effective in increasing teachers' biography-driven practices. Implications for theory and practice are discussed.

  15. Flexible cue combination in the guidance of attention in visual search

    PubMed Central

    Brand, John; Oriet, Chris; Johnson, Aaron P.; Wolfe, Jeremy M.

    2014-01-01

    Hodsoll and Humphreys (2001) have assessed the relative contributions of stimulus-driven and user-driven knowledge on linearly- and nonlinearly separable search. However, the target feature used to determine linear separability in their task (i.e., target size) was required to locate the target. In the present work, we investigated the contributions of stimulus-driven and user-driven knowledge when a linearly- or nonlinearly-separable feature is available but not required for target identification. We asked observers to complete a series of standard color X orientation conjunction searches in which target size was either linearly- or nonlinearly separable from the size of the distractors. When guidance by color X orientation and by size information are both available, observers rely on whichever information results in the best search efficiency. This is the case irrespective of whether we provide target foreknowledge by blocking stimulus conditions, suggesting that feature information is used in both a stimulus-driven and user-driven fashion. PMID:25463553

  16. Limitations Of The Current State Space Modelling Approach In Multistage Machining Processes Due To Operation Variations

    NASA Astrophysics Data System (ADS)

    Abellán-Nebot, J. V.; Liu, J.; Romero, F.

    2009-11-01

    The State Space modelling approach has been recently proposed as an engineering-driven technique for part quality prediction in Multistage Machining Processes (MMP). Current State Space models incorporate fixture and datum variations in the multi-stage variation propagation, without explicitly considering common operation variations such as machine-tool thermal distortions, cutting-tool wear, cutting-tool deflections, etc. This paper shows the limitations of the current State Space model through an experimental case study where the effect of the spindle thermal expansion, cutting-tool flank wear and locator errors are introduced. The paper also discusses the extension of the current State Space model to include operation variations and its potential benefits.

  17. A Model-Driven Approach for Telecommunications Network Services Definition

    NASA Astrophysics Data System (ADS)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  18. A hypothesis-driven physical examination learning and assessment procedure for medical students: initial validity evidence.

    PubMed

    Yudkowsky, Rachel; Otaki, Junji; Lowenstein, Tali; Riddle, Janet; Nishigori, Hiroshi; Bordage, Georges

    2009-08-01

    Diagnostic accuracy is maximised by having clinical signs and diagnostic hypotheses in mind during the physical examination (PE). This diagnostic reasoning approach contrasts with the rote, hypothesis-free screening PE learned by many medical students. A hypothesis-driven PE (HDPE) learning and assessment procedure was developed to provide targeted practice and assessment in anticipating, eliciting and interpreting critical aspects of the PE in the context of diagnostic challenges. This study was designed to obtain initial content validity evidence, performance and reliability estimates, and impact data for the HDPE procedure. Nineteen clinical scenarios were developed, covering 160 PE manoeuvres. A total of 66 Year 3 medical students prepared for and encountered three clinical scenarios during required formative assessments. For each case, students listed anticipated positive PE findings for two plausible diagnoses before examining the patient; examined a standardised patient (SP) simulating one of the diagnoses; received immediate feedback from the SP, and documented their findings and working diagnosis. The same students later encountered some of the scenarios during their Year 4 clinical skills examination. On average, Year 3 students anticipated 65% of the positive findings, correctly performed 88% of the PE manoeuvres and documented 61% of the findings. Year 4 students anticipated and elicited fewer findings overall, but achieved proportionally more discriminating findings, thereby more efficiently achieving a diagnostic accuracy equivalent to that of students in Year 3. Year 4 students performed better on cases on which they had received feedback as Year 3 students. Twelve cases would provide a reliability of 0.80, based on discriminating checklist items only. The HDPE provided medical students with a thoughtful, deliberate approach to learning and assessing PE skills in a valid and reliable manner.

  19. Chronic care coordination by integrating care through a team-based, population-driven approach: a case study.

    PubMed

    van Eeghen, Constance O; Littenberg, Benjamin; Kessler, Rodger

    2018-05-23

    Patients with chronic conditions frequently experience behavioral comorbidities to which primary care cannot easily respond. This study observed a Vermont family medicine practice with integrated medical and behavioral health services that use a structured approach to implement a chronic care management system with Lean. The practice chose to pilot a population-based approach to improve outcomes for patients with poorly controlled Type 2 diabetes using a stepped-care model with an interprofessional team including a community health nurse. This case study observed the team's use of Lean, with which it designed and piloted a clinical algorithm composed of patient self-assessment, endorsement of behavioral goals, shared documentation of goals and plans, and follow-up. The team redesigned workflows and measured reach (patients who engaged to the end of the pilot), outcomes (HbA1c results), and process (days between HbA1c tests). The researchers evaluated practice member self-reports about the use of Lean and facilitators and barriers to move from pilot to larger scale applications. Of 20 eligible patients recruited over 3 months, 10 agreed to participate and 9 engaged fully (45%); 106 patients were controls. Relative to controls, outcomes and process measures improved but lacked significance. Practice members identified barriers that prevented implementation of all changes needed but were in agreement that the pilot produced useful outcomes. A systematized, population-based, chronic care management service is feasible in a busy primary care practice. To test at scale, practice leadership will need to allocate staffing, invest in shared documentation, and standardize workflows to streamline office practice responsibilities.

  20. Computational Fluid Dynamics Simulation of Flows in an Oxidation Ditch Driven by a New Surface Aerator

    PubMed Central

    Huang, Weidong; Li, Kun; Wang, Gan; Wang, Yingzhe

    2013-01-01

    Abstract In this article, we present a newly designed inverse umbrella surface aerator, and tested its performance in driving flow of an oxidation ditch. Results show that it has a better performance in driving the oxidation ditch than the original one with higher average velocity and more uniform flow field. We also present a computational fluid dynamics model for predicting the flow field in an oxidation ditch driven by a surface aerator. The improved momentum source term approach to simulate the flow field of the oxidation ditch driven by an inverse umbrella surface aerator was developed and validated through experiments. Four kinds of turbulent models were investigated with the approach, including the standard k−ɛ model, RNG k−ɛ model, realizable k−ɛ model, and Reynolds stress model, and the predicted data were compared with those calculated with the multiple rotating reference frame approach (MRF) and sliding mesh approach (SM). Results of the momentum source term approach are in good agreement with the experimental data, and its prediction accuracy is better than MRF, close to SM. It is also found that the momentum source term approach has lower computational expenses, is simpler to preprocess, and is easier to use. PMID:24302850

  1. Designing a curriculum for communication skills training from a theory and evidence-based perspective.

    PubMed

    Street, Richard L; De Haes, Hanneke C J M

    2013-10-01

    Because quality health care delivery requires effective clinician-patient communication, successful training of health professionals requires communication skill curricula of the highest quality. Two approaches for developing medical communication curricula are a consensus approach and a theory driven approach. We propose a theory-driven, communication function framework for identifying important communication skills, one that is focused on the key goals and outcomes that need to be accomplished in clinical encounters. We discuss 7 communication functions important to medical encounters and the types of skills needed to accomplish each. The functional approach has important pedagogical implications including the importance of distinguishing the performance of a behavior (capacity) from the outcome of that behavior in context (effectiveness) and the recognition that what counts as effective communication depends on perspective (e.g., observer, patient). Consensus and theory-driven approaches to medical communication curricula are not necessarily contradictory and can be integrated to further enhance ongoing development and improvements in medical communication education. A functional approach should resonate with practicing clinicians and continuing education initiatives in that it is embraces the notion that competent communication is situation-specific as clinicians creatively use communicative skills to accomplish the key goals of the encounter. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Infection Control for Drug-Resistant Tuberculosis: Early Diagnosis and Treatment Is the Key

    PubMed Central

    van Cutsem, Gilles; Isaakidis, Petros; Farley, Jason; Nardell, Ed; Volchenkov, Grigory; Cox, Helen

    2016-01-01

    Multidrug-resistant (MDR) tuberculosis, “Ebola with wings,” is a significant threat to tuberculosis control efforts. Previous prevailing views that resistance was mainly acquired through poor treatment led to decades of focus on drug-sensitive rather than drug-resistant (DR) tuberculosis, driven by the World Health Organization's directly observed therapy, short course strategy. The paradigm has shifted toward recognition that most DR tuberculosis is transmitted and that there is a need for increased efforts to control DR tuberculosis. Yet most people with DR tuberculosis are untested and untreated, driving transmission in the community and in health systems in high-burden settings. The risk of nosocomial transmission is high for patients and staff alike. Lowering transmission risk for MDR tuberculosis requires a combination approach centered on rapid identification of active tuberculosis disease and tuberculosis drug resistance, followed by rapid initiation of appropriate treatment and adherence support, complemented by universal tuberculosis infection control measures in healthcare facilities. It also requires a second paradigm shift, from the classic infection control hierarchy to a novel, decentralized approach across the continuum from early diagnosis and treatment to community awareness and support. A massive scale-up of rapid diagnosis and treatment is necessary to control the MDR tuberculosis epidemic. This will not be possible without intense efforts toward the implementation of decentralized, ambulatory models of care. Increasing political will and resources need to be accompanied by a paradigm shift. Instead of focusing on diagnosed cases, recognition that transmission is driven largely by undiagnosed, untreated cases, both in the community and in healthcare settings, is necessary. This article discusses this comprehensive approach, strategies available, and associated challenges. PMID:27118853

  3. Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.

    The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networksmore » and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.« less

  4. An interactive ontology-driven information system for simulating background radiation and generating scenarios for testing special nuclear materials detection algorithms

    DOE PAGES

    Sorokine, Alexandre; Schlicher, Bob G.; Ward, Richard C.; ...

    2015-05-22

    This paper describes an original approach to generating scenarios for the purpose of testing the algorithms used to detect special nuclear materials (SNM) that incorporates the use of ontologies. Separating the signal of SNM from the background requires sophisticated algorithms. To assist in developing such algorithms, there is a need for scenarios that capture a very wide range of variables affecting the detection process, depending on the type of detector being used. To provide such a cpability, we developed an ontology-driven information system (ODIS) for generating scenarios that can be used in creating scenarios for testing of algorithms for SNMmore » detection. The ontology-driven scenario generator (ODSG) is an ODIS based on information supplied by subject matter experts and other documentation. The details of the creation of the ontology, the development of the ontology-driven information system, and the design of the web user interface (UI) are presented along with specific examples of scenarios generated using the ODSG. We demonstrate that the paradigm behind the ODSG is capable of addressing the problem of semantic complexity at both the user and developer levels. Compared to traditional approaches, an ODIS provides benefits such as faithful representation of the users' domain conceptualization, simplified management of very large and semantically diverse datasets, and the ability to handle frequent changes to the application and the UI. Furthermore, the approach makes possible the generation of a much larger number of specific scenarios based on limited user-supplied information« less

  5. Data-driven diagnostics of terrestrial carbon dynamics over North America

    Treesearch

    Jingfeng Xiao; Scott V. Ollinger; Steve Frolking; George C. Hurtt; David Y. Hollinger; Kenneth J. Davis; Yude Pan; Xiaoyang Zhang; Feng Deng; Jiquan Chen; Dennis D. Baldocchi; Bevery E. Law; M. Altaf Arain; Ankur R. Desai; Andrew D. Richardson; Ge Sun; Brian Amiro; Hank Margolis; Lianhong Gu; Russell L. Scott; Peter D. Blanken; Andrew E. Suyker

    2014-01-01

    The exchange of carbon dioxide is a key measure of ecosystem metabolism and a critical intersection between the terrestrial biosphere and the Earth's climate. Despite the general agreement that the terrestrial ecosystems in North America provide a sizeable carbon sink, the size and distribution of the sink remain uncertain. We use a data-driven approach to upscale...

  6. Intensity and angle-of-arrival spectra of laser light propagating through axially homogeneous buoyancy-driven turbulence.

    PubMed

    Pawar, Shashikant S; Arakeri, Jaywant H

    2016-08-01

    Frequency spectra obtained from the measurements of light intensity and angle of arrival (AOA) of parallel laser light propagating through the axially homogeneous, axisymmetric buoyancy-driven turbulent flow at high Rayleigh numbers in a long (length-to-diameter ratio of about 10) vertical tube are reported. The flow is driven by an unstable density difference created across the tube ends using brine and fresh water. The highest Rayleigh number is about 8×109. The aim of the present work is to find whether the conventional Obukhov-Corrsin scaling or Bolgiano-Obukhov (BO) scaling is obtained for the intensity and AOA spectra in the case of light propagation in a buoyancy-driven turbulent medium. Theoretical relations for the frequency spectra of log amplitude and AOA fluctuations developed for homogeneous isotropic turbulent media are modified for the buoyancy-driven flow in the present case to obtain the asymptotic scalings for the high and low frequency ranges. For low frequencies, the spectra of intensity and vertical AOA fluctuations obtained from measurements follow BO scaling, while scaling for the spectra of horizontal AOA fluctuations shows a small departure from BO scaling.

  7. Comparison of the Effectiveness of Interactive Didactic Lecture Versus Online Simulation-Based CME Programs Directed at Improving the Diagnostic Capabilities of Primary Care Practitioners.

    PubMed

    McFadden, Pam; Crim, Andrew

    2016-01-01

    Diagnostic errors in primary care contribute to increased morbidity and mortality, and billions in costs each year. Improvements in the way practicing physicians are taught so as to optimally perform differential diagnosis can increase patient safety and lower the costs of care. This study represents a comparison of the effectiveness of two approaches to CME training directed at improving the primary care practitioner's diagnostic capabilities against seven common and important causes of joint pain. Using a convenience sampling methodology, one group of primary care practitioners was trained by a traditional live, expert-led, multimedia-based training activity supplemented with interactive practice opportunities and feedback (control group). The second group was trained online with a multimedia-based training activity supplemented with interactive practice opportunities and feedback delivered by an artificial intelligence-driven simulation/tutor (treatment group). Before their respective instructional intervention, there were no significant differences in the diagnostic performance of the two groups against a battery of case vignettes presenting with joint pain. Using the same battery of case vignettes to assess postintervention diagnostic performance, there was a slight but not statistically significant improvement in the control group's diagnostic accuracy (P = .13). The treatment group, however, demonstrated a significant improvement in accuracy (P < .02; Cohen d, effect size = 0.79). These data indicate that within the context of a CME activity, a significant improvement in diagnostic accuracy can be achieved by the use of a web-delivered, multimedia-based instructional activity supplemented by practice opportunities and feedback delivered by an artificial intelligence-driven simulation/tutor.

  8. Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region

    NASA Astrophysics Data System (ADS)

    Gilroy, Kristin; Mens, Marjolein; Haasnoot, Marjolijn; Jeuken, Ad

    2016-04-01

    More frequent and intense hydrologic events under climate change are expected to enhance water security and flood risk management challenges worldwide. Traditional planning approaches must be adapted to address climate change and develop solutions with an appropriate level of robustness and flexibility. The Climate Risk Informed Decision Analysis (CRIDA) method is a novel planning approach embodying a suite of complementary methods, including decision scaling and adaptation pathways. Decision scaling offers a bottom-up approach to assess risk and tailors the complexity of the analysis to the problem at hand and the available capacity. Through adaptation pathway,s an array of future strategies towards climate robustness are developed, ranging in flexibility and immediacy of investments. Flexible pathways include transfer points to other strategies to ensure that the system can be adapted if future conditions vary from those expected. CRIDA combines these two approaches in a stakeholder driven process which guides decision makers through the planning and decision process, taking into account how the confidence in the available science, the consequences in the system, and the capacity of institutions should influence strategy selection. In this presentation, we will explain the CRIDA method and compare it to existing planning processes, such as the US Army Corps of Engineers Principles and Guidelines as well as Integrated Water Resources Management Planning. Then, we will apply the approach to a hypothetical case study for the Waas Region, a large downstream river basin facing rapid development threatened by increased flood risks. Through the case study, we will demonstrate how a stakeholder driven process can be used to evaluate system robustness to climate change; develop adaptation pathways for multiple objectives and criteria; and illustrate how varying levels of confidence, consequences, and capacity would play a role in the decision making process, specifically in regards to the level of robustness and flexibility in the selected strategy. This work will equip practitioners and decision makers with an example of a structured process for decision making under climate uncertainty that can be scaled as needed to the problem at hand. This presentation builds further on another submitted abstract "Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning" by Jeuken et al.

  9. Multiscale-Driven approach to detecting change in Synthetic Aperture Radar (SAR) imagery

    NASA Astrophysics Data System (ADS)

    Gens, R.; Hogenson, K.; Ajadi, O. A.; Meyer, F. J.; Myers, A.; Logan, T. A.; Arnoult, K., Jr.

    2017-12-01

    Detecting changes between Synthetic Aperture Radar (SAR) images can be a useful but challenging exercise. SAR with its all-weather capabilities can be an important resource in identifying and estimating the expanse of events such as flooding, river ice breakup, earthquake damage, oil spills, and forest growth, as it can overcome shortcomings of optical methods related to cloud cover. However, detecting change in SAR imagery can be impeded by many factors including speckle, complex scattering responses, low temporal sampling, and difficulty delineating boundaries. In this presentation we use a change detection method based on a multiscale-driven approach. By using information at different resolution levels, we attempt to obtain more accurate change detection maps in both heterogeneous and homogeneous regions. Integrated within the processing flow are processes that 1) improve classification performance by combining Expectation-Maximization algorithms with mathematical morphology, 2) achieve high accuracy in preserving boundaries using measurement level fusion techniques, and 3) combine modern non-local filtering and 2D-discrete stationary wavelet transform to provide robustness against noise. This multiscale-driven approach to change detection has recently been incorporated into the Alaska Satellite Facility (ASF) Hybrid Pluggable Processing Pipeline (HyP3) using radiometrically terrain corrected SAR images. Examples primarily from natural hazards are presented to illustrate the capabilities and limitations of the change detection method.

  10. A unified approach for numerical simulation of viscous compressible and incompressible flows over adiabatic and isothermal walls

    NASA Technical Reports Server (NTRS)

    Hafez, M.; Soliman, M.; White, S.

    1992-01-01

    A new formulation (including the choice of variables, their non-dimensionalization, and the form of the artificial viscosity) is proposed for the numerical solution of the full Navier-Stokes equations for compressible and incompressible flows with heat transfer. With the present approach, the same code can be used for constant as well as variable density flows. The changes of the density due to pressure and temperature variations are identified and it is shown that the low Mach number approximation is a special case. At zero Mach number, the density changes due to the temperature variation are accounted for, mainly through a body force term in the momentum equation. It is also shown that the Boussinesq approximation of the buoyancy effects in an incompressible flow is a special case. To demonstrate the new capability, three examples are tested. Flows in driven cavities with adiabatic and isothermal walls are simulated with the same code as well as incompressible and supersonic flows over a wall with and without a groove. Finally, viscous flow simulations of an oblique shock reflection from a flat plate are shown to be in good agreement with the solutions available in literature.

  11. Interactions in the microbiome: communities of organisms and communities of genes

    PubMed Central

    Boon, Eva; Meehan, Conor J; Whidden, Chris; Wong, Dennis H-J; Langille, Morgan GI; Beiko, Robert G

    2014-01-01

    A central challenge in microbial community ecology is the delineation of appropriate units of biodiversity, which can be taxonomic, phylogenetic, or functional in nature. The term ‘community’ is applied ambiguously; in some cases, the term refers simply to a set of observed entities, while in other cases, it requires that these entities interact with one another. Microorganisms can rapidly gain and lose genes, potentially decoupling community roles from taxonomic and phylogenetic groupings. Trait-based approaches offer a useful alternative, but many traits can be defined based on gene functions, metabolic modules, and genomic properties, and the optimal set of traits to choose is often not obvious. An analysis that considers taxon assignment and traits in concert may be ideal, with the strengths of each approach offsetting the weaknesses of the other. Individual genes also merit consideration as entities in an ecological analysis, with characteristics such as diversity, turnover, and interactions modeled using genes rather than organisms as entities. We identify some promising avenues of research that are likely to yield a deeper understanding of microbial communities that shift from observation-based questions of ‘Who is there?’ and ‘What are they doing?’ to the mechanistically driven question of ‘How will they respond?’ PMID:23909933

  12. Investigative Primary Science: A Problem-Based Learning Approach

    ERIC Educational Resources Information Center

    Etherington, Matthew B.

    2011-01-01

    This study reports on the success of using a problem-based learning approach (PBL) as a pedagogical mode of learning open inquiry science within a traditional four-year undergraduate elementary teacher education program. In 2010, a problem-based learning approach to teaching primary science replaced the traditional content driven syllabus. During…

  13. Model-Drive Architecture for Agent-Based Systems

    NASA Technical Reports Server (NTRS)

    Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.

    2004-01-01

    The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.

  14. Knowledge-driven binning approach for rare variant association analysis: application to neuroimaging biomarkers in Alzheimer's disease.

    PubMed

    Kim, Dokyoon; Basile, Anna O; Bang, Lisa; Horgusluoglu, Emrin; Lee, Seunggeun; Ritchie, Marylyn D; Saykin, Andrew J; Nho, Kwangsik

    2017-05-18

    Rapid advancement of next generation sequencing technologies such as whole genome sequencing (WGS) has facilitated the search for genetic factors that influence disease risk in the field of human genetics. To identify rare variants associated with human diseases or traits, an efficient genome-wide binning approach is needed. In this study we developed a novel biological knowledge-based binning approach for rare-variant association analysis and then applied the approach to structural neuroimaging endophenotypes related to late-onset Alzheimer's disease (LOAD). For rare-variant analysis, we used the knowledge-driven binning approach implemented in Bin-KAT, an automated tool, that provides 1) binning/collapsing methods for multi-level variant aggregation with a flexible, biologically informed binning strategy and 2) an option of performing unified collapsing and statistical rare variant analyses in one tool. A total of 750 non-Hispanic Caucasian participants from the Alzheimer's Disease Neuroimaging Initiative (ADNI) cohort who had both WGS data and magnetic resonance imaging (MRI) scans were used in this study. Mean bilateral cortical thickness of the entorhinal cortex extracted from MRI scans was used as an AD-related neuroimaging endophenotype. SKAT was used for a genome-wide gene- and region-based association analysis of rare variants (MAF (minor allele frequency) < 0.05) and potential confounding factors (age, gender, years of education, intracranial volume (ICV) and MRI field strength) for entorhinal cortex thickness were used as covariates. Significant associations were determined using FDR adjustment for multiple comparisons. Our knowledge-driven binning approach identified 16 functional exonic rare variants in FANCC significantly associated with entorhinal cortex thickness (FDR-corrected p-value < 0.05). In addition, the approach identified 7 evolutionary conserved regions, which were mapped to FAF1, RFX7, LYPLAL1 and GOLGA3, significantly associated with entorhinal cortex thickness (FDR-corrected p-value < 0.05). In further analysis, the functional exonic rare variants in FANCC were also significantly associated with hippocampal volume and cerebrospinal fluid (CSF) Aβ 1-42 (p-value < 0.05). Our novel binning approach identified rare variants in FANCC as well as 7 evolutionary conserved regions significantly associated with a LOAD-related neuroimaging endophenotype. FANCC (fanconi anemia complementation group C) has been shown to modulate TLR and p38 MAPK-dependent expression of IL-1β in macrophages. Our results warrant further investigation in a larger independent cohort and demonstrate that the biological knowledge-driven binning approach is a powerful strategy to identify rare variants associated with AD and other complex disease.

  15. Comparative regulatory approaches for groups of new plant breeding techniques.

    PubMed

    Lusser, Maria; Davies, Howard V

    2013-06-25

    This manuscript provides insights into ongoing debates on the regulatory issues surrounding groups of biotechnology-driven 'New Plant Breeding Techniques' (NPBTs). It presents the outcomes of preliminary discussions and in some cases the initial decisions taken by regulators in the following countries: Argentina, Australia, Canada, EU, Japan, South Africa and USA. In the light of these discussions we suggest in this manuscript a structured approach to make the evaluation more consistent and efficient. The issue appears to be complex as these groups of new technologies vary widely in both the technologies deployed and their impact on heritable changes in the plant genome. An added complication is that the legislation, definitions and regulatory approaches for biotechnology-derived crops differ significantly between these countries. There are therefore concerns that this situation will lead to non-harmonised regulatory approaches and asynchronous development and marketing of such crops resulting in trade disruptions. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Safe and sustainable: the extracranial approach toward frontoethmoidal meningoencephalocele repair.

    PubMed

    Heidekrueger, Paul I; Thu, Myat; Mühlbauer, Wolfgang; Holm-Mühlbauer, Charlotte; Schucht, Philippe; Anderl, Hans; Schoeneich, Heinrich; Aung, Kyawzwa; Mg Ag, Mg; Thu Soe Myint, Ag; Juran, Sabrina; Aung, Thiha; Ehrl, Denis; Ninkovic, Milomir; Broer, P Niclas

    2017-10-01

    OBJECTIVE Although rare, frontoethmoidal meningoencephaloceles continue to pose a challenge to neurosurgeons and plastic reconstructive surgeons. Especially when faced with limited infrastructure and resources, establishing reliable and safe surgical techniques is of paramount importance. The authors present a case series in order to evaluate a previously proposed concise approach for meningoencephalocele repair, with a focus on sustainability of internationally driven surgical efforts. METHODS Between 2001 and 2016, a total of 246 patients with frontoethmoidal meningoencephaloceles were treated using a 1-stage extracranial approach by a single surgeon in the Department of Neurosurgery of the Yangon General Hospital in Yangon, Myanmar, initially assisted by European surgeons. Outcomes and complications were evaluated. RESULTS A total of 246 patients (138 male and 108 female) were treated. Their ages ranged from 75 days to 32 years (median 8 years). The duration of follow-up ranged between 4 weeks and 16 years (median 4 months). Eighteen patients (7.3%) showed signs of increased intracranial pressure postoperatively, and early CSF rhinorrhea was observed in 27 patients (11%), with 5 (2%) of them requiring operative dural repair. In 8 patients, a decompressive lumbar puncture was performed. There were 8 postoperative deaths (3.3%) due to meningitis. In 15 patients (6.1%), recurrent herniation of brain tissue was observed; this herniation led to blindness in 1 case. The remaining patients all showed good to very good aesthetic and functional results. CONCLUSIONS A minimally invasive, purely extracranial approach to frontoethmoidal meningoencephalocele repair may serve well, especially in middle- and low-income countries. This case series points out how the frequently critiqued lack of sustainability in the field of humanitarian surgical missions, as well as the often-cited missing aftercare and dependence on foreign supporters, can be circumvented by meticulous training of local surgeons.

  17. Socratic Seminar with Data: A Strategy to Support Student Discourse and Understanding

    PubMed Central

    Griswold, Joan; Shaw, Loren; Munn, Maureen

    2017-01-01

    A Socratic seminar can be a powerful tool for increasing students’ ability to analyze and interpret data. Most commonly used for text-based discussion, we found that using Socratic seminar to engage students with data contributes to student understanding by allowing them to reason through and process complex information as a group. This approach also provides teachers with insights about student misconceptions and understanding of concepts by listening to the student-driven discussion. This article reports on Socratic seminar in the context of a high school type 2 diabetes curriculum that explores gene and environment interactions. A case study illustrates how Socratic seminar is applied in a classroom and how students engage with the process. General characteristics of Socratic seminar are discussed at the end of the article. PMID:29147033

  18. Electro-Fermentation - Merging Electrochemistry with Fermentation in Industrial Applications.

    PubMed

    Schievano, Andrea; Pepé Sciarria, Tommy; Vanbroekhoven, Karolien; De Wever, Heleen; Puig, Sebastià; Andersen, Stephen J; Rabaey, Korneel; Pant, Deepak

    2016-11-01

    Electro-fermentation (EF) merges traditional industrial fermentation with electrochemistry. An imposed electrical field influences the fermentation environment and microbial metabolism in either a reductive or oxidative manner. The benefit of this approach is to produce target biochemicals with improved selectivity, increase carbon efficiency, limit the use of additives for redox balance or pH control, enhance microbial growth, or in some cases enhance product recovery. We discuss the principles of electrically driven fermentations and how EF can be used to steer both pure culture and microbiota-based fermentations. An overview is given on which advantages EF may bring to both existing and innovative industrial fermentation processes, and which doors might be opened in waste biomass utilization towards added-value biorefineries. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Self-consistent modelling of line-driven hot-star winds with Monte Carlo radiation hydrodynamics

    NASA Astrophysics Data System (ADS)

    Noebauer, U. M.; Sim, S. A.

    2015-11-01

    Radiative pressure exerted by line interactions is a prominent driver of outflows in astrophysical systems, being at work in the outflows emerging from hot stars or from the accretion discs of cataclysmic variables, massive young stars and active galactic nuclei. In this work, a new radiation hydrodynamical approach to model line-driven hot-star winds is presented. By coupling a Monte Carlo radiative transfer scheme with a finite volume fluid dynamical method, line-driven mass outflows may be modelled self-consistently, benefiting from the advantages of Monte Carlo techniques in treating multiline effects, such as multiple scatterings, and in dealing with arbitrary multidimensional configurations. In this work, we introduce our approach in detail by highlighting the key numerical techniques and verifying their operation in a number of simplified applications, specifically in a series of self-consistent, one-dimensional, Sobolev-type, hot-star wind calculations. The utility and accuracy of our approach are demonstrated by comparing the obtained results with the predictions of various formulations of the so-called CAK theory and by confronting the calculations with modern sophisticated techniques of predicting the wind structure. Using these calculations, we also point out some useful diagnostic capabilities our approach provides. Finally, we discuss some of the current limitations of our method, some possible extensions and potential future applications.

  20. The importance of multidisciplinary team management of patients with non-small-cell lung cancer

    PubMed Central

    Ellis, P.M.

    2012-01-01

    Historically, a simple approach to the treatment of non-small-cell lung cancer (nsclc) was applicable to nearly all patients. Recently, a more complex treatment algorithm has emerged, driven by both pathologic and molecular phenotype. This increasing complexity underscores the importance of a multidisciplinary team approach to the diagnosis, treatment, and supportive care of patients with nsclc. A team approach to management is important at all points: from diagnosis, through treatment, to end-of-life care. It also needs to be patient-centred and must involve the patient in decision-making concerning treatment. Multidisciplinary case conferencing is becoming an integral part of care. Early integration of palliative care into the team approach appears to contribute significantly to quality of life and potentially extends overall survival for these patients. Supportive approaches, including psychosocial and nutrition support, should be routinely incorporated into the team approach. Challenges to the implementation of multidisciplinary care require institutional commitment and support. PMID:22787414

  1. Putting the psychology back into psychological models: mechanistic versus rational approaches.

    PubMed

    Sakamoto, Yasuaki; Jones, Mattr; Love, Bradley C

    2008-09-01

    Two basic approaches to explaining the nature of the mind are the rational and the mechanistic approaches. Rational analyses attempt to characterize the environment and the behavioral outcomes that humans seek to optimize, whereas mechanistic models attempt to simulate human behavior using processes and representations analogous to those used by humans. We compared these approaches with regard to their accounts of how humans learn the variability of categories. The mechanistic model departs in subtle ways from rational principles. In particular, the mechanistic model incrementally updates its estimates of category means and variances through error-driven learning, based on discrepancies between new category members and the current representation of each category. The model yields a prediction, which we verify, regarding the effects of order manipulations that the rational approach does not anticipate. Although both rational and mechanistic models can successfully postdict known findings, we suggest that psychological advances are driven primarily by consideration of process and representation and that rational accounts trail these breakthroughs.

  2. An approach for software-driven and standard-based support of cross-enterprise tumor boards.

    PubMed

    Mangesius, Patrick; Fischer, Bernd; Schabetsberger, Thomas

    2015-01-01

    For tumor boards, the networking of different medical disciplines' expertise continues to gain importance. However, interdisciplinary tumor boards spread across several institutions are rarely supported by information technology tools today. The aim of this paper is to point out an approach for a tumor board management system prototype. For analyzing the requirements, an incremental process was used. The requirements were surveyed using Informal Conversational Interview and documented with Use Case Diagrams defined by the Unified Modeling Language (UML). Analyses of current EHR standards were conducted to evaluate technical requirements. Functional and technical requirements of clinical conference applications were evaluated and documented. In several steps, workflows were derived and application mockups were created. Although there is a vast amount of common understanding concerning how clinical conferences should be conducted and how their workflows should be structured, these are hardly standardized, neither on a functional nor on a technical level. This results in drawbacks for participants and patients. Using modern EHR technologies based on profiles such as IHE Cross Enterprise document sharing (XDS), these deficits could be overcome.

  3. Reference hydrologic networks II. Using reference hydrologic networks to assess climate-driven changes in streamflow

    USGS Publications Warehouse

    Burn, Donald H.; Hannaford, Jamie; Hodgkins, Glenn A.; Whitfield, Paul H.; Thorne, Robin; Marsh, Terry

    2012-01-01

    Reference hydrologic networks (RHNs) can play an important role in monitoring for changes in the hydrological regime related to climate variation and change. Currently, the literature concerning hydrological response to climate variations is complex and confounded by the combinations of many methods of analysis, wide variations in hydrology, and the inclusion of data series that include changes in land use, storage regulation and water use in addition to those of climate. Three case studies that illustrate a variety of approaches to the analysis of data from RHNs are presented and used, together with a summary of studies from the literature, to develop approaches for the investigation of changes in the hydrological regime at a continental or global scale, particularly for international comparison. We present recommendations for an analysis framework and the next steps to advance such an initiative. There is a particular focus on the desirability of establishing standardized procedures and methodologies for both the creation of new national RHNs and the systematic analysis of data derived from a collection of RHNs.

  4. Future projections of insured losses in the German private building sector following the A1B climatic change scenario

    NASA Astrophysics Data System (ADS)

    Held, H.; Gerstengarbe, F.-W.; Hattermann, F.; Pinto, J. G.; Ulbrich, U.; Böhm, U.; Born, K.; Büchner, M.; Donat, M. G.; Kücken, M.; Leckebusch, G. C.; Nissen, K.; Nocke, T.; Österle, H.; Pardowitz, T.; Werner, P. C.; Burghoff, O.; Broecker, U.; Kubik, A.

    2012-04-01

    We present an overview of a complementary-approaches impact project dealing with the consequences of climate change for the natural hazard branch of the insurance industry in Germany. The project was conducted by four academic institutions together with the German Insurance Association (GDV) and finalized in autumn 2011. A causal chain is modeled that goes from global warming projections over regional meteorological impacts to regional economic losses for private buildings, hereby fully covering the area of Germany. This presentation will focus on wind storm related losses, although the method developed had also been applied in part to hail and flood impact losses. For the first time, the GDV supplied their collected set of insurance cases, dating back for decades, for such an impact study. These data were used to calibrate and validate event-based damage functions which in turn were driven by three different types of regional climate models to generate storm loss projections. The regional models were driven by a triplet of ECHAM5 experiments following the A1B scenario which were found representative in the recent ENSEMBLES intercomparison study. In our multi-modeling approach we used two types of regional climate models that conceptually differ at maximum: a dynamical model (CCLM) and a statistical model based on the idea of biased bootstrapping (STARS). As a third option we pursued a hybrid approach (statistical-dynamical downscaling). For the assessment of climate change impacts, the buildings' infrastructure and their economic value is kept at current values. For all three approaches, a significant increase of average storm losses and extreme event return levels in the German private building sector is found for future decades assuming an A1B-scenario. However, the three projections differ somewhat in terms of magnitude and regional differentiation. We have developed a formalism that allows us to express the combined effect of multi-source uncertainty on return levels within the framework of a generalized Pareto distribution.

  5. Exploring Techniques of Developing Writing Skill in IELTS Preparatory Courses: A Data-Driven Study

    ERIC Educational Resources Information Center

    Ostovar-Namaghi, Seyyed Ali; Safaee, Seyyed Esmail

    2017-01-01

    Being driven by the hypothetico-deductive mode of inquiry, previous studies have tested the effectiveness of theory-driven interventions under controlled experimental conditions to come up with universally applicable generalizations. To make a case in the opposite direction, this data-driven study aims at uncovering techniques and strategies…

  6. Modelling Conditions and Health Care Processes in Electronic Health Records: An Application to Severe Mental Illness with the Clinical Practice Research Datalink

    PubMed Central

    Olier, Ivan; Springate, David A.; Ashcroft, Darren M.; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos

    2016-01-01

    Background The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. Methods We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. Results We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. Conclusion We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists. PMID:26918439

  7. Evolutionary pruning of transfer learned deep convolutional neural network for breast cancer diagnosis in digital breast tomosynthesis.

    PubMed

    Samala, Ravi K; Chan, Heang-Ping; Hadjiiski, Lubomir M; Helvie, Mark A; Richter, Caleb; Cha, Kenny

    2018-05-01

    Deep learning models are highly parameterized, resulting in difficulty in inference and transfer learning for image recognition tasks. In this work, we propose a layered pathway evolution method to compress a deep convolutional neural network (DCNN) for classification of masses in digital breast tomosynthesis (DBT). The objective is to prune the number of tunable parameters while preserving the classification accuracy. In the first stage transfer learning, 19 632 augmented regions-of-interest (ROIs) from 2454 mass lesions on mammograms were used to train a pre-trained DCNN on ImageNet. In the second stage transfer learning, the DCNN was used as a feature extractor followed by feature selection and random forest classification. The pathway evolution was performed using genetic algorithm in an iterative approach with tournament selection driven by count-preserving crossover and mutation. The second stage was trained with 9120 DBT ROIs from 228 mass lesions using leave-one-case-out cross-validation. The DCNN was reduced by 87% in the number of neurons, 34% in the number of parameters, and 95% in the number of multiply-and-add operations required in the convolutional layers. The test AUC on 89 mass lesions from 94 independent DBT cases before and after pruning were 0.88 and 0.90, respectively, and the difference was not statistically significant (p  >  0.05). The proposed DCNN compression approach can reduce the number of required operations by 95% while maintaining the classification performance. The approach can be extended to other deep neural networks and imaging tasks where transfer learning is appropriate.

  8. Evolutionary pruning of transfer learned deep convolutional neural network for breast cancer diagnosis in digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Helvie, Mark A.; Richter, Caleb; Cha, Kenny

    2018-05-01

    Deep learning models are highly parameterized, resulting in difficulty in inference and transfer learning for image recognition tasks. In this work, we propose a layered pathway evolution method to compress a deep convolutional neural network (DCNN) for classification of masses in digital breast tomosynthesis (DBT). The objective is to prune the number of tunable parameters while preserving the classification accuracy. In the first stage transfer learning, 19 632 augmented regions-of-interest (ROIs) from 2454 mass lesions on mammograms were used to train a pre-trained DCNN on ImageNet. In the second stage transfer learning, the DCNN was used as a feature extractor followed by feature selection and random forest classification. The pathway evolution was performed using genetic algorithm in an iterative approach with tournament selection driven by count-preserving crossover and mutation. The second stage was trained with 9120 DBT ROIs from 228 mass lesions using leave-one-case-out cross-validation. The DCNN was reduced by 87% in the number of neurons, 34% in the number of parameters, and 95% in the number of multiply-and-add operations required in the convolutional layers. The test AUC on 89 mass lesions from 94 independent DBT cases before and after pruning were 0.88 and 0.90, respectively, and the difference was not statistically significant (p  >  0.05). The proposed DCNN compression approach can reduce the number of required operations by 95% while maintaining the classification performance. The approach can be extended to other deep neural networks and imaging tasks where transfer learning is appropriate.

  9. The Potential of Knowing More: A Review of Data-Driven Urban Water Management.

    PubMed

    Eggimann, Sven; Mutzner, Lena; Wani, Omar; Schneider, Mariane Yvonne; Spuhler, Dorothee; Moy de Vitry, Matthew; Beutler, Philipp; Maurer, Max

    2017-03-07

    The promise of collecting and utilizing large amounts of data has never been greater in the history of urban water management (UWM). This paper reviews several data-driven approaches which play a key role in bringing forward a sea change. It critically investigates whether data-driven UWM offers a promising foundation for addressing current challenges and supporting fundamental changes in UWM. We discuss the examples of better rain-data management, urban pluvial flood-risk management and forecasting, drinking water and sewer network operation and management, integrated design and management, increasing water productivity, wastewater-based epidemiology and on-site water and wastewater treatment. The accumulated evidence from literature points toward a future UWM that offers significant potential benefits thanks to increased collection and utilization of data. The findings show that data-driven UWM allows us to develop and apply novel methods, to optimize the efficiency of the current network-based approach, and to extend functionality of today's systems. However, generic challenges related to data-driven approaches (e.g., data processing, data availability, data quality, data costs) and the specific challenges of data-driven UWM need to be addressed, namely data access and ownership, current engineering practices and the difficulty of assessing the cost benefits of data-driven UWM.

  10. Next-generation analysis of cataracts: determining knowledge driven gene-gene interactions using Biofilter, and gene-environment interactions using the PhenX Toolkit.

    PubMed

    Pendergrass, Sarah A; Verma, Shefali S; Holzinger, Emily R; Moore, Carrie B; Wallace, John; Dudek, Scott M; Huggins, Wayne; Kitchner, Terrie; Waudby, Carol; Berg, Richard; McCarty, Catherine A; Ritchie, Marylyn D

    2013-01-01

    Investigating the association between biobank derived genomic data and the information of linked electronic health records (EHRs) is an emerging area of research for dissecting the architecture of complex human traits, where cases and controls for study are defined through the use of electronic phenotyping algorithms deployed in large EHR systems. For our study, 2580 cataract cases and 1367 controls were identified within the Marshfield Personalized Medicine Research Project (PMRP) Biobank and linked EHR, which is a member of the NHGRI-funded electronic Medical Records and Genomics (eMERGE) Network. Our goal was to explore potential gene-gene and gene-environment interactions within these data for 529,431 single nucleotide polymorphisms (SNPs) with minor allele frequency > 1%, in order to explore higher level associations with cataract risk beyond investigations of single SNP-phenotype associations. To build our SNP-SNP interaction models we utilized a prior-knowledge driven filtering method called Biofilter to minimize the multiple testing burden of exploring the vast array of interaction models possible from our extensive number of SNPs. Using the Biofilter, we developed 57,376 prior-knowledge directed SNP-SNP models to test for association with cataract status. We selected models that required 6 sources of external domain knowledge. We identified 5 statistically significant models with an interaction term with p-value < 0.05, as well as an overall model with p-value < 0.05 associated with cataract status. We also conducted gene-environment interaction analyses for all GWAS SNPs and a set of environmental factors from the PhenX Toolkit: smoking, UV exposure, and alcohol use; these environmental factors have been previously associated with the formation of cataracts. We found a total of 288 models that exhibit an interaction term with a p-value ≤ 1×10(-4) associated with cataract status. Our results show these approaches enable advanced searches for epistasis and gene-environment interactions beyond GWAS, and that the EHR based approach provides an additional source of data for seeking these advanced explanatory models of the etiology of complex disease/outcome such as cataracts.

  11. Australian diagnosis related groups: Drivers of complexity adjustment.

    PubMed

    Jackson, Terri; Dimitropoulos, Vera; Madden, Richard; Gillett, Steve

    2015-11-01

    In undertaking a major revision to the Australian Refined Diagnosis Related Group (ARDRG) classification, we set out to contrast Australia's approach to using data on additional (not principal) diagnoses with major international approaches in splitting base or Adjacent Diagnosis Related Groups (ADRGs). Comparative policy analysis/narrative review of peer-reviewed and grey literature on international approaches to use of additional (secondary) diagnoses in the development of Australian and international DRG systems. European and US approaches to characterise complexity of inpatient care are well-documented, providing useful points of comparison with Australia's. Australia, with good data sources, has continued to refine its national DRG classification using increasingly sophisticated approaches. Hospital funders in Australia and in other systems are often under pressure from provider groups to expand classifications to reflect clinical complexity. DRG development in most healthcare systems reviewed here reflects four critical factors: these socio-political factors, the quality and depth of the coded data available to characterise the mix of cases in a healthcare system, the size of the underlying population, and the intended scope and use of the classification. Australia's relatively small national population has constrained the size of its DRG classifications, and development has been concentrated on inpatient care in public hospitals. Development of casemix classifications in health care is driven by both technical and socio-political factors. Use of additional diagnoses to adjust for patient complexity and cost needs to respond to these in each casemix application. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Morphologically occult systemic mastocytosis in bone marrow: clinicopathologic features and an algorithmic approach to diagnosis.

    PubMed

    Reichard, Kaaren K; Chen, Dong; Pardanani, Animesh; McClure, Rebecca F; Howard, Matthew T; Kurtin, Paul J; Wood, Adam J; Ketterling, Rhett P; King, Rebecca L; He, Rong; Morice, William G; Hanson, Curtis A

    2015-09-01

    Bone marrow (BM) biopsy specimens involved by systemic mastocytosis (SM) typically show multifocal, compact, dense aggregates of spindled mast cells (MCs). However, some cases lack aggregate formation and fulfill the World Health Organization 2008 criteria for SM, based on minor criteria. We identified 26 BM cases of KIT D816V-mutated, morphologically occult SM in the BM. All patients had some combination of allergic/MC activating symptoms. Peripheral blood counts were generally normal. BM aspirates showed 5% or less MCs, which were only occasionally spindled. BM biopsy specimens showed no morphologic classic MC lesions. Tryptase immunohistochemistry (IHC) demonstrated interstitial, individually distributed MCs (up to 5%) with prominent spindling, lacking aggregate formation. MCs coexpressed CD25 by IHC and/or flow cytometry. Spindled MCs constituted more than 25% of total MCs in all cases and more than 50% in 20 of 26 cases. Morphologically occult involvement of normal-appearing BM by SM will be missed without appropriate clinical suspicion and pathologic evaluation by tryptase and CD25 IHC and KIT D816V mutation analysis. On the basis of these findings, we propose a cost-effective, data-driven, evidence-based algorithmic approach to the workup of these cases. Copyright© by the American Society for Clinical Pathology.

  13. Research misconduct oversight: defining case costs.

    PubMed

    Gammon, Elizabeth; Franzini, Luisa

    2013-01-01

    This study uses a sequential mixed method study design to define cost elements of research misconduct among faculty at academic medical centers. Using time driven activity based costing, the model estimates a per case cost for 17 cases of research misconduct reported by the Office of Research Integrity for the period of 2000-2005. Per case cost of research misconduct was found to range from $116,160 to $2,192,620. Research misconduct cost drivers are identified.

  14. Deep Change: Cases and Commentary on Schools and Programs of Successful Reform in High Stakes States. Research in Curriculum and Instruction

    ERIC Educational Resources Information Center

    Ponder, Gerald, Ed.; Strahan, David, Ed.

    2005-01-01

    This book presents cases of schools (Part One) and programs at the district level and beyond (Part Two) in which reform, while driven by high-stakes accountability, became larger and deeper through data-driven dialogue, culture change, organizational learning, and other elements of high performing cultures. Commentaries on cross-case patterns by…

  15. Dynamic simulation of storm-driven barrier island morphology under future sea level rise

    NASA Astrophysics Data System (ADS)

    Passeri, D. L.; Long, J.; Plant, N. G.; Bilskie, M. V.; Hagen, S. C.

    2016-12-01

    The impacts of short-term processes such as tropical and extratropical storms have the potential to alter barrier island morphology. On the event scale, the effects of storm-driven morphology may result in damage or loss of property, infrastructure and habitat. On the decadal scale, the combination of storms and sea level rise (SLR) will evolve barrier islands. The effects of SLR on hydrodynamics and coastal morphology are dynamic and inter-related; nonlinearities in SLR can cause larger peak surges, lengthier inundation times and additional inundated land, which may result in increased erosion, overwash or breaching along barrier islands. This study uses a two-dimensional morphodynamic model (XBeach) to examine the response of Dauphin Island, AL to storm surge under future SLR. The model is forced with water levels and waves provided by a large-domain hydrodynamic model. A historic validation of hurricanes Ivan and Katrina indicates the model is capable of predicting morphologic response with high skill (0.5). The validated model is used to simulate storm surge driven by Ivan and Katrina under four future SLR scenarios, ranging from 20 cm to 2 m. Each SLR scenario is implemented using a static or "bathtub" approach (in which water levels are increased linearly by the amount of SLR) versus a dynamic approach (in which SLR is applied at the open ocean boundary of the hydrodynamic model and allowed to propagate through the domain as guided by the governing equations). Results illustrate that higher amounts of SLR result in additional shoreline change, dune erosion, overwash and breaching. Compared to the dynamic approach, the static approach over-predicts inundation, dune erosion, overwash and breaching of the island. Overall, results provide a better understanding of the effects of SLR on storm-driven barrier island morphology and support a paradigm shift away from the "bathtub" approach, towards considering the integrated, dynamic effects of SLR.

  16. Spin Seebeck effect in a metal-single-molecule-magnet-metal junction

    NASA Astrophysics Data System (ADS)

    Niu, Pengbin; Liu, Lixiang; Su, Xiaoqiang; Dong, Lijuan; Luo, Hong-Gang

    2018-01-01

    We investigate the nonlinear regime of temperature-driven spin-related currents through a single molecular magnet (SMM), which is connected with two metal electrodes. Under a large spin approximation, the SMM is simplified to a natural two-channel model possessing spin-opposite configuration and Coulomb interaction. We find that in temperature-driven case the system can generate spin-polarized currents. More interestingly, at electron-hole symmetry point, the competition of the two channels induces a temperature-driven pure spin current. This device demonstrates that temperature-driven SMM junction shows some results different from the usual quantum dot model, which may be useful in the future design of thermal-based molecular spintronic devices.

  17. Improved Quantitative Plant Proteomics via the Combination of Targeted and Untargeted Data Acquisition

    PubMed Central

    Hart-Smith, Gene; Reis, Rodrigo S.; Waterhouse, Peter M.; Wilkins, Marc R.

    2017-01-01

    Quantitative proteomics strategies – which are playing important roles in the expanding field of plant molecular systems biology – are traditionally designated as either hypothesis driven or non-hypothesis driven. Many of these strategies aim to select individual peptide ions for tandem mass spectrometry (MS/MS), and to do this mixed hypothesis driven and non-hypothesis driven approaches are theoretically simple to implement. In-depth investigations into the efficacies of such approaches have, however, yet to be described. In this study, using combined samples of unlabeled and metabolically 15N-labeled Arabidopsis thaliana proteins, we investigate the mixed use of targeted data acquisition (TDA) and data dependent acquisition (DDA) – referred to as TDA/DDA – to facilitate both hypothesis driven and non-hypothesis driven quantitative data collection in individual LC-MS/MS experiments. To investigate TDA/DDA for hypothesis driven data collection, 7 miRNA target proteins of differing size and abundance were targeted using inclusion lists comprised of 1558 m/z values, using 3 different TDA/DDA experimental designs. In samples in which targeted peptide ions were of particularly low abundance (i.e., predominantly only marginally above mass analyser detection limits), TDA/DDA produced statistically significant increases in the number of targeted peptides identified (230 ± 8 versus 80 ± 3 for DDA; p = 1.1 × 10-3) and quantified (35 ± 3 versus 21 ± 2 for DDA; p = 0.038) per experiment relative to the use of DDA only. These expected improvements in hypothesis driven data collection were observed alongside unexpected improvements in non-hypothesis driven data collection. Untargeted peptide ions with m/z values matching those in inclusion lists were repeatedly identified and quantified across technical replicate TDA/DDA experiments, resulting in significant increases in the percentages of proteins repeatedly quantified in TDA/DDA experiments only relative to DDA experiments only (33.0 ± 2.6% versus 8.0 ± 2.7%, respectively; p = 0.011). These results were observed together with uncompromised broad-scale MS/MS data collection in TDA/DDA experiments relative to DDA experiments. Using our observations we provide guidelines for TDA/DDA method design for quantitative plant proteomics studies, and suggest that TDA/DDA is a broadly underutilized proteomics data acquisition strategy. PMID:29021799

  18. Qualitatively modelling and analysing genetic regulatory networks: a Petri net approach.

    PubMed

    Steggles, L Jason; Banks, Richard; Shaw, Oliver; Wipat, Anil

    2007-02-01

    New developments in post-genomic technology now provide researchers with the data necessary to study regulatory processes in a holistic fashion at multiple levels of biological organization. One of the major challenges for the biologist is to integrate and interpret these vast data resources to gain a greater understanding of the structure and function of the molecular processes that mediate adaptive and cell cycle driven changes in gene expression. In order to achieve this biologists require new tools and techniques to allow pathway related data to be modelled and analysed as network structures, providing valuable insights which can then be validated and investigated in the laboratory. We propose a new technique for constructing and analysing qualitative models of genetic regulatory networks based on the Petri net formalism. We take as our starting point the Boolean network approach of treating genes as binary switches and develop a new Petri net model which uses logic minimization to automate the construction of compact qualitative models. Our approach addresses the shortcomings of Boolean networks by providing access to the wide range of existing Petri net analysis techniques and by using non-determinism to cope with incomplete and inconsistent data. The ideas we present are illustrated by a case study in which the genetic regulatory network controlling sporulation in the bacterium Bacillus subtilis is modelled and analysed. The Petri net model construction tool and the data files for the B. subtilis sporulation case study are available at http://bioinf.ncl.ac.uk/gnapn.

  19. A land classification protocol for pollinator ecology research: An urbanization case study.

    PubMed

    Samuelson, Ash E; Leadbeater, Ellouise

    2018-06-01

    Land-use change is one of the most important drivers of widespread declines in pollinator populations. Comprehensive quantitative methods for land classification are critical to understanding these effects, but co-option of existing human-focussed land classifications is often inappropriate for pollinator research. Here, we present a flexible GIS-based land classification protocol for pollinator research using a bottom-up approach driven by reference to pollinator ecology, with urbanization as a case study. Our multistep method involves manually generating land cover maps at multiple biologically relevant radii surrounding study sites using GIS, with a focus on identifying land cover types that have a specific relevance to pollinators. This is followed by a three-step refinement process using statistical tools: (i) definition of land-use categories, (ii) principal components analysis on the categories, and (iii) cluster analysis to generate a categorical land-use variable for use in subsequent analysis. Model selection is then used to determine the appropriate spatial scale for analysis. We demonstrate an application of our protocol using a case study of 38 sites across a gradient of urbanization in South-East England. In our case study, the land classification generated a categorical land-use variable at each of four radii based on the clustering of sites with different degrees of urbanization, open land, and flower-rich habitat. Studies of land-use effects on pollinators have historically employed a wide array of land classification techniques from descriptive and qualitative to complex and quantitative. We suggest that land-use studies in pollinator ecology should broadly adopt GIS-based multistep land classification techniques to enable robust analysis and aid comparative research. Our protocol offers a customizable approach that combines specific relevance to pollinator research with the potential for application to a wide range of ecological questions, including agroecological studies of pest control.

  20. Chemical potential of quasi-equilibrium magnon gas driven by pure spin current.

    PubMed

    Demidov, V E; Urazhdin, S; Divinskiy, B; Bessonov, V D; Rinkevich, A B; Ustinov, V V; Demokritov, S O

    2017-11-17

    Pure spin currents provide the possibility to control the magnetization state of conducting and insulating magnetic materials. They allow one to increase or reduce the density of magnons, and achieve coherent dynamic states of magnetization reminiscent of the Bose-Einstein condensation. However, until now there was no direct evidence that the state of the magnon gas subjected to spin current can be treated thermodynamically. Here, we show experimentally that the spin current generated by the spin-Hall effect drives the magnon gas into a quasi-equilibrium state that can be described by the Bose-Einstein statistics. The magnon population function is characterized either by an increased effective chemical potential or by a reduced effective temperature, depending on the spin current polarization. In the former case, the chemical potential can closely approach, at large driving currents, the lowest-energy magnon state, indicating the possibility of spin current-driven Bose-Einstein condensation.

  1. Lazy evaluation of FP programs: A data-flow approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Y.H.; Gaudiot, J.L.

    1988-12-31

    This paper presents a lazy evaluation system for the list-based functional language, Backus` FP in data-driven environment. A superset language of FP, called DFP (Demand-driven FP), is introduced. FP eager programs are transformed into DFP lazy programs which contain the notions of demands. The data-driven execution of DFP programs has the same effects of lazy evaluation. DFP lazy programs have the property of always evaluating a sufficient and necessary result. The infinite sequence generator is used to demonstrate the eager-lazy program transformation and the execution of the lazy programs.

  2. India Commercial Buildings Data Framework: A Summary of Potential Use Cases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mathew, Paul; Mathew, Sangeeta; Kumar, Satish

    This report details a potential set of use cases for India’s Commercial Buildings Data Framework. The use cases are aimed at enabling data-driven, evidence-based policy making and at transforming the market for energy efficiency in the building sector by facilitating the adoption of (1) superior energy-efficient building design and operation and maintenance practices, and (2) better specification and procurement of end-use equipment and systems.

  3. Switching Kalman filter for failure prognostic

    NASA Astrophysics Data System (ADS)

    Lim, Chi Keong Reuben; Mba, David

    2015-02-01

    The use of condition monitoring (CM) data to predict remaining useful life have been growing with increasing use of health and usage monitoring systems on aircraft. There are many data-driven methodologies available for the prediction and popular ones include artificial intelligence and statistical based approach. The drawback of such approaches is that they require a lot of failure data for training which can be scarce in practice. In lieu of this, methods using state-space and regression-based models that extract information from the data history itself have been explored. However, such methods have their own limitations as they utilize a single time-invariant model which does not represent changing degradation path well. This causes most degradation modeling studies to focus only on segments of their CM data that behaves close to the assumed model. In this paper, a state-space based method; the Switching Kalman Filter (SKF), is adopted for model estimation and life prediction. The SKF approach however, uses multiple models from which the most probable model is inferred from the CM data using Bayesian estimation before it is applied for prediction. At the same time, the inference of the degradation model itself can provide maintainers with more information for their planning. This SKF approach is demonstrated with a case study on gearbox bearings that were found defective from the Republic of Singapore Air Force AH64D helicopter. The use of in-service CM data allows the approach to be applied in a practical scenario and results showed that the developed SKF approach is a promising tool to support maintenance decision-making.

  4. Community dialogues for child health: results from a qualitative process evaluation in three countries.

    PubMed

    Martin, Sandrine; Leitão, Jordana; Muhangi, Denis; Nuwa, Anthony; Magul, Dieterio; Counihan, Helen

    2017-06-05

    Across the developing world, countries are increasingly adopting the integrated community case management of childhood illnesses (iCCM) strategy in efforts to reduce child mortality. This intervention's effectiveness is dependent on community adoption and changes in care-seeking practices. We assessed the implementation process of a theory-driven community dialogue (CD) intervention specifically designed to strengthen the support and uptake of the newly introduced iCCM services and related behaviours in three African countries. A qualitative process evaluation methodology was chosen and used secondary project data and primary data collected in two districts of each of the three countries, in purposefully sampled communities. The final data set included 67 focus group discussions and 57 key informant interviews, totalling 642 respondents, including caregivers, CD facilitators community leaders, and trainers. Thematic analysis of the data followed the 'Framework Approach' utilising both a deduction and induction process. Results show that CDs contribute to triggering community uptake of and support for iCCM services through filling health information gaps and building cooperation within communities. We found it to be an effective approach for addressing social norms around child care practices. This approach was embraced by communities for its flexibility and value in planning individual and collective change. Regular CDs can contribute to the formation of new habits, particularly in relation to seeking timely care in case of child sickness. This study also confirms the value of process evaluation to unwrap the mechanisms of community mobilisation approaches in context and provides key insights for improving the CD approach.

  5. Solar wind driven empirical forecast models of the time derivative of the ground magnetic field

    NASA Astrophysics Data System (ADS)

    Wintoft, Peter; Wik, Magnus; Viljanen, Ari

    2015-03-01

    Empirical models are developed to provide 10-30-min forecasts of the magnitude of the time derivative of local horizontal ground geomagnetic field (|dBh/dt|) over Europe. The models are driven by ACE solar wind data. A major part of the work has been devoted to the search and selection of datasets to support the model development. To simplify the problem, but at the same time capture sudden changes, 30-min maximum values of |dBh/dt| are forecast with a cadence of 1 min. Models are tested both with and without the use of ACE SWEPAM plasma data. It is shown that the models generally capture sudden increases in |dBh/dt| that are associated with sudden impulses (SI). The SI is the dominant disturbance source for geomagnetic latitudes below 50° N and with minor contribution from substorms. However, at occasions, large disturbances can be seen associated with geomagnetic pulsations. For higher latitudes longer lasting disturbances, associated with substorms, are generally also captured. It is also shown that the models using only solar wind magnetic field as input perform in most cases equally well as models with plasma data. The models have been verified using different approaches including the extremal dependence index which is suitable for rare events.

  6. Driven by Data: How Three Districts Are Successfully Using Data, Rather than Gut Feelings, to Align Staff Development with School Needs

    ERIC Educational Resources Information Center

    Gold, Stephanie

    2005-01-01

    The concept of data-driven professional development is both straight-forward and sensible. Implementing this approach is another story, which is why many administrators are turning to sophisticated tools to help manage data collection and analysis. These tools allow educators to assess and correlate student outcomes, instructional methods, and…

  7. Think to Learn (Creating a Standards-Driven Thinking Classroom). Occasional Paper Series. Volume 1, Number 2

    ERIC Educational Resources Information Center

    Fluellen, Jerry E., Jr.

    2006-01-01

    Think to Learn. That's how Robert Sternberg boils down his approach for teaching thinking. In an urban technology high school, two Teacher Consultants in the District of Columbia Area Writing Project at Howard University co-constructed a prototype for creating standards driven thinking classrooms. With 132 high school students, they used the…

  8. Automated control of hierarchical systems using value-driven methods

    NASA Technical Reports Server (NTRS)

    Pugh, George E.; Burke, Thomas E.

    1990-01-01

    An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.

  9. Data Driven Model Development for the SuperSonic SemiSpan Transport (S(sup 4)T)

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2011-01-01

    In this report, we will investigate two common approaches to model development for robust control synthesis in the aerospace community; namely, reduced order aeroservoelastic modelling based on structural finite-element and computational fluid dynamics based aerodynamic models, and a data-driven system identification procedure. It is shown via analysis of experimental SuperSonic SemiSpan Transport (S4T) wind-tunnel data that by using a system identification approach it is possible to estimate a model at a fixed Mach, which is parsimonious and robust across varying dynamic pressures.

  10. Dynamic reflexivity in action: an armchair walkthrough of a qualitatively driven mixed-method and multiple methods study of mindfulness training in schoolchildren.

    PubMed

    Cheek, Julianne; Lipschitz, David L; Abrams, Elizabeth M; Vago, David R; Nakamura, Yoshio

    2015-06-01

    Dynamic reflexivity is central to enabling flexible and emergent qualitatively driven inductive mixed-method and multiple methods research designs. Yet too often, such reflexivity, and how it is used at various points of a study, is absent when we write our research reports. Instead, reports of mixed-method and multiple methods research focus on what was done rather than how it came to be done. This article seeks to redress this absence of emphasis on the reflexive thinking underpinning the way that mixed- and multiple methods, qualitatively driven research approaches are thought about and subsequently used throughout a project. Using Morse's notion of an armchair walkthrough, we excavate and explore the layers of decisions we made about how, and why, to use qualitatively driven mixed-method and multiple methods research in a study of mindfulness training (MT) in schoolchildren. © The Author(s) 2015.

  11. Understanding Preprocedure Patient Flow in IR.

    PubMed

    Zafar, Abdul Mueed; Suri, Rajeev; Nguyen, Tran Khanh; Petrash, Carson Cope; Fazal, Zanira

    2016-08-01

    To quantify preprocedural patient flow in interventional radiology (IR) and to identify potential contributors to preprocedural delays. An administrative dataset was used to compute time intervals required for various preprocedural patient-flow processes. These time intervals were compared across on-time/delayed cases and inpatient/outpatient cases by Mann-Whitney U test. Spearman ρ was used to assess any correlation of the rank of a procedure on a given day and the procedure duration to the preprocedure time. A linear-regression model of preprocedure time was used to further explore potential contributing factors. Any identified reason(s) for delay were collated. P < .05 was considered statistically significant. Of the total 1,091 cases, 65.8% (n = 718) were delayed. Significantly more outpatient cases started late compared with inpatient cases (81.4% vs 45.0%; P < .001, χ(2) test). The multivariate linear regression model showed outpatient status, length of delay in arrival, and longer procedure times to be significantly associated with longer preprocedure times. Late arrival of patients (65.9%), unavailability of physicians (18.4%), and unavailability of procedure room (13.0%) were the three most frequently identified reasons for delay. The delay was multifactorial in 29.6% of cases (n = 213). Objective measurement of preprocedural IR patient flow demonstrated considerable waste and highlighted high-yield areas of possible improvement. A data-driven approach may aid efficient delivery of IR care. Copyright © 2016 SIR. Published by Elsevier Inc. All rights reserved.

  12. A model-driven approach for representing clinical archetypes for Semantic Web environments.

    PubMed

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto

    2009-02-01

    The life-long clinical information of any person supported by electronic means configures his Electronic Health Record (EHR). This information is usually distributed among several independent and heterogeneous systems that may be syntactically or semantically incompatible. There are currently different standards for representing and exchanging EHR information among different systems. In advanced EHR approaches, clinical information is represented by means of archetypes. Most of these approaches use the Archetype Definition Language (ADL) to specify archetypes. However, ADL has some drawbacks when attempting to perform semantic activities in Semantic Web environments. In this work, Semantic Web technologies are used to specify clinical archetypes for advanced EHR architectures. The advantages of using the Ontology Web Language (OWL) instead of ADL are described and discussed in this work. Moreover, a solution combining Semantic Web and Model-driven Engineering technologies is proposed to transform ADL into OWL for the CEN EN13606 EHR architecture.

  13. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less

  14. SuperJet International case study: a business network start-up in the aeronautics industry

    NASA Astrophysics Data System (ADS)

    Corallo, Angelo; de Maggio, Marco; Storelli, Davide

    This chapter presents the SuperJet International case study, a start-up in the aeronautics industry characterized by a process-oriented approach and a complex and as yet evolving network of partnerships and collaborations. The chapter aims to describe the key points of the start-up process, highlighting common factors and differences compared to the TEKNE Methodology of Change, with particular reference to the second and third phase, namely, the design and deployment of new techno-organizational systems. The SuperJet International startup is presented as a case study where strategic and organizational aspects have been jointly conceived from a network-driven perspective. The chapter compares some of the guidelines of the TEKNE Methodology of Change with experiences and actual practices deriving from interviews with key players in SJI's start-up process.

  15. Troubleshooting Portfolios

    ERIC Educational Resources Information Center

    Crismond, David; Peterie, Matthew

    2017-01-01

    The Troubleshooting Portfolios approach was developed at the Olathe Northwest High School in Olathe, Kansas. This approach supports integrated STEM and "informed design" thinking and learning, in which students: (1) use design strategies effectively; (2) work creatively and collaboratively in teams; (3) make knowledge-driven decisions;…

  16. System performance predictions for Space Station Freedom's electric power system

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Hojnicki, Jeffrey S.; Green, Robert D.; Follo, Jeffrey C.

    1993-01-01

    Space Station Freedom Electric Power System (EPS) capability to effectively deliver power to housekeeping and user loads continues to strongly influence Freedom's design and planned approaches for assembly and operations. The EPS design consists of silicon photovoltaic (PV) arrays, nickel-hydrogen batteries, and direct current power management and distribution hardware and cabling. To properly characterize the inherent EPS design capability, detailed system performance analyses must be performed for early stages as well as for the fully assembled station up to 15 years after beginning of life. Such analyses were repeatedly performed using the FORTRAN code SPACE (Station Power Analysis for Capability Evaluation) developed at the NASA Lewis Research Center over a 10-year period. SPACE combines orbital mechanics routines, station orientation/pointing routines, PV array and battery performance models, and a distribution system load-flow analysis to predict EPS performance. Time-dependent, performance degradation, low earth orbit environmental interactions, and EPS architecture build-up are incorporated in SPACE. Results from two typical SPACE analytical cases are presented: (1) an electric load driven case and (2) a maximum EPS capability case.

  17. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  18. Quasi-Static Electric Field Generator

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2017-01-01

    A generator for producing an electric field for with an inspection technology system is provided. The generator provides the required variable magnitude quasi-static electric fields for the "illumination" of objects, areas and volumes to be inspected by the system, and produces human-safe electric fields that are only visible to the system. The generator includes a casing, a driven, non-conducting and triboelectrically neutral rotation shaft mounted therein, an ungrounded electrostatic dipole element which works in the quasi-static range, and a non-conducting support for mounting the dipole element to the shaft. The dipole element has a wireless motor system and a charging system which are wholly contained within the dipole element and the support that uses an electrostatic approach to charge the dipole element.

  19. Interaction of Gortler vortices and Tollmien-Schlichting waves in curved channel flow

    NASA Technical Reports Server (NTRS)

    Daudpota, Q. Isa; Zang, Thomas A.; Hall, Philip

    1987-01-01

    The flow in a two-dimensional curved channel driven by an azimuthal pressure gradient can become linearly unstable due to axisymmetric perturbations and/or nonaxisymmetric perturbations depending on the curvature of the channel and the Reynolds number. For a particular small value of curvature, the critical Reynolds number for both these perturbations becomes identical. In the neighborhood of this curvature value and critical Reynolds number, nonlinear interactions occur between these perturbations. The Stuart-Watson approach is used to derive two coupled Landau equations for the amplitudes of these perturbations. The stability of the various possible states of these perturbations is shown through bifurcation diagrams. Emphasis is given to those cases which have relevance to external flows.

  20. Principled Principals? Values-Driven Leadership: Evidence from Ten Case Studies of 'Outstanding' School Leaders.

    ERIC Educational Resources Information Center

    Gold, Anne; Evans, Jennifer; Earley, Peter; Halpin, David; Collarbone, Patricia

    2003-01-01

    Case studies of English schools with "outstanding leaders" finds them avoiding doing "bastard leadership" by mediating government policy through their own values. Explores ways in which school leaders promote and encourage shared values. Discusses how these leaders enact values-driven leadership by, for example, developing…

  1. Arc-based smoothing of ion beam intensity on targets

    DOE PAGES

    Friedman, Alex

    2012-06-20

    Manipulating a set of ion beams upstream of a target, makes it possible to arrange a smoother deposition pattern, so as to achieve more uniform illumination of the target. A uniform energy deposition pattern is important for applications including ion-beam-driven high energy density physics and heavy-ion beam-driven inertial fusion energy (“heavy-ion fusion”). Here, we consider an approach to such smoothing that is based on rapidly “wobbling” each of the beams back and forth along a short arc-shaped path, via oscillating fields applied upstream of the final pulse compression. In this technique, uniformity is achieved in the time-averaged sense; this ismore » sufficient provided the beam oscillation timescale is short relative to the hydrodynamic timescale of the target implosion. This work builds on two earlier concepts: elliptical beams applied to a distributed-radiator target [D. A. Callahan and M. Tabak, Phys. Plasmas 7, 2083 (2000)] and beams that are wobbled so as to trace a number of full rotations around a circular or elliptical path [R. C. Arnold et al., Nucl. Instrum. Methods 199, 557 (1982)]. Here, we describe the arc-based smoothing approach and compare it to results obtainable using an elliptical-beam prescription. In particular, we assess the potential of these approaches for minimization of azimuthal asymmetry, for the case of a ring of beams arranged on a cone. We also found that, for small numbers of beams on the ring, the arc-based smoothing approach offers superior uniformity. In contrast with the full-rotation approach, arc-based smoothing remains usable when the geometry precludes wobbling the beams around a full circle, e.g., for the X-target [E. Henestroza, B. G. Logan, and L. J. Perkins, Phys. Plasmas 18, 032702 (2011)] and some classes of distributed-radiator targets.« less

  2. A more rational, theory-driven approach to analysing the factor structure of the Edinburgh Postnatal Depression Scale.

    PubMed

    Kozinszky, Zoltan; Töreki, Annamária; Hompoth, Emőke A; Dudas, Robert B; Németh, Gábor

    2017-04-01

    We endeavoured to analyze the factor structure of the Edinburgh Postnatal Depression Scale (EPDS) during a screening programme in Hungary, using exploratory (EFA) and confirmatory factor analysis (CFA), testing both previously published models and newly developed theory-driven ones, after a critical analysis of the literature. Between April 2011 and January 2015, a sample of 2967 pregnant women (between 12th and 30th weeks of gestation) and 714 women 6 weeks after delivery completed the Hungarian version of the EPDS in South-East Hungary. EFAs suggested unidimensionality in both samples. 33 out of 42 previously published models showed good and 6 acceptable fit with our antepartum data in CFAs, whilst 10 of them showed good and 28 acceptable fit in our postpartum sample. Using multiple fit indices, our theory-driven anhedonia (items 1,2) - anxiety (items 4,5) - low mood (items 8,9) model provided the best fit in the antepartum sample. In the postpartum sample, our theory-driven models were again among the best performing models, including an anhedonia and an anxiety factor together with either a low mood or a suicidal risk factor (items 3,6,10). The EPDS showed moderate within- and between-culture invariability, although this would also need to be re-examined with a theory-driven approach. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  3. Comparison of CRISPR/Cas9 expression constructs for efficient targeted mutagenesis in rice.

    PubMed

    Mikami, Masafumi; Toki, Seiichi; Endo, Masaki

    2015-08-01

    The CRISPR/Cas9 system is an efficient tool used for genome editing in a variety of organisms. Despite several recent reports of successful targeted mutagenesis using the CRISPR/Cas9 system in plants, in each case the target gene of interest, the Cas9 expression system and guide-RNA (gRNA) used, and the tissues used for transformation and subsequent mutagenesis differed, hence the reported frequencies of targeted mutagenesis cannot be compared directly. Here, we evaluated mutation frequency in rice using different Cas9 and/or gRNA expression cassettes under standardized experimental conditions. We introduced Cas9 and gRNA expression cassettes separately or sequentially into rice calli, and assessed the frequency of mutagenesis at the same endogenous targeted sequences. Mutation frequencies differed significantly depending on the Cas9 expression cassette used. In addition, a gRNA driven by the OsU6 promoter was superior to one driven by the OsU3 promoter. Using an all-in-one expression vector harboring the best combined Cas9/gRNA expression cassette resulted in a much improved frequency of targeted mutagenesis in rice calli, and bi-allelic mutant plants were produced in the T0 generation. The approach presented here could be adapted to optimize the construction of Cas9/gRNA cassettes for genome editing in a variety of plants.

  4. Quantitative and qualitative assessment of the bovine abortion surveillance system in France.

    PubMed

    Bronner, Anne; Gay, Emilie; Fortané, Nicolas; Palussière, Mathilde; Hendrikx, Pascal; Hénaux, Viviane; Calavas, Didier

    2015-06-01

    Bovine abortion is the main clinical sign of bovine brucellosis, a disease of which France has been declared officially free since 2005. To ensure the early detection of any brucellosis outbreak, event-driven surveillance relies on the mandatory notification of bovine abortions and the brucellosis testing of aborting cows. However, the under-reporting of abortions appears frequent. Our objectives were to assess the aptitude of the bovine abortion surveillance system to detect each and every bovine abortion and to identify factors influencing the system's effectiveness. We evaluated five attributes defined by the U.S. Centers for Disease Control with a method suited to each attribute: (1) data quality was studied quantitatively and qualitatively, as this factor considerably influences data analysis and results; (2) sensitivity and representativeness were estimated using a unilist capture-recapture approach to quantify the surveillance system's effectiveness; (3) acceptability and simplicity were studied through qualitative interviews of actors in the field, given that the surveillance system relies heavily on abortion notifications by farmers and veterinarians. Our analysis showed that (1) data quality was generally satisfactory even though some errors might be due to actors' lack of awareness of the need to collect accurate data; (2) from 2006 to 2011, the mean annual sensitivity - i.e. the proportion of farmers who reported at least one abortion out of all those who detected such events - was around 34%, but was significantly higher in dairy than beef cattle herds (highlighting a lack of representativeness); (3) overall, the system's low sensitivity was related to its low acceptability and lack of simplicity. This study showed that, in contrast to policy-makers, most farmers and veterinarians perceived the risk of a brucellosis outbreak as negligible. They did not consider sporadic abortions as a suspected case of brucellosis and usually reported abortions only to identify their cause rather than to reject brucellosis. The system proved too complex, especially for beef cattle farmers, as they may fail to detect aborting cows at pasture or have difficulties catching them for sampling. By investigating critical attributes, our evaluation highlighted the surveillance system's strengths and needed improvements. We believe our comprehensive approach can be used to assess other event-driven surveillance systems. In addition, some of our recommendations on increasing the effectiveness of event-driven brucellosis surveillance may be useful in improving the notification rate for suspected cases of other exotic diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Impact of real-time traffic characteristics on crash occurrence: Preliminary results of the case of rare events.

    PubMed

    Theofilatos, Athanasios; Yannis, George; Kopelias, Pantelis; Papadimitriou, Fanis

    2018-01-04

    Considerable efforts have been made from researchers and policy makers in order to explain road crash occurrence and improve road safety performance of highways. However, there are cases when crashes are so few that they could be considered as rare events. In such cases, the binary dependent variable is characterized by dozens to thousands of times fewer events (crashes) than non-events (non-crashes). This paper attempts to add to the current knowledge by investigating crash likelihood by utilizing real-time traffic data and by proposing a framework driven by appropriate statistical models (Bias Correction and Firth method) in order to overcome the problems that arise when the number of crashes is very low. Under this approach instead of using traditional logistic regression methods, crashes are considered as rare events In order to demonstrate this approach, traffic data were collected from three random loop detectors in the Attica Tollway ("Attiki Odos") located in Greater Athens Area in Greece for the 2008-2011 period. The traffic dataset consists of hourly aggregated traffic data such as flow, occupancy, mean time speed and percentage of trucks in traffic. This study demonstrates the application and findings of our approach and revealed a negative relationship between crash occurrence and speed in crash locations. The method and findings of the study attempt to provide insights on the mechanism of crash occurrence and also to overcome data considerations for the first time in safety evaluation of motorways. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. The nonlinear behavior of whistler waves at the reconnecting dayside magnetopause as observed by the Magnetospheric Multiscale mission: A case study

    NASA Astrophysics Data System (ADS)

    Wilder, F. D.; Ergun, R. E.; Newman, D. L.; Goodrich, K. A.; Trattner, K. J.; Goldman, M. V.; Eriksson, S.; Jaynes, A. N.; Leonard, T.; Malaspina, D. M.; Ahmadi, N.; Schwartz, S. J.; Burch, J. L.; Torbert, R. B.; Argall, M. R.; Giles, B. L.; Phan, T. D.; Le Contel, O.; Graham, D. B.; Khotyaintsev, Yu V.; Strangeway, R. J.; Russell, C. T.; Magnes, W.; Plaschke, F.; Lindqvist, P.-A.

    2017-05-01

    We show observations of whistler mode waves in both the low-latitude boundary layer (LLBL) and on closed magnetospheric field lines during a crossing of the dayside reconnecting magnetopause by the Magnetospheric Multiscale (MMS) mission on 11 October 2015. The whistlers in the LLBL were on the electron edge of the magnetospheric separatrix and exhibited high propagation angles with respect to the background field, approaching 40°, with bursty and nonlinear parallel electric field signatures. The whistlers in the closed magnetosphere had Poynting flux that was more field aligned. Comparing the reduced electron distributions for each event, the magnetospheric whistlers appear to be consistent with anisotropy-driven waves, while the distribution in the LLBL case includes anisotropic backward resonant electrons and a forward resonant beam at near half the electron-Alfvén speed. Results are compared with the previously published observations by MMS on 19 September 2015 of LLBL whistler waves. The observations suggest that whistlers in the LLBL can be both beam and anisotropy driven, and the relative contribution of each might depend on the distance from the X line.

  7. Using subject-specific three-dimensional (3D) anthropometry data in digital human modelling: case study in hand motion simulation.

    PubMed

    Tsao, Liuxing; Ma, Liang

    2016-11-01

    Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.

  8. Matrix-product-operator approach to the nonequilibrium steady state of driven-dissipative quantum arrays

    NASA Astrophysics Data System (ADS)

    Mascarenhas, Eduardo; Flayac, Hugo; Savona, Vincenzo

    2015-08-01

    We develop a numerical procedure to efficiently model the nonequilibrium steady state of one-dimensional arrays of open quantum systems based on a matrix-product operator ansatz for the density matrix. The procedure searches for the null eigenvalue of the Liouvillian superoperator by sweeping along the system while carrying out a partial diagonalization of the single-site stationary problem. It bears full analogy to the density-matrix renormalization-group approach to the ground state of isolated systems, and its numerical complexity scales as a power law with the bond dimension. The method brings considerable advantage when compared to the integration of the time-dependent problem via Trotter decomposition, as it can address arbitrarily long-ranged couplings. Additionally, it ensures numerical stability in the case of weakly dissipative systems thanks to a slow tuning of the dissipation rates along the sweeps. We have tested the method on a driven-dissipative spin chain, under various assumptions for the Hamiltonian, drive, and dissipation parameters, and compared the results to those obtained both by Trotter dynamics and Monte Carlo wave function methods. Accurate and numerically stable convergence was always achieved when applying the method to systems with a gapped Liouvillian and a nondegenerate steady state.

  9. The potential of targeting Ras proteins in lung cancer.

    PubMed

    McCormick, Frank

    2015-04-01

    The Ras pathway is a major driver in lung adenocarcinoma: over 75% of all cases harbor mutations that activate this pathway. While spectacular clinical successes have been achieved by targeting activated receptor tyrosine kinases in this pathway, little, if any, significant progress has been achieved targeting Ras proteins themselves or cancers driven by oncogenic Ras mutants. New approaches to drug discovery, new insights into Ras function, new ways of attacking undruggable proteins through RNA interference and new ways of harnessing the immune system could change this landscape in the relatively near future.

  10. Emergent Structural Mechanisms for High-Density Collective Motion Inspired by Human Crowds

    NASA Astrophysics Data System (ADS)

    Bottinelli, Arianna; Sumpter, David T. J.; Silverberg, Jesse L.

    2016-11-01

    Collective motion of large human crowds often depends on their density. In extreme cases like heavy metal concerts and black Friday sales events, motion is dominated by physical interactions instead of conventional social norms. Here, we study an active matter model inspired by situations when large groups of people gather at a point of common interest. Our analysis takes an approach developed for jammed granular media and identifies Goldstone modes, soft spots, and stochastic resonance as structurally driven mechanisms for potentially dangerous emergent collective motion.

  11. Novel and emerging approaches to combat adolescent obesity

    PubMed Central

    Sharma, Manoj; Branscum, Paul

    2010-01-01

    Overweight and obesity continue to be health concerns facing today’s adolescent population. Along with metabolic and physical problems associated with obesity, today’s obese adolescents also face many psychological issues such as high rates of depression, anxiety, and social discrimination. Obesity is commonly recognized as having many causes, such as genetic, lifestyle and environmental. There are four major modalities for management of overweight and obesity in adolescents: dietary management, increasing physical activity, pharmacological therapy, and bariatric surgery. The purpose of this study was to conduct a review of novel and emerging approaches for preventing and managing adolescent obesity. It was found that while not always the case, theory driven approaches are being better utilized in newer interventions especially by those directed toward prevention. New theories that are being used are the theories of reasoned action, planned behavior, intervention mapping, and social marketing. Schools are found to be the most common place for such interventions, which is appropriate since virtually all children attend some form of private or public school. Limitations found in many studies include the underuse of process evaluations, the low number of studies attempted, environmental or policy changes, and that not all studies used a similar control group for comparison. PMID:24600257

  12. Novel and emerging approaches to combat adolescent obesity.

    PubMed

    Sharma, Manoj; Branscum, Paul

    2010-01-01

    Overweight and obesity continue to be health concerns facing today's adolescent population. Along with metabolic and physical problems associated with obesity, today's obese adolescents also face many psychological issues such as high rates of depression, anxiety, and social discrimination. Obesity is commonly recognized as having many causes, such as genetic, lifestyle and environmental. There are four major modalities for management of overweight and obesity in adolescents: dietary management, increasing physical activity, pharmacological therapy, and bariatric surgery. The purpose of this study was to conduct a review of novel and emerging approaches for preventing and managing adolescent obesity. It was found that while not always the case, theory driven approaches are being better utilized in newer interventions especially by those directed toward prevention. New theories that are being used are the theories of reasoned action, planned behavior, intervention mapping, and social marketing. Schools are found to be the most common place for such interventions, which is appropriate since virtually all children attend some form of private or public school. Limitations found in many studies include the underuse of process evaluations, the low number of studies attempted, environmental or policy changes, and that not all studies used a similar control group for comparison.

  13. Learning and Information Approaches for Inference in Dynamic Data-Driven Geophysical Applications

    NASA Astrophysics Data System (ADS)

    Ravela, S.

    2015-12-01

    Many Geophysical inference problems are characterized by non-linear processes, high-dimensional models and complex uncertainties. A dynamic coupling between models, estimation, and sampling is typically sought to efficiently characterize and reduce uncertainty. This process is however fraught with several difficulties. Among them, the key difficulties are the ability to deal with model errors, efficacy of uncertainty quantification and data assimilation. In this presentation, we present three key ideas from learning and intelligent systems theory and apply them to two geophysical applications. The first idea is the use of Ensemble Learning to compensate for model error, the second is to develop tractable Information Theoretic Learning to deal with non-Gaussianity in inference, and the third is a Manifold Resampling technique for effective uncertainty quantification. We apply these methods, first to the development of a cooperative autonomous observing system using sUAS for studying coherent structures. We apply this to Second, we apply this to the problem of quantifying risk from hurricanes and storm surges in a changing climate. Results indicate that learning approaches can enable new effectiveness in cases where standard approaches to model reduction, uncertainty quantification and data assimilation fail.

  14. Gene Mutations and Genomic Rearrangements in the Mouse as a Result of Transposon Mobilization from Chromosomal Concatemers

    PubMed Central

    Geurts, Aron M; Collier, Lara S; Geurts, Jennifer L; Oseth, Leann L; Bell, Matthew L; Mu, David; Lucito, Robert; Godbout, Susan A; Green, Laura E; Lowe, Scott W; Hirsch, Betsy A; Leinwand, Leslie A; Largaespada, David A

    2006-01-01

    Previous studies of the Sleeping Beauty (SB) transposon system, as an insertional mutagen in the germline of mice, have used reverse genetic approaches. These studies have led to its proposed use for regional saturation mutagenesis by taking a forward-genetic approach. Thus, we used the SB system to mutate a region of mouse Chromosome 11 in a forward-genetic screen for recessive lethal and viable phenotypes. This work represents the first reported use of an insertional mutagen in a phenotype-driven approach. The phenotype-driven approach was successful in both recovering visible and behavioral mutants, including dominant limb and recessive behavioral phenotypes, and allowing for the rapid identification of candidate gene disruptions. In addition, a high frequency of recessive lethal mutations arose as a result of genomic rearrangements near the site of transposition, resulting from transposon mobilization. The results suggest that the SB system could be used in a forward-genetic approach to recover interesting phenotypes, but that local chromosomal rearrangements should be anticipated in conjunction with single-copy, local transposon insertions in chromosomes. Additionally, these mice may serve as a model for chromosome rearrangements caused by transposable elements during the evolution of vertebrate genomes. PMID:17009875

  15. Role of electromagnetic wave in mode selection of magnetically driven instabilities

    NASA Astrophysics Data System (ADS)

    Dan, J. K.; Ren, X. D.; Duan, S. C.; Ouyang, K.; Chen, G. H.; Huang, X. B.

    2014-12-01

    The fundamental wavelength of the instability along two 25-μm-diameter aluminum wires using a 100 ns rise time, 220 kA pulsed power facility is measured for two different load configurations. In one case the wires are perpendicular to end surface of electrodes, and in another case the wires are oblique to electrode's end surface. The primary diagnostic used to measure time revolution of instability wavelength and amplitude is laser shadowgraphy. The role of end surface of electrodes appears to be responsible for the differences in dominant wavelength of instability between two types of load configurations. The experimental results that the fundamental wavelength in oblique case is about one half of that in perpendicular case indicates the ionic electromagnetic waves may play a key role in mode selection of magnetically driven instabilities. Conclusions drew from this paper may help us to understand the original reason why instabilities along wires manifest itself as a quasiperiodic pattern.

  16. QA-driven Guidelines Generation for Bacteriotherapy

    PubMed Central

    Pasche, Emilie; Teodoro, Douglas; Gobeill, Julien; Ruch, Patrick; Lovis, Christian

    2009-01-01

    PURPOSE We propose a question-answering (QA) driven generation approach for automatic acquisition of structured rules that can be used in a knowledge authoring tool for antibiotic prescription guidelines management. METHODS: The rule generation is seen as a question-answering problem, where the parameters of the questions are known items of the rule (e.g. an infectious disease, caused by a given bacterium) and answers (e.g. some antibiotics) are obtained by a question-answering engine. RESULTS: When looking for a drug given a pathogen and a disease, top-precision of 0.55 is obtained by the combination of the Boolean engine (PubMed) and the relevance-driven engine (easyIR), which means that for more than half of our evaluation benchmark at least one of the recommended antibiotics was automatically acquired by the rule generation method. CONCLUSION: These results suggest that such an automatic text mining approach could provide a useful tool for guidelines management, by improving knowledge update and discovery. PMID:20351908

  17. A biomechanical modeling-guided simultaneous motion estimation and image reconstruction technique (SMEIR-Bio) for 4D-CBCT reconstruction

    NASA Astrophysics Data System (ADS)

    Huang, Xiaokun; Zhang, You; Wang, Jing

    2018-02-01

    Reconstructing four-dimensional cone-beam computed tomography (4D-CBCT) images directly from respiratory phase-sorted traditional 3D-CBCT projections can capture target motion trajectory, reduce motion artifacts, and reduce imaging dose and time. However, the limited numbers of projections in each phase after phase-sorting decreases CBCT image quality under traditional reconstruction techniques. To address this problem, we developed a simultaneous motion estimation and image reconstruction (SMEIR) algorithm, an iterative method that can reconstruct higher quality 4D-CBCT images from limited projections using an inter-phase intensity-driven motion model. However, the accuracy of the intensity-driven motion model is limited in regions with fine details whose quality is degraded due to insufficient projection number, which consequently degrades the reconstructed image quality in corresponding regions. In this study, we developed a new 4D-CBCT reconstruction algorithm by introducing biomechanical modeling into SMEIR (SMEIR-Bio) to boost the accuracy of the motion model in regions with small fine structures. The biomechanical modeling uses tetrahedral meshes to model organs of interest and solves internal organ motion using tissue elasticity parameters and mesh boundary conditions. This physics-driven approach enhances the accuracy of solved motion in the organ’s fine structures regions. This study used 11 lung patient cases to evaluate the performance of SMEIR-Bio, making both qualitative and quantitative comparisons between SMEIR-Bio, SMEIR, and the algebraic reconstruction technique with total variation regularization (ART-TV). The reconstruction results suggest that SMEIR-Bio improves the motion model’s accuracy in regions containing small fine details, which consequently enhances the accuracy and quality of the reconstructed 4D-CBCT images.

  18. Metropolitan Transportation Commission, San Francisco Bay area : developing regional objectives and performance measures to improve system operations

    DOT National Transportation Integrated Search

    2009-04-01

    The Metropolitan Transportation Commission (MTC) uses an objectives-driven, performance-based approach in its transportation planning for the San Francisco Bay Area. This approach focuses attention on transportation investments of highest priority. T...

  19. An update on vulvar intraepithelial neoplasia: terminology and a practical approach to diagnosis.

    PubMed

    Reyes, M Carolina; Cooper, Kumarasen

    2014-04-01

    There are two distinct types of vulvar intraepithelial neoplasia (VIN), which differ in their clinical presentation, aetiology, pathogenesis and histological/immunophenotypical features. One form driven by high-risk human papilloma virus infection usually occurs in young women and has been termed classic or usual VIN (uVIN). The other, not related to viral infection, occurs in postmenopausal women with chronic skin conditions such as lichen sclerosus and lichen simplex chronicus and is termed differentiated or simplex-type VIN. The latter is the precursor lesion of the most common type of squamous cell carcinoma (SCC) in the vulva, namely keratinizing SCC (representing 60% of cases). In contrast, uVIN usually gives rise to basaloid or warty SCC (40% of cases). The histological features of uVIN are similar to those of high grade lesions encountered in other lower anogenital tract sites (hyperchomatic nuclei with high nuclear to cytoplasmic ratios and increased mitotic activity). However, differentiated VIN has very subtle histopathological changes and often escapes diagnosis. Since uVIN is driven by high-risk human papilloma virus infections, p16 immunohistochemistry is diffusely positive in these lesions and is characterized with a high Ki-67 proliferation index. In contrast, differentiated or simplex-type VIN is consistently negative for p16 and the majority of the cases harbour TP53 mutations, correlating with p53 positivity by immunohistochemistry.

  20. Integrating Undergraduate Students in Faculty-Driven Motor Behavior Research

    ERIC Educational Resources Information Center

    Robinson, Leah E.

    2013-01-01

    This article described the faculty-sponsored, faculty-driven approach to undergraduate research (UGR) at Auburn University. This approach is centered around research in the Pediatric Movement and Physical Activity Laboratory, and students can get elective course credit for their participation in UGR. The article also describes how students' roles…

Top