Sample records for standard model framework

  1. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.

  2. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a modeling framework's native component interface. (3) Create semantic mappings between modeling frameworks that support semantic mediation. This third goal involves creating a crosswalk between the CF Standard Names and the CSDMS Standard Names (a set of naming conventions). This talk will summarize progress towards these goals.

  3. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.

  4. EPA'S NEW EMISSIONS MODELING FRAMEWORK

    EPA Science Inventory

    EPA's Office of Air Quality Planning and Standards is building a new Emissions Modeling Framework that will solve many of the long-standing difficulties of emissions modeling. The goals of the Framework are to (1) prevent bottlenecks and errors caused by emissions modeling activi...

  5. Health level 7 development framework for medication administration.

    PubMed

    Kim, Hwa Sun; Cho, Hune

    2009-01-01

    We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.

  6. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  7. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance

    PubMed Central

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405

  8. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance.

    PubMed

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.

  9. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE PAGES

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...

    2016-01-28

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  10. Teaching Scientific Practices: Meeting the Challenge of Change

    ERIC Educational Resources Information Center

    Osborne, Jonathan

    2014-01-01

    This paper provides a rationale for the changes advocated by the Framework for K-12 Science Education and the Next Generation Science Standards. It provides an argument for why the model embedded in the Next Generation Science Standards is seen as an improvement. The Case made here is that the underlying model that the new Framework presents of…

  11. Higher order QCD predictions for associated Higgs production with anomalous couplings to gauge bosons

    NASA Astrophysics Data System (ADS)

    Mimasu, Ken; Sanz, Verónica; Williams, Ciaran

    2016-08-01

    We present predictions for the associated production of a Higgs boson at NLO+PS accuracy, including the effect of anomalous interactions between the Higgs and gauge bosons. We present our results in different frameworks, one in which the interaction vertex between the Higgs boson and Standard Model W and Z bosons is parameterized in terms of general Lorentz structures, and one in which Electroweak symmetry breaking is manifestly linear and the resulting operators arise through a six-dimensional effective field theory framework. We present analytic calculations of the Standard Model and Beyond the Standard Model contributions, and discuss the phenomenological impact of the higher order pieces. Our results are implemented in the NLO Monte Carlo program MCFM, and interfaced to shower Monte Carlos through the Powheg box framework.

  12. Frequency Spectrum Neutrality Tests: One for All and All for One

    PubMed Central

    Achaz, Guillaume

    2009-01-01

    Neutrality tests based on the frequency spectrum (e.g., Tajima's D or Fu and Li's F) are commonly used by population geneticists as routine tests to assess the goodness-of-fit of the standard neutral model on their data sets. Here, I show that these neutrality tests are specific instances of a general model that encompasses them all. I illustrate how this general framework can be taken advantage of to devise new more powerful tests that better detect deviations from the standard model. Finally, I exemplify the usefulness of the framework on SNP data by showing how it supports the selection hypothesis in the lactase human gene by overcoming the ascertainment bias. The framework presented here paves the way for constructing novel tests optimized for specific violations of the standard model that ultimately will help to unravel scenarios of evolution. PMID:19546320

  13. A UML profile for framework modeling.

    PubMed

    Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  14. European standardization effort: interworking the goal

    NASA Astrophysics Data System (ADS)

    Mattheus, Rudy A.

    1993-09-01

    In the European Standardization Committee (CEN), the technical committee responsible for the standardization activities in Medical Informatics (CEN TC 251), has agreed upon the directions of the scopes to follow in this field. They are described in the Directory of the European Standardization Requirements for Healthcare Informatics and Programme for the Development of Standards adopted on 02-28-1991 by CEN/TC 251 and approved by CEN/BT. Top-down objectives describe the common framework and items like terminology, security, more bottom up oriented items describe fields like medical imaging and multi-media. The draft standard is described; the general framework model and object oriented model; the interworking aspects, the relation to ISO standards, and the DICOM proposal. This paper also focuses on all the boundaries in the standardization work, which are also influencing the standardization process.

  15. Toward a consistent modeling framework to assess multi-sectoral climate impacts.

    PubMed

    Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin

    2018-02-13

    Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.

  16. Predictive Models and Computational Embryology

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  17. Calibration and Propagation of Uncertainty for Independence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Troy Michael; Kress, Joel David; Bhat, Kabekode Ghanasham

    This document reports on progress and methods for the calibration and uncertainty quantification of the Independence model developed at UT Austin. The Independence model is an advanced thermodynamic and process model framework for piperazine solutions as a high-performance CO 2 capture solvent. Progress is presented in the framework of the CCSI standard basic data model inference framework. Recent work has largely focused on the thermodynamic submodels of Independence.

  18. Predictive Models and Computational Toxicology (II IBAMTOX)

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  19. Progressive Education Standards: A Neuroscience Framework

    ERIC Educational Resources Information Center

    O'Grady, Patty

    2011-01-01

    This paper proposes a coherent and unique set of 12 standards, adopting a neuroscience framework for biologically based on school reform. This model of educational principles and practices aligns with the long-standing principles and practices of the Progressive Education Movement in the United States and the emerging principles of neuroscience.…

  20. SAS- Semantic Annotation Service for Geoscience resources on the web

    NASA Astrophysics Data System (ADS)

    Elag, M.; Kumar, P.; Marini, L.; Li, R.; Jiang, P.

    2015-12-01

    There is a growing need for increased integration across the data and model resources that are disseminated on the web to advance their reuse across different earth science applications. Meaningful reuse of resources requires semantic metadata to realize the semantic web vision for allowing pragmatic linkage and integration among resources. Semantic metadata associates standard metadata with resources to turn them into semantically-enabled resources on the web. However, the lack of a common standardized metadata framework as well as the uncoordinated use of metadata fields across different geo-information systems, has led to a situation in which standards and related Standard Names abound. To address this need, we have designed SAS to provide a bridge between the core ontologies required to annotate resources and information systems in order to enable queries and analysis over annotation from a single environment (web). SAS is one of the services that are provided by the Geosematnic framework, which is a decentralized semantic framework to support the integration between models and data and allow semantically heterogeneous to interact with minimum human intervention. Here we present the design of SAS and demonstrate its application for annotating data and models. First we describe how predicates and their attributes are extracted from standards and ingested in the knowledge-base of the Geosemantic framework. Then we illustrate the application of SAS in annotating data managed by SEAD and annotating simulation models that have web interface. SAS is a step in a broader approach to raise the quality of geoscience data and models that are published on the web and allow users to better search, access, and use of the existing resources based on standard vocabularies that are encoded and published using semantic technologies.

  1. Composable Framework Support for Software-FMEA Through Model Execution

    NASA Astrophysics Data System (ADS)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  2. The SGML Standardization Framework and the Introduction of XML

    PubMed Central

    Grütter, Rolf

    2000-01-01

    Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future. PMID:11720931

  3. The SGML standardization framework and the introduction of XML.

    PubMed

    Fierz, W; Grütter, R

    2000-01-01

    Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future.

  4. Developing Global Standards Framework and Quality Integrated Models for Cooperative and Work-Integrated Education Programs

    ERIC Educational Resources Information Center

    Khampirat, Buratin; McRae, Norah

    2016-01-01

    Cooperative and Work-integrated Education (CWIE) programs have been widely accepted as educational programs that can effectively connect what students are learning to the world of work through placements. Because a global quality standards framework could be a very valuable resource and guide to establishing, developing, and accrediting quality…

  5. A general framework for parametric survival analysis.

    PubMed

    Crowther, Michael J; Lambert, Paul C

    2014-12-30

    Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  7. The Standard Model from LHC to future colliders.

    PubMed

    Forte, S; Nisati, A; Passarino, G; Tenchini, R; Calame, C M Carloni; Chiesa, M; Cobal, M; Corcella, G; Degrassi, G; Ferrera, G; Magnea, L; Maltoni, F; Montagna, G; Nason, P; Nicrosini, O; Oleari, C; Piccinini, F; Riva, F; Vicini, A

    This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the "What Next" Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators.

  8. An E-Learning Framework for Assessment (FREMA)

    ERIC Educational Resources Information Center

    Wills, Gary B.; Bailey, Christopher P.; Davis, Hugh C.; Gilbert, Lester; Howard, Yvonne; Jeyes, Steve; Millard, David E.; Price, Joseph; Sclater, Niall; Sherratt, Robert; Tulloch, Iain; Young, Rowin

    2009-01-01

    This article reports on the e-Framework Reference Model for Assessment (FREMA) project that aimed at creating a reference model for the assessment domain: a guide to what resources (standards, projects, people, organisations, software, services and use cases) exist for the domain, aimed at helping strategists understand the state of e-learning…

  9. Open data models for smart health interconnected applications: the example of openEHR.

    PubMed

    Demski, Hans; Garde, Sebastian; Hildebrand, Claudia

    2016-10-22

    Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.

  10. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    PubMed

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  11. Modular modelling with Physiome standards

    PubMed Central

    Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.

    2016-01-01

    Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233

  12. Fermion hierarchy from sfermion anarchy

    DOE PAGES

    Altmannshofer, Wolfgang; Frugiuele, Claudia; Harnik, Roni

    2014-12-31

    We present a framework to generate the hierarchical flavor structure of Standard Model quarks and leptons from loops of superpartners. The simplest model consists of the minimal supersymmetric standard model with tree level Yukawa couplings for the third generation only and anarchic squark and slepton mass matrices. Agreement with constraints from low energy flavor observables, in particular Kaon mixing, is obtained for supersymmetric particles with masses at the PeV scale or above. In our framework both the second and the first generation fermion masses are generated at 1-loop. Despite this, a novel mechanism generates a hierarchy among the first andmore » second generations without imposing a symmetry or small parameters. A second-to-first generation mass ratio of order 100 is typical. The minimal supersymmetric standard model thus includes all the necessary ingredients to realize a fermion spectrum that is qualitatively similar to observation, with hierarchical masses and mixing. The minimal framework produces only a few quantitative discrepancies with observation, most notably the muon mass is too low. Furthermore, we discuss simple modifications which resolve this and also investigate the compatibility of our model with gauge and Yukawa coupling Unification.« less

  13. Field Markup Language: biological field representation in XML.

    PubMed

    Chang, David; Lovell, Nigel H; Dokos, Socrates

    2007-01-01

    With an ever increasing number of biological models available on the internet, a standardized modeling framework is required to allow information to be accessed or visualized. Based on the Physiome Modeling Framework, the Field Markup Language (FML) is being developed to describe and exchange field information for biological models. In this paper, we describe the basic features of FML, its supporting application framework and its ability to incorporate CellML models to construct tissue-scale biological models. As a typical application example, we present a spatially-heterogeneous cardiac pacemaker model which utilizes both FML and CellML to describe and solve the underlying equations of electrical activation and propagation.

  14. A framework for analysis of research risks and benefits to participants in standard of care pragmatic clinical trials.

    PubMed

    Chen, Stephanie C; Kim, Scott Yh

    2016-12-01

    Standard of care pragmatic clinical trials that compare treatments already in use could improve care and reduce costs, but there is considerable debate about the research risks of standard of care pragmatic clinical trials and how to apply informed consent regulations to such trials. We sought to develop a framework integrating the insights from opposing sides of the debate. We developed a formal risk-benefit analysis framework for standard of care pragmatic clinical trials and then applied it to key provisions of the US federal regulations. Our formal framework for standard of care pragmatic clinical trial risk-benefit analysis takes into account three key considerations: the ex ante estimates of risks and benefits of the treatments to be compared in a standard of care pragmatic clinical trial, the allocation ratios of treatments inside and outside such a trial, and the significance of some participants receiving a different treatment inside a trial than outside the trial. The framework provides practical guidance on how the research ethics regulations on informed consent should be applied to standard of care pragmatic clinical trials. Our proposed formal model makes explicit the relationship between the concepts used by opposing sides of the debate about the research risks of standard of care pragmatic clinical trials and can be used to clarify the implications for informed consent. © The Author(s) 2016.

  15. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  16. Standardized Competencies for Parenteral Nutrition Prescribing: The American Society for Parenteral and Enteral Nutrition Model.

    PubMed

    Guenter, Peggi; Boullata, Joseph I; Ayers, Phil; Gervasio, Jane; Malone, Ainsley; Raymond, Erica; Holcombe, Beverly; Kraft, Michael; Sacks, Gordon; Seres, David

    2015-08-01

    Parenteral nutrition (PN) provision is complex, as it is a high-alert medication and prone to a variety of potential errors. With changes in clinical practice models and recent federal rulings, the number of PN prescribers may be increasing. Safe prescribing of this therapy requires that competency for prescribers from all disciplines be demonstrated using a standardized process. A standardized model for PN prescribing competency is proposed based on a competency framework, the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.)-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines. This framework will guide institutions and agencies in developing and maintaining competency for safe PN prescription by their staff. © 2015 American Society for Parenteral and Enteral Nutrition.

  17. Rethinking the Introduction of Particle Theory: A Substance-Based Framework

    ERIC Educational Resources Information Center

    Johnson, Philip; Papageorgiou, George

    2010-01-01

    In response to extensive research exposing students' poor understanding of the particle theory of matter, this article argues that the conceptual framework within which the theory is introduced could be a limiting factor. The standard school particle model is characterized as operating within a "solids, liquids, and gases" framework.…

  18. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.

  19. Reusable Models of Pedagogical Concepts--A Framework for Pedagogical and Content Design.

    ERIC Educational Resources Information Center

    Pawlowski, Jan M.

    Standardization initiatives in the field of learning technologies have produced standards for the interoperability of learning environments and learning management systems. Learning resources based on these standards can be reused, recombined, and adapted to the user. However, these standards follow a content-oriented approach; the process of…

  20. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  1. Methodology for setting risk-based concentrations of contaminants in soil and groundwater and application to a model contaminated site.

    PubMed

    Fujinaga, Aiichiro; Uchiyama, Iwao; Morisawa, Shinsuke; Yoneda, Minoru; Sasamoto, Yuzuru

    2012-01-01

    In Japan, environmental standards for contaminants in groundwater and in leachate from soil are set with the assumption that they are used for drinking water over a human lifetime. Where there is neither a well nor groundwater used for drinking, the standard is thus too severe. Therefore, remediation based on these standards incurs excessive effort and cost. In contrast, the environmental-assessment procedure used in the United States and the Netherlands considers the site conditions (land use, existing wells, etc.); however, a risk assessment is required for each site. Therefore, this study proposes a new framework for judging contamination in Japan by considering the merits of the environmental standards used and a method for risk assessment. The framework involves setting risk-based concentrations that are attainable remediation goals for contaminants in soil and groundwater. The framework was then applied to a model contaminated site for risk management, and the results are discussed regarding the effectiveness and applicability of the new methodology. © 2011 Society for Risk Analysis.

  2. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  3. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  4. Electroweak baryogenesis and the standard model effective field theory

    NASA Astrophysics Data System (ADS)

    de Vries, Jordy; Postma, Marieke; van de Vis, Jorinde; White, Graham

    2018-01-01

    We investigate electroweak baryogenesis within the framework of the Standard Model Effective Field Theory. The Standard Model Lagrangian is supplemented by dimension-six operators that facilitate a strong first-order electroweak phase transition and provide sufficient CP violation. Two explicit scenarios are studied that are related via the classical equations of motion and are therefore identical at leading order in the effective field theory expansion. We demonstrate that formally higher-order dimension-eight corrections lead to large modifications of the matter-antimatter asymmetry. The effective field theory expansion breaks down in the modified Higgs sector due to the requirement of a first-order phase transition. We investigate the source of the breakdown in detail and show how it is transferred to the CP-violating sector. We briefly discuss possible modifications of the effective field theory framework.

  5. Software design and implementation concepts for an interoperable medical communication framework.

    PubMed

    Besting, Andreas; Bürger, Sebastian; Kasparick, Martin; Strathen, Benjamin; Portheine, Frank

    2018-02-23

    The new IEEE 11073 service-oriented device connectivity (SDC) standard proposals for networked point-of-care and surgical devices constitutes the basis for improved interoperability due to its independence of vendors. To accelerate the distribution of the standard a reference implementation is indispensable. However, the implementation of such a framework has to overcome several non-trivial challenges. First, the high level of complexity of the underlying standard must be reflected in the software design. An efficient implementation has to consider the limited resources of the underlying hardware. Moreover, the frameworks purpose of realizing a distributed system demands a high degree of reliability of the framework itself and its internal mechanisms. Additionally, a framework must provide an easy-to-use and fail-safe application programming interface (API). In this work, we address these challenges by discussing suitable software engineering principles and practical coding guidelines. A descriptive model is developed that identifies key strategies. General feasibility is shown by outlining environments in which our implementation has been utilized.

  6. Methods for Specifying Scientific Data Standards and Modeling Relationships with Applications to Neuroscience

    PubMed Central

    Rübel, Oliver; Dougherty, Max; Prabhat; Denes, Peter; Conant, David; Chang, Edward F.; Bouchard, Kristofer

    2016-01-01

    Neuroscience continues to experience a tremendous growth in data; in terms of the volume and variety of data, the velocity at which data is acquired, and in turn the veracity of data. These challenges are a serious impediment to sharing of data, analyses, and tools within and across labs. Here, we introduce BRAINformat, a novel data standardization framework for the design and management of scientific data formats. The BRAINformat library defines application-independent design concepts and modules that together create a general framework for standardization of scientific data. We describe the formal specification of scientific data standards, which facilitates sharing and verification of data and formats. We introduce the concept of Managed Objects, enabling semantic components of data formats to be specified as self-contained units, supporting modular and reusable design of data format components and file storage. We also introduce the novel concept of Relationship Attributes for modeling and use of semantic relationships between data objects. Based on these concepts we demonstrate the application of our framework to design and implement a standard format for electrophysiology data and show how data standardization and relationship-modeling facilitate data analysis and sharing. The format uses HDF5, enabling portable, scalable, and self-describing data storage and integration with modern high-performance computing for data-driven discovery. The BRAINformat library is open source, easy-to-use, and provides detailed user and developer documentation and is freely available at: https://bitbucket.org/oruebel/brainformat. PMID:27867355

  7. A logical foundation for representation of clinical data.

    PubMed Central

    Campbell, K E; Das, A K; Musen, M A

    1994-01-01

    OBJECTIVE: A general framework for representation of clinical data that provides a declarative semantics of terms and that allows developers to define explicitly the relationships among both terms and combinations of terms. DESIGN: Use of conceptual graphs as a standard representation of logic and of an existing standardized vocabulary, the Systematized Nomenclature of Medicine (SNOMED International), for lexical elements. Concepts such as time, anatomy, and uncertainty must be modeled explicitly in a way that allows relation of these foundational concepts to surface-level clinical descriptions in a uniform manner. RESULTS: The proposed framework was used to model a simple radiology report, which included temporal references. CONCLUSION: Formal logic provides a framework for formalizing the representation of medical concepts. Actual implementations will be required to evaluate the practicality of this approach. PMID:7719805

  8. Wavelet-Bayesian inference of cosmic strings embedded in the cosmic microwave background

    NASA Astrophysics Data System (ADS)

    McEwen, J. D.; Feeney, S. M.; Peiris, H. V.; Wiaux, Y.; Ringeval, C.; Bouchet, F. R.

    2017-12-01

    Cosmic strings are a well-motivated extension to the standard cosmological model and could induce a subdominant component in the anisotropies of the cosmic microwave background (CMB), in addition to the standard inflationary component. The detection of strings, while observationally challenging, would provide a direct probe of physics at very high-energy scales. We develop a framework for cosmic string inference from observations of the CMB made over the celestial sphere, performing a Bayesian analysis in wavelet space where the string-induced CMB component has distinct statistical properties to the standard inflationary component. Our wavelet-Bayesian framework provides a principled approach to compute the posterior distribution of the string tension Gμ and the Bayesian evidence ratio comparing the string model to the standard inflationary model. Furthermore, we present a technique to recover an estimate of any string-induced CMB map embedded in observational data. Using Planck-like simulations, we demonstrate the application of our framework and evaluate its performance. The method is sensitive to Gμ ∼ 5 × 10-7 for Nambu-Goto string simulations that include an integrated Sachs-Wolfe contribution only and do not include any recombination effects, before any parameters of the analysis are optimized. The sensitivity of the method compares favourably with other techniques applied to the same simulations.

  9. Comparison of Optimal Design Methods in Inverse Problems

    PubMed Central

    Banks, H. T.; Holm, Kathleen; Kappel, Franz

    2011-01-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29]. PMID:21857762

  10. Estimating social carrying capacity through computer simulation modeling: an application to Arches National Park, Utah

    Treesearch

    Benjamin Wang; Robert E. Manning; Steven R. Lawson; William A. Valliere

    2001-01-01

    Recent research and management experience has led to several frameworks for defining and managing carrying capacity of national parks and related areas. These frameworks rely on monitoring indicator variables to ensure that standards of quality are maintained. The objective of this study was to develop a computer simulation model to estimate the relationships between...

  11. Building energy simulation in real time through an open standard interface

    DOE PAGES

    Pang, Xiufeng; Nouidui, Thierry S.; Wetter, Michael; ...

    2015-10-20

    Building energy models (BEMs) are typically used for design and code compliance for new buildings and in the renovation of existing buildings to predict energy use. We present the increasing adoption of BEM as standard practice in the building industry presents an opportunity to extend the use of BEMs into construction, commissioning and operation. In 2009, the authors developed a real-time simulation framework to execute an EnergyPlus model in real time to improve building operation. This paper reports an enhancement of that real-time energy simulation framework. The previous version only works with software tools that implement the custom co-simulation interfacemore » of the Building Controls Virtual Test Bed (BCVTB), such as EnergyPlus, Dymola and TRNSYS. The new version uses an open standard interface, the Functional Mockup Interface (FMI), to provide a generic interface to any application that supports the FMI protocol. In addition, the new version utilizes the Simple Measurement and Actuation Profile (sMAP) tool as the data acquisition system to acquire, store and present data. Lastly, this paper introduces the updated architecture of the real-time simulation framework using FMI and presents proof-of-concept demonstration results which validate the new framework.« less

  12. Generic Software Architecture for Prognostics (GSAP) User Guide

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher Allen; Daigle, Matthew John; Watkins, Jason; Sankararaman, Shankar; Goebel, Kai

    2016-01-01

    The Generic Software Architecture for Prognostics (GSAP) is a framework for applying prognostics. It makes applying prognostics easier by implementing many of the common elements across prognostic applications. The standard interface enables reuse of prognostic algorithms and models across systems using the GSAP framework.

  13. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  14. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  15. Model-Based Policymaking: A Framework to Promote Ethical "Good Practice" in Mathematical Modeling for Public Health Policymaking.

    PubMed

    Boden, Lisa A; McKendrick, Iain J

    2017-01-01

    Mathematical models are increasingly relied upon as decision support tools, which estimate risks and generate recommendations to underpin public health policies. However, there are no formal agreements about what constitutes professional competencies or duties in mathematical modeling for public health. In this article, we propose a framework to evaluate whether mathematical models that assess human and animal disease risks and control strategies meet standards consistent with ethical "good practice" and are thus "fit for purpose" as evidence in support of policy. This framework is derived from principles of biomedical ethics: independence, transparency (autonomy), beneficence/non-maleficence, and justice. We identify ethical risks associated with model development and implementation and consider the extent to which scientists are accountable for the translation and communication of model results to policymakers so that the strengths and weaknesses of the scientific evidence base and any socioeconomic and ethical impacts of biased or uncertain predictions are clearly understood. We propose principles to operationalize a framework for ethically sound model development and risk communication between scientists and policymakers. These include the creation of science-policy partnerships to mutually define policy questions and communicate results; development of harmonized international standards for model development; and data stewardship and improvement of the traceability and transparency of models via a searchable archive of policy-relevant models. Finally, we suggest that bespoke ethical advisory groups, with relevant expertise and access to these resources, would be beneficial as a bridge between science and policy, advising modelers of potential ethical risks and providing overview of the translation of modeling advice into policy.

  16. NoSQL Based 3D City Model Management System

    NASA Astrophysics Data System (ADS)

    Mao, B.; Harrie, L.; Cao, J.; Wu, Z.; Shen, J.

    2014-04-01

    To manage increasingly complicated 3D city models, a framework based on NoSQL database is proposed in this paper. The framework supports import and export of 3D city model according to international standards such as CityGML, KML/COLLADA and X3D. We also suggest and implement 3D model analysis and visualization in the framework. For city model analysis, 3D geometry data and semantic information (such as name, height, area, price and so on) are stored and processed separately. We use a Map-Reduce method to deal with the 3D geometry data since it is more complex, while the semantic analysis is mainly based on database query operation. For visualization, a multiple 3D city representation structure CityTree is implemented within the framework to support dynamic LODs based on user viewpoint. Also, the proposed framework is easily extensible and supports geoindexes to speed up the querying. Our experimental results show that the proposed 3D city management system can efficiently fulfil the analysis and visualization requirements.

  17. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Kevin M.; Smith, Brennan T.; Witt, Adam M.

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  18. A Comprehensive Stress Education and Reduction Program Utilizing a Well-Being Model: Incorporating the ASCA Student Standards

    ERIC Educational Resources Information Center

    Tarabochia, Dawn S.

    2013-01-01

    The American School Counselor Association developed national standards for students to provide a framework for a holistic approach to student academic, career, and personal/social development. While the ASCA Student Standards are comprehensive, little attention is given to stress. Adolescents are experiencing greater stress associated with…

  19. Acceptance Factors Influencing Adoption of National Institute of Standards and Technology Information Security Standards: A Quantitative Study

    ERIC Educational Resources Information Center

    Kiriakou, Charles M.

    2012-01-01

    Adoption of a comprehensive information security governance model and security controls is the best option organizations may have to protect their information assets and comply with regulatory requirements. Understanding acceptance factors of the National Institute of Standards and Technology (NIST) Risk Management Framework (RMF) comprehensive…

  20. Feature selection and classifier parameters estimation for EEG signals peak detection using particle swarm optimization.

    PubMed

    Adam, Asrul; Shapiai, Mohd Ibrahim; Tumari, Mohd Zaidi Mohd; Mohamad, Mohd Saberi; Mubin, Marizan

    2014-01-01

    Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model.

  1. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  2. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gettelman, Andrew

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  3. Stress distribution in Co-Cr implant frameworks after laser or TIG welding.

    PubMed

    de Castro, Gabriela Cassaro; de Araújo, Cleudmar Amaral; Mesquita, Marcelo Ferraz; Consani, Rafael Leonardo Xediek; Nóbilo, Mauro Antônio de Arruda

    2013-01-01

    Lack of passivity has been associated with biomechanical problems in implant-supported prosthesis. The aim of this study was to evaluate the passivity of three techniques to fabricate an implant framework from a Co-Cr alloy by photoelasticity. The model was obtained from a steel die simulating an edentulous mandible with 4 external hexagon analog implants with a standard platform. On this model, five frameworks were fabricated for each group: a monoblock framework (control), laser and TIG welding frameworks. The photoelastic model was made from a flexible epoxy resin. On the photoelastic analysis, the frameworks were bolted onto the model for the verification of maximum shear stress at 34 selected points around the implants and 5 points in the middle of the model. The stresses were compared all over the photoelastic model, between the right, left, and center regions and between the cervical and apical regions. The values were subjected to two-way ANOVA, and Tukey's test (α=0.05). There was no significant difference among the groups and studied areas (p>0.05). It was concluded that the stresses generated around the implants were similar for all techniques.

  4. NADM Conceptual Model 1.0 -- A Conceptual Model for Geologic Map Information

    USGS Publications Warehouse

    ,

    2004-01-01

    Executive Summary -- The NADM Data Model Design Team was established in 1999 by the North American Geologic Map Data Model Steering Committee (NADMSC) with the purpose of drafting a geologic map data model for consideration as a standard for developing interoperable geologic map-centered databases by state, provincial, and federal geological surveys. The model is designed to be a technology-neutral conceptual model that can form the basis for a web-based interchange format using evolving information technology (e.g., XML, RDF, OWL), and guide implementation of geoscience databases in a common conceptual framework. The intended purpose is to allow geologic information sharing between geologic map data providers and users, independent of local information system implementation. The model emphasizes geoscience concepts and relationships related to information presented on geologic maps. Design has been guided by an informal requirements analysis, documentation of existing databases, technology developments, and other standardization efforts in the geoscience and computer-science communities. A key aspect of the model is the notion that representation of the conceptual framework (ontology) that underlies geologic map data must be part of the model, because this framework changes with time and understanding, and varies between information providers. The top level of the model distinguishes geologic concepts, geologic representation concepts, and metadata. The geologic representation part of the model provides a framework for representing the ontology that underlies geologic map data through a controlled vocabulary, and for establishing the relationships between this vocabulary and a geologic map visualization or portrayal. Top-level geologic classes in the model are Earth material (substance), geologic unit (parts of the Earth), geologic age, geologic structure, fossil, geologic process, geologic relation, and geologic event.

  5. Model-Based Policymaking: A Framework to Promote Ethical “Good Practice” in Mathematical Modeling for Public Health Policymaking

    PubMed Central

    Boden, Lisa A.; McKendrick, Iain J.

    2017-01-01

    Mathematical models are increasingly relied upon as decision support tools, which estimate risks and generate recommendations to underpin public health policies. However, there are no formal agreements about what constitutes professional competencies or duties in mathematical modeling for public health. In this article, we propose a framework to evaluate whether mathematical models that assess human and animal disease risks and control strategies meet standards consistent with ethical “good practice” and are thus “fit for purpose” as evidence in support of policy. This framework is derived from principles of biomedical ethics: independence, transparency (autonomy), beneficence/non-maleficence, and justice. We identify ethical risks associated with model development and implementation and consider the extent to which scientists are accountable for the translation and communication of model results to policymakers so that the strengths and weaknesses of the scientific evidence base and any socioeconomic and ethical impacts of biased or uncertain predictions are clearly understood. We propose principles to operationalize a framework for ethically sound model development and risk communication between scientists and policymakers. These include the creation of science–policy partnerships to mutually define policy questions and communicate results; development of harmonized international standards for model development; and data stewardship and improvement of the traceability and transparency of models via a searchable archive of policy-relevant models. Finally, we suggest that bespoke ethical advisory groups, with relevant expertise and access to these resources, would be beneficial as a bridge between science and policy, advising modelers of potential ethical risks and providing overview of the translation of modeling advice into policy. PMID:28424768

  6. Modeling treatment of ischemic heart disease with partially observable Markov decision processes.

    PubMed

    Hauskrecht, M; Fraser, H

    1998-01-01

    Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead they are very often dependent and interleaved over time, mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of Partially observable Markov decision processes (POMDPs) developed and used in operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In the paper, we show how the POMDP framework could be used to model and solve the problem of the management of patients with ischemic heart disease, and point out modeling advantages of the framework over standard decision formalisms.

  7. A new fit-for-purpose model testing framework: Decision Crash Tests

    NASA Astrophysics Data System (ADS)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building decisions. In one case, we show the set of model building decisions has a low probability to correctly support the upgrade decision. In the other case, we show evidence suggesting another set of model building decisions has a high probability to correctly support the decision. The proposed DCT framework focuses on what model users typically care about: the management decision in question. The DCT framework will often be very strict and will produce easy to interpret results enabling clear unsuitability determinations. In the past, hydrologic modelling progress has necessarily meant new models and model building methods. Continued progress in hydrologic modelling requires finding clear evidence to motivate researchers to disregard unproductive models and methods and the DCT framework is built to produce this kind of evidence. References: Andréassian, V., C. Perrin, L. Berthet, N. Le Moine, J. Lerat, C. Loumagne, L. Oudin, T. Mathevet, M.-H. Ramos, and A. Valéry (2009), Crash tests for a standardized evaluation of hydrological models. Hydrology and Earth System Sciences, 13, 1757-1764. Klemeš, V. (1986), Operational testing of hydrological simulation models. Hydrological Sciences Journal, 31 (1), 13-24.

  8. Quantum Gravity and Cosmology: an intimate interplay

    NASA Astrophysics Data System (ADS)

    Sakellariadou, Mairi

    2017-08-01

    I will briefly discuss three cosmological models built upon three distinct quantum gravity proposals. I will first highlight the cosmological rôle of a vector field in the framework of a string/brane cosmological model. I will then present the resolution of the big bang singularity and the occurrence of an early era of accelerated expansion of a geometric origin, in the framework of group field theory condensate cosmology. I will then summarise results from an extended gravitational model based on non-commutative spectral geometry, a model that offers a purely geometric explanation for the standard model of particle physics.

  9. A Descriptive Case Study of the Implementation of the Departmentalized Looping Team Model

    ERIC Educational Resources Information Center

    Miller, Cody R.

    2011-01-01

    The conceptual framework guiding this study focuses on local, state, and federal standards as well as demands on schools to improve performance of underserved student populations as impetuses for school structure changes. As related to the aforementioned framework, many schools have developed innovative school restructuring methods such as the…

  10. Application of Resource Description Framework to Personalise Learning: Systematic Review and Methodology

    ERIC Educational Resources Information Center

    Jevsikova, Tatjana; Berniukevicius, Andrius; Kurilovas, Eugenijus

    2017-01-01

    The paper is aimed to present a methodology of learning personalisation based on applying Resource Description Framework (RDF) standard model. Research results are two-fold: first, the results of systematic literature review on Linked Data, RDF "subject-predicate-object" triples, and Web Ontology Language (OWL) application in education…

  11. Information Literacy for Archives and Special Collections: Defining Outcomes

    ERIC Educational Resources Information Center

    Carini, Peter

    2016-01-01

    This article provides the framework for a set of standards and outcomes that would constitute information literacy with primary sources. Based on a working model used at Dartmouth College's Rauner Special Collections Library in Hanover, New Hampshire, these concepts create a framework for teaching with primary source materials intended to produce…

  12. Reengineering Framework for Systems in Education

    ERIC Educational Resources Information Center

    Choquet, Christophe; Corbiere, Alain

    2006-01-01

    Specifications recently proposed as standards in the domain of Technology Enhanced Learning (TEL), question the designers of TEL systems on how to put them into practice. Recent studies in Model Driven Engineering have highlighted the need for a framework which could formalize the use of these specifications as well as enhance the quality of the…

  13. Unified Computational Methods for Regression Analysis of Zero-Inflated and Bound-Inflated Data

    PubMed Central

    Yang, Yan; Simpson, Douglas

    2010-01-01

    Bounded data with excess observations at the boundary are common in many areas of application. Various individual cases of inflated mixture models have been studied in the literature for bound-inflated data, yet the computational methods have been developed separately for each type of model. In this article we use a common framework for computing these models, and expand the range of models for both discrete and semi-continuous data with point inflation at the lower boundary. The quasi-Newton and EM algorithms are adapted and compared for estimation of model parameters. The numerical Hessian and generalized Louis method are investigated as means for computing standard errors after optimization. Correlated data are included in this framework via generalized estimating equations. The estimation of parameters and effectiveness of standard errors are demonstrated through simulation and in the analysis of data from an ultrasound bioeffect study. The unified approach enables reliable computation for a wide class of inflated mixture models and comparison of competing models. PMID:20228950

  14. A penalized framework for distributed lag non-linear models.

    PubMed

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  15. A Comparison of Career-Related Assessment Tools/Models. Final [Report].

    ERIC Educational Resources Information Center

    WestEd, San Francisco, CA.

    This document contains charts that evaluate career related assessment items. Chart categories include: Purpose/Current Uses/Format; Intended Population; Oregon Career Related Learning Standards Addressed; Relationship to the Standards; Relationship to Endorsement Area Frameworks; Evidence of Validity; Evidence of Reliability; Evidence of Fairness…

  16. Open Systems Interconnection.

    ERIC Educational Resources Information Center

    Denenberg, Ray

    1985-01-01

    Discusses the need for standards allowing computer-to-computer communication and gives examples of technical issues. The seven-layer framework of the Open Systems Interconnection (OSI) Reference Model is explained and illustrated. Sidebars feature public data networks and Recommendation X.25, OSI standards, OSI layer functions, and a glossary.…

  17. A computational fluid dynamics simulation framework for ventricular catheter design optimization.

    PubMed

    Weisenberg, Sofy H; TerMaath, Stephanie C; Barbier, Charlotte N; Hill, Judith C; Killeffer, James A

    2017-11-10

    OBJECTIVE Cerebrospinal fluid (CSF) shunts are the primary treatment for patients suffering from hydrocephalus. While proven effective in symptom relief, these shunt systems are plagued by high failure rates and often require repeated revision surgeries to replace malfunctioning components. One of the leading causes of CSF shunt failure is obstruction of the ventricular catheter by aggregations of cells, proteins, blood clots, or fronds of choroid plexus that occlude the catheter's small inlet holes or even the full internal catheter lumen. Such obstructions can disrupt CSF diversion out of the ventricular system or impede it entirely. Previous studies have suggested that altering the catheter's fluid dynamics may help to reduce the likelihood of complete ventricular catheter failure caused by obstruction. However, systematic correlation between a ventricular catheter's design parameters and its performance, specifically its likelihood to become occluded, still remains unknown. Therefore, an automated, open-source computational fluid dynamics (CFD) simulation framework was developed for use in the medical community to determine optimized ventricular catheter designs and to rapidly explore parameter influence for a given flow objective. METHODS The computational framework was developed by coupling a 3D CFD solver and an iterative optimization algorithm and was implemented in a high-performance computing environment. The capabilities of the framework were demonstrated by computing an optimized ventricular catheter design that provides uniform flow rates through the catheter's inlet holes, a common design objective in the literature. The baseline computational model was validated using 3D nuclear imaging to provide flow velocities at the inlet holes and through the catheter. RESULTS The optimized catheter design achieved through use of the automated simulation framework improved significantly on previous attempts to reach a uniform inlet flow rate distribution using the standard catheter hole configuration as a baseline. While the standard ventricular catheter design featuring uniform inlet hole diameters and hole spacing has a standard deviation of 14.27% for the inlet flow rates, the optimized design has a standard deviation of 0.30%. CONCLUSIONS This customizable framework, paired with high-performance computing, provides a rapid method of design testing to solve complex flow problems. While a relatively simplified ventricular catheter model was used to demonstrate the framework, the computational approach is applicable to any baseline catheter model, and it is easily adapted to optimize catheters for the unique needs of different patients as well as for other fluid-based medical devices.

  18. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  19. Computing decay rates for new physics theories with FEYNRULES and MADGRAPH 5_AMC@NLO

    NASA Astrophysics Data System (ADS)

    Alwall, Johan; Duhr, Claude; Fuks, Benjamin; Mattelaer, Olivier; Öztürk, Deniz Gizem; Shen, Chia-Hsien

    2015-12-01

    We present new features of the FEYNRULES and MADGRAPH 5_AMC@NLO programs for the automatic computation of decay widths that consistently include channels of arbitrary final-state multiplicity. The implementations are generic enough so that they can be used in the framework of any quantum field theory, possibly including higher-dimensional operators. We extend at the same time the conventions of the Universal FEYNRULES Output (or UFO) format to include decay tables and information on the total widths. We finally provide a set of representative examples of the usage of the new functions of the different codes in the framework of the Standard Model, the Higgs Effective Field Theory, the Strongly Interacting Light Higgs model and the Minimal Supersymmetric Standard Model and compare the results to available literature and programs for validation purposes.

  20. Feature Selection and Classifier Parameters Estimation for EEG Signals Peak Detection Using Particle Swarm Optimization

    PubMed Central

    Adam, Asrul; Mohd Tumari, Mohd Zaidi; Mohamad, Mohd Saberi

    2014-01-01

    Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model. PMID:25243236

  1. Reliable structural interpretation of small-angle scattering data from bio-molecules in solution--the importance of quality control and a standard reporting framework.

    PubMed

    Jacques, David A; Guss, Jules Mitchell; Trewhella, Jill

    2012-05-17

    Small-angle scattering is becoming an increasingly popular tool for the study of bio-molecular structures in solution. The large number of publications with 3D-structural models generated from small-angle solution scattering data has led to a growing consensus for the need to establish a standard reporting framework for their publication. The International Union of Crystallography recently established a set of guidelines for the necessary information required for the publication of such structural models. Here we describe the rationale for these guidelines and the importance of standardising the way in which small-angle scattering data from bio-molecules and associated structural interpretations are reported.

  2. Prospective Teachers' Perspectives on Mathematics Teaching and Learning: Lens for Interpreting Experiences in a Standards-Based Mathematics Course

    ERIC Educational Resources Information Center

    Chamberlin, Michelle T.

    2013-01-01

    In a mathematics course for prospective elementary teachers, we strove to model standards-based pedagogy. However, an end-of-class reflection revealed the prospective teachers were considering incorporating standards-based strategies in their future classrooms in ways different from our intent. Thus, we drew upon the framework presented by Simon,…

  3. Automated next-to-leading order predictions for new physics at the LHC: The case of colored scalar pair production

    DOE PAGES

    Degrande, Céline; Fuks, Benjamin; Hirschi, Valentin; ...

    2015-05-05

    We present for the first time the full automation of collider predictions matched with parton showers at the next-to-leading accuracy in QCD within nontrivial extensions of the standard model. The sole inputs required from the user are the model Lagrangian and the process of interest. As an application of the above, we explore scenarios beyond the standard model where new colored scalar particles can be pair produced in hadron collisions. Using simplified models to describe the new field interactions with the standard model, we present precision predictions for the LHC within the MadGraph5_aMC@NLO framework.

  4. Estimating and modelling cure in population-based cancer studies within the framework of flexible parametric survival models.

    PubMed

    Andersson, Therese M L; Dickman, Paul W; Eloranta, Sandra; Lambert, Paul C

    2011-06-22

    When the mortality among a cancer patient group returns to the same level as in the general population, that is, the patients no longer experience excess mortality, the patients still alive are considered "statistically cured". Cure models can be used to estimate the cure proportion as well as the survival function of the "uncured". One limitation of parametric cure models is that the functional form of the survival of the "uncured" has to be specified. It can sometimes be hard to find a survival function flexible enough to fit the observed data, for example, when there is high excess hazard within a few months from diagnosis, which is common among older age groups. This has led to the exclusion of older age groups in population-based cancer studies using cure models. Here we have extended the flexible parametric survival model to incorporate cure as a special case to estimate the cure proportion and the survival of the "uncured". Flexible parametric survival models use splines to model the underlying hazard function, and therefore no parametric distribution has to be specified. We have compared the fit from standard cure models to our flexible cure model, using data on colon cancer patients in Finland. This new method gives similar results to a standard cure model, when it is reliable, and better fit when the standard cure model gives biased estimates. Cure models within the framework of flexible parametric models enables cure modelling when standard models give biased estimates. These flexible cure models enable inclusion of older age groups and can give stage-specific estimates, which is not always possible from parametric cure models. © 2011 Andersson et al; licensee BioMed Central Ltd.

  5. Estimating and modelling cure in population-based cancer studies within the framework of flexible parametric survival models

    PubMed Central

    2011-01-01

    Background When the mortality among a cancer patient group returns to the same level as in the general population, that is, the patients no longer experience excess mortality, the patients still alive are considered "statistically cured". Cure models can be used to estimate the cure proportion as well as the survival function of the "uncured". One limitation of parametric cure models is that the functional form of the survival of the "uncured" has to be specified. It can sometimes be hard to find a survival function flexible enough to fit the observed data, for example, when there is high excess hazard within a few months from diagnosis, which is common among older age groups. This has led to the exclusion of older age groups in population-based cancer studies using cure models. Methods Here we have extended the flexible parametric survival model to incorporate cure as a special case to estimate the cure proportion and the survival of the "uncured". Flexible parametric survival models use splines to model the underlying hazard function, and therefore no parametric distribution has to be specified. Results We have compared the fit from standard cure models to our flexible cure model, using data on colon cancer patients in Finland. This new method gives similar results to a standard cure model, when it is reliable, and better fit when the standard cure model gives biased estimates. Conclusions Cure models within the framework of flexible parametric models enables cure modelling when standard models give biased estimates. These flexible cure models enable inclusion of older age groups and can give stage-specific estimates, which is not always possible from parametric cure models. PMID:21696598

  6. Probabilistic Graphical Model Representation in Phylogenetics

    PubMed Central

    Höhna, Sebastian; Heath, Tracy A.; Boussau, Bastien; Landis, Michael J.; Ronquist, Fredrik; Huelsenbeck, John P.

    2014-01-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis–Hastings or Gibbs sampling of the posterior distribution. [Computation; graphical models; inference; modularization; statistical phylogenetics; tree plate.] PMID:24951559

  7. Archetype Model-Driven Development Framework for EHR Web System.

    PubMed

    Kobayashi, Shinji; Kimura, Eizen; Ishihara, Ken

    2013-12-01

    This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems.

  8. External Modeling Framework And The OpenUTF

    DTIC Science & Technology

    2012-01-24

    12S- SIW -034 WarpIV Technologies, Inc. 3/26/12 1 External Modeling Framework and the OpenUTF1 Jeffrey S. Steinman, Ph.D. Craig N. Lammers...unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 12S- SIW -034 WarpIV Technologies, Inc. 3/26/12...tracks. Full visualization was performed at the Naval Research Laboratory (NRL) in Washington DC. 12S- SIW -034 WarpIV Technologies, Inc. 3/26/12 3

  9. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  10. Assessment of the Draft AIAA S-119 Flight Dynamic Model Exchange Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Murri, Daniel G.; Hill, Melissa A.; Jessick, Matthew V.; Penn, John M.; Hasan, David A.; Crues, Edwin Z.; Falck, Robert D.; McCarthy, Thomas G.; Vuong, Nghia; hide

    2011-01-01

    An assessment of a draft AIAA standard for flight dynamics model exchange, ANSI/AIAA S-119-2011, was conducted on behalf of NASA by a team from the NASA Engineering and Safety Center. The assessment included adding the capability of importing standard models into real-time simulation facilities at several NASA Centers as well as into analysis simulation tools. All participants were successful at importing two example models into their respective simulation frameworks by using existing software libraries or by writing new import tools. Deficiencies in the libraries and format documentation were identified and fixed; suggestions for improvements to the standard were provided to the AIAA. An innovative tool to generate C code directly from such a model was developed. Performance of the software libraries compared favorably with compiled code. As a result of this assessment, several NASA Centers can now import standard models directly into their simulations. NASA is considering adopting the now-published S-119 standard as an internal recommended practice.

  11. Using Standards and High-Stakes Testing for Students: Exploiting Power with Critical Pedagogy. Counterpoints: Studies in the Postmodern Theory of Education. Volume 425

    ERIC Educational Resources Information Center

    Gorlewski, Julie A., Ed.; Porfilio, Brad J., Ed.; Gorlewski, David A., Ed.

    2012-01-01

    This book overturns the typical conception of standards, empowering educators by providing concrete examples of how top-down models of assessment can be embraced and used in ways that are consistent with critical pedagogies. Although standards, as broad frameworks for setting learning targets, are not necessarily problematic, when they are…

  12. Business Education. Vocational Education Program Courses Standards.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.

    This document contains vocational education program courses standards (curriculum frameworks and student performance standards) for business technology education programs in Florida. Each program courses standard is composed of two parts: a curriculum framework and student performance standards. The curriculum framework includes four major…

  13. A Unified Framework for Association Analysis with Multiple Related Phenotypes

    PubMed Central

    Stephens, Matthew

    2013-01-01

    We consider the problem of assessing associations between multiple related outcome variables, and a single explanatory variable of interest. This problem arises in many settings, including genetic association studies, where the explanatory variable is genotype at a genetic variant. We outline a framework for conducting this type of analysis, based on Bayesian model comparison and model averaging for multivariate regressions. This framework unifies several common approaches to this problem, and includes both standard univariate and standard multivariate association tests as special cases. The framework also unifies the problems of testing for associations and explaining associations – that is, identifying which outcome variables are associated with genotype. This provides an alternative to the usual, but conceptually unsatisfying, approach of resorting to univariate tests when explaining and interpreting significant multivariate findings. The method is computationally tractable genome-wide for modest numbers of phenotypes (e.g. 5–10), and can be applied to summary data, without access to raw genotype and phenotype data. We illustrate the methods on both simulated examples, and to a genome-wide association study of blood lipid traits where we identify 18 potential novel genetic associations that were not identified by univariate analyses of the same data. PMID:23861737

  14. Bayesian hierarchical models for cost-effectiveness analyses that use data from cluster randomized trials.

    PubMed

    Grieve, Richard; Nixon, Richard; Thompson, Simon G

    2010-01-01

    Cost-effectiveness analyses (CEA) may be undertaken alongside cluster randomized trials (CRTs) where randomization is at the level of the cluster (for example, the hospital or primary care provider) rather than the individual. Costs (and outcomes) within clusters may be correlated so that the assumption made by standard bivariate regression models, that observations are independent, is incorrect. This study develops a flexible modeling framework to acknowledge the clustering in CEA that use CRTs. The authors extend previous Bayesian bivariate models for CEA of multicenter trials to recognize the specific form of clustering in CRTs. They develop new Bayesian hierarchical models (BHMs) that allow mean costs and outcomes, and also variances, to differ across clusters. They illustrate how each model can be applied using data from a large (1732 cases, 70 primary care providers) CRT evaluating alternative interventions for reducing postnatal depression. The analyses compare cost-effectiveness estimates from BHMs with standard bivariate regression models that ignore the data hierarchy. The BHMs show high levels of cost heterogeneity across clusters (intracluster correlation coefficient, 0.17). Compared with standard regression models, the BHMs yield substantially increased uncertainty surrounding the cost-effectiveness estimates, and altered point estimates. The authors conclude that ignoring clustering can lead to incorrect inferences. The BHMs that they present offer a flexible modeling framework that can be applied more generally to CEA that use CRTs.

  15. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  16. pyomo.dae: a modeling and automatic discretization framework for optimization with differential and algebraic equations

    DOE PAGES

    Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul; ...

    2017-12-20

    We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less

  17. pyomo.dae: a modeling and automatic discretization framework for optimization with differential and algebraic equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul

    We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less

  18. Probing Supersymmetry with Neutral Current Scattering Experiments

    NASA Astrophysics Data System (ADS)

    Kurylov, A.; Ramsey-Musolf, M. J.; Su, S.

    2004-02-01

    We compute the supersymmetric contributions to the weak charges of the electron (QWe) and proton (QWp) in the framework of Minimal Supersymmetric Standard Model. We also consider the ratio of neutral current to charged current cross sections, R v and Rv¯ at v (v¯)-nucleus deep inelastic scattering, and compare the supersymmetric corrections with the deviations of these quantities from the Standard Model predictions implied by the recent NuTeV measurement.

  19. A Systems Approach to Designing Effective Clinical Trials Using Simulations

    PubMed Central

    Fusaro, Vincent A.; Patil, Prasad; Chi, Chih-Lin; Contant, Charles F.; Tonellato, Peter J.

    2013-01-01

    Background Pharmacogenetics in warfarin clinical trials have failed to show a significant benefit compared to standard clinical therapy. This study demonstrates a computational framework to systematically evaluate pre-clinical trial design of target population, pharmacogenetic algorithms, and dosing protocols to optimize primary outcomes. Methods and Results We programmatically created an end-to-end framework that systematically evaluates warfarin clinical trial designs. The framework includes options to create a patient population, multiple dosing strategies including genetic-based and non-genetic clinical-based, multiple dose adjustment protocols, pharmacokinetic/pharmacodynamics (PK/PD) modeling and international normalization ratio (INR) prediction, as well as various types of outcome measures. We validated the framework by conducting 1,000 simulations of the CoumaGen clinical trial primary endpoints. The simulation predicted a mean time in therapeutic range (TTR) of 70.6% and 72.2% (P = 0.47) in the standard and pharmacogenetic arms, respectively. Then, we evaluated another dosing protocol under the same original conditions and found a significant difference in TTR between the pharmacogenetic and standard arm (78.8% vs. 73.8%; P = 0.0065), respectively. Conclusions We demonstrate that this simulation framework is useful in the pre-clinical assessment phase to study and evaluate design options and provide evidence to optimize the clinical trial for patient efficacy and reduced risk. PMID:23261867

  20. Can a Competence or Standards Model Facilitate an Inclusive Approach to Teacher Education?

    ERIC Educational Resources Information Center

    Moran, Anne

    2009-01-01

    The paper seeks to determine whether programmes of initial teacher education (ITE) can contribute to the development of beginning teachers' inclusive attitudes, values and practices. The majority of ITE programmes are based on government prescribed competence or standards frameworks, which are underpinned by Codes of Professional Values. It is…

  1. Left Ventricular Endocardium Tracking by Fusion of Biomechanical and Deformable Models

    PubMed Central

    Gu, Jason

    2014-01-01

    This paper presents a framework for tracking left ventricular (LV) endocardium through 2D echocardiography image sequence. The framework is based on fusion of biomechanical (BM) model of the heart with the parametric deformable model. The BM model constitutive equation consists of passive and active strain energy functions. The deformations of the LV are obtained by solving the constitutive equations using ABAQUS FEM in each frame in the cardiac cycle. The strain energy functions are defined in two user subroutines for active and passive phases. Average fusion technique is used to fuse the BM and deformable model contours. Experimental results are conducted to verify the detected contours and the results are evaluated by comparing themto a created gold standard. The results and the evaluation proved that the framework has the tremendous potential to track and segment the LV through the whole cardiac cycle. PMID:24587814

  2. Setting performance standards for medical practice: a theoretical framework.

    PubMed

    Southgate, L; Hays, R B; Norcini, J; Mulholland, H; Ayers, B; Woolliscroft, J; Cusimano, M; McAvoy, P; Ainsworth, M; Haist, S; Campbell, M

    2001-05-01

    The assessment of performance in the real world of medical practice is now widely accepted as the goal of assessment at the postgraduate level. This is largely a validity issue, as it is recognised that tests of knowledge and in clinical simulations cannot on their own really measure how medical practitioners function in the broader health care system. However, the development of standards for performance-based assessment is not as well understood as in competency assessment, where simulations can more readily reflect narrower issues of knowledge and skills. This paper proposes a theoretical framework for the development of standards that reflect the more complex world in which experienced medical practitioners work. The paper reflects the combined experiences of a group of education researchers and the results of literature searches that included identifying current health system data sources that might contribute information to the measurement of standards. Standards that reflect the complexity of medical practice may best be developed through an "expert systems" analysis of clinical conditions for which desired health care outcomes reflect the contribution of several health professionals within a complex, three-dimensional, contextual model. Examples of the model are provided, but further work is needed to test validity and measurability.

  3. Planning treatment of ischemic heart disease with partially observable Markov decision processes.

    PubMed

    Hauskrecht, M; Fraser, H

    2000-03-01

    Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead, they are very often dependent and interleaved over time. This is mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of partially observable Markov decision processes (POMDPs) developed and used in the operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In this paper, we show how the POMDP framework can be used to model and solve the problem of the management of patients with ischemic heart disease (IHD), and demonstrate the modeling advantages of the framework over standard decision formalisms.

  4. Towards a common oil spill risk assessment framework – Adapting ISO 31000 and addressing uncertainties.

    PubMed

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio; Janeiro, Joao; Samaras, Achilleas; Zodiatis, George; De Dominicis, Michela

    2015-08-15

    Oil spills are a transnational problem, and establishing a common standard methodology for Oil Spill Risk Assessments (OSRAs) is thus paramount in order to protect marine environments and coastal communities. In this study we firstly identified the strengths and weaknesses of the OSRAs carried out in various parts of the globe. We then searched for a generic and recognized standard, i.e. ISO 31000, in order to design a method to perform OSRAs in a scientific and standard way. The new framework was tested for the Lebanon oil spill that occurred in 2006 employing ensemble oil spill modeling to quantify the risks and uncertainties due to unknown spill characteristics. The application of the framework generated valuable visual instruments for the transparent communication of the risks, replacing the use of risk tolerance levels, and thus highlighting the priority areas to protect in case of an oil spill. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Coalescent: an open-science framework for importance sampling in coalescent theory.

    PubMed

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only effective sample size. Here, we evaluate proposals in the coalescent literature, to discover that the order of efficiency among the three importance sampling schemes changes when one considers running time as well as effective sample size. We also describe a computational technique called "just-in-time delegation" available to improve the trade-off between running time and precision by constructing improved importance sampling schemes from existing ones. Thus, our systems approach is a potential solution to the "2(8) programs problem" highlighted by Felsenstein, because it provides the flexibility to include or exclude various features of similar coalescent models or importance sampling schemes.

  6. RRegrs: an R package for computer-aided model selection with multiple regression models.

    PubMed

    Tsiliki, Georgia; Munteanu, Cristian R; Seoane, Jose A; Fernandez-Lozano, Carlos; Sarimveis, Haralambos; Willighagen, Egon L

    2015-01-01

    Predictive regression models can be created with many different modelling approaches. Choices need to be made for data set splitting, cross-validation methods, specific regression parameters and best model criteria, as they all affect the accuracy and efficiency of the produced predictive models, and therefore, raising model reproducibility and comparison issues. Cheminformatics and bioinformatics are extensively using predictive modelling and exhibit a need for standardization of these methodologies in order to assist model selection and speed up the process of predictive model development. A tool accessible to all users, irrespectively of their statistical knowledge, would be valuable if it tests several simple and complex regression models and validation schemes, produce unified reports, and offer the option to be integrated into more extensive studies. Additionally, such methodology should be implemented as a free programming package, in order to be continuously adapted and redistributed by others. We propose an integrated framework for creating multiple regression models, called RRegrs. The tool offers the option of ten simple and complex regression methods combined with repeated 10-fold and leave-one-out cross-validation. Methods include Multiple Linear regression, Generalized Linear Model with Stepwise Feature Selection, Partial Least Squares regression, Lasso regression, and Support Vector Machines Recursive Feature Elimination. The new framework is an automated fully validated procedure which produces standardized reports to quickly oversee the impact of choices in modelling algorithms and assess the model and cross-validation results. The methodology was implemented as an open source R package, available at https://www.github.com/enanomapper/RRegrs, by reusing and extending on the caret package. The universality of the new methodology is demonstrated using five standard data sets from different scientific fields. Its efficiency in cheminformatics and QSAR modelling is shown with three use cases: proteomics data for surface-modified gold nanoparticles, nano-metal oxides descriptor data, and molecular descriptors for acute aquatic toxicity data. The results show that for all data sets RRegrs reports models with equal or better performance for both training and test sets than those reported in the original publications. Its good performance as well as its adaptability in terms of parameter optimization could make RRegrs a popular framework to assist the initial exploration of predictive models, and with that, the design of more comprehensive in silico screening applications.Graphical abstractRRegrs is a computer-aided model selection framework for R multiple regression models; this is a fully validated procedure with application to QSAR modelling.

  7. An assisted navigation training framework based on judgment theory using sparse and discrete human-machine interfaces.

    PubMed

    Lopes, Ana C; Nunes, Urbano

    2009-01-01

    This paper aims to present a new framework to train people with severe motor disabilities steering an assisted mobile robot (AMR), such as a powered wheelchair. Users with high level of motor disabilities are not able to use standard HMIs, which provide a continuous command signal (e. g. standard joystick). For this reason HMIs providing a small set of simple commands, which are sparse and discrete in time must be used (e. g. scanning interface, or brain computer interface), making very difficult to steer the AMR. In this sense, the assisted navigation training framework (ANTF) is designed to train users driving the AMR, in indoor structured environments, using this type of HMIs. Additionally it provides user characterization on steering the robot, which will later be used to adapt the AMR navigation system to human competence steering the AMR. A rule-based lens (RBL) model is used to characterize users on driving the AMR. Individual judgment performance choosing the best manoeuvres is modeled using a genetic-based policy capturing (GBPC) technique characterized to infer non-compensatory judgment strategies from human decision data. Three user models, at three different learning stages, using the RBL paradigm, are presented.

  8. Outputs as Educator Effectiveness in the United States: Shifting towards Political Accountability

    ERIC Educational Resources Information Center

    Piro, Jody S.; Mullen, Laurie

    2013-01-01

    The definition of educator effectiveness is being redefined by econometric modeling to evidence student achievement on standardized tests. While the reasons that econometric frameworks are in vogue are many, it is clear that the strength of such models lie in the quantifiable evidence of student learning. Current accountability models frame…

  9. Application of ''Earl's Assessment "as", Assessment "for", and Assessment "of" Learning Model'' with Orthopaedic Assessment Clinical Competence

    ERIC Educational Resources Information Center

    Lafave, Mark R.; Katz, Larry; Vaughn, Norman

    2013-01-01

    Context: In order to study the efficacy of assessment methods, a theoretical framework of Earl's model of assessment was introduced. Objective: (1) Introduce the predictive learning assessment model (PLAM) as an application of Earl's model of learning; (2) test Earl's model of learning through the use of the Standardized Orthopedic Assessment Tool…

  10. Use of Annotations for Component and Framework Interoperability

    NASA Astrophysics Data System (ADS)

    David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.

    2009-12-01

    The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.

  11. Investing in innovation: trade-offs in the costs and cost-efficiency of school feeding using community-based kitchens in Bangladesh.

    PubMed

    Gelli, Aulo; Suwa, Yuko

    2014-09-01

    School feeding programs have been a key response to the recent food and economic crises and function to some degree in nearly every country in the world. However, school feeding programs are complex and exhibit different, context-specific models or configurations. To examine the trade-offs, including the costs and cost-efficiency, of an innovative cluster kitchen implementation model in Bangladesh using a standardized framework. A supply chain framework based on international standards was used to provide benchmarks for meaningful comparisons across models. Implementation processes specific to the program in Bangladesh were mapped against this reference to provide a basis for standardized performance measures. Qualitative and quantitative data on key metrics were collected retrospectively using semistructured questionnaires following an ingredients approach, including both financial and economic costs. Costs were standardized to a 200-feeding-day year and 700 kcal daily. The cluster kitchen model had similarities with the semidecentralized model and outsourced models in the literature, the main differences involving implementation scale, scale of purchasing volumes, and frequency of purchasing. Two important features stand out in terms of implementation: the nutritional quality of meals and the level of community involvement. The standardized full cost per child per year was US$110. Despite the nutritious content of the meals, the overall cost-efficiency in cost per nutrient output was lower than the benchmark for centralized programs, due mainly to support and start-up costs. Cluster kitchens provide an example of an innovative implementation model, combining an emphasis on quality meal delivery with strong community engagement. However, the standardized costs-per child were above the average benchmarks for both low-and middle-income countries. In contrast to the existing benchmark data from mature, centralized models, the main cost drivers of the program were associated with support and start-up activities. Further research is required to better understand changes in cost drivers as programs mature.

  12. Meta-Modeling-Based Groundwater Remediation Optimization under Flexibility in Environmental Standard.

    PubMed

    He, Li; Xu, Zongda; Fan, Xing; Li, Jing; Lu, Hongwei

    2017-05-01

      This study develops a meta-modeling based mathematical programming approach with flexibility in environmental standards. It integrates numerical simulation, meta-modeling analysis, and fuzzy programming within a general framework. A set of models between remediation strategies and remediation performance can well guarantee the mitigation in computational efforts in the simulation and optimization process. In order to prevent the occurrence of over-optimistic and pessimistic optimization strategies, a high satisfaction level resulting from the implementation of a flexible standard can indicate the degree to which the environmental standard is satisfied. The proposed approach is applied to a naphthalene-contaminated site in China. Results show that a longer remediation period corresponds to a lower total pumping rate and a stringent risk standard implies a high total pumping rate. The wells located near or in the down-gradient direction to the contaminant sources have the most significant efficiency among all of remediation schemes.

  13. Measuring Up: Are You Looking for a Way to Assess Your Teaching Practice? These Three Frameworks Can Help You Evaluate Your Performance

    ERIC Educational Resources Information Center

    Carrero, Jacqueline

    2015-01-01

    The author discusses how teachers can measure their effectiveness and discusses three frameworks they can use to do so: the National Board for Professional Teaching Standards's Core Propositions, Robert J. Marzano's Teacher Evaluation Model, and John Hattie's eight mind frames. She also gives examples from her own experience to show how she…

  14. Changing pattern in the basal ganglia: motor switching under reduced dopaminergic drive

    PubMed Central

    Fiore, Vincenzo G.; Rigoli, Francesco; Stenner, Max-Philipp; Zaehle, Tino; Hirth, Frank; Heinze, Hans-Jochen; Dolan, Raymond J.

    2016-01-01

    Action selection in the basal ganglia is often described within the framework of a standard model, associating low dopaminergic drive with motor suppression. Whilst powerful, this model does not explain several clinical and experimental data, including varying therapeutic efficacy across movement disorders. We tested the predictions of this model in patients with Parkinson’s disease, on and off subthalamic deep brain stimulation (DBS), focussing on adaptive sensory-motor responses to a changing environment and maintenance of an action until it is no longer suitable. Surprisingly, we observed prolonged perseverance under on-stimulation, and high inter-individual variability in terms of the motor selections performed when comparing the two conditions. To account for these data, we revised the standard model exploring its space of parameters and associated motor functions and found that, depending on effective connectivity between external and internal parts of the globus pallidus and saliency of the sensory input, a low dopaminergic drive can result in increased, dysfunctional, motor switching, besides motor suppression. This new framework provides insight into the biophysical mechanisms underlying DBS, allowing a description in terms of alteration of the signal-to-baseline ratio in the indirect pathway, which better account of known electrophysiological data in comparison with the standard model. PMID:27004463

  15. Metadata Design in the New PDS4 Standards - Something for Everybody

    NASA Astrophysics Data System (ADS)

    Raugh, Anne C.; Hughes, John S.

    2015-11-01

    The Planetary Data System (PDS) archives, supports, and distributes data of diverse targets, from diverse sources, to diverse users. One of the core problems addressed by the PDS4 data standard redesign was that of metadata - how to accommodate the increasingly sophisticated demands of search interfaces, analytical software, and observational documentation into label standards without imposing limits and constraints that would impinge on the quality or quantity of metadata that any particular observer or team could supply. And yet, as an archive, PDS must have detailed documentation for the metadata in the labels it supports, or the institutional knowledge encoded into those attributes will be lost - putting the data at risk.The PDS4 metadata solution is based on a three-step approach. First, it is built on two key ISO standards: ISO 11179 "Information Technology - Metadata Registries", which provides a common framework and vocabulary for defining metadata attributes; and ISO 14721 "Space Data and Information Transfer Systems - Open Archival Information System (OAIS) Reference Model", which provides the framework for the information architecture that enforces the object-oriented paradigm for metadata modeling. Second, PDS has defined a hierarchical system that allows it to divide its metadata universe into namespaces ("data dictionaries", conceptually), and more importantly to delegate stewardship for a single namespace to a local authority. This means that a mission can develop its own data model with a high degree of autonomy and effectively extend the PDS model to accommodate its own metadata needs within the common ISO 11179 framework. Finally, within a single namespace - even the core PDS namespace - existing metadata structures can be extended and new structures added to the model as new needs are identifiedThis poster illustrates the PDS4 approach to metadata management and highlights the expected return on the development investment for PDS, users and data preparers.

  16. The CRISP theory of hippocampal function in episodic memory

    PubMed Central

    Cheng, Sen

    2013-01-01

    Over the past four decades, a “standard framework” has emerged to explain the neural mechanisms of episodic memory storage. This framework has been instrumental in driving hippocampal research forward and now dominates the design and interpretation of experimental and theoretical studies. It postulates that cortical inputs drive plasticity in the recurrent cornu ammonis 3 (CA3) synapses to rapidly imprint memories as attractor states in CA3. Here we review a range of experimental studies and argue that the evidence against the standard framework is mounting, notwithstanding the considerable evidence in its support. We propose CRISP as an alternative theory to the standard framework. CRISP is based on Context Reset by dentate gyrus (DG), Intrinsic Sequences in CA3, and Pattern completion in cornu ammonis 1 (CA1). Compared to previous models, CRISP uses a radically different mechanism for storing episodic memories in the hippocampus. Neural sequences are intrinsic to CA3, and inputs are mapped onto these intrinsic sequences through synaptic plasticity in the feedforward projections of the hippocampus. Hence, CRISP does not require plasticity in the recurrent CA3 synapses during the storage process. Like in other theories DG and CA1 play supporting roles, however, their function in CRISP have distinct implications. For instance, CA1 performs pattern completion in the absence of CA3 and DG contributes to episodic memory retrieval, increasing the speed, precision, and robustness of retrieval. We propose the conceptual theory, discuss its implications for experimental results and suggest testable predictions. It appears that CRISP not only accounts for those experimental results that are consistent with the standard framework, but also for results that are at odds with the standard framework. We therefore suggest that CRISP is a viable, and perhaps superior, theory for the hippocampal function in episodic memory. PMID:23653597

  17. Multipartite interacting scalar dark matter in the light of updated LUX data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Subhaditya; Ghosh, Purusottam; Poulose, Poulose, E-mail: subhab@iitg.ernet.in, E-mail: p.ghosh@iitg.ernet.in, E-mail: poulose@iitg.ernet.in

    2017-04-01

    We explore constraints on multipartite dark matter (DM) framework composed of singlet scalar DM interacting with the Standard Model (SM) through Higgs portal coupling. We compute relic density and direct search constraints including the updated LUX bound for two component scenario with non-zero interactions between two DM components in Z{sub 2} × Z{sub 2}{sup '} framework in comparison with the one having O(2) symmetry. We point out availability of a significantly large region of parameter space of such a multipartite model with DM-DM interactions.

  18. Architectural approaches for HL7-based health information systems implementation.

    PubMed

    López, D M; Blobel, B

    2010-01-01

    Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.

  19. A log-normal distribution model for the molecular weight of aquatic fulvic acids

    USGS Publications Warehouse

    Cabaniss, S.E.; Zhou, Q.; Maurice, P.A.; Chin, Y.-P.; Aiken, G.R.

    2000-01-01

    The molecular weight of humic substances influences their proton and metal binding, organic pollutant partitioning, adsorption onto minerals and activated carbon, and behavior during water treatment. We propose a lognormal model for the molecular weight distribution in aquatic fulvic acids to provide a conceptual framework for studying these size effects. The normal curve mean and standard deviation are readily calculated from measured M(n) and M(w) and vary from 2.7 to 3 for the means and from 0.28 to 0.37 for the standard deviations for typical aquatic fulvic acids. The model is consistent with several types of molecular weight data, including the shapes of high- pressure size-exclusion chromatography (HP-SEC) peaks. Applications of the model to electrostatic interactions, pollutant solubilization, and adsorption are explored in illustrative calculations.The molecular weight of humic substances influences their proton and metal binding, organic pollutant partitioning, adsorption onto minerals and activated carbon, and behavior during water treatment. We propose a log-normal model for the molecular weight distribution in aquatic fulvic acids to provide a conceptual framework for studying these size effects. The normal curve mean and standard deviation are readily calculated from measured Mn and Mw and vary from 2.7 to 3 for the means and from 0.28 to 0.37 for the standard deviations for typical aquatic fulvic acids. The model is consistent with several type's of molecular weight data, including the shapes of high-pressure size-exclusion chromatography (HP-SEC) peaks. Applications of the model to electrostatic interactions, pollutant solubilization, and adsorption are explored in illustrative calculations.

  20. A screening-level modeling approach to estimate nitrogen ...

    EPA Pesticide Factsheets

    This paper presents a screening-level modeling approach that can be used to rapidly estimate nutrient loading and assess numerical nutrient standard exceedance risk of surface waters leading to potential classification as impaired for designated use. It can also be used to explore best management practice (BMP) implementation to reduce loading. The modeling framework uses a hybrid statistical and process based approach to estimate source of pollutants, their transport and decay in the terrestrial and aquatic parts of watersheds. The framework is developed in the ArcGIS environment and is based on the total maximum daily load (TMDL) balance model. Nitrogen (N) is currently addressed in the framework, referred to as WQM-TMDL-N. Loading for each catchment includes non-point sources (NPS) and point sources (PS). NPS loading is estimated using export coefficient or event mean concentration methods depending on the temporal scales, i.e., annual or daily. Loading from atmospheric deposition is also included. The probability of a nutrient load to exceed a target load is evaluated using probabilistic risk assessment, by including the uncertainty associated with export coefficients of various land uses. The computed risk data can be visualized as spatial maps which show the load exceedance probability for all stream segments. In an application of this modeling approach to the Tippecanoe River watershed in Indiana, USA, total nitrogen (TN) loading and risk of standard exce

  1. Archetype Model-Driven Development Framework for EHR Web System

    PubMed Central

    Kimura, Eizen; Ishihara, Ken

    2013-01-01

    Objectives This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. Methods The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. Results We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. Conclusions We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems. PMID:24523991

  2. QuantumOptics.jl: A Julia framework for simulating open quantum systems

    NASA Astrophysics Data System (ADS)

    Krämer, Sebastian; Plankensteiner, David; Ostermann, Laurin; Ritsch, Helmut

    2018-06-01

    We present an open source computational framework geared towards the efficient numerical investigation of open quantum systems written in the Julia programming language. Built exclusively in Julia and based on standard quantum optics notation, the toolbox offers speed comparable to low-level statically typed languages, without compromising on the accessibility and code readability found in dynamic languages. After introducing the framework, we highlight its features and showcase implementations of generic quantum models. Finally, we compare its usability and performance to two well-established and widely used numerical quantum libraries.

  3. Darkflation-One scalar to rule them all?

    NASA Astrophysics Data System (ADS)

    Lalak, Zygmunt; Nakonieczny, Łukasz

    2017-03-01

    The problem of explaining both inflationary and dark matter physics in the framework of a minimal extension of the Standard Model was investigated. To this end, the Standard Model completed by a real scalar singlet playing a role of the dark matter candidate has been considered. We assumed both the dark matter field and the Higgs doublet to be nonminimally coupled to gravity. Using quantum field theory in curved spacetime we derived an effective action for the inflationary period and analyzed its consequences. In this approach, after integrating out both dark matter and Standard Model sectors we obtained the effective action expressed purely in terms of the gravitational field. We paid special attention to determination, by explicit calculations, of the form of coefficients controlling the higher-order in curvature gravitational terms. Their connection to the Standard Model coupling constants has been discussed.

  4. A Web GIS Enabled Comprehensive Hydrologic Information System for Indian Water Resources Systems

    NASA Astrophysics Data System (ADS)

    Goyal, A.; Tyagi, H.; Gosain, A. K.; Khosa, R.

    2017-12-01

    Hydrological systems across the globe are getting increasingly water stressed with each passing season due to climate variability & snowballing water demand. Hence, to safeguard food, livelihood & economic security, it becomes imperative to employ scientific studies for holistic management of indispensable resource like water. However, hydrological study of any scale & purpose is heavily reliant on various spatio-temporal datasets which are not only difficult to discover/access but are also tough to use & manage. Besides, owing to diversity of water sector agencies & dearth of standard operating procedures, seamless information exchange is challenging for collaborators. Extensive research is being done worldwide to address these issues but regrettably not much has been done in developing countries like India. Therefore, the current study endeavours to develop a Hydrological Information System framework in a Web-GIS environment for empowering Indian water resources systems. The study attempts to harmonize the standards for metadata, terminology, symbology, versioning & archiving for effective generation, processing, dissemination & mining of data required for hydrological studies. Furthermore, modelers with humble computing resources at their disposal, can consume this standardized data in high performance simulation modelling using cloud computing within the developed Web-GIS framework. They can also integrate the inputs-outputs of different numerical models available on the platform and integrate their results for comprehensive analysis of the chosen hydrological system. Thus, the developed portal is an all-in-one framework that can facilitate decision makers, industry professionals & researchers in efficient water management.

  5. 78 FR 34795 - Formaldehyde; Third-Party Certification Framework for the Formaldehyde Standards for Composite...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-10

    ... Formaldehyde; Third-Party Certification Framework for the Formaldehyde Standards for Composite Wood Products; Formaldehyde Emissions Standards for Composite Wood Products; Proposed Rules #0;#0;Federal Register / Vol. 78... Certification Framework for the Formaldehyde Standards for Composite Wood Products AGENCY: Environmental...

  6. Non-standard models and the sociology of cosmology

    NASA Astrophysics Data System (ADS)

    López-Corredoira, Martín

    2014-05-01

    I review some theoretical ideas in cosmology different from the standard "Big Bang": the quasi-steady state model, the plasma cosmology model, non-cosmological redshifts, alternatives to non-baryonic dark matter and/or dark energy, and others. Cosmologists do not usually work within the framework of alternative cosmologies because they feel that these are not at present as competitive as the standard model. Certainly, they are not so developed, and they are not so developed because cosmologists do not work on them. It is a vicious circle. The fact that most cosmologists do not pay them any attention and only dedicate their research time to the standard model is to a great extent due to a sociological phenomenon (the "snowball effect" or "groupthink"). We might well wonder whether cosmology, our knowledge of the Universe as a whole, is a science like other fields of physics or a predominant ideology.

  7. "Models Of" versus "Models For": Toward an Agent-Based Conception of Modeling in the Science Classroom

    ERIC Educational Resources Information Center

    Gouvea, Julia; Passmore, Cynthia

    2017-01-01

    The inclusion of the practice of "developing and using models" in the "Framework for K-12 Science Education" and in the "Next Generation Science Standards" provides an opportunity for educators to examine the role this practice plays in science and how it can be leveraged in a science classroom. Drawing on conceptions…

  8. The Parallel System for Integrating Impact Models and Sectors (pSIMS)

    NASA Technical Reports Server (NTRS)

    Elliott, Joshua; Kelly, David; Chryssanthacopoulos, James; Glotter, Michael; Jhunjhnuwala, Kanika; Best, Neil; Wilde, Michael; Foster, Ian

    2014-01-01

    We present a framework for massively parallel climate impact simulations: the parallel System for Integrating Impact Models and Sectors (pSIMS). This framework comprises a) tools for ingesting and converting large amounts of data to a versatile datatype based on a common geospatial grid; b) tools for translating this datatype into custom formats for site-based models; c) a scalable parallel framework for performing large ensemble simulations, using any one of a number of different impacts models, on clusters, supercomputers, distributed grids, or clouds; d) tools and data standards for reformatting outputs to common datatypes for analysis and visualization; and e) methodologies for aggregating these datatypes to arbitrary spatial scales such as administrative and environmental demarcations. By automating many time-consuming and error-prone aspects of large-scale climate impacts studies, pSIMS accelerates computational research, encourages model intercomparison, and enhances reproducibility of simulation results. We present the pSIMS design and use example assessments to demonstrate its multi-model, multi-scale, and multi-sector versatility.

  9. Data Standardization for Carbon Cycle Modeling: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Liu, S.; Cook, R. B.; Post, W. M.; Huntzinger, D. N.; Schwalm, C.; Schaefer, K. M.; Jacobson, A. R.; Michalak, A. M.

    2012-12-01

    Terrestrial biogeochemistry modeling is a crucial component of carbon cycle research and provides unique capabilities to understand terrestrial ecosystems. The Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) aims to identify key differences in model formulation that drive observed differences in model predictions of biospheric carbon exchange. To do so, the MsTMIP framework provides standardized prescribed environmental driver data and a standard model protocol to facilitate comparisons of modeling results from nearly 30 teams. Model performance is then evaluated against a variety of carbon-cycle related observations (remote sensing, atmospheric, and flux tower-based observations) using quantitative performance measures and metrics in an integrated evaluation framework. As part of this effort, we have harmonized highly diverse and heterogeneous environmental driver data, model outputs, and observational benchmark data sets to facilitate use and analysis by the MsTMIP team. In this presentation, we will describe the lessons learned from this data-intensive carbon cycle research. The data harmonization activity itself can be made more efficient with the consideration of proper tools, version control, workflow management, and collaboration within the whole team. The adoption of on-demand and interoperable protocols (e.g. OPeNDAP and Open Geospatial Consortium) makes data visualization and distribution more flexible. Users can customize and download data in specific spatial extent, temporal period, and different resolutions. The effort to properly organize data in an open and standard format (e.g. Climate & Forecast compatible netCDF) allows the data to be analysed by a dispersed set of researchers more efficiently, and maximizes the longevity and utilization of the data. The lessons learned from this specific experience can benefit efforts by the broader community to leverage diverse data resources more efficiently in scientific research.

  10. A new framework for evaluating the impacts of drought on net primary productivity of grassland.

    PubMed

    Lei, Tianjie; Wu, Jianjun; Li, Xiaohan; Geng, Guangpo; Shao, Changliang; Zhou, Hongkui; Wang, Qianfeng; Liu, Leizhen

    2015-12-01

    This paper presented a valuable framework for evaluating the impacts of droughts (single factor) on grassland ecosystems. This framework was defined as the quantitative magnitude of drought impact that unacceptable short-term and long-term effects on ecosystems may experience relative to the reference standard. Long-term effects on ecosystems may occur relative to the reference standard. Net primary productivity (NPP) was selected as the response indicator of drought to assess the quantitative impact of drought on Inner Mongolia grassland based on the Standardized Precipitation Index (SPI) and BIOME-BGC model. The framework consists of six main steps: 1) clearly defining drought scenarios, such as moderate, severe and extreme drought; 2) selecting an appropriate indicator of drought impact; 3) selecting an appropriate ecosystem model and verifying its capabilities, calibrating the bias and assessing the uncertainty; 4) assigning a level of unacceptable impact of drought on the indicator; 5) determining the response of the indicator to drought and normal weather state under global-change; and 6) investigating the unacceptable impact of drought at different spatial scales. We found NPP losses assessed using the new framework were more sensitive to drought and had higher precision than the long-term average method. Moreover, the total and average losses of NPP are different in different grassland types during the drought years from 1961-2009. NPP loss was significantly increased along a gradient of increasing drought levels. Meanwhile, NPP loss variation under the same drought level was different in different grassland types. The operational framework was particularly suited for integrative assessing the effects of different drought events and long-term droughts at multiple spatial scales, which provided essential insights for sciences and societies that must develop coping strategies for ecosystems for such events. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. The CMMI Product Suite and International Standards

    DTIC Science & Technology

    2006-07-01

    standards: “2.3 Reference Documents 2.3.1 Applicable ISO /IEC documents, including ISO /IEC 12207 and ISO /IEC 15504.” “3.1 Development User Requirements...related international standards such as ISO 9001:2000, 12207 , 15288 © 2006 by Carnegie Mellon University Page 12 Key Supplements Needed...the Measurement Framework in ISO /IEC 15504; and • the Process Reference Model included in ISO /IEC 12207 . A possible approach has been developed for

  12. OpenMI: the essential concepts and their implications for legacy software

    NASA Astrophysics Data System (ADS)

    Gregersen, J. B.; Gijsbers, P. J. A.; Westen, S. J. P.; Blind, M.

    2005-08-01

    Information & Communication Technology (ICT) tools such as computational models are very helpful in designing river basin management plans (rbmp-s). However, in the scientific world there is consensus that a single integrated modelling system to support e.g. the implementation of the Water Framework Directive cannot be developed and that integrated systems need to be very much tailored to the local situation. As a consequence there is an urgent need to increase the flexibility of modelling systems, such that dedicated model systems can be developed from available building blocks. The HarmonIT project aims at precisely that. Its objective is to develop and implement a standard interface for modelling components and other relevant tools: The Open Modelling Interface (OpenMI) standard. The OpenMI standard has been completed and documented. It relies entirely on the "pull" principle, where data are pulled by one model from the previous model in the chain. This paper gives an overview of the OpenMI standard, explains the foremost concepts and the rational behind it.

  13. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014, 11th International Conf. on Hydroinformatics, New York, NY.

  14. Methods for reducing biases and errors in regional photochemical model outputs for use in emission reduction and exposure assessments

    EPA Science Inventory

    In the United States, regional-scale photochemical models are being used to design emission control strategies needed to meet the relevant National Ambient Air Quality Standards (NAAQS) within the framework of the attainment demonstration process. Previous studies have shown that...

  15. Improving Conceptual Understanding and Representation Skills through Excel-Based Modeling

    ERIC Educational Resources Information Center

    Malone, Kathy L.; Schunn, Christian D.; Schuchardt, Anita M.

    2018-01-01

    The National Research Council framework for science education and the Next Generation Science Standards have developed a need for additional research and development of curricula that is both technologically model-based and includes engineering practices. This is especially the case for biology education. This paper describes a quasi-experimental…

  16. Modelling Method of Recursive Entity

    ERIC Educational Resources Information Center

    Amal, Rifai; Messoussi, Rochdi

    2012-01-01

    With the development of the Information and Communication Technologies, great masses of information are published in the Web. In order to reuse, to share and to organise them in distance formation and e-learning frameworks, several research projects have been achieved and various standards and modelling languages developed. In our previous…

  17. Search for the standard model Higgs boson in $$l\

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Dikai

    2013-01-01

    Humans have always attempted to understand the mystery of Nature, and more recently physicists have established theories to describe the observed phenomena. The most recent theory is a gauge quantum field theory framework, called Standard Model (SM), which proposes a model comprised of elementary matter particles and interaction particles which are fundamental force carriers in the most unified way. The Standard Model contains the internal symmetries of the unitary product group SU(3) c ⓍSU(2) L Ⓧ U(1) Y , describes the electromagnetic, weak and strong interactions; the model also describes how quarks interact with each other through all of thesemore » three interactions, how leptons interact with each other through electromagnetic and weak forces, and how force carriers mediate the fundamental interactions.« less

  18. Growth Modeling with Nonignorable Dropout: Alternative Analyses of the STAR*D Antidepressant Trial

    ERIC Educational Resources Information Center

    Muthen, Bengt; Asparouhov, Tihomir; Hunter, Aimee M.; Leuchter, Andrew F.

    2011-01-01

    This article uses a general latent variable framework to study a series of models for nonignorable missingness due to dropout. Nonignorable missing data modeling acknowledges that missingness may depend not only on covariates and observed outcomes at previous time points as with the standard missing at random assumption, but also on latent…

  19. Southern Forest Resource Assessment Using the Subregional Timber Supply (SRTS) Model

    Treesearch

    Robert C. Abt; Frederick W. Cubbage; Gerardo Pacheco

    2000-01-01

    Most timber supply analyses are focused on broad regions. This paper describes a modeling system that uses a standard empirical framework applied to subregional inventory data in the South. Model results indicate significant within-region variation in supply responses across owners and regions. Projections of southern timber markets indicate that results are sensitive...

  20. Ecosystem Services and Climate Change Considerations for ...

    EPA Pesticide Factsheets

    Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework “iemWatersheds” has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water

  1. A New Browser-based, Ontology-driven Tool for Generating Standardized, Deep Descriptions of Geoscience Models

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.

    2016-12-01

    Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts such as variables, objects, quantities, operations, processes and assumptions. The purpose of this talk is to present details of the new ontology and to then demonstrate the MCM Tool for several hydrologic models.

  2. Quark-lepton flavor democracy and the nonexistence of the fourth generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cvetic, G.; Kim, C.S.

    1995-01-01

    In the standard model with two Higgs doublets (type II), which has a consistent trend to a flavor gauge theory and its related flavor democracy in the quark and the leptonic sectors (unlike the minimal standard model) when the energy of the probes increases, we impose the mixed quark-lepton flavor democracy at high transition'' energy and assume the usual seesaw mechanism, and consequently find out that the existence of the fourth generation of fermions in this framework is practically ruled out.

  3. Increased flexibility for modeling telemetry and nest-survival data using the multistate framework

    USGS Publications Warehouse

    Devineau, Olivier; Kendall, William L.; Doherty, Paul F.; Shenk, Tanya M.; White, Gary C.; Lukacs, Paul M.; Burnham, Kenneth P.

    2014-01-01

    Although telemetry is one of the most common tools used in the study of wildlife, advances in the analysis of telemetry data have lagged compared to progress in the development of telemetry devices. We demonstrate how standard known-fate telemetry and related nest-survival data analysis models are special cases of the more general multistate framework. We present a short theoretical development, and 2 case examples regarding the American black duck and the mallard. We also present a more complex lynx data analysis. Although not necessary in all situations, the multistate framework provides additional flexibility to analyze telemetry data, which may help analysts and biologists better deal with the vagaries of real-world data collection.

  4. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    PubMed

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  5. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification

    PubMed Central

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594

  6. Common modeling system for digital simulation

    NASA Technical Reports Server (NTRS)

    Painter, Rick

    1994-01-01

    The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.

  7. The Shortened Raven Standard Progressive Matrices: Item Response Theory-Based Psychometric Analyses and Normative Data

    ERIC Educational Resources Information Center

    Van der Elst, Wim; Ouwehand, Carolijn; van Rijn, Peter; Lee, Nikki; Van Boxtel, Martin; Jolles, Jelle

    2013-01-01

    The purpose of the present study was to evaluate the psychometric properties of a shortened version of the Raven Standard Progressive Matrices (SPM) under an item response theory framework (the one- and two-parameter logistic models). The shortened Raven SPM was administered to N = 453 cognitively healthy adults aged between 24 and 83 years. The…

  8. From Standards to Frameworks for IL: How the ACRL Framework Addresses Critiques of the Standards

    ERIC Educational Resources Information Center

    Foasberg, Nancy M.

    2015-01-01

    The Association of College and Research Libraries (ACRL) Information Literacy Competency Standards for Higher Education, since their publication in 2000, have drawn criticism for ignoring the social and political aspects of information literacy. The ACRL Information Literacy Competency Standards Task Force responded with the Framework for…

  9. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...

  10. Fertilizer standards for controlling groundwater nitrate pollution from agriculture: El Salobral-Los Llanos case study, Spain

    NASA Astrophysics Data System (ADS)

    Peña-Haro, S.; Llopis-Albert, C.; Pulido-Velazquez, M.; Pulido-Velazquez, D.

    2010-10-01

    SummaryAlthough the legislation on groundwater quality targets pollutant concentration, the effects of measures on non-point source pollution control are often evaluated in terms of their emission reduction potential at the source, not on their capacity of reducing the pollutant concentration in groundwater. This paper applies a hydro-economic modelling framework to an aquifer, El Salobral-Los Llanos aquifer (Mancha Oriental, Spain), where nitrate concentrations higher than those allowed by the EU Water Framework Directive and Groundwater Directive are locally found due to the intense fertilizer use in irrigated crops. The approach allows defining the economically optimal allocation of spatially variable fertilizer standards in agricultural basins using a hydro-economic model that links the fertilizer application with groundwater nitrate concentration at different control sites while maximizing net economic benefits. The methodology incorporates results from agronomic simulations, groundwater flow and transport into a management framework that yields the fertilizer allocation that maximizes benefits in agriculture while meeting the environmental standards. The cost of applying fertilizer standards was estimated as the difference between the private net revenues from actual application and the scenarios generated considering the application of the standards. Furthermore, the cost of applying fertilizer standards was compared with the cost of taxing nitrogen fertilizers in order to reduce the fertilizer use to a level that the nitrate concentration in groundwater was below the limit. The results show the required reduction of fertilizer application in the different crop areas depending on its location with regards to the control sites, crop types and soil-plant conditions, groundwater flow and transport processes, time horizon for meeting the standards, and the cost of implementing such a policy (as forgone benefits). According to the results, a high fertilizer price would be required to reduce nitrate concentrations in groundwater below the standard of 50 mg/l. In this particular case, it is more cost-efficient to apply standards to fertilizer use than taxes, although the instrument of fertilizer standards is more difficult to implement and control.

  11. ATLAS particle detector CSC ROD software design and implementation, and, Addition of K physics to chi-squared analysis of FDQM

    NASA Astrophysics Data System (ADS)

    Hawkins, Donovan Lee

    In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.

  12. [Expert investigation on food safety standard system framework construction in China].

    PubMed

    He, Xiang; Yan, Weixing; Fan, Yongxiang; Zeng, Biao; Peng, Zhen; Sun, Zhenqiu

    2013-09-01

    Through investigating food safety standard framework among food safety experts, to summarize the basic elements and principles of food safety standard system, and provide policy advices for food safety standards framework. A survey was carried out among 415 experts from government, professional institutions and the food industry/enterprises using the National Food Safety Standard System Construction Consultation Questionnaire designed in the name of the Secretariat of National Food Safety Standard Committee. Experts have different advices in each group about the principles of food product standards, food additive product standards, food related product standards, hygienic practice, test methods. According to the results, the best solution not only may reflect experts awareness of the work of food safety standards situation, but also provide advices for setting and revision of food safety standards for the next. Through experts investigation, the framework and guiding principles of food safety standard had been built.

  13. Overarching framework for data-based modelling

    NASA Astrophysics Data System (ADS)

    Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco

    2014-02-01

    One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.

  14. Friendship Dissolution Within Social Networks Modeled Through Multilevel Event History Analysis

    PubMed Central

    Dean, Danielle O.; Bauer, Daniel J.; Prinstein, Mitchell J.

    2018-01-01

    A social network perspective can bring important insight into the processes that shape human behavior. Longitudinal social network data, measuring relations between individuals over time, has become increasingly common—as have the methods available to analyze such data. A friendship duration model utilizing discrete-time multilevel survival analysis with a multiple membership random effect structure is developed and applied here to study the processes leading to undirected friendship dissolution within a larger social network. While the modeling framework is introduced in terms of understanding friendship dissolution, it can be used to understand microlevel dynamics of a social network more generally. These models can be fit with standard generalized linear mixed-model software, after transforming the data to a pair-period data set. An empirical example highlights how the model can be applied to understand the processes leading to friendship dissolution between high school students, and a simulation study is used to test the use of the modeling framework under representative conditions that would be found in social network data. Advantages of the modeling framework are highlighted, and potential limitations and future directions are discussed. PMID:28463022

  15. The OGC Sensor Web Enablement framework

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Botts, M.

    2006-12-01

    Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which covers low-level data streams, and domain-specific GML Application Schemas for definitions of the target feature types. The SWE framework has been demonstrated in several interoperability testbeds. These were based around emergency management, security, contamination and environmental monitoring scenarios.

  16. Designing Collaborative Developmental Standards by Refactoring of the Earth Science Models, Libraries, Workflows and Frameworks.

    NASA Astrophysics Data System (ADS)

    Mirvis, E.; Iredell, M.

    2015-12-01

    The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the reverse engineering tools/APIs. We will also inform on collaborative efforts in the restructuring of the NOAA Environmental Modeling System (NEMS) - the multi- model and coupling framework, and transitioning FEE verification methodology.

  17. Simultaneously estimating evolutionary history and repeated traits phylogenetic signal: applications to viral and host phenotypic evolution

    PubMed Central

    Vrancken, Bram; Lemey, Philippe; Rambaut, Andrew; Bedford, Trevor; Longdon, Ben; Günthard, Huldrych F.; Suchard, Marc A.

    2014-01-01

    Phylogenetic signal quantifies the degree to which resemblance in continuously-valued traits reflects phylogenetic relatedness. Measures of phylogenetic signal are widely used in ecological and evolutionary research, and are recently gaining traction in viral evolutionary studies. Standard estimators of phylogenetic signal frequently condition on data summary statistics of the repeated trait observations and fixed phylogenetics trees, resulting in information loss and potential bias. To incorporate the observation process and phylogenetic uncertainty in a model-based approach, we develop a novel Bayesian inference method to simultaneously estimate the evolutionary history and phylogenetic signal from molecular sequence data and repeated multivariate traits. Our approach builds upon a phylogenetic diffusion framework that model continuous trait evolution as a Brownian motion process and incorporates Pagel’s λ transformation parameter to estimate dependence among traits. We provide a computationally efficient inference implementation in the BEAST software package. We evaluate the synthetic performance of the Bayesian estimator of phylogenetic signal against standard estimators, and demonstrate the use of our coherent framework to address several virus-host evolutionary questions, including virulence heritability for HIV, antigenic evolution in influenza and HIV, and Drosophila sensitivity to sigma virus infection. Finally, we discuss model extensions that will make useful contributions to our flexible framework for simultaneously studying sequence and trait evolution. PMID:25780554

  18. Unit Standards Catalogue. Unit Standards and Qualifications Registered on the National Qualifications Framework to March 1994.

    ERIC Educational Resources Information Center

    New Zealand Qualifications Authority, Wellington.

    This booklet includes the latest list of unit standards and qualifications registered on the New Zealand National Qualifications Framework to April 1994. Unit standards registered on the framework can be offered by private and government training establishments, polytechnics, colleges of education, and schools. This list of registered unit…

  19. Indoorgml - a Standard for Indoor Spatial Modeling

    NASA Astrophysics Data System (ADS)

    Li, Ki-Joune

    2016-06-01

    With recent progress of mobile devices and indoor positioning technologies, it becomes possible to provide location-based services in indoor space as well as outdoor space. It is in a seamless way between indoor and outdoor spaces or in an independent way only for indoor space. However, we cannot simply apply spatial models developed for outdoor space to indoor space due to their differences. For example, coordinate reference systems are employed to indicate a specific position in outdoor space, while the location in indoor space is rather specified by cell number such as room number. Unlike outdoor space, the distance between two points in indoor space is not determined by the length of the straight line but the constraints given by indoor components such as walls, stairs, and doors. For this reason, we need to establish a new framework for indoor space from fundamental theoretical basis, indoor spatial data models, and information systems to store, manage, and analyse indoor spatial data. In order to provide this framework, an international standard, called IndoorGML has been developed and published by OGC (Open Geospatial Consortium). This standard is based on a cellular notion of space, which considers an indoor space as a set of non-overlapping cells. It consists of two types of modules; core module and extension module. While core module consists of four basic conceptual and implementation modeling components (geometric model for cell, topology between cells, semantic model of cell, and multi-layered space model), extension modules may be defined on the top of the core module to support an application area. As the first version of the standard, we provide an extension for indoor navigation.

  20. Age-structured mark-recapture analysis: A virtual-population-analysis-based model for analyzing age-structured capture-recapture data

    USGS Publications Warehouse

    Coggins, L.G.; Pine, William E.; Walters, C.J.; Martell, S.J.D.

    2006-01-01

    We present a new model to estimate capture probabilities, survival, abundance, and recruitment using traditional Jolly-Seber capture-recapture methods within a standard fisheries virtual population analysis framework. This approach compares the numbers of marked and unmarked fish at age captured in each year of sampling with predictions based on estimated vulnerabilities and abundance in a likelihood function. Recruitment to the earliest age at which fish can be tagged is estimated by using a virtual population analysis method to back-calculate the expected numbers of unmarked fish at risk of capture. By using information from both marked and unmarked animals in a standard fisheries age structure framework, this approach is well suited to the sparse data situations common in long-term capture-recapture programs with variable sampling effort. ?? Copyright by the American Fisheries Society 2006.

  1. Beyond standard model calculations with Sherpa

    DOE PAGES

    Höche, Stefan; Kuttimalai, Silvan; Schumann, Steffen; ...

    2015-03-24

    We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level.

  2. Beyond standard model calculations with Sherpa.

    PubMed

    Höche, Stefan; Kuttimalai, Silvan; Schumann, Steffen; Siegert, Frank

    We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in Beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level.

  3. Derivation and Implementation of a Model Teaching the Nature of Science Using Informal Science Education Venues

    ERIC Educational Resources Information Center

    Spector, Barbara S.; Burkett, Ruth; Leard, Cyndy

    2012-01-01

    This paper introduces a model for using informal science education venues as contexts within which to teach the nature of science. The model was initially developed to enable university education students to teach science in elementary schools so as to be consistent with "National Science Education Standards" (NSES) (1996) and "A Framework for…

  4. Analysis model for personal eHealth solutions and services.

    PubMed

    Mykkänen, Juha; Tuomainen, Mika; Luukkonen, Irmeli; Itälä, Timo

    2010-01-01

    In this paper, we present a framework for analysing and assessing various features of personal wellbeing information management services and solutions such as personal health records and citizen-oriented eHealth services. The model is based on general functional and interoperability standards for personal health management applications and generic frameworks for different aspects of analysis. It has been developed and used in the MyWellbeing project in Finland to provide baseline for the research, development and comparison of many different personal wellbeing and health management solutions and to support the development of unified "Coper" concept for citizen empowerment.

  5. A human rights framework for midwifery care.

    PubMed

    Thompson, Joyce Beebe

    2004-01-01

    This article presents a rights-based model for midwifery care of women and childbearing families. Salient features include discussion of the influence of values on how women are viewed within cultures and societies, universal ethical principles applicable to health care services, and human rights based on the view of women as persons rather than as objects or chattel. Examples of the health impact on women of persistent violation of basic human rights are used to support the need for using a human rights framework for midwifery care--a model supported by codes of ethics, the midwifery philosophy of care, and standards of practice.

  6. Design and Application of an Ontology for Component-Based Modeling of Water Systems

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2012-12-01

    Many Earth system modeling frameworks have adopted an approach of componentizing models so that a large model can be assembled by linking a set of smaller model components. These model components can then be more easily reused, extended, and maintained by a large group of model developers and end users. While there has been a notable increase in component-based model frameworks in the Earth sciences in recent years, there has been less work on creating framework-agnostic metadata and ontologies for model components. Well defined model component metadata is needed, however, to facilitate sharing, reuse, and interoperability both within and across Earth system modeling frameworks. To address this need, we have designed an ontology for the water resources community named the Water Resources Component (WRC) ontology in order to advance the application of component-based modeling frameworks across water related disciplines. Here we present the design of the WRC ontology and demonstrate its application for integration of model components used in watershed management. First we show how the watershed modeling system Soil and Water Assessment Tool (SWAT) can be decomposed into a set of hydrological and ecological components that adopt the Open Modeling Interface (OpenMI) standard. Then we show how the components can be used to estimate nitrogen losses from land to surface water for the Baltimore Ecosystem study area. Results of this work are (i) a demonstration of how the WRC ontology advances the conceptual integration between components of water related disciplines by handling the semantic and syntactic heterogeneity present when describing components from different disciplines and (ii) an investigation of a methodology by which large models can be decomposed into a set of model components that can be well described by populating metadata according to the WRC ontology.

  7. Second-Language Learning through Imaginative Theory

    ERIC Educational Resources Information Center

    Broom, Catherine

    2011-01-01

    This article explores how Egan's (1997) work on imagination can enrich our understanding of teaching English as a second language (ESL). Much has been written on ESL teaching techniques; however, some of this work has been expounded in a standard educational framework, which is what Egan calls an assembly-line model. This model can easily underlie…

  8. An Integrated Modeling Framework Forecasting Ecosystem Exposure-- A Systems Approach to the Cumulative Impacts of Multiple Stressors

    NASA Astrophysics Data System (ADS)

    Johnston, J. M.

    2013-12-01

    Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.

  9. A new multi-layer approach for progressive damage simulation in composite laminates based on isogeometric analysis and Kirchhoff-Love shells. Part II: impact modeling

    NASA Astrophysics Data System (ADS)

    Pigazzini, M. S.; Bazilevs, Y.; Ellison, A.; Kim, H.

    2017-11-01

    In this two-part paper we introduce a new formulation for modeling progressive damage in laminated composite structures. We adopt a multi-layer modeling approach, based on isogeometric analysis, where each ply or lamina is represented by a spline surface, and modeled as a Kirchhoff-Love thin shell. Continuum damage mechanics is used to model intralaminar damage, and a new zero-thickness cohesive-interface formulation is introduced to model delamination as well as permitting laminate-level transverse shear compliance. In Part I of this series we focus on the presentation of the modeling framework, validation of the framework using standard Mode I and Mode II delamination tests, and assessment of its suitability for modeling thick laminates. In Part II of this series we focus on the application of the proposed framework to modeling and simulation of damage in composite laminates resulting from impact. The proposed approach has significant accuracy and efficiency advantages over existing methods for modeling impact damage. These stem from the use of IGA-based Kirchhoff-Love shells to represent the individual plies of the composite laminate, while the compliant cohesive interfaces enable transverse shear deformation of the laminate. Kirchhoff-Love shells give a faithful representation of the ply deformation behavior, and, unlike solids or traditional shear-deformable shells, do not suffer from transverse-shear locking in the limit of vanishing thickness. This, in combination with higher-order accurate and smooth representation of the shell midsurface displacement field, allows us to adopt relatively coarse in-plane discretizations without sacrificing solution accuracy. Furthermore, the thin-shell formulation employed does not use rotational degrees of freedom, which gives additional efficiency benefits relative to more standard shell formulations.

  10. A new multi-layer approach for progressive damage simulation in composite laminates based on isogeometric analysis and Kirchhoff-Love shells. Part I: basic theory and modeling of delamination and transverse shear

    NASA Astrophysics Data System (ADS)

    Bazilevs, Y.; Pigazzini, M. S.; Ellison, A.; Kim, H.

    2017-11-01

    In this two-part paper we introduce a new formulation for modeling progressive damage in laminated composite structures. We adopt a multi-layer modeling approach, based on Isogeometric Analysis (IGA), where each ply or lamina is represented by a spline surface, and modeled as a Kirchhoff-Love thin shell. Continuum Damage Mechanics is used to model intralaminar damage, and a new zero-thickness cohesive-interface formulation is introduced to model delamination as well as permitting laminate-level transverse shear compliance. In Part I of this series we focus on the presentation of the modeling framework, validation of the framework using standard Mode I and Mode II delamination tests, and assessment of its suitability for modeling thick laminates. In Part II of this series we focus on the application of the proposed framework to modeling and simulation of damage in composite laminates resulting from impact. The proposed approach has significant accuracy and efficiency advantages over existing methods for modeling impact damage. These stem from the use of IGA-based Kirchhoff-Love shells to represent the individual plies of the composite laminate, while the compliant cohesive interfaces enable transverse shear deformation of the laminate. Kirchhoff-Love shells give a faithful representation of the ply deformation behavior, and, unlike solids or traditional shear-deformable shells, do not suffer from transverse-shear locking in the limit of vanishing thickness. This, in combination with higher-order accurate and smooth representation of the shell midsurface displacement field, allows us to adopt relatively coarse in-plane discretizations without sacrificing solution accuracy. Furthermore, the thin-shell formulation employed does not use rotational degrees of freedom, which gives additional efficiency benefits relative to more standard shell formulations.

  11. Standardization efforts of digital pathology in Europe.

    PubMed

    Rojo, Marcial García; Daniel, Christel; Schrader, Thomas

    2012-01-01

    EURO-TELEPATH is a European COST Action IC0604. It started in 2007 and will end in November 2011. Its main objectives are evaluating and validating the common technological framework and communication standards required to access, transmit, and manage digital medical records by pathologists and other medical specialties in a networked environment. Working Group 1, "Business Modelling in Pathology," has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy - using Business Process Modelling Notation (BPMN). Working Group 2 has been dedicated to promoting the application of informatics standards in pathology, collaborating with Integrating Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Health terminology standardization research has become a topic of great interest. Future research work should focus on standardizing automatic image analysis and tissue microarrays imaging.

  12. Development of structured ICD-10 and its application to computer-assisted ICD coding.

    PubMed

    Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko

    2010-01-01

    This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.

  13. Adjusting for overdispersion in piecewise exponential regression models to estimate excess mortality rate in population-based research.

    PubMed

    Luque-Fernandez, Miguel Angel; Belot, Aurélien; Quaresma, Manuela; Maringe, Camille; Coleman, Michel P; Rachet, Bernard

    2016-10-01

    In population-based cancer research, piecewise exponential regression models are used to derive adjusted estimates of excess mortality due to cancer using the Poisson generalized linear modelling framework. However, the assumption that the conditional mean and variance of the rate parameter given the set of covariates x i are equal is strong and may fail to account for overdispersion given the variability of the rate parameter (the variance exceeds the mean). Using an empirical example, we aimed to describe simple methods to test and correct for overdispersion. We used a regression-based score test for overdispersion under the relative survival framework and proposed different approaches to correct for overdispersion including a quasi-likelihood, robust standard errors estimation, negative binomial regression and flexible piecewise modelling. All piecewise exponential regression models showed the presence of significant inherent overdispersion (p-value <0.001). However, the flexible piecewise exponential model showed the smallest overdispersion parameter (3.2 versus 21.3) for non-flexible piecewise exponential models. We showed that there were no major differences between methods. However, using a flexible piecewise regression modelling, with either a quasi-likelihood or robust standard errors, was the best approach as it deals with both, overdispersion due to model misspecification and true or inherent overdispersion.

  14. 78 FR 44090 - Formaldehyde; Third-Party Certification Framework for the Formaldehyde Standards for Composite...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-23

    ... Formaldehyde; Third-Party Certification Framework for the Formaldehyde Standards for Composite Wood Products..., concerning a third-party certification framework for the formaldehyde standards for composite wood products... Environmental protection, Composite wood products, Formaldehyde, Reporting and recordkeeping, Third-party...

  15. Constaints on Lorentz symmetry violations using lunar laser ranging observations

    NASA Astrophysics Data System (ADS)

    Bourgoin, Adrien

    2016-12-01

    General Relativity (GR) and the standard model of particle physics provide a comprehensive description of the four interactions of nature. A quantum gravity theory is expected to merge these two pillars of modern physics. From unification theories, such a combination would lead to a breaking of fundamental symmetry appearing in both GR and the standard model of particle physics as the Lorentz symmetry. Lorentz symmetry violations in all fields of physics can be parametrized by an effective field theory framework called the standard-model extension (SME). Local Lorentz Invariance violations in the gravitational sector should impact the orbital motion of bodies inside the solar system, such as the Moon. Thus, the accurate lunar laser ranging (LLR) data can be analyzed in order to study precisely the lunar motion to look for irregularities. For this purpose, ELPN (Ephéméride Lunaire Parisienne Numérique), a new lunar ephemeris has been integrated in the SME framework. This new numerical solution of the lunar motion provides time series dated in temps dynamique barycentrique (TDB). Among that series, we mention the barycentric position and velocity of the Earth-Moon vector, the lunar libration angles, the time scale difference between the terrestrial time and TDB and partial derivatives integrated from variational equations. ELPN predictions have been used to analyzed LLR observations. In the GR framework, the residuals standard deviations has turned out to be the same order of magnitude compare to those of INPOP13b and DE430 ephemerides. In the framework of the minimal SME, LLR data analysis provided constraints on local Lorentz invariance violations. Spetial attention was paid to analyze uncertainties to provide the most realistic constraints. Therefore, in a first place, linear combinations of SME coefficients have been derived and fitted to LLR observations. In a second time, realistic uncertainties have been determined with a resampling method. LLR data analysis did not reveal local Lorentz invariance vio lations arising on the lunar orbit. Therefore, GR predictions are recovered with absolute precisions of the order of 10-9 to 10-12.

  16. A Framework for Analysis of Research Risks and Benefits to Participants in Standard of Care Pragmatic Clinical Trials

    PubMed Central

    Chen, Stephanie C; Kim, Scott Y H

    2016-01-01

    Background/Aims Standard of care pragmatic clinical trials (SCPCTs) that compare treatments already in use could improve care and reduce cost but there is considerable debate about the research risks of SCPCTs and how to apply informed consent regulations to such trials. We sought to develop a framework integrating the insights from opposing sides of the debate. Methods We developed a formal risk-benefit analysis framework for SCPCTs and then applied it to key provisions of the U.S. federal regulations. Results Our formal framework for SCPCT risk-benefit analysis takes into account three key considerations: the ex ante estimates of risks and benefits of the treatments to be compared in a SCPCT, the allocation ratios of treatments inside and outside a SCPCT, and the significance of some participants receiving a different treatment inside a SCPCT than outside the trial. The framework provides practical guidance on how the research ethics regulations on informed consent should be applied to SCPCTs. Conclusions Our proposed formal model makes explicit the relationship between the concepts used by opposing sides of the debate about the research risks of SCPCTs and can be used to clarify the implications for informed consent. PMID:27365010

  17. Crossing the Virtual World Barrier with OpenAvatar

    NASA Technical Reports Server (NTRS)

    Joy, Bruce; Kavle, Lori; Tan, Ian

    2012-01-01

    There are multiple standards and formats for 3D models in virtual environments. The problem is that there is no open source platform for generating models out of discrete parts; this results in the process of having to "reinvent the wheel" when new games, virtual worlds and simulations want to enable their users to create their own avatars or easily customize in-world objects. OpenAvatar is designed to provide a framework to allow artists and programmers to create reusable assets which can be used by end users to generate vast numbers of complete models that are unique and functional. OpenAvatar serves as a framework which facilitates the modularization of 3D models allowing parts to be interchanged within a set of logical constraints.

  18. FPGA implemented testbed in 8-by-8 and 2-by-2 OFDM-MIMO channel estimation and design of baseband transceiver.

    PubMed

    Ramesh, S; Seshasayanan, R

    2016-01-01

    In this study, a baseband OFDM-MIMO framework with channel timing and estimation synchronization is composed and executed utilizing the FPGA innovation. The framework is prototyped in light of the IEEE 802.11a standard and the signals transmitted and received utilizing a data transmission of 20 MHz. With the assistance of the QPSK tweak, the framework can accomplish a throughput of 24 Mbps. Besides, the LS formula is executed and the estimation of a frequency-specific fading channel is illustrated. For the rough estimation of timing, MNC plan is examined and actualized. Above all else, the whole framework is demonstrated in MATLAB and a drifting point model is set up. At that point, the altered point model is made with the assistance of Simulink and Xilinx's System Generator for DSP. In this way, the framework is incorporated and actualized inside of Xilinx's ISE tools and focused to Xilinx Virtex 5 board. In addition, an equipment co-simulation is contrived to decrease the preparing time while figuring the BER of the fixed point model. The work concentrates on above all else venture for further examination of planning creative channel estimation strategies towards applications in the fourth era (4G) mobile correspondence frameworks.

  19. Standardized mappings--a framework to combine different semantic mappers into a standardized web-API.

    PubMed

    Neuhaus, Philipp; Doods, Justin; Dugas, Martin

    2015-01-01

    Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.

  20. Integrating Structured and Unstructured EHR Data Using an FHIR-based Type System: A Case Study with Medication Data.

    PubMed

    Hong, Na; Wen, Andrew; Shen, Feichen; Sohn, Sunghwan; Liu, Sijia; Liu, Hongfang; Jiang, Guoqian

    2018-01-01

    Standards-based modeling of electronic health records (EHR) data holds great significance for data interoperability and large-scale usage. Integration of unstructured data into a standard data model, however, poses unique challenges partially due to heterogeneous type systems used in existing clinical NLP systems. We introduce a scalable and standards-based framework for integrating structured and unstructured EHR data leveraging the HL7 Fast Healthcare Interoperability Resources (FHIR) specification. We implemented a clinical NLP pipeline enhanced with an FHIR-based type system and performed a case study using medication data from Mayo Clinic's EHR. Two UIMA-based NLP tools known as MedXN and MedTime were integrated in the pipeline to extract FHIR MedicationStatement resources and related attributes from unstructured medication lists. We developed a rule-based approach for assigning the NLP output types to the FHIR elements represented in the type system, whereas we investigated the FHIR elements belonging to the source of the structured EMR data. We used the FHIR resource "MedicationStatement" as an example to illustrate our integration framework and methods. For evaluation, we manually annotated FHIR elements in 166 medication statements from 14 clinical notes generated by Mayo Clinic in the course of patient care, and used standard performance measures (precision, recall and f-measure). The F-scores achieved ranged from 0.73 to 0.99 for the various FHIR element representations. The results demonstrated that our framework based on the FHIR type system is feasible for normalizing and integrating both structured and unstructured EHR data.

  1. Maximizing your Process Improvement ROI through Harmonization

    DTIC Science & Technology

    2008-03-01

    ISO 12207 ) provide comprehensive guidance on what system and software engineering processes are needed. The frameworks of Six Sigma provide specific...reductions. Their veloci-Q Enterprise integrated system, includes ISO 9001, CMM, P-CMM, TL9000, British Standard 7799, and Six Sigma. They estimate a 30...at their discretion. And, they chose to blend process maturity models and ISO standards to support their objective regarding the establishment of

  2. An integrated framework for detecting suspicious behaviors in video surveillance

    NASA Astrophysics Data System (ADS)

    Zin, Thi Thi; Tin, Pyke; Hama, Hiromitsu; Toriu, Takashi

    2014-03-01

    In this paper, we propose an integrated framework for detecting suspicious behaviors in video surveillance systems which are established in public places such as railway stations, airports, shopping malls and etc. Especially, people loitering in suspicion, unattended objects left behind and exchanging suspicious objects between persons are common security concerns in airports and other transit scenarios. These involve understanding scene/event, analyzing human movements, recognizing controllable objects, and observing the effect of the human movement on those objects. In the proposed framework, multiple background modeling technique, high level motion feature extraction method and embedded Markov chain models are integrated for detecting suspicious behaviors in real time video surveillance systems. Specifically, the proposed framework employs probability based multiple backgrounds modeling technique to detect moving objects. Then the velocity and distance measures are computed as the high level motion features of the interests. By using an integration of the computed features and the first passage time probabilities of the embedded Markov chain, the suspicious behaviors in video surveillance are analyzed for detecting loitering persons, objects left behind and human interactions such as fighting. The proposed framework has been tested by using standard public datasets and our own video surveillance scenarios.

  3. A geographic data model for representing ground water systems.

    PubMed

    Strassberg, Gil; Maidment, David R; Jones, Norm L

    2007-01-01

    The Arc Hydro ground water data model is a geographic data model for representing spatial and temporal ground water information within a geographic information system (GIS). The data model is a standardized representation of ground water systems within a spatial database that provides a public domain template for GIS users to store, document, and analyze commonly used spatial and temporal ground water data sets. This paper describes the data model framework, a simplified version of the complete ground water data model that includes two-dimensional and three-dimensional (3D) object classes for representing aquifers, wells, and borehole data, and the 3D geospatial context in which these data exist. The framework data model also includes tabular objects for representing temporal information such as water levels and water quality samples that are related with spatial features.

  4. High energy nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Wosiek, B.

    1986-01-01

    Experimental results on high energy nucleus-nucleus interactions are presented. The data are discussed within the framework of standard super-position models and from the point-of-view of the possible formation of new states of matter in heavy ion collisions.

  5. Consistent design schematics for biological systems: standardization of representation in biological engineering

    PubMed Central

    Matsuoka, Yukiko; Ghosh, Samik; Kitano, Hiroaki

    2009-01-01

    The discovery by design paradigm driving research in synthetic biology entails the engineering of de novo biological constructs with well-characterized input–output behaviours and interfaces. The construction of biological circuits requires iterative phases of design, simulation and assembly, leading to the fabrication of a biological device. In order to represent engineered models in a consistent visual format and further simulating them in silico, standardization of representation and model formalism is imperative. In this article, we review different efforts for standardization, particularly standards for graphical visualization and simulation/annotation schemata adopted in systems biology. We identify the importance of integrating the different standardization efforts and provide insights into potential avenues for developing a common framework for model visualization, simulation and sharing across various tools. We envision that such a synergistic approach would lead to the development of global, standardized schemata in biology, empowering deeper understanding of molecular mechanisms as well as engineering of novel biological systems. PMID:19493898

  6. 78 FR 51696 - Formaldehyde; Third-Party Certification Framework for the Formaldehyde Standards for Composite...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-21

    ... Formaldehyde; Third-Party Certification Framework for the Formaldehyde Standards for Composite Wood Products..., concerning a third-party certification framework for the formaldehyde standards for composite wood products... INFORMATION CONTACT. List of Subjects in 40 CFR Part 770 Environmental protection, Composite wood products...

  7. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  8. From Pixels to Response Maps: Discriminative Image Filtering for Face Alignment in the Wild.

    PubMed

    Asthana, Akshay; Zafeiriou, Stefanos; Tzimiropoulos, Georgios; Cheng, Shiyang; Pantic, Maja

    2015-06-01

    We propose a face alignment framework that relies on the texture model generated by the responses of discriminatively trained part-based filters. Unlike standard texture models built from pixel intensities or responses generated by generic filters (e.g. Gabor), our framework has two important advantages. First, by virtue of discriminative training, invariance to external variations (like identity, pose, illumination and expression) is achieved. Second, we show that the responses generated by discriminatively trained filters (or patch-experts) are sparse and can be modeled using a very small number of parameters. As a result, the optimization methods based on the proposed texture model can better cope with unseen variations. We illustrate this point by formulating both part-based and holistic approaches for generic face alignment and show that our framework outperforms the state-of-the-art on multiple "wild" databases. The code and dataset annotations are available for research purposes from http://ibug.doc.ic.ac.uk/resources.

  9. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  10. Next generation of weather generators on web service framework

    NASA Astrophysics Data System (ADS)

    Chinnachodteeranun, R.; Hung, N. D.; Honda, K.; Ines, A. V. M.

    2016-12-01

    Weather generator is a statistical model that synthesizes possible realization of long-term historical weather in future. It generates several tens to hundreds of realizations stochastically based on statistical analysis. Realization is essential information as a crop modeling's input for simulating crop growth and yield. Moreover, they can be contributed to analyzing uncertainty of weather to crop development stage and to decision support system on e.g. water management and fertilizer management. Performing crop modeling requires multidisciplinary skills which limit the usage of weather generator only in a research group who developed it as well as a barrier for newcomers. To improve the procedures of performing weather generators as well as the methodology to acquire the realization in a standard way, we implemented a framework for providing weather generators as web services, which support service interoperability. Legacy weather generator programs were wrapped in the web service framework. The service interfaces were implemented based on an international standard that was Sensor Observation Service (SOS) defined by Open Geospatial Consortium (OGC). Clients can request realizations generated by the model through SOS Web service. Hierarchical data preparation processes required for weather generator are also implemented as web services and seamlessly wired. Analysts and applications can invoke services over a network easily. The services facilitate the development of agricultural applications and also reduce the workload of analysts on iterative data preparation and handle legacy weather generator program. This architectural design and implementation can be a prototype for constructing further services on top of interoperable sensor network system. This framework opens an opportunity for other sectors such as application developers and scientists in other fields to utilize weather generators.

  11. A Framework for Simulation of Aircraft Flyover Noise Through a Non-Standard Atmosphere

    NASA Technical Reports Server (NTRS)

    Arntzen, Michael; Rizzi, Stephen A.; Visser, Hendrikus G.; Simons, Dick G.

    2012-01-01

    This paper describes a new framework for the simulation of aircraft flyover noise through a non-standard atmosphere. Central to the framework is a ray-tracing algorithm which defines multiple curved propagation paths, if the atmosphere allows, between the moving source and listener. Because each path has a different emission angle, synthesis of the sound at the source must be performed independently for each path. The time delay, spreading loss and absorption (ground and atmosphere) are integrated along each path, and applied to each synthesized aircraft noise source to simulate a flyover. A final step assigns each resulting signal to its corresponding receiver angle for the simulation of a flyover in a virtual reality environment. Spectrograms of the results from a straight path and a curved path modeling assumption are shown. When the aircraft is at close range, the straight path results are valid. Differences appear especially when the source is relatively far away at shallow elevation angles. These differences, however, are not significant in common sound metrics. While the framework used in this work performs off-line processing, it is conducive to real-time implementation.

  12. An interdisciplinary framework for participatory modeling design and evaluation—What makes models effective participatory decision tools?

    NASA Astrophysics Data System (ADS)

    Falconi, Stefanie M.; Palmer, Richard N.

    2017-02-01

    Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.

  13. Teacher Professional Develpment That Meets 21st Century Science Education Standards

    NASA Astrophysics Data System (ADS)

    van der Veen, Wil E.; Roelofsen Moody, T.

    2011-01-01

    The National Academies are working with several other groups to develop new National Science Education Standards, with the intention that they will be adopted by all states. It is critical that the science education community uses these new standards when planning teacher professional development and understands the potential implementation challenges. As a first step in developing these new standards, the National Research Council (NRC) recently published a draft Framework for Science Education. This framework describes the major scientific ideas and practices that all students should be familiar with by the end of high school. Following recommendations from the NRC Report "Taking Science to School” (NRC, 2007), it emphasizes the importance of integrating science practices with the learning of science content. These same recommendations influenced the recently revised New Jersey Science Education Standards. Thus, the revised New Jersey standards can be valuable as a case study for curriculum developers and professional development providers. While collaborating with the New Jersey Department of Education on the development of these revised science standards, we identified two critical needs for successful implementation. First, we found that many currently used science activities must be adapted to meet the revised standards and that new activities must be developed. Second, teacher professional development is needed to model the integration of science practices with the learning of science content. With support from the National Space Grant Foundation we developed a week-long Astronomy Institute, which was presented in the summers of 2009 and 2010. We will briefly describe our professional development model and how it helped teachers to bridge the gap between the standards and their current classroom practice. We will provide examples of astronomy activities that were either adapted or developed to meet the new standards. Finally, we will briefly discuss the evaluation results.

  14. A Model of Yeast Cell-Cycle Regulation Based on a Standard Component Modeling Strategy for Protein Regulatory Networks.

    PubMed

    Laomettachit, Teeraphan; Chen, Katherine C; Baumann, William T; Tyson, John J

    2016-01-01

    To understand the molecular mechanisms that regulate cell cycle progression in eukaryotes, a variety of mathematical modeling approaches have been employed, ranging from Boolean networks and differential equations to stochastic simulations. Each approach has its own characteristic strengths and weaknesses. In this paper, we propose a "standard component" modeling strategy that combines advantageous features of Boolean networks, differential equations and stochastic simulations in a framework that acknowledges the typical sorts of reactions found in protein regulatory networks. Applying this strategy to a comprehensive mechanism of the budding yeast cell cycle, we illustrate the potential value of standard component modeling. The deterministic version of our model reproduces the phenotypic properties of wild-type cells and of 125 mutant strains. The stochastic version of our model reproduces the cell-to-cell variability of wild-type cells and the partial viability of the CLB2-dbΔ clb5Δ mutant strain. Our simulations show that mathematical modeling with "standard components" can capture in quantitative detail many essential properties of cell cycle control in budding yeast.

  15. A Model of Yeast Cell-Cycle Regulation Based on a Standard Component Modeling Strategy for Protein Regulatory Networks

    PubMed Central

    Laomettachit, Teeraphan; Chen, Katherine C.; Baumann, William T.

    2016-01-01

    To understand the molecular mechanisms that regulate cell cycle progression in eukaryotes, a variety of mathematical modeling approaches have been employed, ranging from Boolean networks and differential equations to stochastic simulations. Each approach has its own characteristic strengths and weaknesses. In this paper, we propose a “standard component” modeling strategy that combines advantageous features of Boolean networks, differential equations and stochastic simulations in a framework that acknowledges the typical sorts of reactions found in protein regulatory networks. Applying this strategy to a comprehensive mechanism of the budding yeast cell cycle, we illustrate the potential value of standard component modeling. The deterministic version of our model reproduces the phenotypic properties of wild-type cells and of 125 mutant strains. The stochastic version of our model reproduces the cell-to-cell variability of wild-type cells and the partial viability of the CLB2-dbΔ clb5Δ mutant strain. Our simulations show that mathematical modeling with “standard components” can capture in quantitative detail many essential properties of cell cycle control in budding yeast. PMID:27187804

  16. Technical note: The Linked Paleo Data framework - a common tongue for paleoclimatology

    NASA Astrophysics Data System (ADS)

    McKay, Nicholas P.; Emile-Geay, Julien

    2016-04-01

    Paleoclimatology is a highly collaborative scientific endeavor, increasingly reliant on online databases for data sharing. Yet there is currently no universal way to describe, store and share paleoclimate data: in other words, no standard. Data standards are often regarded by scientists as mere technicalities, though they underlie much scientific and technological innovation, as well as facilitating collaborations between research groups. In this article, we propose a preliminary data standard for paleoclimate data, general enough to accommodate all the archive and measurement types encountered in a large international collaboration (PAGES 2k). We also introduce a vehicle for such structured data (Linked Paleo Data, or LiPD), leveraging recent advances in knowledge representation (Linked Open Data).The LiPD framework enables quick querying and extraction, and we expect that it will facilitate the writing of open-source community codes to access, analyze, model and visualize paleoclimate observations. We welcome community feedback on this standard, and encourage paleoclimatologists to experiment with the format for their own purposes.

  17. A Standardization Framework for Electronic Government Service Portals

    NASA Astrophysics Data System (ADS)

    Sarantis, Demetrios; Tsiakaliaris, Christos; Lampathaki, Fenareti; Charalabidis, Yannis

    Although most eGovernment interoperability frameworks (eGIFs) cover adequately the technical aspects of developing and supporting the provision of electronic services to citizens and businesses, they do not exclusively address several important areas regarding the organization, presentation, accessibility and security of the content and the electronic services offered through government portals. This chapter extends the scope of existing eGIFs presenting the overall architecture and the basic concepts of the Greek standardization framework for electronic government service portals which, for the first time in Europe, is part of a country's eGovernment framework. The proposed standardization framework includes standards, guidelines and recommendations regarding the design, development and operation of government portals that support the provision of administrative information and services to citizens and businesses. By applying the guidelines of the framework, the design, development and operation of portals in central, regional and municipal government can be systematically addressed resulting in an applicable, sustainable and ever-expanding framework.

  18. Towards A Topological Framework for Integrating Semantic Information Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Hogan, Emilie A.; Robinson, Michael

    2014-09-07

    In this position paper we argue for the role that topological modeling principles can play in providing a framework for sensor integration. While used successfully in standard (quantitative) sensors, we are developing this methodology in new directions to make it appropriate specifically for semantic information sources, including keyterms, ontology terms, and other general Boolean, categorical, ordinal, and partially-ordered data types. We illustrate the basics of the methodology in an extended use case/example, and discuss path forward.

  19. Restful API Architecture Based on Laravel Framework

    NASA Astrophysics Data System (ADS)

    Chen, Xianjun; Ji, Zhoupeng; Fan, Yu; Zhan, Yongsong

    2017-10-01

    Web service has been an industry standard tech for message communication and integration between heterogeneous systems. RESTFUL API has become mainstream web service development paradigm after SOAP, how to effectively construct RESTFUL API remains a research hotspots. This paper presents a development model of RESTFUL API construction based on PHP language and LARAVEL framework. The key technical problems that need to be solved during the construction of RESTFUL API are discussed, and implementation details based on LARAVEL are given.

  20. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent processes from a different domain or have different spatial and temporal resolutions. An open source framework that bridges OpenMI and OpenDA is presented. The framework provides a generic and easy means for any OpenMI compliant model to assimilate observation measurements. An example test case will be presented using MikeSHE, and OpenMI compliant fully coupled integrated hydrological model that can accurately simulate the feedback dynamics of overland flow, unsaturated zone and saturated zone.

  1. Designing an evaluation framework for WFME basic standards for medical education.

    PubMed

    Tackett, Sean; Grant, Janet; Mmari, Kristin

    2016-01-01

    To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.

  2. A framework to assess welfare mix and service provision models in health care and social welfare: case studies of two prominent Italian regions.

    PubMed

    Longo, Francesco; Notarnicola, Elisabetta; Tasselli, Stefano

    2015-04-09

    The mechanisms through which the relationships among public institutions, private providers and families affect care and service provision systems are puzzling. How can we understand the mechanisms in these contexts? Which elements should we explore to capture the complexity of care provision? The aim of our study is to provide a framework that can help read and reframe these puzzling care provision mechanisms in a welfare mix context. First, we develop a theoretical framework for understanding how service provision occurs in care systems that are characterised by a variety of relationships between multiple actors, using an evidence-based approach that looks at both public and private expenditures and the number of users relative to the level of needs coverage and compared with declared values and political rhetoric. Second, we test this framework in two case studies built on data from two prominent Italian regions, Lombardy and Emilia-Romagna. We argue that service provision models depend on the interplay among six conceptual elements: policy values, governance rules, resources, nature of the providers, service standards and eligibility criteria. Our empirical study shows that beneath the relevant differences in values and political rhetoric between the case studies of the two Italian regions, there is a surprising isomorphism in service standards and the levels of covering the population's needs. The suggested framework appears to be effective and feasible; it fosters interdisciplinary approaches and supports policy-making discussions. This study may contribute to deepening knowledge about public care service provision and institutional arrangements, which can be used to promote more effective reforms and may advance future research. Although the framework was tested on the Italian welfare system, it can be used to assess many different systems.

  3. Searching for new physics at the frontiers with lattice quantum chromodynamics.

    PubMed

    Van de Water, Ruth S

    2012-07-01

    Numerical lattice-quantum chromodynamics (QCD) simulations, when combined with experimental measurements, allow the determination of fundamental parameters of the particle-physics Standard Model and enable searches for physics beyond-the-Standard Model. We present the current status of lattice-QCD weak matrix element calculations needed to obtain the elements and phase of the Cabibbo-Kobayashi-Maskawa (CKM) matrix and to test the Standard Model in the quark-flavor sector. We then discuss evidence that may hint at the presence of new physics beyond the Standard Model CKM framework. Finally, we discuss two opportunities where we expect lattice QCD to play a pivotal role in searching for, and possibly discovery of, new physics at upcoming high-intensity experiments: rare decays and the muon anomalous magnetic moment. The next several years may witness the discovery of new elementary particles at the Large Hadron Collider (LHC). The interplay between lattice QCD, high-energy experiments at the LHC, and high-intensity experiments will be needed to determine the underlying structure of whatever physics beyond-the-Standard Model is realized in nature. © 2012 New York Academy of Sciences.

  4. A blended supervision model in Australian general practice training.

    PubMed

    Ingham, Gerard; Fry, Jennifer

    2016-05-01

    The Royal Australian College of General Practitioners' Standards for general practice training allow different models of registrar supervision, provided these models achieve the outcomes of facilitating registrars' learning and ensuring patient safety. In this article, we describe a model of supervision called 'blended supervision', and its initial implementation and evaluation. The blended supervision model integrates offsite supervision with available local supervision resources. It is a pragmatic alternative to traditional supervision. Further evaluation of the cost-effectiveness, safety and effectiveness of this model is required, as is the recruitment and training of remote supervisors. A framework of questions was developed to outline the training practice's supervision methods and explain how blended supervision is achieving supervision and teaching outcomes. The supervision and teaching framework can be used to understand the supervision methods of all practices, not just practices using blended supervision.

  5. The nature of thinking, shallow and deep

    PubMed Central

    Brase, Gary L.

    2014-01-01

    Because the criteria for success differ across various domains of life, no single normative standard will ever work for all types of thinking. One method for dealing with this apparent dilemma is to propose that the mind is made up of a large number of specialized modules. This review describes how this multi-modular framework for the mind overcomes several critical conceptual and theoretical challenges to our understanding of human thinking, and hopefully clarifies what are (and are not) some of the implications based on this framework. In particular, an evolutionarily informed “deep rationality” conception of human thinking can guide psychological research out of clusters of ad hoc models which currently occupy some fields. First, the idea of deep rationality helps theoretical frameworks in terms of orienting themselves with regard to time scale references, which can alter the nature of rationality assessments. Second, the functional domains of deep rationality can be hypothesized (non-exhaustively) to include the areas of self-protection, status, affiliation, mate acquisition, mate retention, kin care, and disease avoidance. Thus, although there is no single normative standard of rationality across all of human cognition, there are sensible and objective standards by which we can evaluate multiple, fundamental, domain-specific motives underlying human cognition and behavior. This review concludes with two examples to illustrate the implications of this framework. The first example, decisions about having a child, illustrates how competing models can be understood by realizing that different fundamental motives guiding people’s thinking can sometimes be in conflict. The second example is that of personifications within modern financial markets (e.g., in the form of corporations), which are entities specifically constructed to have just one fundamental motive. This single focus is the source of both the strengths and flaws in how such entities behave. PMID:24860542

  6. The nature of thinking, shallow and deep.

    PubMed

    Brase, Gary L

    2014-01-01

    Because the criteria for success differ across various domains of life, no single normative standard will ever work for all types of thinking. One method for dealing with this apparent dilemma is to propose that the mind is made up of a large number of specialized modules. This review describes how this multi-modular framework for the mind overcomes several critical conceptual and theoretical challenges to our understanding of human thinking, and hopefully clarifies what are (and are not) some of the implications based on this framework. In particular, an evolutionarily informed "deep rationality" conception of human thinking can guide psychological research out of clusters of ad hoc models which currently occupy some fields. First, the idea of deep rationality helps theoretical frameworks in terms of orienting themselves with regard to time scale references, which can alter the nature of rationality assessments. Second, the functional domains of deep rationality can be hypothesized (non-exhaustively) to include the areas of self-protection, status, affiliation, mate acquisition, mate retention, kin care, and disease avoidance. Thus, although there is no single normative standard of rationality across all of human cognition, there are sensible and objective standards by which we can evaluate multiple, fundamental, domain-specific motives underlying human cognition and behavior. This review concludes with two examples to illustrate the implications of this framework. The first example, decisions about having a child, illustrates how competing models can be understood by realizing that different fundamental motives guiding people's thinking can sometimes be in conflict. The second example is that of personifications within modern financial markets (e.g., in the form of corporations), which are entities specifically constructed to have just one fundamental motive. This single focus is the source of both the strengths and flaws in how such entities behave.

  7. Grade Expectations for Vermont's Framework of Standards and Learning Opportunities, Spring 2004 (Mathematics, Reading and Writing)

    ERIC Educational Resources Information Center

    Vermont Department of Education, 2004

    2004-01-01

    This document, "Grade Expectations for Vermont's Framework of Standards and Learning Opportunities" (hereafter "Vermont's Grade Expectations"), is an important companion to "Vermont's Framework." These Grade Expectations (GEs) serve the same purposes as "Vermont's Framework," but articulate learning…

  8. Implementation of the US EPA (United States Environmental Protection Agency) Regional Oxidant Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novak, J.H.

    1984-05-01

    Model design, implementation and quality assurance procedures can have a significant impact on the effectiveness of long term utility of any modeling approach. The Regional Oxidant Modeling System (ROMS) is exceptionally complex because it treats all chemical and physical processes thought to affect ozone concentration on a regional scale. Thus, to effectively illustrate useful design and implementation techniques, this paper describes the general modeling framework which forms the basis of the ROMS. This framework is flexible enough to allow straightforward update or replacement of the chemical kinetics mechanism and/or any theoretical formulations of the physical processes. Use of the Jacksonmore » Structured Programming (JSP) method to implement this modeling framework has not only increased programmer productivity and quality of the resulting programs, but also has provided standardized program design, dynamic documentation, and easily maintainable and transportable code. A summary of the JSP method is presented to encourage modelers to pursue this technique in their own model development efforts. In addition, since data preparation is such an integral part of a successful modeling system, the ROMS processor network is described with emphasis on the internal quality control techniques.« less

  9. An overview of the model integration process: From pre ...

    EPA Pesticide Factsheets

    Integration of models requires linking models which can be developed using different tools, methodologies, and assumptions. We performed a literature review with the aim of improving our understanding of model integration process, and also presenting better strategies for building integrated modeling systems. We identified five different phases to characterize integration process: pre-integration assessment, preparation of models for integration, orchestration of models during simulation, data interoperability, and testing. Commonly, there is little reuse of existing frameworks beyond the development teams and not much sharing of science components across frameworks. We believe this must change to enable researchers and assessors to form complex workflows that leverage the current environmental science available. In this paper, we characterize the model integration process and compare integration practices of different groups. We highlight key strategies, features, standards, and practices that can be employed by developers to increase reuse and interoperability of science software components and systems. The paper provides a review of the literature regarding techniques and methods employed by various modeling system developers to facilitate science software interoperability. The intent of the paper is to illustrate the wide variation in methods and the limiting effect the variation has on inter-framework reuse and interoperability. A series of recommendation

  10. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  11. The Effectiveness of Three Experiential Teaching Approaches on Student Science Learning in Fifth-Grade Public School Classrooms.

    ERIC Educational Resources Information Center

    Powell, Kristin; Wells, Marcella

    2002-01-01

    Compares the effects of three experiential science lessons in meeting the objectives of the Colorado model content science standards. Uses Kolb's (1984) experiential learning model as a framework for understanding the process by which students engage in learning when participating in experiential learning activities. Uses classroom exams and…

  12. A classical regression framework for mediation analysis: fitting one model to estimate mediation effects.

    PubMed

    Saunders, Christina T; Blume, Jeffrey D

    2017-10-26

    Mediation analysis explores the degree to which an exposure's effect on an outcome is diverted through a mediating variable. We describe a classical regression framework for conducting mediation analyses in which estimates of causal mediation effects and their variance are obtained from the fit of a single regression model. The vector of changes in exposure pathway coefficients, which we named the essential mediation components (EMCs), is used to estimate standard causal mediation effects. Because these effects are often simple functions of the EMCs, an analytical expression for their model-based variance follows directly. Given this formula, it is instructive to revisit the performance of routinely used variance approximations (e.g., delta method and resampling methods). Requiring the fit of only one model reduces the computation time required for complex mediation analyses and permits the use of a rich suite of regression tools that are not easily implemented on a system of three equations, as would be required in the Baron-Kenny framework. Using data from the BRAIN-ICU study, we provide examples to illustrate the advantages of this framework and compare it with the existing approaches. © The Author 2017. Published by Oxford University Press.

  13. Reusable Component Model Development Approach for Parallel and Distributed Simulation

    PubMed Central

    Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng

    2014-01-01

    Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751

  14. Security Certification Challenges in a Cloud Computing Delivery Model

    DTIC Science & Technology

    2010-04-27

    Relevant Security Standards, Certifications, and Guidance  NIST SP 800 series  ISO /IEC 27001 framework  Cloud Security Alliance  Statement of...CSA Domains / Cloud Features ISO 27001 Cloud Service Provider Responsibility Government Agency Responsibility Analyze Security gaps Compensating

  15. Design and Applications of a GeoSemantic Framework for Integration of Data and Model Resources in Hydrologic Systems

    NASA Astrophysics Data System (ADS)

    Elag, M.; Kumar, P.

    2016-12-01

    Hydrologists today have to integrate resources such as data and models, which originate and reside in multiple autonomous and heterogeneous repositories over the Web. Several resource management systems have emerged within geoscience communities for sharing long-tail data, which are collected by individual or small research groups, and long-tail models, which are developed by scientists or small modeling communities. While these systems have increased the availability of resources within geoscience domains, deficiencies remain due to the heterogeneity in the methods, which are used to describe, encode, and publish information about resources over the Web. This heterogeneity limits our ability to access the right information in the right context so that it can be efficiently retrieved and understood without the Hydrologist's mediation. A primary challenge of the Web today is the lack of the semantic interoperability among the massive number of resources, which already exist and are continually being generated at rapid rates. To address this challenge, we have developed a decentralized GeoSemantic (GS) framework, which provides three sets of micro-web services to support (i) semantic annotation of resources, (ii) semantic alignment between the metadata of two resources, and (iii) semantic mediation among Standard Names. Here we present the design of the framework and demonstrate its application for semantic integration between data and models used in the IML-CZO. First we show how the IML-CZO data are annotated using the Semantic Annotation Services. Then we illustrate how the Resource Alignment Services and Knowledge Integration Services are used to create a semantic workflow among TopoFlow model, which is a spatially-distributed hydrologic model and the annotated data. Results of this work are (i) a demonstration of how the GS framework advances the integration of heterogeneous data and models of water-related disciplines by seamless handling of their semantic heterogeneity, (ii) an introduction of new paradigm for reusing existing and new standards as well as tools and models without the need of their implementation in the Cyberinfrastructures of water-related disciplines, and (iii) an investigation of a methodology by which distributed models can be coupled in a workflow using the GS services.

  16. Finite element analysis of an implant-assisted removable partial denture.

    PubMed

    Shahmiri, Reza; Aarts, John M; Bennani, Vincent; Atieh, Momen A; Swain, Michael V

    2013-10-01

    This study analyzes the effects of loading a Kennedy class I implant-assisted removable partial denture (IARPD) using finite element analysis (FEA). Standard RPDs are not originally designed to accommodate a posterior implant load point. The null hypothesis is that the introduction of posteriorly placed implants into an RPD has no effect on the load distribution. A Faro Arm scan was used to extract the geometrical data of a human partially edentulous mandible. A standard plus regular neck (4.8 × 12 mm) Straumann® implant and titanium matrix, tooth roots, and periodontal ligaments were modeled using a combination of reverse engineering in Rapidform XOR2 and solid modeling in Solidworks 2008 FEA program. The model incorporated an RPD and was loaded with a bilateral force of 120 N. ANSYS Workbench 11.0 was used to analyze deformation in the IARPD and elastic strain in the metal framework. FEA identified that the metal framework developed high strain patterns on the major and minor connectors, and the acrylic was subjected to deformation, which could lead to acrylic fractures. The ideal position of the neutral axis was calculated to be 0.75 mm above the ridge. A potentially destructive mismatch of strain distribution was identified between the acrylic and metal framework, which could be a factor in the failure of the acrylic. The metal framework showed high strain patterns on the major and minor connectors around the teeth, while the implant components transferred the load directly to the acrylic. © 2013 by the American College of Prosthodontists.

  17. Thermal Dark Matter Below a MeV

    DOE PAGES

    Berlin, Asher; Blinov, Nikita

    2018-01-08

    We consider a class of models in which thermal dark matter is lighter than a MeV. If dark matter thermalizes with the standard model below the temperature of neutrino-photon decoupling, equilibration and freeze-out cool and heat the standard model bath comparably, alleviating constraints from measurements of the effective number of neutrino species. We demonstrate this mechanism in a model consisting of fermionic dark matter coupled to a light scalar mediator. Thermal dark matter can be as light as a few keV, while remaining compatible with existing cosmological and astrophysical observations. This framework motivates new experiments in the direct search formore » sub-MeV thermal dark matter and light force carriers.« less

  18. Thermal Dark Matter Below a MeV

    NASA Astrophysics Data System (ADS)

    Berlin, Asher; Blinov, Nikita

    2018-01-01

    We consider a class of models in which thermal dark matter is lighter than a MeV. If dark matter thermalizes with the standard model below the temperature of neutrino-photon decoupling, equilibration and freeze-out cool and heat the standard model bath comparably, alleviating constraints from measurements of the effective number of neutrino species. We demonstrate this mechanism in a model consisting of fermionic dark matter coupled to a light scalar mediator. Thermal dark matter can be as light as a few keV, while remaining compatible with existing cosmological and astrophysical observations. This framework motivates new experiments in the direct search for sub-MeV thermal dark matter and light force carriers.

  19. Thermal Dark Matter Below a MeV.

    PubMed

    Berlin, Asher; Blinov, Nikita

    2018-01-12

    We consider a class of models in which thermal dark matter is lighter than a MeV. If dark matter thermalizes with the standard model below the temperature of neutrino-photon decoupling, equilibration and freeze-out cool and heat the standard model bath comparably, alleviating constraints from measurements of the effective number of neutrino species. We demonstrate this mechanism in a model consisting of fermionic dark matter coupled to a light scalar mediator. Thermal dark matter can be as light as a few keV, while remaining compatible with existing cosmological and astrophysical observations. This framework motivates new experiments in the direct search for sub-MeV thermal dark matter and light force carriers.

  20. Thermal Dark Matter Below a MeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berlin, Asher; Blinov, Nikita

    We consider a class of models in which thermal dark matter is lighter than a MeV. If dark matter thermalizes with the standard model below the temperature of neutrino-photon decoupling, equilibration and freeze-out cool and heat the standard model bath comparably, alleviating constraints from measurements of the effective number of neutrino species. We demonstrate this mechanism in a model consisting of fermionic dark matter coupled to a light scalar mediator. Thermal dark matter can be as light as a few keV, while remaining compatible with existing cosmological and astrophysical observations. This framework motivates new experiments in the direct search formore » sub-MeV thermal dark matter and light force carriers.« less

  1. SHINE: Strategic Health Informatics Networks for Europe.

    PubMed

    Kruit, D; Cooper, P A

    1994-10-01

    The mission of SHINE is to construct an open systems framework for the development of regional community healthcare telematic services that support and add to the strategic business objectives of European healthcare providers and purchasers. This framework will contain a Methodology, that identifies healthcare business processes and develops a supporting IT strategy, and the Open Health Environment. This consists of an architecture and information standards that are 'open' and will be available to any organisation wishing to construct SHINE conform regional healthcare telematic services. Results are: generic models, e.g., regional healthcare business networks, IT strategies; demonstrable, e.g., pilot demonstrators, application and service prototypes; reports, e.g., SHINE Methodology, pilot specifications & evaluations; proposals, e.g., service/interface specifications, standards conformance.

  2. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    NASA Astrophysics Data System (ADS)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  3. General squark flavour mixing: constraints, phenomenology and benchmarks

    DOE PAGES

    De Causmaecker, Karen; Fuks, Benjamin; Herrmann, Bjorn; ...

    2015-11-19

    Here, we present an extensive study of non-minimal flavour violation in the squark sector in the framework of the Minimal Supersymmetric Standard Model. We investigate the effects of multiple non-vanishing flavour-violating elements in the squark mass matrices by means of a Markov Chain Monte Carlo scanning technique and identify parameter combinations that are favoured by both current data and theoretical constraints. We then detail the resulting distributions of the flavour-conserving and flavour-violating model parameters. Based on this analysis, we propose a set of benchmark scenarios relevant for future studies of non-minimal flavour violation in the Minimal Supersymmetric Standard Model.

  4. Minimal but non-minimal inflation and electroweak symmetry breaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzola, Luca; Institute of Physics, University of Tartu,Ravila 14c, 50411 Tartu; Racioppi, Antonio

    2016-10-07

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  5. A framework for semantic interoperability in healthcare: a service oriented architecture based on health informatics standards.

    PubMed

    Ryan, Amanda; Eklund, Peter

    2008-01-01

    Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).

  6. Right-handed charged currents in the era of the Large Hadron Collider

    DOE PAGES

    Alioli, Simone; Cirigliano, Vincenzo; Dekens, Wouter Gerard; ...

    2017-05-16

    We discuss the phenomenology of right-handed charged currents in the frame-work of the Standard Model Effective Field Theory, in which they arise due to a single gauge-invariant dimension-six operator. We study the manifestations of the nine complex couplings of the W to right-handed quarks in collider physics, flavor physics, and low-energy precision measurements. We first obtain constraints on the couplings under the assumption that the right-handed operator is the dominant correction to the Standard Model at observable energies. Here, we subsequently study the impact of degeneracies with other Beyond-the-Standard-Model effective interactions and identify observables, both at colliders and low-energy experiments,more » that would uniquely point to right-handed charged currents.« less

  7. A reference model for space data system interconnection services

    NASA Astrophysics Data System (ADS)

    Pietras, John; Theis, Gerhard

    1993-03-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  8. A reference model for space data system interconnection services

    NASA Technical Reports Server (NTRS)

    Pietras, John; Theis, Gerhard

    1993-01-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  9. Embedding Quantum Mechanics Into a Broader Noncontextual Theory: A Conciliatory Result

    NASA Astrophysics Data System (ADS)

    Garola, Claudio; Sozzo, Sandro

    2010-12-01

    The extended semantic realism ( ESR) model embodies the mathematical formalism of standard (Hilbert space) quantum mechanics in a noncontextual framework, reinterpreting quantum probabilities as conditional instead of absolute. We provide here an improved version of this model and show that it predicts that, whenever idealized measurements are performed, a modified Bell-Clauser-Horne-Shimony-Holt ( BCHSH) inequality holds if one takes into account all individual systems that are prepared, standard quantum predictions hold if one considers only the individual systems that are detected, and a standard BCHSH inequality holds at a microscopic (purely theoretical) level. These results admit an intuitive explanation in terms of an unconventional kind of unfair sampling and constitute a first example of the unified perspective that can be attained by adopting the ESR model.

  10. Small drinking water systems under spatiotemporal water quality variability: a risk-based performance benchmarking framework.

    PubMed

    Bereskie, Ty; Haider, Husnain; Rodriguez, Manuel J; Sadiq, Rehan

    2017-08-23

    Traditional approaches for benchmarking drinking water systems are binary, based solely on the compliance and/or non-compliance of one or more water quality performance indicators against defined regulatory guidelines/standards. The consequence of water quality failure is dependent on location within a water supply system as well as time of the year (i.e., season) with varying levels of water consumption. Conventional approaches used for water quality comparison purposes fail to incorporate spatiotemporal variability and degrees of compliance and/or non-compliance. This can lead to misleading or inaccurate performance assessment data used in the performance benchmarking process. In this research, a hierarchical risk-based water quality performance benchmarking framework is proposed to evaluate small drinking water systems (SDWSs) through cross-comparison amongst similar systems. The proposed framework (R WQI framework) is designed to quantify consequence associated with seasonal and location-specific water quality issues in a given drinking water supply system to facilitate more efficient decision-making for SDWSs striving for continuous performance improvement. Fuzzy rule-based modelling is used to address imprecision associated with measuring performance based on singular water quality guidelines/standards and the uncertainties present in SDWS operations and monitoring. This proposed R WQI framework has been demonstrated using data collected from 16 SDWSs in Newfoundland and Labrador and Quebec, Canada, and compared to the Canadian Council of Ministers of the Environment WQI, a traditional, guidelines/standard-based approach. The study found that the R WQI framework provides an in-depth state of water quality and benchmarks SDWSs more rationally based on the frequency of occurrence and consequence of failure events.

  11. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  12. Hybrid modeling in biochemical systems theory by means of functional petri nets.

    PubMed

    Wu, Jialiang; Voit, Eberhard

    2009-02-01

    Many biological systems are genuinely hybrids consisting of interacting discrete and continuous components and processes that often operate at different time scales. It is therefore desirable to create modeling frameworks capable of combining differently structured processes and permitting their analysis over multiple time horizons. During the past 40 years, Biochemical Systems Theory (BST) has been a very successful approach to elucidating metabolic, gene regulatory, and signaling systems. However, its foundation in ordinary differential equations has precluded BST from directly addressing problems containing switches, delays, and stochastic effects. In this study, we extend BST to hybrid modeling within the framework of Hybrid Functional Petri Nets (HFPN). First, we show how the canonical GMA and S-system models in BST can be directly implemented in a standard Petri Net framework. In a second step we demonstrate how to account for different types of time delays as well as for discrete, stochastic, and switching effects. Using representative test cases, we validate the hybrid modeling approach through comparative analyses and simulations with other approaches and highlight the feasibility, quality, and efficiency of the hybrid method.

  13. Beyond the SCS-CN method: A theoretical framework for spatially lumped rainfall-runoff response

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-06-01

    Since its introduction in 1954, the Soil Conservation Service curve number (SCS-CN) method has become the standard tool, in practice, for estimating an event-based rainfall-runoff response. However, because of its empirical origins, the SCS-CN method is restricted to certain geographic regions and land use types. Moreover, it does not describe the spatial variability of runoff. To move beyond these limitations, we present a new theoretical framework for spatially lumped, event-based rainfall-runoff modeling. In this framework, we describe the spatially lumped runoff model as a point description of runoff that is upscaled to a watershed area based on probability distributions that are representative of watershed heterogeneities. The framework accommodates different runoff concepts and distributions of heterogeneities, and in doing so, it provides an implicit spatial description of runoff variability. Heterogeneity in storage capacity and soil moisture are the basis for upscaling a point runoff response and linking ecohydrological processes to runoff modeling. For the framework, we consider two different runoff responses for fractions of the watershed area: "prethreshold" and "threshold-excess" runoff. These occur before and after infiltration exceeds a storage capacity threshold. Our application of the framework results in a new model (called SCS-CNx) that extends the SCS-CN method with the prethreshold and threshold-excess runoff mechanisms and an implicit spatial description of runoff. We show proof of concept in four forested watersheds and further that the resulting model may better represent geographic regions and site types that previously have been beyond the scope of the traditional SCS-CN method.

  14. Comparison of marginal and internal fit of 3-unit ceramic fixed dental prostheses made with either a conventional or digital impression.

    PubMed

    Su, Ting-Shu; Sun, Jian

    2016-09-01

    For 20 years, the intraoral digital impression technique has been applied to the fabrication of computer aided design and computer aided manufacturing (CAD-CAM) fixed dental prostheses (FDPs). Clinical fit is one of the main determinants of the success of an FDP. Studies of the clinical fit of 3-unit ceramic FDPs made by means of a conventional impression versus a digital impression technology are limited. The purpose of this in vitro study was to evaluate and compare the internal fit and marginal fit of CAD-CAM, 3-unit ceramic FDP frameworks fabricated from an intraoral digital impression and a conventional impression. A standard model was designed for a prepared maxillary left canine and second premolar and missing first premolar. The model was scanned with an intraoral digital scanner, exporting stereolithography (STL) files as the experimental group (digital group). The model was used to fabricate 10 stone casts that were scanned with an extraoral scanner, exporting STL files to a computer connected to the scanner as the control group (conventional group). The STL files were used to produce zirconia FDP frameworks with CAD-CAM. These frameworks were seated on the standard model and evaluated for marginal and internal fit. Each framework was segmented into 4 sections per abutment teeth, resulting in 8 sections per framework, and was observed using optical microscopy with ×50 magnification. Four measurement points were selected on each section as marginal discrepancy (P1), mid-axial wall (P2), axio-occusal edge (P3), and central-occlusal point (P4). Mean marginal fit values of the digital group (64 ±16 μm) were significantly smaller than those of the conventional group (76 ±18 μm) (P<.05). The mean internal fit values of the digital group (111 ±34 μm) were significantly smaller than those of the conventional group (132 ±44 μm) (P<.05). CAD-CAM 3-unit zirconia FDP frameworks fabricated from intraoral digital and conventional impressions showed clinically acceptable marginal and internal fit. The marginal and internal fit of frameworks fabricated from the intraoral digital impression system were better than those fabricated from conventional impressions. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  15. Enterprise and system of systems capability development life-cycle processes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, David Franklin

    2014-08-01

    This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less

  16. Sustainable development induction in organizations: a convergence analysis of ISO standards management tools' parameters.

    PubMed

    Merlin, Fabrício Kurman; Pereira, Vera Lúciaduarte do Valle; Pacheco, Waldemar

    2012-01-01

    Organizations are part of an environment in which they are pressured to meet society's demands and acting in a sustainable way. In an attempt to meet such demands, organizations make use of various management tools, among which, ISO standards are used. Although there are evidences of contributions provided by these standards, it is questionable whether its parameters converge for a possible induction for sustainable development in organizations. This work presents a theoretical study, designed on structuralism world view, descriptive and deductive method, which aims to analyze the convergence of management tools' parameters in ISO standards. In order to support the analysis, a generic framework for possible convergence was developed, based on systems approach, linking five ISO standards (ISO 9001, ISO 14001, OHSAS 18001, ISO 31000 and ISO 26000) with sustainable development and positioning them according to organization levels (strategic, tactical and operational). The structure was designed based on Brundtland report concept. The analysis was performed exploring the generic framework for possible convergence based on Nadler and Tushman model. The results found the standards can contribute to a possible sustainable development induction in organizations, as long as they meet certain minimum conditions related to its strategic alignment.

  17. Documenting Models for Interoperability and Reusability ...

    EPA Pesticide Factsheets

    Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knirsch, Fabian; Engel, Dominik; Neureiter, Christian

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures needmore » to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.« less

  19. Accounting for measurement reliability to improve the quality of inference in dental microhardness research: a worked example.

    PubMed

    Sever, Ivan; Klaric, Eva; Tarle, Zrinka

    2016-07-01

    Dental microhardness experiments are influenced by unobserved factors related to the varying tooth characteristics that affect measurement reproducibility. This paper explores the appropriate analytical tools for modeling different sources of unobserved variability to reduce the biases encountered and increase the validity of microhardness studies. The enamel microhardness of human third molars was measured by Vickers diamond. The effects of five bleaching agents-10, 16, and 30 % carbamide peroxide, and 25 and 38 % hydrogen peroxide-were examined, as well as the effect of artificial saliva and amorphous calcium phosphate. To account for both between- and within-tooth heterogeneity in evaluating treatment effects, the statistical analysis was performed in the mixed-effects framework, which also included the appropriate weighting procedure to adjust for confounding. The results were compared to those of the standard ANOVA model usually applied. The weighted mixed-effects model produced the parameter estimates of different magnitude and significance than the standard ANOVA model. The results of the former model were more intuitive, with more precise estimates and better fit. Confounding could seriously bias the study outcomes, highlighting the need for more robust statistical procedures in dental research that account for the measurement reliability. The presented framework is more flexible and informative than existing analytical techniques and may improve the quality of inference in dental research. Reported results could be misleading if underlying heterogeneity of microhardness measurements is not taken into account. The confidence in treatment outcomes could be increased by applying the framework presented.

  20. Development of a Curricular Framework for Pediatric Hospital Medicine Fellowships.

    PubMed

    Jerardi, Karen E; Fisher, Erin; Rassbach, Caroline; Maniscalco, Jennifer; Blankenburg, Rebecca; Chase, Lindsay; Shah, Neha

    2017-07-01

    Pediatric Hospital Medicine (PHM) is an emerging field in pediatrics and one that has experienced immense growth and maturation in a short period of time. Evolution and rapid expansion of the field invigorated the goal of standardizing PHM fellowship curricula, which naturally aligned with the field's evolving pursuit of a defined identity and consideration of certification options. The national group of PHM fellowship program directors sought to establish curricular standards that would more accurately reflect the competencies needed to practice pediatric hospital medicine and meet future board certification needs. In this manuscript, we describe the method by which we reached consensus on a 2-year curricular framework for PHM fellowship programs, detail the current model for this framework, and provide examples of how this curricular framework may be applied to meet the needs of a variety of fellows and fellowship programs. The 2-year PHM fellowship curricular framework was developed over a number of years through an iterative process and with the input of PHM fellowship program directors (PDs), PHM fellowship graduates, PHM leaders, pediatric hospitalists practicing in a variety of clinical settings, and other educators outside the field. We have developed a curricular framework for PHM Fellowships that consists of 8 education units (defined as 4 weeks each) in 3 areas: clinical care, systems and scholarship, and individualized curriculum. Copyright © 2017 by the American Academy of Pediatrics.

  1. A Comparison of Pseudo-Maximum Likelihood and Asymptotically Distribution-Free Dynamic Factor Analysis Parameter Estimation in Fitting Covariance Structure Models to Block-Toeplitz Matrices Representing Single-Subject Multivariate Time-Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    1998-01-01

    Pseudo-Maximum Likelihood (p-ML) and Asymptotically Distribution Free (ADF) estimation methods for estimating dynamic factor model parameters within a covariance structure framework were compared through a Monte Carlo simulation. Both methods appear to give consistent model parameter estimates, but only ADF gives standard errors and chi-square…

  2. Middleware for Plug and Play Integration of Heterogeneous Sensor Resources into the Sensor Web

    PubMed Central

    Toma, Daniel M.; Jirka, Simon; Del Río, Joaquín

    2017-01-01

    The study of global phenomena requires the combination of a considerable amount of data coming from different sources, acquired by different observation platforms and managed by institutions working in different scientific fields. Merging this data to provide extensive and complete data sets to monitor the long-term, global changes of our oceans is a major challenge. The data acquisition and data archival procedures usually vary significantly depending on the acquisition platform. This lack of standardization ultimately leads to information silos, preventing the data to be effectively shared across different scientific communities. In the past years, important steps have been taken in order to improve both standardization and interoperability, such as the Open Geospatial Consortium’s Sensor Web Enablement (SWE) framework. Within this framework, standardized models and interfaces to archive, access and visualize the data from heterogeneous sensor resources have been proposed. However, due to the wide variety of software and hardware architectures presented by marine sensors and marine observation platforms, there is still a lack of uniform procedures to integrate sensors into existing SWE-based data infrastructures. In this work, a framework aimed to enable sensor plug and play integration into existing SWE-based data infrastructures is presented. First, an analysis of the operations required to automatically identify, configure and operate a sensor are analysed. Then, the metadata required for these operations is structured in a standard way. Afterwards, a modular, plug and play, SWE-based acquisition chain is proposed. Finally different use cases for this framework are presented. PMID:29244732

  3. The COST Action IC0604 "Telepathology Network in Europe" (EURO-TELEPATH).

    PubMed

    García-Rojo, Marcial; Gonçalves, Luís; Blobel, Bernd

    2012-01-01

    The COST Action IC0604 "Telepathology Network in Europe" (EURO-TELEPATH) is a European COST Action that has been running from 2007 to 2011. COST Actions are funded by the COST (European Cooperation in the field of Scientific and Technical Research) Agency, supported by the Seventh Framework Programme for Research and Technological Development (FP7), of the European Union. EURO-TELEPATH's main objectives were evaluating and validating the common technological framework and communication standards required to access, transmit and manage digital medical records by pathologists and other medical professionals in a networked environment. The project was organized in four working groups. orking Group 1 "Business modeling in pathology" has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy -using Business Process Modeling Notation (BPMN). orking Group 2 "Informatics standards in pathology" has been dedicated to promoting the development and application of informatics standards in pathology, collaborating with Integrating the Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Working Group 3 "Images: Analysis, Processing, Retrieval and Management" worked on the use of virtual or digital slides that are fostering the use of image processing and analysis in pathology not only for research purposes, but also in daily practice. Working Group 4 "Technology and Automation in Pathology" was focused on studying the adequacy of current existing technical solutions, including, e.g., the quality of images obtained by slide scanners, or the efficiency of image analysis applications. Major outcome of this action are the collaboration with international health informatics standardization bodies to foster the development of standards for digital pathology, offering a new approach for workflow analysis, based in business process modeling. Health terminology standardization research has become a topic of high interest. Future research work should focus on standardization of automatic image analysis and tissue microarrays imaging.

  4. Geo3DML: A standard-based exchange format for 3D geological models

    NASA Astrophysics Data System (ADS)

    Wang, Zhangang; Qu, Honggang; Wu, Zixing; Wang, Xianghong

    2018-01-01

    A geological model (geomodel) in three-dimensional (3D) space is a digital representation of the Earth's subsurface, recognized by geologists and stored in resultant geological data (geodata). The increasing demand for data management and interoperable applications of geomodelscan be addressed by developing standard-based exchange formats for the representation of not only a single geological object, but also holistic geomodels. However, current standards such as GeoSciML cannot incorporate all the geomodel-related information. This paper presents Geo3DML for the exchange of 3D geomodels based on the existing Open Geospatial Consortium (OGC) standards. Geo3DML is based on a unified and formal representation of structural models, attribute models and hierarchical structures of interpreted resultant geodata in different dimensional views, including drills, cross-sections/geomaps and 3D models, which is compatible with the conceptual model of GeoSciML. Geo3DML aims to encode all geomodel-related information integrally in one framework, including the semantic and geometric information of geoobjects and their relationships, as well as visual information. At present, Geo3DML and some supporting tools have been released as a data-exchange standard by the China Geological Survey (CGS).

  5. Closed-Loop Lifecycle Management of Service and Product in the Internet of Things: Semantic Framework for Knowledge Integration.

    PubMed

    Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris

    2016-07-08

    This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) BACKGROUND: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) METHODS: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) RESULTS: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) CONCLUSION: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database.

  6. Closed-Loop Lifecycle Management of Service and Product in the Internet of Things: Semantic Framework for Knowledge Integration

    PubMed Central

    Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris

    2016-01-01

    This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) Background: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) Methods: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) Results: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) Conclusion: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database. PMID:27399717

  7. A proposed application programming interface for a physical volume repository

    NASA Technical Reports Server (NTRS)

    Jones, Merritt; Williams, Joel; Wrenn, Richard

    1996-01-01

    The IEEE Storage System Standards Working Group (SSSWG) has developed the Reference Model for Open Storage Systems Interconnection, Mass Storage System Reference Model Version 5. This document, provides the framework for a series of standards for application and user interfaces to open storage systems. More recently, the SSSWG has been developing Application Programming Interfaces (APIs) for the individual components defined by the model. The API for the Physical Volume Repository is the most fully developed, but work is being done on APIs for the Physical Volume Library and for the Mover also. The SSSWG meets every other month, and meetings are open to all interested parties. The Physical Volume Repository (PVR) is responsible for managing the storage of removable media cartridges and for mounting and dismounting these cartridges onto drives. This document describes a model which defines a Physical Volume Repository, and gives a brief summary of the Application Programming Interface (API) which the IEEE Storage Systems Standards Working Group (SSSWG) is proposing as the standard interface for the PVR.

  8. The Power to drive change: Working together for excellence. Creating a continuously improving consumer engagement framework for excellence in patient-centered care.

    PubMed

    Ryan, Catherine

    2016-01-01

    The World Health Organization has acknowledged Patient Safety while receiving hospital care as a serious global public health issue, with patient empowerment and community engagement key to continuously improving safety and quality of care for the best possible clinical and patient outcomes. In Australia, the introduction of ten mandatory National Safety and Quality Health Service Standards in 2011 provided the catalyst for all Australian health facilities to review their systems. Standard 2: Partnering with Consumers required health facilities across Australia to assess commitment to, and capacity for consumer and community engagement and participation. At this time, the Royal Brisbane and Women's Hospital did not have a strategic perspective and understanding, or an organizational structure for engaging with consumers (patients, families, care givers and community members). The concept required a new model to replace the clinician-led model of healthcare historically featured in Australia, with a change in culture and core business. processes to partner with consumers at all levels of the system, from individual patient care through to participating in policy development, health service planning and delivery, and evaluation and measurement processes. The challenge for the hospital was to build a sustainable framework of engagement for a genuine patient-centered model of care informed by best practice, and provide leadership and commitment to developing as an area of excellence in patient engagement and experience. A successful and sustainable framework for consumer and community engagement has been embedded in the hospital, with resultant culture change, achieving accreditation across all core and developmental criteria for the partnering with consumer standards including several Met with Merit ratings.

  9. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges

    PubMed Central

    Amarasingham, Ruben; Audet, Anne-Marie J.; Bates, David W.; Glenn Cohen, I.; Entwistle, Martin; Escobar, G. J.; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M.; Shahi, Anand; Stewart, Walter F.; Steyerberg, Ewout W.; Xie, Bin

    2016-01-01

    Context: The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Objectives: Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. Methods: To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. Findings: The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and Ethics) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility.Ethics: Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models. PMID:27141516

  10. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges.

    PubMed

    Amarasingham, Ruben; Audet, Anne-Marie J; Bates, David W; Glenn Cohen, I; Entwistle, Martin; Escobar, G J; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M; Shahi, Anand; Stewart, Walter F; Steyerberg, Ewout W; Xie, Bin

    2016-01-01

    The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and ETHICS) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility. Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models.

  11. Discovering Technicolor

    NASA Astrophysics Data System (ADS)

    Andersen, J. R.; Antipin, O.; Azuelos, G.; Del Debbio, L.; Del Nobile, E.; Di Chiara, S.; Hapola, T.; Järvinen, M.; Lowdon, P. J.; Maravin, Y.; Masina, I.; Nardecchia, M.; Pica, C.; Sannino, F.

    2011-09-01

    We provide a pedagogical introduction to extensions of the Standard Model in which the Higgs is composite. These extensions are known as models of dynamical electroweak symmetry breaking or, in brief, Technicolor. Material covered includes: motivations for Technicolor, the construction of underlying gauge theories leading to minimal models of Technicolor, the comparison with electroweak precision data, the low-energy effective theory, the spectrum of the states common to most of the Technicolor models, the decays of the composite particles and the experimental signals at the Large Hadron Collider. The level of the presentation is aimed at readers familiar with the Standard Model but who have little or no prior exposure to Technicolor. Several extensions of the Standard Model featuring a composite Higgs can be reduced to the effective Lagrangian introduced in the text. We establish the relevant experimental benchmarks for Vanilla, Running, Walking, and Custodial Technicolor, and a natural fourth family of leptons, by laying out the framework to discover these models at the Large Hadron Collider.

  12. Using frameworks to diagram value in complex policy and environmental interventions to prevent childhood obesity.

    PubMed

    Swank, Melissa Farrell; Brennan, Laura K; Gentry, Daniel; Kemner, Allison L

    2015-01-01

    To date, few tools assist policy makers and practitioners in understanding and conveying the implementation costs, potential impacts, and value of policy and environmental changes to address healthy eating, active living, and childhood obesity. For the Evaluation of Healthy Kids, Healthy Communities (HKHC), evaluators considered inputs (resources and investments) that generate costs and savings as well as benefits and harms related to social, economic, environmental, and health-related outcomes in their assessment of 49 HKHC community partnerships funded from 2009 to 2014. Using data collected through individual and group interviews and an online performance monitoring system, evaluators created a socioecological framework to assess investments, resources, costs, savings, benefits, and harms at the individual, organizational, community, and societal levels. Evaluators customized frameworks for 6 focal strategies: active transportation, parks and play spaces, child care physical activity standards, corner stores, farmers' markets, and child care nutrition standards. To illustrate the Value Frameworks, this brief highlights the 38 HKHC communities implementing at least 1 active transportation strategy. Evaluators populated this conceptual Value Framework with themes from the strategy-specific inputs and outputs. The range of factors corresponding to the implementation and impact of the HKHC community partnerships are highlighted along with the inputs and outputs. The Value Frameworks helped evaluators identify gaps in current analysis models (ie, benefit-cost analysis, cost-effectiveness analysis) as well as paint a more complete picture of value for potential obesity prevention strategies. These frameworks provide a comprehensive understanding of investments needed, proposed costs and savings, and potential benefits and harms associated with economic, social, environmental, and health outcomes. This framing also allowed evaluators to demonstrate the interdependence of each socioecological level on the others in these multicomponent interventions. This model can be used by practitioners and community leaders to assess realistic and sustainable strategies to combat childhood obesity.

  13. Redesigning the Preparation of All Teachers within the Framework of an Integrated Program Model

    ERIC Educational Resources Information Center

    Hardman, Michael L.

    2009-01-01

    It is incumbent on universities to reflect current research on effective teacher preparation and respond to the changing needs of the 21st century. These needs include the knowledge and skills to instruct diverse students; an increasing emphasis on standards and an integrated curriculum model; and the call for all educators to work together to…

  14. An Open Source Extensible Smart Energy Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rankin, Linda

    Aggregated distributed energy resources are the subject of much interest in the energy industry and are expected to play an important role in meeting our future energy needs by changing how we use, distribute and generate electricity. This energy future includes an increased amount of energy from renewable resources, load management techniques to improve resiliency and reliability, and distributed energy storage and generation capabilities that can be managed to meet the needs of the grid as well as individual customers. These energy assets are commonly referred to as Distributed Energy Resources (DER). DERs rely on a means to communicate informationmore » between an energy provider and multitudes of devices. Today DER control systems are typically vendor-specific, using custom hardware and software solutions. As a result, customers are locked into communication transport protocols, applications, tools, and data formats. Today’s systems are often difficult to extend to meet new application requirements, resulting in stranded assets when business requirements or energy management models evolve. By partnering with industry advisors and researchers, an implementation DER research platform was developed called the Smart Energy Framework (SEF). The hypothesis of this research was that an open source Internet of Things (IoT) framework could play a role in creating a commodity-based eco-system for DER assets that would reduce costs and provide interoperable products. SEF is based on the AllJoynTM IoT open source framework. The demonstration system incorporated DER assets, specifically batteries and smart water heaters. To verify the behavior of the distributed system, models of water heaters and batteries were also developed. An IoT interface for communicating between the assets and a control server was defined. This interface supports a series of “events” and telemetry reporting, similar to those defined by current smart grid communication standards. The results of this effort demonstrated the feasibility and application potential of using IoT frameworks for the creation of commodity-based DER systems. All of the identified commodity-based system requirements were met by the AllJoyn framework. By having commodity solutions, small vendors can enter the market and the cost of implementation for all parties is reduced. Utilities and aggregators can choose from multiple interoperable products reducing the risk of stranded assets. Based on this research it is recommended that interfaces based on existing smart grid communication protocol standards be created for these emerging IoT frameworks. These interfaces should be standardized as part of the IoT framework allowing for interoperability testing and certification. Similarly, IoT frameworks are introducing application level security. This type of security is needed for protecting application and platforms and will be important moving forward. Recommendations are that along with DER-based data model interfaces, platform and application security requirements also be prescribed when IoT devices support DER applications.« less

  15. Neutrino in standard model and beyond

    NASA Astrophysics Data System (ADS)

    Bilenky, S. M.

    2015-07-01

    After discovery of the Higgs boson at CERN the Standard Model acquired a status of the theory of the elementary particles in the electroweak range (up to about 300 GeV). What general conclusions can be inferred from the Standard Model? It looks that the Standard Model teaches us that in the framework of such general principles as local gauge symmetry, unification of weak and electromagnetic interactions and Brout-Englert-Higgs spontaneous breaking of the electroweak symmetry nature chooses the simplest possibilities. Two-component left-handed massless neutrino fields play crucial role in the determination of the charged current structure of the Standard Model. The absence of the right-handed neutrino fields in the Standard Model is the simplest, most economical possibility. In such a scenario Majorana mass term is the only possibility for neutrinos to be massive and mixed. Such mass term is generated by the lepton-number violating Weinberg effective Lagrangian. In this approach three Majorana neutrino masses are suppressed with respect to the masses of other fundamental fermions by the ratio of the electroweak scale and a scale of a lepton-number violating physics. The discovery of the neutrinoless double β-decay and absence of transitions of flavor neutrinos into sterile states would be evidence in favor of the minimal scenario we advocate here.

  16. Whose Choice? Developing a Unifying Ethical Framework for Conscience Laws in Health Care.

    PubMed

    Brown, Benjamin P; Hasselbacher, Lee; Chor, Julie

    2016-08-01

    Since abortion became legal nationwide, federal and state "conscience clauses" have been established to define the context in which health professionals may decline to participate in contested services. Patients and health care providers may act according to conscience in making health care decisions and in deciding whether to abstain from or to participate in contested services. Historically, however, conscience clauses largely have equated conscience in health care with provider abstinence from such services. We propose a framework to analyze the ethical implications of conscience laws. There is a rich literature on the exercise of conscience in the clinical encounter. This essay addresses the need to ensure that policy, too, is grounded in an ethical framework. We argue that the ideal law meets three standards: it protects patients' exercise of conscience, it safeguards health care providers' rights of conscience, and it does not contradict standards of ethical conduct established by professional societies. We have chosen Illinois as a test of our framework because it has one of the nation's broadest conscience clauses and because an amendment to ensure that women receive consistent access to contested services has just passed in the state legislature. Without such an amendment, Illinois law fails all three standards of our framework. If signed by the governor, the amended law will provide protections for patients' positive claims of conscience. We recommend further protections for providers' positive claims as well. Enacting such changes would offer a model for how ethics-based analysis could be applied to similar policies nationwide.

  17. Chemistry in Past and New Science Frameworks and Standards: Gains, Losses, and Missed Opportunities

    ERIC Educational Resources Information Center

    Talanquer, Vicente; Sevian, Hannah

    2014-01-01

    Science education frameworks and standards play a central role in the development of curricula and assessments, as well as in guiding teaching practices in grades K-12. Recently, the National Research Council published a new Framework for K-12 Science Education that has guided the development of the Next Generation Science Standards. In this…

  18. Q and A about the College, Career, and Civic Life (C3) Framework for Social Studies State Standards

    ERIC Educational Resources Information Center

    Herczog, Michelle

    2013-01-01

    The "College, Career, and Civic Life (C3) Framework for Social Studies State Standards: State Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History" will soon be released. The C3 Framework was developed to serve two audiences: for states to upgrade their state social studies standards, and for…

  19. Framework for industry engagement and quality principles for industry-provided medical education in Europe.

    PubMed

    Allen, Tamara; Donde, Nina; Hofstädter-Thalmann, Eva; Keijser, Sandra; Moy, Veronique; Murama, Jean-Jacques; Kellner, Thomas

    2017-01-01

    Lifelong learning through continuing professional development (CPD) and medical education is critical for healthcare professionals to stay abreast of knowledge and skills and provide an optimal standard of care to patients. In Europe, CPD and medical education are fragmented as there are numerous models, providers and national regulations and a lack of harmonisation of qualitative criteria. There is continued debate on the appropriate role of pharmaceutical companies in the context of medical education. Accrediting bodies such as European Accreditation Council for Continuing Medical Education do not permit active involvement of the pharmaceutical industry due to concerns around conflicts of interest and potential for bias. However, many examples of active collaboration between pharmaceutical companies and medical societies and scientific experts exist, demonstrating high integrity, clear roles and responsibilities, and fair and balanced content. Medical education experts from 16 pharmaceutical companies met to develop a set of quality principles similar to standards that have been established for clinical trials and in alignment with existing principles of accrediting bodies. This paper outlines their proposal for a framework to improve and harmonise medical education quality standards in Europe, and is also an invitation for all stakeholders to join a discussion on this integrative model.

  20. Manual physical therapy: we speak gibberish.

    PubMed

    Flynn, Timothy W; Childs, John D; Bell, Stephania; Magel, Jake S; Rowe, Robert H; Plock, Haideh

    2008-03-01

    In December of 2006, the American Academy of Orthopaedic Manual Physical Therapists (AAOMPT) convened a task force to create a framework for standardizing manual physical therapy procedures. The impetus came from many years of frustration with our ability to precisely communicate to each other, as well as to stakeholders outside our profession. To this end, a contribution titled "A Model for Standardizing Manipulation Terminology In Physical Therapy Practice" is published in this issue of the Journal.

  1. An Air Operations Division Live, Virtual, and Constructive (LVC) Corporate Interoperability Standards Development Strategy

    DTIC Science & Technology

    2011-07-01

    Orlando, Florida, September 2009, 09F- SIW -090. [HLA (2000) - 1] - Modeling and Simulation Standard - High Level Architecture (HLA) – Framework and...Simulation Interoperability Workshop, Orlando, FL, USA, September 2009, 09F- SIW -023. [MaK] - www.mak.com [MIL-STD-3011] - MIL-STD-3011...Spring Simulation Interoperability Workshop, Norfolk, VA, USA, March 2007, 07S- SIW -072. [Ross] - Ross, P. and Clark, P. (2005), “Recommended

  2. Preparing Students for Success in a Multicultural World: Faculty Advisement and Intercultural Communication.

    ERIC Educational Resources Information Center

    Cornett-DeVito, Myrna M.; Reeves, Kenna J.

    1999-01-01

    Summarizes key findings from counseling, advisement, and intercultural communication literature that are associated with multicultural competence, including the academic and modeling role of the advisor. Offers a conceptual framework of standards for developing multicultural communication advisement competence. (Author/DB)

  3. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  4. Race, Power, and Language Criticism: The Case of Hawai'i

    ERIC Educational Resources Information Center

    Marlow, Mikaela Loyola

    2009-01-01

    Ethnolinguistic vitality, communication accommodation, and markedness model frameworks guided research assessing language ideologies, practices, and criticism among multi-ethnic Locals in the Hawaiian Islands. Results from Study 1 indicated that respondents draw from widespread ideologies that influence them to employ Standard English in…

  5. Guidelines for Guidance Services.

    ERIC Educational Resources Information Center

    Manitoba Dept. of Education and Training, Winnipeg.

    The purpose of this booklet is to provide direction and assistance to school divisions as they develop responsive, effective, and accountable guidance services and programs at the school level. The guidelines presented provide a broad conceptual framework of definitions and goals and outline expectations for service standards. Models and…

  6. Enhanced semantic interoperability by profiling health informatics standards.

    PubMed

    López, Diego M; Blobel, Bernd

    2009-01-01

    Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.

  7. Standard model light-by-light scattering in SANC: Analytic and numeric evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardin, D. Yu., E-mail: bardin@nu.jinr.ru; Kalinovskaya, L. V., E-mail: kalinov@nu.jinr.ru; Uglov, E. D., E-mail: corner@nu.jinr.r

    2010-11-15

    The implementation of the Standard Model process {gamma}{gamma} {yields} {gamma}{gamma} through a fermion and boson loop into the framework of SANC system and additional precomputation modules used for calculation of massive box diagrams are described. The computation of this process takes into account nonzero mass of loop particles. The covariant and helicity amplitudes for this process, some particular cases of D{sub 0} and C{sub 0} Passarino-Veltman functions, and also numerical results of corresponding SANC module evaluation are presented. Whenever possible, the results are compared with those existing in the literature.

  8. Production of a Scalar Boson and a Fermion Pair in Arbitrarily Polarized e - e + Beams

    NASA Astrophysics Data System (ADS)

    Abdullayev, S. K.; Gojayev, M. Sh.; Nasibova, N. A.

    2018-05-01

    Within the framework of the Standard Model (Minimal Supersymmetric Standard Model) we consider the production of the scalar boson HSM (h; H) and a fermion pair ff- in arbitrarily polarized, counterpropagating electron-positron beams e - e + ⇒ HSM (h; H) ff-. Characteristic features of the behavior of the cross sections and polarization characteristics (right-left spin asymmetry, degree of longitudinal polarization of the fermion, and transverse spin asymmetry) are investigated and elucidated as functions of the energy of the electron-positron beams and the mass of the scalar boson.

  9. `Models of' versus `Models for'. Toward an Agent-Based Conception of Modeling in the Science Classroom

    NASA Astrophysics Data System (ADS)

    Gouvea, Julia; Passmore, Cynthia

    2017-03-01

    The inclusion of the practice of "developing and using models" in the Framework for K-12 Science Education and in the Next Generation Science Standards provides an opportunity for educators to examine the role this practice plays in science and how it can be leveraged in a science classroom. Drawing on conceptions of models in the philosophy of science, we bring forward an agent-based account of models and discuss the implications of this view for enacting modeling in science classrooms. Models, according to this account, can only be understood with respect to the aims and intentions of a cognitive agent (models for), not solely in terms of how they represent phenomena in the world (models of). We present this contrast as a heuristic— models of versus models for—that can be used to help educators notice and interpret how models are positioned in standards, curriculum, and classrooms.

  10. Data Convergence - An Australian Perspective

    NASA Astrophysics Data System (ADS)

    Allen, S. S.; Howell, B.

    2012-12-01

    Coupled numerical physical, biogeochemical and sediment models are increasingly being used as integrators to help understand the cumulative or far field effects of change in the coastal environment. This reliance on modeling has forced observations to be delivered as data streams ingestible by modeling frameworks. This has made it easier to create near real-time or forecasting models than to try to recreate the past, and has lead in turn to the conversion of historical data into data streams to allow them to be ingested by the same frameworks. The model and observation frameworks under development within Australia's Commonwealth and Industrial Research Organisation (CSIRO) are now feeding into the Australian Ocean Data Network's (AODN's) MARine Virtual Laboratory (MARVL) . The sensor, or data stream, brokering solution is centred around the "message" and all data flowing through the gateway is wrapped as a message. Messages consist of a topic and a data object and their routing through the gateway to pre-processors and listeners is determined by the topic. The Sensor Message Gateway (SMG) method is allowing data from different sensors measuring the same thing but with different temporal resolutions, units or spatial coverage to be ingested or visualized seamlessly. At the same time the model output as a virtual sensor is being explored, this again being enabled by the SMG. It is only for two way communications with sensor that rigorous adherence to standards is needed, by accepting existing data in less than ideal formats, but exposing them though the SMG we can move a step closer to the Internet Of Things by creating an Internet of Industries where each vested interest can continue with business as usual, contribute to data convergence and adopt more open standards when investment seems appropriate to that sector or business.Architecture Overview

  11. Improved estimate for the muon g-2 using VMD constraints

    NASA Astrophysics Data System (ADS)

    Benayoun, M.

    2012-04-01

    The muon anomalous magnetic moment aμ and the hadronic vacuum polarization (HVP) are examined using data analyzed within the framework of a suitably broken HLS model. The analysis relies on all available scan data samples and leaves aside the existing ISR data. The framework provided by our broken HLS model allows for improved estimates of the contributions to aμ from the e+e- annihilation cross sections into π+π-,π0γ,ηγ,π+π-π0,K+K-,K0K up to slightly above the ϕ meson mass. Within this framework, the information provided by the τ±→π±π0ν decay and by the radiative decays (VPγ and Pγγ) of light flavor mesons play as strong constraints on the model parameters. The discrepancy between the theoretical estimate of the muon anomalous magnetic moment g-2 and its direct BNL measurement is shown to reach conservatively 4.1σ while standard methods used under the same conditions yield 3.5σ.

  12. Industrial Education. Vocational Education Program Courses Standards.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.

    This document contains vocational education program courses standards (curriculum frameworks and student performance standards) for exploratory courses, practical arts courses, and job preparatory programs offered at the secondary or postsecondary level in Florida. Each program courses standard is composed of two parts: a curriculum framework and…

  13. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.

  14. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    NASA Technical Reports Server (NTRS)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  15. Defining and Assessing Quality Improvement Outcomes: A Framework for Public Health

    PubMed Central

    Nawaz, Saira; Thomas, Craig; Young, Andrea

    2015-01-01

    We describe an evidence-based framework to define and assess the impact of quality improvement (QI) in public health. Developed to address programmatic and research-identified needs for articulating the value of public health QI in aggregate, this framework proposes a standardized set of measures to monitor and improve the efficiency and effectiveness of public health programs and operations. We reviewed the scientific literature and analyzed QI initiatives implemented through the Centers for Disease Control and Prevention’s National Public Health Improvement Initiative to inform the selection of 5 efficiency and 8 effectiveness measures. This framework provides a model for identifying the types of improvement outcomes targeted by public health QI efforts and a means to understand QI’s impact on the practice of public health. PMID:25689185

  16. Framework for Establishment of a Comprehensive and Standardized Administration System for Prevention and Control of Tuberculosis in College Student Community in China.

    PubMed

    Zhang, Shaoru; Li, Xiaohong; Zhang, Tianhua; Wang, Xiangni; Liu, Weiping; Ma, Xuexue; Li, Yuelu; Fan, Yahui

    2016-10-01

    College student community is the one with high risk of tuberculosis (TB). A systemic and standardized administration model for prevention and control of TB is significance in controlling TB spread in universities. Currently, the universities in China have not established the comprehensive and standardized administration system for TB prevention and control in college student community. Firstly, the literature research and brainstorming method (n=13) were used to construct the clause and sub-clause pool for the administration of TB prevention and control within college student community in 2014. Secondly, a total of twenty experts in the field of TB prevention and control who are representatives of the east, west, south and north parts of China were selected and invited to participate the Delphi letter-inquiry. After two rounds of letter-inquiry, the opinions of the experts reached a consensus and the framework for the administration system was constructed. A framework for the administration system was constructed, which included 8 first class indexes, 26 second class indexes and 104 third class indexes. The results are highly scientific and reliable, which can be helpful for improving the systemic and standardized levels for the administration of TB prevention and control in universities in China and perhaps in other developing counties with high TB burden as well.

  17. Framework for Establishment of a Comprehensive and Standardized Administration System for Prevention and Control of Tuberculosis in College Student Community in China

    PubMed Central

    ZHANG, Shaoru; LI, Xiaohong; ZHANG, Tianhua; WANG, Xiangni; LIU, Weiping; MA, Xuexue; LI, Yuelu; FAN, Yahui

    2016-01-01

    Background: College student community is the one with high risk of tuberculosis (TB). A systemic and standardized administration model for prevention and control of TB is significance in controlling TB spread in universities. Currently, the universities in China have not established the comprehensive and standardized administration system for TB prevention and control in college student community. Methods: Firstly, the literature research and brainstorming method (n=13) were used to construct the clause and sub-clause pool for the administration of TB prevention and control within college student community in 2014. Secondly, a total of twenty experts in the field of TB prevention and control who are representatives of the east, west, south and north parts of China were selected and invited to participate the Delphi letter-inquiry. After two rounds of letter-inquiry, the opinions of the experts reached a consensus and the framework for the administration system was constructed. Results: A framework for the administration system was constructed, which included 8 first class indexes, 26 second class indexes and 104 third class indexes. Conclusion: The results are highly scientific and reliable, which can be helpful for improving the systemic and standardized levels for the administration of TB prevention and control in universities in China and perhaps in other developing counties with high TB burden as well. PMID:27957436

  18. A nonlinear autoregressive Volterra model of the Hodgkin-Huxley equations.

    PubMed

    Eikenberry, Steffen E; Marmarelis, Vasilis Z

    2013-02-01

    We propose a new variant of Volterra-type model with a nonlinear auto-regressive (NAR) component that is a suitable framework for describing the process of AP generation by the neuron membrane potential, and we apply it to input-output data generated by the Hodgkin-Huxley (H-H) equations. Volterra models use a functional series expansion to describe the input-output relation for most nonlinear dynamic systems, and are applicable to a wide range of physiologic systems. It is difficult, however, to apply the Volterra methodology to the H-H model because is characterized by distinct subthreshold and suprathreshold dynamics. When threshold is crossed, an autonomous action potential (AP) is generated, the output becomes temporarily decoupled from the input, and the standard Volterra model fails. Therefore, in our framework, whenever membrane potential exceeds some threshold, it is taken as a second input to a dual-input Volterra model. This model correctly predicts membrane voltage deflection both within the subthreshold region and during APs. Moreover, the model naturally generates a post-AP afterpotential and refractory period. It is known that the H-H model converges to a limit cycle in response to a constant current injection. This behavior is correctly predicted by the proposed model, while the standard Volterra model is incapable of generating such limit cycle behavior. The inclusion of cross-kernels, which describe the nonlinear interactions between the exogenous and autoregressive inputs, is found to be absolutely necessary. The proposed model is general, non-parametric, and data-derived.

  19. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  20. Building Assured Systems Framework

    DTIC Science & Technology

    2010-09-01

    of standards such as ISO 27001 as frameworks [NASCIO 2009]. In this context, a framework is a standard intended to assist in auditing and compliance...Information Security ISO /IEC 27004 Information technology – Security techniques - Information security management measurement ISO /IEC 15939, System and

  1. Health Occupations Education. Vocational Education Program Courses Standards.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.

    This document contains vocational education program courses standards for exploratory courses, practical arts courses, and job preparatory programs offered at the secondary or postsecondary level. Each program standard is composed of two parts: a curriculum framework and student performance standards. The curriculum framework includes four major…

  2. Assessing Concentrations and Health Impacts of Air Quality Management Strategies: Framework for Rapid Emissions Scenario and Health impact ESTimation (FRESH-EST)

    PubMed Central

    Milando, Chad W.; Martenies, Sheena E.; Batterman, Stuart A.

    2017-01-01

    In air quality management, reducing emissions from pollutant sources often forms the primary response to attaining air quality standards and guidelines. Despite the broad success of air quality management in the US, challenges remain. As examples: allocating emissions reductions among multiple sources is complex and can require many rounds of negotiation; health impacts associated with emissions, the ultimate driver for the standards, are not explicitly assessed; and long dispersion model run-times, which result from the increasing size and complexity of model inputs, limit the number of scenarios that can be evaluated, thus increasing the likelihood of missing an optimal strategy. A new modeling framework, called the "Framework for Rapid Emissions Scenario and Health impact ESTimation" (FRESH-EST), is presented to respond to these challenges. FRESH-EST estimates concentrations and health impacts of alternative emissions scenarios at the urban scale, providing efficient computations from emissions to health impacts at the Census block or other desired spatial scale. In addition, FRESH-EST can optimize emission reductions to meet specified environmental and health constraints, and a convenient user interface and graphical displays are provided to facilitate scenario evaluation. The new framework is demonstrated in an SO2 non-attainment area in southeast Michigan with two optimization strategies: the first minimizes emission reductions needed to achieve a target concentration; the second minimizes concentrations while holding constant the cumulative emissions across local sources (e.g., an emissions floor). The optimized strategies match outcomes in the proposed SO2 State Implementation Plan without the proposed stack parameter modifications or shutdowns. In addition, the lower health impacts estimated for these strategies suggest the potential for FRESH-EST to identify pollution control alternatives for air quality management planning. PMID:27318620

  3. Leadership Influence: A Core Foundation for Advocacy.

    PubMed

    Shillam, Casey R; MacLean, Lola

    As the largest segment of the health care workforce, nurses have the greatest potential for advancing systems and services to improve health care delivery in the United States. This article presents a framework for nurse administrators to use in developing direct care nurses in their leadership influence competency as a means of increasing their advocacy potential. A systematic review resulted in establishing a nurse leadership influence framework based on the Kouzes and Posner leadership model. The framework includes leadership competencies by nursing professional organizations and was validated by 2 national nurse leader focus groups. Nurse administrators have the opportunity to adopt an evidence-based leadership influence framework to ensure development of advocacy competency in direct care nurses. The impact of nurse administrators systematically adopting a standardized leadership influence framework will result in setting a strong foundation for nurse advocacy. Successful long-term impacts will result in nurses skillfully integrating leadership influence and advocacy into all aspects of daily practice.

  4. Theoretical uncertainties in the calculation of supersymmetric dark matter observables

    NASA Astrophysics Data System (ADS)

    Bergeron, Paul; Sandick, Pearl; Sinha, Kuver

    2018-05-01

    We estimate the current theoretical uncertainty in supersymmetric dark matter predictions by comparing several state-of-the-art calculations within the minimal supersymmetric standard model (MSSM). We consider standard neutralino dark matter scenarios — coannihilation, well-tempering, pseudoscalar resonance — and benchmark models both in the pMSSM framework and in frameworks with Grand Unified Theory (GUT)-scale unification of supersymmetric mass parameters. The pipelines we consider are constructed from the publicly available software packages SOFTSUSY, SPheno, FeynHiggs, SusyHD, micrOMEGAs, and DarkSUSY. We find that the theoretical uncertainty in the relic density as calculated by different pipelines, in general, far exceeds the statistical errors reported by the Planck collaboration. In GUT models, in particular, the relative discrepancies in the results reported by different pipelines can be as much as a few orders of magnitude. We find that these discrepancies are especially pronounced for cases where the dark matter physics relies critically on calculations related to electroweak symmetry breaking, which we investigate in detail, and for coannihilation models, where there is heightened sensitivity to the sparticle spectrum. The dark matter annihilation cross section today and the scattering cross section with nuclei also suffer appreciable theoretical uncertainties, which, as experiments reach the relevant sensitivities, could lead to uncertainty in conclusions regarding the viability or exclusion of particular models.

  5. Biologically driven neural platform invoking parallel electrophoretic separation and urinary metabolite screening.

    PubMed

    Page, Tessa; Nguyen, Huong Thi Huynh; Hilts, Lindsey; Ramos, Lorena; Hanrahan, Grady

    2012-06-01

    This work reveals a computational framework for parallel electrophoretic separation of complex biological macromolecules and model urinary metabolites. More specifically, the implementation of a particle swarm optimization (PSO) algorithm on a neural network platform for multiparameter optimization of multiplexed 24-capillary electrophoresis technology with UV detection is highlighted. Two experimental systems were examined: (1) separation of purified rabbit metallothioneins and (2) separation of model toluene urinary metabolites and selected organic acids. Results proved superior to the use of neural networks employing standard back propagation when examining training error, fitting response, and predictive abilities. Simulation runs were obtained as a result of metaheuristic examination of the global search space with experimental responses in good agreement with predicted values. Full separation of selected analytes was realized after employing optimal model conditions. This framework provides guidance for the application of metaheuristic computational tools to aid in future studies involving parallel chemical separation and screening. Adaptable pseudo-code is provided to enable users of varied software packages and modeling framework to implement the PSO algorithm for their desired use.

  6. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  7. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE PAGES

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    2016-08-10

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  8. Learning intervention-induced deformations for non-rigid MR-CT registration and electrode localization in epilepsy patients

    PubMed Central

    Onofrey, John A.; Staib, Lawrence H.; Papademetris, Xenophon

    2015-01-01

    This paper describes a framework for learning a statistical model of non-rigid deformations induced by interventional procedures. We make use of this learned model to perform constrained non-rigid registration of pre-procedural and post-procedural imaging. We demonstrate results applying this framework to non-rigidly register post-surgical computed tomography (CT) brain images to pre-surgical magnetic resonance images (MRIs) of epilepsy patients who had intra-cranial electroencephalography electrodes surgically implanted. Deformations caused by this surgical procedure, imaging artifacts caused by the electrodes, and the use of multi-modal imaging data make non-rigid registration challenging. Our results show that the use of our proposed framework to constrain the non-rigid registration process results in significantly improved and more robust registration performance compared to using standard rigid and non-rigid registration methods. PMID:26900569

  9. Design and architecture of the Mars relay network planning and analysis framework

    NASA Technical Reports Server (NTRS)

    Cheung, K. M.; Lee, C. H.

    2002-01-01

    In this paper we describe the design and architecture of the Mars Network planning and analysis framework that supports generation and validation of efficient planning and scheduling strategy. The goals are to minimize the transmitting time, minimize the delaying time, and/or maximize the network throughputs. The proposed framework would require (1) a client-server architecture to support interactive, batch, WEB, and distributed analysis and planning applications for the relay network analysis scheme, (2) a high-fidelity modeling and simulation environment that expresses link capabilities between spacecraft to spacecraft and spacecraft to Earth stations as time-varying resources, and spacecraft activities, link priority, Solar System dynamic events, the laws of orbital mechanics, and other limiting factors as spacecraft power and thermal constraints, (3) an optimization methodology that casts the resource and constraint models into a standard linear and nonlinear constrained optimization problem that lends itself to commercial off-the-shelf (COTS)planning and scheduling algorithms.

  10. A Liver-Centric Multiscale Modeling Framework for Xenobiotics.

    PubMed

    Sluka, James P; Fu, Xiao; Swat, Maciej; Belmonte, Julio M; Cosmanescu, Alin; Clendenon, Sherry G; Wambaugh, John F; Glazier, James A

    2016-01-01

    We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK) modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML) to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics.

  11. A Liver-Centric Multiscale Modeling Framework for Xenobiotics

    PubMed Central

    Swat, Maciej; Cosmanescu, Alin; Clendenon, Sherry G.; Wambaugh, John F.; Glazier, James A.

    2016-01-01

    We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK) modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML) to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics. PMID:27636091

  12. Collaborative development of predictive toxicology applications

    PubMed Central

    2010-01-01

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436

  13. Collaborative development of predictive toxicology applications.

    PubMed

    Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia

    2010-08-31

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.

  14. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  15. Rapid Prototyping of an Aircraft Model in an Object-Oriented Simulation

    NASA Technical Reports Server (NTRS)

    Kenney, P. Sean

    2003-01-01

    A team was created to participate in the Mars Scout Opportunity. Trade studies determined that an aircraft provided the best opportunity to complete the science objectives of the team. A high fidelity six degree of freedom flight simulation was required to provide credible evidence that the aircraft design fulfilled mission objectives and to support the aircraft design process by providing performance evaluations. The team created the simulation using the Langley Standard Real-Time Simulation in C++ (LaSRS++) application framework. A rapid prototyping approach was necessary because the team had only three months to both develop the aircraft simulation model and evaluate aircraft performance as the design and mission parameters matured. The design of LaSRS++ enabled rapid-prototyping in several ways. First, the framework allowed component models to be designed, implemented, unit-tested, and integrated quickly. Next, the framework provides a highly reusable infrastructure that allowed developers to maximize code reuse while concentrating on aircraft and mission specific features. Finally, the framework reduces risk by providing reusable components that allow developers to build a quality product with a compressed testing cycle that relies heavily on unit testing of new components.

  16. Modeling Floor Effects in Standardized Vocabulary Test Scores in a Sample of Low SES Hispanic Preschool Children under the Multilevel Structural Equation Modeling Framework.

    PubMed

    Zhu, Leina; Gonzalez, Jorge

    2017-01-01

    Researchers and practitioners often use standardized vocabulary tests such as the Peabody Picture Vocabulary Test-4 (PPVT-4; Dunn and Dunn, 2007) and its companion, the Expressive Vocabulary Test-2 (EVT-2; Williams, 2007), to assess English vocabulary skills as an indicator of children's school readiness. Despite their psychometric excellence in the norm sample, issues arise when standardized vocabulary tests are used to asses children from culturally, linguistically and ethnically diverse backgrounds (e.g., Spanish-speaking English language learners) or delayed in some manner. One of the biggest challenges is establishing the appropriateness of these measures with non-English or non-standard English speaking children as often they score one to two standard deviations below expected levels (e.g., Lonigan et al., 2013). This study re-examines the issues in analyzing the PPVT-4 and EVT-2 scores in a sample of 4-to-5-year-old low SES Hispanic preschool children who were part of a larger randomized clinical trial on the effects of a supplemental English shared-reading vocabulary curriculum (Pollard-Durodola et al., 2016). It was found that data exhibited strong floor effects and the presence of floor effects made it difficult to differentiate the invention group and the control group on their vocabulary growth in the intervention. A simulation study is then presented under the multilevel structural equation modeling (MSEM) framework and results revealed that in regular multilevel data analysis, ignoring floor effects in the outcome variables led to biased results in parameter estimates, standard error estimates, and significance tests. Our findings suggest caution in analyzing and interpreting scores of ethnically and culturally diverse children on standardized vocabulary tests (e.g., floor effects). It is recommended appropriate analytical methods that take into account floor effects in outcome variables should be considered.

  17. Dynamics of sea level rise and coastal flooding on a changing landscape

    NASA Astrophysics Data System (ADS)

    Bilskie, M. V.; Hagen, S. C.; Medeiros, S. C.; Passeri, D. L.

    2014-02-01

    Standard approaches to determining the impacts of sea level rise (SLR) on storm surge flooding employ numerical models reflecting present conditions with modified sea states for a given SLR scenario. In this study, we advance this paradigm by adjusting the model framework so that it reflects not only a change in sea state but also variations to the landscape (morphologic changes and urbanization of coastal cities). We utilize a numerical model of the Mississippi and Alabama coast to simulate the response of hurricane storm surge to changes in sea level, land use/land cover, and land surface elevation for past (1960), present (2005), and future (2050) conditions. The results show that the storm surge response to SLR is dynamic and sensitive to changes in the landscape. We introduce a new modeling framework that includes modification of the landscape when producing storm surge models for future conditions.

  18. Flood mapping in ungauged basins using fully continuous hydrologic-hydraulic modeling

    NASA Astrophysics Data System (ADS)

    Grimaldi, Salvatore; Petroselli, Andrea; Arcangeletti, Ettore; Nardi, Fernando

    2013-04-01

    SummaryIn this work, a fully-continuous hydrologic-hydraulic modeling framework for flood mapping is introduced and tested. It is characterized by a simulation of a long rainfall time series at sub-daily resolution that feeds a continuous rainfall-runoff model producing a discharge time series that is directly given as an input to a bi-dimensional hydraulic model. The main advantage of the proposed approach is to avoid the use of the design hyetograph and the design hydrograph that constitute the main source of subjective analysis and uncertainty for standard methods. The proposed procedure is optimized for small and ungauged watersheds where empirical models are commonly applied. Results of a simple real case study confirm that this experimental fully-continuous framework may pave the way for the implementation of a less subjective and potentially automated procedure for flood hazard mapping.

  19. A Regression Framework for Effect Size Assessments in Longitudinal Modeling of Group Differences

    PubMed Central

    Feingold, Alan

    2013-01-01

    The use of growth modeling analysis (GMA)--particularly multilevel analysis and latent growth modeling--to test the significance of intervention effects has increased exponentially in prevention science, clinical psychology, and psychiatry over the past 15 years. Model-based effect sizes for differences in means between two independent groups in GMA can be expressed in the same metric (Cohen’s d) commonly used in classical analysis and meta-analysis. This article first reviews conceptual issues regarding calculation of d for findings from GMA and then introduces an integrative framework for effect size assessments that subsumes GMA. The new approach uses the structure of the linear regression model, from which effect sizes for findings from diverse cross-sectional and longitudinal analyses can be calculated with familiar statistics, such as the regression coefficient, the standard deviation of the dependent measure, and study duration. PMID:23956615

  20. A Flexible Electronic Commerce Recommendation System

    NASA Astrophysics Data System (ADS)

    Gong, Songjie

    Recommendation systems have become very popular in E-commerce websites. Many of the largest commerce websites are already using recommender technologies to help their customers find products to purchase. An electronic commerce recommendation system learns from a customer and recommends products that the customer will find most valuable from among the available products. But most recommendation methods are hard-wired into the system and they support only fixed recommendations. This paper presented a framework of flexible electronic commerce recommendation system. The framework is composed by user model interface, recommendation engine, recommendation strategy model, recommendation technology group, user interest model and database interface. In the recommender strategy model, the method can be collaborative filtering, content-based filtering, mining associate rules method, knowledge-based filtering method or the mixed method. The system mapped the implementation and demand through strategy model, and the whole system would be design as standard parts to adapt to the change of the recommendation strategy.

  1. Reply to comment by Fred L. Ogden et al. on "Beyond the SCS-CN method: A theoretical framework for spatially lumped rainfall-runoff response"

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2017-07-01

    Though Ogden et al. list several shortcomings of the original SCS-CN method, fit for purpose is a key consideration in hydrological modelling, as shown by the adoption of SCS-CN method in many design standards. The theoretical framework of Bartlett et al. [2016a] reveals a family of semidistributed models, of which the SCS-CN method is just one member. Other members include event-based versions of the Variable Infiltration Capacity (VIC) model and TOPMODEL. This general model allows us to move beyond the limitations of the original SCS-CN method under different rainfall-runoff mechanisms and distributions for soil and rainfall variability. Future research should link this general model approach to different hydrogeographic settings, in line with the call for action proposed by Ogden et al.

  2. Data-driven model-independent searches for long-lived particles at the LHC

    NASA Astrophysics Data System (ADS)

    Coccaro, Andrea; Curtin, David; Lubatti, H. J.; Russell, Heather; Shelton, Jessie

    2016-12-01

    Neutral long-lived particles (LLPs) are highly motivated by many beyond the Standard Model scenarios, such as theories of supersymmetry, baryogenesis, and neutral naturalness, and present both tremendous discovery opportunities and experimental challenges for the LHC. A major bottleneck for current LLP searches is the prediction of Standard Model backgrounds, which are often impossible to simulate accurately. In this paper, we propose a general strategy for obtaining differential, data-driven background estimates in LLP searches, thereby notably extending the range of LLP masses and lifetimes that can be discovered at the LHC. We focus on LLPs decaying in the ATLAS muon system, where triggers providing both signal and control samples are available at LHC run 2. While many existing searches require two displaced decays, a detailed knowledge of backgrounds will allow for very inclusive searches that require just one detected LLP decay. As we demonstrate for the h →X X signal model of LLP pair production in exotic Higgs decays, this results in dramatic sensitivity improvements for proper lifetimes ≳10 m . In theories of neutral naturalness, this extends reach to glueball masses far below the b ¯b threshold. Our strategy readily generalizes to other signal models and other detector subsystems. This framework therefore lends itself to the development of a systematic, model-independent LLP search program, in analogy to the highly successful simplified-model framework of prompt searches.

  3. National Framework of Professional Standards for Change Leadership in Education

    ERIC Educational Resources Information Center

    Duffy, Francis M.

    2009-01-01

    The ten professional standards form what Francis Duffy refers to as a "National Framework of Professional Standards for Change Leadership in Education." Each standard has examples of the knowledge, skills, and dispositions that the research suggests are important for effective change leadership. Duffy's hope is that this proposed…

  4. GAMBIT: the global and modular beyond-the-standard-model inference tool

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  5. Four (Algorithms) in One (Bag): An Integrative Framework of Knowledge for Teaching the Standard Algorithms of the Basic Arithmetic Operations

    ERIC Educational Resources Information Center

    Raveh, Ira; Koichu, Boris; Peled, Irit; Zaslavsky, Orit

    2016-01-01

    In this article we present an integrative framework of knowledge for teaching the standard algorithms of the four basic arithmetic operations. The framework is based on a mathematical analysis of the algorithms, a connectionist perspective on teaching mathematics and an analogy with previous frameworks of knowledge for teaching arithmetic…

  6. Bounds on the dynamics of sink populations with noisy immigration.

    PubMed

    Eager, Eric Alan; Guiver, Chris; Hodgson, Dave; Rebarber, Richard; Stott, Iain; Townley, Stuart

    2014-03-01

    Sink populations are doomed to decline to extinction in the absence of immigration. The dynamics of sink populations are not easily modelled using the standard framework of per capita rates of immigration, because numbers of immigrants are determined by extrinsic sources (for example, source populations, or population managers). Here we appeal to a systems and control framework to place upper and lower bounds on both the transient and future dynamics of sink populations that are subject to noisy immigration. Immigration has a number of interpretations and can fit a wide variety of models found in the literature. We apply the results to case studies derived from published models for Chinook salmon (Oncorhynchus tshawytscha) and blowout penstemon (Penstemon haydenii). Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less

  8. DYNAMO-HIA--a Dynamic Modeling tool for generic Health Impact Assessments.

    PubMed

    Lhachimi, Stefan K; Nusselder, Wilma J; Smit, Henriette A; van Baal, Pieter; Baili, Paolo; Bennett, Kathleen; Fernández, Esteve; Kulik, Margarete C; Lobstein, Tim; Pomerleau, Joceline; Mackenbach, Johan P; Boshuizen, Hendriek C

    2012-01-01

    Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA). A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states) and three usability criteria (modest data requirements, rich model output, generally accessible) to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment), we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures--e.g. life expectancy and disease-free life expectancy--and detailed data--e.g. prevalences and mortality/survival rates--by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based on epidemiologic evidence.

  9. Assessing Quality of Data Standards: Framework and Illustration Using XBRL GAAP Taxonomy

    NASA Astrophysics Data System (ADS)

    Zhu, Hongwei; Wu, Harris

    The primary purpose of data standards or metadata schemas is to improve the interoperability of data created by multiple standard users. Given the high cost of developing data standards, it is desirable to assess the quality of data standards. We develop a set of metrics and a framework for assessing data standard quality. The metrics include completeness and relevancy. Standard quality can also be indirectly measured by assessing interoperability of data instances. We evaluate the framework using data from the financial sector: the XBRL (eXtensible Business Reporting Language) GAAP (Generally Accepted Accounting Principles) taxonomy and US Securities and Exchange Commission (SEC) filings produced using the taxonomy by approximately 500 companies. The results show that the framework is useful and effective. Our analysis also reveals quality issues of the GAAP taxonomy and provides useful feedback to taxonomy users. The SEC has mandated that all publicly listed companies must submit their filings using XBRL. Our findings are timely and have practical implications that will ultimately help improve the quality of financial data.

  10. Can Performance-Related Learning Outcomes Have Standards?

    ERIC Educational Resources Information Center

    Brockmann, Michaela; Clarke, Linda; Winch, Christopher

    2008-01-01

    Purpose: This paper aims to explain the distinction between educational standards and learning outcomes and to indicate the problems that potentially arise when a learning outcomes approach is applied to a qualification meta-framework like the European Qualification Framework, or indeed to national qualification frameworks.…

  11. Database integration in a multimedia-modeling environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorow, Kevin E.

    2002-09-02

    Integration of data from disparate remote sources has direct applicability to modeling, which can support Brownfield assessments. To accomplish this task, a data integration framework needs to be established. A key element in this framework is the metadata that creates the relationship between the pieces of information that are important in the multimedia modeling environment and the information that is stored in the remote data source. The design philosophy is to allow modelers and database owners to collaborate by defining this metadata in such a way that allows interaction between their components. The main parts of this framework include toolsmore » to facilitate metadata definition, database extraction plan creation, automated extraction plan execution / data retrieval, and a central clearing house for metadata and modeling / database resources. Cross-platform compatibility (using Java) and standard communications protocols (http / https) allow these parts to run in a wide variety of computing environments (Local Area Networks, Internet, etc.), and, therefore, this framework provides many benefits. Because of the specific data relationships described in the metadata, the amount of data that have to be transferred is kept to a minimum (only the data that fulfill a specific request are provided as opposed to transferring the complete contents of a data source). This allows for real-time data extraction from the actual source. Also, the framework sets up collaborative responsibilities such that the different types of participants have control over the areas in which they have domain knowledge-the modelers are responsible for defining the data relevant to their models, while the database owners are responsible for mapping the contents of the database using the metadata definitions. Finally, the data extraction mechanism allows for the ability to control access to the data and what data are made available.« less

  12. Managing public health in the Army through a standard community health promotion council model.

    PubMed

    Courie, Anna F; Rivera, Moira Shaw; Pompey, Allison

    2014-01-01

    Public health processes in the US Army remain uncoordinated due to competing lines of command, funding streams and multiple subject matter experts in overlapping public health concerns. The US Army Public Health Command (USAPHC) has identified a standard model for community health promotion councils (CHPCs) as an effective framework for synchronizing and integrating these overlapping systems to ensure a coordinated approach to managing the public health process. The purpose of this study is to test a foundational assumption of the CHPC effectiveness theory: the 3 features of a standard CHPC model - a CHPC chaired by a strong leader, ie, the senior commander; a full time health promotion team dedicated to the process; and centralized management through the USAPHC - will lead to high quality health promotion councils capable of providing a coordinated approach to addressing public health on Army installations. The study employed 2 evaluation questions: (1) Do CHPCs with centralized management through the USAPHC, alignment with the senior commander, and a health promotion operations team adhere more closely to the evidence-based CHPC program framework than CHPCs without these 3 features? (2) Do members of standard CHPCs report that participation in the CHPC leads to a well-coordinated approach to public health at the installation? The results revealed that both time (F(5,76)=25.02, P<.0001) and the 3 critical features of the standard CHPC model (F(1,76)=28.40, P<.0001) independently predicted program adherence. Evaluation evidence supports the USAPHC's approach to CHPC implementation as part of public health management on Army installations. Preliminary evidence suggests that the standard CHPC model may lead to a more coordinated approach to public health and may assure that CHPCs follow an evidence-informed design. This is consistent with past research demonstrating that community coalitions and public health systems that have strong leadership; dedicated staff time and expertise; influence over policy, governance and oversight; and formalized rules and regulations function more effectively than those without. It also demonstrates the feasibility of implementing an evidence-informed approach to community coalitions in an Army environment.

  13. Mapping the function of neuronal ion channels in model and experiment

    PubMed Central

    Podlaski, William F; Seeholzer, Alexander; Groschner, Lukas N; Miesenböck, Gero; Ranjan, Rajnish; Vogels, Tim P

    2017-01-01

    Ion channel models are the building blocks of computational neuron models. Their biological fidelity is therefore crucial for the interpretation of simulations. However, the number of published models, and the lack of standardization, make the comparison of ion channel models with one another and with experimental data difficult. Here, we present a framework for the automated large-scale classification of ion channel models. Using annotated metadata and responses to a set of voltage-clamp protocols, we assigned 2378 models of voltage- and calcium-gated ion channels coded in NEURON to 211 clusters. The IonChannelGenealogy (ICGenealogy) web interface provides an interactive resource for the categorization of new and existing models and experimental recordings. It enables quantitative comparisons of simulated and/or measured ion channel kinetics, and facilitates field-wide standardization of experimentally-constrained modeling. DOI: http://dx.doi.org/10.7554/eLife.22152.001 PMID:28267430

  14. A Pragmatic Application of the RE-AIM Framework for Evaluating the Implementation of Physical Activity as a Standard of Care in Health Systems

    PubMed Central

    Galaviz, Karla I.; Lobelo, Felipe; Joy, Elizabeth; Heath, Gregory W.; Hutber, Adrian; Estabrooks, Paul

    2018-01-01

    Introduction Exercise is Medicine (EIM) is an initiative that seeks to integrate physical activity assessment, prescription, and patient referral as a standard in patient care. Methods to assess this integration have lagged behind its implementation. Purpose and Objectives The purpose of this work is to provide a pragmatic framework to guide health care systems in assessing the implementation and impact of EIM. Evaluation Methods A working group of experts from health care, public health, and implementation science convened to develop an evaluation model based on the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework. The working group aimed to provide pragmatic guidance on operationalizing EIM across the different RE-AIM dimensions based on data typically available in health care settings. Results The Reach of EIM can be determined by the number and proportion of patients that were screened for physical inactivity, received brief counseling and/or a physical activity prescription, and were referred to physical activity resources. Effectiveness can be assessed through self-reported changes in physical activity, cardiometabolic biometric factors, incidence/burden of chronic disease, as well as health care utilization and costs. Adoption includes assessing the number and representativeness of health care settings that adopt any component of EIM, and Implementation involves assessing the extent to which health care teams implement EIM in their clinic. Finally, Maintenance involves assessing the long-term effectiveness (patient level) and sustained implementation (clinic level) of EIM in a given health care setting. Implications for Public Health The availability of a standardized, pragmatic, evaluation framework is critical in determining the impact of implementing EIM as a standard of care across health care systems. PMID:29752803

  15. A Pragmatic Application of the RE-AIM Framework for Evaluating the Implementation of Physical Activity as a Standard of Care in Health Systems.

    PubMed

    Stoutenberg, Mark; Galaviz, Karla I; Lobelo, Felipe; Joy, Elizabeth; Heath, Gregory W; Hutber, Adrian; Estabrooks, Paul

    2018-05-10

    Exercise is Medicine (EIM) is an initiative that seeks to integrate physical activity assessment, prescription, and patient referral as a standard in patient care. Methods to assess this integration have lagged behind its implementation. The purpose of this work is to provide a pragmatic framework to guide health care systems in assessing the implementation and impact of EIM. A working group of experts from health care, public health, and implementation science convened to develop an evaluation model based on the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework. The working group aimed to provide pragmatic guidance on operationalizing EIM across the different RE-AIM dimensions based on data typically available in health care settings. The Reach of EIM can be determined by the number and proportion of patients that were screened for physical inactivity, received brief counseling and/or a physical activity prescription, and were referred to physical activity resources. Effectiveness can be assessed through self-reported changes in physical activity, cardiometabolic biometric factors, incidence/burden of chronic disease, as well as health care utilization and costs. Adoption includes assessing the number and representativeness of health care settings that adopt any component of EIM, and Implementation involves assessing the extent to which health care teams implement EIM in their clinic. Finally, Maintenance involves assessing the long-term effectiveness (patient level) and sustained implementation (clinic level) of EIM in a given health care setting. The availability of a standardized, pragmatic, evaluation framework is critical in determining the impact of implementing EIM as a standard of care across health care systems.

  16. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816

  17. The RITES Way for NGSS Success

    NASA Astrophysics Data System (ADS)

    Murray, D. P.; De Oliveira, G.; Caulkins, J. L.; Veeger, A. I.; McLaren, P. J.

    2012-12-01

    The NRC's Framework for Science Education describes a new vision for science education: practical experience, thought process, and connecting ideas are not lost in a sea of endless information. That is because the Framework does not emphasize broad coverage of all subfields of science. Instead, they identify ideas in three dimensions that lend themselves to the creation of opportunities for a deeper understanding of science, namely, Science and Engineering Practices, Disciplinary Core Ideas, and Crosscutting Concepts. Developed with fidelity to the Framework the K-12 Next Generation Science Standards (NGSS) will provide a rich, cohesive set of standards in all disciplines designed to engage all students in the practices and apply crosscutting concepts to deepen their understanding of the core ideas within these discipline. In Rhode Island, for the last four years, the Rhode Island Technology Enhanced Science Project (RITES) has aimed to transform the quality of science teaching and learning at all secondary schools, with a similar vision to the Framework and NGSS. RITES was initially developed to closely align with existing state standards (Grade Span Expectations). As the work of developing new standards progresses, Rhode Island, as a NGSS Lead State Partner, established the RI-NGSS State Leadership Team, which was charged with providing feedback to the NGSS Writing Team. The inclusion of nine RITES personnel in this state team ensures that this project will quickly adjust to the new standards, even as they are being developed and refined. A main component of RITES is a professional development program for teachers, framed around summer workshops and projects during the school year. At the heart of the PD are Investigations, modules developed by scientist/teacher teams designed to engage students through science practices while presenting core ideas and crosscutting concepts. Around fifty investigations, drawn from the life, physical, and earth & space sciences (ESS), employ a web-based platform to explore models and analyze data collected by students. Formative and summative assessment tools are built into the investigations. Investigation topics include: rock cycle; measurements in astronomy; plate tectonics; seasons; nuclear decay; and phases of the moon. We will showcase at least two ESS investigations that exemplify the three dimensional components envisioned by the Framework.

  18. Grade Expectations for Vermont's Framework of Standards and Learning Opportunities, Summer 2004 (Arts)

    ERIC Educational Resources Information Center

    Vermont Department of Education, 2004

    2004-01-01

    In the fall of 1996, the State Board of Education adopted Vermont's Framework of Standards and Learning Opportunities. Over the years, the aim has been to make the standards more useful as guides to curriculum development. In 2000, the standards were formally revised and again adopted by the State Board. In 2004, another chapter in the standards,…

  19. Skew-flavored dark matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, Prateek; Chacko, Zackaria; Fortes, Elaine C. F. S.

    We explore a novel flavor structure in the interactions of dark matter with the Standard Model. We consider theories in which both the dark matter candidate, and the particles that mediate its interactions with the Standard Model fields, carry flavor quantum numbers. The interactions are skewed in flavor space, so that a dark matter particle does not directly couple to the Standard Model matter fields of the same flavor, but only to the other two flavors. This framework respects minimal flavor violation and is, therefore, naturally consistent with flavor constraints. We study the phenomenology of a benchmark model in whichmore » dark matter couples to right-handed charged leptons. In large regions of parameter space, the dark matter can emerge as a thermal relic, while remaining consistent with the constraints from direct and indirect detection. The collider signatures of this scenario include events with multiple leptons and missing energy. In conclusion, these events exhibit a characteristic flavor pattern that may allow this class of models to be distinguished from other theories of dark matter.« less

  20. Skew-flavored dark matter

    DOE PAGES

    Agrawal, Prateek; Chacko, Zackaria; Fortes, Elaine C. F. S.; ...

    2016-05-10

    We explore a novel flavor structure in the interactions of dark matter with the Standard Model. We consider theories in which both the dark matter candidate, and the particles that mediate its interactions with the Standard Model fields, carry flavor quantum numbers. The interactions are skewed in flavor space, so that a dark matter particle does not directly couple to the Standard Model matter fields of the same flavor, but only to the other two flavors. This framework respects minimal flavor violation and is, therefore, naturally consistent with flavor constraints. We study the phenomenology of a benchmark model in whichmore » dark matter couples to right-handed charged leptons. In large regions of parameter space, the dark matter can emerge as a thermal relic, while remaining consistent with the constraints from direct and indirect detection. The collider signatures of this scenario include events with multiple leptons and missing energy. In conclusion, these events exhibit a characteristic flavor pattern that may allow this class of models to be distinguished from other theories of dark matter.« less

  1. A metadata reporting framework for standardization and synthesis of ecohydrological field observations

    NASA Astrophysics Data System (ADS)

    Christianson, D. S.; Varadharajan, C.; Detto, M.; Faybishenko, B.; Gimenez, B.; Jardine, K.; Negron Juarez, R. I.; Pastorello, G.; Powell, T.; Warren, J.; Wolfe, B.; McDowell, N. G.; Kueppers, L. M.; Chambers, J.; Agarwal, D.

    2016-12-01

    The U.S. Department of Energy's (DOE) Next Generation Ecosystem Experiment (NGEE) Tropics project aims to develop a process-rich tropical forest ecosystem model that is parameterized and benchmarked by field observations. Thus, data synthesis, quality assurance and quality control (QA/QC), and data product generation of a diverse and complex set of ecohydrological observations, including sapflux, leaf surface temperature, soil water content, and leaf gas exchange from sites across the Tropics, are required to support model simulations. We have developed a metadata reporting framework, implemented in conjunction with the NGEE Tropics Data Archive tool, to enable cross-site and cross-method comparison, data interpretability, and QA/QC. We employed a modified User-Centered Design approach, which involved short development cycles based on user-identified needs, and iterative testing with data providers and users. The metadata reporting framework currently has been implemented for sensor-based observations and leverages several existing metadata protocols. The framework consists of templates that define a multi-scale measurement position hierarchy, descriptions of measurement settings, and details about data collection and data file organization. The framework also enables data providers to define data-access permission settings, provenance, and referencing to enable appropriate data usage, citation, and attribution. In addition to describing the metadata reporting framework, we discuss tradeoffs and impressions from both data providers and users during the development process, focusing on the scalability, usability, and efficiency of the framework.

  2. Generic Educational Knowledge Representation for Adaptive and Cognitive Systems

    ERIC Educational Resources Information Center

    Caravantes, Arturo; Galan, Ramon

    2011-01-01

    The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…

  3. Linking Reflection and Technical Competence: The Logbook as an Instrument in Teacher Education.

    ERIC Educational Resources Information Center

    Korthagen, Fred A. J.

    1999-01-01

    Describes a framework for integrating reflection and teacher competency development into teacher education programs, introducing a spiral model for reflection, standard reflection questions, and a method of structuring logbooks, all designed to develop a competency for self-directed professional growth in interpersonal classroom behavior. An…

  4. Industrial Sectors Integrated Solutions (ISIS) - An Engineering–Economic and Multi-Pollutant Modeling Framework for Comprehensive Regulatory Analyses

    EPA Science Inventory

    Based on the National Academy of Science’s 2004 report, “Air Quality Management in the United States”, the National Research Council (NRC) recommended to the US Environmental Protection Agency (EPA) that standard setting, planning, and control strategy development should be based...

  5. Merging Information Literacy and Evidence-Based Practice in an Undergraduate Health Sciences Curriculum Map

    ERIC Educational Resources Information Center

    Franzen, Susan; Bannon, Colleen M.

    2016-01-01

    The ACRL's "Framework for Information Literacy for Higher Education" offers the opportunity to rethink information literacy teaching and curriculum. However, the ACRL's rescinded "Information Literacy Competency Standards for Higher Education" correlate with the preferred research and decision-making model of the health…

  6. Yoga Therapy and Polyvagal Theory: The Convergence of Traditional Wisdom and Contemporary Neuroscience for Self-Regulation and Resilience

    PubMed Central

    Sullivan, Marlysa B.; Erb, Matt; Schmalzl, Laura; Moonaz, Steffany; Noggle Taylor, Jessica; Porges, Stephen W.

    2018-01-01

    Yoga therapy is a newly emerging, self-regulating complementary and integrative healthcare (CIH) practice. It is growing in its professionalization, recognition and utilization with a demonstrated commitment to setting practice standards, educational and accreditation standards, and promoting research to support its efficacy for various populations and conditions. However, heterogeneity of practice, poor reporting standards, and lack of a broadly accepted understanding of the neurophysiological mechanisms involved in yoga therapy limits the structuring of testable hypotheses and clinical applications. Current proposed frameworks of yoga-based practices focus on the integration of bottom-up neurophysiological and top-down neurocognitive mechanisms. In addition, it has been proposed that phenomenology and first person ethical inquiry can provide a lens through which yoga therapy is viewed as a process that contributes towards eudaimonic well-being in the experience of pain, illness or disability. In this article we build on these frameworks, and propose a model of yoga therapy that converges with Polyvagal Theory (PVT). PVT links the evolution of the autonomic nervous system to the emergence of prosocial behaviors and posits that the neural platforms supporting social behavior are involved in maintaining health, growth and restoration. This explanatory model which connects neurophysiological patterns of autonomic regulation and expression of emotional and social behavior, is increasingly utilized as a framework for understanding human behavior, stress and illness. Specifically, we describe how PVT can be conceptualized as a neurophysiological counterpart to the yogic concept of the gunas, or qualities of nature. Similar to the neural platforms described in PVT, the gunas provide the foundation from which behavioral, emotional and physical attributes emerge. We describe how these two different yet analogous frameworks—one based in neurophysiology and the other in an ancient wisdom tradition—highlight yoga therapy’s promotion of physical, mental and social wellbeing for self-regulation and resilience. This parallel between the neural platforms of PVT and the gunas of yoga is instrumental in creating a translational framework for yoga therapy to align with its philosophical foundations. Consequently, yoga therapy can operate as a distinct practice rather than fitting into an outside model for its utilization in research and clinical contexts. PMID:29535617

  7. Astronomical Data Integration Beyond the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Lemson, G.; Laurino, O.

    2015-09-01

    "Data integration" generally refers to the process of combining data from different source data bases into a unified view. Much work has been devoted in this area by the International Virtual Observatory Alliance (IVOA), allowing users to discover and access databases through standard protocols. However, different archives present their data through their own schemas and users must still select, filter, and combine data for each archive individually. An important reason for this is that the creation of common data models that satisfy all sub-disciplines is fraught with difficulties. Furthermore it requires a substantial amount of work for data providers to present their data according to some standard representation. We will argue that existing standards allow us to build a data integration framework that works around these problems. The particular framework requires the implementation of the IVOA Table Access Protocol (TAP) only. It uses the newly developed VO data modelling language (VO-DML) specification, which allows one to define extensible object-oriented data models using a subset of UML concepts through a simple XML serialization language. A rich mapping language allows one to describe how instances of VO-DML data models are represented by the TAP service, bridging the possible mismatch between a local archive's schema and some agreed-upon representation of the astronomical domain. In this so called local-as-view approach to data integration, “mediators" use the mapping prescriptions to translate queries phrased in terms of the common schema to the underlying TAP service. This mapping language has a graphical representation, which we expose through a web based graphical “drag-and-drop-and-connect" interface. This service allows any user to map the holdings of any TAP service to the data model(s) of choice. The mappings are defined and stored outside of the data sources themselves, which allows the interface to be used in a kind of crowd-sourcing effort to annotate any remote database of interest. This reduces the burden of publishing one's data and allows a great flexibility in the definition of the views through which particular communities might wish to access remote archives. At the same time, the framework easies the user's effort to select, filter, and combine data from many different archives, so as to build knowledge bases for their analysis. We will present the framework and demonstrate a prototype implementation. We will discuss ideas for producing the missing elements, in particular the query language and the implementation of mediator tools to translate object queries to ADQL

  8. Standardizing the classification of abortion incidents: the Procedural Abortion Incident Reporting and Surveillance (PAIRS) Framework.

    PubMed

    Taylor, Diana; Upadhyay, Ushma D; Fjerstad, Mary; Battistelli, Molly F; Weitz, Tracy A; Paul, Maureen E

    2017-07-01

    To develop and validate standardized criteria for assessing abortion-related incidents (adverse events, morbidities, near misses) for first-trimester aspiration abortion procedures and to demonstrate the utility of a standardized framework [the Procedural Abortion Incident Reporting & Surveillance (PAIRS) Framework] for estimating serious abortion-related adverse events. As part of a California-based study of early aspiration abortion provision conducted between 2007 and 2013, we developed and validated a standardized framework for defining and monitoring first-trimester (≤14weeks) aspiration abortion morbidity and adverse events using multiple methods: a literature review, framework criteria testing with empirical data, repeated expert reviews and data-based revisions to the framework. The final framework distinguishes incidents resulting from procedural abortion care (adverse events) from morbidity related to pregnancy, the abortion process and other nonabortion related conditions. It further classifies incidents by diagnosis (confirmatory data, etiology, risk factors), management (treatment type and location), timing (immediate or delayed), seriousness (minor or major) and outcome. Empirical validation of the framework using data from 19,673 women receiving aspiration abortions revealed almost an equal proportion of total adverse events (n=205, 1.04%) and total abortion- or pregnancy-related morbidity (n=194, 0.99%). The majority of adverse events were due to retained products of conception (0.37%), failed attempted abortion (0.15%) and postabortion infection (0.17%). Serious or major adverse events were rare (n=11, 0.06%). Distinguishing morbidity diagnoses from adverse events using a standardized, empirically tested framework confirms the very low frequency of serious adverse events related to clinic-based abortion care. The PAIRS Framework provides a useful set of tools to systematically classify and monitor abortion-related incidents for first-trimester aspiration abortion procedures. Standardization will assist healthcare providers, researchers and policymakers to anticipate morbidity and prevent abortion adverse events, improve care metrics and enhance abortion quality. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Utilizing the National Research Council's (NRC) Conceptual Framework for the Next Generation Science Standards (NGSS): A Self-Study in My Science, Engineering, and Mathematics Classroom

    NASA Astrophysics Data System (ADS)

    Corvo, Arthur Francis

    Given the reality that active and competitive participation in the 21 st century requires American students to deepen their scientific and mathematical knowledge base, the National Research Council (NRC) proposed a new conceptual framework for K--12 science education. The framework consists of an integration of what the NRC report refers to as the three dimensions: scientific and engineering practices, crosscutting concepts, and core ideas in four disciplinary areas (physical, life and earth/spaces sciences, and engineering/technology). The Next Generation Science Standards (NGSS ), which are derived from this new framework, were released in April 2013 and have implications on teacher learning and development in Science, Technology, Engineering, and Mathematics (STEM). Given the NGSS's recent introduction, there is little research on how teachers can prepare for its release. To meet this research need, I implemented a self-study aimed at examining my teaching practices and classroom outcomes through the lens of the NRC's conceptual framework and the NGSS. The self-study employed design-based research (DBR) methods to investigate what happened in my secondary classroom when I designed, enacted, and reflected on units of study for my science, engineering, and mathematics classes. I utilized various best practices including Learning for Use (LfU) and Understanding by Design (UbD) models for instructional design, talk moves as a tool for promoting discourse, and modeling instruction for these designed units of study. The DBR strategy was chosen to promote reflective cycles, which are consistent with and in support of the self-study framework. A multiple case, mixed-methods approach was used for data collection and analysis. The findings in the study are reported by study phase in terms of unit planning, unit enactment, and unit reflection. The findings have implications for science teaching, teacher professional development, and teacher education.

  10. BIAS: A Regional Management of Underwater Sound in the Baltic Sea.

    PubMed

    Sigray, Peter; Andersson, Mathias; Pajala, Jukka; Laanearu, Janek; Klauson, Aleksander; Tegowski, Jaroslaw; Boethling, Maria; Fischer, Jens; Tougaard, Jakob; Wahlberg, Magnus; Nikolopoulos, Anna; Folegot, Thomas; Matuschek, Rainer; Verfuss, Ursula

    2016-01-01

    Management of the impact of underwater sound is an emerging concern worldwide. Several countries are in the process of implementing regulatory legislations. In Europe, the Marine Strategy Framework Directive was launched in 2008. This framework addresses noise impacts and the recommendation is to deal with it on a regional level. The Baltic Sea is a semienclosed area with nine states bordering the sea. The number of ships is one of the highest in Europe. Furthermore, the number of ships is estimated to double by 2030. Undoubtedly, due to the unbound character of noise, an efficient management of sound in the Baltic Sea must be done on a regional scale. In line with the European Union directive, the Baltic Sea Information on the Acoustic Soundscape (BIAS) project was established to implement Descriptor 11 of the Marine Strategy Framework Directive in the Baltic Sea region. BIAS will develop tools, standards, and methodologies that will allow for cross-border handling of data and results, measure sound in 40 locations for 1 year, establish a seasonal soundscape map by combining measured sound with advanced three-dimensional modeling, and, finally, establish standards for measuring continuous sound. Results from the first phase of BIAS are presented here, with an emphasis on standards and soundscape mapping as well as the challenges related to regional handling.

  11. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.

    PubMed

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.

  12. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator

    PubMed Central

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation. PMID:28596730

  13. High-Performance Data Analytics Beyond the Relational and Graph Data Models with GEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Minutoli, Marco; Bhatt, Shreyansh

    Graphs represent an increasingly popular data model for data-analytics, since they can naturally represent relationships and interactions between entities. Relational databases and their pure table-based data model are not well suitable to store and process sparse data. Consequently, graph databases have gained interest in the last few years and the Resource Description Framework (RDF) became the standard data model for graph data. Nevertheless, while RDF is well suited to analyze the relationships between the entities, it is not efficient in representing their attributes and properties. In this work we propose the adoption of a new hybrid data model, based onmore » attributed graphs, that aims at overcoming the limitations of the pure relational and graph data models. We present how we have re-designed the GEMS data-analytics framework to fully take advantage of the proposed hybrid data model. To improve analysts productivity, in addition to a C++ API for applications development, we adopt GraQL as input query language. We validate our approach implementing a set of queries on net-flow data and we compare our framework performance against Neo4j. Experimental results show significant performance improvement over Neo4j, up to several orders of magnitude when increasing the size of the input data.« less

  14. Confirmatory Factor Analysis of the Patient Reported Outcomes Measurement Information System (PROMIS) Adult Domain Framework Using Item Response Theory Scores.

    PubMed

    Carle, Adam C; Riley, William; Hays, Ron D; Cella, David

    2015-10-01

    To guide measure development, National Institutes of Health-supported Patient reported Outcomes Measurement Information System (PROMIS) investigators developed a hierarchical domain framework. The framework specifies health domains at multiple levels. The initial PROMIS domain framework specified that physical function and symptoms such as Pain and Fatigue indicate Physical Health (PH); Depression, Anxiety, and Anger indicate Mental Health (MH); and Social Role Performance and Social Satisfaction indicate Social Health (SH). We used confirmatory factor analyses to evaluate the fit of the hypothesized framework to data collected from a large sample. We used data (n=14,098) from PROMIS's wave 1 field test and estimated domain scores using the PROMIS item response theory parameters. We then used confirmatory factor analyses to test whether the domains corresponded to the PROMIS domain framework as expected. A model corresponding to the domain framework did not provide ideal fit [root mean square error of approximation (RMSEA)=0.13; comparative fit index (CFI)=0.92; Tucker Lewis Index (TLI)=0.88; standardized root mean square residual (SRMR)=0.09]. On the basis of modification indices and exploratory factor analyses, we allowed Fatigue to load on both PH and MH. This model fit the data acceptably (RMSEA=0.08; CFI=0.97; TLI=0.96; SRMR=0.03). Our findings generally support the PROMIS domain framework. Allowing Fatigue to load on both PH and MH improved fit considerably.

  15. ICW eHealth Framework.

    PubMed

    Klein, Karsten; Wolff, Astrid C; Ziebold, Oliver; Liebscher, Thomas

    2008-01-01

    The ICW eHealth Framework (eHF) is a powerful infrastructure and platform for the development of service-oriented solutions in the health care business. It is the culmination of many years of experience of ICW in the development and use of in-house health care solutions and represents the foundation of ICW product developments based on the Java Enterprise Edition (Java EE). The ICW eHealth Framework has been leveraged to allow development by external partners - enabling adopters a straightforward integration into ICW solutions. The ICW eHealth Framework consists of reusable software components, development tools, architectural guidelines and conventions defining a full software-development and product lifecycle. From the perspective of a partner, the framework provides services and infrastructure capabilities for integrating applications within an eHF-based solution. This article introduces the ICW eHealth Framework's basic architectural concepts and technologies. It provides an overview of its module and component model, describes the development platform that supports the complete software development lifecycle of health care applications and outlines technological aspects, mainly focusing on application development frameworks and open standards.

  16. A framework for automatic creation of gold-standard rigid 3D-2D registration datasets.

    PubMed

    Madan, Hennadii; Pernuš, Franjo; Likar, Boštjan; Špiclin, Žiga

    2017-02-01

    Advanced image-guided medical procedures incorporate 2D intra-interventional information into pre-interventional 3D image and plan of the procedure through 3D/2D image registration (32R). To enter clinical use, and even for publication purposes, novel and existing 32R methods have to be rigorously validated. The performance of a 32R method can be estimated by comparing it to an accurate reference or gold standard method (usually based on fiducial markers) on the same set of images (gold standard dataset). Objective validation and comparison of methods are possible only if evaluation methodology is standardized, and the gold standard  dataset is made publicly available. Currently, very few such datasets exist and only one contains images of multiple patients acquired during a procedure. To encourage the creation of gold standard 32R datasets, we propose an automatic framework. The framework is based on rigid registration of fiducial markers. The main novelty is spatial grouping of fiducial markers on the carrier device, which enables automatic marker localization and identification across the 3D and 2D images. The proposed framework was demonstrated on clinical angiograms of 20 patients. Rigid 32R computed by the framework was more accurate than that obtained manually, with the respective target registration error below 0.027 mm compared to 0.040 mm. The framework is applicable for gold standard setup on any rigid anatomy, provided that the acquired images contain spatially grouped fiducial markers. The gold standard datasets and software will be made publicly available.

  17. ICHEP 2014 Summary: Theory Status after the First LHC Run

    NASA Astrophysics Data System (ADS)

    Pich, Antonio

    2016-04-01

    A brief overview of the main highlights discussed at ICHEP 2014 is presented. The experimental data confirm that the scalar boson discovered at the LHC couples to other particles as predicted in the Standard Model. This constitutes a great success of the present theoretical paradigm, which has been confirmed as the correct description at the electroweak scale. At the same time, the negative searches for signals of new phenomena tightly constrain many new-physics scenarios, challenging previous theoretical wisdom and opening new perspectives in fundamental physics. Fresh ideas are needed to face the many pending questions unanswered within the Standard Model framework.

  18. CF Metadata Conventions: Founding Principles, Governance, and Future Directions

    NASA Astrophysics Data System (ADS)

    Taylor, K. E.

    2016-12-01

    The CF Metadata Conventions define attributes that promote sharing of climate and forecasting data and facilitate automated processing by computers. The development, maintenance, and evolution of the conventions have mainly been provided by voluntary community contributions. Nevertheless, an organizational framework has been established, which relies on established rules and web-based discussion to ensure smooth (but relatively efficient) evolution of the standard to accommodate new types of data. The CF standard has been essential to the success of high-profile internationally-coordinated modeling activities (e.g, the Coupled Model Intercomparison Project). A summary of CF's founding principles and the prospects for its future evolution will be discussed.

  19. The deegree framework - Spatial Data Infrastructure solution for end-users and developers

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Poth, Andreas

    2010-05-01

    The open source software framework deegree is a comprehensive implementa­tion of standards as defined by ISO and Open Geospatial Consortium (OGC). It has been developed with two goals in mind: provide a uniform framework for implementing Spatial Data Infrastructures (SDI) and adhering to standards as strictly as possible. Although being open source software (Lesser GNU Public Li­cense, LGPL), deegree has been developed with a business model in mind: providing the general building blocks of SDIs without license fees and offer cus­tomization, consulting and tailoring by specialized companies. The core of deegree is a comprehensive Java Application Programming Inter­face (API) offering access to spatial features, analysis, metadata and coordinate reference systems. As a library, deegree can and has been integrated as a core module inside spatial information systems. It is reference implementation for several OGC standards and based on an ISO 19107 geometry model. For end users, deegree is shipped as a web application providing easy-to-set-up components for web mapping and spatial analysis. Since 2000, deegree has been the backbone of many productive SDIs, first and foremost for governmental stakeholders (e.g. Federal Agency for Cartography and Geodesy in Germany, the Ministry of Housing, Spatial Planning and the En­vironment in the Netherlands, etc.) as well as for research and development projects as an early adoption of standards, drafts and discussion papers. Be­sides mature standards like Web Map Service, Web Feature Service and Cata­logue Services, deegree also implements rather new standards like the Sensor Observation Service, the Web Processing Service and the Web Coordinate Transformation Service (WCTS). While a robust background in standardization (knowledge and implementation) is a must for consultancy, standard-compliant services and encodings alone do not provide solutions for customers. The added value is comprised by a sophistic­ated set of client software, desktop and web environments. A focus lies on different client solutions for specific standards like the Web Pro­cessing Service and the Web Coordinate Transformation Service. On the other hand, complex geoportal solutions comprised of multiple standards and en­hanced by components for user management, security and map client function­ality show the demanding requirements of real world solutions. The XPlan-GML-standard as defined by the German spatial planing authorities is a good ex­ample of how complex real-world requirements can get. XPlan-GML is intended to provide a framework for digital spatial planning documents and requires complex Geography Markup Language (GML) features along with Symbology Encoding (SE), Filter Encoding (FE), Web Map Services (WMS), Web Feature Services (WFS). This complex in­frastructure should be used by urban and spatial planners and therefore re­quires a user-friendly graphical interface hiding the complexity of the underly­ing infrastructure. Based on challenges faced within customer projects, the importance of easy to use software components is focused. SDI solution should be build upon ISO/OGC-standards, but more important, should be user-friendly and support the users in spatial data management and analysis.

  20. Modelling Common Agricultural Policy-Water Framework Directive interactions and cost-effectiveness of measures to reduce nitrogen pollution.

    PubMed

    Mouratiadou, Ioanna; Russell, Graham; Topp, Cairistiona; Louhichi, Kamel; Moran, Dominic

    2010-01-01

    Selecting cost-effective measures to regulate agricultural water pollution to conform to the Water Framework Directive presents multiple challenges. A bio-economic modelling approach is presented that has been used to explore the water quality and economic effects of the 2003 Common Agricultural Policy Reform and to assess the cost-effectiveness of input quotas and emission standards against nitrate leaching, in a representative case study catchment in Scotland. The approach combines a biophysical model (NDICEA) with a mathematical programming model (FSSIM-MP). The results indicate only small changes due to the Reform, with the main changes in farmers' decision making and the associated economic and water quality indicators depending on crop price changes, and suggest the use of target fertilisation in relation to crop and soil requirements, as opposed to measures targeting farm total or average nitrogen use.

  1. Medical Differential Diagnosis (MDD) as the Architectural Framework for a Knowledge Model: A Vulnerability Detection and Threat Identification Methodology for Cyber-Crime and Cyber-Terrorism

    ERIC Educational Resources Information Center

    Conley-Ware, Lakita D.

    2010-01-01

    This research addresses a real world cyberspace problem, where currently no cross industry standard methodology exists. The goal is to develop a model for identification and detection of vulnerabilities and threats of cyber-crime or cyber-terrorism where cyber-technology is the vehicle to commit the criminal or terrorist act (CVCT). This goal was…

  2. Operational Resiliency Management: An Introduction to the Resiliency Engineering Framework

    DTIC Science & Technology

    2006-09-20

    Maturity Model Integration (CMMI) . 5 © 2006 Carnegie Mellon University y FRB Bus Con Conference 2006 Managing Today’s Operational Risk Challenges ...Bus Con Conference 2006 A model is needed to. . . Identify and prioritize risk exposures Define a process improvement roadmap Measure and facilitate...University y FRB Bus Con Conference 2006 Why use a “model” approach? Provides an operational risk roadmap Vendor-neutral, standardized, unbiased

  3. Quantile regression via vector generalized additive models.

    PubMed

    Yee, Thomas W

    2004-07-30

    One of the most popular methods for quantile regression is the LMS method of Cole and Green. The method naturally falls within a penalized likelihood framework, and consequently allows for considerable flexible because all three parameters may be modelled by cubic smoothing splines. The model is also very understandable: for a given value of the covariate, the LMS method applies a Box-Cox transformation to the response in order to transform it to standard normality; to obtain the quantiles, an inverse Box-Cox transformation is applied to the quantiles of the standard normal distribution. The purposes of this article are three-fold. Firstly, LMS quantile regression is presented within the framework of the class of vector generalized additive models. This confers a number of advantages such as a unifying theory and estimation process. Secondly, a new LMS method based on the Yeo-Johnson transformation is proposed, which has the advantage that the response is not restricted to be positive. Lastly, this paper describes a software implementation of three LMS quantile regression methods in the S language. This includes the LMS-Yeo-Johnson method, which is estimated efficiently by a new numerical integration scheme. The LMS-Yeo-Johnson method is illustrated by way of a large cross-sectional data set from a New Zealand working population. Copyright 2004 John Wiley & Sons, Ltd.

  4. Dark Matter "Collider" from Inelastic Boosted Dark Matter.

    PubMed

    Kim, Doojin; Park, Jong-Chul; Shin, Seodong

    2017-10-20

    We propose a novel dark matter (DM) detection strategy for models with a nonminimal dark sector. The main ingredients in the underlying DM scenario are a boosted DM particle and a heavier dark sector state. The relativistic DM impinged on target material scatters off inelastically to the heavier state, which subsequently decays into DM along with lighter states including visible (standard model) particles. The expected signal event, therefore, accompanies a visible signature by the secondary cascade process associated with a recoiling of the target particle, differing from the typical neutrino signal not involving the secondary signature. We then discuss various kinematic features followed by DM detection prospects at large-volume neutrino detectors with a model framework where a dark gauge boson is the mediator between the standard model particles and DM.

  5. Bargaining Agents in Wireless Contexts: An Alternating-Offers Protocol for Multi-issue Bilateral Negotiation in Mobile Marketplaces

    NASA Astrophysics Data System (ADS)

    Ragone, Azzurra; Ruta, Michele; di Sciascio, Eugenio; Donini, Francesco M.

    We present an approach to multi-issue bilateral negotiation for mobile commerce scenarios. The negotiation mechanism has been integrated in a semantic-based application layer enhancing both RFID and Bluetooth wireless standards. OWL DL has been used to model advertisements and relationships among issues within a shared common ontology. Finally, non standard inference services integrated with utility theory help in finding suitable agreements. We illustrate and motivate the provided theoretical framework in a wireless commerce case study.

  6. System Software Framework for System of Systems Avionics

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.

    2005-01-01

    Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.

  7. Standardized Competencies for Parenteral Nutrition Order Review and Parenteral Nutrition Preparation, Including Compounding: The ASPEN Model.

    PubMed

    Boullata, Joseph I; Holcombe, Beverly; Sacks, Gordon; Gervasio, Jane; Adams, Stephen C; Christensen, Michael; Durfee, Sharon; Ayers, Phil; Marshall, Neil; Guenter, Peggi

    2016-08-01

    Parenteral nutrition (PN) is a high-alert medication with a complex drug use process. Key steps in the process include the review of each PN prescription followed by the preparation of the formulation. The preparation step includes compounding the PN or activating a standardized commercially available PN product. The verification and review, as well as preparation of this complex therapy, require competency that may be determined by using a standardized process for pharmacists and for pharmacy technicians involved with PN. An American Society for Parenteral and Enteral Nutrition (ASPEN) standardized model for PN order review and PN preparation competencies is proposed based on a competency framework, the ASPEN-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines, and is intended for institutions and agencies to use with their staff. © 2016 American Society for Parenteral and Enteral Nutrition.

  8. Framework for industry engagement and quality principles for industry-provided medical education in Europe

    PubMed Central

    Allen, Tamara; Donde, Nina; Hofstädter-Thalmann, Eva; Keijser, Sandra; Moy, Veronique; Murama, Jean-Jacques; Kellner, Thomas

    2017-01-01

    ABSTRACT Lifelong learning through continuing professional development (CPD) and medical education is critical for healthcare professionals to stay abreast of knowledge and skills and provide an optimal standard of care to patients. In Europe, CPD and medical education are fragmented as there are numerous models, providers and national regulations and a lack of harmonisation of qualitative criteria. There is continued debate on the appropriate role of pharmaceutical companies in the context of medical education. Accrediting bodies such as European Accreditation Council for Continuing Medical Education do not permit active involvement of the pharmaceutical industry due to concerns around conflicts of interest and potential for bias. However, many examples of active collaboration between pharmaceutical companies and medical societies and scientific experts exist, demonstrating high integrity, clear roles and responsibilities, and fair and balanced content. Medical education experts from 16 pharmaceutical companies met to develop a set of quality principles similar to standards that have been established for clinical trials and in alignment with existing principles of accrediting bodies. This paper outlines their proposal for a framework to improve and harmonise medical education quality standards in Europe, and is also an invitation for all stakeholders to join a discussion on this integrative model. PMID:29644135

  9. Implications of new physics in the decays Bc→(J /ψ , ηc)τ ν

    NASA Astrophysics Data System (ADS)

    Tran, C. T.; Ivanov, M. A.; Körner, J. G.; Santorelli, P.

    2018-03-01

    We study the semileptonic decays of the Bc meson into final charmonium states within the standard model and beyond. The relevant hadronic transition form factors are calculated in the framework of the covariant confined quark model developed by us. We focus on the tau mode of these decays, which may provide some hints of new physics effects. We extend the standard model by assuming a general effective Hamiltonian describing the b →c τ ν transition, which consists of the full set of the four-fermion operators. We then obtain experimental constraints on the Wilson coefficients corresponding to each operator and provide predictions for the branching fractions and other polarization observables in different new physics scenarios.

  10. BioNet Digital Communications Framework

    NASA Technical Reports Server (NTRS)

    Gifford, Kevin; Kuzminsky, Sebastian; Williams, Shea

    2010-01-01

    BioNet v2 is a peer-to-peer middleware that enables digital communication devices to talk to each other. It provides a software development framework, standardized application, network-transparent device integration services, a flexible messaging model, and network communications for distributed applications. BioNet is an implementation of the Constellation Program Command, Control, Communications and Information (C3I) Interoperability specification, given in CxP 70022-01. The system architecture provides the necessary infrastructure for the integration of heterogeneous wired and wireless sensing and control devices into a unified data system with a standardized application interface, providing plug-and-play operation for hardware and software systems. BioNet v2 features a naming schema for mobility and coarse-grained localization information, data normalization within a network-transparent device driver framework, enabling of network communications to non-IP devices, and fine-grained application control of data subscription band width usage. BioNet directly integrates Disruption Tolerant Networking (DTN) as a communications technology, enabling networked communications with assets that are only intermittently connected including orbiting relay satellites and planetary rover vehicles.

  11. A Five-Factor Model framework for understanding childhood personality disorder antecedents.

    PubMed

    De Clercq, Barbara; De Fruyt, Filip

    2012-12-01

    The present contribution reviews evidence that supports the relevance of childhood antecedents of personality disorders, and advocates that the validity of a Five-Factor Model framework for describing general trait differences in childhood can be extended towards the field of developmental personality difficulties. In addition, we suggest that several traditional childhood Axis I conditions include a substantial trait component that may be responsible for the recurring finding that childhood Axis I disorders are predictive for adult Axis II disorders. Given the valuable information provided by a trait assessment, we further propose to integrate dimensional personality and personality pathology measures as standard tools in mental health assessments at a young age. © 2012 The Authors. Journal of Personality © 2012, Wiley Periodicals, Inc.

  12. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  13. Beauty and the beast: Aligning national curriculum standards with state (high school) graduation requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linder-Scholer, B.

    1994-12-31

    An overview of SCI/MATH/MN - Minnesota`s standards-based, systemic approach to the reform and improvement of the K-12 science and mathematics education delivery system - is offered as an illustration of the challenges of aligning state educational practices with the national curriculum standards, and as a model for business involvement in state educational policy issues that will enable fundamental, across-the-system reform. SCI/MATH/MN illustrates the major challenges involved in developing a statewide vision for math and science education reform, articulating frameworks aligned with the national standards, building capacity for system-oriented change at the local level, and involving business in systemic reform.

  14. Framework for Leading Next Generation Science Standards Implementation

    ERIC Educational Resources Information Center

    Stiles, Katherine; Mundry, Susan; DiRanna, Kathy

    2017-01-01

    In response to the need to develop leaders to guide the implementation of the Next Generation Science Standards (NGSS), the Carnegie Corporation of New York provided funding to WestEd to develop a framework that defines the leadership knowledge and actions needed to effectively implement the NGSS. The development of the framework entailed…

  15. Early Learning Foundations. Indiana's Early Learning Development Framework Aligned to the Indiana Academic Standards, 2014

    ERIC Educational Resources Information Center

    Indiana Department of Education, 2015

    2015-01-01

    The "Foundations" (English/language arts, mathematics, social emotional skills, approaches to play and learning, science, social studies, creative arts, and physical health and growth) are Indiana's early learning development framework and are aligned to the 2014 Indiana Academic Standards. This framework provides core elements that…

  16. The Importance of the C3 Framework

    ERIC Educational Resources Information Center

    Social Education, 2013

    2013-01-01

    "The C3 Framework for Social Studies State Standards will soon be released under the title "The College, Career, and Civic Life (C3) Framework for Social Studies State Standards: State Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History." The C3 Project Director and Lead Writer was NCSS member Kathy…

  17. Teaching ear reconstruction using an alloplastic carving model.

    PubMed

    Murabit, Amera; Anzarut, Alexander; Kasrai, Laila; Fisher, David; Wilkes, Gordon

    2010-11-01

    Ear reconstruction is challenging surgery, often with poor outcomes. Our purpose was to develop a surgical training model for auricular reconstruction. Silicone costal cartilage models were incorporated in a workshop-based instructional program. Trainees were randomly divided. Workshop group (WG) participated in an interactive session, carving frameworks under supervision. Nonworkshop group (NWG) did not participate. Standard Nagata templates were used. Two further frameworks were created, first with supervision then without. Groups were combined after the first carving because of frustration in the NWG. Assessment was completed by 3 microtia surgeons from 2 different centers, blinded to framework origin. Frameworks were rated out of 10 using Likert and visual analog scales. Results were examined using SPSS (version 14), with t test, ANOVA, and Bonferroni post hoc analyses. Cartilaginous frameworks from the WG scored better for the first carving (WG 5.5 vs NWG 4.4), the NWG improved for the second carving (WG 6.6 vs NWG 6.5), and both groups scored lower with the third unsupervised carving (WG 5.9 vs NWG 5.6). Combined scores after 3 frameworks were not statistically significantly different between original groups. A statistically significant improvement was demonstrated for all carvers between sessions 1 and 2 (P ≤ 0.09), between sessions 1 and 3 (P ≤ 0.05), but not between sessions 2 and 3, thus suggesting the necessity of in vitro practice until high scores are achieved and maintained without supervision before embarking on in vivo carvings. Quality of carvings was not related to level of training. An appropriate and applicable surgical training model and training method can aid in attaining skills necessary for successful auricular reconstruction.

  18. Program and Project Management Framework

    NASA Technical Reports Server (NTRS)

    Butler, Cassandra D.

    2002-01-01

    The primary objective of this project was to develop a framework and system architecture for integrating program and project management tools that may be applied consistently throughout Kennedy Space Center (KSC) to optimize planning, cost estimating, risk management, and project control. Project management methodology used in building interactive systems to accommodate the needs of the project managers is applied as a key component in assessing the usefulness and applicability of the framework and tools developed. Research for the project included investigation and analysis of industrial practices, KSC standards, policies, and techniques, Systems Management Office (SMO) personnel, and other documented experiences of project management experts. In addition, this project documents best practices derived from the literature as well as new or developing project management models, practices, and techniques.

  19. Bayes factors and multimodel inference

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    Multimodel inference has two main themes: model selection, and model averaging. Model averaging is a means of making inference conditional on a model set, rather than on a selected model, allowing formal recognition of the uncertainty associated with model choice. The Bayesian paradigm provides a natural framework for model averaging, and provides a context for evaluation of the commonly used AIC weights. We review Bayesian multimodel inference, noting the importance of Bayes factors. Noting the sensitivity of Bayes factors to the choice of priors on parameters, we define and propose nonpreferential priors as offering a reasonable standard for objective multimodel inference.

  20. A novel integrated modelling framework to assess the impacts of climate and socio-economic drivers on land use and water quality.

    PubMed

    Zessner, Matthias; Schönhart, Martin; Parajka, Juraj; Trautvetter, Helene; Mitter, Hermine; Kirchner, Mathias; Hepp, Gerold; Blaschke, Alfred Paul; Strenn, Birgit; Schmid, Erwin

    2017-02-01

    Changes in climatic conditions will directly affect the quality and quantity of water resources. Further on, they will affect them indirectly through adaptation in land use which ultimately influences diffuse nutrient emissions to rivers and therefore potentially the compliance with good ecological status according to the EU Water Framework Directive (WFD). We present an integrated impact modelling framework (IIMF) to track and quantify direct and indirect pollution impacts along policy-economy-climate-agriculture-water interfaces. The IIMF is applied to assess impacts of climatic and socio-economic drivers on agricultural land use (crop choices, farming practices and fertilization levels), river flows and the risk for exceedance of environmental quality standards for determination of the ecological water quality status in Austria. This article also presents model interfaces as well as validation procedures and results of single models and the IIMF with respect to observed state variables such as land use, river flow and nutrient river loads. The performance of the IIMF for calculations of river nutrient loads (120 monitoring stations) shows a Nash-Sutcliffe Efficiency of 0.73 for nitrogen and 0.51 for phosphorus. Most problematic is the modelling of phosphorus loads in the alpine catchments dominated by forests and mountainous landscape. About 63% of these catchments show a deviation between modelled and observed loads of 30% and more. In catchments dominated by agricultural production, the performance of the IIMF is much better as only 30% of cropland and 23% of permanent grassland dominated areas have a deviation of >30% between modelled and observed loads. As risk of exceedance of environmental quality standards is mainly recognized in catchments dominated by cropland, the IIMF is well suited for assessing the nutrient component of the WFD ecological status. Copyright © 2016 British Geological Survey, NERC. Published by Elsevier B.V. All rights reserved.

  1. Framework for Designing The Assessment Models of Readiness SMEs to Adopt Indonesian National Standard (SNI), Case Study: SMEs Batik in Surakarta

    NASA Astrophysics Data System (ADS)

    Fahma, Fakhrina; Zakaria, Roni; Fajar Gumilang, Royan

    2018-03-01

    Since the ASEAN Economic Community (AEC) is released, the opportunity to expand market share has become very open, but the level of competition is also very high. Standardization is believed to be an important factor in seizing opportunities in the AEC’s era and other free trade agreements in the future. Standardization activities in industry can be proven by obtaining certification of SNI (Indonesian National Standard). This is a challenge for SMEs, considering that currently only 20% of SMEs had SNI certification both product and process. This research will be designed a model of readiness assessment to obtain SNI certification for SMEs. The stages of model development used an innovative approach by Roger (2003). Variables that affect the readiness of SMEs are obtained from product certification requirements established by BSN (National Standardization Agency) and LSPro (Certification body). This model will be used for mapping the readiness for SNI certification of SMEs’product. The level of readiness of SMEs is determined by the percentage of compliance with those requirements. Based on the result of this study, the five variables are determined as main aspects to assess SME readiness. For model validation, trials were conducted on Batik SMEs in Laweyan Surakarta.

  2. Experimental Study and Numerical Modeling of Fracture Propagation in Shale Rocks During Brazilian Disk Test

    NASA Astrophysics Data System (ADS)

    Mousavi Nezhad, Mohaddeseh; Fisher, Quentin J.; Gironacci, Elia; Rezania, Mohammad

    2018-06-01

    Reliable prediction of fracture process in shale-gas rocks remains one of the most significant challenges for establishing sustained economic oil and gas production. This paper presents a modeling framework for simulation of crack propagation in heterogeneous shale rocks. The framework is on the basis of a variational approach, consistent with Griffith's theory. The modeling framework is used to reproduce the fracture propagation process in shale rock samples under standard Brazilian disk test conditions. Data collected from the experiments are employed to determine the testing specimens' tensile strength and fracture toughness. To incorporate the effects of shale formation heterogeneity in the simulation of crack paths, fracture properties of the specimens are defined as spatially random fields. A computational strategy on the basis of stochastic finite element theory is developed that allows to incorporate the effects of heterogeneity of shale rocks on the fracture evolution. A parametric study has been carried out to better understand how anisotropy and heterogeneity of the mechanical properties affect both direction of cracks and rock strength.

  3. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  4. Composite accidental axions

    NASA Astrophysics Data System (ADS)

    Redi, Michele; Sato, Ryosuke

    2016-05-01

    We present several models where the QCD axion arises accidentally. Confining gauge theories can generate axion candidates whose properties are uniquely determined by the quantum numbers of the new fermions under the Standard Model. The Peccei-Quinn symmetry can emerge accidentally if the gauge theory is chiral. We generalise previous constructions in a unified framework. In some cases these models can be understood as the deconstruction of 5-dimensional gauge theories where the Peccei-Quinn symmetry is protected by locality but more general constructions are possible.

  5. Developing and Testing a Robust, Multi-Scale Framework for the Recovery of Longleaf Pine Understory Communities

    DTIC Science & Technology

    2015-05-01

    Model averaging for species richness on post-agricultural sites (1000 m2) with a landscape radius of 150 m. Table 3.4.8. Model selection for species ... richness on post-agricultural sites (1000 m2) with a landscape radius of 150 m. Table 3.4.9. Model averaging for proportion of reference species on...Direct, indirect, and total standardized effects on species richness . Table 4.1.1. Species and number of seeds added to the experimental plots at

  6. Astroparticle physics and cosmology.

    PubMed

    Mitton, Simon

    2006-05-20

    Astroparticle physics is an interdisciplinary field that explores the connections between the physics of elementary particles and the large-scale properties of the universe. Particle physicists have developed a standard model to describe the properties of matter in the quantum world. This model explains the bewildering array of particles in terms of constructs made from two or three quarks. Quarks, leptons, and three of the fundamental forces of physics are the main components of this standard model. Cosmologists have also developed a standard model to describe the bulk properties of the universe. In this new framework, ordinary matter, such as stars and galaxies, makes up only around 4% of the material universe. The bulk of the universe is dark matter (roughly 23%) and dark energy (about 73%). This dark energy drives an acceleration that means that the expanding universe will grow ever larger. String theory, in which the universe has several invisible dimensions, might offer an opportunity to unite the quantum description of the particle world with the gravitational properties of the large-scale universe.

  7. The Reliability of Setting Grade Boundaries Using Comparative Judgement

    ERIC Educational Resources Information Center

    Benton, Tom; Elliott, Gill

    2016-01-01

    In recent years the use of expert judgement to set and maintain examination standards has been increasingly criticised in favour of approaches based on statistical modelling. This paper reviews existing research on this controversy and attempts to unify the evidence within a framework where expertise is utilised in the form of comparative…

  8. An Exploration of Teachers' and Administrators' Perspectives: The Collaborative Process Using the Danielson Framework for Teaching Model

    ERIC Educational Resources Information Center

    Landolfi, Adrienne M.

    2016-01-01

    As accountability measures continue to increase within education, public school systems have integrated standards-based evaluation systems to formally assess professional practices among educators. The purpose of this study was to explore the extent in which the communication process between evaluators and teachers impacts teacher performance…

  9. Distributional Effects of Educational Improvements: Are We Using the Wrong Model?

    ERIC Educational Resources Information Center

    Bourguignon, Francois; Rogers, F. Halsey

    2007-01-01

    Measuring the incidence of public spending in education requires an intergenerational framework distinguishing between what current and future generations--that is, parents and children--give and receive. In standard distributional incidence analysis, households are assumed to receive a benefit equal to what is spent on their children enrolled in…

  10. Comparing Three Models of Achievement Goals: Goal Orientations, Goal Standards, and Goal Complexes

    ERIC Educational Resources Information Center

    Senko, Corwin; Tropiano, Katie L.

    2016-01-01

    Achievement goal theory (Dweck, 1986) initially characterized mastery goals and performance goals as opposites in a good-bad dualism of student motivation. A later revision (Harackiewicz, Barron, & Elliot, 1998) contended that both goals can provide benefits and be pursued together. Perhaps both frameworks are correct: Their contrasting views…

  11. Applying Modeling Instruction to High School Chemistry to Improve Students' Conceptual Understanding

    ERIC Educational Resources Information Center

    Dukerich, Larry

    2015-01-01

    With the release of the Next Generation Science Standards, high school chemistry teachers are now pondering the implications of their recommendations for their teaching. They may agree that traditional instruction, as the Framework points out, "emphasizes discrete facts with a focus on breadth over depth, and does not provide students with…

  12. Criteria for Accreditation in Vietnam's Higher Education: Focus on Input or Outcome?

    ERIC Educational Resources Information Center

    Nguyen, Kim D.; Oliver, Diane E.; Priddy, Lynn E.

    2009-01-01

    The purpose of this article is to analyse the development of accreditation standards and processes in Vietnam and to offer recommendations for the further progress of Vietnam's accreditation model. The authors first provide contextual details of the higher education system and then present the conceptual framework of quality assurance in relation…

  13. Framework for improved confidence in modeled nitrous oxide estimates for biofuel regulatory standards

    USDA-ARS?s Scientific Manuscript database

    There is a large variation in local commodity prices across the United States, with farmers receiving less for their soybeans produced on their farm, and paying more for meal to feed their livestock, thereby reducing their profitability. This price differential is due to a number of factors includin...

  14. Quality Assurance in University Guidance Services

    ERIC Educational Resources Information Center

    Simon, Alexandra

    2014-01-01

    In Europe there is no common quality assurance framework for the delivery of guidance in higher education. Using a case study approach in four university career guidance services in England, France and Spain, this article aims to study how quality is implemented in university career guidance services in terms of strategy, standards and models,…

  15. Judicious Discipline: Citizenship Values as a Framework for Moral Education.

    ERIC Educational Resources Information Center

    McEwan, Barbara

    When teaching moral education, the ethical dilemma often faced by educators revolves around the question of whose morals should be taught. Judicious Discipline, a constitutional model for classroom management, proposes to answer this question by offering educators the opportunity to teach the moral standards of the U.S. democratic system of…

  16. Adopting SCORM 1.2 Standards in a Courseware Production Environment

    ERIC Educational Resources Information Center

    Barker, Bradley

    2004-01-01

    The Sharable Content Object Reference Model (SCORM) is a technology framework for Web-based learning technology. Originated by the Department of Defense and accelerated by the Advanced Distributed Learning initiative SCORM was released in January of 2000 (ADL, 2003). The goals of SCORM are to decrease the cost of training, while increasing the…

  17. Conceptual framework for describing selected urban and community impacts of federal energy policies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, F.A,; Marcus, A.A.; Keller, D.

    1980-06-01

    A conceptual framework is presented for describing selected urban and community impacts of Federal energy policies. The framework depends on a simple causal model. The outputs of the model are impacts, changes in the state of the world of particular interest to policymakers. At any given time, a set of determinants account for the state of the world with respect to an impact category. Application of the model to a particular impact category requires: establishing a definition and measure for the impact category and identifying the determinants of these impacts. Analysis of the impact of a particular policy requires themore » following: identifying the policy and its effects (as estimated by others), isolating any effects that themselves constitute an urban and community impact, identifying any effects that change the value of determinants, and describing the impact with reference to the new values of determinants. This report provides a framework for these steps. Three impacts addressed are: neighborhood stability, housing availability, and quality and availability of public services. In each chapter, a definition and measure for the impact are specified; its principal determinants are identified; how the causal model can be used to estimate impacts by applying it to three illustrative Federal policies (domestic oil price decontrol, building energy performance standards, and increased Federal aid for mass transit) is demonstrated. (MCW)« less

  18. Treatment evolution and new standards of care: implications for cost-effectiveness analysis.

    PubMed

    Shechter, Steven M

    2011-01-01

    Traditional approaches to cost-effectiveness analysis have not considered the downstream possibility of a new standard of care coming out of the research and development pipeline. However, the treatment landscape for patients may change significantly over the course of their lifetimes. To present a Markov modeling framework that incorporates the possibility of treatment evolution into the incremental cost-effectiveness ratio (ICER) that compares treatments available at the present time. . Markov model evaluated by matrix algebra. Measurements. The author evaluates the difference between the new and traditional ICER calculations for patients with chronic diseases facing a lifetime of treatment. The bias of the traditional ICER calculation may be substantial, with further testing revealing that it may be either positive or negative depending on the model parameters. The author also performs probabilistic sensitivity analyses with respect to the possible timing of a new treatment discovery and notes the increase in the magnitude of the bias when the new treatment is likely to appear sooner rather than later. Limitations. The modeling framework is intended as a proof of concept and therefore makes simplifying assumptions such as time stationarity of model parameters and consideration of a single new drug discovery. For diseases with a more active research and development pipeline, the possibility of a new treatment paradigm may be at least as important to consider in sensitivity analysis as other parameters that are often considered.

  19. A Risk Assessment Model for Reduced Aircraft Separation: A Quantitative Method to Evaluate the Safety of Free Flight

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Smith, Alex; Connors, Mary; Wojciech, Jack; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    As new technologies and procedures are introduced into the National Airspace System, whether they are intended to improve efficiency, capacity, or safety level, the quantification of potential changes in safety levels is of vital concern. Applications of technology can improve safety levels and allow the reduction of separation standards. An excellent example is the Precision Runway Monitor (PRM). By taking advantage of the surveillance and display advances of PRM, airports can run instrument parallel approaches to runways separated by 3400 feet with the same level of safety as parallel approaches to runways separated by 4300 feet using the standard technology. Despite a wealth of information from flight operations and testing programs, there is no readily quantifiable relationship between numerical safety levels and the separation standards that apply to aircraft on final approach. This paper presents a modeling approach to quantify the risk associated with reducing separation on final approach. Reducing aircraft separation, both laterally and longitudinally, has been the goal of several aviation R&D programs over the past several years. Many of these programs have focused on technological solutions to improve navigation accuracy, surveillance accuracy, aircraft situational awareness, controller situational awareness, and other technical and operational factors that are vital to maintaining flight safety. The risk assessment model relates different types of potential aircraft accidents and incidents and their contribution to overall accident risk. The framework links accident risks to a hierarchy of failsafe mechanisms characterized by procedures and interventions. The model will be used to assess the overall level of safety associated with reducing separation standards and the introduction of new technology and procedures, as envisaged under the Free Flight concept. The model framework can be applied to various aircraft scenarios, including parallel and in-trail approaches. This research was performed under contract to NASA and in cooperation with the FAA's Safety Division (ASY).

  20. The Adoption and Diffusion of an NHRD Standard: A Conceptual Framework

    ERIC Educational Resources Information Center

    Murphy, Aileen; Garavan, Thomas N.

    2009-01-01

    This article proposes a conceptual framework to explain the adoption and diffusion of a national human resource development (NHRD) standard. NHRD standards are used by governments to promote training and development in organizations and increase the professionalization of practices used by organizations. Institutional theory suggests that adoption…

  1. "Circumstance and Proper Timing": Context and the Construction of a Standards Framework for School Principals' Performance.

    ERIC Educational Resources Information Center

    Louden, William; Wildy, Helen

    1999-01-01

    Professional standards for school principals typically describe an ideal performance in a generalized context. This article describes an alternative method of developing a standards framework, combining qualitative vignettes with probabilistic measurement techniques to provide essential or ideal performance qualities with contextually rich…

  2. Development and assessment of an integrated ecological modelling framework to assess the effect of investments in wastewater treatment on water quality.

    PubMed

    Holguin-Gonzalez, Javier E; Boets, Pieter; Everaert, Gert; Pauwels, Ine S; Lock, Koen; Gobeyn, Sacha; Benedetti, Lorenzo; Amerlinck, Youri; Nopens, Ingmar; Goethals, Peter L M

    2014-01-01

    Worldwide, large investments in wastewater treatment are made to improve water quality. However, the impacts of these investments on river water quality are often not quantified. To assess water quality, the European Water Framework Directive (WFD) requires an integrated approach. The aim of this study was to develop an integrated ecological modelling framework for the River Drava (Croatia) that includes physical-chemical and hydromorphological characteristics as well as the ecological river water quality status. The developed submodels and the integrated model showed accurate predictions when comparing the modelled results to the observations. Dissolved oxygen and nitrogen concentrations (ammonium and organic nitrogen) were the most important variables in determining the ecological water quality (EWQ). The result of three potential investment scenarios of the wastewater treatment infrastructure in the city of Varaždin on the EWQ of the River Drava was assessed. From this scenario-based analysis, it was concluded that upgrading the existing wastewater treatment plant with nitrogen and phosphorus removal will be insufficient to reach a good EWQ. Therefore, other point and diffuse pollution sources in the area should also be monitored and remediated to meet the European WFD standards.

  3. [Computer aided design for fixed partial denture framework based on reverse engineering technology].

    PubMed

    Sun, Yu-chun; Lü, Pei-jun; Wang, Yong

    2006-03-01

    To explore a computer aided design (CAD) route for the framework of domestic fixed partial denture (FPD) and confirm the suitable method of 3-D CAD. The working area of a dentition model was scanned with a 3-D mechanical scanner. Using the reverse engineering (RE) software, margin and border curves were extracted and several reference curves were created to ensure the dimension and location of pontic framework that was taken from the standard database. The shoulder parts of the retainers were created after axial surfaces constructed. The connecting areas, axial line and curving surface of the framework connector were finally created. The framework of a three-unit FPD was designed with RE technology, which showed smooth surfaces and continuous contours. The design route is practical. The result of this study is significant in theory and practice, which will provide a reference for establishing the computer aided design/computer aided manufacture (CAD/CAM) system of domestic FPD.

  4. A VGI data integration framework based on linked data model

    NASA Astrophysics Data System (ADS)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  5. Next Generation Science Partnerships

    NASA Astrophysics Data System (ADS)

    Magnusson, J.

    2016-02-01

    I will provide an overview of the Next Generation Science Standards (NGSS) and demonstrate how scientists and educators can use these standards to strengthen and enhance their collaborations. The NGSS are rich in content and practice and provide all students with an internationally-benchmarked science education. Using these state-led standards to guide outreach efforts can help develop and sustain effective and mutually beneficial teacher-researcher partnerships. Aligning outreach with the three dimensions of the standards can help make research relevant for target audiences by intentionally addressing the science practices, cross-cutting concepts, and disciplinary core ideas of the K-12 science curriculum that drives instruction and assessment. Collaborations between researchers and educators that are based on this science framework are more sustainable because they address the needs of both scientists and educators. Educators are better able to utilize science content that aligns with their curriculum. Scientists who learn about the NGSS can better understand the frameworks under which educators work, which can lead to more extensive and focused outreach with teachers as partners. Based on this model, the International Ocean Discovery Program (IODP) develops its education materials in conjunction with scientists and educators to produce accurate, standards-aligned activities and curriculum-based interactions with researchers. I will highlight examples of IODP's current, successful teacher-researcher collaborations that are intentionally aligned with the NGSS.

  6. Benchmarking hydrological model predictive capability for UK River flows and flood peaks.

    NASA Astrophysics Data System (ADS)

    Lane, Rosanna; Coxon, Gemma; Freer, Jim; Wagener, Thorsten

    2017-04-01

    Data and hydrological models are now available for national hydrological analyses. However, hydrological model performance varies between catchments, and lumped, conceptual models are not able to produce adequate simulations everywhere. This study aims to benchmark hydrological model performance for catchments across the United Kingdom within an uncertainty analysis framework. We have applied four hydrological models from the FUSE framework to 1128 catchments across the UK. These models are all lumped models and run at a daily timestep, but differ in the model structural architecture and process parameterisations, therefore producing different but equally plausible simulations. We apply FUSE over a 20 year period from 1988-2008, within a GLUE Monte Carlo uncertainty analyses framework. Model performance was evaluated for each catchment, model structure and parameter set using standard performance metrics. These were calculated both for the whole time series and to assess seasonal differences in model performance. The GLUE uncertainty analysis framework was then applied to produce simulated 5th and 95th percentile uncertainty bounds for the daily flow time-series and additionally the annual maximum prediction bounds for each catchment. The results show that the model performance varies significantly in space and time depending on catchment characteristics including climate, geology and human impact. We identify regions where models are systematically failing to produce good results, and present reasons why this could be the case. We also identify regions or catchment characteristics where one model performs better than others, and have explored what structural component or parameterisation enables certain models to produce better simulations in these catchments. Model predictive capability was assessed for each catchment, through looking at the ability of the models to produce discharge prediction bounds which successfully bound the observed discharge. These results improve our understanding of the predictive capability of simple conceptual hydrological models across the UK and help us to identify where further effort is needed to develop modelling approaches to better represent different catchment and climate typologies.

  7. A Framework for Comprehensive Health Terminology Systems in the United States

    PubMed Central

    Chute, Christopher G.; Cohn, Simon P.; Campbell, James R.

    1998-01-01

    Health care in the United States has become an information-intensive industry, yet electronic health records represent patient data inconsistently for lack of clinical data standards. Classifications that have achieved common acceptance, such as the ICD-9-CM or ICD, aggregate heterogeneous patients into broad categories, which preclude their practical use in decision support, development of refined guidelines, or detailed comparison of patient outcomes or benchmarks. This document proposes a framework for the integration and maturation of clinical terminologies that would have practical applications in patient care, process management, outcome analysis, and decision support. Arising from the two working groups within the standards community—the ANSI (American National Standards Institute) Healthcare Informatics Standards Board Working Group and the Computer-based Patient Records Institute Working Group on Codes and Structures—it outlines policies regarding 1) functional characteristics of practical terminologies, 2) terminology models that can broaden their applications and contribute to their sustainability, 3) maintenance attributes that will enable terminologies to keep pace with rapidly changing health care knowledge and process, and 4) administrative issues that would facilitate their accessibility, adoption, and application to improve the quality and efficiency of American health care. PMID:9824798

  8. Grounding theories of W(e)Learn: a framework for online interprofessional education.

    PubMed

    Casimiro, Lynn; MacDonald, Colla J; Thompson, Terrie Lynn; Stodel, Emma J

    2009-07-01

    Interprofessional care (IPC) is a prerequisite for enhanced communication between healthcare team members, improved quality of care, and better outcomes for patients. A move to an IPC model requires changing the learning experiences of healthcare providers during and after their qualification program. With the rapid growth of online and blended approaches to learning, an educational framework that explains how to construct quality learning events to provide IPC is pressing. Such a framework would offer a quality standard to help educators design, develop, deliver, and evaluate online interprofessional education (IPE) programs. IPE is an extremely delicate process due to issues related to knowledge, status, power, accountability, personality traits, and culture that surround IPC. In this paper, a review of the pertinent literature that would inform the development of such a framework is presented. The review covers IPC, IPE, learning theories, and eLearning in healthcare.

  9. A Framework for Building and Reasoning with Adaptive and Interoperable PMESII Models

    DTIC Science & Technology

    2007-11-01

    Description Logic SOA Service Oriented Architecture SPARQL Simple Protocol And RDF Query Language SQL Standard Query Language SROM Stability and...another by providing a more expressive ontological structure for one of the models, e.g., semantic networks can be mapped to first- order logical...Pellet is an open-source reasoner that works with OWL-DL. It accepts the SPARQL protocol and RDF query language ( SPARQL ) and provides a Java API to

  10. An automated and integrated framework for dust storm detection based on ogc web processing services

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.

  11. Accurate Modeling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model

    NASA Astrophysics Data System (ADS)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron; Scoccimarro, Roman

    2015-01-01

    The large-scale distribution of galaxies can be explained fairly simply by assuming (i) a cosmological model, which determines the dark matter halo distribution, and (ii) a simple connection between galaxies and the halos they inhabit. This conceptually simple framework, called the halo model, has been remarkably successful at reproducing the clustering of galaxies on all scales, as observed in various galaxy redshift surveys. However, none of these previous studies have carefully modeled the systematics and thus truly tested the halo model in a statistically rigorous sense. We present a new accurate and fully numerical halo model framework and test it against clustering measurements from two luminosity samples of galaxies drawn from the SDSS DR7. We show that the simple ΛCDM cosmology + halo model is not able to simultaneously reproduce the galaxy projected correlation function and the group multiplicity function. In particular, the more luminous sample shows significant tension with theory. We discuss the implications of our findings and how this work paves the way for constraining galaxy formation by accurate simultaneous modeling of multiple galaxy clustering statistics.

  12. Towards a Framework for Developing Semantic Relatedness Reference Standards

    PubMed Central

    Pakhomov, Serguei V.S.; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B.; Ruggieri, Alexander; Chute, Christopher G.

    2010-01-01

    Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the “moderate” range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. PMID:21044697

  13. Lorentz Symmetry Violations from Matter-Gravity Couplings with Lunar Laser Ranging

    NASA Astrophysics Data System (ADS)

    Bourgoin, A.; Le Poncin-Lafitte, C.; Hees, A.; Bouquillon, S.; Francou, G.; Angonin, M.-C.

    2017-11-01

    The standard-model extension (SME) is an effective field theory framework aiming at parametrizing any violation to the Lorentz symmetry (LS) in all sectors of physics. In this Letter, we report the first direct experimental measurement of SME coefficients performed simultaneously within two sectors of the SME framework using lunar laser ranging observations. We consider the pure gravitational sector and the classical point-mass limit in the matter sector of the minimal SME. We report no deviation from general relativity and put new realistic stringent constraints on LS violations improving up to 3 orders of magnitude previous estimations.

  14. A computational framework for converting textual clinical diagnostic criteria into the quality data model.

    PubMed

    Hong, Na; Li, Dingcheng; Yu, Yue; Xiu, Qiongying; Liu, Hongfang; Jiang, Guoqian

    2016-10-01

    Constructing standard and computable clinical diagnostic criteria is an important but challenging research field in the clinical informatics community. The Quality Data Model (QDM) is emerging as a promising information model for standardizing clinical diagnostic criteria. To develop and evaluate automated methods for converting textual clinical diagnostic criteria in a structured format using QDM. We used a clinical Natural Language Processing (NLP) tool known as cTAKES to detect sentences and annotate events in diagnostic criteria. We developed a rule-based approach for assigning the QDM datatype(s) to an individual criterion, whereas we invoked a machine learning algorithm based on the Conditional Random Fields (CRFs) for annotating attributes belonging to each particular QDM datatype. We manually developed an annotated corpus as the gold standard and used standard measures (precision, recall and f-measure) for the performance evaluation. We harvested 267 individual criteria with the datatypes of Symptom and Laboratory Test from 63 textual diagnostic criteria. We manually annotated attributes and values in 142 individual Laboratory Test criteria. The average performance of our rule-based approach was 0.84 of precision, 0.86 of recall, and 0.85 of f-measure; the performance of CRFs-based classification was 0.95 of precision, 0.88 of recall and 0.91 of f-measure. We also implemented a web-based tool that automatically translates textual Laboratory Test criteria into the QDM XML template format. The results indicated that our approaches leveraging cTAKES and CRFs are effective in facilitating diagnostic criteria annotation and classification. Our NLP-based computational framework is a feasible and useful solution in developing diagnostic criteria representation and computerization. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Numerical Coupling and Simulation of Point-Mass System with the Turbulent Fluid Flow

    NASA Astrophysics Data System (ADS)

    Gao, Zheng

    A computational framework that combines the Eulerian description of the turbulence field with a Lagrangian point-mass ensemble is proposed in this dissertation. Depending on the Reynolds number, the turbulence field is simulated using Direct Numerical Simulation (DNS) or eddy viscosity model. In the meanwhile, the particle system, such as spring-mass system and cloud droplets, are modeled using the ordinary differential system, which is stiff and hence poses a challenge to the stability of the entire system. This computational framework is applied to the numerical study of parachute deceleration and cloud microphysics. These two distinct problems can be uniformly modeled with Partial Differential Equations (PDEs) and Ordinary Differential Equations (ODEs), and numerically solved in the same framework. For the parachute simulation, a novel porosity model is proposed to simulate the porous effects of the parachute canopy. This model is easy to implement with the projection method and is able to reproduce Darcy's law observed in the experiment. Moreover, the impacts of using different versions of k-epsilon turbulence model in the parachute simulation have been investigated and conclude that the standard and Re-Normalisation Group (RNG) model may overestimate the turbulence effects when Reynolds number is small while the Realizable model has a consistent performance with both large and small Reynolds number. For another application, cloud microphysics, the cloud entrainment-mixing problem is studied in the same numerical framework. Three sets of DNS are carried out with both decaying and forced turbulence. The numerical result suggests a new way parameterize the cloud mixing degree using the dynamical measures. The numerical experiments also verify the negative relationship between the droplets number concentration and the vorticity field. The results imply that the gravity has fewer impacts on the forced turbulence than the decaying turbulence. In summary, the proposed framework can be used to solve a physics problem that involves turbulence field and point-mass system, and therefore has a broad application.

  16. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand's Official Statistics System

    PubMed Central

    Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.

    2013-01-01

    Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231

  17. Purpose, Processes, Partnerships, and Products: 4Ps to advance Participatory Socio-Environmental Modeling

    NASA Astrophysics Data System (ADS)

    Gray, S. G.; Voinov, A. A.; Jordan, R.; Paolisso, M.

    2016-12-01

    Model-based reasoning is a basic part of human understanding, decision-making, and communication. Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding environmental change since stakeholders often hold valuable knowledge about socio-environmental dynamics and since collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four dimensional framework that includes reporting on dimensions of: (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human-environment interactions and the consequences of environmental changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of environmental policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM.

  18. Distortion of CAD-CAM-fabricated implant-fixed titanium and zirconia complete dental prosthesis frameworks.

    PubMed

    Al-Meraikhi, Hadi; Yilmaz, Burak; McGlumphy, Edwin; Brantley, William A; Johnston, William M

    2018-01-01

    Computer-aided design and computer-aided manufacturing (CAD-CAM)-fabricated titanium and zirconia implant-supported fixed dental prostheses have become increasingly popular for restoring patients with complete edentulism. However, the distortion level of these frameworks is not well known. The purpose of this in vitro study was to compare the 3-dimensional (3D) distortion of CAD-CAM zirconia and titanium implant-fixed screw-retained complete dental prostheses. A master edentulous model with 4 implants at the positions of the maxillary first molars and canines was used. Multiunit abutments (Nobel Biocare) secured to the model were digitally scanned using scan bodies and a laboratory scanner (S600 ARTI; Zirkonzahn). Titanium (n=5) and zirconia (n=5) frameworks were milled using a CAD-CAM system (Zirkonzahn M1; Zirkonzahn). All frameworks were scanned using an industrial computed tomography (CT) scanner (Nikon/X-Tek XT H 225kV MCT Micro-Focus). The direct CT scans were reconstructed to generate standard tessellation language (STL) files. To calculate the 3D distortion of the frameworks, STL files of the CT scans were aligned to the CAD model using a sum of the least squares best-fit algorithm. Surface comparison points were placed on the CAD model on the midfacial aspect of all teeth. The 3D distortion of each direct scan to the CAD model was calculated. In addition, color maps of the scan-to-CAD comparison were constructed using a ±0.500 mm color scale range. Both materials exhibited distortion; however, no significant difference was found in the amount of distortion from the CAD model between the materials (P=.747). Absolute values of deviations from the CAD model were evident in the x and y plane and less so in the z direction. Zirconia and titanium frameworks showed similar 3D distortion compared with the CAD model for the tested CAD-CAM and implant systems. The distortion was more pronounced in the horizontal and sagittal plane than in the vertical plane. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  19. From service provision to function based performance - perspectives on public health systems from the USA and Israel

    PubMed Central

    2012-01-01

    If public health agencies are to fulfill their overall mission, they need to have defined measurable targets and should structure services to reach these targets, rather than offer a combination of ill-targeted programs. In order to do this, it is essential that there be a clear definition of what public health should do- a definition that does not ebb and flow based upon the prevailing political winds, but rather is based upon professional standards and measurements. The establishment of the Essential Public Health Services framework in the U.S.A. was a major move in that direction, and the model, or revisions of the model, have been adopted beyond the borders of the U.S. This article reviews the U.S. public health system, the needs and processes which brought about the development of the 10 Essential Public Health Services (EPHS), and historical and contemporary applications of the model. It highlights the value of establishing a common delineation of public health activities such as those contained in the EPHS, and explores the validity of using the same process in other countries through a discussion of the development in Israel of a similar model, the 10 Public Health Essential Functions (PHEF), that describes the activities of Israel’s public health system. The use of the same process and framework to develop similar yet distinct frameworks suggests that the process has wide applicability, and may be beneficial to any public health system. Once a model is developed, it can be used to measure public health performance and improve the quality of services delivered through the development of standards and measures based upon the model, which could, ultimately, improve the health of the communities that depend upon public health agencies to protect their well-being. PMID:23181452

  20. Peer Review of Assessment Network: Supporting Comparability of Standards

    ERIC Educational Resources Information Center

    Booth, Sara; Beckett, Jeff; Saunders, Cassandra

    2016-01-01

    Purpose: This paper aims to test the need in the Australian higher education (HE) sector for a national network for the peer review of assessment in response to the proposed HE standards framework and propose a sector-wide framework for calibrating and assuring achievement standards, both within and across disciplines, through the establishment of…

  1. A Community Framework for Integrative, Coupled Modeling of Human-Earth Systems

    NASA Astrophysics Data System (ADS)

    Barton, C. M.; Nelson, G. C.; Tucker, G. E.; Lee, A.; Porter, C.; Ullah, I.; Hutton, E.; Hoogenboom, G.; Rogers, K. G.; Pritchard, C.

    2017-12-01

    We live today in a humanized world, where critical zone dynamics are driven by coupled human and biophysical processes. First generation modeling platforms have been invaluable in providing insight into dynamics of biophysical systems and social systems. But to understand today's humanized planet scientifically and to manage it sustainably, we need integrative modeling of this coupled human-Earth system. To address both scientific and policy questions, we also need modeling that can represent variable combinations of human-Earth system processes at multiple scales. Simply adding more code needed to do this to large, legacy first generation models is impractical, expensive, and will make them even more difficult to evaluate or understand. We need an approach to modeling that mirrors and benefits from the architecture of the complexly coupled systems we hope to model. Building on a series of international workshops over the past two years, we present a community framework to enable and support an ecosystem of diverse models as components that can be interconnected as needed to facilitate understanding of a range of complex human-earth systems interactions. Models are containerized in Docker to make them platform independent. A Basic Modeling Interface and Standard Names ontology (developed by the Community Surface Dynamics Modeling System) is applied to make them interoperable. They are then transformed into RESTful micro-services to allow them to be connected and run in a browser environment. This enables a flexible, multi-scale modeling environment to help address diverse issues with combinations of smaller, focused, component models that are easier to understand and evaluate. We plan to develop, deploy, and maintain this framework for integrated, coupled modeling in an open-source collaborative development environment that can democratize access to advanced technology and benefit from diverse global participation in model development. We also present an initial proof-of-concept of this framework, coupling a widely used agricultural crop model (DSSAT) with a widely used hydrology model (TopoFlow).

  2. The fusion of large scale classified side-scan sonar image mosaics.

    PubMed

    Reed, Scott; Tena, Ruiz Ioseba; Capus, Chris; Petillot, Yvan

    2006-07-01

    This paper presents a unified framework for the creation of classified maps of the seafloor from sonar imagery. Significant challenges in photometric correction, classification, navigation and registration, and image fusion are addressed. The techniques described are directly applicable to a range of remote sensing problems. Recent advances in side-scan data correction are incorporated to compensate for the sonar beam pattern and motion of the acquisition platform. The corrected images are segmented using pixel-based textural features and standard classifiers. In parallel, the navigation of the sonar device is processed using Kalman filtering techniques. A simultaneous localization and mapping framework is adopted to improve the navigation accuracy and produce georeferenced mosaics of the segmented side-scan data. These are fused within a Markovian framework and two fusion models are presented. The first uses a voting scheme regularized by an isotropic Markov random field and is applicable when the reliability of each information source is unknown. The Markov model is also used to inpaint regions where no final classification decision can be reached using pixel level fusion. The second model formally introduces the reliability of each information source into a probabilistic model. Evaluation of the two models using both synthetic images and real data from a large scale survey shows significant quantitative and qualitative improvement using the fusion approach.

  3. Search for selectron and squark production in collisions at HERA

    NASA Astrophysics Data System (ADS)

    ZEUS Collaboration; Breitweg, J.; Derrick, M.; Krakauer, D.; Magill, S.; Mikunas, D.; Musgrave, B.; Repond, J.; Stanek, R.; Talaga, R. L.; Yoshida, R.; Zhang, H.; Mattingly, M. C. K.; Anselmo, F.; Antonioli, P.; Bari, G.; Basile, M.; Bellagamba, L.; Boscherini, D.; Bruni, A.; Bruni, G.; Cara Romeo, G.; Castellini, G.; Cifarelli, L.; Cindolo, F.; Contin, A.; Coppola, N.; Corradi, M.; de Pasquale, S.; Giusti, P.; Iacobucci, G.; Laurenti, G.; Levi, G.; Margotti, A.; Massam, T.; Nania, R.; Palmonari, F.; Pesci, A.; Polini, A.; Sartorelli, G.; Zamora Garcia, Y.; Zichichi, A.; Amelung, C.; Bornheim, A.; Brock, I.; Coböken, K.; Crittenden, J.; Deffner, R.; Eckert, M.; Grothe, M.; Hartmann, H.; Heinloth, K.; Heinz, L.; Hilger, E.; Jakob, H.-P.; Kappes, A.; Katz, U. F.; Kerger, R.; Paul, E.; Pfeiffer, M.; Stamm, J.; Wieber, H.; Bailey, D. S.; Campbell-Robson, S.; Cottingham, W. N.; Foster, B.; Hall-Wilton, R.; Heath, G. P.; Heath, H. F.; McFall, J. D.; Piccioni, D.; Roff, D. G.; Tapper, R. J.; Capua, M.; Iannotti, L.; Schioppa, M.; Susinno, G.; Kim, J. Y.; Lee, J. H.; Lim, I. T.; Pac, M. Y.; Caldwell, A.; Cartiglia, N.; Jing, Z.; Liu, W.; Mellado, B.; Parsons, J. A.; Ritz, S.; Sampson, S.; Sciulli, F.; Straub, P. B.; Zhu, Q.; Borzemski, P.; Chwastowski, J.; Eskreys, A.; Figiel, J.; Klimek, K.; Przybycień , M. B.; Zawiejski, L.; Adamczyk, L.; Bednarek, B.; Bukowy, M.; Czermak, A. M.; Jeleń , K.; Kisielewska, D.; Kowalski, T.; Przybycień , M.; Rulikowska-Zarȩ Bska, E.; Suszycki, L.; Zaja C, J.; Duliń Ski, Z.; Kotań Ski, A.; Abbiendi, G.; Bauerdick, L. A. T.; Behrens, U.; Beier, H.; Bienlein, J. K.; Desler, K.; Drews, G.; Fricke, U.; Gialas, I.; Goebel, F.; Göttlicher, P.; Graciani, R.; Haas, T.; Hain, W.; Hartner, G. F.; Hasell, D.; Hebbel, K.; Johnson, K. F.; Kasemann, M.; Koch, W.; Kötz, U.; Kowalski, H.; Lindemann, L.; Löhr, B.; Martínez, M.; Milewski, J.; Milite, M.; Monteiro, T.; Notz, D.; Pellegrino, A.; Pelucchi, F.; Piotrzkowski, K.; Rohde, M.; Roldán, J.; Ryan, J. J.; Saull, P. R. B.; Savin, A. A.; Schneekloth, U.; Schwarzer, O.; Selonke, F.; Stonjek, S.; Surrow, B.; Tassi, E.; Westphal, D.; Wolf, G.; Wollmer, U.; Youngman, C.; Zeuner, W.; Burow, B. D.; Coldewey, C.; Grabosch, H. J.; Meyer, A.; Schlenstedt, S.; Barbagli, G.; Gallo, E.; Pelfer, P.; Maccarrone, G.; Votano, L.; Bamberger, A.; Eisenhardt, S.; Markun, P.; Raach, H.; Trefzger, T.; Wölfle, S.; Bromley, J. T.; Brook, N. H.; Bussey, P. J.; Doyle, A. T.; Lee, S. W.; MacDonald, N.; McCance, G. J.; Saxon, D. H.; Sinclair, L. E.; Skillicorn, I. O.; Strickland, E.; Waugh, R.; Bohnet, I.; Gendner, N.; Holm, U.; Meyer-Larsen, A.; Salehi, H.; Wick, K.; Garfagnini, A.; Gladilin, L. K.; Kçira, D.; Klanner, R.; Lohrmann, E.; Poelz, G.; Zetsche, F.; Bacon, T. C.; Butterworth, I.; Cole, J. E.; Howell, G.; Lamberti, L.; Long, K. R.; Miller, D. B.; Pavel, N.; Prinias, A.; Sedgbeer, J. K.; Sideris, D.; Walker, R.; Mallik, U.; Wang, S. M.; Wu, J. T.; Cloth, P.; Filges, D.; Fleck, J. I.; Ishii, T.; Kuze, M.; Suzuki, I.; Tokushuku, K.; Yamada, S.; Yamauchi, K.; Yamazaki, Y.; Hong, S. J.; Lee, S. B.; Nam, S. W.; Park, S. K.; Lim, H.; Park, I. H.; Son, D.; Barreiro, F.; Fernández, J. P.; García, G.; Glasman, C.; Hernández, J. M.; Hervás, L.; Labarga, L.; del Peso, J.; Puga, J.; Terrón, J.; de Trocóniz, J. F.; Corriveau, F.; Hanna, D. S.; Hartmann, J.; Hung, L. W.; Murray, W. N.; Ochs, A.; Riveline, M.; Stairs, D. G.; St-Laurent, M.; Ullmann, R.; Tsurugai, T.; Bashkirov, V.; Dolgoshein, B. A.; Stifutkin, A.; Bashindzhagyan, G. L.; Ermolov, P. F.; Golubkov, Yu. A.; Khein, L. A.; Korotkova, N. A.; Korzhavina, I. A.; Kuzmin, V. A.; Lukina, O. Yu.; Proskuryakov, A. S.; Shcheglova, L. M.; Solomin, A. N.; Zotkin, S. A.; Bokel, C.; Botje, M.; Brümmer, N.; Engelen, J.; Koffeman, E.; Kooijman, P.; van Sighem, A.; Tiecke, H.; Tuning, N.; Verkerke, W.; Vossebeld, J.; Wiggers, L.; de Wolf, E.; Acosta, D.; Bylsma, B.; Durkin, L. S.; Gilmore, J.; Ginsburg, C. M.; Kim, C. L.; Ling, T. Y.; Nylander, P.; Romanowski, T. A.; Blaikley, H. E.; Cashmore, R. J.; Cooper-Sarkar, A. M.; Devenish, R. C. E.; Edmonds, J. K.; Große-Knetter, J.; Harnew, N.; Nath, C.; Noyes, V. A.; Quadt, A.; Ruske, O.; Tickner, J. R.; Walczak, R.; Waters, D. S.; Bertolin, A.; Brugnera, R.; Carlin, R.; dal Corso, F.; Dosselli, U.; Limentani, S.; Morandin, M.; Posocco, M.; Stanco, L.; Stroili, R.; Voci, C.; Bulmahn, J.; Oh, B. Y.; Okrasiń Ski, J. R.; Toothacker, W. S.; Whitmore, J. J.; Iga, Y.; D'Agostini, G.; Marini, G.; Nigro, A.; Raso, M.; Hart, J. C.; McCubbin, N. A.; Shah, T. P.; Epperson, D.; Heusch, C.; Rahn, J. T.; Sadrozinski, H. F.-W.; Seiden, A.; Wichmann, R.; Williams, D. C.; Abramowicz, H.; Briskin, G.; Dagan, S.; Kananov, S.; Levy, A.; Abe, T.; Fusayasu, T.; Inuzuka, M.; Nagano, K.; Umemori, K.; Yamashita, T.; Hamatsu, R.; Hirose, T.; Homma, K.; Kitamura, S.; Matsushita, T.; Arneodo, M.; Cirio, R.; Costa, M.; Ferrero, M. I.; Maselli, S.; Monaco, V.; Peroni, C.; Petrucci, M. C.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Dardo, M.; Bailey, D. C.; Fagerstroem, C.-P.; Galea, R.; Joo, K. K.; Levman, G. M.; Martin R. S. Orr, J. F.; Polenz, S.; Sabetfakhri, A.; Simmons, D.; Butterworth, J. M.; Catterall, C. D.; Hayes, M. E.; Jones, T. W.; Lane, J. B.; Saunders, R. L.; Sutton, M. R.; Wing, M.; Ciborowski, J.; Grzelak, G.; Kasprzak, M.; Nowak, R. J.; Pawlak, J. M.; Pawlak, R.; Smalska, B.; Tymieniecka, T.; Wróblewski, A. K.; Zakrzewski, J. A.; Zsolararnecki, A. F.; Adamus, M.; Deppe, O.; Eisenberg, Y.; Hochman, D.; Karshon, U.; Badgett, W. F.; Chapin, D.; Cross, R.; Dasu, S.; Foudas, C.; Loveless, R. J.; Mattingly, S.; Reeder, D. D.; Smith, W. H.; Vaiciulis, A.; Wodarczyk, M.; Deshpande, A.; Dhawan, S.; Hughes, V. W.; Bhadra, S.; Frisken, W. R.; Khakzad, M.; Schmidke, W. B.

    1998-08-01

    We have searched for the production of a selectron and a squark in collisions at a center-of-mass energy of 300 GeV using the ZEUS detector at HERA. The selectron and squark are sought in the direct decay into the lightest neutralino in the framework of supersymmetric extensions to the Standard Model which conserve R-parity. No evidence for the production of supersymmetric particles has been found in a data sample corresponding to 46.6 pb of integrated luminosity. We express upper limits on the product of the cross section times the decay branching ratios as excluded regions in the parameter space of the Minimal Supersymmetric Standard Model.

  4. Testing for Lorentz violation: constraints on standard-model-extension parameters via lunar laser ranging.

    PubMed

    Battat, James B R; Chandler, John F; Stubbs, Christopher W

    2007-12-14

    We present constraints on violations of Lorentz invariance based on archival lunar laser-ranging (LLR) data. LLR measures the Earth-Moon separation by timing the round-trip travel of light between the two bodies and is currently accurate to the equivalent of a few centimeters (parts in 10(11) of the total distance). By analyzing this LLR data under the standard-model extension (SME) framework, we derived six observational constraints on dimensionless SME parameters that describe potential Lorentz violation. We found no evidence for Lorentz violation at the 10(-6) to 10(-11) level in these parameters. This work constitutes the first LLR constraints on SME parameters.

  5. Effects of anisotropy on interacting ghost dark energy in Brans-Dicke theories

    NASA Astrophysics Data System (ADS)

    Hossienkhani, H.; Fayaz, V.; Azimi, N.

    2017-03-01

    In this work we concentrate on the ghost dark energy model within the framework of the Brans-Dicke theory in an anisotropic Universe. Within this framework we discuss the behavior of equation of state, deceleration and dark energy density parameters of the model. We consider the squared sound speed and quest for signs of stability of the model. We also probe observational constraints by using the latest observational data on the ghost dark energy models as the unification of dark matter and dark energy. In order to do so, we focus on observational determinations of the Hubble expansion rate (namely, the expansion history) H(z). Then we evaluate the evolution of the growth of perturbations in the linear regime for both ghost DE and Brans-Dicke theory and compare the results with standard FRW and ΛCDM models. We display the effects of the anisotropy on the evolutionary behavior the ghost DE models where the growth rate is higher in this models. Eventually the growth factor for the ΛCDM Universe will always fall behind the ghost DE models in an anisotropic Universe.

  6. Vulnerable Populations in Hospital and Health Care Emergency Preparedness Planning: A Comprehensive Framework for Inclusion.

    PubMed

    Kreisberg, Debra; Thomas, Deborah S K; Valley, Morgan; Newell, Shannon; Janes, Enessa; Little, Charles

    2016-04-01

    As attention to emergency preparedness becomes a critical element of health care facility operations planning, efforts to recognize and integrate the needs of vulnerable populations in a comprehensive manner have lagged. This not only results in decreased levels of equitable service, but also affects the functioning of the health care system in disasters. While this report emphasizes the United States context, the concepts and approaches apply beyond this setting. This report: (1) describes a conceptual framework that provides a model for the inclusion of vulnerable populations into integrated health care and public health preparedness; and (2) applies this model to a pilot study. The framework is derived from literature, hospital regulatory policy, and health care standards, laying out the communication and relational interfaces that must occur at the systems, organizational, and community levels for a successful multi-level health care systems response that is inclusive of diverse populations explicitly. The pilot study illustrates the application of key elements of the framework, using a four-pronged approach that incorporates both quantitative and qualitative methods for deriving information that can inform hospital and health facility preparedness planning. The conceptual framework and model, applied to a pilot project, guide expanded work that ultimately can result in methodologically robust approaches to comprehensively incorporating vulnerable populations into the fabric of hospital disaster preparedness at levels from local to national, thus supporting best practices for a community resilience approach to disaster preparedness.

  7. Identifying fMRI Model Violations with Lagrange Multiplier Tests

    PubMed Central

    Cassidy, Ben; Long, Christopher J; Rae, Caroline; Solo, Victor

    2013-01-01

    The standard modeling framework in Functional Magnetic Resonance Imaging (fMRI) is predicated on assumptions of linearity, time invariance and stationarity. These assumptions are rarely checked because doing so requires specialised software, although failure to do so can lead to bias and mistaken inference. Identifying model violations is an essential but largely neglected step in standard fMRI data analysis. Using Lagrange Multiplier testing methods we have developed simple and efficient procedures for detecting model violations such as non-linearity, non-stationarity and validity of the common Double Gamma specification for hemodynamic response. These procedures are computationally cheap and can easily be added to a conventional analysis. The test statistic is calculated at each voxel and displayed as a spatial anomaly map which shows regions where a model is violated. The methodology is illustrated with a large number of real data examples. PMID:22542665

  8. Conformal standard model, leptogenesis, and dark matter

    NASA Astrophysics Data System (ADS)

    Lewandowski, Adrian; Meissner, Krzysztof A.; Nicolai, Hermann

    2018-02-01

    The conformal standard model is a minimal extension of the Standard Model (SM) of particle physics based on the assumed absence of large intermediate scales between the TeV scale and the Planck scale, which incorporates only right-chiral neutrinos and a new complex scalar in addition to the usual SM degrees of freedom, but no other features such as supersymmetric partners. In this paper, we present a comprehensive quantitative analysis of this model, and show that all outstanding issues of particle physics proper can in principle be solved "in one go" within this framework. This includes in particular the stabilization of the electroweak scale, "minimal" leptogenesis and the explanation of dark matter, with a small mass and very weakly interacting Majoron as the dark matter candidate (for which we propose to use the name "minoron"). The main testable prediction of the model is a new and almost sterile scalar boson that would manifest itself as a narrow resonance in the TeV region. We give a representative range of parameter values consistent with our assumptions and with observation.

  9. Search for physics beyond the standard model in final states with a lepton and missing transverse energy in proton-proton collisions at $$\\sqrt{s}$$ = 8 TeV

    DOE PAGES

    Khachatryan, Vardan

    2015-05-22

    A search for new physics in proton-proton collisions having final states with an electron or muon and missing transverse energy is presented. The analysis uses data collected in 2012 with the CMS detector, at an LHC center-of-mass energy of 8 TeV, and corresponding to an integrated luminosity of 19.7 fbmore » $$^{-1}$$. No significant deviation of the transverse mass distribution of the charged lepton-neutrino system from the standard model prediction is found. Mass exclusion limits of up to 3.28 TeV at a 95% confidence level for a W$$^{\\prime}$$ boson with the same couplings as that of the standard model W boson are determined. Results are also derived in the framework of split universal extra dimensions, and exclusion limits on Kaluza-Klein W$$^{(2)}_{{\\rm KK}}$$ states are found. The final state with large missing transverse energy also enables a search for dark matter production with a recoiling W boson, with limits set on the mass and the production cross section of potential candidates. Finally, limits are established for a model including interference between a left-handed W$$^{\\prime}$$ boson and the standard model W boson, and for a compositeness model.« less

  10. Assessing Inter-Sectoral Climate Change Risks: The Role of ISIMIP

    NASA Technical Reports Server (NTRS)

    Rosenzweig, Cynthia; Arnell, Nigel W.; Ebi, Kristie L.; Lotze-Campen, Hermann; Raes, Frank; Rapley, Chris; Smith, Mark Stafford; Cramer, Wolfgang; Frieler, Katja; Reyer, Christopher P. O.; hide

    2017-01-01

    The aims of the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) are to provide a framework for the intercomparison of global and regional-scale risk models within and across multiple sectors and to enable coordinated multi-sectoral assessments of different risks and their aggregated effects. The overarching goal is to use the knowledge gained to support adaptation and mitigation decisions that require regional or global perspectives within the context of facilitating transformations to enable sustainable development, despite inevitable climate shifts and disruptions. ISIMIP uses community-agreed sets of scenarios with standardized climate variables and socioeconomic projections as inputs for projecting future risks and associated uncertainties, within and across sectors. The results are consistent multi-model assessments of sectoral risks and opportunities that enable studies that integrate across sectors, providing support for implementation of the Paris Agreement under the United Nations Framework Convention on Climate Change.

  11. Nested Interrupt Analysis of Low Cost and High Performance Embedded Systems Using GSPN Framework

    NASA Astrophysics Data System (ADS)

    Lin, Cheng-Min

    Interrupt service routines are a key technology for embedded systems. In this paper, we introduce the standard approach for using Generalized Stochastic Petri Nets (GSPNs) as a high-level model for generating CTMC Continuous-Time Markov Chains (CTMCs) and then use Markov Reward Models (MRMs) to compute the performance for embedded systems. This framework is employed to analyze two embedded controllers with low cost and high performance, ARM7 and Cortex-M3. Cortex-M3 is designed with a tail-chaining mechanism to improve the performance of ARM7 when a nested interrupt occurs on an embedded controller. The Platform Independent Petri net Editor 2 (PIPE2) tool is used to model and evaluate the controllers in terms of power consumption and interrupt overhead performance. Using numerical results, in spite of the power consumption or interrupt overhead, Cortex-M3 performs better than ARM7.

  12. Evaluating Mobile Survey Tools (MSTs) for Field-Level Monitoring and Data Collection: Development of a Novel Evaluation Framework, and Application to MSTs for Rural Water and Sanitation Monitoring

    PubMed Central

    Fisher, Michael B.; Mann, Benjamin H.; Cronk, Ryan D.; Shields, Katherine F.; Klug, Tori L.; Ramaswamy, Rohit

    2016-01-01

    Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs. PMID:27563916

  13. Evaluating Mobile Survey Tools (MSTs) for Field-Level Monitoring and Data Collection: Development of a Novel Evaluation Framework, and Application to MSTs for Rural Water and Sanitation Monitoring.

    PubMed

    Fisher, Michael B; Mann, Benjamin H; Cronk, Ryan D; Shields, Katherine F; Klug, Tori L; Ramaswamy, Rohit

    2016-08-23

    Information and communications technologies (ICTs) such as mobile survey tools (MSTs) can facilitate field-level data collection to drive improvements in national and international development programs. MSTs allow users to gather and transmit field data in real time, standardize data storage and management, automate routine analyses, and visualize data. Dozens of diverse MST options are available, and users may struggle to select suitable options. We developed a systematic MST Evaluation Framework (EF), based on International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) software quality modeling standards, to objectively assess MSTs and assist program implementers in identifying suitable MST options. The EF is applicable to MSTs for a broad variety of applications. We also conducted an MST user survey to elucidate needs and priorities of current MST users. Finally, the EF was used to assess seven MSTs currently used for water and sanitation monitoring, as a validation exercise. The results suggest that the EF is a promising method for evaluating MSTs.

  14. Improving Quality and Reducing Waste in Allied Health Workplace Education Programs: A Pragmatic Operational Education Framework Approach.

    PubMed

    Golder, Janet; Farlie, Melanie K; Sevenhuysen, Samantha

    2016-01-01

    Efficient utilisation of education resources is required for the delivery of effective learning opportunities for allied health professionals. This study aimed to develop an education framework to support delivery of high-quality education within existing education resources. This study was conducted in a large metropolitan health service. Homogenous and purposive sampling methods were utilised in Phase 1 (n=43) and 2 (n=14) consultation stages. Participants included 25 allied health professionals, 22 managers, 1 educator, and 3 executives. Field notes taken during 43 semi-structured interviews and 4 focus groups were member-checked, and semantic thematic analysis methods were utilised. Framework design was informed by existing published framework development guides. The framework model contains governance, planning, delivery, and evaluation and research elements and identifies performance indicators, practice examples, and support tools for a range of stakeholders. Themes integrated into framework content include improving quality of education and training provided and delivery efficiency, greater understanding of education role requirements, and workforce support for education-specific knowledge and skill development. This framework supports efficient delivery of allied health workforce education and training to the highest standard, whilst pragmatically considering current allied health education workforce demands.

  15. Measuring adverse events in helicopter emergency medical services: establishing content validity.

    PubMed

    Patterson, P Daniel; Lave, Judith R; Martin-Gill, Christian; Weaver, Matthew D; Wadas, Richard J; Arnold, Robert M; Roth, Ronald N; Mosesso, Vincent N; Guyette, Francis X; Rittenberger, Jon C; Yealy, Donald M

    2014-01-01

    We sought to create a valid framework for detecting adverse events (AEs) in the high-risk setting of helicopter emergency medical services (HEMS). We assembled a panel of 10 expert clinicians (n = 6 emergency medicine physicians and n = 4 prehospital nurses and flight paramedics) affiliated with a large multistate HEMS organization in the Northeast US. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the content validity index (CVI), to quantify the validity of the framework's content. The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: (1) a trigger tool, (2) a method for rating proximal cause, and (3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. We demonstrate a standardized process for the development of a content-valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS.

  16. A Classroom Entry and Exit Game of Supply with Price-Taking Firms

    ERIC Educational Resources Information Center

    Cheung, Stephen L.

    2005-01-01

    The author describes a classroom game demonstrating the process of adjustment to long-run equilibrium in a market consisting of price-taking firms. This game unites and extends key insights from several simpler games in a framework more consistent with the standard textbook model of a competitive industry. Because firms have increasing marginal…

  17. A Note on the Treatment of Uncertainty in Economics and Finance

    ERIC Educational Resources Information Center

    Carilli, Anthony M.; Dempster, Gregory M.

    2003-01-01

    The treatment of uncertainty in the business classroom has been dominated by the application of risk theory to the utility-maximization framework. Nonetheless, the relevance of the standard risk model as a positive description of economic decision making often has been called into question in theoretical work. In this article, the authors offer an…

  18. Impacts | Wind | NREL

    Science.gov Websites

    in hard hats standing on top of a large wind turbine overlooking several other wind turbines in the Framework Transforms FAST Wind Turbine Modeling Tool NREL Assesses National Design Standards for Offshore Wind Resource NREL Identifies Investments for Wind Turbine Drivetrain Technologies Awards R&D 100

  19. An Evaluation of a School-Based Teenage Pregnancy Prevention Program Using a Logic Model Framework

    ERIC Educational Resources Information Center

    Hulton, Linda J.

    2007-01-01

    Teenage pregnancy and the subsequent social morbidities associated with unintended pregnancies are complex issues facing school nurses in their daily work. In contemporary practice, school nurses are being held to higher standards of accountability and being asked to demonstrate the effective outcomes of their interventions. The purpose of this…

  20. Engineering Design for Engineering Design: Benefits, Models, and Examples from Practice

    ERIC Educational Resources Information Center

    Turner, Ken L., Jr.; Kirby, Melissa; Bober, Sue

    2016-01-01

    Engineering design, a framework for studying and solving societal problems, is a key component of STEM education. It is also the area of greatest challenge within the Next Generation Science Standards, NGSS. Many teachers feel underprepared to teach or create activities that feature engineering design, and integrating a lesson plan of core content…

  1. Developing standards for a national spatial data infrastructure

    USGS Publications Warehouse

    Wortman, Kathryn C.

    1994-01-01

    The concept of a framework for data and information linkages among producers and users, known as a National Spatial Data Infrastructure (NSDI), is built upon four corners: data, technology, institutions, and standards. Standards are paramount to increase the efficiency and effectiveness of the NSDI. Historically, data standards and specifications have been developed with a very limited scope - they were parochial, and even competitive in nature, and promoted the sharing of data and information within only a small community at the expense of more open sharing across many communities. Today, an approach is needed to grow and evolve standards to support open systems and provide consistency and uniformity among data producers. There are several significant ongoing activities in geospatial data standards: transfer or exchange, metadata, and data content. In addition, standards in other areas are under discussion, including data quality, data models, and data collection.

  2. Purpose, processes, partnerships, and products: four Ps to advance participatory socio-environmental modeling

    USGS Publications Warehouse

    Gray, Steven; Voinov, Alexey; Paolisso, Michael; Jordan, Rebecca; BenDor, Todd; Bommel, Pierre; Glynn, Pierre D.; Hedelin, Beatrice; Hubacek, Klaus; Introne, Josh; Kolagani, Nagesh; Laursen, Bethany; Prell, Christina; Schmitt-Olabisi, Laura; Singer, Alison; Sterling, Eleanor J.; Zellner, Moira

    2018-01-01

    Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding ecological change. This is because stakeholders often hold valuable knowledge about socio-environmental dynamics and collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four-dimensional framework (4P) that includes reporting on dimensions of (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human–environment interactions and the consequences of ecological changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of ecological policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM.

  3. Purpose, processes, partnerships, and products: four Ps to advance participatory socio-environmental modeling.

    PubMed

    Gray, Steven; Voinov, Alexey; Paolisso, Michael; Jordan, Rebecca; BenDor, Todd; Bommel, Pierre; Glynn, Pierre; Hedelin, Beatrice; Hubacek, Klaus; Introne, Josh; Kolagani, Nagesh; Laursen, Bethany; Prell, Christina; Schmitt Olabisi, Laura; Singer, Alison; Sterling, Eleanor; Zellner, Moira

    2018-01-01

    Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding ecological change. This is because stakeholders often hold valuable knowledge about socio-environmental dynamics and collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four-dimensional framework (4P) that includes reporting on dimensions of (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human-environment interactions and the consequences of ecological changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of ecological policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM. © 2017 by the Ecological Society of America.

  4. Anisotropic neutron stars in R2 gravity

    NASA Astrophysics Data System (ADS)

    Folomeev, Vladimir

    2018-06-01

    We consider static neutron stars within the framework of R2 gravity. The neutron fluid is described by three different types of realistic equations of state (soft, moderately stiff, and stiff). Using the observational data on the neutron star mass-radius relation, it is demonstrated that the characteristics of the objects supported by the isotropic fluid agree with the observations only if one uses the soft equation of state. We show that the inclusion of the fluid anisotropy enables one also to employ more stiff equations of state to model configurations that will satisfy the observational constraints sufficiently. Also, using the standard thin accretion disk model, we demonstrate potentially observable differences, which allow us to distinguish the neutron stars constructed within the modified gravity framework from those described in Einstein's general relativity.

  5. Leverage hadoop framework for large scale clinical informatics applications.

    PubMed

    Dong, Xiao; Bahroos, Neil; Sadhu, Eugene; Jackson, Tommie; Chukhman, Morris; Johnson, Robert; Boyd, Andrew; Hynes, Denise

    2013-01-01

    In this manuscript, we present our experiences using the Apache Hadoop framework for high data volume and computationally intensive applications, and discuss some best practice guidelines in a clinical informatics setting. There are three main aspects in our approach: (a) process and integrate diverse, heterogeneous data sources using standard Hadoop programming tools and customized MapReduce programs; (b) after fine-grained aggregate results are obtained, perform data analysis using the Mahout data mining library; (c) leverage the column oriented features in HBase for patient centric modeling and complex temporal reasoning. This framework provides a scalable solution to meet the rapidly increasing, imperative "Big Data" needs of clinical and translational research. The intrinsic advantage of fault tolerance, high availability and scalability of Hadoop platform makes these applications readily deployable at the enterprise level cluster environment.

  6. Effect of Ionic Diffusion on Extracellular Potentials in Neural Tissue

    PubMed Central

    Halnes, Geir; Mäki-Marttunen, Tuomo; Keller, Daniel; Pettersen, Klas H.; Andreassen, Ole A.

    2016-01-01

    Recorded potentials in the extracellular space (ECS) of the brain is a standard measure of population activity in neural tissue. Computational models that simulate the relationship between the ECS potential and its underlying neurophysiological processes are commonly used in the interpretation of such measurements. Standard methods, such as volume-conductor theory and current-source density theory, assume that diffusion has a negligible effect on the ECS potential, at least in the range of frequencies picked up by most recording systems. This assumption remains to be verified. We here present a hybrid simulation framework that accounts for diffusive effects on the ECS potential. The framework uses (1) the NEURON simulator to compute the activity and ionic output currents from multicompartmental neuron models, and (2) the electrodiffusive Kirchhoff-Nernst-Planck framework to simulate the resulting dynamics of the potential and ion concentrations in the ECS, accounting for the effect of electrical migration as well as diffusion. Using this framework, we explore the effect that ECS diffusion has on the electrical potential surrounding a small population of 10 pyramidal neurons. The neural model was tuned so that simulations over ∼100 seconds of biological time led to shifts in ECS concentrations by a few millimolars, similar to what has been seen in experiments. By comparing simulations where ECS diffusion was absent with simulations where ECS diffusion was included, we made the following key findings: (i) ECS diffusion shifted the local potential by up to ∼0.2 mV. (ii) The power spectral density (PSD) of the diffusion-evoked potential shifts followed a 1/f2 power law. (iii) Diffusion effects dominated the PSD of the ECS potential for frequencies up to several hertz. In scenarios with large, but physiologically realistic ECS concentration gradients, diffusion was thus found to affect the ECS potential well within the frequency range picked up in experimental recordings. PMID:27820827

  7. Effect of Ionic Diffusion on Extracellular Potentials in Neural Tissue.

    PubMed

    Halnes, Geir; Mäki-Marttunen, Tuomo; Keller, Daniel; Pettersen, Klas H; Andreassen, Ole A; Einevoll, Gaute T

    2016-11-01

    Recorded potentials in the extracellular space (ECS) of the brain is a standard measure of population activity in neural tissue. Computational models that simulate the relationship between the ECS potential and its underlying neurophysiological processes are commonly used in the interpretation of such measurements. Standard methods, such as volume-conductor theory and current-source density theory, assume that diffusion has a negligible effect on the ECS potential, at least in the range of frequencies picked up by most recording systems. This assumption remains to be verified. We here present a hybrid simulation framework that accounts for diffusive effects on the ECS potential. The framework uses (1) the NEURON simulator to compute the activity and ionic output currents from multicompartmental neuron models, and (2) the electrodiffusive Kirchhoff-Nernst-Planck framework to simulate the resulting dynamics of the potential and ion concentrations in the ECS, accounting for the effect of electrical migration as well as diffusion. Using this framework, we explore the effect that ECS diffusion has on the electrical potential surrounding a small population of 10 pyramidal neurons. The neural model was tuned so that simulations over ∼100 seconds of biological time led to shifts in ECS concentrations by a few millimolars, similar to what has been seen in experiments. By comparing simulations where ECS diffusion was absent with simulations where ECS diffusion was included, we made the following key findings: (i) ECS diffusion shifted the local potential by up to ∼0.2 mV. (ii) The power spectral density (PSD) of the diffusion-evoked potential shifts followed a 1/f2 power law. (iii) Diffusion effects dominated the PSD of the ECS potential for frequencies up to several hertz. In scenarios with large, but physiologically realistic ECS concentration gradients, diffusion was thus found to affect the ECS potential well within the frequency range picked up in experimental recordings.

  8. Assessing concentrations and health impacts of air quality management strategies: Framework for Rapid Emissions Scenario and Health impact ESTimation (FRESH-EST).

    PubMed

    Milando, Chad W; Martenies, Sheena E; Batterman, Stuart A

    2016-09-01

    In air quality management, reducing emissions from pollutant sources often forms the primary response to attaining air quality standards and guidelines. Despite the broad success of air quality management in the US, challenges remain. As examples: allocating emissions reductions among multiple sources is complex and can require many rounds of negotiation; health impacts associated with emissions, the ultimate driver for the standards, are not explicitly assessed; and long dispersion model run-times, which result from the increasing size and complexity of model inputs, limit the number of scenarios that can be evaluated, thus increasing the likelihood of missing an optimal strategy. A new modeling framework, called the "Framework for Rapid Emissions Scenario and Health impact ESTimation" (FRESH-EST), is presented to respond to these challenges. FRESH-EST estimates concentrations and health impacts of alternative emissions scenarios at the urban scale, providing efficient computations from emissions to health impacts at the Census block or other desired spatial scale. In addition, FRESH-EST can optimize emission reductions to meet specified environmental and health constraints, and a convenient user interface and graphical displays are provided to facilitate scenario evaluation. The new framework is demonstrated in an SO2 non-attainment area in southeast Michigan with two optimization strategies: the first minimizes emission reductions needed to achieve a target concentration; the second minimizes concentrations while holding constant the cumulative emissions across local sources (e.g., an emissions floor). The optimized strategies match outcomes in the proposed SO2 State Implementation Plan without the proposed stack parameter modifications or shutdowns. In addition, the lower health impacts estimated for these strategies suggest that FRESH-EST could be used to identify potentially more desirable pollution control alternatives in air quality management planning. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. A reflection and evaluation model of comparative thinking.

    PubMed

    Markman, Keith D; McMullen, Matthew N

    2003-01-01

    This article reviews research on counterfactual, social, and temporal comparisons and proposes a Reflection and Evaluation Model (REM) as an organizing framework. At the heart of the model is the assertion that 2 psychologically distinct modes of mental simulation operate during comparative thinking: reflection, an experiential ("as if") mode of thinking characterized by vividly simulating that information about the comparison standard is true of, or part of, the self; and evaluation, an evaluative mode of thinking characterized by the use of information about the standard as a reference point against which to evaluate one's present standing. Reflection occurs when information about the standard is included in one's self-construal, and evaluation occurs when such information is excluded. The result of reflection is that standard-consistent cognitions about the self become highly accessible, thereby yielding affective assimilation; whereas the result of evaluation is that comparison information is used as a standard against which one's present standing is evaluated, thereby yielding affective contrast. The resulting affect leads to either an increase or decrease in behavioral persistence as a function of the type of task with which one is engaged, and a combination of comparison-derived causal inferences and regulatory focus strategies direct one toward adopting specific future action plans.

  10. Translating Radiometric Requirements for Satellite Sensors to Match International Standards.

    PubMed

    Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong

    2014-01-01

    International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument.

  11. Translating Radiometric Requirements for Satellite Sensors to Match International Standards

    PubMed Central

    Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong

    2014-01-01

    International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument. PMID:26601032

  12. Information Object Definition–based Unified Modeling Language Representation of DICOM Structured Reporting

    PubMed Central

    Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K.P.

    2002-01-01

    Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification. PMID:11751804

  13. A comparison of linear and nonlinear statistical techniques in performance attribution.

    PubMed

    Chan, N H; Genovese, C R

    2001-01-01

    Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.

  14. MPEG-21 in broadcasting: the novel digital broadcast item model

    NASA Astrophysics Data System (ADS)

    Lugmayr, Artur R.; Touimi, Abdellatif B.; Kaneko, Itaru; Kim, Jong-Nam; Alberti, Claudio; Yona, Sadigurschi; Kim, Jaejoon; Andrade, Maria Teresa; Kalli, Seppo

    2004-05-01

    The MPEG experts are currently developing the MPEG-21 set of standards and this includes a framework and specifications for digital rights management (DRM), delivery of quality of services (QoS) over heterogeneous networks and terminals, packaging of multimedia content and other things essential for the infrastructural aspects of multimedia content distribution. Considerable research effort is being applied to these new developments and the capabilities of MPEG-21 technologies to address specific application areas are being investigated. One such application area is broadcasting, in particular the development of digital TV and its services. In more practical terms, digital TV addresses networking, events, channels, services, programs, signaling, encoding, bandwidth, conditional access, subscription, advertisements and interactivity. MPEG-21 provides an excellent framework of standards to be applied in digital TV applications. Within the scope of this research work we describe a new model based on MPEG-21 and its relevance to digital TV: the digital broadcast item model (DBIM). The goal of the DBIM is to elaborate the potential of MPEG-21 for digital TV applications. Within this paper we focus on a general description of the DBIM, quality of service (QoS) management and metadata filtering, digital rights management and also present use-cases and scenarios where the DBIM"s role is explored in detail.

  15. Understanding the CCA Standard Through Decaf

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumfert, G

    2003-04-17

    This document is a tutorial on the CCA Standard as realized through the Decaf implementation. Decaf does not equal the CCA standard much in the same way that Microsoft Visual C++ is not ANSI/ISO C++. This document was created because the CCA standard is evolving and still too fluid to nail down in a tutorial document. Because of its fluidity, and that it represents a hotbed of research and development, beginners can only start learning CCA by choosing one of the frameworks (warts and all). Decaf has just enough functionality to be a useful tool for beginners in the CCAmore » to get started on. Though it lacks many features of the bigger CCA frameworks (CCAFE [3], XCAT [10], and SciRUN [8]) where the heavy-duty research is still going on, it is the first CCA framework that is underpinned by Babel, which provides its language interoperability features. This document can also serve the dual-purpose of providing a reasonable-sized example of building an application using Babel. The entire source for Decaf is included in the examples/subdirectory of the Babel code distribution. This manual assumes the reader is a programmer who has a conceptual understanding of the Babel Language Interoperability Tool. They should be proficient in two or more of the following languages: Fortran77, C, C++, Java, or Python. Furthermore, this manual assumes the reader is familiar with the SPMD{sup 2} programming model that pervades the scientific computing community. Knowledge of and experience with MPI programming is helpful, but not strictly required.« less

  16. Fitting identity in the reasoned action framework: A meta-analysis and model comparison.

    PubMed

    Paquin, Ryan S; Keating, David M

    2017-01-01

    Several competing models have been put forth regarding the role of identity in the reasoned action framework. The standard model proposes that identity is a background variable. Under a typical augmented model, identity is treated as an additional direct predictor of intention and behavior. Alternatively, it has been proposed that identity measures are inadvertent indicators of an underlying intention factor (e.g., a manifest-intention model). In order to test these competing hypotheses, we used data from 73 independent studies (total N = 23,917) to conduct a series of meta-analytic structural equation models. We also tested for moderation effects based on whether there was a match between identity constructs and the target behaviors examined (e.g., if the study examined a "smoker identity" and "smoking behavior," there would be a match; if the study examined a "health conscious identity" and "smoking behavior," there would not be a match). Average effects among primary reasoned action variables were all substantial, rs = .37-.69. Results gave evidence for the manifest-intention model over the other explanations, and a moderation effect by identity-behavior matching.

  17. Phenomenological implications of an alternative Hamiltonian constraint for quantum cosmology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kagan, Mikhail

    2005-11-15

    In this paper we review a model based on loop quantum cosmology that arises from a symmetry reduction of the self-dual Plebanski action. In this formulation the symmetry reduction leads to a very simple Hamiltonian constraint that can be quantized explicitly in the framework of loop quantum cosmology. We investigate the phenomenological implications of this model in the semiclassical regime and compare those with the known results of the standard Loop Quantum Cosmology.

  18. A Note on the Bogdanov-Takens Bifurcation in the Romer Model with Learning by Doing

    NASA Astrophysics Data System (ADS)

    Bella, Giovanni

    This paper is aimed at describing the whole set of necessary and sufficient conditions for the emergence of multiple equilibria and global indeterminacy in the standard endogenous growth framework with learning by doing. The novelty of this paper relies on the application of the original Bogdanov-Takens bifurcation theorem, which allows us to characterize the full dynamics of the model, and determine the emergence of an unavoidable poverty trap.

  19. Higgs Discovery: Impact on Composite Dynamics Technicolor & eXtreme Compositeness Thinking Fast and Slow

    NASA Astrophysics Data System (ADS)

    Sannino, Francesco

    I discuss the impact of the discovery of a Higgs-like state on composite dynamics starting by critically examining the reasons in favour of either an elementary or composite nature of this state. Accepting the standard model interpretation I re-address the standard model vacuum stability within a Weyl-consistent computation. I will carefully examine the fundamental reasons why what has been discovered might not be the standard model Higgs. Dynamical electroweak breaking naturally addresses a number of the fundamental issues unsolved by the standard model interpretation. However this paradigm has been challenged by the discovery of a not-so-heavy Higgs-like state. I will therefore review the recent discovery1 that the standard model top-induced radiative corrections naturally reduce the intrinsic non-perturbative mass of the composite Higgs state towards the desired experimental value. Not only we have a natural and testable working framework but we have also suggested specic gauge theories that can realise, at the fundamental level, these minimal models of dynamical electroweak symmetry breaking. These strongly coupled gauge theories are now being heavily investigated via first principle lattice simulations with encouraging results. The new findings show that the recent naive claims made about new strong dynamics at the electroweak scale being disfavoured by the discovery of a not-so-heavy composite Higgs are unwarranted. I will then introduce the more speculative idea of extreme compositeness according to which not only the Higgs sector of the standard model is composite but also quarks and leptons, and provide a toy example in the form of gauge-gauge duality.

  20. R-IDEAL: A Framework for Systematic Clinical Evaluation of Technical Innovations in Radiation Oncology.

    PubMed

    Verkooijen, Helena M; Kerkmeijer, Linda G W; Fuller, Clifton D; Huddart, Robbert; Faivre-Finn, Corinne; Verheij, Marcel; Mook, Stella; Sahgal, Arjun; Hall, Emma; Schultz, Chris

    2017-01-01

    The pace of innovation in radiation oncology is high and the window of opportunity for evaluation narrow. Financial incentives, industry pressure, and patients' demand for high-tech treatments have led to widespread implementation of innovations before, or even without, robust evidence of improved outcomes has been generated. The standard phase I-IV framework for drug evaluation is not the most efficient and desirable framework for assessment of technological innovations. In order to provide a standard assessment methodology for clinical evaluation of innovations in radiotherapy, we adapted the surgical IDEAL framework to fit the radiation oncology setting. Like surgery, clinical evaluation of innovations in radiation oncology is complicated by continuous technical development, team and operator dependence, and differences in quality control. Contrary to surgery, radiotherapy innovations may be used in various ways, e.g., at different tumor sites and with different aims, such as radiation volume reduction and dose escalation. Also, the effect of radiation treatment can be modeled, allowing better prediction of potential benefits and improved patient selection. Key distinctive features of R-IDEAL include the important role of predicate and modeling studies (Stage 0), randomization at an early stage in the development of the technology, and long-term follow-up for late toxicity. We implemented R-IDEAL for clinical evaluation of a recent innovation in radiation oncology, the MRI-guided linear accelerator (MR-Linac). MR-Linac combines a radiotherapy linear accelerator with a 1.5-T MRI, aiming for improved targeting, dose escalation, and margin reduction, and is expected to increase the use of hypofractionation, improve tumor control, leading to higher cure rates and less toxicity. An international consortium, with participants from seven large cancer institutes from Europe and North America, has adopted the R-IDEAL framework to work toward coordinated, evidence-based introduction of the MR-Linac. R-IDEAL holds the promise for timely, evidence-based introduction of radiotherapy innovations with proven superior effectiveness, while preventing unnecessary exposure of patients to potentially harmful interventions.

  1. Towards a framework for developing semantic relatedness reference standards.

    PubMed

    Pakhomov, Serguei V S; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B; Ruggieri, Alexander; Chute, Christopher G

    2011-04-01

    Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the "moderate" range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. Copyright © 2010 Elsevier Inc. All rights reserved.

  2. Information risk and security modeling

    NASA Astrophysics Data System (ADS)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  3. The HTA Core Model®-10 Years of Developing an International Framework to Share Multidimensional Value Assessment.

    PubMed

    Kristensen, Finn Børlum; Lampe, Kristian; Wild, Claudia; Cerbo, Marina; Goettsch, Wim; Becla, Lidia

    2017-02-01

    The HTA Core Model ® as a science-based framework for assessing dimensions of value was developed as a part of the European network for Health Technology Assessment project in the period 2006 to 2008 to facilitate production and sharing of health technology assessment (HTA) information, such as evidence on efficacy and effectiveness and patient aspects, to inform decisions. It covers clinical value as well as organizational, economic, and patient aspects of technologies and has been field-tested in two consecutive joint actions in the period 2010 to 2016. A large number of HTA institutions were involved in the work. The model has undergone revisions and improvement after iterations of piloting and can be used in a local, national, or international context to produce structured HTA information that can be taken forward by users into their own frameworks to fit their specific needs when informing decisions on technology. The model has a broad scope and offers a common ground to various stakeholders through offering a standard structure and a transparent set of proposed HTA questions. It consists of three main components: 1) the HTA ontology, 2) methodological guidance, and 3) a common reporting structure. It covers domains such as effectiveness, safety, and economics, and also includes domains covering organizational, patient, social, and legal aspects. There is a full model and a focused rapid relative effectiveness assessment model, and a third joint action is to continue till 2020. The HTA Core Model is now available for everyone around the world as a framework for assessing value. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. DYNAMO-HIA–A Dynamic Modeling Tool for Generic Health Impact Assessments

    PubMed Central

    Lhachimi, Stefan K.; Nusselder, Wilma J.; Smit, Henriette A.; van Baal, Pieter; Baili, Paolo; Bennett, Kathleen; Fernández, Esteve; Kulik, Margarete C.; Lobstein, Tim; Pomerleau, Joceline; Mackenbach, Johan P.; Boshuizen, Hendriek C.

    2012-01-01

    Background Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA). A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states) and three usability criteria (modest data requirements, rich model output, generally accessible) to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment), we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. Methods and Results DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures – e.g. life expectancy and disease-free life expectancy – and detailed data – e.g. prevalences and mortality/survival rates – by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. Conclusion By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based on epidemiologic evidence. PMID:22590491

  5. Computable visually observed phenotype ontological framework for plants

    PubMed Central

    2011-01-01

    Background The ability to search for and precisely compare similar phenotypic appearances within and across species has vast potential in plant science and genetic research. The difficulty in doing so lies in the fact that many visual phenotypic data, especially visually observed phenotypes that often times cannot be directly measured quantitatively, are in the form of text annotations, and these descriptions are plagued by semantic ambiguity, heterogeneity, and low granularity. Though several bio-ontologies have been developed to standardize phenotypic (and genotypic) information and permit comparisons across species, these semantic issues persist and prevent precise analysis and retrieval of information. A framework suitable for the modeling and analysis of precise computable representations of such phenotypic appearances is needed. Results We have developed a new framework called the Computable Visually Observed Phenotype Ontological Framework for plants. This work provides a novel quantitative view of descriptions of plant phenotypes that leverages existing bio-ontologies and utilizes a computational approach to capture and represent domain knowledge in a machine-interpretable form. This is accomplished by means of a robust and accurate semantic mapping module that automatically maps high-level semantics to low-level measurements computed from phenotype imagery. The framework was applied to two different plant species with semantic rules mined and an ontology constructed. Rule quality was evaluated and showed high quality rules for most semantics. This framework also facilitates automatic annotation of phenotype images and can be adopted by different plant communities to aid in their research. Conclusions The Computable Visually Observed Phenotype Ontological Framework for plants has been developed for more efficient and accurate management of visually observed phenotypes, which play a significant role in plant genomics research. The uniqueness of this framework is its ability to bridge the knowledge of informaticians and plant science researchers by translating descriptions of visually observed phenotypes into standardized, machine-understandable representations, thus enabling the development of advanced information retrieval and phenotype annotation analysis tools for the plant science community. PMID:21702966

  6. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier-Stokes simulations: A data-driven, physics-informed Bayesian approach

    NASA Astrophysics Data System (ADS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes.

  7. Geometrothermodynamic model for the evolution of the Universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruber, Christine; Quevedo, Hernando, E-mail: christine.gruber@correo.nucleares.unam.mx, E-mail: quevedo@nucleares.unam.mx

    Using the formalism of geometrothermodynamics to derive a fundamental thermodynamic equation, we construct a cosmological model in the framework of relativistic cosmology. In a first step, we describe a system without thermodynamic interaction, and show it to be equivalent to the standard ΛCDM paradigm. The second step includes thermodynamic interaction and produces a model consistent with the main features of inflation. With the proposed fundamental equation we are thus able to describe all the known epochs in the evolution of our Universe, starting from the inflationary phase.

  8. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  9. BF actions for the Husain-Kuchař model

    NASA Astrophysics Data System (ADS)

    Barbero G., J. Fernando; Villaseñor, Eduardo J.

    2001-04-01

    We show that the Husain-Kuchař model can be described in the framework of BF theories. This is a first step towards its quantization by standard perturbative quantum field theory techniques or the spin-foam formalism introduced in the space-time description of general relativity and other diff-invariant theories. The actions that we will consider are similar to the ones describing the BF-Yang-Mills model and some mass generating mechanisms for gauge fields. We will also discuss the role of diffeomorphisms in the new formulations that we propose.

  10. Constraining the top-Higgs sector of the standard model effective field theory

    NASA Astrophysics Data System (ADS)

    Cirigliano, V.; Dekens, W.; de Vries, J.; Mereghetti, E.

    2016-08-01

    Working in the framework of the Standard Model effective field theory, we study chirality-flipping couplings of the top quark to Higgs and gauge bosons. We discuss in detail the renormalization-group evolution to lower energies and investigate direct and indirect contributions to high- and low-energy C P -conserving and C P -violating observables. Our analysis includes constraints from collider observables, precision electroweak tests, flavor physics, and electric dipole moments. We find that indirect probes are competitive or dominant for both C P -even and C P -odd observables, even after accounting for uncertainties associated with hadronic and nuclear matrix elements, illustrating the importance of including operator mixing in constraining the Standard Model effective field theory. We also study scenarios where multiple anomalous top couplings are generated at the high scale, showing that while the bounds on individual couplings relax, strong correlations among couplings survive. Finally, we find that enforcing minimal flavor violation does not significantly affect the bounds on the top couplings.

  11. Feature-based component model for design of embedded systems

    NASA Astrophysics Data System (ADS)

    Zha, Xuan Fang; Sriram, Ram D.

    2004-11-01

    An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.

  12. Generalized Symbolic Execution for Model Checking and Testing

    NASA Technical Reports Server (NTRS)

    Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)

    2003-01-01

    Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.

  13. Model Checking Degrees of Belief in a System of Agents

    NASA Technical Reports Server (NTRS)

    Raimondi, Franco; Primero, Giuseppe; Rungta, Neha

    2014-01-01

    Reasoning about degrees of belief has been investigated in the past by a number of authors and has a number of practical applications in real life. In this paper we present a unified framework to model and verify degrees of belief in a system of agents. In particular, we describe an extension of the temporal-epistemic logic CTLK and we introduce a semantics based on interpreted systems for this extension. In this way, degrees of beliefs do not need to be provided externally, but can be derived automatically from the possible executions of the system, thereby providing a computationally grounded formalism. We leverage the semantics to (a) construct a model checking algorithm, (b) investigate its complexity, (c) provide a Java implementation of the model checking algorithm, and (d) evaluate our approach using the standard benchmark of the dining cryptographers. Finally, we provide a detailed case study: using our framework and our implementation, we assess and verify the situational awareness of the pilot of Air France 447 flying in off-nominal conditions.

  14. Towards a Framework for Evaluating and Comparing Diagnosis Algorithms

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia,David; Kuhn, Lukas; deKleer, Johan; vanGemund, Arjan; Feldman, Alexander

    2009-01-01

    Diagnostic inference involves the detection of anomalous system behavior and the identification of its cause, possibly down to a failed unit or to a parameter of a failed unit. Traditional approaches to solving this problem include expert/rule-based, model-based, and data-driven methods. Each approach (and various techniques within each approach) use different representations of the knowledge required to perform the diagnosis. The sensor data is expected to be combined with these internal representations to produce the diagnosis result. In spite of the availability of various diagnosis technologies, there have been only minimal efforts to develop a standardized software framework to run, evaluate, and compare different diagnosis technologies on the same system. This paper presents a framework that defines a standardized representation of the system knowledge, the sensor data, and the form of the diagnosis results and provides a run-time architecture that can execute diagnosis algorithms, send sensor data to the algorithms at appropriate time steps from a variety of sources (including the actual physical system), and collect resulting diagnoses. We also define a set of metrics that can be used to evaluate and compare the performance of the algorithms, and provide software to calculate the metrics.

  15. Providing comprehensive and consistent access to astronomical observatory archive data: the NASA archive model

    NASA Astrophysics Data System (ADS)

    McGlynn, Thomas; Fabbiano, Giuseppina; Accomazzi, Alberto; Smale, Alan; White, Richard L.; Donaldson, Thomas; Aloisi, Alessandra; Dower, Theresa; Mazzerella, Joseph M.; Ebert, Rick; Pevunova, Olga; Imel, David; Berriman, Graham B.; Teplitz, Harry I.; Groom, Steve L.; Desai, Vandana R.; Landry, Walter

    2016-07-01

    Since the turn of the millennium a constant concern of astronomical archives have begun providing data to the public through standardized protocols unifying data from disparate physical sources and wavebands across the electromagnetic spectrum into an astronomical virtual observatory (VO). In October 2014, NASA began support for the NASA Astronomical Virtual Observatories (NAVO) program to coordinate the efforts of NASA astronomy archives in providing data to users through implementation of protocols agreed within the International Virtual Observatory Alliance (IVOA). A major goal of the NAVO collaboration has been to step back from a piecemeal implementation of IVOA standards and define what the appropriate presence for the US and NASA astronomy archives in the VO should be. This includes evaluating what optional capabilities in the standards need to be supported, the specific versions of standards that should be used, and returning feedback to the IVOA, to support modifications as needed. We discuss a standard archive model developed by the NAVO for data archive presence in the virtual observatory built upon a consistent framework of standards defined by the IVOA. Our standard model provides for discovery of resources through the VO registries, access to observation and object data, downloads of image and spectral data and general access to archival datasets. It defines specific protocol versions, minimum capabilities, and all dependencies. The model will evolve as the capabilities of the virtual observatory and needs of the community change.

  16. Providing Comprehensive and Consistent Access to Astronomical Observatory Archive Data: The NASA Archive Model

    NASA Technical Reports Server (NTRS)

    McGlynn, Thomas; Guiseppina, Fabbiano A; Accomazzi, Alberto; Smale, Alan; White, Richard L.; Donaldson, Thomas; Aloisi, Alessandra; Dower, Theresa; Mazzerella, Joseph M.; Ebert, Rick; hide

    2016-01-01

    Since the turn of the millennium a constant concern of astronomical archives have begun providing data to the public through standardized protocols unifying data from disparate physical sources and wavebands across the electromagnetic spectrum into an astronomical virtual observatory (VO). In October 2014, NASA began support for the NASA Astronomical Virtual Observatories (NAVO) program to coordinate the efforts of NASA astronomy archives in providing data to users through implementation of protocols agreed within the International Virtual Observatory Alliance (IVOA). A major goal of the NAVO collaboration has been to step back from a piecemeal implementation of IVOA standards and define what the appropriate presence for the US and NASA astronomy archives in the VO should be. This includes evaluating what optional capabilities in the standards need to be supported, the specific versions of standards that should be used, and returning feedback to the IVOA, to support modifications as needed. We discuss a standard archive model developed by the NAVO for data archive presence in the virtual observatory built upon a consistent framework of standards defined by the IVOA. Our standard model provides for discovery of resources through the VO registries, access to observation and object data, downloads of image and spectral data and general access to archival datasets. It defines specific protocol versions, minimum capabilities, and all dependencies. The model will evolve as the capabilities of the virtual observatory and needs of the community change.

  17. Enhancing CIDOC-CRM and compatible models with the concept of multiple interpretation

    NASA Astrophysics Data System (ADS)

    Van Ruymbeke, M.; Hallot, P.; Billen, R.

    2017-08-01

    Modelling cultural heritage and archaeological objects is used as much for management as for research purposes. To ensure the sustainable benefit of digital data, models benefit from taking the data specificities of historical and archaeological domains into account. Starting from a conceptual model tailored to storing these specificities, we present, in this paper, an extended mapping to CIDOC-CRM and its compatible models. Offering an ideal framework to structure and highlight the best modelling practices, these ontologies are essentially dedicated to storing semantic data which provides information about cultural heritage objects. Based on this standard, our proposal focuses on multiple interpretation and sequential reality.

  18. Defining a Communications Satellite Policy System for the 21st Century: A Model for a International Legal Framework and A New _Code of Conduct_

    NASA Astrophysics Data System (ADS)

    Pelton, Joseph N.

    1996-02-01

    This paper addresses the changing international communications environment and explores the key elements of a new policy framework for the 21st Century. It addresses the issues related to changing markets, trade considerations, standards, regulatory changes and international institutions and law. The most important aspects will related to new international policy and regulatory frameworks and in particular to a new international code of ethics and behavior in the field of satellite communications. A new communications satellite policy framework requires systematically addressing the following points: • Multi-lateral agreements at the nation state and the operating entity level • Systematic means to access both private and public capital • Meshing ITU regulations with regional and national policy guidelines including • landing rights" and national allocation procedures. • Systematic approach to local partnerships • Resolving the issue of the relative standing of various satellite systems (i.e. GEO, MEO, and LEO systems) • Resolving the rights, duties, and priorities of satellite facility providers versus types of service prviders. Beyond this policy framework and generalized legal infrastructure there is also another need. This is a need that arises from both increased globalism and competitive international markets. This is what might quite simply be called a "code of reasonable conduct:" To provide global and international communications services effectively and well in the 21st Century will require more than meeting minimum international legal requirements. A new "code of conduct" for global satellite communications will thus likely need to address: • Privacy and surveillance • Ethics of transborder data flow • Censorship and moral values • Cultural and linguistic sensitivity • Freedom of the press and respect for journalistic standards As expanding global information and telecommunications systems grow and impact every aspect of modern life, the need for new international policy and especially new suitable standards of conduct in the field of satellite communications become ever more apparent and necessary.

  19. Risk assessment of vector-borne diseases for public health governance.

    PubMed

    Sedda, L; Morley, D W; Braks, M A H; De Simone, L; Benz, D; Rogers, D J

    2014-12-01

    In the context of public health, risk governance (or risk analysis) is a framework for the assessment and subsequent management and/or control of the danger posed by an identified disease threat. Generic frameworks in which to carry out risk assessment have been developed by various agencies. These include monitoring, data collection, statistical analysis and dissemination. Due to the inherent complexity of disease systems, however, the generic approach must be modified for individual, disease-specific risk assessment frameworks. The analysis was based on the review of the current risk assessments of vector-borne diseases adopted by the main Public Health organisations (OIE, WHO, ECDC, FAO, CDC etc…). Literature, legislation and statistical assessment of the risk analysis frameworks. This review outlines the need for the development of a general public health risk assessment method for vector-borne diseases, in order to guarantee that sufficient information is gathered to apply robust models of risk assessment. Stochastic (especially spatial) methods, often in Bayesian frameworks are now gaining prominence in standard risk assessment procedures because of their ability to assess accurately model uncertainties. Risk assessment needs to be addressed quantitatively wherever possible, and submitted with its quality assessment in order to enable successful public health measures to be adopted. In terms of current practice, often a series of different models and analyses are applied to the same problem, with results and outcomes that are difficult to compare because of the unknown model and data uncertainties. Therefore, the risk assessment areas in need of further research are identified in this article. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  20. Integrating Sediment Connectivity into Water Resources Management Trough a Graph Theoretic, Stochastic Modeling Framework.

    NASA Astrophysics Data System (ADS)

    Schmitt, R. J. P.; Castelletti, A.; Bizzi, S.

    2014-12-01

    Understanding sediment transport processes at the river basin scale, their temporal spectra and spatial patterns is key to identify and minimize morphologic risks associated to channel adjustments processes. This work contributes a stochastic framework for modeling bed-load connectivity based on recent advances in the field (e.g., Bizzi & Lerner, 2013; Czubas & Foufoulas-Georgiu, 2014). It presents river managers with novel indicators from reach scale vulnerability to channel adjustment in large river networks with sparse hydrologic and sediment observations. The framework comprises three steps. First, based on a distributed hydrological model and remotely sensed information, the framework identifies a representative grain size class for each reach. Second, sediment residence time distributions are calculated for each reach in a Monte-Carlo approach applying standard sediment transport equations driven by local hydraulic conditions. Third, a network analysis defines the up- and downstream connectivity for various travel times resulting in characteristic up/downstream connectivity signatures for each reach. Channel vulnerability indicators quantify the imbalance between up/downstream connectivity for each travel time domain, representing process dependent latency of morphologic response. Last, based on the stochastic core of the model, a sensitivity analysis identifies drivers of change and major sources of uncertainty in order to target key detrimental processes and to guide effective gathering of additional data. The application, limitation and integration into a decision analytic framework is demonstrated for a major part of the Red River Basin in Northern Vietnam (179.000 km2). Here, a plethora of anthropic alterations ranging from large reservoir construction to land-use changes results in major downstream deterioration and calls for deriving concerted sediment management strategies to mitigate current and limit future morphologic alterations.

  1. The ACRL framework for information literacy in higher education: implications for health sciences librarianship.

    PubMed

    Knapp, Maureen; Brower, Stewart

    2014-01-01

    The Association of College and Research Libraries is developing a new framework of information literacy concepts that will revise and replace the previously adopted standards. This framework consists of six threshold concepts that are more flexible than the original standards, and that work to identify both the function and the feelings behind information literacy education practices. This column outlines the new tentative framework with an eye toward its implications for health sciences libraries, and suggests ways the medical library community might work with this new document.

  2. Research governance: implications for health library and information professionals.

    PubMed

    Sen, Barbara A

    2003-03-01

    The Research Governance Framework for Health and Social Care published by the Department of Health in 2001 provides a model of best practice and a framework for research in the health and social care sector. This article reviews the Department of Health Research Governance Framework, discusses the implications of research governance for library and information professionals undertaking research in the health- and social-care sector and recommends strategies for best practice within the information profession relating to research governance. The scope of the Framework document that covers both clinical and non-clinical research is outlined. Any research involving, amongst other issues, patients, NHS staff and use or access to NHS premises may require ethics committee approval. Particular reference is made to the roles, responsibilities and professional conduct and the systems needed to support effective research practice. Issues such as these combine to encourage the development of a quality research culture which supports best practice. Questions arise regarding the training and experience of researchers, and access to the necessary information and support. The use of the Framework to guide research practice complements the quality issues within the evidence-based practice movement and supports the ongoing development of a quality research culture. Recommendations are given in relation to the document's five domains of ethics, science, information, health and safety and finance and intellectual property. Practical recommendations are offered for incorporating research governance into research practice in ways which conform to the Framework's standards and which are particularly relevant for research practitioners in information science. Concluding comments support the use of the Research Governance Framework as a model for best practice.

  3. Parameterized post-Newtonian cosmology

    NASA Astrophysics Data System (ADS)

    Sanghai, Viraj A. A.; Clifton, Timothy

    2017-03-01

    Einstein’s theory of gravity has been extensively tested on solar system scales, and for isolated astrophysical systems, using the perturbative framework known as the parameterized post-Newtonian (PPN) formalism. This framework is designed for use in the weak-field and slow-motion limit of gravity, and can be used to constrain a large class of metric theories of gravity with data collected from the aforementioned systems. Given the potential of future surveys to probe cosmological scales to high precision, it is a topic of much contemporary interest to construct a similar framework to link Einstein’s theory of gravity and its alternatives to observations on cosmological scales. Our approach to this problem is to adapt and extend the existing PPN formalism for use in cosmology. We derive a set of equations that use the same parameters to consistently model both weak fields and cosmology. This allows us to parameterize a large class of modified theories of gravity and dark energy models on cosmological scales, using just four functions of time. These four functions can be directly linked to the background expansion of the universe, first-order cosmological perturbations, and the weak-field limit of the theory. They also reduce to the standard PPN parameters on solar system scales. We illustrate how dark energy models and scalar-tensor and vector-tensor theories of gravity fit into this framework, which we refer to as ‘parameterized post-Newtonian cosmology’ (PPNC).

  4. Framework for Assessing the ICT Competency in Teachers up to the Requirements of "Teacher" Occupational Standard

    ERIC Educational Resources Information Center

    Avdeeva, Svetlana; Zaichkina, Olga; Nikulicheva, Nataliya; Khapaeva, Svetlana

    2016-01-01

    The paper deals with problems of working out a test framework for the assessment of teachers' ICT competency in line with the requirements of "Teacher" occupational standard. The authors have analyzed the known approaches to assessing teachers' ICT competency--ISTE Standards and UNESCO ICT CFT and have suggested their own approach to…

  5. Measuring implementation behaviour of menu guidelines in the childcare setting: confirmatory factor analysis of a theoretical domains framework questionnaire (TDFQ).

    PubMed

    Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin

    2017-04-04

    While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.

  6. A Solar Data Model for Use in Virtual Observatories

    NASA Astrophysics Data System (ADS)

    Reardon, K. P.; Bentley, R. D.; Messerotti, M.; Giordano, S.

    2004-05-01

    The creation of a virtual solar observatories relies heavily on the merging of the metadata describing different datasets into a common form so that it can be handled in a standard way for all associated resources. In order to bring together the varied data descriptions that already exist, it is necessary to have a common framework on which all the different datasets can be represented. The definition of this framework is done through a data model which attempts to provide a simplified but realistic description of the various entities that make up a data set or solar resource. We present the solar data model which has been developed as part of the European Grid of Solar Observations (EGSO) project. This model attempts to include many of the different elements in the field of solar physics, including data producers, data sets, event lists, and data providers. This global picture can then be used to focus on the particular elements required for a specific implementation. We present the different aspects of the model and describe some systems in which portions of this model have been implemented.

  7. Private animal health and welfare standards in quality assurance programmes: a review and proposed framework for critical evaluation.

    PubMed

    More, S J; Hanlon, A; Marchewka, J; Boyle, L

    2017-06-24

    In recent years, 'private standards' in animal health and welfare have become increasingly common, and are often incorporated into quality assurance (QA) programmes. Here, we present an overview of the use of private animal health and welfare standards in QA programmes, and propose a generic framework to facilitate critical programme review. Private standards are being developed in direct response to consumer demand for QA, and offer an opportunity for product differentiation and a means to drive consumer choice. Nonetheless, a range of concerns have been raised, relating to the credibility of these standards, their potential as a discriminatory barrier to trade, the multiplicity of private standards that have been developed, the lack of consumer input and compliance costs. There is a need for greater scrutiny of private standards and of associated QA programmes. We propose a framework to clarify the primary programme goal(s) and measureable outputs relevant to animal health and welfare, the primary programme beneficiaries and to determine whether the programme is effective, efficient and transparent. This paper provides a theoretical overview, noting that this framework could be used as a tool directly for programme evaluation, or as a tool to assist with programme development and review. British Veterinary Association.

  8. Exploiting salient semantic analysis for information retrieval

    NASA Astrophysics Data System (ADS)

    Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui

    2016-11-01

    Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.

  9. Fermionic dark matter and neutrino masses in a B - L model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sánchez-Vega, B. L.; Schmitz, E. R.

    2015-09-01

    In this work we present a common framework for neutrino masses and dark matter. Specifically, we work with a local B - L extension of the standard model which has three right-handed neutrinos, n(Ri), and some extra scalars, Phi, phi(i), besides the standard model fields. The n(Ri)'s have nonstandard B - L quantum numbers and thus these couple to different scalars. This model has the attractive property that an almost automatic Z(2) symmetry acting only on a fermionic field, n(R3), is present. Taking advantage of this Z(2) symmetry, we study both the neutrino mass generation via a natural seesaw mechanismmore » at low energy and the possibility of n(R3) being a dark matter candidate. For this last purpose, we study its relic abundance and its compatibility with the current direct detection experiments.« less

  10. Total Force Fitness in units part 1: military demand-resource model.

    PubMed

    Bates, Mark J; Fallesen, Jon J; Huey, Wesley S; Packard, Gary A; Ryan, Diane M; Burke, C Shawn; Smith, David G; Watola, Daniel J; Pinder, Evette D; Yosick, Todd M; Estrada, Armando X; Crepeau, Loring; Bowles, Stephen V

    2013-11-01

    The military unit is a critical center of gravity in the military's efforts to enhance resilience and the health of the force. The purpose of this article is to augment the military's Total Force Fitness (TFF) guidance with a framework of TFF in units. The framework is based on a Military Demand-Resource model that highlights the dynamic interactions across demands, resources, and outcomes. A joint team of subject-matter experts identified key variables representing unit fitness demands, resources, and outcomes. The resulting framework informs and supports leaders, support agencies, and enterprise efforts to strengthen TFF in units by (1) identifying TFF unit variables aligned with current evidence and operational practices, (2) standardizing communication about TFF in units across the Department of Defense enterprise in a variety of military organizational contexts, (3) improving current resources including evidence-based actions for leaders, (4) identifying and addressing of gaps, and (5) directing future research for enhancing TFF in units. These goals are intended to inform and enhance Service efforts to develop Service-specific TFF models, as well as provide the conceptual foundation for a follow-on article about TFF metrics for units. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  11. Accountability for quality of care: Monitoring all aspects of quality across a framework adapted for action.

    PubMed

    Hulton, Louise; Matthews, Zoë; Bandali, Sarah; Izge, Abubakar; Daroda, Ramatu; Stones, William

    2016-01-01

    Quality of care is essential to maternal and newborn survival. The multidimensional nature of quality of care means that frameworks are useful for capturing it. The present paper proposes an adaptation to a widely used quality of care framework for maternity services. The framework subdivides quality into two inter-related dimensions-provision and experience of care-but suggests adaptations to reflect changes in the concept of quality over the past 15years. The application of the updated framework is presented in a case study, which uses it to measure and inform quality improvements in northern Nigeria across the reproductive, maternal, newborn, and child health continuum of care. Data from 231 sampled basic and comprehensive emergency obstetric and newborn care (BEmONC and CEmONC) facilities in six northern Nigerian states showed that only 35%-47% of facilities met minimum quality standards in infrastructure. Standards for human resources performed better with 49%-73% reaching minimum standards. A framework like this could form the basis for a certification scheme. Certification offers a practical and concrete opportunity to drive quality standards up and reward good performance. It also offers a mechanism to strengthen accountability. Copyright © 2015 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Payment models to support population health management.

    PubMed

    Huerta, Timothy R; Hefner, Jennifer L; McAlearney, Ann Scheck

    2014-01-01

    To survey the policy-driven financial controls currently being used to drive physician change in the care of populations. This paper offers a review of current health care payment models and discusses the impact of each on the potential success of PHM initiatives. We present the benefits of a multi-part model, combining visit-based fee-for-service reimbursement with a monthly "care coordination payment" and a performance-based payment system. A multi-part model removes volume-based incentives and promotes efficiency. However, it is predicated on a pay-for-performance framework that requires standardized measurement. Application of this model is limited due to the current lack of standardized measurement of quality goals that are linked to payment incentives. Financial models dictated by health system payers are inextricably linked to the organization and management of health care. There is a need for better measurements and realistic targets as part of a comprehensive system of measurement assessment that focuses on practice redesign, with the goal of standardizing measurement of the structure and process of redesign. Payment reform is a necessary component of an accurate measure of the associations between practice transformation and outcomes important to both patients and society.

  13. IKOS: A Framework for Static Analysis based on Abstract Interpretation (Tool Paper)

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.; Laserna, Jorge A.; Shi, Nija; Venet, Arnaud Jean

    2014-01-01

    The RTCA standard (DO-178C) for developing avionic software and getting certification credits includes an extension (DO-333) that describes how developers can use static analysis in certification. In this paper, we give an overview of the IKOS static analysis framework that helps developing static analyses that are both precise and scalable. IKOS harnesses the power of Abstract Interpretation and makes it accessible to a larger class of static analysis developers by separating concerns such as code parsing, model development, abstract domain management, results management, and analysis strategy. The benefits of the approach is demonstrated by a buffer overflow analysis applied to flight control systems.

  14. Small-scale multi-axial hybrid simulation of a shear-critical reinforced concrete frame

    NASA Astrophysics Data System (ADS)

    Sadeghian, Vahid; Kwon, Oh-Sung; Vecchio, Frank

    2017-10-01

    This study presents a numerical multi-scale simulation framework which is extended to accommodate hybrid simulation (numerical-experimental integration). The framework is enhanced with a standardized data exchange format and connected to a generalized controller interface program which facilitates communication with various types of laboratory equipment and testing configurations. A small-scale experimental program was conducted using a six degree-of-freedom hydraulic testing equipment to verify the proposed framework and provide additional data for small-scale testing of shearcritical reinforced concrete structures. The specimens were tested in a multi-axial hybrid simulation manner under a reversed cyclic loading condition simulating earthquake forces. The physical models were 1/3.23-scale representations of a beam and two columns. A mixed-type modelling technique was employed to analyze the remainder of the structures. The hybrid simulation results were compared against those obtained from a large-scale test and finite element analyses. The study found that if precautions are taken in preparing model materials and if the shear-related mechanisms are accurately considered in the numerical model, small-scale hybrid simulations can adequately simulate the behaviour of shear-critical structures. Although the findings of the study are promising, to draw general conclusions additional test data are required.

  15. Neutrinos secretly converting to lighter particles to please both KATRIN and the cosmos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farzan, Yasaman; Hannestad, Steen, E-mail: yasaman@theory.ipm.ac.ir, E-mail: sth@phys.au.dk

    Within the framework of the Standard Model of particle physics and standard cosmology, observations of the Cosmic Microwave Background (CMB) and Baryon Acoustic Oscillations (BAO) set stringent bounds on the sum of the masses of neutrinos. If these bounds are satisfied, the upcoming KATRIN experiment which is designed to probe neutrino mass down to ∼ 0.2 eV will observe only a null signal. We show that the bounds can be relaxed by introducing new interactions for the massive active neutrinos, making neutrino masses in the range observable by KATRIN compatible with cosmological bounds. Within this scenario, neutrinos convert to new stablemore » light particles by resonant production of intermediate states around a temperature of T∼ keV in the early Universe, leading to a much less pronounced suppression of density fluctuations compared to the standard model.« less

  16. Study of C P -violating charge asymmetries of single muons and like-sign dimuons in p p ¯ collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abazov, V. M.; Abbott, B.; Acharya, B. S.

    2014-01-01

    We measure the inclusive single muon charge asymmetry and the like-sign dimuon charge asymmetry inmore » $$p \\bar{p}$$ collisions using the full data set of 10.4 fb$$^{-1}$$ collected with the D0 detector at the Fermilab Tevatron. The standard model predictions of the charge asymmetries induced by CP violation are small in magnitude compared to the current experimental precision, so non-zero measurements could indicate new sources of CP violation. The measurements differ from the standard model predictions of CP violation in these asymmetries with a significance of 3.6 standard deviations. These results are interpreted in a framework of $B$ meson mixing within the CKM formalism to measure the relative width difference $$\\dgg$$ between the mass eigenstates of the $$\\Bd$$ meson system, and the semileptonic charge asymmetries $$\\asld$$ and $$\\asls$$ of $$\\Bd$$ and $$\\Bs$$ mesons respectively.« less

  17. Search for leptonic decays of W' bosons in pp collisions at sqrt {s} = {7} TeV

    NASA Astrophysics Data System (ADS)

    Chatrchyan, S.; Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.; Adam, W.; Bergauer, T.; Dragicevic, M.; Erö, J.; Fabjan, C.; Friedl, M.; Frühwirth, R.; Ghete, V. M.; Hammer, J.; Hörmann, N.; Hrubec, J.; Jeitler, M.; Kiesenhofer, W.; Knünz, V.; Krammer, M.; Liko, D.; Mikulec, I.; Pernicka, M.; Rahbaran, B.; Rohringer, C.; Rohringer, H.; Schöfbeck, R.; Strauss, J.; Taurok, A.; Teischinger, F.; Wagner, P.; Waltenberger, W.; Walzel, G.; Widl, E.; Wulz, C.-E.; Mossolov, V.; Shumeiko, N.; Suarez Gonzalez, J.; Bansal, S.; Cerny, K.; Cornelis, T.; De Wolf, E. A.; Janssen, X.; Luyckx, S.; Maes, T.; Mucibello, L.; Ochesanu, S.; Roland, B.; Rougny, R.; Selvaggi, M.; Van Haevermaet, H.; Van Mechelen, P.; Van Remortel, N.; Van Spilbeeck, A.; Blekman, F.; Blyweert, S.; D'Hondt, J.; Gonzalez Suarez, R.; Kalogeropoulos, A.; Maes, M.; Olbrechts, A.; Van Doninck, W.; Van Mulders, P.; Van Onsem, G. P.; Villella, I.; Charaf, O.; Clerbaux, B.; De Lentdecker, G.; Dero, V.; Gay, A. P. R.; Hreus, T.; Léonard, A.; Marage, P. E.; Reis, T.; Thomas, L.; Vander Velde, C.; Vanlaer, P.; Adler, V.; Beernaert, K.; Cimmino, A.; Costantini, S.; Garcia, G.; Grunewald, M.; Klein, B.; Lellouch, J.; Marinov, A.; Mccartin, J.; Ocampo Rios, A. A.; Ryckbosch, D.; Strobbe, N.; Thyssen, F.; Tytgat, M.; Vanelderen, L.; Verwilligen, P.; Walsh, S.; Yazgan, E.; Zaganidis, N.; Basegmez, S.; Bruno, G.; Ceard, L.; Delaere, C.; du Pree, T.; Favart, D.; Forthomme, L.; Giammanco, A.; Hollar, J.; Lemaitre, V.; Liao, J.; Militaru, O.; Nuttens, C.; Pagano, D.; Pin, A.; Piotrzkowski, K.; Schul, N.; Beliy, N.; Caebergs, T.; Daubie, E.; Hammad, G. H.; Alves, G. A.; Correa Martins, M.; De Jesus Damiao, D.; Martins, T.; Pol, M. E.; Souza, M. H. G.; Aldá, W. L.; Carvalho, W.; Custódio, A.; Da Costa, E. M.; De Oliveira Martins, C.; Fonseca De Souza, S.; Matos Figueiredo, D.; Mundim, L.; Nogima, H.; Oguri, V.; Prado Da Silva, W. L.; Santoro, A.; Silva Do Amaral, S. M.; Soares Jorge, L.; Sznajder, A.; Anjos, T. S.; Bernardes, C. A.; Dias, F. A.; Fernandez Perez Tomei, T. R.; Gregores, E. M.; Lagana, C.; Marinho, F.; Mercadante, P. G.; Novaes, S. F.; Padula, Sandra S.; Genchev, V.; Iaydjiev, P.; Piperov, S.; Rodozov, M.; Stoykova, S.; Sultanov, G.; Tcholakov, V.; Trayanov, R.; Vutova, M.; Dimitrov, A.; Hadjiiska, R.; Kozhuharov, V.; Litov, L.; Pavlov, B.; Petkov, P.; Bian, J. G.; Chen, G. M.; Chen, H. S.; Jiang, C. H.; Liang, D.; Liang, S.; Meng, X.; Tao, J.; Wang, J.; Wang, J.; Wang, X.; Wang, Z.; Xiao, H.; Xu, M.; Zang, J.; Zhang, Z.; Asawatangtrakuldee, C.; Ban, Y.; Guo, S.; Guo, Y.; Li, W.; Liu, S.; Mao, Y.; Qian, S. J.; Teng, H.; Wang, S.; Zhu, B.; Zou, W.; Avila, C.; Gomez Moreno, B.; Osorio Oliveros, A. F.; Sanabria, J. C.; Godinovic, N.; Lelas, D.; Plestina, R.; Polic, D.; Puljak, I.; Antunovic, Z.; Dzelalija, M.; Kovac, M.; Brigljevic, V.; Duric, S.; Kadija, K.; Luetic, J.; Morovic, S.; Attikis, A.; Galanti, M.; Mavromanolakis, G.; Mousa, J.; Nicolaou, C.; Ptochos, F.; Razis, P. A.; Finger, M.; Finger, M.; Assran, Y.; Elgammal, S.; Ellithi Kamel, A.; Khalil, S.; Mahmoud, M. A.; Radi, A.; Kadastik, M.; Müntel, M.; Raidal, M.; Rebane, L.; Tiko, A.; Azzolini, V.; Eerola, P.; Fedi, G.; Voutilainen, M.; Härkönen, J.; Heikkinen, A.; Karimäki, V.; Kinnunen, R.; Kortelainen, M. J.; Lampén, T.; Lassila-Perini, K.; Lehti, S.; Lindén, T.; Luukka, P.; Mäenpää, T.; Peltola, T.; Tuominen, E.; Tuominiemi, J.; Tuovinen, E.; Ungaro, D.; Wendland, L.; Banzuzi, K.; Korpela, A.; Tuuva, T.; Besancon, M.; Choudhury, S.; Dejardin, M.; Denegri, D.; Fabbro, B.; Faure, J. L.; Ferri, F.; Ganjour, S.; Givernaud, A.; Gras, P.; Hamel de Monchenault, G.; Jarry, P.; Locci, E.; Malcles, J.; Millischer, L.; Nayak, A.; Rander, J.; Rosowsky, A.; Shreyber, I.; Titov, M.; Baffioni, S.; Beaudette, F.; Benhabib, L.; Bianchini, L.; Bluj, M.; Broutin, C.; Busson, P.; Charlot, C.; Daci, N.; Dahms, T.; Dobrzynski, L.; Granier de Cassagnac, R.; Haguenauer, M.; Miné, P.; Mironov, C.; Ochando, C.; Paganini, P.; Sabes, D.; Salerno, R.; Sirois, Y.; Veelken, C.; Zabi, A.; Agram, J.-L.; Andrea, J.; Bloch, D.; Bodin, D.; Brom, J.-M.; Cardaci, M.; Chabert, E. C.; Collard, C.; Conte, E.; Drouhin, F.; Ferro, C.; Fontaine, J.-C.; Gelé, D.; Goerlach, U.; Juillot, P.; Karim, M.; Le Bihan, A.-C.; Van Hove, P.; Fassi, F.; Mercier, D.; Beauceron, S.; Beaupere, N.; Bondu, O.; Boudoul, G.; Brun, H.; Chasserat, J.; Chierici, R.; Contardo, D.; Depasse, P.; El Mamouni, H.; Fay, J.; Gascon, S.; Gouzevitch, M.; Ille, B.; Kurca, T.; Lethuillier, M.; Mirabito, L.; Perries, S.; Sordini, V.; Tosi, S.; Tschudi, Y.; Verdier, P.; Viret, S.; Tsamalaidze, Z.; Anagnostou, G.; Beranek, S.; Edelhoff, M.; Feld, L.; Heracleous, N.; Hindrichs, O.; Jussen, R.; Klein, K.; Merz, J.; Ostapchuk, A.; Perieanu, A.; Raupach, F.; Sammet, J.; Schael, S.; Sprenger, D.; Weber, H.; Wittmer, B.; Zhukov, V.; Ata, M.; Caudron, J.; Dietz-Laursonn, E.; Duchardt, D.; Erdmann, M.; Güth, A.; Hebbeker, T.; Heidemann, C.; Hoepfner, K.; Klimkovich, T.; Klingebiel, D.; Kreuzer, P.; Lanske, D.; Lingemann, J.; Magass, C.; Merschmeyer, M.; Meyer, A.; Olschewski, M.; Papacz, P.; Pieta, H.; Reithler, H.; Schmitz, S. A.; Schulte, J. F.; Sonnenschein, L.; Steggemann, J.; Teyssier, D.; Thüer, S.; Weber, M.; Bontenackels, M.; Cherepanov, V.; Davids, M.; Flügge, G.; Geenen, H.; Geisler, M.; Haj Ahmad, W.; Hoehle, F.; Kargoll, B.; Kress, T.; Kuessel, Y.; Linn, A.; Nowack, A.; Perchalla, L.; Pooth, O.; Rennefeld, J.; Sauerland, P.; Stahl, A.; Aldaya Martin, M.; Behr, J.; Behrenhoff, W.; Behrens, U.; Bergholz, M.; Bethani, A.; Borras, K.; Burgmeier, A.; Cakir, A.; Calligaris, L.; Campbell, A.; Castro, E.; Costanza, F.; Dammann, D.; Eckerlin, G.; Eckstein, D.; Fischer, D.; Flucke, G.; Geiser, A.; Glushkov, I.; Habib, S.; Hauk, J.; Jung, H.; Kasemann, M.; Katsas, P.; Kleinwort, C.; Kluge, H.; Knutsson, A.; Krämer, M.; Krücker, D.; Kuznetsova, E.; Lange, W.; Lohmann, W.; Lutz, B.; Mankel, R.; Marfin, I.; Marienfeld, M.; Melzer-Pellmann, I.-A.; Meyer, A. B.; Mnich, J.; Mussgiller, A.; Naumann-Emme, S.; Olzem, J.; Perrey, H.; Petrukhin, A.; Pitzl, D.; Raspereza, A.; Ribeiro Cipriano, P. M.; Riedl, C.; Rosin, M.; Salfeld-Nebgen, J.; Schmidt, R.; Schoerner-Sadenius, T.; Sen, N.; Spiridonov, A.; Stein, M.; Walsh, R.; Wissing, C.; Autermann, C.; Blobel, V.; Bobrovskyi, S.; Draeger, J.; Enderle, H.; Erfle, J.; Gebbert, U.; Görner, M.; Hermanns, T.; Höing, R. S.; Kaschube, K.; Kaussen, G.; Kirschenmann, H.; Klanner, R.; Lange, J.; Mura, B.; Nowak, F.; Pietsch, N.; Rathjens, D.; Sander, C.; Schettler, H.; Schleper, P.; Schlieckau, E.; Schmidt, A.; Schröder, M.; Schum, T.; Seidel, M.; Stadie, H.; Steinbrück, G.; Thomsen, J.; Barth, C.; Berger, J.; Chwalek, T.; De Boer, W.; Dierlamm, A.; Feindt, M.; Guthoff, M.; Hackstein, C.; Hartmann, F.; Heinrich, M.; Held, H.; Hoffmann, K. H.; Honc, S.; Katkov, I.; Komaragiri, J. R.; Martschei, D.; Mueller, S.; Müller, Th.; Niegel, M.; Nürnberg, A.; Oberst, O.; Oehler, A.; Ott, J.; Peiffer, T.; Quast, G.; Rabbertz, K.; Ratnikov, F.; Ratnikova, N.; Röcker, S.; Saout, C.; Scheurer, A.; Schilling, F.-P.; Schmanau, M.; Schott, G.; Simonis, H. J.; Stober, F. M.; Troendle, D.; Ulrich, R.; Wagner-Kuhr, J.; Weiler, T.; Zeise, M.; Ziebarth, E. B.; Daskalakis, G.; Geralis, T.; Kesisoglou, S.; Kyriakis, A.; Loukas, D.; Manolakos, I.; Markou, A.; Markou, C.; Mavrommatis, C.; Ntomari, E.; Gouskos, L.; Mertzimekis, T. J.; Panagiotou, A.; Saoulidou, N.; Evangelou, I.; Foudas, C.; Kokkas, P.; Manthos, N.; Papadopoulos, I.; Patras, V.; Bencze, G.; Hajdu, C.; Hidas, P.; Horvath, D.; Krajczar, K.; Radics, B.; Sikler, F.; Veszpremi, V.; Vesztergombi, G.; Beni, N.; Czellar, S.; Molnar, J.; Palinkas, J.; Szillasi, Z.; Karancsi, J.; Raics, P.; Trocsanyi, Z. L.; Ujvari, B.; Beri, S. B.; Bhatnagar, V.; Dhingra, N.; Gupta, R.; Jindal, M.; Kaur, M.; Kohli, J. M.; Mehta, M. Z.; Nishu, N.; Saini, L. K.; Sharma, A.; Singh, J.; Singh, S. P.; Ahuja, S.; Bhardwaj, A.; Choudhary, B. C.; Kumar, A.; Kumar, A.; Malhotra, S.; Naimuddin, M.; Ranjan, K.; Sharma, V.; Shivpuri, R. K.; Banerjee, S.; Bhattacharya, S.; Dutta, S.; Gomber, B.; Jain, Sa.; Jain, Sh.; Khurana, R.; Sarkar, S.; Abdulsalam, A.; Choudhury, R. K.; Dutta, D.; Kailas, S.; Kumar, V.; Mohanty, A. K.; Pant, L. M.; Shukla, P.; Aziz, T.; Ganguly, S.; Guchait, M.; Gurtu, A.; Maity, M.; Majumder, G.; Mazumdar, K.; Mohanty, G. B.; Parida, B.; Sudhakar, K.; Wickramage, N.; Banerjee, S.; Dugad, S.; Arfaei, H.; Bakhshiansohi, H.; Etesami, S. M.; Fahim, A.; Hashemi, M.; Hesari, H.; Jafari, A.; Khakzad, M.; Mohammadi, A.; Mohammadi Najafabadi, M.; Paktinat Mehdiabadi, S.; Safarzadeh, B.; Zeinali, M.; Abbrescia, M.; Barbone, L.; Calabria, C.; Chhibra, S. S.; Colaleo, A.; Creanza, D.; De Filippis, N.; De Palma, M.; Fiore, L.; Iaselli, G.; Lusito, L.; Maggi, G.; Maggi, M.; Marangelli, B.; My, S.; Nuzzo, S.; Pacifico, N.; Pompili, A.; Pugliese, G.; Selvaggi, G.; Silvestris, L.; Singh, G.; Zito, G.; Abbiendi, G.; Benvenuti, A. C.; Bonacorsi, D.; Braibant-Giacomelli, S.; Brigliadori, L.; Capiluppi, P.; Castro, A.; Cavallo, F. R.; Cuffiani, M.; Dallavalle, G. M.; Fabbri, F.; Fanfani, A.; Fasanella, D.; Giacomelli, P.; Grandi, C.; Guiducci, L.; Marcellini, S.; Masetti, G.; Meneghelli, M.; Montanari, A.; Navarria, F. L.; Odorici, F.; Perrotta, A.; Primavera, F.; Rossi, A. M.; Rovelli, T.; Siroli, G.; Travaglini, R.; Albergo, S.; Cappello, G.; Chiorboli, M.; Costa, S.; Potenza, R.; Tricomi, A.; Tuve, C.; Barbagli, G.; Ciulli, V.; Civinini, C.; D'Alessandro, R.; Focardi, E.; Frosali, S.; Gallo, E.; Gonzi, S.; Meschini, M.; Paoletti, S.; Sguazzoni, G.; Tropiano, A.; Benussi, L.; Bianco, S.; Colafranceschi, S.; Fabbri, F.; Piccolo, D.; Fabbricatore, P.; Musenich, R.; Benaglia, A.; De Guio, F.; Di Matteo, L.; Fiorendi, S.; Gennai, S.; Ghezzi, A.; Malvezzi, S.; Manzoni, R. A.; Martelli, A.; Massironi, A.; Menasce, D.; Moroni, L.; Paganoni, M.; Pedrini, D.; Ragazzi, S.; Redaelli, N.; Sala, S.; Tabarelli de Fatis, T.; Buontempo, S.; Carrillo Montoya, C. A.; Cavallo, N.; De Cosa, A.; Dogangun, O.; Fabozzi, F.; Iorio, A. O. M.; Lista, L.; Meola, S.; Merola, M.; Paolucci, P.; Azzi, P.; Bacchetta, N.; Bellan, P.; Bisello, D.; Branca, A.; Carlin, R.; Checchia, P.; Dorigo, T.; Dosselli, U.; Gasparini, F.; Gozzelino, A.; Kanishchev, K.; Lacaprara, S.; Lazzizzera, I.; Margoni, M.; Meneguzzo, A. T.; Perrozzi, L.; Pozzobon, N.; Ronchese, P.; Simonetto, F.; Torassa, E.; Tosi, M.; Vanini, S.; Zotto, P.; Zumerle, G.; Gabusi, M.; Ratti, S. P.; Riccardi, C.; Torre, P.; Vitulo, P.; Bilei, G. M.; Fanò, L.; Lariccia, P.; Lucaroni, A.; Mantovani, G.; Menichelli, M.; Nappi, A.; Romeo, F.; Saha, A.; Santocchia, A.; Taroni, S.; Azzurri, P.; Bagliesi, G.; Boccali, T.; Broccolo, G.; Castaldi, R.; D'Agnolo, R. T.; Dell'Orso, R.; Fiori, F.; Foà, L.; Giassi, A.; Kraan, A.; Ligabue, F.; Lomtadze, T.; Martini, L.; Messineo, A.; Palla, F.; Palmonari, F.; Rizzi, A.; Serban, A. T.; Spagnolo, P.; Squillacioti, P.; Tenchini, R.; Tonelli, G.; Venturi, A.; Verdini, P. G.; Barone, L.; Cavallari, F.; Del Re, D.; Diemoz, M.; Fanelli, C.; Grassi, M.; Longo, E.; Meridiani, P.; Micheli, F.; Nourbakhsh, S.; Organtini, G.; Pandolfi, F.; Paramatti, R.; Rahatlou, S.; Sigamani, M.; Soffi, L.; Amapane, N.; Arcidiacono, R.; Argiro, S.; Arneodo, M.; Biino, C.; Botta, C.; Cartiglia, N.; Castello, R.; Costa, M.; Demaria, N.; Graziano, A.; Mariotti, C.; Maselli, S.; Migliore, E.; Monaco, V.; Musich, M.; Obertino, M. M.; Pastrone, N.; Pelliccioni, M.; Potenza, A.; Romero, A.; Ruspa, M.; Sacchi, R.; Sola, V.; Solano, A.; Staiano, A.; Vilela Pereira, A.; Belforte, S.; Cossutti, F.; Della Ricca, G.; Gobbo, B.; Marone, M.; Montanino, D.; Penzo, A.; Schizzi, A.; Heo, S. G.; Kim, T. Y.; Nam, S. K.; Chang, S.; Chung, J.; Kim, D. H.; Kim, G. N.; Kong, D. J.; Park, H.; Ro, S. R.; Son, D. C.; Son, T.; Kim, J. Y.; Kim, Zero J.; Song, S.; Jo, H. Y.; Choi, S.; Gyun, D.; Hong, B.; Jo, M.; Kim, H.; Kim, T. J.; Lee, K. S.; Moon, D. H.; Park, S. K.; Seo, E.; Choi, M.; Kang, S.; Kim, H.; Kim, J. H.; Park, C.; Park, I. C.; Park, S.; Ryu, G.; Cho, Y.; Choi, Y.; Choi, Y. K.; Goh, J.; Kim, M. S.; Kwon, E.; Lee, B.; Lee, J.; Lee, S.; Seo, H.; Yu, I.; Bilinskas, M. J.; Grigelionis, I.; Janulis, M.; Juodagalvis, A.; Castilla-Valdez, H.; De La Cruz-Burelo, E.; Heredia-de La Cruz, I.; Lopez-Fernandez, R.; Magaña Villalba, R.; Martínez-Ortega, J.; Sánchez-Hernández, A.; Villasenor-Cendejas, L. M.; Carrillo Moreno, S.; Vazquez Valencia, F.; Salazar Ibarguen, H. A.; Casimiro Linares, E.; Morelos Pineda, A.; Reyes-Santos, M. A.; Krofcheck, D.; Bell, A. J.; Butler, P. H.; Doesburg, R.; Reucroft, S.; Silverwood, H.; Ahmad, M.; Asghar, M. I.; Hoorani, H. R.; Khalid, S.; Khan, W. A.; Khurshid, T.; Qazi, S.; Shah, M. A.; Shoaib, M.; Brona, G.; Bunkowski, K.; Cwiok, M.; Dominik, W.; Doroba, K.; Kalinowski, A.; Konecki, M.; Krolikowski, J.; Bialkowska, H.; Boimska, B.; Frueboes, T.; Gokieli, R.; Górski, M.; Kazana, M.; Nawrocki, K.; Romanowska-Rybinska, K.; Szleper, M.; Wrochna, G.; Zalewski, P.; Almeida, N.; Bargassa, P.; David, A.; Faccioli, P.; Ferreira Parracho, P. G.; Gallinaro, M.; Musella, P.; Seixas, J.; Varela, J.; Vischia, P.; Belotelov, I.; Gavrilenko, M.; Golutvin, I.; Gorbunov, I.; Kamenev, A.; Karjavin, V.; Kozlov, G.; Lanev, A.; Malakhov, A.; Moisenz, P.; Palichik, V.; Perelygin, V.; Savina, M.; Shmatov, S.; Smirnov, V.; Volodko, A.; Zarubin, A.; Evstyukhin, S.; Golovtsov, V.; Ivanov, Y.; Kim, V.; Levchenko, P.; Murzin, V.; Oreshkin, V.; Smirnov, I.; Sulimov, V.; Uvarov, L.; Vavilov, S.; Vorobyev, A.; Vorobyev, An.; Andreev, Yu.; Dermenev, A.; Gninenko, S.; Golubev, N.; Kirsanov, M.; Krasnikov, N.; Matveev, V.; Pashenkov, A.; Tlisov, D.; Toropin, A.; Epshteyn, V.; Erofeeva, M.; Gavrilov, V.; Kossov, M.; Lychkovskaya, N.; Popov, V.; Safronov, G.; Semenov, S.; Stolin, V.; Vlasov, E.; Zhokin, A.; Belyaev, A.; Boos, E.; Bunichev, V.; Dubinin, M.; Dudko, L.; Ershov, A.; Gribushin, A.; Klyukhin, V.; Kodolova, O.; Lokhtin, I.; Markina, A.; Obraztsov, S.; Perfilov, M.; Petrushanko, S.; Popov, A.; Sarycheva, L.; Savrin, V.; Andreev, V.; Azarkin, M.; Dremin, I.; Kirakosyan, M.; Leonidov, A.; Mesyats, G.; Rusakov, S. V.; Vinogradov, A.; Azhgirey, I.; Bayshev, I.; Bitioukov, S.; Grishin, V.; Kachanov, V.; Konstantinov, D.; Korablev, A.; Krychkine, V.; Petrov, V.; Ryutin, R.; Sobol, A.; Tourtchanovitch, L.; Troshin, S.; Tyurin, N.; Uzunian, A.; Volkov, A.; Adzic, P.; Djordjevic, M.; Ekmedzic, M.; Krpic, D.; Milosevic, J.; Aguilar-Benitez, M.; Alcaraz Maestre, J.; Arce, P.; Battilana, C.; Calvo, E.; Cerrada, M.; Chamizo Llatas, M.; Colino, N.; De La Cruz, B.; Delgado Peris, A.; Diez Pardos, C.; Domínguez Vázquez, D.; Fernandez Bedoya, C.; Fernández Ramos, J. P.; Ferrando, A.; Flix, J.; Fouz, M. C.; Garcia-Abia, P.; Gonzalez Lopez, O.; Goy Lopez, S.; Hernandez, J. M.; Josa, M. I.; Merino, G.; Puerta Pelayo, J.; Redondo, I.; Romero, L.; Santaolalla, J.; Soares, M. S.; Willmott, C.; Albajar, C.; Codispoti, G.; de Trocóniz, J. F.; Cuevas, J.; Fernandez Menendez, J.; Folgueras, S.; Gonzalez Caballero, I.; Lloret Iglesias, L.; Piedra Gomez, J.; Vizan Garcia, J. M.; Brochero Cifuentes, J. A.; Cabrillo, I. J.; Calderon, A.; Chuang, S. H.; Duarte Campderros, J.; Felcini, M.; Fernandez, M.; Gomez, G.; Gonzalez Sanchez, J.; Jorda, C.; Lobelle Pardo, P.; Lopez Virto, A.; Marco, J.; Marco, R.; Martinez Rivero, C.; Matorras, F.; Munoz Sanchez, F. J.; Rodrigo, T.; Rodríguez-Marrero, A. Y.; Ruiz-Jimeno, A.; Scodellaro, L.; Sobron Sanudo, M.; Vila, I.; Vilar Cortabitarte, R.; Abbaneo, D.; Auffray, E.; Auzinger, G.; Baillon, P.; Ball, A. H.; Barney, D.; Bernet, C.; Bianchi, G.; Bloch, P.; Bocci, A.; Bonato, A.; Breuker, H.; Camporesi, T.; Cerminara, G.; Christiansen, T.; Coarasa Perez, J. A.; D'Enterria, D.; De Roeck, A.; Di Guida, S.; Dobson, M.; Dupont-Sagorin, N.; Elliott-Peisert, A.; Frisch, B.; Funk, W.; Georgiou, G.; Giffels, M.; Gigi, D.; Gill, K.; Giordano, D.; Giunta, M.; Glege, F.; Gomez-Reino Garrido, R.; Govoni, P.; Gowdy, S.; Guida, R.; Hansen, M.; Harris, P.; Hartl, C.; Harvey, J.; Hegner, B.; Hinzmann, A.; Innocente, V.; Janot, P.; Kaadze, K.; Karavakis, E.; Kousouris, K.; Lecoq, P.; Lenzi, P.; Lourenço, C.; Mäki, T.; Malberti, M.; Malgeri, L.; Mannelli, M.; Masetti, L.; Meijers, F.; Mersi, S.; Meschi, E.; Moser, R.; Mozer, M. U.; Mulders, M.; Nesvold, E.; Nguyen, M.; Orimoto, T.; Orsini, L.; Palencia Cortezon, E.; Perez, E.; Petrilli, A.; Pfeiffer, A.; Pierini, M.; Pimiä, M.; Piparo, D.; Polese, G.; Quertenmont, L.; Racz, A.; Reece, W.; Rodrigues Antunes, J.; Rolandi, G.; Rommerskirchen, T.; Rovelli, C.; Rovere, M.; Sakulin, H.; Santanastasio, F.; Schäfer, C.; Schwick, C.; Segoni, I.; Sekmen, S.; Sharma, A.; Siegrist, P.; Silva, P.; Simon, M.; Sphicas, P.; Spiga, D.; Spiropulu, M.; Stoye, M.; Tsirou, A.; Veres, G. I.; Vlimant, J. R.; Wöhri, H. K.; Worm, S. D.; Zeuner, W. D.; Bertl, W.; Deiters, K.; Erdmann, W.; Gabathuler, K.; Horisberger, R.; Ingram, Q.; Kaestli, H. C.; König, S.; Kotlinski, D.; Langenegger, U.; Meier, F.; Renker, D.; Rohe, T.; Sibille, J.; Bäni, L.; Bortignon, P.; Buchmann, M. A.; Casal, B.; Chanon, N.; Chen, Z.; Deisher, A.; Dissertori, G.; Dittmar, M.; Dünser, M.; Eugster, J.; Freudenreich, K.; Grab, C.; Lecomte, P.; Lustermann, W.; Marini, A. C.; Martinez Ruiz del Arbol, P.; Mohr, N.; Moortgat, F.; Nägeli, C.; Nef, P.; Nessi-Tedaldi, F.; Pape, L.; Pauss, F.; Peruzzi, M.; Ronga, F. J.; Rossini, M.; Sala, L.; Sanchez, A. K.; Starodumov, A.; Stieger, B.; Takahashi, M.; Tauscher, L.; Thea, A.; Theofilatos, K.; Treille, D.; Urscheler, C.; Wallny, R.; Weber, H. A.; Wehrli, L.; Aguilo, E.; Amsler, C.; Chiochia, V.; De Visscher, S.; Favaro, C.; Ivova Rikova, M.; Millan Mejias, B.; Otiougova, P.; Robmann, P.; Snoek, H.; Tupputi, S.; Verzetti, M.; Chang, Y. H.; Chen, K. H.; Go, A.; Kuo, C. M.; Li, S. W.; Lin, W.; Liu, Z. K.; Lu, Y. J.; Mekterovic, D.; Singh, A. P.; Volpe, R.; Yu, S. S.; Bartalini, P.; Chang, P.; Chang, Y. H.; Chang, Y. W.; Chao, Y.; Chen, K. F.; Dietz, C.; Grundler, U.; Hou, W.-S.; Hsiung, Y.; Kao, K. Y.; Lei, Y. J.; Lu, R.-S.; Majumder, D.; Petrakou, E.; Shi, X.; Shiu, J. G.; Tzeng, Y. M.; Wang, M.; Adiguzel, A.; Bakirci, M. N.; Cerci, S.; Dozen, C.; Dumanoglu, I.; Eskut, E.; Girgis, S.; Gokbulut, G.; Hos, I.; Kangal, E. E.; Karapinar, G.; Kayis Topaksu, A.; Onengut, G.; Ozdemir, K.; Ozturk, S.; Polatoz, A.; Sogut, K.; Sunar Cerci, D.; Tali, B.; Topakli, H.; Vergili, L. N.; Vergili, M.; Akin, I. V.; Aliev, T.; Bilin, B.; Bilmis, S.; Deniz, M.; Gamsizkan, H.; Guler, A. M.; Ocalan, K.; Ozpineci, A.; Serin, M.; Sever, R.; Surat, U. E.; Yalvac, M.; Yildirim, E.; Zeyrek, M.; Deliomeroglu, M.; Gülmez, E.; Isildak, B.; Kaya, M.; Kaya, O.; Ozkorucuklu, S.; Sonmez, N.; Cankocak, K.; Levchuk, L.; Bostock, F.; Brooke, J. J.; Clement, E.; Cussans, D.; Flacher, H.; Frazier, R.; Goldstein, J.; Grimes, M.; Heath, G. P.; Heath, H. F.; Kreczko, L.; Metson, S.; Newbold, D. M.; Nirunpong, K.; Poll, A.; Senkin, S.; Smith, V. J.; Williams, T.; Basso, L.; Bell, K. W.; Belyaev, A.; Brew, C.; Brown, R. M.; Cockerill, D. J. A.; Coughlan, J. A.; Harder, K.; Harper, S.; Jackson, J.; Kennedy, B. W.; Olaiya, E.; Petyt, D.; RadburnSmith, B. C.; Shepherd-Themistocleous, C. H.; Tomalin, I. R.; Womersley, W. J.; Bainbridge, R.; Ball, G.; Beuselinck, R.; Buchmuller, O.; Colling, D.; Cripps, N.; Cutajar, M.; Dauncey, P.; Davies, G.; Della Negra, M.; Ferguson, W.; Fulcher, J.; Futyan, D.; Gilbert, A.; Guneratne Bryer, A.; Hall, G.; Hatherell, Z.; Hays, J.; Iles, G.; Jarvis, M.; Karapostoli, G.; Lyons, L.; Magnan, A.-M.; Marrouche, J.; Mathias, B.; Nandi, R.; Nash, J.; Nikitenko, A.; Papageorgiou, A.; Pela, J.; Pesaresi, M.; Petridis, K.; Pioppi, M.; Raymond, D. M.; Rogerson, S.; Rompotis, N.; Rose, A.; Ryan, M. J.; Seez, C.; Sharp, P.; Sparrow, A.; Tapper, A.; Vazquez Acosta, M.; Virdee, T.; Wakefield, S.; Wardle, N.; Whyntie, T.; Barrett, M.; Chadwick, M.; Cole, J. E.; Hobson, P. R.; Khan, A.; Kyberd, P.; Leggat, D.; Leslie, D.; Martin, W.; Reid, I. D.; Symonds, P.; Teodorescu, L.; Turner, M.; Hatakeyama, K.; Liu, H.; Scarborough, T.; Henderson, C.; Rumerio, P.; Avetisyan, A.; Bose, T.; Fantasia, C.; Heister, A.; John, J. St.; Lawson, P.; Lazic, D.; Rohlf, J.; Sperka, D.; Sulak, L.; Alimena, J.; Bhattacharya, S.; Cutts, D.; Ferapontov, A.; Heintz, U.; Jabeen, S.; Kukartsev, G.; Landsberg, G.; Luk, M.; Narain, M.; Nguyen, D.; Segala, M.; Sinthuprasith, T.; Speer, T.; Tsang, K. V.; Breedon, R.; Breto, G.; Calderon De La Barca Sanchez, M.; Chauhan, S.; Chertok, M.; Conway, J.; Conway, R.; Cox, P. T.; Dolen, J.; Erbacher, R.; Gardner, M.; Houtz, R.; Ko, W.; Kopecky, A.; Lander, R.; Mall, O.; Miceli, T.; Nelson, R.; Pellett, D.; Rutherford, B.; Searle, M.; Smith, J.; Squires, M.; Tripathi, M.; Vasquez Sierra, R.; Andreev, V.; Cline, D.; Cousins, R.; Duris, J.; Erhan, S.; Everaerts, P.; Farrell, C.; Hauser, J.; Ignatenko, M.; Plager, C.; Rakness, G.; Schlein, P.; Tucker, J.; Valuev, V.; Weber, M.; Babb, J.; Clare, R.; Dinardo, M. E.; Ellison, J.; Gary, J. W.; Giordano, F.; Hanson, G.; Jeng, G. Y.; Liu, H.; Long, O. R.; Luthra, A.; Nguyen, H.; Paramesvaran, S.; Sturdy, J.; Sumowidagdo, S.; Wilken, R.; Wimpenny, S.; Andrews, W.; Branson, J. G.; Cerati, G. B.; Cittolin, S.; Evans, D.; Golf, F.; Holzner, A.; Kelley, R.; Lebourgeois, M.; Letts, J.; Macneill, I.; Mangano, B.; Muelmenstaedt, J.; Padhi, S.; Palmer, C.; Petrucciani, G.; Pieri, M.; Ranieri, R.; Sani, M.; Sharma, V.; Simon, S.; Sudano, E.; Tadel, M.; Tu, Y.; Vartak, A.; Wasserbaech, S.; Würthwein, F.; Yagil, A.; Yoo, J.; Barge, D.; Bellan, R.; Campagnari, C.; D'Alfonso, M.; Danielson, T.; Flowers, K.; Geffert, P.; Incandela, J.; Justus, C.; Kalavase, P.; Koay, S. A.; Kovalskyi, D.; Krutelyov, V.; Lowette, S.; Mccoll, N.; Pavlunin, V.; Rebassoo, F.; Ribnik, J.; Richman, J.; Rossin, R.; Stuart, D.; To, W.; West, C.; Apresyan, A.; Bornheim, A.; Chen, Y.; Di Marco, E.; Duarte, J.; Gataullin, M.; Ma, Y.; Mott, A.; Newman, H. B.; Rogan, C.; Timciuc, V.; Traczyk, P.; Veverka, J.; Wilkinson, R.; Yang, Y.; Zhu, R. Y.; Akgun, B.; Carroll, R.; Ferguson, T.; Iiyama, Y.; Jang, D. W.; Liu, Y. F.; Paulini, M.; Vogel, H.; Vorobiev, I.; Cumalat, J. P.; Drell, B. R.; Edelmaier, C. J.; Ford, W. T.; Gaz, A.; Heyburn, B.; Luiggi Lopez, E.; Smith, J. G.; Stenson, K.; Ulmer, K. A.; Wagner, S. R.; Agostino, L.; Alexander, J.; Chatterjee, A.; Eggert, N.; Gibbons, L. K.; Heltsley, B.; Hopkins, W.; Khukhunaishvili, A.; Kreis, B.; Mirman, N.; Nicolas Kaufman, G.; Patterson, J. R.; Ryd, A.; Salvati, E.; Sun, W.; Teo, W. D.; Thom, J.; Thompson, J.; Vaughan, J.; Weng, Y.; Winstrom, L.; Wittich, P.; Winn, D.; Abdullin, S.; Albrow, M.; Anderson, J.; Bauerdick, L. A. T.; Beretvas, A.; Berryhill, J.; Bhat, P. C.; Bloch, I.; Burkett, K.; Butler, J. N.; Chetluru, V.; Cheung, H. W. K.; Chlebana, F.; Elvira, V. D.; Fisk, I.; Freeman, J.; Gao, Y.; Green, D.; Gutsche, O.; Hahn, A.; Hanlon, J.; Harris, R. M.; Hirschauer, J.; Hooberman, B.; Jindariani, S.; Johnson, M.; Joshi, U.; Kilminster, B.; Klima, B.; Kunori, S.; Kwan, S.; Leonidopoulos, C.; Lincoln, D.; Lipton, R.; Lueking, L.; Lykken, J.; Maeshima, K.; Marraffino, J. M.; Maruyama, S.; Mason, D.; McBride, P.; Mishra, K.; Mrenna, S.; Musienko, Y.; Newman-Holmes, C.; O'Dell, V.; Prokofyev, O.; Sexton-Kennedy, E.; Sharma, S.; Spalding, W. J.; Spiegel, L.; Tan, P.; Taylor, L.; Tkaczyk, S.; Tran, N. V.; Uplegger, L.; Vaandering, E. W.; Vidal, R.; Whitmore, J.; Wu, W.; Yang, F.; Yumiceva, F.; Yun, J. C.; Acosta, D.; Avery, P.; Bourilkov, D.; Chen, M.; Das, S.; De Gruttola, M.; Di Giovanni, G. P.; Dobur, D.; Drozdetskiy, A.; Field, R. D.; Fisher, M.; Fu, Y.; Furic, I. K.; Gartner, J.; Hugon, J.; Kim, B.; Konigsberg, J.; Korytov, A.; Kropivnitskaya, A.; Kypreos, T.; Low, J. F.; Matchev, K.; Milenovic, P.; Mitselmakher, G.; Muniz, L.; Remington, R.; Rinkevicius, A.; Sellers, P.; Skhirtladze, N.; Snowball, M.; Yelton, J.; Zakaria, M.; Gaultney, V.; Lebolo, L. M.; Linn, S.; Markowitz, P.; Martinez, G.; Rodriguez, J. L.; Adams, T.; Askew, A.; Bochenek, J.; Chen, J.; Diamond, B.; Gleyzer, S. V.; Haas, J.; Hagopian, S.; Hagopian, V.; Jenkins, M.; Johnson, K. F.; Prosper, H.; Veeraraghavan, V.; Weinberg, M.; Baarmand, M. M.; Dorney, B.; Hohlmann, M.; Kalakhety, H.; Vodopiyanov, I.; Adams, M. R.; Anghel, I. M.; Apanasevich, L.; Bai, Y.; Bazterra, V. E.; Betts, R. R.; Callner, J.; Cavanaugh, R.; Dragoiu, C.; Evdokimov, O.; Garcia-Solis, E. J.; Gauthier, L.; Gerber, C. E.; Hofman, D. J.; Khalatyan, S.; Lacroix, F.; Malek, M.; O'Brien, C.; Silkworth, C.; Strom, D.; Varelas, N.; Akgun, U.; Albayrak, E. A.; Bilki, B.; Chung, K.; Clarida, W.; Duru, F.; Griffiths, S.; Lae, C. K.; Merlo, J.-P.; Mermerkaya, H.; Mestvirishvili, A.; Moeller, A.; Nachtman, J.; Newsom, C. R.; Norbeck, E.; Olson, J.; Onel, Y.; Ozok, F.; Sen, S.; Tiras, E.; Wetzel, J.; Yetkin, T.; Yi, K.; Barnett, B. A.; Blumenfeld, B.; Bolognesi, S.; Fehling, D.; Giurgiu, G.; Gritsan, A. V.; Guo, Z. J.; Hu, G.; Maksimovic, P.; Rappoccio, S.; Swartz, M.; Whitbeck, A.; Baringer, P.; Bean, A.; Benelli, G.; Grachov, O.; Kenny, R. P.; Murray, M.; Noonan, D.; Radicci, V.; Sanders, S.; Stringer, R.; Tinti, G.; Wood, J. S.; Zhukova, V.; Barfuss, A. F.; Bolton, T.; Chakaberia, I.; Ivanov, A.; Khalil, S.; Makouski, M.; Maravin, Y.; Shrestha, S.; Svintradze, I.; Gronberg, J.; Lange, D.; Wright, D.; Baden, A.; Boutemeur, M.; Calvert, B.; Eno, S. C.; Gomez, J. A.; Hadley, N. J.; Kellogg, R. G.; Kirn, M.; Kolberg, T.; Lu, Y.; Marionneau, M.; Mignerey, A. C.; Peterman, A.; Rossato, K.; Skuja, A.; Temple, J.; Tonjes, M. B.; Tonwar, S. C.; Twedt, E.; Bauer, G.; Bendavid, J.; Busza, W.; Butz, E.; Cali, I. A.; Chan, M.; Dutta, V.; Gomez Ceballos, G.; Goncharov, M.; Hahn, K. A.; Kim, Y.; Klute, M.; Lee, Y.-J.; Li, W.; Luckey, P. D.; Ma, T.; Nahn, S.; Paus, C.; Ralph, D.; Roland, C.; Roland, G.; Rudolph, M.; Stephans, G. S. F.; Stöckli, F.; Sumorok, K.; Sung, K.; Velicanu, D.; Wenger, E. A.; Wolf, R.; Wyslouch, B.; Xie, S.; Yang, M.; Yilmaz, Y.; Yoon, A. S.; Zanetti, M.; Cooper, S. I.; Cushman, P.; Dahmes, B.; De Benedetti, A.; Franzoni, G.; Gude, A.; Haupt, J.; Kao, S. C.; Klapoetke, K.; Kubota, Y.; Mans, J.; Pastika, N.; Rusack, R.; Sasseville, M.; Singovsky, A.; Tambe, N.; Turkewitz, J.; Cremaldi, L. M.; Kroeger, R.; Perera, L.; Rahmat, R.; Sanders, D. A.; Avdeeva, E.; Bloom, K.; Bose, S.; Butt, J.; Claes, D. R.; Dominguez, A.; Eads, M.; Jindal, P.; Keller, J.; Kravchenko, I.; Lazo-Flores, J.; Malbouisson, H.; Malik, S.; Snow, G. R.; Baur, U.; Godshalk, A.; Iashvili, I.; Jain, S.; Kharchilava, A.; Kumar, A.; Shipkowski, S. P.; Smith, K.; Alverson, G.; Barberis, E.; Baumgartel, D.; Chasco, M.; Haley, J.; Trocino, D.; Wood, D.; Zhang, J.; Anastassov, A.; Kubik, A.; Mucia, N.; Odell, N.; Ofierzynski, R. A.; Pollack, B.; Pozdnyakov, A.; Schmitt, M.; Stoynev, S.; Velasco, M.; Won, S.; Antonelli, L.; Berry, D.; Brinkerhoff, A.; Hildreth, M.; Jessop, C.; Karmgard, D. J.; Kolb, J.; Lannon, K.; Luo, W.; Lynch, S.; Marinelli, N.; Morse, D. M.; Pearson, T.; Ruchti, R.; Slaunwhite, J.; Valls, N.; Warchol, J.; Wayne, M.; Wolf, M.; Ziegler, J.; Bylsma, B.; Durkin, L. S.; Hill, C.; Hughes, R.; Killewald, P.; Kotov, K.; Ling, T. Y.; Puigh, D.; Rodenburg, M.; Vuosalo, C.; Williams, G.; Winer, B. L.; Adam, N.; Berry, E.; Elmer, P.; Gerbaudo, D.; Halyo, V.; Hebda, P.; Hegeman, J.; Hunt, A.; Laird, E.; Lopes Pegna, D.; Lujan, P.; Marlow, D.; Medvedeva, T.; Mooney, M.; Olsen, J.; Piroué, P.; Quan, X.; Raval, A.; Saka, H.; Stickland, D.; Tully, C.; Werner, J. S.; Zuranski, A.; Acosta, J. G.; Brownson, E.; Huang, X. T.; Lopez, A.; Mendez, H.; Oliveros, S.; Ramirez Vargas, J. E.; Zatserklyaniy, A.; Alagoz, E.; Barnes, V. E.; Benedetti, D.; Bolla, G.; Bortoletto, D.; De Mattia, M.; Everett, A.; Hu, Z.; Jones, M.; Koybasi, O.; Kress, M.; Laasanen, A. T.; Leonardo, N.; Maroussov, V.; Merkel, P.; Miller, D. H.; Neumeister, N.; Shipsey, I.; Silvers, D.; Svyatkovskiy, A.; Vidal Marono, M.; Yoo, H. D.; Zablocki, J.; Zheng, Y.; Guragain, S.; Parashar, N.; Adair, A.; Boulahouache, C.; Cuplov, V.; Ecklund, K. M.; Geurts, F. J. M.; Padley, B. P.; Redjimi, R.; Roberts, J.; Zabel, J.; Betchart, B.; Bodek, A.; Chung, Y. S.; Covarelli, R.; de Barbaro, P.; Demina, R.; Eshaq, Y.; Garcia-Bellido, A.; Goldenzweig, P.; Gotra, Y.; Han, J.; Harel, A.; Korjenevski, S.; Miner, D. C.; Vishnevskiy, D.; Zielinski, M.; Bhatti, A.; Ciesielski, R.; Demortier, L.; Goulianos, K.; Lungu, G.; Malik, S.; Mesropian, C.; Arora, S.; Barker, A.; Chou, J. P.; Contreras-Campana, C.; Contreras-Campana, E.; Duggan, D.; Ferencek, D.; Gershtein, Y.; Gray, R.; Halkiadakis, E.; Hidas, D.; Hits, D.; Kilic, C.; Lath, A.; Panwalkar, S.; Park, M.; Patel, R.; Rekovic, V.; Richards, A.; Robles, J.; Rose, K.; Salur, S.; Schnetzer, S.; Seitz, C.; Somalwar, S.; Stone, R.; Thomas, S.; Cerizza, G.; Hollingsworth, M.; Spanier, S.; Yang, Z. C.; York, A.; Eusebi, R.; Flanagan, W.; Gilmore, J.; Kamon, T.; Khotilovich, V.; Montalvo, R.; Osipenkov, I.; Pakhotin, Y.; Perloff, A.; Roe, J.; Safonov, A.; Sakuma, T.; Sengupta, S.; Suarez, I.; Tatarinov, A.; Toback, D.; Akchurin, N.; Damgov, J.; Dudero, P. R.; Jeong, C.; Kovitanggoon, K.; Lee, S. W.; Libeiro, T.; Roh, Y.; Volobouev, I.; Appelt, E.; Engh, D.; Florez, C.; Greene, S.; Gurrola, A.; Johns, W.; Kurt, P.; Maguire, C.; Melo, A.; Sheldon, P.; Snook, B.; Tuo, S.; Velkovska, J.; Arenton, M. W.; Balazs, M.; Boutle, S.; Cox, B.; Francis, B.; Goodell, J.; Hirosky, R.; Ledovskoy, A.; Lin, C.; Neu, C.; Wood, J.; Yohay, R.; Gollapinni, S.; Harr, R.; Karchin, P. E.; Kottachchi Kankanamge Don, C.; Lamichhane, P.; Sakharov, A.; Anderson, M.; Bachtis, M.; Belknap, D.; Borrello, L.; Carlsmith, D.; Cepeda, M.; Dasu, S.; Gray, L.; Grogg, K. S.; Grothe, M.; Hall-Wilton, R.; Herndon, M.; Hervé, A.; Klabbers, P.; Klukas, J.; Lanaro, A.; Lazaridis, C.; Leonard, J.; Loveless, R.; Mohapatra, A.; Ojalvo, I.; Pierro, G. A.; Ross, I.; Savin, A.; Smith, W. H.; Swanson, J.

    2012-08-01

    A search for a new heavy gauge boson W' decaying to an electron or muon, plus a low mass neutrino, is presented. This study uses data corresponding to an integrated luminosity of 5.0 fb-1, collected using the CMS detector in pp collisions at a centre-of-mass energy of 7 TeV at the LHC. Events containing a single electron or muon and missing transverse momentum are analyzed. No significant excess of events above the standard model expectation is found in the transverse mass distribution of the lepton-neutrino system, and upper limits for cross sections above different transverse mass thresholds are presented. Mass exclusion limits at 95% CL for a range of W' models are determined, including a limit of 2.5 TeV for right-handed W' bosons with standard-model-like couplings and limits of 2.43-2.63 TeV for left-handed W' bosons, taking into account their interference with the standard model W boson. Exclusion limits have also been set on Kaluza-Klein WKK states in the framework of split universal extra dimensions.

  18. Supporting Parental Involvement in Children's Early Learning: Lessons from Community Childcare Centres in Dublin's Docklands

    ERIC Educational Resources Information Center

    Share, Michelle; Kerrins, Liz

    2013-01-01

    Recently in Ireland attention has been placed on the importance of parental involvement in early childhood care and education settings as seen in the Síolta Quality Standards and Aistear Curriculum Framework. Yet there is little Irish empirical evidence on parental involvement in childcare settings; on the involvement models being used, or on the…

  19. A standard telemental health evaluation model: the time is now.

    PubMed

    Kramer, Greg M; Shore, Jay H; Mishkind, Matt C; Friedl, Karl E; Poropatich, Ronald K; Gahm, Gregory A

    2012-05-01

    The telehealth field has advanced historic promises to improve access, cost, and quality of care. However, the extent to which it is delivering on its promises is unclear as the scientific evidence needed to justify success is still emerging. Many have identified the need to advance the scientific knowledge base to better quantify success. One method for advancing that knowledge base is a standard telemental health evaluation model. Telemental health is defined here as the provision of mental health services using live, interactive video-teleconferencing technology. Evaluation in the telemental health field largely consists of descriptive and small pilot studies, is often defined by the individual goals of the specific programs, and is typically focused on only one outcome. The field should adopt new evaluation methods that consider the co-adaptive interaction between users (patients and providers), healthcare costs and savings, and the rapid evolution in communication technologies. Acceptance of a standard evaluation model will improve perceptions of telemental health as an established field, promote development of a sounder empirical base, promote interagency collaboration, and provide a framework for more multidisciplinary research that integrates measuring the impact of the technology and the overall healthcare aspect. We suggest that consideration of a standard model is timely given where telemental health is at in terms of its stage of scientific progress. We will broadly recommend some elements of what such a standard evaluation model might include for telemental health and suggest a way forward for adopting such a model.

  20. State-of-the-Art Calculation of the Decay Rate of Electroweak Vacuum in the Standard Model.

    PubMed

    Chigusa, So; Moroi, Takeo; Shoji, Yutaro

    2017-11-24

    The decay rate of the electroweak (EW) vacuum is calculated in the framework of the standard model (SM) of particle physics, using the recent progress in the understanding of the decay rate of metastable vacuum in gauge theories. We give a manifestly gauge-invariant expression of the decay rate. We also perform a detailed numerical calculation of the decay rate. With the best-fit values of the SM parameters, we find that the decay rate of the EW vacuum per unit volume is about 10^{-554}  Gyr^{-1} Gpc^{-3}; with the uncertainty in the top mass, the decay rate is estimated as 10^{-284}-10^{-1371}  Gyr^{-1} Gpc^{-3}.

  1. 76 FR 58730 - Version 4 Critical Infrastructure Protection Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-22

    ... provide a cybersecurity framework for the identification and protection of ``Critical Cyber Assets'' to... the identification and documentation of Critical Cyber Assets associated with Critical Assets that... Standards provide a cybersecurity framework for the identification and protection of ``Critical Cyber Assets...

  2. A standardized framing for reporting protein identifications in mzIdentML 1.2

    PubMed Central

    Seymour, Sean L.; Farrah, Terry; Binz, Pierre-Alain; Chalkley, Robert J.; Cottrell, John S.; Searle, Brian C.; Tabb, David L.; Vizcaíno, Juan Antonio; Prieto, Gorka; Uszkoreit, Julian; Eisenacher, Martin; Martínez-Bartolomé, Salvador; Ghali, Fawaz; Jones, Andrew R.

    2015-01-01

    Inferring which protein species have been detected in bottom-up proteomics experiments has been a challenging problem for which solutions have been maturing over the past decade. While many inference approaches now function well in isolation, comparing and reconciling the results generated across different tools remains difficult. It presently stands as one of the greatest barriers in collaborative efforts such as the Human Proteome Project and public repositories like the PRoteomics IDEntifications (PRIDE) database. Here we present a framework for reporting protein identifications that seeks to improve capabilities for comparing results generated by different inference tools. This framework standardizes the terminology for describing protein identification results, associated with the HUPO-Proteomics Standards Initiative (PSI) mzIdentML standard, while still allowing for differing methodologies to reach that final state. It is proposed that developers of software for reporting identification results will adopt this terminology in their outputs. While the new terminology does not require any changes to the core mzIdentML model, it represents a significant change in practice, and, as such, the rules will be released via a new version of the mzIdentML specification (version 1.2) so that consumers of files are able to determine whether the new guidelines have been adopted by export software. PMID:25092112

  3. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development

    PubMed Central

    2014-01-01

    Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on average 15% of the mean values over the succeeding parameter sets. Conclusions Our results indicate that the presented approach is effective for comparing model alternatives and reducing models to the minimum complexity replicating measured data. We therefore believe that this approach has significant potential for reparameterising existing frameworks, for identification of redundant model components of large biophysical models and to increase their predictive capacity. PMID:24886522

  4. Charting the Unknown: A Hunt in the Dark

    NASA Astrophysics Data System (ADS)

    Mohlabeng, Gopolang Mokoka

    Astrophysical and cosmological observations have pointed strongly to the existence of dark matter in the Universe, yet its nature remains elusive. It may be hidden in a vast unknown parameter space in which exhaustively searching for a signal is not feasible. We are, therefore, compelled to consider a robust program based on a wide range of new theoretical ideas and complementary strategies for detection. The aim of this dissertation is to investigate the phenomenology of diverse dark sectors with the objective of understanding and characterizing dark matter. We do so by exploring dark matter phenomenology under three main frameworks of study: (I) the model dependent approach, (II) model independent approach and (III) considering simplified models. In each framework we focus on unexplored and well motivated dark matter scenarios as well as their prospects of detection at current and future experiments. First, we concentrate on the model dependent method where we consider minimal dark matter in the form of mixed fermionic stable states in a gauge extension of the standard model. In particular, we incorporate the fermion mixings governed by gauge invariant interactions with the heavier degrees of freedom. We find that the manner of mixing has an impact on the detectability of the dark matter at experiments. Pursuing this model dependent direction, we explore a space-time extension of the standard model which houses a vector dark matter candidate. We incorporate boundary terms arising from the topology of the model and find that these control the way dark matter may interact with baryonic matter. Next we investigate the model independent approach in which we examine a non-minimal dark sector in the form of boosted dark matter. In this study, we consider an effective field theory involving two stable fermionic states. We probe the sensitivity of this type of dark matter coming from the galactic center and the center of the Sun, and investigate its detection prospects at current and future large volume experiments. Finally, we explore an intermediate approach in the form of a simplified model. Here we analyze a different non-minimal dark sector in which its interactions with the standard model sector are mediated primarily by the Higgs Boson. We discuss for the first time a vector and fermion dark matter preserved under the same stabilization symmetry. We find that the presence of both species in the early Universe results in rare processes contributing to the dark matter relic abundance. We conclude that connecting these three frameworks under one main dark matter program, instead of concentrating on them individually, could help us understand what we are missing, and may assist us to produce ground breaking ideas which lead to the discovery of a signal in the near future.

  5. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Vetter, Jeffrey S

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less

  6. A Security Audit Framework to Manage Information System Security

    NASA Astrophysics Data System (ADS)

    Pereira, Teresa; Santos, Henrique

    The widespread adoption of information and communication technology have promoted an increase dependency of organizations in the performance of their Information Systems. As a result, adequate security procedures to properly manage information security must be established by the organizations, in order to protect their valued or critical resources from accidental or intentional attacks, and ensure their normal activity. A conceptual security framework to manage and audit Information System Security is proposed and discussed. The proposed framework intends to assist organizations firstly to understand what they precisely need to protect assets and what are their weaknesses (vulnerabilities), enabling to perform an adequate security management. Secondly, enabling a security audit framework to support the organization to assess the efficiency of the controls and policy adopted to prevent or mitigate attacks, threats and vulnerabilities, promoted by the advances of new technologies and new Internet-enabled services, that the organizations are subject of. The presented framework is based on a conceptual model approach, which contains the semantic description of the concepts defined in information security domain, based on the ISO/IEC_JCT1 standards.

  7. An Advanced Pharmacy Practice Framework for Australia

    PubMed Central

    Jackson, Shane; Martin, Grant; Bergin, Jennifer; Clark, Bronwyn; Stupans, Ieva; Yeates, Gilbert; Nissen, Lisa; Marty, Stephen; Gysslink, Paul; Matthews, Andrew; Kirsa, Sue; Deans, Kerry; Sorimachi, Kay

    2015-01-01

    The need to develop An Advanced Pharmacy Practice Framework for Australia (the “APPF”) was identified during the 2010 review of the competency standards for Australian pharmacists. The Advanced Pharmacy Practice Framework Steering Committee, a collaborative profession-wide committee comprised of representatives of ten pharmacy organisations, examined and adapted existing advanced practice frameworks, all of which were found to have been based on the Competency Development and Evaluation Group (CoDEG) Advanced and Consultant Level Framework (the “CoDEG Framework”) from the United Kingdom. Its competency standards were also found to align well with the Domains of the National Competency Standards Framework for Pharmacists in Australia (the “National Framework”). Adaptation of the CoDEG Framework created an APPF that is complementary to the National Framework, sufficiently flexible to customise for recognising advanced practice in any area of professional practice and has been approved by the boards/councils of all participating organisations. The primary purpose of the APPF is to assist the development of the profession to meet the changing health care needs of the community. However, it is also a valuable tool for assuring members of the public of the competence of an advanced practice pharmacist and the quality and safety of the services they deliver. PMID:28975900

  8. A framework for the definition of standardized protocols for measuring upper-extremity kinematics.

    PubMed

    Kontaxis, A; Cutti, A G; Johnson, G R; Veeger, H E J

    2009-03-01

    Increasing interest in upper extremity biomechanics has led to closer investigations of both segment movements and detailed joint motion. Unfortunately, conceptual and practical differences in the motion analysis protocols used up to date reduce compatibility for post data and cross validation analysis and so weaken the body of knowledge. This difficulty highlights a need for standardised protocols, each addressing a set of questions of comparable content. The aim of this work is therefore to open a discussion and propose a flexible framework to support: (1) the definition of standardised protocols, (2) a standardised description of these protocols, and (3) the formulation of general recommendations. Proposal of a framework for the definition of standardized protocols. The framework is composed by two nested flowcharts. The first defines what a motion analysis protocol is by pointing out its role in a motion analysis study. The second flowchart describes the steps to build a protocol, which requires decisions on the joints or segments to be investigated and the description of their mechanical equivalent model, the definition of the anatomical or functional coordinate frames, the choice of marker or sensor configuration and the validity of their use, the definition of the activities to be measured and the refinements that can be applied to the final measurements. Finally, general recommendations are proposed for each of the steps based on the current literature, and open issues are highlighted for future investigation and standardisation. Standardisation of motion analysis protocols is urgent. The proposed framework can guide this process through the rationalisation of the approach.

  9. Framework for 21st Century School Nursing Practice: Framing Professional Development.

    PubMed

    Allen-Johnson, Ann

    2017-05-01

    The NASN Code of Ethics upholds that it is the responsibility of the school nurse to maintain competency and pursue personal and professional growth. Designing professional development activities that are relevant and support the needs of the school nurse can be a challenge. The Framework for 21st Century School Nursing Practice provides a model rooted in evidence-based standards of practice that can be utilized to assess an existing professional development program and identify gaps in learning opportunities. Nurse leaders can use the Framework for 21st Century Nursing Practice to provide a roadmap toward a professional development program that will be meaningful to school nurse staff, help restore or maintain joy in their practice, and allow them to achieve the goal of advancing the well-being, academic success, and lifelong achievement and health of students.

  10. Development of the Korean framework for senior-friendly hospitals: a Delphi study.

    PubMed

    Kim, Yoon-Sook; Han, Seol-Heui; Hwang, Jeong-Hae; Park, Jae-Min; Lee, Jongmin; Choi, Jaekyung; Moon, Yeonsil; Kim, Hee Joung; Shin, Grace Jung Eun; Lee, Ji-Sun; Choi, Ye Ji; Uhm, Kyeong Eun; Kim, In Ae; Nam, Ji-Won

    2017-08-04

    Aging is an inevitable part of life. One can maintain well-being and wellness even after discharge and/or transition if his or her functional decline is minimized, sudden decline is prevented, and functioning is promoted during hospitalization. Caring appropriately for elderly patients requires the systematic application of Senior-Friendly Hospital principles to all operating systems, including medical centres' organization and environment, as well as patient treatment processes. The Senior-Friendly Hospital framework is valid and important for patient safety and quality improvement. This study aimed to make recommendations regarding the development of the Korean Framework for Senior-Friendly Hospitals for older patients' care management, patient safety interventions, and health promotion, via a Delphi survey. Two rounds of Delphi surveying were conducted with 15 participants who had at least 3 years' experience in accreditation surveying and medical accreditation standards, survey methods, and accreditation investigator education. In each round, we calculated statistics describing each standard's validity and feasibility. The Korean Framework for Senior-Friendly Hospitals included 4 Chapters, 11 categories, and 67 standards through consensus of the Senior-Friendly Hospitals task force and experts' peer review. After the two rounds of Delphi surveying, validity evaluation led to no changes in standards of the Senior-Friendly Hospitals; however, the number of standards showing adequate validity decreased from 67 to 58. Regarding feasibility, no changes were necessary in the standards; however, the number of categories showing adequate feasibility decreased from 11 to 8 and from 67 to 30, respectively. The excluded categories were 3.2, 4.2, and 4.3 (service, transportation, and signage and identification). The highest feasibility values were given to standards 2.1.1, 4.1.4, and 4.1.6. The highest feasibility score was given to standard 2.4.2. The Korean Framework for Senior-Friendly Hospitals needs to include 4 Chapters, 8 categories, and 30 standards. The Accreditation Program for Healthcare Organizations should include Senior-Friendly Hospitals -relevant standards considering Korea's medical environment.

  11. Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vimmerstedt, Laura J; Newes, Emily K

    2017-12-05

    The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less

  12. Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vimmerstedt, Laura J; Newes, Emily K

    The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less

  13. The Development of a Conceptual Framework for New K-12 Science Education Standards (Invited)

    NASA Astrophysics Data System (ADS)

    Keller, T.

    2010-12-01

    The National Academy of Sciences has created a committee of 18 National Academy of Science and Engineering members, academic scientists, cognitive and learning scientists, and educators, educational policymakers and researchers to develop a framework to guide new K-12 science education standards. The committee began its work in January, 2010, released a draft of the framework in July, 2010, and intends to have the final framework in the first quarter of 2011. The committee was helped in early phases of the work by consultant design teams. The framework is designed to help realize a vision for science and engineering education in which all students actively engage in science and engineering practices in order to deepen their understanding of core ideas in science over multiple years of school. These three dimensions - core disciplinary ideas, science and engineering practices, and cross-cutting elements - must blend together to build an exciting, relevant, and forward looking science education. The framework will be used as a base for development of next generation K-12 science education standards.

  14. A framework for understanding cancer comparative effectiveness research data needs.

    PubMed

    Carpenter, William R; Meyer, Anne-Marie; Abernethy, Amy P; Stürmer, Til; Kosorok, Michael R

    2012-11-01

    Randomized controlled trials remain the gold standard for evaluating cancer intervention efficacy. Randomized trials are not always feasible, practical, or timely and often don't adequately reflect patient heterogeneity and real-world clinical practice. Comparative effectiveness research can leverage secondary data to help fill knowledge gaps randomized trials leave unaddressed; however, comparative effectiveness research also faces shortcomings. The goal of this project was to develop a new model and inform an evolving framework articulating cancer comparative effectiveness research data needs. We examined prevalent models and conducted semi-structured discussions with 76 clinicians and comparative effectiveness research researchers affiliated with the Agency for Healthcare Research and Quality's cancer comparative effectiveness research programs. A new model was iteratively developed and presents cancer comparative effectiveness research and important measures in a patient-centered, longitudinal chronic care model better reflecting contemporary cancer care in the context of the cancer care continuum, rather than a single-episode, acute-care perspective. Immediately relevant for federally funded comparative effectiveness research programs, the model informs an evolving framework articulating cancer comparative effectiveness research data needs, including evolutionary enhancements to registries and epidemiologic research data systems. We discuss elements of contemporary clinical practice, methodology improvements, and related needs affecting comparative effectiveness research's ability to yield findings clinicians, policy makers, and stakeholders can confidently act on. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. A framework for understanding cancer comparative effectiveness research data needs

    PubMed Central

    Carpenter, William R; Meyer, Anne-Marie; Abernethy, Amy P.; Stürmer, Til; Kosorok, Michael R.

    2012-01-01

    Objective Randomized controlled trials remain the gold standard for evaluating cancer intervention efficacy. Randomized trials are not always feasible, practical, or timely, and often don’t adequately reflect patient heterogeneity and real-world clinical practice. Comparative effectiveness research can leverage secondary data to help fill knowledge gaps randomized trials leave unaddressed; however, comparative effectiveness research also faces shortcomings. The goal of this project was to develop a new model and inform an evolving framework articulating cancer comparative effectiveness research data needs. Study Design and Setting We examined prevalent models and conducted semi-structured discussions with 76 clinicians and comparative effectiveness research researchers affiliated with the Agency for Healthcare Research and Quality’s cancer comparative effectiveness research programs. Results A new model was iteratively developed, and presents cancer comparative effectiveness research and important measures in a patient-centered, longitudinal chronic care model better-reflecting contemporary cancer care in the context of the cancer care continuum, rather than a single-episode, acute-care perspective. Conclusion Immediately relevant for federally-funded comparative effectiveness research programs, the model informs an evolving framework articulating cancer comparative effectiveness research data needs, including evolutionary enhancements to registries and epidemiologic research data systems. We discuss elements of contemporary clinical practice, methodology improvements, and related needs affecting comparative effectiveness research’s ability to yield findings clinicians, policymakers, and stakeholders can confidently act on. PMID:23017633

  16. System and methods of resource usage using an interoperable management framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  17. A resource oriented webs service for environmental modeling

    NASA Astrophysics Data System (ADS)

    Ferencik, Ioan

    2013-04-01

    Environmental modeling is a largely adopted practice in the study of natural phenomena. Environmental models can be difficult to build and use and thus sharing them within the community is an important aspect. The most common approach to share a model is to expose it as a web service. In practice the interaction with this web service is cumbersome due to lack of standardized contract and the complexity of the model being exposed. In this work we investigate the use of a resource oriented approach in exposing environmental models as web services. We view a model as a layered resource build atop the object concept from Object Oriented Programming, augmented with persistence capabilities provided by an embedded object database to keep track of its state and implementing the four basic principles of resource oriented architectures: addressability, statelessness, representation and uniform interface. For implementation we use exclusively open source software: Django framework, dyBase object oriented database and Python programming language. We developed a generic framework of resources structured into a hierarchy of types and consequently extended this typology with recurses specific to the domain of environmental modeling. To test our web service we used cURL, a robust command-line based web client.

  18. An Integrated Framework for Multipollutant Air Quality Management and Its Application in Georgia

    NASA Astrophysics Data System (ADS)

    Cohan, Daniel S.; Boylan, James W.; Marmur, Amit; Khan, Maudood N.

    2007-10-01

    Air protection agencies in the United States increasingly confront non-attainment of air quality standards for multiple pollutants sharing interrelated emission origins. Traditional approaches to attainment planning face important limitations that are magnified in the multipollutant context. Recognizing those limitations, the Georgia Environmental Protection Division has adopted an integrated framework to address ozone, fine particulate matter, and regional haze in the state. Rather than applying atmospheric modeling merely as a final check of an overall strategy, photochemical sensitivity analysis is conducted upfront to compare the effectiveness of controlling various precursor emission species and source regions. Emerging software enables the modeling of health benefits and associated economic valuations resulting from air pollution control. Photochemical sensitivity and health benefits analyses, applied together with traditional cost and feasibility assessments, provide a more comprehensive characterization of the implications of various control options. The fuller characterization both informs the selection of control options and facilitates the communication of impacts to affected stakeholders and the public. Although the integrated framework represents a clear improvement over previous attainment-planning efforts, key remaining shortcomings are also discussed.

  19. An integrated framework for multipollutant air quality management and its application in Georgia.

    PubMed

    Cohan, Daniel S; Boylan, James W; Marmur, Amit; Khan, Maudood N

    2007-10-01

    Air protection agencies in the United States increasingly confront non-attainment of air quality standards for multiple pollutants sharing interrelated emission origins. Traditional approaches to attainment planning face important limitations that are magnified in the multipollutant context. Recognizing those limitations, the Georgia Environmental Protection Division has adopted an integrated framework to address ozone, fine particulate matter, and regional haze in the state. Rather than applying atmospheric modeling merely as a final check of an overall strategy, photochemical sensitivity analysis is conducted upfront to compare the effectiveness of controlling various precursor emission species and source regions. Emerging software enables the modeling of health benefits and associated economic valuations resulting from air pollution control. Photochemical sensitivity and health benefits analyses, applied together with traditional cost and feasibility assessments, provide a more comprehensive characterization of the implications of various control options. The fuller characterization both informs the selection of control options and facilitates the communication of impacts to affected stakeholders and the public. Although the integrated framework represents a clear improvement over previous attainment-planning efforts, key remaining shortcomings are also discussed.

  20. A review of predictive nonlinear theories for multiscale modeling of heterogeneous materials

    NASA Astrophysics Data System (ADS)

    Matouš, Karel; Geers, Marc G. D.; Kouznetsova, Varvara G.; Gillman, Andrew

    2017-02-01

    Since the beginning of the industrial age, material performance and design have been in the midst of innovation of many disruptive technologies. Today's electronics, space, medical, transportation, and other industries are enriched by development, design and deployment of composite, heterogeneous and multifunctional materials. As a result, materials innovation is now considerably outpaced by other aspects from component design to product cycle. In this article, we review predictive nonlinear theories for multiscale modeling of heterogeneous materials. Deeper attention is given to multiscale modeling in space and to computational homogenization in addressing challenging materials science questions. Moreover, we discuss a state-of-the-art platform in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers. Such a modeling framework consists of experimental tools, computational methods, and digital data strategies. Once fully completed, this collaborative and interdisciplinary framework can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations. Moreover, it will decrease the time to market of innovative products.

Top