Sample records for model work standards

  1. Creating Better Child Care Jobs: Model Work Standards for Teaching Staff in Center-Based Child Care.

    ERIC Educational Resources Information Center

    Center for the Child Care Workforce, Washington, DC.

    This document presents model work standards articulating components of the child care center-based work environment that enable teachers to do their jobs well. These standards establish criteria to assess child care work environments and identify areas to improve in order to assure good jobs for adults and good care for children. The standards are…

  2. Creating Better School-Age Care Jobs: Model Work Standards.

    ERIC Educational Resources Information Center

    Haack, Peggy

    Built on the premise that good school-age care jobs are the cornerstone of high-quality services for school-age youth and their families, this guide presents model work standards for school-age care providers. The guide begins with a description of the strengths and challenges of the school-age care profession. The model work standards are…

  3. Formulation of consumables management models. Development approach for the mission planning processor working model

    NASA Technical Reports Server (NTRS)

    Connelly, L. C.

    1977-01-01

    The mission planning processor is a user oriented tool for consumables management and is part of the total consumables subsystem management concept. The approach to be used in developing a working model of the mission planning processor is documented. The approach includes top-down design, structured programming techniques, and application of NASA approved software development standards. This development approach: (1) promotes cost effective software development, (2) enhances the quality and reliability of the working model, (3) encourages the sharing of the working model through a standard approach, and (4) promotes portability of the working model to other computer systems.

  4. Statewide and District Professional Development in Standards: Addressing Teacher Equity. Models of Inservice. National Writing Project at Work

    ERIC Educational Resources Information Center

    Koch, Richard; Roop, Laura; Setter, Gail

    2006-01-01

    The National Writing Project at Work (NWP) monograph series documents how the National Writing Project model is implemented and developed at local sites across the country. These monographs describe NWP work, which is often shared informally or in workshops. Richard Koch and Laura Roop present a model of standards-based professional development…

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolda, Christopher

    In this talk, I review recent work on using a generalization of the Next-to-Minimal Supersymmetric Standard Model (NMSSM), called the Singlet-extended Minimal Supersymmetric Standard Model (SMSSM), to raise the mass of the Standard Model-like Higgs boson without requiring extremely heavy top squarks or large stop mixing. In so doing, this model solves the little hierarchy problem of the minimal model (MSSM), at the expense of leaving the {mu}-problem of the MSSM unresolved. This talk is based on work published in Refs. [1, 2, 3].

  6. Taming Many-Parameter BSM Models with Bayesian Neural Networks

    NASA Astrophysics Data System (ADS)

    Kuchera, M. P.; Karbo, A.; Prosper, H. B.; Sanchez, A.; Taylor, J. Z.

    2017-09-01

    The search for physics Beyond the Standard Model (BSM) is a major focus of large-scale high energy physics experiments. One method is to look for specific deviations from the Standard Model that are predicted by BSM models. In cases where the model has a large number of free parameters, standard search methods become intractable due to computation time. This talk presents results using Bayesian Neural Networks, a supervised machine learning method, to enable the study of higher-dimensional models. The popular phenomenological Minimal Supersymmetric Standard Model was studied as an example of the feasibility and usefulness of this method. Graphics Processing Units (GPUs) are used to expedite the calculations. Cross-section predictions for 13 TeV proton collisions will be presented. My participation in the Conference Experience for Undergraduates (CEU) in 2004-2006 exposed me to the national and global significance of cutting-edge research. At the 2005 CEU, I presented work from the previous summer's SULI internship at Lawrence Berkeley Laboratory, where I learned to program while working on the Majorana Project. That work inspired me to follow a similar research path, which led me to my current work on computational methods applied to BSM physics.

  7. Wisconsin's Model Academic Standards for Agricultural Education. Bulletin No. 9003.

    ERIC Educational Resources Information Center

    Fortier, John D.; Albrecht, Bryan D.; Grady, Susan M.; Gagnon, Dean P.; Wendt, Sharon, W.

    These model academic standards for agricultural education in Wisconsin represent the work of a task force of educators, parents, and business people with input from the public. The introductory section of this bulletin defines the academic standards and discusses developing the standards, using the standards, relating the standards to all…

  8. Preschool Literacy and the Common Core: A Professional Development Model

    ERIC Educational Resources Information Center

    Wake, Donna G.; Benson, Tammy Rachelle

    2016-01-01

    Many states have adopted the Common Core Standards for literacy and math and have begun enacting these standards in school curriculum. In states where these standards have been adopted, professional educators working in K-12 contexts have been working to create transition plans from existing state-based standards to the Common Core standards. A…

  9. Standardization efforts of digital pathology in Europe.

    PubMed

    Rojo, Marcial García; Daniel, Christel; Schrader, Thomas

    2012-01-01

    EURO-TELEPATH is a European COST Action IC0604. It started in 2007 and will end in November 2011. Its main objectives are evaluating and validating the common technological framework and communication standards required to access, transmit, and manage digital medical records by pathologists and other medical specialties in a networked environment. Working Group 1, "Business Modelling in Pathology," has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy - using Business Process Modelling Notation (BPMN). Working Group 2 has been dedicated to promoting the application of informatics standards in pathology, collaborating with Integrating Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Health terminology standardization research has become a topic of great interest. Future research work should focus on standardizing automatic image analysis and tissue microarrays imaging.

  10. Multilevel Linkages between State Standards, Teacher Standards, and Student Achievement: Testing External versus Internal Standards-Based Education Models

    ERIC Educational Resources Information Center

    Lee, Jaekyung; Liu, Xiaoyan; Amo, Laura Casey; Wang, Weichun Leilani

    2014-01-01

    Drawing on national and state assessment datasets in reading and math, this study tested "external" versus "internal" standards-based education models. The goal was to understand whether and how student performance standards work in multilayered school systems under No Child Left Behind Act of 2001 (NCLB). Under the…

  11. Wisconsin Model Early Learning Standards Alignment with Wisconsin Common Core State Standards for English Language Arts and Mathematics

    ERIC Educational Resources Information Center

    Wisconsin Department of Public Instruction, 2011

    2011-01-01

    Wisconsin's adoption of the Common Core State Standards provides an excellent opportunity for Wisconsin school districts and communities to define expectations from birth through preparation for college and work. By aligning the existing Wisconsin Model Early Learning Standards with the Wisconsin Common Core State Standards, expectations can be…

  12. The HPS experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colaneri, Luca

    2017-04-01

    With the experimental discovery of the Higgs boson, the Standard Model has been considered veri ed in all its previsions. The Standard Model, though, is still considered an incomplete theory, because it fails to address many theoretical and phenomenological issues. Among those, it doesn't provide any viable Dark Matter candidate. Many Beyond-Standard Model theories, such as the Supersymmetric Standard Model, provide possible solutions. In this work we have reported the experimental observations that led to considerate the existence of a new Force, mediated by a new massive vector boson, that could address all the observed phenomenology. This new dark Forcemore » could open an observational channel between the Standard Model and a new Dark Sector, convey by the interaction of the Standard Model photon with the massive dark photon, also called the A'. Purpose of this work was to develop an independent study of the background processes and the implementation of an independent event generator, to better understand the kinematics of the produced particles in the process e - +W → e - +W' + e + + e - and validate, or invalidate, the o cial event generator.« less

  13. A proposed application programming interface for a physical volume repository

    NASA Technical Reports Server (NTRS)

    Jones, Merritt; Williams, Joel; Wrenn, Richard

    1996-01-01

    The IEEE Storage System Standards Working Group (SSSWG) has developed the Reference Model for Open Storage Systems Interconnection, Mass Storage System Reference Model Version 5. This document, provides the framework for a series of standards for application and user interfaces to open storage systems. More recently, the SSSWG has been developing Application Programming Interfaces (APIs) for the individual components defined by the model. The API for the Physical Volume Repository is the most fully developed, but work is being done on APIs for the Physical Volume Library and for the Mover also. The SSSWG meets every other month, and meetings are open to all interested parties. The Physical Volume Repository (PVR) is responsible for managing the storage of removable media cartridges and for mounting and dismounting these cartridges onto drives. This document describes a model which defines a Physical Volume Repository, and gives a brief summary of the Application Programming Interface (API) which the IEEE Storage Systems Standards Working Group (SSSWG) is proposing as the standard interface for the PVR.

  14. DICOM static and dynamic representation through unified modeling language

    NASA Astrophysics Data System (ADS)

    Martinez-Martinez, Alfonso; Jimenez-Alaniz, Juan R.; Gonzalez-Marquez, A.; Chavez-Avelar, N.

    2004-04-01

    The DICOM standard, as all standards, specifies in generic way the management in network and storage media environments of digital medical images and their related information. However, understanding the specifications for particular implementation is not a trivial work. Thus, this work is about understanding and modelling parts of the DICOM standard using Object Oriented methodologies, as part of software development processes. This has offered different static and dynamic views, according with the standard specifications, and the resultant models have been represented through the Unified Modelling Language (UML). The modelled parts are related to network conformance claim: Network Communication Support for Message Exchange, Message Exchange, Information Object Definitions, Service Class Specifications, Data Structures and Encoding, and Data Dictionary. The resultant models have given a better understanding about DICOM parts and have opened the possibility of create a software library to develop DICOM conformable PACS applications.

  15. Non-standard models and the sociology of cosmology

    NASA Astrophysics Data System (ADS)

    López-Corredoira, Martín

    2014-05-01

    I review some theoretical ideas in cosmology different from the standard "Big Bang": the quasi-steady state model, the plasma cosmology model, non-cosmological redshifts, alternatives to non-baryonic dark matter and/or dark energy, and others. Cosmologists do not usually work within the framework of alternative cosmologies because they feel that these are not at present as competitive as the standard model. Certainly, they are not so developed, and they are not so developed because cosmologists do not work on them. It is a vicious circle. The fact that most cosmologists do not pay them any attention and only dedicate their research time to the standard model is to a great extent due to a sociological phenomenon (the "snowball effect" or "groupthink"). We might well wonder whether cosmology, our knowledge of the Universe as a whole, is a science like other fields of physics or a predominant ideology.

  16. Improving the Interoperability of Disaster Models: a Case Study of Proposing Fireml for Forest Fire Model

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Wang, F.; Meng, Q.; Li, Z.; Liu, B.; Zheng, X.

    2018-04-01

    This paper presents a new standardized data format named Fire Markup Language (FireML), extended by the Geography Markup Language (GML) of OGC, to elaborate upon the fire hazard model. The proposed FireML is able to standardize the input and output documents of a fire model for effectively communicating with different disaster management systems to ensure a good interoperability. To demonstrate the usage of FireML and testify its feasibility, an adopted forest fire spread model being compatible with FireML is described. And a 3DGIS disaster management system is developed to simulate the dynamic procedure of forest fire spread with the defined FireML documents. The proposed approach will enlighten ones who work on other disaster models' standardization work.

  17. Lattice Gauge Theories Within and Beyond the Standard Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gelzer, Zechariah John

    The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involvingmore » $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($$B \\to \\pi \\ell \

  18. The Standard Model from LHC to future colliders.

    PubMed

    Forte, S; Nisati, A; Passarino, G; Tenchini, R; Calame, C M Carloni; Chiesa, M; Cobal, M; Corcella, G; Degrassi, G; Ferrera, G; Magnea, L; Maltoni, F; Montagna, G; Nason, P; Nicrosini, O; Oleari, C; Piccinini, F; Riva, F; Vicini, A

    This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the "What Next" Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators.

  19. SGML and Related Standards: New Directions as the Second Decade Begins.

    ERIC Educational Resources Information Center

    Mason, James David

    1997-01-01

    ISO--International Organization for Standards highlights the activities of WG8 (Working Group 8 of ISO) in the alignment of standards for a common tree model and common query languages. Examines the how Document Style Semantics and Specification Language (DSSSL) and HyTime make documents easier to work with and more powerful in their ability to…

  20. Profile-Likelihood Approach for Estimating Generalized Linear Mixed Models with Factor Structures

    ERIC Educational Resources Information Center

    Jeon, Minjeong; Rabe-Hesketh, Sophia

    2012-01-01

    In this article, the authors suggest a profile-likelihood approach for estimating complex models by maximum likelihood (ML) using standard software and minimal programming. The method works whenever setting some of the parameters of the model to known constants turns the model into a standard model. An important class of models that can be…

  1. A Sandbox Environment for the Community Sensor Model Standard

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Laura, J. R.; Humpreys, I. R.; Wilson, T. J.; Hahn, M. A.; Shepherd, M. R.; Sides, S. C.

    2017-06-01

    Here we present ongoing work Astrogeology is undertaking to provide a programming sandbox environment for the Community Sensor Model standard. We define a sandbox as a testing environment that allows programmers to experiment.

  2. Developing Global Standards Framework and Quality Integrated Models for Cooperative and Work-Integrated Education Programs

    ERIC Educational Resources Information Center

    Khampirat, Buratin; McRae, Norah

    2016-01-01

    Cooperative and Work-integrated Education (CWIE) programs have been widely accepted as educational programs that can effectively connect what students are learning to the world of work through placements. Because a global quality standards framework could be a very valuable resource and guide to establishing, developing, and accrediting quality…

  3. Categorical Working Memory Representations are used in Delayed Estimation of Continuous Colors

    PubMed Central

    Hardman, Kyle O; Vergauwe, Evie; Ricker, Timothy J

    2016-01-01

    In the last decade, major strides have been made in understanding visual working memory through mathematical modeling of color production responses. In the delayed color estimation task (Wilken & Ma, 2004), participants are given a set of colored squares to remember and a few seconds later asked to reproduce those colors by clicking on a color wheel. The degree of error in these responses is characterized with mathematical models that estimate working memory precision and the proportion of items remembered by participants. A standard mathematical model of color memory assumes that items maintained in memory are remembered through memory for precise details about the particular studied shade of color. We contend that this model is incomplete in its present form because no mechanism is provided for remembering the coarse category of a studied color. In the present work we remedy this omission and present a model of visual working memory that includes both continuous and categorical memory representations. In two experiments we show that our new model outperforms this standard modeling approach, which demonstrates that categorical representations should be accounted for by mathematical models of visual working memory. PMID:27797548

  4. Categorical working memory representations are used in delayed estimation of continuous colors.

    PubMed

    Hardman, Kyle O; Vergauwe, Evie; Ricker, Timothy J

    2017-01-01

    In the last decade, major strides have been made in understanding visual working memory through mathematical modeling of color production responses. In the delayed color estimation task (Wilken & Ma, 2004), participants are given a set of colored squares to remember, and a few seconds later asked to reproduce those colors by clicking on a color wheel. The degree of error in these responses is characterized with mathematical models that estimate working memory precision and the proportion of items remembered by participants. A standard mathematical model of color memory assumes that items maintained in memory are remembered through memory for precise details about the particular studied shade of color. We contend that this model is incomplete in its present form because no mechanism is provided for remembering the coarse category of a studied color. In the present work, we remedy this omission and present a model of visual working memory that includes both continuous and categorical memory representations. In 2 experiments, we show that our new model outperforms this standard modeling approach, which demonstrates that categorical representations should be accounted for by mathematical models of visual working memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. The MP (Materialization Pattern) Model for Representing Math Educational Standards

    NASA Astrophysics Data System (ADS)

    Choi, Namyoun; Song, Il-Yeol; An, Yuan

    Representing natural languages with UML has been an important research issue for various reasons. Little work has been done for modeling imperative mood sentences which are the sentence structure of math educational standard statements. In this paper, we propose the MP (Materialization Pattern) model that captures the semantics of English sentences used in math educational standards. The MP model is based on the Reed-Kellogg sentence diagrams and creates MP schemas with the UML notation. The MP model explicitly represents the semantics of the sentences by extracting math concepts and the cognitive process of math concepts from math educational standard statements, and simplifies modeling. This MP model is also developed to be used for aligning math educational standard statements via schema matching.

  6. [Monitoring of occupational activities under the risk of heat stress: use of mathematical models in the prediction of physiological parameters].

    PubMed

    Terzi, R; Catenacci, G; Marcaletti, G

    1985-01-01

    Some authors proposed mathematical models that, starting from standardized conditions of environmental microclimate parameters, thermal impedance of the clothing, and energetic expenditure allowed the forecast of the body temperature and heart rate variations in respect to the basal values in subjects standing in the same environment. In the present work we verify the usefulness of these models applied to the working tasks characterized by standardized job made under unfavourable thermal conditions. In subject working in an electric power station the values of the body temperature and heart rate are registered and compared with the values obtained by the application of the studied models. The results are discussed in view of the practical use.

  7. Improving automation standards via semantic modelling: Application to ISA88.

    PubMed

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Les Houches ''Physics at TeV Colliders 2003'' Beyond the Standard Model Working Group: Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allanach, B

    2004-03-01

    The work contained herein constitutes a report of the ''Beyond the Standard Model'' working group for the Workshop ''Physics at TeV Colliders'', Les Houches, France, 26 May-6 June, 2003. The research presented is original, and was performed specifically for the workshop. Tools for calculations in the minimal supersymmetric standard model are presented, including a comparison of the dark matter relic density predicted by public codes. Reconstruction of supersymmetric particle masses at the LHC and a future linear collider facility is examined. Less orthodox supersymmetric signals such as non-pointing photons and R-parity violating signals are studied. Features of extra dimensional modelsmore » are examined next, including measurement strategies for radions and Higgs', as well as the virtual effects of Kaluza Klein modes of gluons. Finally, there is an update on LHC Z' studies.« less

  9. Organizing Community-Based Data Standards: Lessons from Developing a Successful Open Standard in Systems Biology

    NASA Astrophysics Data System (ADS)

    Hucka, M.

    2015-09-01

    In common with many fields, including astronomy, a vast number of software tools for computational modeling and simulation are available today in systems biology. This wealth of resources is a boon to researchers, but it also presents interoperability problems. Despite working with different software tools, researchers want to disseminate their work widely as well as reuse and extend the models of other researchers. This situation led in the year 2000 to an effort to create a tool-independent, machine-readable file format for representing models: SBML, the Systems Biology Markup Language. SBML has since become the de facto standard for its purpose. Its success and general approach has inspired and influenced other community-oriented standardization efforts in systems biology. Open standards are essential for the progress of science in all fields, but it is often difficult for academic researchers to organize successful community-based standards. I draw on personal experiences from the development of SBML and summarize some of the lessons learned, in the hope that this may be useful to other groups seeking to develop open standards in a community-oriented fashion.

  10. Building Information Modeling (BIM): A Road Map for Implementation to Support MILCON Transformation and Civil Works Projects within the U.S. Army Corps of Engineers

    DTIC Science & Technology

    2006-10-01

    benefit from BIM , but the data that can be gleaned from the BIM model will also feed many systems and users. What is Needed and What Must be...ER D C TR -0 6 -1 0 Building Information Modeling ( BIM ) A Road Map for Implementation To Support MILCON Transformation and Civil Works...compliant with National BIM Standard (NBIMS) 8 Centers of Standardization (COS) productive in BIM by 2008 All districts productive in NBIMS

  11. [Implementation and validation in the Italian context of the HSE management standards: a contribution to provide a practical model for the assessment of work-related stress].

    PubMed

    Iavicoli, S; Natali, E; Rondinone, B M; Castaldi, T; Persechino, B

    2010-01-01

    Over the last years, stress has been recognized as a potential work-related risk factor. Unfortunately, work-related stress is a very delicate subject, especially because it is difficult to assess it objectively and in broadly acceptable terms. In fact, work-related stress is a subjective personal response to a specific work environment, ad is of a multifactorial origin. In order to provide a practical tool for the assessment of work-related stress, the authors carried out a thorough benchmarking analysis of the various models to manage work stress problems adopted by EU countries. As a result, the authors have chosen to apply and implement the Health and Safety Executive (HSE) Management Standards approach in the Italian context. In compliance with the European Framework Agreement signed on October 8, 2004, HSE Management Standards ask for the coordinated and integrated involvement of workers and safety personnel and represent a valid assessment approach based on principles widely acknowledged in the scientific literature.

  12. The COST Action IC0604 "Telepathology Network in Europe" (EURO-TELEPATH).

    PubMed

    García-Rojo, Marcial; Gonçalves, Luís; Blobel, Bernd

    2012-01-01

    The COST Action IC0604 "Telepathology Network in Europe" (EURO-TELEPATH) is a European COST Action that has been running from 2007 to 2011. COST Actions are funded by the COST (European Cooperation in the field of Scientific and Technical Research) Agency, supported by the Seventh Framework Programme for Research and Technological Development (FP7), of the European Union. EURO-TELEPATH's main objectives were evaluating and validating the common technological framework and communication standards required to access, transmit and manage digital medical records by pathologists and other medical professionals in a networked environment. The project was organized in four working groups. orking Group 1 "Business modeling in pathology" has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy -using Business Process Modeling Notation (BPMN). orking Group 2 "Informatics standards in pathology" has been dedicated to promoting the development and application of informatics standards in pathology, collaborating with Integrating the Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Working Group 3 "Images: Analysis, Processing, Retrieval and Management" worked on the use of virtual or digital slides that are fostering the use of image processing and analysis in pathology not only for research purposes, but also in daily practice. Working Group 4 "Technology and Automation in Pathology" was focused on studying the adequacy of current existing technical solutions, including, e.g., the quality of images obtained by slide scanners, or the efficiency of image analysis applications. Major outcome of this action are the collaboration with international health informatics standardization bodies to foster the development of standards for digital pathology, offering a new approach for workflow analysis, based in business process modeling. Health terminology standardization research has become a topic of high interest. Future research work should focus on standardization of automatic image analysis and tissue microarrays imaging.

  13. Leaning on Mathematical Habits of Mind

    ERIC Educational Resources Information Center

    Sword, Sarah; Matsuura, Ryota; Cuoco, Al; Kang, Jane; Gates, Miriam

    2018-01-01

    Mathematical modeling has taken on increasing curricular importance in the past decade due in no small measure to the Common Core State Standards in Mathematics (CCSSM) identifying modeling as one of the Standards for Mathematical Practice (SMP 4, CCSSI 2010, p. 7). Although researchers have worked on mathematical modeling (Lesh and Doerr 2003;…

  14. Leading the Charge in Changing Times: 21st Century Learning and Leading

    ERIC Educational Resources Information Center

    Jones, Amanda Criswell

    2016-01-01

    Throughout history educational practices have typically been modeled after economic work practices. During the agrarian-age, educational practices modeled agrarian practices. Likewise, in the industrial-age, education became standardized and was modeled after industrial practices to prepare students for work in factories and industrial settings.…

  15. Quality specifications in postgraduate medical e-learning: an integrative literature review leading to a postgraduate medical e-learning model.

    PubMed

    De Leeuw, R A; Westerman, Michiel; Nelson, E; Ket, J C F; Scheele, F

    2016-07-08

    E-learning is driving major shifts in medical education. Prioritizing learning theories and quality models improves the success of e-learning programs. Although many e-learning quality standards are available, few are focused on postgraduate medical education. We conducted an integrative review of the current postgraduate medical e-learning literature to identify quality specifications. The literature was thematically organized into a working model. Unique quality specifications (n = 72) were consolidated and re-organized into a six-domain model that we called the Postgraduate Medical E-learning Model (Postgraduate ME Model). This model was partially based on the ISO-19796 standard, and drew on cognitive load multimedia principles. The domains of the model are preparation, software design and system specifications, communication, content, assessment, and maintenance. This review clarified the current state of postgraduate medical e-learning standards and specifications. It also synthesized these specifications into a single working model. To validate our findings, the next-steps include testing the Postgraduate ME Model in controlled e-learning settings.

  16. Associations among job demands and resources, work engagement, and psychological distress: fixed-effects model analysis in Japan.

    PubMed

    Oshio, Takashi; Inoue, Akiomi; Tsutsumi, Akizumi

    2018-05-25

    We examined the associations among job demands and resources, work engagement, and psychological distress, adjusted for time-invariant individual attributes. We used data from a Japanese occupational cohort survey, which included 18,702 observations of 7,843 individuals. We investigated how work engagement, measured by the Utrecht Work Engagement Scale, was associated with key aspects of job demands and resources, using fixed-effects regression models. We further estimated the fixed-effects models to assess how work engagement moderated the association between each job characteristic and psychological distress as measured by Kessler 6 scores. The fixed-effects models showed that work engagement was positively associated with job resources, as did pooled cross-sectional and prospective cohort models. Specifically, the standardized regression coefficients (β) were 0.148 and 0.120 for extrinsic reward and decision latitude, respectively, compared to -0.159 and 0.020 for role ambiguity and workload and time pressure, respectively (p < 0.001 for all associations). Work engagement modestly moderated the associations of psychological distress with workload and time pressure and extrinsic reward; a one-standard deviation increase in work engagement moderated their associations by 19.2% (p < 0.001) and 11.3% (p = 0.034), respectively. Work engagement was associated with job demands and resources, which is in line with the theoretical prediction of the job demands-resources model, even after controlling for time-invariant individual attributes. Work engagement moderated the association between selected aspects of job demands and resources and psychological distress.

  17. Travaux Neuchatelois de Linguistique (TRANEL) (Neuchatel Working Papers in Linguistics), Volume 14.

    ERIC Educational Resources Information Center

    Py, Bernard, Ed.; Rubattel, Christian, Ed.

    1989-01-01

    Three papers in linguistics, all in French, are presented. "La delocutivite lexicale en francais standard: esquisse d'un modele derivationnel" ("Lexical Delocutivity in Standard French: Sketch of a Derivational Model"), by Marc Bonhomme, examines the process by which certain expressions become neologisms. "La terminologie…

  18. Feminist Policy Analysis: Expanding Traditional Social Work Methods

    ERIC Educational Resources Information Center

    Kanenberg, Heather

    2013-01-01

    In an effort to move the methodology of policy analysis beyond the traditional and artificial position of being objective and value-free, this article is a call to those working and teaching in social work to consider a feminist policy analysis lens. A review of standard policy analysis models is presented alongside feminist models. Such a…

  19. Integrating Computer Content into Social Work Curricula: A Model for Planning

    ERIC Educational Resources Information Center

    Beaulaurier, Richard L.

    2005-01-01

    While recent CSWE standards focus on the need for including more relevant technological content in social work curricula, they do not offer guidance regarding how it is to be assessed and selected. Social work educators are in need of an analytic model of computerization to help them understand which technologies are most appropriate and relevant…

  20. Increasing EHR system usability through standards: Conformance criteria in the HL7 EHR-system functional model.

    PubMed

    Meehan, Rebecca A; Mon, Donald T; Kelly, Kandace M; Rocca, Mitra; Dickinson, Gary; Ritter, John; Johnson, Constance M

    2016-10-01

    Though substantial work has been done on the usability of health information technology, improvements in electronic health record system (EHR) usability have been slow, creating frustration, distrust of EHRs and the use of potentially unsafe work-arounds. Usability standards could be part of the solution for improving EHR usability. EHR system functional requirements and standards have been used successfully in the past to specify system behavior, the criteria of which have been gradually implemented in EHR systems through certification programs and other national health IT strategies. Similarly, functional requirements and standards for usability can help address the multitude of sequelae associated with poor usability. This paper describes the evidence-based functional requirements for usability contained in the Health Level Seven (HL7) EHR System Functional Model, and the benefits of open and voluntary EHR system usability standards. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. A Standard-Based Model for Adaptive E-Learning Platform for Mauritian Academic Institutions

    ERIC Educational Resources Information Center

    Kanaksabee, P.; Odit, M. P.; Ramdoyal, A.

    2011-01-01

    The key aim of this paper is to introduce a standard-based model for adaptive e-learning platform for Mauritian academic institutions and to investigate the conditions and tools required to implement this model. The main forces of the system are that it allows collaborative learning, communication among user, and reduce considerable paper work.…

  2. Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.

    This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less

  3. Implementing PAT with Standards

    NASA Astrophysics Data System (ADS)

    Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.

    2016-02-01

    Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.

  4. Cost minimizing of cutting process for CNC thermal and water-jet machines

    NASA Astrophysics Data System (ADS)

    Tavaeva, Anastasia; Kurennov, Dmitry

    2015-11-01

    This paper deals with optimization problem of cutting process for CNC thermal and water-jet machines. The accuracy of objective function parameters calculation for optimization problem is investigated. This paper shows that working tool path speed is not constant value. One depends on some parameters that are described in this paper. The relations of working tool path speed depending on the numbers of NC programs frames, length of straight cut, configuration part are presented. Based on received results the correction coefficients for working tool speed are defined. Additionally the optimization problem may be solved by using mathematical model. Model takes into account the additional restrictions of thermal cutting (choice of piercing and output tool point, precedence condition, thermal deformations). At the second part of paper the non-standard cutting techniques are considered. Ones may lead to minimizing of cutting cost and time compared with standard cutting techniques. This paper considers the effectiveness of non-standard cutting techniques application. At the end of the paper the future research works are indicated.

  5. 2016 Standard Scenarios Report: A U.S. Electricity Sector Outlook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Wesley; Mai, Trieu; Logan, Jeffrey

    This is the webinar presentation deck used to present the 2016 Standard Scenarios work. It discusses the Annual Technology Baseline (ATB) detailed cost and performance projections for electricity-generating technologies and the standard scenarios of the power sector modeling using ATB inputs.

  6. Relationship between non-standard work arrangements and work-related accident absence in Belgium

    PubMed Central

    Alali, Hanan; Braeckman, Lutgart; Van Hecke, Tanja; De Clercq, Bart; Janssens, Heidi; Wahab, Magd Abdel

    2017-01-01

    Objectives: The main objective of this study is to examine the relationship between indicators of non-standard work arrangements, including precarious contract, long working hours, multiple jobs, shift work, and work-related accident absence, using a representative Belgian sample and considering several socio-demographic and work characteristics. Methods: This study was based on the data of the fifth European Working Conditions Survey (EWCS). For the analysis, the sample was restricted to 3343 respondents from Belgium who were all employed workers. The associations between non-standard work arrangements and work-related accident absence were studied with multivariate logistic regression modeling techniques while adjusting for several confounders. Results: During the last 12 months, about 11.7% of workers were absent from work because of work-related accident. A multivariate regression model showed an increased injury risk for those performing shift work (OR 1.546, 95% CI 1.074-2.224). The relationship between contract type and occupational injuries was not significant (OR 1.163, 95% CI 0.739-1.831). Furthermore, no statistically significant differences were observed for those performing long working hours (OR 1.217, 95% CI 0.638-2.321) and those performing multiple jobs (OR 1.361, 95% CI 0.827-2.240) in relation to work-related accident absence. Those who rated their health as bad, low educated workers, workers from the construction sector, and those exposed to biomechanical exposure (BM) were more frequent victims of work-related accident absence. No significant gender difference was observed. Conclusion: Indicators of non-standard work arrangements under this study, except shift work, were not significantly associated with work-related accident absence. To reduce the burden of occupational injuries, not only risk reduction strategies and interventions are needed but also policy efforts are to be undertaken to limit shift work. In general, preventive measures and more training on the job are needed to ensure the safety and well-being of all workers. PMID:28111414

  7. Relationship between non-standard work arrangements and work-related accident absence in Belgium.

    PubMed

    Alali, Hanan; Braeckman, Lutgart; Van Hecke, Tanja; De Clercq, Bart; Janssens, Heidi; Wahab, Magd Abdel

    2017-03-28

    The main objective of this study is to examine the relationship between indicators of non-standard work arrangements, including precarious contract, long working hours, multiple jobs, shift work, and work-related accident absence, using a representative Belgian sample and considering several socio-demographic and work characteristics. This study was based on the data of the fifth European Working Conditions Survey (EWCS). For the analysis, the sample was restricted to 3343 respondents from Belgium who were all employed workers. The associations between non-standard work arrangements and work-related accident absence were studied with multivariate logistic regression modeling techniques while adjusting for several confounders. During the last 12 months, about 11.7% of workers were absent from work because of work-related accident. A multivariate regression model showed an increased injury risk for those performing shift work (OR 1.546, 95% CI 1.074-2.224). The relationship between contract type and occupational injuries was not significant (OR 1.163, 95% CI 0.739-1.831). Furthermore, no statistically significant differences were observed for those performing long working hours (OR 1.217, 95% CI 0.638-2.321) and those performing multiple jobs (OR 1.361, 95% CI 0.827-2.240) in relation to work-related accident absence. Those who rated their health as bad, low educated workers, workers from the construction sector, and those exposed to biomechanical exposure (BM) were more frequent victims of work-related accident absence. No significant gender difference was observed. Indicators of non-standard work arrangements under this study, except shift work, were not significantly associated with work-related accident absence. To reduce the burden of occupational injuries, not only risk reduction strategies and interventions are needed but also policy efforts are to be undertaken to limit shift work. In general, preventive measures and more training on the job are needed to ensure the safety and well-being of all workers.

  8. Experiencing Organizational Work Design: Beyond Hackman and Oldham

    ERIC Educational Resources Information Center

    Fornaciari, Charles J.; Dean, Kathy Lund

    2005-01-01

    Standard organizational behavior survey courses usually introduce students to the "nuts and bolts" of organizational work design and models that mechanize work. This article develops an experiential exercise that simulates working conditions that can foster greater student understanding of the affective, ethical, and human aspects of work design.…

  9. Searches for Physics Beyond the Standard Model and Triggering on Proton-Proton Collisions at 14 TEV LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wittich, Peter

    2011-10-14

    This document describes the work achieved under the OJI award received May 2008 by Peter Wittich as Principal Investigator. The proposal covers experimental particle physics project searching for physics beyond the standard model at the Large Hadron Collider (LHC) at the European Organization for Nuclear Research.

  10. The Science of Standards-Based Education

    ERIC Educational Resources Information Center

    Smithson, John

    2017-01-01

    A standards-based model of reform has dominated public education for 30 years. Under the Every Student Succeeds Act (ESSA), it will continue to dominate education policy. Is that model working? State boards of education share an intrinsic interest in this question. While there are many ways to investigate it, one approach that shows promise treats…

  11. Comparing satisfaction and burnout between caseload and standard care midwives: findings from two cross-sectional surveys conducted in Victoria, Australia.

    PubMed

    Newton, Michelle S; McLachlan, Helen L; Willis, Karen F; Forster, Della A

    2014-12-24

    Caseload midwifery reduces childbirth interventions and increases women's satisfaction with care. It is therefore important to understand the impact of caseload midwifery on midwives working in and alongside the model. While some studies have reported higher satisfaction for caseload compared with standard care midwives, others have suggested a need to explore midwives' work-life balance as well as potential for stress and burnout. This study explored midwives' attitudes to their professional role, and also measured burnout in caseload midwives compared to standard care midwives at two sites in Victoria, Australia with newly introduced caseload midwifery models. All midwives providing maternity care at the study sites were sent questionnaires at the commencement of the caseload midwifery model and two years later. Data items included the Midwifery Process Questionnaire (MPQ) to examine midwives' attitude to their professional role, the Copenhagen Burnout Inventory (CBI) to measure burnout, and questions about midwives' views of caseload work. Data were pooled for the two sites and comparisons made between caseload and standard care midwives. The MPQ and CBI data were summarised as individual and group means. Twenty caseload midwives (88%) and 130 standard care midwives (41%) responded at baseline and 22 caseload midwives (95%) and 133 standard care midwives (45%) at two years. Caseload and standard care midwives were initially similar across all measures except client-related burnout, which was lower for caseload midwives (12.3 vs 22.4, p = 0.02). After two years, compared to midwives in standard care, caseload midwives had higher mean scores in professional satisfaction (1.08 vs 0.76, p = 0.01), professional support (1.06 vs 0.11, p <0.01) and client interaction (1.4 vs 0.09, p <0.01) and lower scores for personal burnout (35.7 vs 47.7, p < 0.01), work-related burnout (27.3 vs 42.7, p <0.01), and client-related burnout (11.3 vs 21.4, p < 0.01). Caseload midwifery was associated with lower burnout scores and higher professional satisfaction. Further research should focus on understanding the key features of the caseload model that are related to these outcomes to help build a picture of what is required to ensure the long-term sustainability of the model.

  12. Associations among job demands and resources, work engagement, and psychological distress: fixed-effects model analysis in Japan

    PubMed Central

    Oshio, Takashi; Inoue, Akiomi

    2018-01-01

    Objectives: We examined the associations among job demands and resources, work engagement, and psychological distress, adjusted for time-invariant individual attributes. Methods: We used data from a Japanese occupational cohort survey, which included 18,702 observations of 7,843 individuals. We investigated how work engagement, measured by the Utrecht Work Engagement Scale, was associated with key aspects of job demands and resources, using fixed-effects regression models. We further estimated the fixed-effects models to assess how work engagement moderated the association between each job characteristic and psychological distress as measured by Kessler 6 scores. Results: The fixed-effects models showed that work engagement was positively associated with job resources, as did pooled cross-sectional and prospective cohort models. Specifically, the standardized regression coefficients (β) were 0.148 and 0.120 for extrinsic reward and decision latitude, respectively, compared to -0.159 and 0.020 for role ambiguity and workload and time pressure, respectively (p < 0.001 for all associations). Work engagement modestly moderated the associations of psychological distress with workload and time pressure and extrinsic reward; a one-standard deviation increase in work engagement moderated their associations by 19.2% (p < 0.001) and 11.3% (p = 0.034), respectively. Conclusions: Work engagement was associated with job demands and resources, which is in line with the theoretical prediction of the job demands-resources model, even after controlling for time-invariant individual attributes. Work engagement moderated the association between selected aspects of job demands and resources and psychological distress. PMID:29563368

  13. International Standardization of the Clinical Dosimetry of Beta Radiation Brachytherapy Sources: Progress of an ISO Standard

    NASA Astrophysics Data System (ADS)

    Soares, Christopher

    2006-03-01

    In 2004 a new work item proposal (NWIP) was accepted by the International Organization for Standardization (ISO) Technical Committee 85 (TC85 -- Nuclear Energy), Subcommittee 2 (Radiation Protection) for the development of a standard for the clinical dosimetry of beta radiation sources used for brachytherapy. To develop this standard, a new Working Group (WG 22 - Ionizing Radiation Dosimetry and Protocols in Medical Applications) was formed. The standard is based on the work of an ad-hoc working group initiated by the Dosimetry task group of the Deutsches Insitiut für Normung (DIN). Initially the work was geared mainly towards the needs of intravascular brachytherapy, but with the decline of this application, more focus has been placed on the challenges of accurate dosimetry for the concave eye plaques used to treat ocular melanoma. Guidance is given for dosimetry formalisms, reference data to be used, calibrations, measurement methods, modeling, uncertainty determinations, treatment planning and reporting, and clinical quality control. The document is currently undergoing review by the ISO member bodies for acceptance as a Committee Draft (CD) with publication of the final standard expected by 2007. There are opportunities for other ISO standards for medical dosimetry within the framework of WG22.

  14. Parents' work patterns and adolescent mental health.

    PubMed

    Dockery, Alfred; Li, Jianghong; Kendall, Garth

    2009-02-01

    Previous research demonstrates that non-standard work schedules undermine the stability of marriage and reduce family cohesiveness. Limited research has investigated the effects of parents working non-standard schedules on children's health and wellbeing and no published Australian studies have addressed this important issue. This paper contributes to bridging this knowledge gap by focusing on adolescents aged 15-20 years and by including sole parent families which have been omitted in previous research, using panel data from the Household, Income and Labour Dynamics in Australia Survey. Multilevel linear regression models are estimated to analyse the association between parental work schedules and hours of work and measures of adolescents' mental health derived from the SF-36 Health Survey. Evidence of negative impacts of parents working non-standard hours upon adolescent wellbeing is found to exist primarily within sole parent families.

  15. Fatigue of Chinese railway employees and its influential factors: Structural equation modelling.

    PubMed

    Tsao, Liuxing; Chang, Jing; Ma, Liang

    2017-07-01

    Fatigue is an identifiable and preventable cause of accidents in transport operations. Regarding the railway sector, incident logs and simulation studies show that employee fatigue leads to lack of alertness, impaired performance, and occurrence of incidents. China has one of the largest rail systems in the world, and Chinese railway employees work under high fatigue risks; therefore, it is important to assess their fatigue level and find the major factors leading to fatigue. We designed a questionnaire that uses Multidimensional Fatigue Instrument (MFI-20), NASA-TLX and subjective rating of work overtime feelings to assess employee fatigue. The contribution of each influential factor of fatigue was analysed using structural equation modelling. In total, 297 employees from the rail maintenance department and 227 employees from the locomotive department returned valid responses. The average scores and standard deviations for the five subscales of MFI-20, namely General Fatigue, Physical Fatigue, Reduced Activity, Reduced Motivation, and Mental Fatigue, were 2.9 (0.8), 2.8 (0.8), 2.5 (0.8), 2.5 (0.7), and 2.4 (0.8) among the rail maintenance employees and 3.5 (0.8), 3.5 (0.7), 3.3 (0.7), 3.0 (0.6), and 3.1 (0.7), respectively, among the locomotive employees. The fatigue of the locomotive employees was influenced by feelings related to working overtime (standardized r = 0.22) and workload (standardized r = 0.27). The work overtime control and physical working environment significantly influenced subjective feelings (standardized r = -0.25 and 0.47, respectively), while improper work/rest rhythms and an adverse physical working environment significantly increased the workload (standardized r = 0.48 and 0.33, respectively). Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Support of Multidimensional Parallelism in the OpenMP Programming Model

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Jost, Gabriele

    2003-01-01

    OpenMP is the current standard for shared-memory programming. While providing ease of parallel programming, the OpenMP programming model also has limitations which often effect the scalability of applications. Examples for these limitations are work distribution and point-to-point synchronization among threads. We propose extensions to the OpenMP programming model which allow the user to easily distribute the work in multiple dimensions and synchronize the workflow among the threads. The proposed extensions include four new constructs and the associated runtime library. They do not require changes to the source code and can be implemented based on the existing OpenMP standard. We illustrate the concept in a prototype translator and test with benchmark codes and a cloud modeling code.

  17. Compassion Fatigue and the Healthy Work Environment.

    PubMed

    Kelly, Lesly; Todd, Michael

    2017-01-01

    Burnout is a concern for critical care nurses in high-intensity environments. Studies have highlighted the importance of a healthy work environment in promoting optimal nurse and patient outcomes, but research examining the relationship between a healthy work environment and burnout is limited. To examine how healthy work environment components relate to compassion fatigue (eg, burnout, secondary trauma) and compassion satisfaction. Nurses (n = 105) in 3 intensive care units at an academic medical center completed a survey including the Professional Quality of Life and the American Association of Critical-Care Nurses' Healthy Work Environment standards. Regression models using each Healthy Work Environment component to predict each outcome, adjusting for background variables, showed that the 5 Healthy Work Environment components predicted burnout and that meaningful recognition and authentic leadership predicted compassion satisfaction. Findings on associations between healthy work environment standards and burnout suggest the potential importance of implementing the American Association of Critical-Care Nurses' Healthy Work Environment standards as a mechanism for decreasing burnout. ©2017 American Association of Critical-Care Nurses.

  18. Technical note: Harmonizing met-ocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Camossi, E.

    2015-11-01

    Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.

  19. Much ado about mice: Standard-setting in model organism research.

    PubMed

    Hardesty, Rebecca A

    2018-04-11

    Recently there has been a practice turn in the philosophy of science that has called for analyses to be grounded in the actual doings of everyday science. This paper is in furtherance of this call and it does so by employing participant-observation ethnographic methods as a tool for discovering epistemological features of scientific practice in a neuroscience lab. The case I present focuses on a group of neurobiologists researching the genetic underpinnings of cognition in Down syndrome (DS) and how they have developed a new mouse model which they argue should be regarded as the "gold standard" for all DS mouse research. Through use of ethnographic methods, interviews, and analyses of publications, I uncover how the lab constructed their new mouse model. Additionally, I describe how model organisms can serve as abstract standards for scientific work that impact the epistemic value of scientific claims, regulate practice, and constrain future work. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. A Sandbox Environment for the CSM Standard and SPICE

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Laura, J. R.

    2018-04-01

    We present ongoing work USGS is undertaking to provide a programming environment for the Camera Sensor Model (CSM) standard and associated SPICE information. This allows for instrument testing and experimentation outside a given production area.

  1. The "Communication Commando Model" Creates a Research Culture of Commitment

    ERIC Educational Resources Information Center

    Pollock, John C.

    2008-01-01

    A major dilemma faced by undergraduates is the enormous intellectual distance between standard short exercises (essays or exams) in traditional class work and more thorough, literature rich, meticulously analyzed, often empirically tested, issue-oriented work of scholars. Over the past 15 years, the author designed a "communication commando model"…

  2. Second-Language Learning through Imaginative Theory

    ERIC Educational Resources Information Center

    Broom, Catherine

    2011-01-01

    This article explores how Egan's (1997) work on imagination can enrich our understanding of teaching English as a second language (ESL). Much has been written on ESL teaching techniques; however, some of this work has been expounded in a standard educational framework, which is what Egan calls an assembly-line model. This model can easily underlie…

  3. Cultural Considerations in Advising Latino/a Students

    ERIC Educational Resources Information Center

    Negroni-Rodriguez, Lirio K.; Dicks, Barbara A.; Morales, Julio

    2006-01-01

    This paper presents a model for advising Latino/a students in graduate social work programs. The model is based on ecological-systemic and empowerment theory and ascribes to the social work values and cultural competence standards proposed by the National Association of Social Workers. It has been developed within an institution that has sought…

  4. CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.

    PubMed

    Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng

    2017-01-01

    Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.

  5. European standardization effort: interworking the goal

    NASA Astrophysics Data System (ADS)

    Mattheus, Rudy A.

    1993-09-01

    In the European Standardization Committee (CEN), the technical committee responsible for the standardization activities in Medical Informatics (CEN TC 251), has agreed upon the directions of the scopes to follow in this field. They are described in the Directory of the European Standardization Requirements for Healthcare Informatics and Programme for the Development of Standards adopted on 02-28-1991 by CEN/TC 251 and approved by CEN/BT. Top-down objectives describe the common framework and items like terminology, security, more bottom up oriented items describe fields like medical imaging and multi-media. The draft standard is described; the general framework model and object oriented model; the interworking aspects, the relation to ISO standards, and the DICOM proposal. This paper also focuses on all the boundaries in the standardization work, which are also influencing the standardization process.

  6. The Future of Geospatial Standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.

  7. Professional versus Occupational Models of Work Competence

    ERIC Educational Resources Information Center

    Lester, Stan

    2014-01-01

    In addition to the familiar occupational standards that underpin National Vocational Qualifications, the UK has a parallel if less complete system of competence or practice standards that are developed and controlled by professional bodies. While there is a certain amount of overlap between the two types of standard, recent research points to a…

  8. Application of a Mixed Consequential Ethical Model to a Problem Regarding Test Standards.

    ERIC Educational Resources Information Center

    Busch, John Christian

    The work of the ethicist Charles Curran and the problem-solving strategy of the mixed consequentialist ethical model are applied to a traditional social science measurement problem--that of how to adjust a recommended standard in order to be fair to the test-taker and society. The focus is on criterion-referenced teacher certification tests.…

  9. CDISC Terminology

    Cancer.gov

    Clinical Data Interchange Standards Consortium (CDISC) is an international, non-profit organization that develops and supports global data standards for medical research. CDISC is working actively with EVS to develop and support controlled terminology in several areas, notably CDISC's Study Data Tabulation Model (SDTM).

  10. A professional development model for medical laboratory scientists working in the microbiology laboratory.

    PubMed

    Amerson, Megan H; Pulido, Lila; Garza, Melinda N; Ali, Faheem A; Greenhill, Brandy; Einspahr, Christopher L; Yarsa, Joseph; Sood, Pramilla K; Hu, Peter C

    2012-01-01

    The University of Texas M.D. Anderson Cancer Center, Division of Pathology and Laboratory Medicine is committed to providing the best pathology and medicine through: state-of-the art techniques, progressive ground-breaking research, education and training for the clinical diagnosis and research of cancer and related diseases. After surveying the laboratory staff and other hospital professionals, the Department administrators and Human Resource generalists developed a professional development model for Microbiology to support laboratory skills, behavior, certification, and continual education within its staff. This model sets high standards for the laboratory professionals to allow the labs to work at their fullest potential; it provides organization to training technologists based on complete laboratory needs instead of training technologists in individual areas in which more training is required if the laboratory needs them to work in other areas. This model is a working example for all microbiology based laboratories who want to set high standards and want their staff to be acknowledged for demonstrated excellence and professional development in the laboratory. The PDM model is designed to focus on the needs of the laboratory as well as the laboratory professionals.

  11. Implementing the HL7v3 standard in Croatian primary healthcare domain.

    PubMed

    Koncar, Miroslav

    2004-01-01

    The mission of HL7 Inc. is to provide standards for the exchange, management and integration of data that supports clinical patient care and the management, delivery and evaluation of healthcare services. The scope of this work includes the specifications of flexible, cost-effective approaches, standards, guidelines, methodologies, and related services for interoperability between healthcare information systems. In the field of medical information technologies, HL7 provides the world's most advanced information standards. Versions 1 and 2 of the HL7 standard have on the one hand solved many issues, but on the other demonstrated the size and complexity of the health information sharing problem. As the solution, a complete new methodology has been adopted, which is being encompassed in version 3 recommendations. This approach standardizes the Reference Information Model (RIM), which is the source of all domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely-coupled systems that are designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project, we have decided to go directly to HL7v3. Implementing the HL7v3 standard in healthcare applications represents a challenging task. By using standardized refinement and localization methods we were able to define information models for Croatian primary healthcare domain. The scope of our work includes clinical, financial and administrative data management, where in some cases we were compelled to introduce new HL7v3-compliant models. All of the HL7v3 transactions are digitally signed, using the W3C XML Digital Signature standard.

  12. Les Houches 2015: Physics at TeV Colliders Standard Model Working Group Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, J.R.; et al.

    This Report summarizes the proceedings of the 2015 Les Houches workshop on Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant for high precision Standard Model calculations, (II) the new PDF4LHC parton distributions, (III) issues in the theoretical description of the production of Standard Model Higgs bosons and how to relate experimental measurements, (IV) a host of phenomenological studies essential for comparing LHC data from Run I with theoretical predictions and projections for future measurements in Run II, and (V) new developments in Monte Carlo event generators.

  13. Les Houches 2017: Physics at TeV Colliders Standard Model Working Group Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, J.R.; et al.

    This Report summarizes the proceedings of the 2017 Les Houches workshop on Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant for high precision Standard Model calculations, (II) theoretical uncertainties and dataset dependence of parton distribution functions, (III) new developments in jet substructure techniques, (IV) issues in the theoretical description of the production of Standard Model Higgs bosons and how to relate experimental measurements, (V) phenomenological studies essential for comparing LHC data from Run II with theoretical predictions and projections for future measurements, and (VI) new developments in Monte Carlo event generators.

  14. Update Of The ACR-NEMA Standard Committee

    NASA Astrophysics Data System (ADS)

    Wang, Yen; Best, D. E.; Morse, R. R.; Horii, S. C.; Lehr, J. L.; Lodwick, G. S.; Fuscoe, C.; Nelson, O. L.; Perry, J. R.; Thompson, B. G.; Wessell, W. R.

    1988-06-01

    In January, 1984, the American College of Radiology (ACR) representing the users of imaging equipment and the National Electrical Manufacturers Association (NEMA) representing the manufacturers of imaging equipment joined forces to create a committee that could solve the compatibility issues surrounding the exchange of digital medical images. This committee, the ACR-NEMA Digital Imaging and Communication Standards Committee was composed of radiologists and experts from industry who addressed the problems involved in interfacing different digital imaging modalities. In just two years, the committee and three of its working groups created an industry standard interface, ACR-NEMA Digital Imaging and Communications Standard, Publication No. 300-1985. The ACR-NEMA interface allows digital medical images and related information to be communicated between different imaging devices, regardless of manufacturer or use of differing image formats. The interface is modeled on the International Standards Organization's Open Systems Interconnection sever-layer reference model. It is believed that the development of the Interface was the first step in the development of standards for Medical Picture Archiving and Communications Systems (PACS). Developing the interface Standard has required intensive technical analysis and examination of the future trends for digital imaging in order to design a model which would not be quickly outmoded. To continue the enhancement and future development of image management systems, various working groups have been created under the direction of the ACR-NEMA Committee.

  15. NASA Standard for Models and Simulations: Credibility Assessment Scale

    NASA Technical Reports Server (NTRS)

    Babula, Maria; Bertch, William J.; Green, Lawrence L.; Hale, Joseph P.; Moser, Gary E.; Steele, Martin J.; Sylvester, Andre; Woods, Jody

    2008-01-01

    As one of its many responses to the 2003 Space Shuttle Columbia accident, NASA decided to develop a formal standard for models and simulations (M and S)ii. Work commenced in May 2005. An interim version was issued in late 2006. This interim version underwent considerable revision following an extensive Agency-wide review in 2007 along with some additional revisions as a result of the review by the NASA Engineering Management Board (EMB) in the first half of 2008. Issuance of the revised, permanent version,hereafter referred to as the M and S Standard or just the Standard, occurred in July 2008.

  16. Standardization Process for Space Radiation Models Used for Space System Design

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Daly, Eamonn; Brautigam, Donald

    2005-01-01

    The space system design community has three concerns related to models of the radiation belts and plasma: 1) AP-8 and AE-8 models are not adequate for modern applications; 2) Data that have become available since the creation of AP-8 and AE-8 are not being fully exploited for modeling purposes; 3) When new models are produced, there is no authorizing organization identified to evaluate the models or their datasets for accuracy and robustness. This viewgraph presentation provided an overview of the roadmap adopted by the Working Group Meeting on New Standard Radiation Belt and Space Plasma Models.

  17. WaterML, an Information Standard for the Exchange of in-situ hydrological observations

    NASA Astrophysics Data System (ADS)

    Valentine, D.; Taylor, P.; Zaslavsky, I.

    2012-04-01

    The WaterML 2.0 Standards Working Group (SWG), working within the Open Geospatial Consortium (OGC) and in cooperation with the joint OGC-World Meteorological Organization (WMO) Hydrology Domain Working Group (HDWG), has developed an open standard for the exchange of water observation data; WaterML 2.0. The focus of the standard is time-series data, commonly generated from in-situ style monitoring. This is high value data for hydrological applications such as flood forecasting, environmental reporting and supporting hydrological infrastructure (e.g. dams, supply systems), which is commonly exchanged, but a lack of standards inhibits efficient reuse and automation. The process of developing WaterML required doing a harmonization analysis of existing standards to identify overlapping concepts and come to agreement on a harmonized definition. Generally the formats captured similar requirements, all with subtle differences, such as how time-series point metadata was handled. The in-progress standard WaterML 2.0 incorporates the semantics of the hydrologic information: location, procedure, and observations, and is implemented as an application schema of the Geography Markup Language version 3.2.1, making use of the OGC Observations & Measurements standards. WaterML2.0 is designed as an extensible schema to allow encoding of data to be used in a variety of exchange scenarios. Example areas of usage are: exchange of data for operational hydrological monitoring programs; supporting operation of infrastructure (e.g. dams, supply systems); cross-border exchange of observational data; release of data for public dissemination; enhancing disaster management through data exchange; and exchange in support of national reporting The first phase of WaterML2.0 focused on structural definitions allowing for the transfer of time-series, with less work on harmonization of vocabulary items such as quality codes. Vocabularies from various organizations tend to be specific and take time to come to agreement on. This will be continued in future work for the HDWG, along with extending the information model to cover additional types of hydrologic information: rating and gauging information, and water quality. Rating curves, gaugings and river cross sections are commonly exchanged in addition to standard time-series data to allow information relating to conversions such as river level to discharge. Members of the HDWG plan to initiate this work in early 2012. Water quality data is varied in the way it is processed and in the number of phenomena it measures. It will require specific components of extension to the WaterML2.0 model, most likely making use of the specimen types within O&M and extensive use of controlled vocabularies. Other future work involves different target encodings for the WaterML2.0 conceptual model, such as JSON, netCDF, CSV etc. are optimized for particular needs, such as efficiency in size of the encoding and parsing of structure, but may not be capable of representing the full extent of the WaterML2.0 information model. Certain encodings are best matched for particular needs; the community has begun investigation into when and how best to implement these.

  18. Examples as an Instructional Tool in Mathematics and Science Classrooms: Teachers' Perceptions and Attitudes

    ERIC Educational Resources Information Center

    Huang, Xiaoxia; Cribbs, Jennifer

    2017-01-01

    This study examined mathematics and science teachers' perceptions and use of four types of examples, including typical textbook examples (standard worked examples) and erroneous worked examples in the written form as well as mastery modelling examples and peer modelling examples involving the verbalization of the problem-solving process. Data…

  19. A Mixed-Methods Evaluation of Social Work Learning Outcomes in Interprofessional Training with Medicine and Pharmacy Students

    ERIC Educational Resources Information Center

    Wharton, Tracy; Burg, Mary Ann

    2017-01-01

    Social work has moved firmly into a need for partnership training models, as our newest Educational Policy and Accreditation Standards explicitly call for interprofessional education (IPE). Although IPE is not a new model, we have not been consistently involved in training partnerships. Three professional schools formed partnerships to provide IPE…

  20. Flight Dynamic Model Exchange using XML

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2002-01-01

    The AIAA Modeling and Simulation Technical Committee has worked for several years to develop a standard by which the information needed to develop physics-based models of aircraft can be specified. The purpose of this standard is to provide a well-defined set of information, definitions, data tables and axis systems so that cooperating organizations can transfer a model from one simulation facility to another with maximum efficiency. This paper proposes using an application of the eXtensible Markup Language (XML) to implement the AIAA simulation standard. The motivation and justification for using a standard such as XML is discussed. Necessary data elements to be supported are outlined. An example of an aerodynamic model as an XML file is given. This example includes definition of independent and dependent variables for function tables, definition of key variables used to define the model, and axis systems used. The final steps necessary for implementation of the standard are presented. Software to take an XML-defined model and import/export it to/from a given simulation facility is discussed, but not demonstrated. That would be the next step in final implementation of standards for physics-based aircraft dynamic models.

  1. Working times of elastomeric impression materials determined by dimensional accuracy.

    PubMed

    Tan, E; Chai, J; Wozniak, W T

    1996-01-01

    The working times of five poly(vinyl siloxane) impression materials were estimated by evaluating the dimensional accuracy of stone dies of impressions of a standard model made at successive time intervals. The stainless steel standard model was represented by two abutments having known distances between landmarks in three dimensions. Three dimensions in the x-, y-, and z-axes of the stone dies were measured with a traveling microscope. A time interval was rejected as being within the working time if the percentage change of the resultant dies, in any dimension, was statistically different from those measured from stone dies from previous time intervals. The absolute dimensions of those dies from the rejected time interval also must have exceeded all those from previous time intervals. Results showed that the working times estimated with this method generally were about 30 seconds longer than those recommended by the manufacturers.

  2. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  3. Standards for education and training for interagency working in child protection in the UK: implications for nurses, midwives and health visitors.

    PubMed

    Long, Tony; Davis, Cathy; Johnson, Martin; Murphy, Michael; Race, David; Shardlow, Steven M

    2006-01-01

    This article presents a discussion of key issues for the education of nurses, midwives and health visitors following the completion of a Department of Health funded project, managed by the General Social Care Council and conducted jointly by two research centres; Salford Centre for Social Work Research and Salford Centre for Nursing, Midwifery and Collaborative Research. The work was initiated in response to Lord Laming's report on the circumstances leading to the death of Victoria Climbié. The project was conducted in relation to specified professions and occupational groups: doctors; health visitors; midwives; nurses; police; teachers, and social workers. It was undertaken in two stages. The first stage mapped existing material about standards in relation to education and training for interagency working. The second stage engaged in an extensive consultation exercise through which a model and a set of proposed standards for interagency education and training for interagency work were developed. The former is detailed fully in this report, while nine examples of standards are presented. The project final report was presented seven months after commencement.

  4. Globalization and workers' health.

    PubMed

    Kawachi, Ichiro

    2008-10-01

    The global integration of economies worldwide has led to increased pressure for "labor flexibility". A notable aspect of this trend has been the rise in non-standard work arrangements, which include part-time work, temporary agency-based work, fixed-term contingent work, and independent contracting. Although non-standard work arrangements are convenient for employers, they are often associated with poor pay, absence of pension and health benefits, as well as lack of protection from unions and labor laws. Studies have begun to address the question of whether these "precarious" jobs pose a health hazard for workers. The challenge for causal inference is that precarious workers are likely to differ from non-precarious workers in a variety of characteristics that also influence health outcomes, i.e. there is confounding and selection bias. However, even after taking account of these biases--through propensity score-matched analysis--there is evidence to suggest that non-standard work may be damaging to workers' health. Policies modeled after the European Union's Directive on Part-Time Work may help to mitigate some of the health hazards associated with precarious work.

  5. Issues for Universities Working with K-12 Institutions Implementing Prepackaged Pre-Engineering Curricula Such as Project Lead the Way

    ERIC Educational Resources Information Center

    Reid, Kenneth J.; Feldhaus, Charles R.

    2007-01-01

    The implementation of pre-engineering, standard curricula in K-12 schools is growing at a rapid pace. One such curriculum model, Project Lead the Way, consists of six standardized courses requiring significant training for teachers, specified laboratory equipment, standard topics, exams, etc. Schools implementing Project Lead the Way implement an…

  6. Harmonization of standards for parabolic trough collector testing in solar thermal power plants

    NASA Astrophysics Data System (ADS)

    Sallaberry, Fabienne; Valenzuela, Loreto; Palacin, Luis G.; Leon, Javier; Fischer, Stephan; Bohren, Andreas

    2017-06-01

    The technology of parabolic trough collectors (PTC) is used widely in concentrating Solar Power (CSP) plants worldwide. However this type of large-size collectors cannot be officially tested by an accredited laboratory and certified by an accredited certification body so far, as there is no standard adapted to its particularity, and the current published standard for solar thermal collectors are not completely applicable to them. Recently some standardization committees have been working on this technology. This paper aims to give a summary of the standardized testing methodology of large-size PTC for CSP plants, giving the physical model chosen for modeling the thermal performance of the collector in the new revision of standard ISO 9806 and the points still to be improved in the standard draft IEC 62862-3-2. In this paper, a summary of the testing validation performed on one parabolic trough collector installed in one of the test facilities at the Plataforma Solar de Almería (PSA) with this new model is also presented.

  7. 78 FR 45104 - Model Manufactured Home Installation Standards: Ground Anchor Installations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-26

    ... test methods for establishing working load design values of ground anchor assemblies used for new... anchor installations and establish standardized test methods to determine ground anchor performance and... currently no national test method for rating and certifying ground anchor assemblies in different soil...

  8. A model for a knowledge-based system's life cycle

    NASA Technical Reports Server (NTRS)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a Committee on Standards for Artificial Intelligence. Presented here are the initial efforts of one of the working groups of that committee. The purpose here is to present a candidate model for the development life cycle of Knowledge Based Systems (KBS). The intent is for the model to be used by the Aerospace Community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are detailed as are and the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  9. Artificial Intelligence Software Engineering (AISE) model

    NASA Technical Reports Server (NTRS)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  10. Application of the predicted heat strain model in development of localized, threshold-based heat stress management guidelines for the construction industry.

    PubMed

    Rowlinson, Steve; Jia, Yunyan Andrea

    2014-04-01

    Existing heat stress risk management guidelines recommended by international standards are not practical for the construction industry which needs site supervision staff to make instant managerial decisions to mitigate heat risks. The ability of the predicted heat strain (PHS) model [ISO 7933 (2004). Ergonomics of the thermal environment analytical determination and interpretation of heat stress using calculation of the predicted heat strain. Geneva: International Standard Organisation] to predict maximum allowable exposure time (D lim) has now enabled development of localized, action-triggering and threshold-based guidelines for implementation by lay frontline staff on construction sites. This article presents a protocol for development of two heat stress management tools by applying the PHS model to its full potential. One of the tools is developed to facilitate managerial decisions on an optimized work-rest regimen for paced work. The other tool is developed to enable workers' self-regulation during self-paced work.

  11. Fermionic dark matter and neutrino masses in a B - L model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sánchez-Vega, B. L.; Schmitz, E. R.

    2015-09-01

    In this work we present a common framework for neutrino masses and dark matter. Specifically, we work with a local B - L extension of the standard model which has three right-handed neutrinos, n(Ri), and some extra scalars, Phi, phi(i), besides the standard model fields. The n(Ri)'s have nonstandard B - L quantum numbers and thus these couple to different scalars. This model has the attractive property that an almost automatic Z(2) symmetry acting only on a fermionic field, n(R3), is present. Taking advantage of this Z(2) symmetry, we study both the neutrino mass generation via a natural seesaw mechanismmore » at low energy and the possibility of n(R3) being a dark matter candidate. For this last purpose, we study its relic abundance and its compatibility with the current direct detection experiments.« less

  12. To ontologise or not to ontologise: An information model for a geospatial knowledge infrastructure

    NASA Astrophysics Data System (ADS)

    Stock, Kristin; Stojanovic, Tim; Reitsma, Femke; Ou, Yang; Bishr, Mohamed; Ortmann, Jens; Robertson, Anne

    2012-08-01

    A geospatial knowledge infrastructure consists of a set of interoperable components, including software, information, hardware, procedures and standards, that work together to support advanced discovery and creation of geoscientific resources, including publications, data sets and web services. The focus of the work presented is the development of such an infrastructure for resource discovery. Advanced resource discovery is intended to support scientists in finding resources that meet their needs, and focuses on representing the semantic details of the scientific resources, including the detailed aspects of the science that led to the resource being created. This paper describes an information model for a geospatial knowledge infrastructure that uses ontologies to represent these semantic details, including knowledge about domain concepts, the scientific elements of the resource (analysis methods, theories and scientific processes) and web services. This semantic information can be used to enable more intelligent search over scientific resources, and to support new ways to infer and visualise scientific knowledge. The work describes the requirements for semantic support of a knowledge infrastructure, and analyses the different options for information storage based on the twin goals of semantic richness and syntactic interoperability to allow communication between different infrastructures. Such interoperability is achieved by the use of open standards, and the architecture of the knowledge infrastructure adopts such standards, particularly from the geospatial community. The paper then describes an information model that uses a range of different types of ontologies, explaining those ontologies and their content. The information model was successfully implemented in a working geospatial knowledge infrastructure, but the evaluation identified some issues in creating the ontologies.

  13. Parity oscillations and photon correlation functions in the Z2-U (1 ) Dicke model at a finite number of atoms or qubits

    NASA Astrophysics Data System (ADS)

    Yi-Xiang, Yu; Ye, Jinwu; Zhang, CunLin

    2016-08-01

    Four standard quantum optics models, that is, the Rabi, Dicke, Jaynes-Cummings, and Tavis-Cummings models, were proposed by physicists many decades ago. Despite their relative simple forms and many previous theoretical works, their physics at a finite N , especially inside the superradiant regime, remain unknown. In this work, by using the strong-coupling expansion and exact diagonalization (ED), we study the Z2-U(1 ) Dicke model with independent rotating-wave coupling g and counterrotating-wave coupling g' at a finite N . This model includes the four standard quantum optics models as its various special limits. We show that in the superradiant phase, the system's energy levels are grouped into doublets with even and odd parity. Any anisotropy β =g'/g ≠1 leads to the oscillation of parities in both the ground and excited doublets as the atom-photon coupling strength increases. The oscillations will be pushed to the infinite coupling strength in the isotropic Z2 limit β =1 . We find nearly perfect agreement between the strong-coupling expansion and the ED in the superradiant regime when β is not too small. We also compute the photon correlation functions, squeezing spectrum, and number correlation functions that can be measured by various standard optical techniques.

  14. One-loop topological expansion for spin glasses in the large connectivity limit

    NASA Astrophysics Data System (ADS)

    Chiara Angelini, Maria; Parisi, Giorgio; Ricci-Tersenghi, Federico

    2018-01-01

    We apply for the first time a new one-loop topological expansion around the Bethe solution to the spin-glass model with a field in the high connectivity limit, following the methodological scheme proposed in a recent work. The results are completely equivalent to the well-known ones, found by standard field-theoretical expansion around the fully connected model (Bray and Roberts 1980, and following works). However this method has the advantage that the starting point is the original Hamiltonian of the model, with no need to define an associated field theory, nor to know the initial values of the couplings, and the computations have a clear and simple physical meaning. Moreover this new method can also be applied in the case of zero temperature, when the Bethe model has a transition in field, contrary to the fully connected model that is always in the spin-glass phase. Sharing with finite-dimensional model the finite connectivity properties, the Bethe lattice is clearly a better starting point for an expansion with respect to the fully connected model. The present work is a first step towards the generalization of this new expansion to more difficult and interesting cases as the zero-temperature limit, where the expansion could lead to different results with respect to the standard one.

  15. The Standard of Care: Legal History and Definitions: the Bad and Good News

    PubMed Central

    Moffett, Peter; Moore, Gregory

    2011-01-01

    The true meaning of the term “the standard of care” is a frequent topic of discussion among emergency physicians as they evaluate and perform care on patients. This article, using legal cases and dictums, reviews the legal history and definitions of the standard of care. The goal is to provide the working physician with a practical and useful model of the standard of care to help guide daily practice. PMID:21691483

  16. Computation of confined coflow jets with three turbulence models

    NASA Technical Reports Server (NTRS)

    Zhu, J.; Shih, T. H.

    1993-01-01

    A numerical study of confined jets in a cylindrical duct is carried out to examine the performance of two recently proposed turbulence models: an RNG-based K-epsilon model and a realizable Reynolds stress algebraic equation model. The former is of the same form as the standard K-epsilon model but has different model coefficients. The latter uses an explicit quadratic stress-strain relationship to model the turbulent stresses and is capable of ensuring the positivity of each turbulent normal stress. The flow considered involves recirculation with unfixed separation and reattachment points and severe adverse pressure gradients, thereby providing a valuable test of the predictive capability of the models for complex flows. Calculations are performed with a finite-volume procedure. Numerical credibility of the solutions is ensured by using second-order accurate differencing schemes and sufficiently fine grids. Calculations with the standard K-epsilon model are also made for comparison. Detailed comparisons with experiments show that the realizable Reynolds stress algebraic equation model consistently works better than does the standard K-epsilon model in capturing the essential flow features, while the RNG-based K-epsilon model does not seem to give improvements over the standard K-epsilon model under the flow conditions considered.

  17. An Ontology-Based Archive Information Model for the Planetary Science Community

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris

    2008-01-01

    The Planetary Data System (PDS) information model is a mature but complex model that has been used to capture over 30 years of planetary science data for the PDS archive. As the de-facto information model for the planetary science data archive, it is being adopted by the International Planetary Data Alliance (IPDA) as their archive data standard. However, after seventeen years of evolutionary change the model needs refinement. First a formal specification is needed to explicitly capture the model in a commonly accepted data engineering notation. Second, the core and essential elements of the model need to be identified to help simplify the overall archive process. A team of PDS technical staff members have captured the PDS information model in an ontology modeling tool. Using the resulting knowledge-base, work continues to identify the core elements, identify problems and issues, and then test proposed modifications to the model. The final deliverables of this work will include specifications for the next generation PDS information model and the initial set of IPDA archive data standards. Having the information model captured in an ontology modeling tool also makes the model suitable for use by Semantic Web applications.

  18. Student Affairs Case Management: Merging Social Work Theory with Student Affairs Practice

    ERIC Educational Resources Information Center

    Adams, Sharrika D.; Hazelwood, Sherry; Hayden, Bruce

    2014-01-01

    Case management is a functional area in higher education and student affairs that emerged after the mass shootings at Virginia Tech in 2007. Although new to higher education, case management emerged from established social work practice. This article compares social work theory and case management standards with a new case management model for…

  19. Spatial enhancement of ECG using diagnostic similarity score based lead selective multi-scale linear model.

    PubMed

    Nallikuzhy, Jiss J; Dandapat, S

    2017-06-01

    In this work, a new patient-specific approach to enhance the spatial resolution of ECG is proposed and evaluated. The proposed model transforms a three-lead ECG into a standard twelve-lead ECG thereby enhancing its spatial resolution. The three leads used for prediction are obtained from the standard twelve-lead ECG. The proposed model takes advantage of the improved inter-lead correlation in wavelet domain. Since the model is patient-specific, it also selects the optimal predictor leads for a given patient using a lead selection algorithm. The lead selection algorithm is based on a new diagnostic similarity score which computes the diagnostic closeness between the original and the spatially enhanced leads. Standard closeness measures are used to assess the performance of the model. The similarity in diagnostic information between the original and the spatially enhanced leads are evaluated using various diagnostic measures. Repeatability and diagnosability are performed to quantify the applicability of the model. A comparison of the proposed model is performed with existing models that transform a subset of standard twelve-lead ECG into the standard twelve-lead ECG. From the analysis of the results, it is evident that the proposed model preserves diagnostic information better compared to other models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Heliport noise model : methodology - draft report

    DOT National Transportation Integrated Search

    1988-04-30

    The Heliport Noise Model (HNM) is the United States standard for predicting civil helicopter noise exposure in the vicinity of heliports and airports. HNM Version 1 is the culmination of several years of work in helicopter noise research, field measu...

  1. ANZSoilML: An Australian - New Zealand standard for exchange of soil data

    NASA Astrophysics Data System (ADS)

    Simons, Bruce; Wilson, Peter; Ritchie, Alistair; Cox, Simon

    2013-04-01

    The Australian-New Zealand soil information exchange standard (ANZSoilML) is a GML-based standard designed to allow the discovery, query and delivery of soil and landscape data via standard Open Geospatial Consortium (OGC) Web Feature Services. ANZSoilML modifies the Australian soil exchange standard (OzSoilML), which is based on the Australian Soil Information Transfer and Evaluation System (SITES) database design and exchange protocols, to meet the New Zealand National Soils Database requirements. The most significant change was the removal of the lists of CodeList terms in OzSoilML, which were based on the field methods specified in the 'Australian Soil and Land Survey Field Handbook'. These were replaced with empty CodeLists as placeholders to external vocabularies to allow the use of New Zealand vocabularies without violating the data model. Testing of the use of these separately governed Australian and New Zealand vocabularies has commenced. ANZSoilML attempts to accommodate the proposed International Organization for Standardization ISO/DIS 28258 standard for soil quality. For the most part, ANZSoilML is consistent with the ISO model, although major differences arise as a result of: • The need to specify the properties appropriate for each feature type; • The inclusion of soil-related 'Landscape' features; • Allowing the mapping of soil surfaces, bodies, layers and horizons, independent of the soil profile; • Allowing specifying the relationships between the various soil features; • Specifying soil horizons as specialisations of soil layers; • Removing duplication of features provided by the ISO Observation & Measurements standard. The International Union of Soil Sciences (IUSS) Working Group on Soil Information Standards (WG-SIS) aims to develop, promote and maintain a standard to facilitate the exchange of soils data and information. Developing an international exchange standard that is compatible with existing and emerging national and regional standards is a considerable challenge. ANZSoilML is proposed as a profile of the more generalised SoilML model being progressed through the IUSS Working Group.

  2. Modeling single event induced crosstalk in nanometer technologies

    NASA Astrophysics Data System (ADS)

    Boorla, Vijay K.

    Radiation effects become more important in combinational logic circuits with newer technologies. When a high energetic particle strikes at the sensitive region within the combinational logic circuit a voltage pulse called Single Event Transient is created. Recently, researchers reported Single Event Crosstalk because of increasing coupling effects. In this work, the closed form expression for SE crosstalk noise is formulated for the first time. For all calculations, 4-pi model is used in this work. The crosstalk model uses a reduced transfer function between aggressor coupling node and victim node to reduce information loss. Aggressor coupling node waveform is obtained and then applied to transfer function between the coupling node and the victim output to obtain victim noise voltage. This work includes both effect of passive aggressor loading on victim and victim loading on aggressor by considering resistive shielding effect. Noise peak expressions derived in this work show very good results in comparison to HSPICE results. Results show that average error for noise peak is 3.794% while allowing for very fast analysis. Once the SE crosstalk noise is calculated, one can hire mitigation techniques such as driver sizing. A standard DTMOS technique along with sizing is proposed in this work to mitigate SE crosstalk. This combined approach can saves in some areas compared to driver sizing alone. Key Words: Crosstalk Noise, Closed Form Modeling, Standard DTMOS

  3. Analysis of Wind Turbine Simulation Models: Assessment of Simplified versus Complete Methodologies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Honrubia-Escribano, A.; Jimenez-Buendia, F.; Molina-Garcia, A.

    This paper presents the current status of simplified wind turbine models used for power system stability analysis. This work is based on the ongoing work being developed in IEC 61400-27. This international standard, for which a technical committee was convened in October 2009, is focused on defining generic (also known as simplified) simulation models for both wind turbines and wind power plants. The results of the paper provide an improved understanding of the usability of generic models to conduct power system simulations.

  4. Primordial lithium and the standard model(s)

    NASA Technical Reports Server (NTRS)

    Deliyannis, Constantine P.; Demarque, Pierre; Kawaler, Steven D.; Romanelli, Paul; Krauss, Lawrence M.

    1989-01-01

    The results of new theoretical work on surface Li-7 and Li-6 evolution in the oldest halo stars are presented, along with a new and refined analysis of the predicted primordial Li abundance resulting from big-bang nucleosynthesis. This makes it possible to determine the constraints which can be imposed on cosmology using primordial Li and both standard big-bang and stellar-evolution models. This leads to limits on the baryon density today of 0.0044-0.025 (where the Hubble constant is 100h km/sec Mpc) and imposes limitations on alternative nucleosynthesis scenarios.

  5. Setting performance standards for medical practice: a theoretical framework.

    PubMed

    Southgate, L; Hays, R B; Norcini, J; Mulholland, H; Ayers, B; Woolliscroft, J; Cusimano, M; McAvoy, P; Ainsworth, M; Haist, S; Campbell, M

    2001-05-01

    The assessment of performance in the real world of medical practice is now widely accepted as the goal of assessment at the postgraduate level. This is largely a validity issue, as it is recognised that tests of knowledge and in clinical simulations cannot on their own really measure how medical practitioners function in the broader health care system. However, the development of standards for performance-based assessment is not as well understood as in competency assessment, where simulations can more readily reflect narrower issues of knowledge and skills. This paper proposes a theoretical framework for the development of standards that reflect the more complex world in which experienced medical practitioners work. The paper reflects the combined experiences of a group of education researchers and the results of literature searches that included identifying current health system data sources that might contribute information to the measurement of standards. Standards that reflect the complexity of medical practice may best be developed through an "expert systems" analysis of clinical conditions for which desired health care outcomes reflect the contribution of several health professionals within a complex, three-dimensional, contextual model. Examples of the model are provided, but further work is needed to test validity and measurability.

  6. The need for GPS standardization

    NASA Technical Reports Server (NTRS)

    Lewandowski, Wlodzimierz W.; Petit, Gerard; Thomas, Claudine

    1992-01-01

    A desirable and necessary step for improvement of the accuracy of Global Positioning System (GPS) time comparisons is the establishment of common GPS standards. For this reason, the CCDS proposed the creation of a special group of experts with the objective of recommending procedures and models for operational time transfer by GPS common-view method. Since the announcement of the implementation of Selective Availability at the end of last spring, action has become much more urgent and this CCDS Group on GPS Time Transfer Standards has now been set up. It operates under the auspices of the permanent CCDS Working Group on TAI and works in close cooperation with the Sub-Committee on Time of the Civil GPS Service Interface Committee (CGSIC). Taking as an example the implementation of SA during the first week of July 1991, this paper illustrates the need to develop urgently at least two standardized procedures in GPS receiver software: monitoring GPS tracks with a common time scale and retaining broadcast ephemeris parameters throughout the duration of a track. Other matters requiring action are the adoption of common models for atmospheric delay, a common approach to hardware design and agreement about short-term data processing. Several examples of such deficiencies in standardization are presented.

  7. Preliminary work toward the development of a dimensional tolerance standard for rapid prototyping

    NASA Technical Reports Server (NTRS)

    Kennedy, W. J.

    1996-01-01

    Rapid prototyping is a new technology for building parts quickly from CAD models. It works by slicing a CAD model into layers, then by building a model of the part one layer at a time. Since most parts can be sliced, most parts can be modeled using rapid prototyping. The layers themselves are created in a number of different ways - by using a laser to cure a layer of an epoxy or a resin, by depositing a layer of plastic or wax upon a surface, by using a laser to sinter a layer of powder, or by using a laser to cut a layer of paper. Rapid prototyping (RP) is new, and a standard part for use in comparing dimensional tolerances has not yet been chosen and accepted by ASTM (the American Society for Testing Materials). Such a part is needed when RP is used to build parts for investment casting or for direct use. The objective of this project was to start the development of a standard part by using statistical techniques to choose the features of the part which show curl - the vertical deviation of a part from its intended horizontal plane.

  8. Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes

    PubMed Central

    Sharma, Deepak K.; Solbrig, Harold R.; Prud’hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian

    2016-01-01

    Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary’s metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration. PMID:28269909

  9. Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes.

    PubMed

    Sharma, Deepak K; Solbrig, Harold R; Prud'hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian

    2016-01-01

    Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary's metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration.

  10. Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report.

    PubMed

    Appelbaum, Mark; Cooper, Harris; Kline, Rex B; Mayo-Wilson, Evan; Nezu, Arthur M; Rao, Stephen M

    2018-01-01

    Following a review of extant reporting standards for scientific publication, and reviewing 10 years of experience since publication of the first set of reporting standards by the American Psychological Association (APA; APA Publications and Communications Board Working Group on Journal Article Reporting Standards, 2008), the APA Working Group on Quantitative Research Reporting Standards recommended some modifications to the original standards. Examples of modifications include division of hypotheses, analyses, and conclusions into 3 groupings (primary, secondary, and exploratory) and some changes to the section on meta-analysis. Several new modules are included that report standards for observational studies, clinical trials, longitudinal studies, replication studies, and N-of-1 studies. In addition, standards for analytic methods with unique characteristics and output (structural equation modeling and Bayesian analysis) are included. These proposals were accepted by the Publications and Communications Board of APA and supersede the standards included in the 6th edition of the Publication Manual of the American Psychological Association (APA, 2010). (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Generic worklist handler for workflow-enabled products

    NASA Astrophysics Data System (ADS)

    Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas

    1999-07-01

    Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.

  12. Comparison of Fast Neutron Detector Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stange, Sy; Mckigney, Edward Allen

    2015-02-09

    This report documents the work performed for the Department of Homeland Security Domestic Nuclear Detection O ce as the project Fast Neutron Detection Evaluation under contract HSHQDC-14-X-00022. This study was performed as a follow-on to the project Study of Fast Neutron Signatures and Measurement Techniques for SNM Detection - DNDO CFP11-100 STA-01. That work compared various detector technologies in a portal monitor con guration, focusing on a comparison between a number of fast neutron detection techniques and two standard thermal neutron detection technologies. The conclusions of the earlier work are contained in the report Comparison of Fast Neutron Detector Technologies.more » This work is designed to address questions raised about assumptions underlying the models built for the earlier project. To that end, liquid scintillators of two di erent sizes{ one a commercial, o -the-shelf (COTS) model of standard dimensions and the other a large, planer module{were characterized at Los Alamos National Laboratory. The results of those measurements were combined with the results of the earlier models to gain a more complete picture of the performance of liquid scintillator as a portal monitor technology.« less

  13. Using CMAQ to provide probabilistic assessment of emission control scenarios in meeting the ozone standard

    EPA Science Inventory

    In the United States, regional-scale air quality models are being used to identify emissions reductions needed to comply with the ozone National Ambient Air Quality Standard. Previous work has demonstrated that ozone extreme values (i.e., 4th highest ozone or Design Value) are c...

  14. Combining the CIDOC CRM and MPEG-7 to Describe Multimedia in Museums.

    ERIC Educational Resources Information Center

    Hunter, Jane

    This paper describes a proposal for an interoperable metadata model, based on international standards, that has been designed to enable the description, exchange and sharing of multimedia resources both within and between cultural institutions. Domain-specific ontologies have been developed by two different ISO Working Groups to standardize the…

  15. Teaching Information Ethics to High School Students

    ERIC Educational Resources Information Center

    Lehman, Kathy

    2009-01-01

    The new AASL standards clearly spell out ethical responsibilities, which school librarians strive to instill and model as they work with staff and students. In this article, the author presents the AASL standards together with some tips and lesson ideas which she and her library partner have put into practice within their library media program.

  16. Using ISO 25040 standard for evaluating electronic health record systems.

    PubMed

    Oliveira, Marília; Novaes, Magdala; Vasconcelos, Alexandre

    2013-01-01

    Quality of electronic health record systems (EHR-S) is one of the key points in the discussion about the safe use of this kind of system. It stimulates creation of technical standards and certifications in order to establish the minimum requirements expected for these systems. [1] In other side, EHR-S suppliers need to invest in evaluation of their products to provide systems according to these requirements. This work presents a proposal of use ISO 25040 standard, which focuses on the evaluation of software products, for define a model of evaluation of EHR-S in relation to Brazilian Certification for Electronic Health Record Systems - SBIS-CFM Certification. Proposal instantiates the process described in ISO 25040 standard using the set of requirements that is scope of the Brazilian certification. As first results, this research has produced an evaluation model and a scale for classify an EHR-S about its compliance level in relation to certification. This work in progress is part for the acquisition of the degree of master in Computer Science at the Federal University of Pernambuco.

  17. A Dedicated Diversity Course or an Infusion Model? Exploring Which Strategy Is More Effective in Social Work Pedagogy

    ERIC Educational Resources Information Center

    Pitner, Ronald O.; Priester, Mary Ann; Lackey, Richard; Duvall, Deborah

    2018-01-01

    The Council on Social Work Education requires schools of social work to meet diversity and social justice competencies. Many MSW programs meet these standards by having either a dedicated diversity and social justice course, or by using some form of diversity and social justice curricular infusion. The current study explored which of these…

  18. A physiome standards-based model publication paradigm.

    PubMed

    Nickerson, David P; Buist, Martin L

    2009-05-28

    In this era of widespread broadband Internet penetration and powerful Web browsers on most desktops, a shift in the publication paradigm for physiome-style models is envisaged. No longer will model authors simply submit an essentially textural description of the development and behaviour of their model. Rather, they will submit a complete working implementation of the model encoded and annotated according to the various standards adopted by the physiome project, accompanied by a traditional human-readable summary of the key scientific goals and outcomes of the work. While the final published, peer-reviewed article will look little different to the reader, in this new paradigm, both reviewers and readers will be able to interact with, use and extend the models in ways that are not currently possible. Here, we review recent developments that are laying the foundations for this new model publication paradigm. Initial developments have focused on the publication of mathematical models of cellular electrophysiology, using technology based on a CellML- or Systems Biology Markup Language (SBML)-encoded implementation of the mathematical models. Here, we review the current state of the art and what needs to be done before such a model publication becomes commonplace.

  19. Urban Climate Resilience - Connecting climate models with decision support cyberinfrastructure using open standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Idol, T. A.

    2015-12-01

    Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.

  20. The association of fatigue, pain, depression and anxiety with work and activity impairment in immune mediated inflammatory diseases.

    PubMed

    Enns, Murray W; Bernstein, Charles N; Kroeker, Kristine; Graff, Lesley; Walker, John R; Lix, Lisa M; Hitchon, Carol A; El-Gabalawy, Renée; Fisk, John D; Marrie, Ruth Ann

    2018-01-01

    Impairment in work function is a frequent outcome in patients with chronic conditions such as immune-mediated inflammatory diseases (IMID), depression and anxiety disorders. The personal and economic costs of work impairment in these disorders are immense. Symptoms of pain, fatigue, depression and anxiety are potentially remediable forms of distress that may contribute to work impairment in chronic health conditions such as IMID. The present study evaluated the association between pain [Medical Outcomes Study Pain Effects Scale], fatigue [Daily Fatigue Impact Scale], depression and anxiety [Hospital Anxiety and Depression Scale] and work impairment [Work Productivity and Activity Impairment Scale] in four patient populations: multiple sclerosis (n = 255), inflammatory bowel disease (n = 248, rheumatoid arthritis (n = 154) and a depression and anxiety group (n = 307), using quantile regression, controlling for the effects of sociodemographic factors, physical disability, and cognitive deficits. Each of pain, depression symptoms, anxiety symptoms, and fatigue individually showed significant associations with work absenteeism, presenteeism, and general activity impairment (quantile regression standardized estimates ranging from 0.3 to 1.0). When the distress variables were entered concurrently into the regression models, fatigue was a significant predictor of work and activity impairment in all models (quantile regression standardized estimates ranging from 0.2 to 0.5). These findings have important clinical implications for understanding the determinants of work impairment and for improving work-related outcomes in chronic disease.

  1. Effects of individualized electrical impedance tomography and image reconstruction settings upon the assessment of regional ventilation distribution: Comparison to 4-dimensional computed tomography in a porcine model

    PubMed Central

    Mudrak, Daniel; Kampusch, Stefan; Wielandner, Alice; Prosch, Helmut; Braun, Christina; Toemboel, Frédéric P. R.; Hofmanninger, Johannes; Kaniusas, Eugenijus

    2017-01-01

    Electrical impedance tomography (EIT) is a promising imaging technique for bedside monitoring of lung function. It is easily applicable, cheap and requires no ionizing radiation, but clinical interpretation of EIT-images is still not standardized. One of the reasons for this is the ill-posed nature of EIT, allowing a range of possible images to be produced–rather than a single explicit solution. Thus, to further advance the EIT technology for clinical application, thorough examinations of EIT-image reconstruction settings–i.e., mathematical parameters and addition of a priori (e.g., anatomical) information–is essential. In the present work, regional ventilation distribution profiles derived from different EIT finite-element reconstruction models and settings (for GREIT and Gauss Newton) were compared to regional aeration profiles assessed by the gold-standard of 4-dimensional computed tomography (4DCT) by calculating the root mean squared error (RMSE). Specifically, non-individualized reconstruction models (based on circular and averaged thoracic contours) and individualized reconstruction models (based on true thoracic contours) were compared. Our results suggest that GREIT with noise figure of 0.15 and non-uniform background works best for the assessment of regional ventilation distribution by EIT, as verified versus 4DCT. Furthermore, the RMSE of anteroposterior ventilation profiles decreased from 2.53±0.62% to 1.67±0.49% while correlation increased from 0.77 to 0.89 after embedding anatomical information into the reconstruction models. In conclusion, the present work reveals that anatomically enhanced EIT-image reconstruction is superior to non-individualized reconstruction models, but further investigations in humans, so as to standardize reconstruction settings, is warranted. PMID:28763474

  2. Secular changes in standards of bodily attractiveness in women: tests of a reproductive model.

    PubMed

    Barber, N

    1998-05-01

    Since success at work is favored by a more slender body build while reproduction is favored by curvaceousness, standards of women's bodily attractiveness should be predictable from economic and reproductive variables. This hypothesis was tested in a replication and extension of a study by Silverstein, Perdue, Peterson, Vogel, and Fantini (1986) which looked at correlates of curvaceousness of Vogue models over time. As economic prosperity increased, and as women's participation in the economy, and higher education, increased, curvaceousness of the standards declined. As the proportion of single women to men, both aged 20-24 years, increased, and as the birth rate declined, curvaceousness was reduced. Results suggest that cultural standards of attractiveness are influenced by an evolved psychology of mate selection.

  3. Top-quark mass coupling and classification of weakly coupled heterotic superstring vacua

    NASA Astrophysics Data System (ADS)

    Rizos, J.

    2014-06-01

    The quest for the Standard Model among the huge number of string vacua is usually based on a set of phenomenological criteria related to the massless spectrum of string models. In this work we study criteria associated with interactions in the effective low energy theory and in particular with the presence of the coupling that provides mass to the top quark. Working in the context of the free-fermionic formulation of the heterotic superstring, we demonstrate that, in a big class of phenomenologically promising compactifications, these criteria can be expressed entirely in terms of the generalised GSO projection coefficients entering the definition of the models. They are shown to be very efficient in identifying phenomenologically viable vacua, especially in the framework of computer-based search, as they are met by approximately one every models. We apply our results in the investigation of a class of supersymmetric Pati-Salam vacua, comprising configurations, and we show that when combined with other phenomenological requirements they lead to a relatively small set of about Standard Model compatible models that can be fully classified.

  4. Breast cancer screening services: trade-offs in quality, capacity, outreach, and centralization.

    PubMed

    Güneş, Evrim D; Chick, Stephen E; Akşin, O Zeynep

    2004-11-01

    This work combines and extends previous work on breast cancer screening models by explicitly incorporating, for the first time, aspects of the dynamics of health care states, program outreach, and the screening volume-quality relationship in a service system model to examine the effect of public health policy and service capacity decisions on public health outcomes. We consider the impact of increasing standards for minimum reading volume to improve quality, expanding outreach with or without decentralization of service facilities, and the potential of queueing due to stochastic effects and limited capacity. The results indicate a strong relation between screening quality and the cost of screening and treatment, and emphasize the importance of accounting for service dynamics when assessing the performance of health care interventions. For breast cancer screening, increasing outreach without improving quality and maintaining capacity results in less benefit than predicted by standard models.

  5. Geometric Metamorphosis

    PubMed Central

    Niethammer, Marc; Hart, Gabriel L.; Pace, Danielle F.; Vespa, Paul M.; Irimia, Andrei; Van Horn, John D.; Aylward, Stephen R.

    2013-01-01

    Standard image registration methods do not account for changes in image appearance. Hence, metamorphosis approaches have been developed which jointly estimate a space deformation and a change in image appearance to construct a spatio-temporal trajectory smoothly transforming a source to a target image. For standard metamorphosis, geometric changes are not explicitly modeled. We propose a geometric metamorphosis formulation, which explains changes in image appearance by a global deformation, a deformation of a geometric model, and an image composition model. This work is motivated by the clinical challenge of predicting the long-term effects of traumatic brain injuries based on time-series images. This work is also applicable to the quantification of tumor progression (e.g., estimating its infiltrating and displacing components) and predicting chronic blood perfusion changes after stroke. We demonstrate the utility of the method using simulated data as well as scans from a clinical traumatic brain injury patient. PMID:21995083

  6. Clinical data interoperability based on archetype transformation.

    PubMed

    Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2011-10-01

    The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. A comparison of BPMN 2.0 with other notations for manufacturing processes

    NASA Astrophysics Data System (ADS)

    García-Domínguez, A.; Marcos, Mariano; Medina, I.

    2012-04-01

    In order to study their current practices and improve on them, manufacturing firms need to view their processes from several viewpoints at various abstraction levels. Several notations have been developed for this purpose, such as Value Stream Mappings or IDEF models. More recently, the BPMN 2.0 standard from the Object Management Group has been proposed for modeling business processes. A process organizes several activities (manual or automatic) into a single higher-level entity, which can be reused elsewhere in the organization. Its potential for standardizing business interactions is well-known, but there is little work on using BPMN 2.0 to model manufacturing processes. In this work some of the previous notations are outlined and BPMN 2.0 is positioned among them after discussing it in more depth. Some guidelines on using BPMN 2.0 for manufacturing are offered, and its advantages and disadvantages in comparison with the other notations are presented.

  8. Estimating airline operating costs

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.

    1978-01-01

    A review was made of the factors affecting commercial aircraft operating and delay costs. From this work, an airline operating cost model was developed which includes a method for estimating the labor and material costs of individual airframe maintenance systems. The model, similar in some respects to the standard Air Transport Association of America (ATA) Direct Operating Cost Model, permits estimates of aircraft-related costs not now included in the standard ATA model (e.g., aircraft service, landing fees, flight attendants, and control fees). A study of the cost of aircraft delay was also made and a method for estimating the cost of certain types of airline delay is described.

  9. Food-and-beverage environment and procurement policies for healthier work environments.

    PubMed

    Gardner, Christopher D; Whitsel, Laurie P; Thorndike, Anne N; Marrow, Mary W; Otten, Jennifer J; Foster, Gary D; Carson, Jo Ann S; Johnson, Rachel K

    2014-06-01

    The importance of creating healthier work environments by providing healthy foods and beverages in worksite cafeterias, in on-site vending machines, and at meetings and conferences is drawing increasing attention. Large employers, federal and state governments, and hospital systems are significant purchasers and providers of food and beverages. The American Heart Association, federal government, and other organizations have created procurement standards to guide healthy purchasing by these entities. There is a need to review how procurement standards are currently implemented, to identify important minimum criteria for evaluating health and purchasing outcomes, and to recognize significant barriers and challenges to implementation, along with success stories. The purpose of this policy paper is to describe the role of food-and-beverage environment and procurement policy standards in creating healthier worksite environments; to review recently created national model standards; to identify elements across the standards that are important to consider for incorporation into policies; and to delineate issues to address as standards are implemented across the country. © 2014 International Life Sciences Institute.

  10. Estimated work ability in warm outdoor environments depends on the chosen heat stress assessment metric.

    PubMed

    Bröde, Peter; Fiala, Dusan; Lemke, Bruno; Kjellstrom, Tord

    2018-03-01

    With a view to occupational effects of climate change, we performed a simulation study on the influence of different heat stress assessment metrics on estimated workability (WA) of labour in warm outdoor environments. Whole-day shifts with varying workloads were simulated using as input meteorological records for the hottest month from four cities with prevailing hot (Dallas, New Delhi) or warm-humid conditions (Managua, Osaka), respectively. In addition, we considered the effects of adaptive strategies like shielding against solar radiation and different work-rest schedules assuming an acclimated person wearing light work clothes (0.6 clo). We assessed WA according to Wet Bulb Globe Temperature (WBGT) by means of an empirical relation of worker performance from field studies (Hothaps), and as allowed work hours using safety threshold limits proposed by the corresponding standards. Using the physiological models Predicted Heat Strain (PHS) and Universal Thermal Climate Index (UTCI)-Fiala, we calculated WA as the percentage of working hours with body core temperature and cumulated sweat loss below standard limits (38 °C and 7.5% of body weight, respectively) recommended by ISO 7933 and below conservative (38 °C; 3%) and liberal (38.2 °C; 7.5%) limits in comparison. ANOVA results showed that the different metrics, workload, time of day and climate type determined the largest part of WA variance. WBGT-based metrics were highly correlated and indicated slightly more constrained WA for moderate workload, but were less restrictive with high workload and for afternoon work hours compared to PHS and UTCI-Fiala. Though PHS showed unrealistic dynamic responses to rest from work compared to UTCI-Fiala, differences in WA assessed by the physiological models largely depended on the applied limit criteria. In conclusion, our study showed that the choice of the heat stress assessment metric impacts notably on the estimated WA. Whereas PHS and UTCI-Fiala can account for cumulative physiological strain imposed by extended work hours when working heavily under high heat stress, the current WBGT standards do not include this. Advanced thermophysiological models might help developing alternatives, where not only modelling details but also the choice of physiological limit criteria will require attention. There is also an urgent need for suitable empirical data relating workplace heat exposure to workability.

  11. Estimated work ability in warm outdoor environments depends on the chosen heat stress assessment metric

    NASA Astrophysics Data System (ADS)

    Bröde, Peter; Fiala, Dusan; Lemke, Bruno; Kjellstrom, Tord

    2018-03-01

    With a view to occupational effects of climate change, we performed a simulation study on the influence of different heat stress assessment metrics on estimated workability (WA) of labour in warm outdoor environments. Whole-day shifts with varying workloads were simulated using as input meteorological records for the hottest month from four cities with prevailing hot (Dallas, New Delhi) or warm-humid conditions (Managua, Osaka), respectively. In addition, we considered the effects of adaptive strategies like shielding against solar radiation and different work-rest schedules assuming an acclimated person wearing light work clothes (0.6 clo). We assessed WA according to Wet Bulb Globe Temperature (WBGT) by means of an empirical relation of worker performance from field studies (Hothaps), and as allowed work hours using safety threshold limits proposed by the corresponding standards. Using the physiological models Predicted Heat Strain (PHS) and Universal Thermal Climate Index (UTCI)-Fiala, we calculated WA as the percentage of working hours with body core temperature and cumulated sweat loss below standard limits (38 °C and 7.5% of body weight, respectively) recommended by ISO 7933 and below conservative (38 °C; 3%) and liberal (38.2 °C; 7.5%) limits in comparison. ANOVA results showed that the different metrics, workload, time of day and climate type determined the largest part of WA variance. WBGT-based metrics were highly correlated and indicated slightly more constrained WA for moderate workload, but were less restrictive with high workload and for afternoon work hours compared to PHS and UTCI-Fiala. Though PHS showed unrealistic dynamic responses to rest from work compared to UTCI-Fiala, differences in WA assessed by the physiological models largely depended on the applied limit criteria. In conclusion, our study showed that the choice of the heat stress assessment metric impacts notably on the estimated WA. Whereas PHS and UTCI-Fiala can account for cumulative physiological strain imposed by extended work hours when working heavily under high heat stress, the current WBGT standards do not include this. Advanced thermophysiological models might help developing alternatives, where not only modelling details but also the choice of physiological limit criteria will require attention. There is also an urgent need for suitable empirical data relating workplace heat exposure to workability.

  12. Handbook of LHC Higgs Cross Sections: 4. Deciphering the Nature of the Higgs Sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Florian, D.

    This Report summarizes the results of the activities of the LHC Higgs Cross Section Working Group in the period 2014-2016. The main goal of the working group was to present the state-of-the-art of Higgs physics at the LHC, integrating all new results that have appeared in the last few years. The first part compiles the most up-to-date predictions of Higgs boson production cross sections and decay branching ratios, parton distribution functions, and off-shell Higgs boson production and interference effects. The second part discusses the recent progress in Higgs effective field theory predictions, followed by the third part on pseudo-observables, simplifiedmore » template cross section and fiducial cross section measurements, which give the baseline framework for Higgs boson property measurements. The fourth part deals with the beyond the Standard Model predictions of various benchmark scenarios of Minimal Supersymmetric Standard Model, extended scalar sector, Next-to-Minimal Supersymmetric Standard Model and exotic Higgs boson decays. This report follows three previous working-group reports: Handbook of LHC Higgs Cross Sections: 1. Inclusive Observables (CERN-2011-002), Handbook of LHC Higgs Cross Sections: 2. Differential Distributions (CERN-2012-002), and Handbook of LHC Higgs Cross Sections: 3. Higgs properties (CERN-2013-004). The current report serves as the baseline reference for Higgs physics in LHC Run 2 and beyond.« less

  13. When Work Comes First: Young Adults in Vocational Education and Training in Norway

    ERIC Educational Resources Information Center

    Tønder, Anna Hagen; Aspøy, Tove Mogstad

    2017-01-01

    Since reforms implemented in 1994, vocational education and training (VET) in Norway has been integrated and standardized as part of upper-secondary education. When young people enter upper-secondary education at the age of 15 or 16, they can choose either a vocational programme or a general academic programme. The standard model in vocational…

  14. Impersonating the Standard Model Higgs boson: Alignment without decoupling

    DOE PAGES

    Carena, Marcela; Low, Ian; Shah, Nausheen R.; ...

    2014-04-03

    In models with an extended Higgs sector there exists an alignment limit, in which the lightest CP-even Higgs boson mimics the Standard Model Higgs. The alignment limit is commonly associated with the decoupling limit, where all non-standard scalars are significantly heavier than the Z boson. However, alignment can occur irrespective of the mass scale of the rest of the Higgs sector. In this work we discuss the general conditions that lead to “alignment without decoupling”, therefore allowing for the existence of additional non-standard Higgs bosons at the weak scale. The values of tan β for which this happens are derivedmore » in terms of the effective Higgs quartic couplings in general two-Higgs-doublet models as well as in supersymmetric theories, including the MSSM and the NMSSM. In addition, we study the information encoded in the variations of the SM Higgs-fermion couplings to explore regions in the m A – tan β parameter space.« less

  15. PACS for surgery and interventional radiology: features of a Therapy Imaging and Model Management System (TIMMS).

    PubMed

    Lemke, Heinz U; Berliner, Leonard

    2011-05-01

    Appropriate use of information and communication technology (ICT) and mechatronic (MT) systems is viewed by many experts as a means to improve workflow and quality of care in the operating room (OR). This will require a suitable information technology (IT) infrastructure, as well as communication and interface standards, such as specialized extensions of DICOM, to allow data interchange between surgical system components in the OR. A design of such an infrastructure, sometimes referred to as surgical PACS, but better defined as a Therapy Imaging and Model Management System (TIMMS), will be introduced in this article. A TIMMS should support the essential functions that enable and advance image guided therapy, and in the future, a more comprehensive form of patient-model guided therapy. Within this concept, the "image-centric world view" of the classical PACS technology is complemented by an IT "model-centric world view". Such a view is founded in the special patient modelling needs of an increasing number of modern surgical interventions as compared to the imaging intensive working mode of diagnostic radiology, for which PACS was originally conceptualised and developed. The modelling aspects refer to both patient information and workflow modelling. Standards for creating and integrating information about patients, equipment, and procedures are vitally needed when planning for an efficient OR. The DICOM Working Group 24 (WG-24) has been established to develop DICOM objects and services related to image and model guided surgery. To determine these standards, it is important to define step-by-step surgical workflow practices and create interventional workflow models per procedures or per variable cases. As the boundaries between radiation therapy, surgery and interventional radiology are becoming less well-defined, precise patient models will become the greatest common denominator for all therapeutic disciplines. In addition to imaging, the focus of WG-24 is to serve the therapeutic disciplines by enabling modelling technology to be based on standards. Copyright © 2011. Published by Elsevier Ireland Ltd.

  16. Participative Work Design in Lean Production: A Strategy for Dissolving the Paradox between Standardized Work and Team Proactivity by Stimulating Team Learning?

    ERIC Educational Resources Information Center

    Lantz, Annika; Hansen, Niklas; Antoni, Conny

    2015-01-01

    Purpose: The purpose of this paper is to explore job design mechanisms that enhance team proactivity within a lean production system where autonomy is uttermost restricted. We propose and test a model where the team learning process of building shared meaning of work mediates the relationship between team participative decision-making, inter team…

  17. Test of a Power Transfer Model for Standardized Electrofishing

    USGS Publications Warehouse

    Miranda, L.E.; Dolan, C.R.

    2003-01-01

    Standardization of electrofishing in waters with differing conductivities is critical when monitoring temporal and spatial differences in fish assemblages. We tested a model that can help improve the consistency of electrofishing by allowing control over the amount of power that is transferred to the fish. The primary objective was to verify, under controlled laboratory conditions, whether the model adequately described fish immobilization responses elicited with various electrical settings over a range of water conductivities. We found that the model accurately described empirical observations over conductivities ranging from 12 to 1,030 ??S/cm for DC and various pulsed-DC settings. Because the model requires knowledge of a fish's effective conductivity, an attribute that is likely to vary according to species, size, temperature, and other variables, a second objective was to gather available estimates of the effective conductivity of fish to examine the magnitude of variation and to assess whether in practical applications a standard effective conductivity value for fish may be assumed. We found that applying a standard fish effective conductivity of 115 ??S/cm introduced relatively little error into the estimation of the peak power density required to immobilize fish with electrofishing. However, this standard was derived from few estimates of fish effective conductivity and a limited number of species; more estimates are needed to validate our working standard.

  18. Working to Full Scope: The Reorganization of Nursing Work in Two Canadian Community Hospitals

    PubMed Central

    MacKinnon, Karen; Butcher, Diane L.; Bruce, Anne

    2018-01-01

    Work relationships between registered nurses (RNs) and practical nurses (LPNs) are changing as new models of nursing care delivery are introduced to create more flexibility for employers. In Canada, a team-based, hospital nursing care delivery model, known as Care Delivery Model Redesign (CDMR), redesigned a predominantly RN-based staffing model to a functional team consisting of fewer RNs and more LPNs. The scope of practice for LPNs was expanded, and unregulated health care assistants introduced. This study began from the standpoint of RNs and LPNs to understand their experiences working on redesigned teams by focusing on discourses activated in social settings. Guided by institutional ethnography, the conceptual and textual resources nurses are drawing on to understand these changing work relationships are explicated. We show how the institutional goals embedded in CDMR not only mediate how nurses work together, but how they subordinate holistic standards of nursing toward fragmented, task-oriented, divisions of care. PMID:29410976

  19. Evaluation of work stress, turnover intention, work experience, and satisfaction with preceptors of new graduate nurses using a 10-minute preceptor model.

    PubMed

    Hu, Yi-Chun; Chen, Su-Ru; Chen, I-Hui; Shen, Hsi-Che; Lin, Yen-Kuang; Chang, Wen-Yin

    2015-06-01

    Preparing new graduate nurses (NGNs) to achieve standards of nursing competence is challenging; therefore, this study developed and evaluated the effects of a 10-minute preceptor (10MP) model for assisting NGNs in their professional development and increasing their retention in hospitals. A repeated-measures design study, with an intervention and a two-group comparison, was conducted. A total of 107 NGNs participated in the study. At day 7, work stress and work experience were moderately high for the NGNs in both the 10MP and traditional preceptor model (TPM) groups. The preceptorship program showed significant differences between groups (p = 0.001) regarding work stress at months 2 and 3 and work experience at months 1, 2, and 3. The 10MP group reported lower turnover intention and higher satisfaction with the preceptors than the TPM group. The 10MP model is effective at improving training outcomes and facilitating the professional development of NGNs. Copyright 2015, SLACK Incorporated.

  20. Working to Full Scope: The Reorganization of Nursing Work in Two Canadian Community Hospitals.

    PubMed

    MacKinnon, Karen; Butcher, Diane L; Bruce, Anne

    2018-01-01

    Work relationships between registered nurses (RNs) and practical nurses (LPNs) are changing as new models of nursing care delivery are introduced to create more flexibility for employers. In Canada, a team-based, hospital nursing care delivery model, known as Care Delivery Model Redesign (CDMR), redesigned a predominantly RN-based staffing model to a functional team consisting of fewer RNs and more LPNs. The scope of practice for LPNs was expanded, and unregulated health care assistants introduced. This study began from the standpoint of RNs and LPNs to understand their experiences working on redesigned teams by focusing on discourses activated in social settings. Guided by institutional ethnography, the conceptual and textual resources nurses are drawing on to understand these changing work relationships are explicated. We show how the institutional goals embedded in CDMR not only mediate how nurses work together, but how they subordinate holistic standards of nursing toward fragmented, task-oriented, divisions of care.

  1. Creating Flexible and Sustainable Work Models for Academic Obstetrician-Gynecologists Engaged in Global Health Work.

    PubMed

    Molina, Rose; Boatin, Adeline; Farid, Huma; Luckett, Rebecca; Neo, Dayna; Ricciotti, Hope; Scott, Jennifer

    2017-10-01

    To describe various work models for obstetrics and gynecology global health faculty affiliated with academic medical centers and to identify barriers and opportunities for pursuing global health work. A mixed-methods study was conducted in 2016 among obstetrics and gynecology faculty and leaders from seven academic medical institutions in Boston, Massachusetts. Global health faculty members were invited to complete an online survey about their work models and to participate in semistructured interviews about barriers and facilitators of these models. Department chairs and residency directors were asked to participate in interviews. The survey response rate among faculty was 65.6% (21/32), of which 76.2% (16/21) completed an interview. Five department leaders (45.5% [5/11]) participated in an interview. Faculty described a range of work models with varied time and compensation, but only one third reported contracted time for global health work. The most common barriers to global health work were financial constraints, time limitations, lack of mentorship, need for specialized training, and maintenance of clinical skills. Career satisfaction, creating value for the obstetrics and gynecology department, and work model flexibility were the most important facilitators of sustainable global health careers. The study identified challenges and opportunities to creating flexible and sustainable work models for academic obstetrics and gynecology clinicians engaged in global health work. Additional research and innovation are needed to identify work models that allow for sustainable careers in global women's health. There are opportunities to create professional standards and models for academic global health work in the obstetrics and gynecology specialty.

  2. Simulating wall and corner fire tests on wood products with the OSU room fire model

    Treesearch

    H. C. Tran

    1994-01-01

    This work demonstrates the complexity of modeling wall and corner fires in a compartment. The model chosen for this purpose is the Ohio State University (OSU) room fire model. This model was designed to simulate fire growth on walls in a compartment and therefore lends itself to direct comparison with standard room test results. The model input were bench-scale data...

  3. Disability prevention and communication among workers, physicians, employers, and insurers--current models and opportunities for improvement.

    PubMed

    Pransky, Glenn; Shaw, William; Franche, Renee-Louise; Clarke, Andrew

    2004-06-03

    To review prevailing models of disability management and prevention with respect to communication, and to suggest alternative approaches. Review of selected articles. Effective disability management and return to work strategies have been the focus of an increasing number of intervention programmes and associated research studies, spanning a variety of worker populations and provider and business perspectives. Although primary and secondary disability prevention approaches have addressed theoretical basis, methods and costs, few identify communication as a key factor influencing disability outcomes. Four prevailing models of disability management and prevention (medical model, physical rehabilitation model, job-match model, and managed care model) are identified. The medical model emphasizes the physician's role to define functional limitations and job restrictions. In the physical rehabilitation model, rehabilitation professionals communicate the importance of exercise and muscle reconditioning for resuming normal work activities. The job-match model relies on the ability of employers to accurately communicate physical job requirements. The managed care model focuses on dissemination of acceptable standards for medical treatment and duration of work absence, and interventions by case managers when these standards are exceeded. Despite contrary evidence for many health impairments, these models share a common assumption that medical disability outcomes are highly predictable and unaffected by either individual or contextual factors. As a result, communication is often authoritative and unidirectional, with workers and employers in a passive role. Improvements in communication may be responsible for successes across a variety of new interventions. Communication-based interventions may further improve disability outcomes, reduce adversarial relationships, and prove cost-effective; however, controlled trials are needed.

  4. Designing Class Activities to Meet Specific Core Training Competencies: A Developmental Approach

    ERIC Educational Resources Information Center

    Guth, Lorraine J.; McDonnell, Kelly A.

    2004-01-01

    This article presents a developmental model for designing and utilizing class activities to meet specific Association for Specialists in Group Work (ASGW) core training competencies for group workers. A review of the relevant literature about teaching group work and meeting core training standards is provided. The authors suggest a process by…

  5. Analysis of the influence of passenger vehicles front-end design on pedestrian lower extremity injuries by means of the LLMS model.

    PubMed

    Scattina, Alessandro; Mo, Fuhao; Masson, Catherine; Avalle, Massimiliano; Arnoux, Pierre Jean

    2018-01-30

    This work aims at investigating the influence of some front-end design parameters of a passenger vehicle on the behavior and damage occurring in the human lower limbs when impacted in an accident. The analysis is carried out by means of finite element analysis using a generic car model for the vehicle and the lower limbs model for safety (LLMS) for the purpose of pedestrian safety. Considering the pedestrian standardized impact procedure (as in the 2003/12/EC Directive), a parametric analysis, through a design of experiments plan, was performed. Various material properties, bumper thickness, position of the higher and lower bumper beams, and position of pedestrian, were made variable in order to identify how they influence the injury occurrence. The injury prediction was evaluated from the knee lateral flexion, ligament elongation, and state of stress in the bone structure. The results highlighted that the offset between the higher and lower bumper beams is the most influential parameter affecting the knee ligament response. The influence is smaller or absent considering the other responses and the other considered parameters. The stiffness characteristics of the bumper are, instead, more notable on the tibia. Even if an optimal value of the variables could not be identified trends were detected, with the potential of indicating strategies for improvement. The behavior of a vehicle front end in the impact against a pedestrian can be improved optimizing its design. The work indicates potential strategies for improvement. In this work, each parameter was changed independently one at a time; in future works, the interaction between the design parameters could be also investigated. Moreover, a similar parametric analysis can be carried out using a standard mechanical legform model in order to understand potential diversities or correlations between standard tools and human models.

  6. BIM integration in education: A case study of the construction technology project Bolt Tower Dolni Vitkovice

    NASA Astrophysics Data System (ADS)

    Venkrbec, Vaclav; Bittnerova, Lucie

    2017-12-01

    Building information modeling (BIM) can support effectiveness during many activities in the AEC industry. even when processing a construction-technological project. This paper presents an approach how to use building information model in higher education, especially during the work on diploma thesis and it supervision. Diploma thesis is project based work, which aims to compile a construction-technological project for a selected construction. The paper describes the use of input data, working with them and compares this process with standard input data such as printed design documentation. The effectiveness of using the building information model as a input data for construction-technological project is described in the conclusion.

  7. Right-handed charged currents in the era of the Large Hadron Collider

    DOE PAGES

    Alioli, Simone; Cirigliano, Vincenzo; Dekens, Wouter Gerard; ...

    2017-05-16

    We discuss the phenomenology of right-handed charged currents in the frame-work of the Standard Model Effective Field Theory, in which they arise due to a single gauge-invariant dimension-six operator. We study the manifestations of the nine complex couplings of the W to right-handed quarks in collider physics, flavor physics, and low-energy precision measurements. We first obtain constraints on the couplings under the assumption that the right-handed operator is the dominant correction to the Standard Model at observable energies. Here, we subsequently study the impact of degeneracies with other Beyond-the-Standard-Model effective interactions and identify observables, both at colliders and low-energy experiments,more » that would uniquely point to right-handed charged currents.« less

  8. Higgs boson decays to neutralinos in low-scale gauge mediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mason, John D.; Poland, David; Morrissey, David E.

    2009-12-01

    We study the decays of a standard model-like minimal supersymmetric standard model Higgs boson to pairs of neutralinos, each of which subsequently decays promptly to a photon and a gravitino. Such decays can arise in supersymmetric scenarios where supersymmetry breaking is mediated to us by gauge interactions with a relatively light gauge messenger sector (M{sub mess} < or approx. 100 TeV). This process gives rise to a collider signal consisting of a pair of photons and missing energy. In the present work we investigate the bounds on this scenario within the minimal supersymmetric standard model from existing collider data. Wemore » also study the prospects for discovering the Higgs boson through this decay mode with upcoming data from the Tevatron and the LHC.« less

  9. Modeling as an Anchoring Scientific Practice for Explaining Friction Phenomena

    NASA Astrophysics Data System (ADS)

    Neilson, Drew; Campbell, Todd

    2017-12-01

    Through examining the day-to-day work of scientists, researchers in science studies have revealed how models are a central sense-making practice of scientists as they construct and critique explanations about how the universe works. Additionally, they allow predictions to be made using the tenets of the model. Given this, alongside research suggesting that engaging students in developing and using models can have a positive effect on learning in science classrooms, the recent national standards documents in science education have identified developing and using models as an important practice students should engage in as they apply and refine their ideas with peers and teachers in explaining phenomena or solving problems in classrooms. This article details how students can be engaged in developing and using models to help them make sense of friction phenomena in a high school conceptual physics classroom in ways that align with visions for teaching and learning outlined in the Next Generation Science Standards. This particular unit has been refined over several years to build on what was initially an inquiry-based unit we have described previously. In this latest iteration of the friction unit, students developed and refined models through engaging in small group and whole class discussions and investigations.

  10. College quality and hourly wages: evidence from the self-revelation model, sibling models and instrumental variables.

    PubMed

    Borgen, Nicolai T

    2014-11-01

    This paper addresses the recent discussion on confounding in the returns to college quality literature using the Norwegian case. The main advantage of studying Norway is the quality of the data. Norwegian administrative data provide information on college applications, family relations and a rich set of control variables for all Norwegian citizens applying to college between 1997 and 2004 (N = 141,319) and their succeeding wages between 2003 and 2010 (676,079 person-year observations). With these data, this paper uses a subset of the models that have rendered mixed findings in the literature in order to investigate to what extent confounding biases the returns to college quality. I compare estimates obtained using standard regression models to estimates obtained using the self-revelation model of Dale and Krueger (2002), a sibling fixed effects model and the instrumental variable model used by Long (2008). Using these methods, I consistently find increasing returns to college quality over the course of students' work careers, with positive returns only later in students' work careers. I conclude that the standard regression estimate provides a reasonable estimate of the returns to college quality. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. The Aspects About of Objectively Appraisals of Modeling Gypsum Quality and Composites of Phonic-Absorbent and Orthopedic on Base of Gypsum

    NASA Astrophysics Data System (ADS)

    Pop, P. A.; Ungur, P. A.; Lazar, L.; Marcu, F.

    2009-11-01

    The EU Norms about of protection environment, outside and inside ambient, and human health demands has lead at obtain of new materials on the base of airborne material, with high thermo and phonic-absorbent properties, porous and lightweight. The α and β-modeling gypsum plaster quality and lightweight depend on many factors as: fabrication process, granulation, roast temperature, work temperature, environment, additives used, breakage, etc. Also, the objectively appraisal of modeling gypsum quality depends of proper tests methods selection, which are legislated in norms, standards and recommendations. In Romanian Standards SR EN 13279-1/2005 and SR EN 13279-2/2005, adaptable from EU Norms EN 13279-1/2004 and EN 13279-2/2004, the characteristics gypsum family tests are well specification, as: granule-metric analysis, determination of water/plaster ratio, setting time, mechanical characteristics, adhesions and water restrain. For plaster with special use (phonic-absorbent and orthopedic materials, etc.) these determinations are not concluding, being necessary more parameters finding, as: elastic constant, phonic-absorbent coefficient, porosity, working, etc., which is imposed the completion of norms and standards with new determinations.

  12. An In situ Analysis of the Dissolution Characteristics of Half Pitch Line and Space Extreme Ultraviolet Lithography Resist Patterns

    NASA Astrophysics Data System (ADS)

    Santillan, Julius Joseph; Itani, Toshiro

    2013-06-01

    The characterization of the resist dissolution is one fundamental area of research that has been continuously investigated. This paper focuses on the preliminary work on the application the high speed atomic force microscope (HS-AFM) for the in situ dissolution analysis half-pitch (hp) lines and spaces (L/S) at standard developer concentration. In earlier works, this has been difficult but through extensive optimization and the use of carbon nano fiber-tipped cantilevers, the dissolution characterization of a 32 nm hp L/S pattern at 0.26 N aqueous tetramethylammonium hydroxide developer (standard developer concentration) was successfully achieved. Based on the results obtained using the EIDEC standard resist (ESR1) it was found that regardless of analysis condition such as resist pattern configuration (isolated or L/S pattern) and developer concentration (diluted or standard), similar dissolution characteristics in the form of resist swelling of exposed areas was observed. Moreover, further investigations using other types of model resist polymer platforms such as poly(hydroxystyrene) (PHS)-based and hybrid (PHS-methacryl)-based model resists have confirmed that dissolution behavior is not affected by the analysis conditions applied.

  13. Simulation of springback and microstructural analysis of dual phase steels

    NASA Astrophysics Data System (ADS)

    Kalyan, T. Sri.; Wei, Xing; Mendiguren, Joseba; Rolfe, Bernard

    2013-12-01

    With increasing demand for weight reduction and better crashworthiness abilities in car development, advanced high strength Dual Phase (DP) steels have been progressively used when making automotive parts. The higher strength steels exhibit higher springback and lower dimensional accuracy after stamping. This has necessitated the use of simulation of each stamped component prior to production to estimate the part's dimensional accuracy. Understanding the micro-mechanical behaviour of AHSS sheet may provide more accuracy to stamping simulations. This work can be divided basically into two parts: first modelling a standard channel forming process; second modelling the micro-structure of the process. The standard top hat channel forming process, benchmark NUMISHEET'93, is used for investigating springback effect of WISCO Dual Phase steels. The second part of this work includes the finite element analysis of microstructures to understand the behaviour of the multi-phase steel at a more fundamental level. The outcomes of this work will help in the dimensional control of steels during manufacturing stage based on the material's microstructure.

  14. MathWorks Simulink and C++ integration with the new VLT PLC-based standard development platform for instrument control systems

    NASA Astrophysics Data System (ADS)

    Kiekebusch, Mario J.; Di Lieto, Nicola; Sandrock, Stefan; Popovic, Dan; Chiozzi, Gianluca

    2014-07-01

    ESO is in the process of implementing a new development platform, based on PLCs, for upcoming VLT control systems (new instruments and refurbishing of existing systems to manage obsolescence issues). In this context, we have evaluated the integration and reuse of existing C++ libraries and Simulink models into the real-time environment of BECKHOFF Embedded PCs using the capabilities of the latest version of TwinCAT software and MathWorks Embedded Coder. While doing so the aim was to minimize the impact of the new platform by adopting fully tested solutions implemented in C++. This allows us to reuse the in house expertise, as well as extending the normal capabilities of the traditional PLC programming environments. We present the progress of this work and its application in two concrete cases: 1) field rotation compensation for instrument tracking devices like derotators, 2) the ESO standard axis controller (ESTAC), a generic model-based controller implemented in Simulink and used for the control of telescope main axes.

  15. An audit of the laboratory service provided to the Health Service Executive Orthodontic Department, St James Hospital, Dublin.

    PubMed

    Al-Awadhi, E A; Wolstencroft, S J; Blake, M

    2006-01-01

    To evaluate the service purchased from contracted orthodontic laboratories used by HSE (SWA) regional orthodontic unit, St. James's Hospital, Dublin and identify deficiencies in the current service. A data collection questionnaire was designed and distributed to the departmental orthodontists for a period of three months (October-December 2004). Gold standards, drawn up based on the authors' ideal requirements and published guidelines, were supplied to grade the work returned. During the study period 363 items of laboratory work were requested. 20% of the laboratory work arrived late and most of the delayed work was delayed for more than 24 hours. Most laboratory delays occurred with functional appliances, retainers and study models. Prior to fit, 20% of the appliances required adjustments for more than 30 seconds. 65% of laboratory work returned to the department met all of the gold standards. 10% of appliances were considered unsatisfactory. Functional appliances were most often ill fitting accounting for almost half of the unsatisfactory laboratory work. The majority of the laboratory work returned to the department met our gold standards and arrived on time. Forty six percent of the appliances required adjustments. Functional appliances required the most adjustments; one in five of all functional appliances ordered were considered unsatisfactory.

  16. Improving Project Management Using Formal Models and Architectures

    NASA Technical Reports Server (NTRS)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  17. The Enigmatic Neutrino

    ERIC Educational Resources Information Center

    Lincoln, Don; Miceli, Tia

    2015-01-01

    Through a century of work, physicists have refined a model to describe all fundamental particles, the forces they share, and their interactions on a microscopic scale. This masterpiece of science is called the Standard Model. While this theory is incredibly powerful, we know of at least one particle that exhibits behaviors that are outside of its…

  18. Impact of combustion products from Space Shuttle launches on ambient air quality

    NASA Technical Reports Server (NTRS)

    Dumbauld, R. K.; Bowers, J. F.; Cramer, H. E.

    1974-01-01

    The present work describes some multilayer diffusion models and a computer program for these models developed to predict the impact of ground clouds formed during Space Shuttle launches on ambient air quality. The diffusion models are based on the Gaussian plume equation for an instantaneous volume source. Cloud growth is estimated on the basis of measurable meteorological parameters: standard deviation of the wind azimuth angle, standard deviation of wind elevation angle, vertical wind-speed shear, vertical wind-direction shear, and depth of the surface mixing layer. Calculations using these models indicate that Space Shuttle launches under a variety of meteorological regimes at Kennedy Space Center and Vandenberg AFB are unlikely to endanger the exposure standards for HCl; similar results have been obtained for CO and Al2O3. However, the possibility that precipitation scavenging of the ground cloud might result in an acidic rain that could damage vegetation has not been investigated.

  19. Social work in dentistry: the CARES model for improving patient retention and access to care.

    PubMed

    Doris, Joan M; Davis, Elaine; Du Pont, Cynthia; Holdaway, Britt

    2009-07-01

    Social work programs in dental schools and dental clinics have been operated successfully since the 1940s, and have been documented as contributing to patients' access to care and to dental education. However, unlike medical social work, with which it has much in common, social work in dentistry has failed to become a standard feature of dental schools and clinics. Few of the social work initiatives that have been implemented in dental schools have survived after initial grant funding ran out, or the institutional supporters of the program moved on. The authors hope that the CARES program serves as a model for the successful development of other programs at the intersection of social work and dentistry to the benefit of both dental patients and providers.

  20. Volume 2: Compendium of Abstracts

    DTIC Science & Technology

    2017-06-01

    simulation work using a standard running model for legged systems, the Spring Loaded Inverted Pendulum (SLIP) Model. In this model, the dynamics of a single...bar SLIP model is analyzed using a basin of attraction analyses to determine the optimal configuration for running at different velocities and...acquisition, and the automatic target acquisition were then compared to each other. After running trials with the current system, it will be

  1. Examination of ethical practice in nursing continuing education using the Husted model.

    PubMed

    Steckler, J

    1998-01-01

    Beliefs about human nature, adult education, adult learners, and moral commitment are at the heart of the educator-learner agreement. In continuing nursing education, it is the point where professional values, morals, and ethical principles meet. Using Husteds' bioethical decision-making model, the values, beliefs, and actions within the educator-learning agreement are identified and organized by the bioethical standards. By relating the bioethical standards to practice, continuing nurse educators can find their own basis for practice and work toward attaining a consistent professional ethical orientation.

  2. Mental health professional experiences of the flexible assertive community treatment model: a grounded theory study.

    PubMed

    Lexén, Annika; Svensson, Bengt

    2016-08-01

    Despite the lack of evidence for effectiveness of the Flexible Assertive Community Treatment (Flexible ACT), the model is considered feasible and is well received by mental health professionals. No current studies have adequately examined mental health professional experiences of working with Flexible ACT. The aim of this study was to explore mental health professional experiences of working with the Flexible ACT model compared with standard care. The study was guided by grounded theory and based on the interviews with 19 theoretically chosen mental health professionals in Swedish urban areas primarily working with consumers with psychosis, who had worked with the Flexible ACT model for at least 6 months. The analysis resulted in the core category: "Flexible ACT and the shared caseload create a common action space" and three main categories: (1) "Flexible ACT fills the need for a systematic approach to crisis intervention"; (2) "Flexible ACT has advantages in the psychosocial working environment"; and (3) "Flexible ACT increases the quality of care". Mental health professionals may benefit from working with the Flexible ACT model through decreased job-strain and stress, increased feeling of being in control over their work situation, and experiences of providing higher quality of care.

  3. Gold-standard evaluation of a folksonomy-based ontology learning model

    NASA Astrophysics Data System (ADS)

    Djuana, E.

    2018-03-01

    Folksonomy, as one result of collaborative tagging process, has been acknowledged for its potential in improving categorization and searching of web resources. However, folksonomy contains ambiguities such as synonymy and polysemy as well as different abstractions or generality problem. To maximize its potential, some methods for associating tags of folksonomy with semantics and structural relationships have been proposed such as using ontology learning method. This paper evaluates our previous work in ontology learning according to gold-standard evaluation approach in comparison to a notable state-of-the-art work and several baselines. The results show that our method is comparable to the state-of the art work which further validate our approach as has been previously validated using task-based evaluation approach.

  4. Relationship between long working hours and depression: a 3-year longitudinal study of clerical workers.

    PubMed

    Amagasa, Takashi; Nakayama, Takeo

    2013-08-01

    To clarify how long working hours affect the likelihood of current and future depression. Using data from four repeated measurements collected from 218 clerical workers, four models associating work-related factors to the depressive mood scale were established. The final model was constructed after comparing and testing the goodness-of-fit index using structural equation modeling. Multiple logistic regression analysis was also performed. The final model showed the best fit (normed fit index = 0.908; goodness-of-fit index = 0.936; root-mean-square error of approximation = 0.018). Its standardized total effect indicated that long working hours affected depression at the time of evaluation and 1 to 3 years later. The odds ratio for depression risk was 14.7 in employees who were not long-hours overworked according to the initial survey but who were long-hours overworked according to the second survey. Long working hours increase current and future risks of depression.

  5. Process Evaluation of an Integrated Health Promotion/Occupational Health Model in WellWorks-2

    ERIC Educational Resources Information Center

    Hunt, Mary Kay; Lederman, Ruth; Stoddard, Anne M.; LaMontagne, Anthony D.; McLellan, Deborah; Combe, Candace; Barbeau, Elizabeth; Sorensen, Glorian

    2005-01-01

    Disparities in chronic disease risk by occupation call for new approaches to health promotion. WellWorks-2 was a randomized, controlled study comparing the effectiveness of a health promotion/occupational health program (HP/OHS) with a standard intervention (HP). Interventions in both studies were based on the same theoretical foundations. Results…

  6. Partnership Working in Small Rural Primary Schools: The Best of Both Worlds. Research Report

    ERIC Educational Resources Information Center

    Hill, Robert

    2014-01-01

    The aim of the research was to investigate the most effective ways for small rural primary schools to work together in order to improve provision and raise standards. The project sought to examine the circumstances and context of small rural schools in Lincolnshire and evaluate their different leadership models (such as collaborations,…

  7. Color Image Restoration Using Nonlocal Mumford-Shah Regularizers

    NASA Astrophysics Data System (ADS)

    Jung, Miyoun; Bresson, Xavier; Chan, Tony F.; Vese, Luminita A.

    We introduce several color image restoration algorithms based on the Mumford-Shah model and nonlocal image information. The standard Ambrosio-Tortorelli and Shah models are defined to work in a small local neighborhood, which are sufficient to denoise smooth regions with sharp boundaries. However, textures are not local in nature and require semi-local/non-local information to be denoised efficiently. Inspired from recent work (NL-means of Buades, Coll, Morel and NL-TV of Gilboa, Osher), we extend the standard models of Ambrosio-Tortorelli and Shah approximations to Mumford-Shah functionals to work with nonlocal information, for better restoration of fine structures and textures. We present several applications of the proposed nonlocal MS regularizers in image processing such as color image denoising, color image deblurring in the presence of Gaussian or impulse noise, color image inpainting, and color image super-resolution. In the formulation of nonlocal variational models for the image deblurring with impulse noise, we propose an efficient preprocessing step for the computation of the weight function w. In all the applications, the proposed nonlocal regularizers produce superior results over the local ones, especially in image inpainting with large missing regions. Experimental results and comparisons between the proposed nonlocal methods and the local ones are shown.

  8. Closed-form dynamics of a hexarot parallel manipulator by means of the principle of virtual work

    NASA Astrophysics Data System (ADS)

    Pedrammehr, Siamak; Nahavandi, Saeid; Abdi, Hamid

    2018-04-01

    In this research, a systematic approach to solving the inverse dynamics of hexarot manipulators is addressed using the methodology of virtual work. For the first time, a closed form of the mathematical formulation of the standard dynamic model is presented for this class of mechanisms. An efficient algorithm for solving this closed-form dynamic model of the mechanism is developed and it is used to simulate the dynamics of the system for different trajectories. Validation of the proposed model is performed using SimMechanics and it is shown that the results of the proposed mathematical model match with the results obtained by the SimMechanics model.

  9. Experimental demonstration of nonbilocal quantum correlations.

    PubMed

    Saunders, Dylan J; Bennet, Adam J; Branciard, Cyril; Pryde, Geoff J

    2017-04-01

    Quantum mechanics admits correlations that cannot be explained by local realistic models. The most studied models are the standard local hidden variable models, which satisfy the well-known Bell inequalities. To date, most works have focused on bipartite entangled systems. We consider correlations between three parties connected via two independent entangled states. We investigate the new type of so-called "bilocal" models, which correspondingly involve two independent hidden variables. These models describe scenarios that naturally arise in quantum networks, where several independent entanglement sources are used. Using photonic qubits, we build such a linear three-node quantum network and demonstrate nonbilocal correlations by violating a Bell-like inequality tailored for bilocal models. Furthermore, we show that the demonstration of nonbilocality is more noise-tolerant than that of standard Bell nonlocality in our three-party quantum network.

  10. Relations between mental health team characteristics and work role performance.

    PubMed

    Fleury, Marie-Josée; Grenier, Guy; Bamvita, Jean-Marie; Farand, Lambert

    2017-01-01

    Effective mental health care requires a high performing, interprofessional team. Among 79 mental health teams in Quebec (Canada), this exploratory study aims to 1) determine the association between work role performance and a wide range of variables related to team effectiveness according to the literature, and to 2) using structural equation modelling, assess the covariance between each of these variables as well as the correlation with other exogenous variables. Work role performance was measured with an adapted version of a work role questionnaire. Various independent variables including team manager characteristics, user characteristics, team profiles, clinical activities, organizational culture, network integration strategies and frequency/satisfaction of interactions with other teams or services were analyzed under the structural equation model. The later provided a good fit with the data. Frequent use of standardized procedures and evaluation tools (e.g. screening and assessment tools for mental health disorders) and team manager seniority exerted the most direct effect on work role performance. While network integration strategies had little effect on work role performance, there was a high covariance between this variable and those directly affecting work role performance among mental health teams. The results suggest that the mental healthcare system should apply standardized procedures and evaluation tools and, to a lesser extent, clinical approaches to improve work role performance in mental health teams. Overall, a more systematic implementation of network integration strategies may contribute to improved work role performance in mental health care.

  11. Nursing casualization and communication: a critical ethnography.

    PubMed

    Batch, Mary; Windsor, Carol

    2015-04-01

    The aim was to explore the relationship between nursing casualization and the culture of communication for nurses in a healthcare facility. Casualization, or non-standard work, is the use of temporary, contract, part-time and casual labour. An increase in casual labour has been part of a global shift in work organization aimed at creating a more flexible and cheaper workforce. It has been argued that flexibility of labour has enabled nurses to manage both non-work related needs and an increasingly complex work environment. Yet no research has explored casualization and how it impacts on the communication culture for nurses in a healthcare facility. Critical ethnography. Methods included observation, field notes, formal interviews and focus groups. Data collection was undertaken over the 2 years 2008-2009. The concepts of knowing and belonging were perceived as important to nursing teamwork and yet the traditional time/task work model, designed for a full-time workforce, marginalized non-standard workers. The combination of medical dominance and traditional stereotyping of the nurse and work as full-time shaped the behaviours of nurses and situated casual workers on the periphery. The overall finding was that entrenched systemic structures and processes shaped the physical and cultural dimensions of a contemporary work environment and contributed to an ineffective communication culture. Flexible work is an important feature of contemporary nursing. Traditional work models and nurse attitudes and practices have not progressed and are discordant with a contemporary approach to nursing labour management. © 2014 John Wiley & Sons Ltd.

  12. Relations between mental health team characteristics and work role performance

    PubMed Central

    Grenier, Guy; Bamvita, Jean-Marie; Farand, Lambert

    2017-01-01

    Effective mental health care requires a high performing, interprofessional team. Among 79 mental health teams in Quebec (Canada), this exploratory study aims to 1) determine the association between work role performance and a wide range of variables related to team effectiveness according to the literature, and to 2) using structural equation modelling, assess the covariance between each of these variables as well as the correlation with other exogenous variables. Work role performance was measured with an adapted version of a work role questionnaire. Various independent variables including team manager characteristics, user characteristics, team profiles, clinical activities, organizational culture, network integration strategies and frequency/satisfaction of interactions with other teams or services were analyzed under the structural equation model. The later provided a good fit with the data. Frequent use of standardized procedures and evaluation tools (e.g. screening and assessment tools for mental health disorders) and team manager seniority exerted the most direct effect on work role performance. While network integration strategies had little effect on work role performance, there was a high covariance between this variable and those directly affecting work role performance among mental health teams. The results suggest that the mental healthcare system should apply standardized procedures and evaluation tools and, to a lesser extent, clinical approaches to improve work role performance in mental health teams. Overall, a more systematic implementation of network integration strategies may contribute to improved work role performance in mental health care. PMID:28991923

  13. LHC benchmark scenarios for the real Higgs singlet extension of the standard model

    DOE PAGES

    Robens, Tania; Stefaniak, Tim

    2016-05-13

    Here, we present benchmark scenarios for searches for an additional Higgs state in the real Higgs singlet extension of the Standard Model in Run 2 of the LHC. The scenarios are selected such that they ful ll all relevant current theoretical and experimental constraints, but can potentially be discovered at the current LHC run. We take into account the results presented in earlier work and update the experimental constraints from relevant LHC Higgs searches and signal rate measurements. The benchmark scenarios are given separately for the low mass and high mass region, i.e. the mass range where the additional Higgsmore » state is lighter or heavier than the discovered Higgs state at around 125 GeV. They have also been presented in the framework of the LHC Higgs Cross Section Working Group.« less

  14. Demographic management in a federated healthcare environment.

    PubMed

    Román, I; Roa, L M; Reina-Tosina, J; Madinabeitia, G

    2006-09-01

    The purpose of this paper is to provide a further step toward the decentralization of identification and demographic information about persons by solving issues related to the integration of demographic agents in a federated healthcare environment. The aim is to identify a particular person in every system of a federation and to obtain a unified view of his/her demographic information stored in different locations. This work is based on semantic models and techniques, and pursues the reconciliation of several current standardization works including ITU-T's Open Distributed Processing, CEN's prEN 12967, OpenEHR's dual and reference models, CEN's General Purpose Information Components and CORBAmed's PID service. We propose a new paradigm for the management of person identification and demographic data, based on the development of an open architecture of specialized distributed components together with the incorporation of techniques for the efficient management of domain ontologies, in order to have a federated demographic service. This new service enhances previous correlation solutions sharing ideas with different standards and domains like semantic techniques and database systems. The federation philosophy enforces us to devise solutions to the semantic, functional and instance incompatibilities in our approach. Although this work is based on several models and standards, we have improved them by combining their contributions and developing a federated architecture that does not require the centralization of demographic information. The solution is thus a good approach to face integration problems and the applied methodology can be easily extended to other tasks involved in the healthcare organization.

  15. Data format standard for sharing light source measurements

    NASA Astrophysics Data System (ADS)

    Gregory, G. Groot; Ashdown, Ian; Brandenburg, Willi; Chabaud, Dominique; Dross, Oliver; Gangadhara, Sanjay; Garcia, Kevin; Gauvin, Michael; Hansen, Dirk; Haraguchi, Kei; Hasna, Günther; Jiao, Jianzhong; Kelley, Ryan; Koshel, John; Muschaweck, Julius

    2013-09-01

    Optical design requires accurate characterization of light sources for computer aided design (CAD) software. Various methods have been used to model sources, from accurate physical models to measurement of light output. It has become common practice for designers to include measured source data for design simulations. Typically, a measured source will contain rays which sample the output distribution of the source. The ray data must then be exported to various formats suitable for import into optical analysis or design software. Source manufacturers are also making measurements of their products and supplying CAD models along with ray data sets for designers. The increasing availability of data has been beneficial to the design community but has caused a large expansion in storage needs for the source manufacturers since each software program uses a unique format to describe the source distribution. In 2012, the Illuminating Engineering Society (IES) formed a working group to understand the data requirements for ray data and recommend a standard file format. The working group included representatives from software companies supplying the analysis and design tools, source measurement companies providing metrology, source manufacturers creating the data and users from the design community. Within one year the working group proposed a file format which was recently approved by the IES for publication as TM-25. This paper will discuss the process used to define the proposed format, highlight some of the significant decisions leading to the format and list the data to be included in the first version of the standard.

  16. The influence of tie strength on evolutionary games on networks: An empirical investigation

    NASA Astrophysics Data System (ADS)

    Buesser, Pierre; Peña, Jorge; Pestelacci, Enea; Tomassini, Marco

    2011-11-01

    Extending previous work on unweighted networks, we present here a systematic numerical investigation of standard evolutionary games on weighted networks. In the absence of any reliable model for generating weighted social networks, we attribute weights to links in a few ways supported by empirical data ranging from totally uncorrelated to weighted bipartite networks. The results of the extensive simulation work on standard complex network models show that, except in a case that does not seem to be common in social networks, taking the tie strength into account does not change in a radical manner the long-run steady-state behavior of the studied games. Besides model networks, we also included a real-life case drawn from a coauthorship network. In this case also, taking the weights into account only changes the results slightly with respect to the raw unweighted graph, although to draw more reliable conclusions on real social networks many more cases should be studied as these weighted networks become available.

  17. Application of RANS Simulations for Contact Time Predictions in Turbulent Reactor Tanks for Water Purification Process

    NASA Astrophysics Data System (ADS)

    Nickles, Cassandra; Goodman, Matthew; Saez, Jose; Issakhanian, Emin

    2016-11-01

    California's current drought has renewed public interest in recycled water from Water Reclamation Plants (WRPs). It is critical that the recycled water meets public health standards. This project consists of simulating the transport of an instantaneous conservative tracer through the WRP chlorine contact tanks. Local recycled water regulations stipulate a minimum 90-minute modal contact time during disinfection at peak dry weather design flow. In-situ testing is extremely difficult given flowrate dependence on real world sewage line supply and recycled water demand. Given as-built drawings and operation parameters, the chlorine contact tanks are modeled to simulate extreme situations, which may not meet regulatory standards. The turbulent flow solutions are used as the basis to model the transport of a turbulently diffusing conservative tracer added instantaneously to the inlet of the reactors. This tracer simulates the transport through advection and dispersion of chlorine in the WRPs. Previous work validated the models against experimental data. The current work shows the predictive value of the simulations.

  18. Calibration and Propagation of Uncertainty for Independence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Troy Michael; Kress, Joel David; Bhat, Kabekode Ghanasham

    This document reports on progress and methods for the calibration and uncertainty quantification of the Independence model developed at UT Austin. The Independence model is an advanced thermodynamic and process model framework for piperazine solutions as a high-performance CO 2 capture solvent. Progress is presented in the framework of the CCSI standard basic data model inference framework. Recent work has largely focused on the thermodynamic submodels of Independence.

  19. The Economic Impact of the President’s 2013 Budget

    DTIC Science & Technology

    2012-04-01

    and capital . According to the Solow-type model , people base their decisions about working and saving pri- marily on current economic... model developed by Robert Solow. CBO’s life-cycle growth model is an overlapping - generations general -equilibrium model that is based on a standard...services produced in a given period by the labor and capital supplied by the country’s residents , regardless of where the labor

  20. Experimental Verification of an Instrument to Test Flooring Materials

    NASA Astrophysics Data System (ADS)

    Philip, Rony; Löfgren, Hans, Dr

    2018-02-01

    The focus of this work is to validate the fluid model with different flooring materials and the measurements of an instrument to test flooring materials and its force attenuating capabilities using mathematical models to describe the signature and coefficients of the floor. The main contribution of the present work focus on the development of a mathematical fluid model for floors. The aim of the thesis was to analyze, compare different floor materials and to study the linear dynamics of falling impacts on floors. The impact of the hammer during a fall is captured by an accelerometer and response is collected using a picoscope. The collected data was analyzed using matlab least square method which is coded as per the fluid model. The finding from this thesis showed that the fluid model works with more elastic model but it doesn’t work for rigid materials like wood. The importance of parameters like velocity, mass, energy loss and other coefficients of floor which influences the model during the impact of falling on floors were identified and a standardized testing method was set.

  1. An improved interfacial bonding model for material interface modeling

    PubMed Central

    Lin, Liqiang; Wang, Xiaodu; Zeng, Xiaowei

    2016-01-01

    An improved interfacial bonding model was proposed from potential function point of view to investigate interfacial interactions in polycrystalline materials. It characterizes both attractive and repulsive interfacial interactions and can be applied to model different material interfaces. The path dependence of work-of-separation study indicates that the transformation of separation work is smooth in normal and tangential direction and the proposed model guarantees the consistency of the cohesive constitutive model. The improved interfacial bonding model was verified through a simple compression test in a standard hexagonal structure. The error between analytical solutions and numerical results from the proposed model is reasonable in linear elastic region. Ultimately, we investigated the mechanical behavior of extrafibrillar matrix in bone and the simulation results agreed well with experimental observations of bone fracture. PMID:28584343

  2. Meeting report from the fourth meeting of the Computational Modeling in Biology Network (COMBINE)

    PubMed Central

    Waltemath, Dagmar; Bergmann, Frank T.; Chaouiya, Claudine; Czauderna, Tobias; Gleeson, Padraig; Goble, Carole; Golebiewski, Martin; Hucka, Michael; Juty, Nick; Krebs, Olga; Le Novère, Nicolas; Mi, Huaiyu; Moraru, Ion I.; Myers, Chris J.; Nickerson, David; Olivier, Brett G.; Rodriguez, Nicolas; Schreiber, Falk; Smith, Lucian; Zhang, Fengkai; Bonnet, Eric

    2014-01-01

    The Computational Modeling in Biology Network (COMBINE) is an initiative to coordinate the development of community standards and formats in computational systems biology and related fields. This report summarizes the topics and activities of the fourth edition of the annual COMBINE meeting, held in Paris during September 16-20 2013, and attended by a total of 96 people. This edition pioneered a first day devoted to modeling approaches in biology, which attracted a broad audience of scientists thanks to a panel of renowned speakers. During subsequent days, discussions were held on many subjects including the introduction of new features in the various COMBINE standards, new software tools that use the standards, and outreach efforts. Significant emphasis went into work on extensions of the SBML format, and also into community-building. This year’s edition once again demonstrated that the COMBINE community is thriving, and still manages to help coordinate activities between different standards in computational systems biology.

  3. Guide to solar reference spectra and irradiance models

    NASA Astrophysics Data System (ADS)

    Tobiska, W. Kent

    The international standard for determining solar irradiances was published by the International Standards Organization (ISO) in May 2007. The document, ISO 21348 Space Environment (natural and artificial) - Process for determining solar irradiances, describes the process for representing solar irradiances. We report on the next progression of standards work, i.e., the development of a guide that identifies solar reference spectra and irradiance models for use in engineering design or scientific research. This document will be produced as an AIAA Guideline and ISO Technical Report. It will describe the content of the reference spectra and models, uncertainties and limitations, technical basis, data bases from which the reference spectra and models are formed, publication references, and sources of computer code for reference spectra and solar irradiance models, including those which provide spectrally-resolved lines as well as solar indices and proxies and which are generally recognized in the solar sciences. The document is intended to assist aircraft and space vehicle designers and developers, heliophysicists, geophysicists, aeronomers, meteorologists, and climatologists in understanding available models, comparing sources of data, and interpreting engineering and scientific results based on different solar reference spectra and irradiance models.

  4. Mathematical outcomes and working memory in children with TBI and orthopedic injury.

    PubMed

    Raghubar, Kimberly P; Barnes, Marcia A; Prasad, Mary; Johnson, Chad P; Ewing-Cobbs, Linda

    2013-03-01

    This study compared mathematical outcomes in children with predominantly moderate to severe traumatic brain injury (TBI; n550) or orthopedic injury (OI; n547) at 2 and 24 months post-injury. Working memory and its contribution to math outcomes at 24 months post-injury was also examined. Participants were administered an experimental cognitive addition task and standardized measures of calculation, math fluency, and applied problems; as well as experimental measures of verbal and visual-spatial working memory. Although children with TBI did not have deficits in foundational math fact retrieval, they performed more poorly than OIs on standardized measures of math. In the TBI group, performance on standardized measures was predicted by age at injury, socioeconomic status, and the duration of impaired consciousness. Children with TBI showed impairments on verbal, but not visual working memory relative to children with OI. Verbal working memory mediated group differences on math calculations and applied problems at 24 months post-injury. Children with TBI have difficulties in mathematics, but do not have deficits in math fact retrieval, a signature deficit of math disabilities. Results are discussed with reference to models of mathematical cognition and disability and the role of working memory in math learning and performance for children with TBI.

  5. Mathematical Outcomes and Working Memory in Children With TBI and Orthopedic Injury

    PubMed Central

    Raghubar, Kimberly P.; Barnes, Marcia A.; Prasad, Mary; Johnson, Chad P.; Ewing-Cobbs, Linda

    2013-01-01

    This study compared mathematical outcomes in children with predominantly moderate to severe traumatic brain injury (TBI; n =50) or orthopedic injury (OI; n=47) at 2 and 24 months post-injury. Working memory and its contribution to math outcomes at 24 months post-injury was also examined. Participants were administered an experimental cognitive addition task and standardized measures of calculation, math fluency, and applied problems; as well as experimental measures of verbal and visual-spatial working memory. Although children with TBI did not have deficits in foundational math fact retrieval, they performed more poorly than OIs on standardized measures of math. In the TBI group, performance on standardized measures was predicted by age at injury, socioeconomic status, and the duration of impaired consciousness. Children with TBI showed impairments on verbal, but not visual working memory relative to children with OI. Verbal working memory mediated group differences on math calculations and applied problems at 24 months post-injury. Children with TBI have difficulties in mathematics, but do not have deficits in math fact retrieval, a signature deficit of math disabilities. Results are discussed with reference to models of mathematical cognition and disability and the role of working memory in math learning and performance for children with TBI. PMID:23164058

  6. A Collective Study on Modeling and Simulation of Resistive Random Access Memory

    NASA Astrophysics Data System (ADS)

    Panda, Debashis; Sahu, Paritosh Piyush; Tseng, Tseung Yuen

    2018-01-01

    In this work, we provide a comprehensive discussion on the various models proposed for the design and description of resistive random access memory (RRAM), being a nascent technology is heavily reliant on accurate models to develop efficient working designs and standardize its implementation across devices. This review provides detailed information regarding the various physical methodologies considered for developing models for RRAM devices. It covers all the important models reported till now and elucidates their features and limitations. Various additional effects and anomalies arising from memristive system have been addressed, and the solutions provided by the models to these problems have been shown as well. All the fundamental concepts of RRAM model development such as device operation, switching dynamics, and current-voltage relationships are covered in detail in this work. Popular models proposed by Chua, HP Labs, Yakopcic, TEAM, Stanford/ASU, Ielmini, Berco-Tseng, and many others have been compared and analyzed extensively on various parameters. The working and implementations of the window functions like Joglekar, Biolek, Prodromakis, etc. has been presented and compared as well. New well-defined modeling concepts have been discussed which increase the applicability and accuracy of the models. The use of these concepts brings forth several improvements in the existing models, which have been enumerated in this work. Following the template presented, highly accurate models would be developed which will vastly help future model developers and the modeling community.

  7. Progress Report on the Airborne Metadata and Time Series Working Groups of the 2016 ESDSWG

    NASA Astrophysics Data System (ADS)

    Evans, K. D.; Northup, E. A.; Chen, G.; Conover, H.; Ames, D. P.; Teng, W. L.; Olding, S. W.; Krotkov, N. A.

    2016-12-01

    NASA's Earth Science Data Systems Working Groups (ESDSWG) was created over 10 years ago. The role of the ESDSWG is to make recommendations relevant to NASA's Earth science data systems from users' experiences. Each group works independently focusing on a unique topic. Participation in ESDSWG groups comes from a variety of NASA-funded science and technology projects, including MEaSUREs and ROSS. Participants include NASA information technology experts, affiliated contractor staff and other interested community members from academia and industry. Recommendations from the ESDSWG groups will enhance NASA's efforts to develop long term data products. The Airborne Metadata Working Group is evaluating the suitability of the current Common Metadata Repository (CMR) and Unified Metadata Model (UMM) for airborne data sets and to develop new recommendations as necessary. The overarching goal is to enhance the usability, interoperability, discovery and distribution of airborne observational data sets. This will be done by assessing the suitability (gaps) of the current UMM model for airborne data using lessons learned from current and past field campaigns, listening to user needs and community recommendations and assessing the suitability of ISO metadata and other standards to fill the gaps. The Time Series Working Group (TSWG) is a continuation of the 2015 Time Series/WaterML2 Working Group. The TSWG is using a case study-driven approach to test the new Open Geospatial Consortium (OGC) TimeseriesML standard to determine any deficiencies with respect to its ability to fully describe and encode NASA earth observation-derived time series data. To do this, the time series working group is engaging with the OGC TimeseriesML Standards Working Group (SWG) regarding unsatisfied needs and possible solutions. The effort will end with the drafting of an OGC Engineering Report based on the use cases and interactions with the OGC TimeseriesML SWG. Progress towards finalizing recommendations will be presented at the meeting.

  8. Marginal and internal fit of cobalt-chromium copings fabricated using the conventional and the direct metal laser sintering techniques: A comparative in vitro study.

    PubMed

    Ullattuthodi, Sujana; Cherian, Kandathil Phillip; Anandkumar, R; Nambiar, M Sreedevi

    2017-01-01

    This in vitro study seeks to evaluate and compare the marginal and internal fit of cobalt-chromium copings fabricated using the conventional and direct metal laser sintering (DMLS) techniques. A master model of a prepared molar tooth was made using cobalt-chromium alloy. Silicone impression of the master model was made and thirty standardized working models were then produced; twenty working models for conventional lost-wax technique and ten working models for DMLS technique. A total of twenty metal copings were fabricated using two different production techniques: conventional lost-wax method and DMLS; ten samples in each group. The conventional and DMLS copings were cemented to the working models using glass ionomer cement. Marginal gap of the copings were measured at predetermined four points. The die with the cemented copings are standardized-sectioned with a heavy duty lathe. Then, each sectioned samples were analyzed for the internal gap between the die and the metal coping using a metallurgical microscope. Digital photographs were taken at ×50 magnification and analyzed using measurement software. Statistical analysis was done by unpaired t -test and analysis of variance (ANOVA). The results of this study reveal that no significant difference was present in the marginal gap of conventional and DMLS copings ( P > 0.05) by means of ANOVA. The mean values of internal gap of DMLS copings were significantly greater than that of conventional copings ( P < 0.05). Within the limitations of this in vitro study, it was concluded that the internal fit of conventional copings was superior to that of the DMLS copings. Marginal fit of the copings fabricated by two different techniques had no significant difference.

  9. Brian: a simulator for spiking neural networks in python.

    PubMed

    Goodman, Dan; Brette, Romain

    2008-01-01

    "Brian" is a new simulator for spiking neural networks, written in Python (http://brian. di.ens.fr). It is an intuitive and highly flexible tool for rapidly developing new models, especially networks of single-compartment neurons. In addition to using standard types of neuron models, users can define models by writing arbitrary differential equations in ordinary mathematical notation. Python scientific libraries can also be used for defining models and analysing data. Vectorisation techniques allow efficient simulations despite the overheads of an interpreted language. Brian will be especially valuable for working on non-standard neuron models not easily covered by existing software, and as an alternative to using Matlab or C for simulations. With its easy and intuitive syntax, Brian is also very well suited for teaching computational neuroscience.

  10. Information Architecture for Interactive Archives at the Community Coordianted Modeling Center

    NASA Astrophysics Data System (ADS)

    De Zeeuw, D.; Wiegand, C.; Kuznetsova, M.; Mullinix, R.; Boblitt, J. M.

    2017-12-01

    The Community Coordinated Modeling Center (CCMC) is upgrading its meta-data system for model simulations to be compliant with the SPASE meta-data standard. This work is helping to enhance the SPASE standards for simulations to better describe the wide variety of models and their output. It will enable much more sophisticated and automated metrics and validation efforts at the CCMC, as well as much more robust searches for specific types of output. The new meta-data will also allow much more tailored run submissions as it will allow some code options to be selected for Run-On-Request models. We will also demonstrate data accessibility through an implementation of the Heliophysics Application Programmer's Interface (HAPI) protocol of data otherwise available throught the integrated space weather analysis system (iSWA).

  11. Higgs Discovery: Impact on Composite Dynamics Technicolor & eXtreme Compositeness Thinking Fast and Slow

    NASA Astrophysics Data System (ADS)

    Sannino, Francesco

    I discuss the impact of the discovery of a Higgs-like state on composite dynamics starting by critically examining the reasons in favour of either an elementary or composite nature of this state. Accepting the standard model interpretation I re-address the standard model vacuum stability within a Weyl-consistent computation. I will carefully examine the fundamental reasons why what has been discovered might not be the standard model Higgs. Dynamical electroweak breaking naturally addresses a number of the fundamental issues unsolved by the standard model interpretation. However this paradigm has been challenged by the discovery of a not-so-heavy Higgs-like state. I will therefore review the recent discovery1 that the standard model top-induced radiative corrections naturally reduce the intrinsic non-perturbative mass of the composite Higgs state towards the desired experimental value. Not only we have a natural and testable working framework but we have also suggested specic gauge theories that can realise, at the fundamental level, these minimal models of dynamical electroweak symmetry breaking. These strongly coupled gauge theories are now being heavily investigated via first principle lattice simulations with encouraging results. The new findings show that the recent naive claims made about new strong dynamics at the electroweak scale being disfavoured by the discovery of a not-so-heavy composite Higgs are unwarranted. I will then introduce the more speculative idea of extreme compositeness according to which not only the Higgs sector of the standard model is composite but also quarks and leptons, and provide a toy example in the form of gauge-gauge duality.

  12. Partnership Working in Small Rural Primary Schools: The Best of Both Worlds. Supporting Report and Evidence

    ERIC Educational Resources Information Center

    Hill, Robert

    2014-01-01

    The aim of the research presented in this report was to investigate the most effective ways for small rural primary schools to work together in order to improve provision and raise standards. The project sought to examine the circumstances and context of small rural schools in Lincolnshire and evaluate their different leadership models (such as…

  13. Searching for Physics Beyond the Standard Model: Strongly-Coupled Field Theories at the Intensity and Energy Frontiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brower, Richard C.

    This proposal is to develop the software and algorithmic infrastructure needed for the numerical study of quantum chromodynamics (QCD), and of theories that have been proposed to describe physics beyond the Standard Model (BSM) of high energy physics, on current and future computers. This infrastructure will enable users (1) to improve the accuracy of QCD calculations to the point where they no longer limit what can be learned from high-precision experiments that seek to test the Standard Model, and (2) to determine the predictions of BSM theories in order to understand which of them are consistent with the data thatmore » will soon be available from the LHC. Work will include the extension and optimizations of community codes for the next generation of leadership class computers, the IBM Blue Gene/Q and the Cray XE/XK, and for the dedicated hardware funded for our field by the Department of Energy. Members of our collaboration at Brookhaven National Laboratory and Columbia University worked on the design of the Blue Gene/Q, and have begun to develop software for it. Under this grant we will build upon their experience to produce high-efficiency production codes for this machine. Cray XE/XK computers with many thousands of GPU accelerators will soon be available, and the dedicated commodity clusters we obtain with DOE funding include growing numbers of GPUs. We will work with our partners in NVIDIA's Emerging Technology group to scale our existing software to thousands of GPUs, and to produce highly efficient production codes for these machines. Work under this grant will also include the development of new algorithms for the effective use of heterogeneous computers, and their integration into our codes. It will include improvements of Krylov solvers and the development of new multigrid methods in collaboration with members of the FASTMath SciDAC Institute, using their HYPRE framework, as well as work on improved symplectic integrators.« less

  14. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    NASA Astrophysics Data System (ADS)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  15. Working Alliance, Interpersonal Problems, and Depressive Symptoms in Tele-Interpersonal Psychotherapy for HIV-infected Rural Persons: Evidence for Indirect Effects.

    PubMed

    Anderson, Timothy; McClintock, Andrew S; McCarrick, Shannon S; Heckman, Timothy G; Heckman, Bernadette D; Markowitz, John C; Sutton, Mark

    2018-03-01

    Interpersonal psychotherapy (IPT) has demonstrated efficacy for the treatment of depression, yet little is known about its therapeutic mechanisms. As a specific treatment, IPT has been shown to directly reduce depressive symptoms, although it is unclear whether these reductions occur via interpersonal changes. Within IPT, the potential role of the working alliance, a common factor, as a predictor of depression and interpersonal changes is also unclear. Participants were 147 depressed persons living with HIV in rural communities of 28 U.S. states enrolled in a randomized clinical trial. Seventy-five patients received up to 9 sessions of telephone-administered IPT (tele-IPT) plus standard care and 72 patients received standard care only. Two models were tested; one included treatment condition (tele-IPT vs. control) and another included the working alliance as independent variables. The first model found an indirect effect whereby tele-IPT reduced depression via decreased social avoidance. There was a direct effect between tele-IPT and reduced depression. In the second model, the working alliance influenced depressive symptom relief via reductions in social avoidance. Both goal and task working alliance subscales were indirectly associated with reductions in depressive symptoms, also through reductions in social avoidance. There were no direct effects involving the working alliance. Tele-IPT's influence on depressive symptom reduction was primarily through a direct effect, whereas the influence of working alliance depression was almost entirely via an indirect effect through interpersonal problems. Study findings have implications for IPT when intervening with depressed rural people living with HIV/AIDS over the telephone. © 2017 Wiley Periodicals, Inc.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Lei; Yang Jinmin

    Little Higgs theory naturally predicts a light Higgs boson whose most important discovery channel at the LHC is the diphoton signal pp{yields}h{yields}{gamma}{gamma}. In this work, we perform a comparative study for this signal in some typical little Higgs models, namely, the littlest Higgs model, two littlest Higgs models with T-parity (named LHT-I and LHT-II), and the simplest little Higgs models. We find that compared with the standard model prediction, the diphoton signal rate is always suppressed and the suppression extent can be quite different for different models. The suppression is mild (< or approx. 10%) in the littlest Higgs modelmore » but can be quite severe ({approx_equal}90%) in other three models. This means that discovering the light Higgs boson predicted by the little Higgs theory through the diphoton channel at the LHC will be more difficult than discovering the standard model Higgs boson.« less

  17. Defending Against Advanced Persistent Threats Using Game-Theory.

    PubMed

    Rass, Stefan; König, Sandra; Schauer, Stefan

    2017-01-01

    Advanced persistent threats (APT) combine a variety of different attack forms ranging from social engineering to technical exploits. The diversity and usual stealthiness of APT turns them into a central problem of contemporary practical system security, since information on attacks, the current system status or the attacker's incentives is often vague, uncertain and in many cases even unavailable. Game theory is a natural approach to model the conflict between the attacker and the defender, and this work investigates a generalized class of matrix games as a risk mitigation tool for an advanced persistent threat (APT) defense. Unlike standard game and decision theory, our model is tailored to capture and handle the full uncertainty that is immanent to APTs, such as disagreement among qualitative expert risk assessments, unknown adversarial incentives and uncertainty about the current system state (in terms of how deeply the attacker may have penetrated into the system's protective shells already). Practically, game-theoretic APT models can be derived straightforwardly from topological vulnerability analysis, together with risk assessments as they are done in common risk management standards like the ISO 31000 family. Theoretically, these models come with different properties than classical game theoretic models, whose technical solution presented in this work may be of independent interest.

  18. Experimental demonstration of nonbilocal quantum correlations

    PubMed Central

    Saunders, Dylan J.; Bennet, Adam J.; Branciard, Cyril; Pryde, Geoff J.

    2017-01-01

    Quantum mechanics admits correlations that cannot be explained by local realistic models. The most studied models are the standard local hidden variable models, which satisfy the well-known Bell inequalities. To date, most works have focused on bipartite entangled systems. We consider correlations between three parties connected via two independent entangled states. We investigate the new type of so-called “bilocal” models, which correspondingly involve two independent hidden variables. These models describe scenarios that naturally arise in quantum networks, where several independent entanglement sources are used. Using photonic qubits, we build such a linear three-node quantum network and demonstrate nonbilocal correlations by violating a Bell-like inequality tailored for bilocal models. Furthermore, we show that the demonstration of nonbilocality is more noise-tolerant than that of standard Bell nonlocality in our three-party quantum network. PMID:28508045

  19. A participatory model for improving occupational health and safety: improving informal sector working conditions in Thailand.

    PubMed

    Manothum, Aniruth; Rukijkanpanich, Jittra; Thawesaengskulthai, Damrong; Thampitakkul, Boonwa; Chaikittiporn, Chalermchai; Arphorn, Sara

    2009-01-01

    The purpose of this study was to evaluate the implementation of an Occupational Health and Safety Management Model for informal sector workers in Thailand. The studied model was characterized by participatory approaches to preliminary assessment, observation of informal business practices, group discussion and participation, and the use of environmental measurements and samples. This model consisted of four processes: capacity building, risk analysis, problem solving, and monitoring and control. The participants consisted of four local labor groups from different regions, including wood carving, hand-weaving, artificial flower making, and batik processing workers. The results demonstrated that, as a result of applying the model, the working conditions of the informal sector workers had improved to meet necessary standards. This model encouraged the use of local networks, which led to cooperation within the groups to create appropriate technologies to solve their problems. The authors suggest that this model could effectively be applied elsewhere to improve informal sector working conditions on a broader scale.

  20. Reconceptualizing the nature and health consequences of work-related insecurity for the new economy: the decline of workers' power in the flexibility regime.

    PubMed

    Scott, Heather K

    2004-01-01

    This article aims to reconceptualize job insecurity in a manner relevant to shifts in the power relations of work that have accompanied globalization, in order to assess the implications for workers' health. The linkage between job insecurity and health has been well established, but little formal theorizing has analyzed the mechanisms responsible. Implicitly, however, the assumption remains that its role as a stressor is limited to the realm of job strain, whereby workers lack control over a threatened employment situation. Within this framework, job insecurity and related dimensions of power remain locked in the "box" of the standard employment relationship, precluding an analysis of work-related insecurity in the new context of globalization. In contrast, the author constructs a model of work-related insecurity that takes into account current shifts in the balance of power toward employers, which in turn has undermined the fundamental quid pro quo associated with the standard postwar model of employment. She proposes that job insecurity is no longer a mere temporary break in an otherwise predictable work-life pattern but rather a structural feature of the new labor market. Emerging contingencies associated with the New Economy, "flexibilized" employment relationships, and diminution of workers' power have constituted work-related insecurity as a chronic stressor with several implications for long-term health outcomes at the individual and societal levels.

  1. An assessment model for quality management

    NASA Astrophysics Data System (ADS)

    Völcker, Chr.; Cass, A.; Dorling, A.; Zilioli, P.; Secchi, P.

    2002-07-01

    SYNSPACE together with InterSPICE and Alenia Spazio is developing an assessment method to determine the capability of an organisation in the area of quality management. The method, sponsored by the European Space Agency (ESA), is called S9kS (SPiCE- 9000 for SPACE). S9kS is based on ISO 9001:2000 with additions from the quality standards issued by the European Committee for Space Standardization (ECSS) and ISO 15504 - Process Assessments. The result is a reference model that supports the expansion of the generic process assessment framework provided by ISO 15504 to nonsoftware areas. In order to be compliant with ISO 15504, requirements from ISO 9001 and ECSS-Q-20 and Q-20-09 have been turned into process definitions in terms of Purpose and Outcomes, supported by a list of detailed indicators such as Practices, Work Products and Work Product Characteristics. In coordination with this project, the capability dimension of ISO 15504 has been revised to be consistent with ISO 9001. As contributions from ISO 9001 and the space quality assurance standards are separable, the stripped down version S9k offers organisations in all industries an assessment model based solely on ISO 9001, and is therefore interesting to all organisations, which intend to improve their quality management system based on ISO 9001.

  2. Automated Transformation of CDISC ODM to OpenClinica.

    PubMed

    Gessner, Sophia; Storck, Michael; Hegselmann, Stefan; Dugas, Martin; Soto-Rey, Iñaki

    2017-01-01

    Due to the increasing use of electronic data capture systems for clinical research, the interest in saving resources by automatically generating and reusing case report forms in clinical studies is growing. OpenClinica, an open-source electronic data capture system enables the reuse of metadata in its own Excel import template, hampering the reuse of metadata defined in other standard formats. One of these standard formats is the Operational Data Model for metadata, administrative and clinical data in clinical studies. This work suggests a mapping from Operational Data Model to OpenClinica and describes the implementation of a converter to automatically generate OpenClinica conform case report forms based upon metadata in the Operational Data Model.

  3. Eldercare responsibilities, interrole conflict, and employee absence: a daily study.

    PubMed

    Hepburn, C G; Barling, J

    1996-07-01

    A model was developed specifying that the number of hours employees spend providing care to or interacting with elderly parents predicts conflict between the roles of employee and caregiver. Interrole conflict was subsequently expected to predict partial absence from work (e.g., arriving late). Seventeen employed eldercare providers completed a daily questionnaire for 20 work days. The data were standardized and pooled, and the proposed model was tested by using structural equation modeling. The proposed model provided a good fit to the data. A competing model that added the direct effects of hours of interacting with and hours of providing care to parents on partial absence provided a significantly better fit. The potential impact of the findings on employees and organizations is discussed.

  4. Battery Ownership Model - Medium Duty HEV Battery Leasing & Standardization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Ken; Smith, Kandler; Cosgrove, Jon

    2015-12-01

    Prepared for the U.S. Department of Energy, this milestone report focuses on the economics of leasing versus owning batteries for medium-duty hybrid electric vehicles as well as various battery standardization scenarios. The work described in this report was performed by members of the Energy Storage Team and the Vehicle Simulation Team in NREL's Transportation and Hydrogen Systems Center along with members of the Vehicles Analysis Team at Ricardo.

  5. Model performance specifications for police traffic radar devices

    DOT National Transportation Integrated Search

    1982-03-01

    This report provides information about all of the research work regarding police traffic radar completed by the National Bureau of Standards (NBS) under an Inter-Agency Agreement with the National Highway Traffic Safety Administration (NHTSA). Chapte...

  6. Integrating water data, models and forecasts - the Australian Water Resources Information System (Invited)

    NASA Astrophysics Data System (ADS)

    Argent, R.; Sheahan, P.; Plummer, N.

    2010-12-01

    Under the Commonwealth Water Act 2007 the Bureau of Meteorology was given a new national role in water information, encompassing standards, water accounts and assessments, hydrological forecasting, and collecting, enhancing and making freely available Australia’s water information. The Australian Water Resources Information System (AWRIS) is being developed to fulfil part of this role, by providing foundational data, information and model structures and services. Over 250 organisations across Australia are required to provide water data and metadata to the Bureau, including federal, state and local governments, water storage management and hydroelectricity companies, rural and urban water utilities, and catchment management bodies. The data coverage includes the categories needed to assess and account for water resources at a range of scales. These categories are surface, groundwater and meteorological observations, water in storages, water restrictions, urban and irrigation water use and flows, information on rights, allocations and trades, and a limited suite of water quality parameters. These data are currently supplied to the Bureau via a file-based delivery system at various frequencies from annual to daily or finer, and contain observations taken at periods from minutes to monthly or coarser. One of the primary keys to better data access and utilisation is better data organisation, including content and markup standards. As a significant step on the path to standards for water data description, the Bureau has developed a Water Data Transfer Format (WDTF) for transmission of a variety of water data categories, including site metadata. WDTF is adapted from the OGC’s observation and sampling-features standard. The WDTF XML schema is compatible with the OGC's Web Feature Service (WFS) interchange standard, and conforms to GML Simple Features profile (GML-SF) level 1, emphasising the importance of standards in data exchange. In the longer term we are also working with the OGC’s Hydrology Domain Working Group on the development of WaterML 2, which will provide an international standard applicable to a sub-set of the information handled by WDTF. Making water data accessible for multiple uses, such as for predictive models and external products, has required the development of consistent data models for describing the relationships between the various data elements. Early development of the AWRIS data model has utilised a model-driven architecture approach, the benefits of which are likely to accrue in the long term, as more products and services are developed from the common core. Moving on from our initial focus on data organisation and management, the Bureau is in the early stages of developing an integrated modelling suite (the Bureau Hydrological Modelling System - BHMS) which will encompass the variety of hydrological modelling needs of the Bureau, ranging from water balances, assessments and accounts, to streamflow and hydrological forecasting over scales from hours and days to years and decades. It is envisaged that this modelling suite will also be developed, as far as possible, using standardised, discoverable services to enhance data-model and model-model integration.

  7. Classification of Brazilian and foreign gasolines adulterated with alcohol using infrared spectroscopy.

    PubMed

    da Silva, Neirivaldo C; Pimentel, Maria Fernanda; Honorato, Ricardo S; Talhavini, Marcio; Maldaner, Adriano O; Honorato, Fernanda A

    2015-08-01

    The smuggling of products across the border regions of many countries is a practice to be fought. Brazilian authorities are increasingly worried about the illicit trade of fuels along the frontiers of the country. In order to confirm this as a crime, the Federal Police must have a means of identifying the origin of the fuel. This work describes the development of a rapid and nondestructive methodology to classify gasoline as to its origin (Brazil, Venezuela and Peru), using infrared spectroscopy and multivariate classification. Partial Least Squares Discriminant Analysis (PLS-DA) and Soft Independent Modeling Class Analogy (SIMCA) models were built. Direct standardization (DS) was employed aiming to standardize the spectra obtained in different laboratories of the border units of the Federal Police. Two approaches were considered in this work: (1) local and (2) global classification models. When using Approach 1, the PLS-DA achieved 100% correct classification, and the deviation of the predicted values for the secondary instrument considerably decreased after performing DS. In this case, SIMCA models were not efficient in the classification, even after standardization. Using a global model (Approach 2), both PLS-DA and SIMCA techniques were effective after performing DS. Considering that real situations may involve questioned samples from other nations (such as Peru), the SIMCA method developed according to Approach 2 is a more adequate, since the sample will be classified neither as Brazil nor Venezuelan. This methodology could be applied to other forensic problems involving the chemical classification of a product, provided that a specific modeling is performed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Ad Hoc working group on diurnal and semi-diurnal Earth Orientation variation.

    NASA Astrophysics Data System (ADS)

    Gipson, J. M.

    2017-12-01

    Diurnal and semi-diurnal Earth orientation ("HF-EOP") variation were detected in the early 1990s in SLR and VLBI data. Beginning in 1996 a model of HF-EOP variation based on ocean-tides derived from Topex data was included in the IERS standards. This model has not been updated since then with the exception of including libration for effects for polar motion (2003 IERS conventions) and UT1 (2010 IERS conventions). The accuracy of Space Geodesy has increased remarkably over the last 20 years and the 1996 IERS HF-EOP model is no longer adequate. At the conclusion of the 2017 GGOS/IERS Unified Analysis Workshop an ad hoc working group was formed including representatives of the IDS, IGS, ILRS, and IVS. The goal of the working group is to test several models of HF-EOP in the different space geodesy techniques and to make a recommendation to the IERS for the adoption of a new HF-EOP model. In this presentation I will give a summary of work on HF-EOP done to date by various scientists which demonstrate the inadequacy of the current IERS HF-EOP model. I will then describe the goals and the progress of the working group to date, with a preview of further work.

  9. B→πll Form Factors for New Physics Searches from Lattice QCD.

    PubMed

    Bailey, Jon A; Bazavov, A; Bernard, C; Bouchard, C M; DeTar, C; Du, Daping; El-Khadra, A X; Freeland, E D; Gámiz, E; Gottlieb, Steven; Heller, U M; Kronfeld, A S; Laiho, J; Levkova, L; Liu, Yuzhi; Lunghi, E; Mackenzie, P B; Meurice, Y; Neil, E; Qiu, Si-Wei; Simone, J N; Sugar, R; Toussaint, D; Van de Water, R S; Zhou, Ran

    2015-10-09

    The rare decay B→πℓ^{+}ℓ^{-} arises from b→d flavor-changing neutral currents and could be sensitive to physics beyond the standard model. Here, we present the first ab initio QCD calculation of the B→π tensor form factor f_{T}. Together with the vector and scalar form factors f_{+} and f_{0} from our companion work [J. A. Bailey et al., Phys. Rev. D 92, 014024 (2015)], these parametrize the hadronic contribution to B→π semileptonic decays in any extension of the standard model. We obtain the total branching ratio BR(B^{+}→π^{+}μ^{+}μ^{-})=20.4(2.1)×10^{-9} in the standard model, which is the most precise theoretical determination to date, and agrees with the recent measurement from the LHCb experiment [R. Aaij et al., J. High Energy Phys. 12 (2012) 125].

  10. $$B\\to\\pi\\ell\\ell$$ Form Factors for New-Physics Searches from Lattice QCD

    DOE PAGES

    Bailey, Jon A.

    2015-10-07

    The rare decay B→πℓ +ℓ - arises from b→d flavor-changing neutral currents and could be sensitive to physics beyond the standard model. Here, we present the first ab initio QCD calculation of the B→π tensor form factor f T. Together with the vector and scalar form factors f + and f 0 from our companion work [J. A. Bailey et al., Phys. Rev. D 92, 014024 (2015)], these parametrize the hadronic contribution to B→π semileptonic decays in any extension of the standard model. We obtain the total branching ratio BR(B +→π +μ +μ -)=20.4(2.1)×10 -9 in the standard model, whichmore » is the most precise theoretical determination to date, and agrees with the recent measurement from the LHCb experiment [R. Aaij et al., J. High Energy Phys. 12 (2012) 125].« less

  11. A Framework for Comprehensive Health Terminology Systems in the United States

    PubMed Central

    Chute, Christopher G.; Cohn, Simon P.; Campbell, James R.

    1998-01-01

    Health care in the United States has become an information-intensive industry, yet electronic health records represent patient data inconsistently for lack of clinical data standards. Classifications that have achieved common acceptance, such as the ICD-9-CM or ICD, aggregate heterogeneous patients into broad categories, which preclude their practical use in decision support, development of refined guidelines, or detailed comparison of patient outcomes or benchmarks. This document proposes a framework for the integration and maturation of clinical terminologies that would have practical applications in patient care, process management, outcome analysis, and decision support. Arising from the two working groups within the standards community—the ANSI (American National Standards Institute) Healthcare Informatics Standards Board Working Group and the Computer-based Patient Records Institute Working Group on Codes and Structures—it outlines policies regarding 1) functional characteristics of practical terminologies, 2) terminology models that can broaden their applications and contribute to their sustainability, 3) maintenance attributes that will enable terminologies to keep pace with rapidly changing health care knowledge and process, and 4) administrative issues that would facilitate their accessibility, adoption, and application to improve the quality and efficiency of American health care. PMID:9824798

  12. The Evonik-Mainz Eye Care-Study (EMECS): Development of an Expert System for Glaucoma Risk Detection in a Working Population.

    PubMed

    Wahl, Jochen; Barleon, Lorenz; Morfeld, Peter; Lichtmeß, Andrea; Haas-Brähler, Sibylle; Pfeiffer, Norbert

    2016-01-01

    To develop an expert system for glaucoma screening in a working population based on a human expert procedure using images of optic nerve head (ONH), visual field (frequency doubling technology, FDT) and intraocular pressure (IOP). 4167 of 13037 (32%) employees between 40 and 65 years of Evonik Industries were screened. An experienced glaucoma expert (JW) assessed papilla parameters and evaluated all individual screening results. His classification into "no glaucoma", "possible glaucoma" and "probable glaucoma" was defined as "gold standard". A screening model was developed which was tested versus the gold-standard. This model took into account the assessment of the ONH. Values and relationships of CDR and IOP and the FDT were considered additionally and a glaucoma score was generated. The structure of the screening model was specified a priori whereas values of the parameters were chosen post-hoc to optimize sensitivity and specificity of the algorithm. Simple screening models based on IOP and / or FDT were investigated for comparison. 111 persons (2.66%) were classified as glaucoma suspects, thereof 13 (0.31%) as probable and 98 (2.35%) as possible glaucoma suspects by the expert. Re-evaluation by the screening model revealed a sensitivity of 83.8% and a specificity of 99.6% for all glaucoma suspects. The positive predictive value of the model was 80.2%, the negative predictive value 99.6%. Simple screening models showed insufficient diagnostic accuracy. Adjustment of ONH and symmetry parameters with respect to excavation and IOP in an expert system produced sufficiently satisfying diagnostic accuracy. This screening model seems to be applicable in such a working population with relatively low age and low glaucoma prevalence. Different experts should validate the model in different populations.

  13. A Cook's Tour: The Journey to Become a Model Culinary Arts Academy

    ERIC Educational Resources Information Center

    Bantang, Susan C.

    2008-01-01

    This article is about how the West Boca Raton Community High School's Culinary Arts Academy achieved national model status as it works to prepare the next generation of culinary artists. The culinary academy, established in 2004, adopted national standards that have served as a foundation for its excellence. In November 2007, the National Career…

  14. Ex-Nihilo: Obstacles Surrounding Teaching the Standard Model

    ERIC Educational Resources Information Center

    Pimbblet, Kevin A.

    2002-01-01

    The model of the Big Bang is an integral part of the national curricula in England and Wales. Previous work (e.g. Baxter 1989) has shown that pupils often come into education with many and varied prior misconceptions emanating from both internal and external sources. Whilst virtually all of these misconceptions can be remedied, there will remain…

  15. Using conceptual work products of health care to design health IT.

    PubMed

    Berry, Andrew B L; Butler, Keith A; Harrington, Craig; Braxton, Melissa O; Walker, Amy J; Pete, Nikki; Johnson, Trevor; Oberle, Mark W; Haselkorn, Jodie; Paul Nichol, W; Haselkorn, Mark

    2016-02-01

    This paper introduces a new, model-based design method for interactive health information technology (IT) systems. This method extends workflow models with models of conceptual work products. When the health care work being modeled is substantially cognitive, tacit, and complex in nature, graphical workflow models can become too complex to be useful to designers. Conceptual models complement and simplify workflows by providing an explicit specification for the information product they must produce. We illustrate how conceptual work products can be modeled using standard software modeling language, which allows them to provide fundamental requirements for what the workflow must accomplish and the information that a new system should provide. Developers can use these specifications to envision how health IT could enable an effective cognitive strategy as a workflow with precise information requirements. We illustrate the new method with a study conducted in an outpatient multiple sclerosis (MS) clinic. This study shows specifically how the different phases of the method can be carried out, how the method allows for iteration across phases, and how the method generated a health IT design for case management of MS that is efficient and easy to use. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Linear model correction: A method for transferring a near-infrared multivariate calibration model without standard samples.

    PubMed

    Liu, Yan; Cai, Wensheng; Shao, Xueguang

    2016-12-05

    Calibration transfer is essential for practical applications of near infrared (NIR) spectroscopy because the measurements of the spectra may be performed on different instruments and the difference between the instruments must be corrected. For most of calibration transfer methods, standard samples are necessary to construct the transfer model using the spectra of the samples measured on two instruments, named as master and slave instrument, respectively. In this work, a method named as linear model correction (LMC) is proposed for calibration transfer without standard samples. The method is based on the fact that, for the samples with similar physical and chemical properties, the spectra measured on different instruments are linearly correlated. The fact makes the coefficients of the linear models constructed by the spectra measured on different instruments are similar in profile. Therefore, by using the constrained optimization method, the coefficients of the master model can be transferred into that of the slave model with a few spectra measured on slave instrument. Two NIR datasets of corn and plant leaf samples measured with different instruments are used to test the performance of the method. The results show that, for both the datasets, the spectra can be correctly predicted using the transferred partial least squares (PLS) models. Because standard samples are not necessary in the method, it may be more useful in practical uses. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. A study of two subgrid-scale models and their effects on wake breakdown behind a wind turbine in uniform inflow

    NASA Astrophysics Data System (ADS)

    Martinez, Luis; Meneveau, Charles

    2014-11-01

    Large Eddy Simulations (LES) of the flow past a single wind turbine with uniform inflow have been performed. A goal of the simulations is to compare two turbulence subgrid-scale models and their effects in predicting the initial breakdown, transition and evolution of the wake behind the turbine. Prior works have often observed negligible sensitivities to subgrid-scale models. The flow is modeled using an in-house LES with pseudo-spectral discretization in horizontal planes and centered finite differencing in the vertical direction. Turbines are represented using the actuator line model. We compare the standard constant-coefficient Smagorinsky subgrid-scale model with the Lagrangian Scale Dependent Dynamic model (LSDM). The LSDM model predicts faster transition to turbulence in the wake, whereas the standard Smagorinsky model predicts significantly delayed transition. The specified Smagorinsky coefficient is larger than the dynamic one on average, increasing diffusion thus delaying transition. A second goal is to compare the resulting near-blade properties such as local aerodynamic forces from the LES with Blade Element Momentum Theory. Results will also be compared with those of the SOWFA package, the wind energy CFD framework from NREL. This work is supported by NSF (IGERT and IIA-1243482) and computations use XSEDE resources, and has benefitted from interactions with Dr. M. Churchfield of NREL.

  18. Geophysics in INSPIRE

    NASA Astrophysics Data System (ADS)

    Sőrés, László

    2013-04-01

    INSPIRE is a European directive to harmonize spatial data in Europe. Its' aim is to establish a transparent, multidisciplinary network of environmental information by using international standards and OGC web services. Spatial data themes defined in the annex of the directive cover 34 domains that are closely bundled to environment and spatial information. According to the INSPIRE roadmap all data providers must setup discovery, viewing and download services and restructure data stores to provide spatial data as defined by the underlying specifications by 2014 December 1. More than 3000 institutions are going to be involved in the progress. During the data specification process geophysics as an inevitable source of geo information was introduced to Annex II Geology. Within the Geology theme Geophysics is divided into core and extended model. The core model contains specifications for legally binding data provisioning and is going to be part of the Implementation Rules of the INSPIRE directives. To minimize the work load of obligatory data transformations the scope of the core model is very limited and simple. It covers the most essential geophysical feature types that are relevant in economic and environmental context. To fully support the use cases identified by the stake holders the extended model was developed. It contains a wide range of spatial object types for geophysical measurements, processed and interpreted results, and wrapper classes to help data providers in using the Observation and Measurements (O&M) standard for geophysical data exchange. Instead of introducing the traditional concept of "geophysical methods" at a high structural level the data model classifies measurements and geophysical models based on their spatial characteristics. Measurements are classified as geophysical station (point), geophysical profile (curve) and geophysical swath (surface). Generic classes for processing results and interpretation models are curve model (1D), surface model (2D), and solid model (3D). Both measurements and models are derived from O&M sampling features that may be linked to sampling procedures and observation results. Geophysical products are output of complex procedures and can precisely be described as chains of consecutive O&M observations. For describing geophysical processes and results the data model both supports the use of OGC standard XML encoding (SensorML, SWE, GML) and traditional industry standards (SPS, UKOOA, SEG formats). To control the scope of the model and to harmonize terminology an initial set of extendable code lists was developed. The attempt to create a hierarchical SKOS vocabulary of terms for geophysical methods, resource types, processes, properties and technical parameters was partly based on the work done in the eContentPlus GEOMIND project. The result is far from being complete, and the work must be continued in the future.

  19. Improving Low-Dose Blood-Brain Barrier Permeability Quantification Using Sparse High-Dose Induced Prior for Patlak Model

    PubMed Central

    Fang, Ruogu; Karlsson, Kolbeinn; Chen, Tsuhan; Sanelli, Pina C.

    2014-01-01

    Blood-brain-barrier permeability (BBBP) measurements extracted from the perfusion computed tomography (PCT) using the Patlak model can be a valuable indicator to predict hemorrhagic transformation in patients with acute stroke. Unfortunately, the standard Patlak model based PCT requires excessive radiation exposure, which raised attention on radiation safety. Minimizing radiation dose is of high value in clinical practice but can degrade the image quality due to the introduced severe noise. The purpose of this work is to construct high quality BBBP maps from low-dose PCT data by using the brain structural similarity between different individuals and the relations between the high- and low-dose maps. The proposed sparse high-dose induced (shd-Patlak) model performs by building a high-dose induced prior for the Patlak model with a set of location adaptive dictionaries, followed by an optimized estimation of BBBP map with the prior regularized Patlak model. Evaluation with the simulated low-dose clinical brain PCT datasets clearly demonstrate that the shd-Patlak model can achieve more significant gains than the standard Patlak model with improved visual quality, higher fidelity to the gold standard and more accurate details for clinical analysis. PMID:24200529

  20. Standardized dirts for testing the efficacy of workplace cleaning products: validation of their workplace relevance.

    PubMed

    Elsner, Peter; Seyfarth, Florian; Sonsmann, Flora; Strunk, Meike; John, Swen-Malte; Diepgen, Thomas; Schliemann, Sibylle

    2013-10-01

    In order to assess the cleaning efficacy of occupational skin cleansers, standardized test dirts mimicking the spectrum of skin soiling at dirty workplaces are necessary. To validate newly developed standardized test dirts (compliant with the EU Cosmetics Directive) for their occupational relevance. In this single-blinded, monocentric questionnaire-based clinical trial, 87 apprentices of three trades (household management; house painting and varnishing; and metal processing) evaluated the cleanability of six standardized test dirts in relation to their workplace dirts. In addition, they judged the similarity of the test dirts to actual dirts encountered in their working environments. Most of the household management participants assessed the hydrophilic model dirt ('mascara'), the lipophilic model dirt ('W/O cream') and a film-forming model dirt ('disperse paint') as best resembling the dirts found at their workplaces. Most of the painters and varnishers judged the filmogenic model dirts ('disperse paint' and 'acrylic paint') as best resembling the dirts found at their workplaces. For the metal workers, the lipophilic and paste-like model dirts were most similar to their workplace dirts. The spectrum of standardized test dirts developed represents well the dirts encountered at various workplaces. The test dirts may be useful in the development and in vivo efficacy testing of occupational skin cleansers. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. SU-E-T-223: High-Energy Photon Standard Dosimetry Data: A Quality Assurance Tool.

    PubMed

    Lowenstein, J; Kry, S; Molineu, A; Alvarez, P; Aguirre, J; Summers, P; Followill, D

    2012-06-01

    Describe the Radiological Physics Center's (RPC) extensive standard dosimetry data set determined from on-site audits measurements. Measurements were made during on-site audits to institutions participating in NCI funded cooperative clinical trials for 44 years using a 0.6cc cylindrical ionization chamber placed within the RPC's water tank. Measurements were made on Varian, Siemens, and Elekta/Philips accelerators for 11 different energies from 68 models of accelerators. We have measured percent depth dose, output factors, and off-axis factors for 123 different accelerator model/energy combinations for which we have 5 or more sets of measurements. The RPC analyzed these data and determined the 'standard data' for each model/energy combination. The RPC defines 'standard data' as the mean value of 5 or more sets of dosimetry data or agreement with published depth dose data (within 2%). The analysis of these standard data indicates that for modern accelerator models, the dosimetry data for a particular model/energy are within ï,±2%. The RPC has always found accelerators of the same make/model/energy combination have the same dosimetric properties in terms of depth dose, field size dependence and off-axis factors. Because of this consistency, the RPC can assign standard data for percent depth dose, average output factors and off-axis factors for a given combination of energy and accelerator make and model. The RPC standard data can be used as a redundant quality assurance tool to assist Medical Physicists to have confidence in their clinical data to within 2%. The next step is for the RPC to provide a way for institutions to submit data to the RPC to determine if their data agrees with the standard data as a redundant check. This work was supported by PHS grants CA10953 awarded by NCI, DHHS. © 2012 American Association of Physicists in Medicine.

  2. Open-Universe Theory for Bayesian Inference, Decision, and Sensing (OUTBIDS)

    DTIC Science & Technology

    2014-01-01

    using a novel dynamic programming algorithm [6]. The second allows for tensor data, in which observations at a given time step exhibit...unlimited. 5 We developed a dynamical tensor model that gives far better estimation and system- identification results than the standard vectorization...inference. Third, unlike prior work that learns different pieces of the model independently, use matching between 3D models and 2D views and/or voting

  3. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less

  4. New battery model considering thermal transport and partial charge stationary effects in photovoltaic off-grid applications

    NASA Astrophysics Data System (ADS)

    Sanz-Gorrachategui, Iván; Bernal, Carlos; Oyarbide, Estanis; Garayalde, Erik; Aizpuru, Iosu; Canales, Jose María; Bono-Nuez, Antonio

    2018-02-01

    The optimization of the battery pack in an off-grid Photovoltaic application must consider the minimum sizing that assures the availability of the system under the worst environmental conditions. Thus, it is necessary to predict the evolution of the state of charge of the battery under incomplete daily charging and discharging processes and fluctuating temperatures over day-night cycles. Much of previous development work has been carried out in order to model the short term evolution of battery variables. Many works focus on the on-line parameter estimation of available charge, using standard or advanced estimators, but they are not focused on the development of a model with predictive capabilities. Moreover, normally stable environmental conditions and standard charge-discharge patterns are considered. As the actual cycle-patterns differ from the manufacturer's tests, batteries fail to perform as expected. This paper proposes a novel methodology to model these issues, with predictive capabilities to estimate the remaining charge in a battery after several solar cycles. A new non-linear state space model is proposed as a basis, and the methodology to feed and train the model is introduced. The new methodology is validated using experimental data, providing only 5% of error at higher temperatures than the nominal one.

  5. Thermodynamic Model Formulations for Inhomogeneous Solids with Application to Non-isothermal Phase Field Modelling

    NASA Astrophysics Data System (ADS)

    Gladkov, Svyatoslav; Kochmann, Julian; Reese, Stefanie; Hütter, Markus; Svendsen, Bob

    2016-04-01

    The purpose of the current work is the comparison of thermodynamic model formulations for chemically and structurally inhomogeneous solids at finite deformation based on "standard" non-equilibrium thermodynamics [SNET: e. g. S. de Groot and P. Mazur, Non-equilibrium Thermodynamics, North Holland, 1962] and the general equation for non-equilibrium reversible-irreversible coupling (GENERIC) [H. C. Öttinger, Beyond Equilibrium Thermodynamics, Wiley Interscience, 2005]. In the process, non-isothermal generalizations of standard isothermal conservative [e. g. J. W. Cahn and J. E. Hilliard, Free energy of a non-uniform system. I. Interfacial energy. J. Chem. Phys. 28 (1958), 258-267] and non-conservative [e. g. S. M. Allen and J. W. Cahn, A macroscopic theory for antiphase boundary motion and its application to antiphase domain coarsening. Acta Metall. 27 (1979), 1085-1095; A. G. Khachaturyan, Theory of Structural Transformations in Solids, Wiley, New York, 1983] diffuse interface or "phase-field" models [e. g. P. C. Hohenberg and B. I. Halperin, Theory of dynamic critical phenomena, Rev. Modern Phys. 49 (1977), 435-479; N. Provatas and K. Elder, Phase Field Methods in Material Science and Engineering, Wiley-VCH, 2010.] for solids are obtained. The current treatment is consistent with, and includes, previous works [e. g. O. Penrose and P. C. Fife, Thermodynamically consistent models of phase-field type for the kinetics of phase transitions, Phys. D 43 (1990), 44-62; O. Penrose and P. C. Fife, On the relation between the standard phase-field model and a "thermodynamically consistent" phase-field model. Phys. D 69 (1993), 107-113] on non-isothermal systems as a special case. In the context of no-flux boundary conditions, the SNET- and GENERIC-based approaches are shown to be completely consistent with each other and result in equivalent temperature evolution relations.

  6. Precursor model and preschool science learning about shadows formation

    NASA Astrophysics Data System (ADS)

    Delserieys, Alice; Jégou, Corinne; Boilevin, Jean-Marie; Ravanis, Konstantinos

    2018-04-01

    This work is based on the idea that young children benefit from early introduction of scientific concepts. Few researches describe didactical strategies focusing on physics understanding for young children and analyse their effectiveness in standard classroom environments.

  7. On modeling pressure diffusion in non-homogeneous shear flows

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Rogers, M. M.; Durbin, P.; Lele, S. K.

    1996-01-01

    New models are proposed for the 'slow and 'rapid' parts of the pressure diffusive transport based on the examination of DNS databases for plane mixing layers and wakes. The model for the 'slow' part is non-local, but requires the distribution of the triple-velocity correlation as a local source. The latter can be computed accurately for the normal component from standard gradient diffusion models, but such models are inadequate for the cross component. More work is required to remedy this situation.

  8. AutoMOPS- B2B and B2C in mask making: Mask manufacturing performance and customer satisfaction improvement through better information flow management using generic models and standardized languages

    NASA Astrophysics Data System (ADS)

    Filies, Olaf; de Ridder, Luc; Rodriguez, Ben; Kujiken, Aart

    2002-03-01

    Semiconductor manufacturing has become a global business, in which companies of different size unite in virtual enterprises to meet new opportunities. Therefore Mask manufacturing is a key business, but mask ordering is a complex process and is always critical regarding design to market time, even though mask complexity and customer base are increasing using a wide variety of different mask order forms which are frequently faulty and very seldom complete. This is effectively blocking agile manufacturing and can tie wafer fabs to a single mask The goal of the project is elimination of the order verification through paperless, electronically linked information sharing/exchange between chip design, mask production and production stages, which will allow automation of the mask preparation. To cover these new techniques and their specifications as well as the common ones with automated tools a special generic Meta-model will be generated, based on the current standards for mask specifications, including the requirements from the involved partners (Alcatel Microelectronics, Altis, Compugraphics, Infineon, Nimble, Sigma-C), the project works out a pre-normative standard. The paper presents the current status of work. This work is partly funded by the Commission of the European Union under the Fifth Framework project IST-1999-10332 AutoMOPS.

  9. Modeling the irradiance and temperature rependence of photovoltaic modules in PVsyst

    DOE PAGES

    Sauer, Kenneth J.; Roessler, Thomas; Hansen, Clifford W.

    2014-11-10

    In order to reliably simulate the energy yield of photovoltaic (PV) systems, it is necessary to have an accurate model of how the PV modules perform with respect to irradiance and cell temperature. Building on previous work that addresses the irradiance dependence, two approaches to fit the temperature dependence of module power in PVsyst have been developed and are applied here to recent multi-irradiance and -temperature data for a standard Yingli Solar PV module type. The results demonstrate that it is possible to match the measured irradiance and temperature dependence of PV modules in PVsyst. As a result, improvements inmore » energy yield prediction using the optimized models relative to the PVsyst standard model are considered significant for decisions about project financing.« less

  10. Search for a Heavy Neutrino and Right-Handed W of the Left-Right Symmetric Model with Cms Detector

    NASA Astrophysics Data System (ADS)

    Tlisov, Danila

    2013-11-01

    This work describes the first search for signals from the production of right-handed WR bosons and heavy neutrinos Nℓ (ℓ = e, μ), that arise naturally in the left-right symmetric extension to the Standard Model, with the CMS Experiment at the LHC using the 7 TeV pp collision data collected in 2010 and 2011 corresponding to an integrated luminosity of 240 pb-1. No excess over expectations from Standard Model processes is observed. For models with exact left-right symmetry (the same coupling in the left and right sectors) we exclude the region in the two-dimensional parameter space that extends to (MWR, MNℓ) = (1700 GeV, 600 GeV).

  11. On a radiative origin of the Standard Model from trinification

    NASA Astrophysics Data System (ADS)

    Camargo-Molina, José Eliel; Morais, António P.; Pasechnik, Roman; Wessén, Jonas

    2016-09-01

    In this work, we present a trinification-based grand unified theory incorporating a global SU(3) family symmetry that after a spontaneous breaking leads to a left-right symmetric model. Already at the classical level, this model can accommodate the matter content and the quark Cabbibo mixing in the Standard Model (SM) with only one Yukawa coupling at the unification scale. Considering the minimal low-energy scenario with the least amount of light states, we show that the resulting effective theory enables dynamical breaking of its gauge group down to that of the SM by means of radiative corrections accounted for by the renormalisation group evolution at one loop. This result paves the way for a consistent explanation of the SM breaking scale and fermion mass hierarchies.

  12. Work and power analysis of the golf swing.

    PubMed

    Nesbit, Steven M; Serrano, Monika

    2005-12-01

    A work and power (energy) analysis of the golf swing is presented as a method for evaluating the mechanics of the golf swing. Two computer models were used to estimate the energy production, transfers, and conversions within the body and the golf club by employing standard methods of mechanics to calculate work of forces and torques, kinetic energies, strain energies, and power during the golf swing. A detailed model of the golf club determined the energy transfers and conversions within the club during the downswing. A full-body computer model of the golfer determined the internal work produced at the body joints during the downswing. Four diverse amateur subjects were analyzed and compared using these two models. The energy approach yielded new information on swing mechanics, determined the force and torque components that accelerated the club, illustrated which segments of the body produced work, determined the timing of internal work generation, measured swing efficiencies, calculated shaft energy storage and release, and proved that forces and range of motion were equally important in developing club head velocity. A more comprehensive description of the downswing emerged from information derived from an energy based analysis. Key PointsFull-Body Model of the golf swing.Energy analysis of the golf swing.Work of the body joints dDuring the golf swing.Comparisons of subject work and power characteristics.

  13. Work and Power Analysis of the Golf Swing

    PubMed Central

    Nesbit, Steven M.; Serrano, Monika

    2005-01-01

    A work and power (energy) analysis of the golf swing is presented as a method for evaluating the mechanics of the golf swing. Two computer models were used to estimate the energy production, transfers, and conversions within the body and the golf club by employing standard methods of mechanics to calculate work of forces and torques, kinetic energies, strain energies, and power during the golf swing. A detailed model of the golf club determined the energy transfers and conversions within the club during the downswing. A full-body computer model of the golfer determined the internal work produced at the body joints during the downswing. Four diverse amateur subjects were analyzed and compared using these two models. The energy approach yielded new information on swing mechanics, determined the force and torque components that accelerated the club, illustrated which segments of the body produced work, determined the timing of internal work generation, measured swing efficiencies, calculated shaft energy storage and release, and proved that forces and range of motion were equally important in developing club head velocity. A more comprehensive description of the downswing emerged from information derived from an energy based analysis. Key Points Full-Body Model of the golf swing. Energy analysis of the golf swing. Work of the body joints dDuring the golf swing. Comparisons of subject work and power characteristics. PMID:24627666

  14. Data article on the effect of work engagement strategies on faculty staff behavioural outcomes in private universities.

    PubMed

    Falola, Hezekiah Olubusayo; Olokundun, Maxwell Ayodele; Salau, Odunayo Paul; Oludayo, Olumuyiwa Akinrole; Ibidunni, Ayodotun Stephen

    2018-06-01

    The main objective of this study was to present a data article that investigate the effect of work engagement strategies on faculty behavioural outcomes. Few studies analyse how work engagement strategies could help in driving standard work behaviour particularly in higher institutions. In an attempt to bridge this gap, this study was carried out using descriptive research method and Structural Equation Model (AMOS 22) for the analysis of four hundred and forty one (441) valid questionnaire which were completed by the faculty members of the six selected private universities in Nigeria using stratified and simple random sampling techniques. Factor model which shows high-reliability and good fit was generated, while construct validity was provided through convergent and discriminant analyses.

  15. Leveraging electronic healthcare record standards and semantic web technologies for the identification of patient cohorts

    PubMed Central

    Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat

    2013-01-01

    Background The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. Objective To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. Materials and methods We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. Results We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. Conclusions This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed. PMID:23934950

  16. Leveraging electronic healthcare record standards and semantic web technologies for the identification of patient cohorts.

    PubMed

    Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat

    2013-12-01

    The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed.

  17. Inflation in the standard cosmological model

    NASA Astrophysics Data System (ADS)

    Uzan, Jean-Philippe

    2015-12-01

    The inflationary paradigm is now part of the standard cosmological model as a description of its primordial phase. While its original motivation was to solve the standard problems of the hot big bang model, it was soon understood that it offers a natural theory for the origin of the large-scale structure of the universe. Most models rely on a slow-rolling scalar field and enjoy very generic predictions. Besides, all the matter of the universe is produced by the decay of the inflaton field at the end of inflation during a phase of reheating. These predictions can be (and are) tested from their imprint of the large-scale structure and in particular the cosmic microwave background. Inflation stands as a window in physics where both general relativity and quantum field theory are at work and which can be observationally studied. It connects cosmology with high-energy physics. Today most models are constructed within extensions of the standard model, such as supersymmetry or string theory. Inflation also disrupts our vision of the universe, in particular with the ideas of chaotic inflation and eternal inflation that tend to promote the image of a very inhomogeneous universe with fractal structure on a large scale. This idea is also at the heart of further speculations, such as the multiverse. This introduction summarizes the connections between inflation and the hot big bang model and details the basics of its dynamics and predictions. xml:lang="fr"

  18. A revisit to model the Cr i triplet at 5204-5208 Å and the Ba ii D2 line at 4554 Å in the Second Solar Spectrum

    NASA Astrophysics Data System (ADS)

    Smitha, H. N.; Nagendra, K. N.; Stenflo, J. O.; Bianda, M.; Sampoorna, M.; Ramelli, R.

    2015-10-01

    In our previous attempt to model the Stokes profiles of the Cr i triplet at 5204-5208 Å and the Ba ii D2 at 4554 Å, we found it necessary to slightly modify the standard FAL model atmospheres to fit the observed polarization profiles. In the case of Cr i triplet, this modification was done to reduce the theoretical continuum polarization, and in the case of Ba ii D2, it was needed to reproduce the central peak in Q/I. In this work, we revisit both these cases using different standard model atmospheres whose temperature structures closely resemble those of the modified FAL models, and explore the possibility of synthesizing the line profiles without the need for small modifications of the model atmosphere.

  19. Summary of the white paper of DICOM WG24 'DICOM in Surgery'

    NASA Astrophysics Data System (ADS)

    Lemke, Heinz U.

    2007-03-01

    Standards for creating and integrating information about patients, equipment, and procedures are vitally needed when planning for an efficient Operating Room (OR). The DICOM Working Group 24 (WG24) has been established to develop DICOM objects and services related to Image Guided Surgery (IGS). To determine these standards, it is important to define day-to-day, step-by-step surgical workflow practices and create surgery workflow models per procedures or per variable cases. A well-defined workflow and a high fidelity patient model will be the base of activities for both, radiation therapy and surgery. Considering the present and future requirements for surgical planning and intervention, such a patient model must be n-dimensional, were n may include the spatial and temporal dimensions as well as a number of functional variables. As the boundaries between radiation therapy, surgery and interventional radiology are becoming less well-defined, precise patient models will become the greatest common denominator for all therapeutic disciplines. In addition to imaging, the focus of WG24 should, therefore, also be to serve the therapeutic disciplines by enabling modelling technology to be based on standards.

  20. The effects of economic deprivation on psychological well-being among the working population of Switzerland

    PubMed Central

    Vetter, Stefan; Endrass, Jerome; Schweizer, Ivo; Teng, Hsun-Mei; Rossler, Wulf; Gallo, William T

    2006-01-01

    Background The association between poverty and mental health has been widely investigated. There is, however, limited evidence of mental health implications of working poverty, despite its representing a rapidly expanding segment of impoverished populations in many developed nations. In this study, we examined whether working poverty in Switzerland, a country with substantial recent growth among the working poor, was correlated with two dependent variables of interest: psychological health and unmet mental health need. Methods This cross-sectional study used data drawn from the first 3 waves (1999–2001) of the Swiss Household Panel, a nationally representative sample of the permanent resident population of Switzerland. The study sample comprised 5453 subjects aged 20–59 years. We used Generalized Estimating Equation models to investigate the association between working poverty and psychological well-being; we applied logistic regression models to analyze the link between working poverty and unmet mental health need. Working poverty was represented by dummy variables indicating financial deficiency, restricted standard of living, or both conditions. Results After controlling other factors, restricted standard of living was significantly (p < .001) negatively correlated with psychological well-being; it was also associated with approximately 50% increased risk of unmet mental health need (OR = 1.55; 95% CI 1.17 – 2.06). Conclusion The findings of this study contribute to our understanding of the potential psychological impact of material deprivation on working Swiss citizens. Such knowledge may aid in the design of community intervention programs to help reduce the individual and societal burdens of poverty in Switzerland. PMID:16952322

  1. Changing the work environment in intensive care units to achieve patient-focused care: the time has come.

    PubMed

    McCauley, Kathleen; Irwin, Richard S

    2006-11-01

    The American Association of Critical-Care Nurses Standards for Establishing and Sustaining Healthy Work Environments and the American College of Chest Physicians Patient-Focused Care project are complementary initiatives that provide a road map for creating practice environments where interdisciplinary, patient-focused care can thrive. Healthy work environments are so influential that failure to address the issue would result in deleterious effects for every aspect of acute and critical care practice. Skilled communication and true collaboration are crucial for transforming work environments. The American College of Chest Physicians project on patient-focused care was born out of a realization that medicine as currently practiced is too fragmented, too focused on turf battles that hinder communication, and too divorced from a real understanding of what patients expect and need from their healthcare providers. Communication as well as continuity and concordance with the patients' wishes are foundational premises of care that is patient-focused and safe. Some individuals may achieve some level of genuine patient-focused care even when they practice in a toxic work environment because they are gifted communicators who embrace true collaboration. At best, most likely those efforts will be hit-or-miss and such heroism will be impossible to sustain if the environment is not transformed into a model that reflects standards and initiatives set out by the American Association of Critical-Care Nurses and the American College of Chest Physicians. Other innovative models of care delivery remain unreported. The successes and failures of these models should be shared with the professional community.

  2. 45 CFR 2543.84 - Contract Work Hours and Safety Standards Act.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Contract Work Hours and Safety Standards Act. 2543... laborer on the basis of a standard work week of 40 hours. Work in excess of the standard work week is... pay for all hours worked in excess of 40 hours in the work week. Section 107 of the Act is applicable...

  3. Standardization as an Arena for Open Innovation

    NASA Astrophysics Data System (ADS)

    Grøtnes, Endre

    This paper argues that anticipatory standardization can be viewed as an arena for open innovation and shows this through two cases from mobile telecommunication standardization. One case is the Android initiative by Google and the Open Handset Alliance, while the second case is the general standardization work of the Open Mobile Alliance. The paper shows how anticipatory standardization intentionally uses inbound and outbound streams of research and intellectual property to create new innovations. This is at the heart of the open innovation model. The standardization activities use both pooling of R&D and the distribution of freely available toolkits to create products and architectures that can be utilized by the participants and third parties to leverage their innovation. The paper shows that the technology being standardized needs to have a systemic nature to be part of an open innovation process.

  4. Understanding of flux-limited behaviors of heat transport in nonlinear regime

    NASA Astrophysics Data System (ADS)

    Guo, Yangyu; Jou, David; Wang, Moran

    2016-01-01

    The classical Fourier's law of heat transport breaks down in highly nonequilibrium situations as in nanoscale heat transport, where nonlinear effects become important. The present work is aimed at exploring the flux-limited behaviors based on a categorization of existing nonlinear heat transport models in terms of their theoretical foundations. Different saturation heat fluxes are obtained, whereas the same qualitative variation trend of heat flux versus exerted temperature gradient is got in diverse nonlinear models. The phonon hydrodynamic model is proposed to act as a standard to evaluate other heat flux limiters because of its more rigorous physical foundation. A deeper knowledge is thus achieved about the phenomenological generalized heat transport models. The present work provides deeper understanding and accurate modeling of nonlocal and nonlinear heat transport beyond the diffusive limit.

  5. Stellar Structure Models of Deformed Neutron Stars

    NASA Astrophysics Data System (ADS)

    Zubairi, Omair; Wigley, David; Weber, Fridolin

    Traditional stellar structure models of non-rotating neutron stars work under the assumption that these stars are perfect spheres. This assumption of perfect spherical symmetry is not correct if the matter inside neutron stars is described by an anisotropic model for the equation of state. Certain classes of neutron stars such as Magnetars and neutron stars which contain color-superconducting quark matter cores are expected to be deformed making them oblong spheroids. In this work, we investigate the stellar structure of these deformed neutron stars by deriving stellar structure equations in the framework of general relativity. Using a non-isotropic equation of state model, we solve these structure equations numerically in two dimensions. We calculate stellar properties such as masses and radii along with pressure profiles and investigate changes from standard spherical models.

  6. Creating NDA working standards through high-fidelity spent fuel modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, Steven E; Gauld, Ian C; Romano, Catherine E

    2012-01-01

    The Next Generation Safeguards Initiative (NGSI) is developing advanced non-destructive assay (NDA) techniques for spent nuclear fuel assemblies to advance the state-of-the-art in safeguards measurements. These measurements aim beyond the capabilities of existing methods to include the evaluation of plutonium and fissile material inventory, independent of operator declarations. Testing and evaluation of advanced NDA performance will require reference assemblies with well-characterized compositions to serve as working standards against which the NDA methods can be benchmarked and for uncertainty quantification. To support the development of standards for the NGSI spent fuel NDA project, high-fidelity modeling of irradiated fuel assemblies is beingmore » performed to characterize fuel compositions and radiation emission data. The assembly depletion simulations apply detailed operating history information and core simulation data as it is available to perform high fidelity axial and pin-by-pin fuel characterization for more than 1600 nuclides. The resulting pin-by-pin isotopic inventories are used to optimize the NDA measurements and provide information necessary to unfold and interpret the measurement data, e.g., passive gamma emitters, neutron emitters, neutron absorbers, and fissile content. A key requirement of this study is the analysis of uncertainties associated with the calculated compositions and signatures for the standard assemblies; uncertainties introduced by the calculation methods, nuclear data, and operating information. An integral part of this assessment involves the application of experimental data from destructive radiochemical assay to assess the uncertainty and bias in computed inventories, the impact of parameters such as assembly burnup gradients and burnable poisons, and the influence of neighboring assemblies on periphery rods. This paper will present the results of high fidelity assembly depletion modeling and uncertainty analysis from independent calculations performed using SCALE and MCNP. This work is supported by the Next Generation Safeguards Initiative, Office of Nuclear Safeguards and Security, National Nuclear Security Administration.« less

  7. Consumer Vehicle Choice Model Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Changzheng; Greene, David L

    In response to the Fuel Economy and Greenhouse Gas (GHG) emissions standards, automobile manufacturers will need to adopt new technologies to improve the fuel economy of their vehicles and to reduce the overall GHG emissions of their fleets. The U.S. Environmental Protection Agency (EPA) has developed the Optimization Model for reducing GHGs from Automobiles (OMEGA) to estimate the costs and benefits of meeting GHG emission standards through different technology packages. However, the model does not simulate the impact that increased technology costs will have on vehicle sales or on consumer surplus. As the model documentation states, “While OMEGA incorporates functionsmore » which generally minimize the cost of meeting a specified carbon dioxide (CO2) target, it is not an economic simulation model which adjusts vehicle sales in response to the cost of the technology added to each vehicle.” Changes in the mix of vehicles sold, caused by the costs and benefits of added fuel economy technologies, could make it easier or more difficult for manufacturers to meet fuel economy and emissions standards, and impacts on consumer surplus could raise the costs or augment the benefits of the standards. Because the OMEGA model does not presently estimate such impacts, the EPA is investigating the feasibility of developing an adjunct to the OMEGA model to make such estimates. This project is an effort to develop and test a candidate model. The project statement of work spells out the key functional requirements for the new model.« less

  8. Redesigning the Preparation of All Teachers within the Framework of an Integrated Program Model

    ERIC Educational Resources Information Center

    Hardman, Michael L.

    2009-01-01

    It is incumbent on universities to reflect current research on effective teacher preparation and respond to the changing needs of the 21st century. These needs include the knowledge and skills to instruct diverse students; an increasing emphasis on standards and an integrated curriculum model; and the call for all educators to work together to…

  9. Using CONTENT 1.5 to analyze an SIR model for childhood infectious diseases

    NASA Astrophysics Data System (ADS)

    Su, Rui; He, Daihai

    2008-11-01

    In this work, we introduce a standard software CONTENT 1.5 for analysis of dynamical systems. A simple model for childhood infectious diseases is used as an example. The detailed steps to obtain the bifurcation structures of the system are given. These bifurcation structures can be used to explain the observed dynamical transition in measles incidences.

  10. Getting Started and Working with Building Information Modeling

    ERIC Educational Resources Information Center

    Smith, Dana K.

    2009-01-01

    This article will assume that one has heard of Building Information Modeling or BIM but has not developed a strategy as to how to get the most out of it. The National BIM Standard (NBIMS) has defined BIM as a digital representation of physical and functional characteristics of a facility. As such, it serves as a shared knowledge resource for…

  11. Rasch Model Parameter Estimation in the Presence of a Nonnormal Latent Trait Using a Nonparametric Bayesian Approach

    ERIC Educational Resources Information Center

    Finch, Holmes; Edwards, Julianne M.

    2016-01-01

    Standard approaches for estimating item response theory (IRT) model parameters generally work under the assumption that the latent trait being measured by a set of items follows the normal distribution. Estimation of IRT parameters in the presence of nonnormal latent traits has been shown to generate biased person and item parameter estimates. A…

  12. COSPAR/PRBEM international working group activities report

    NASA Astrophysics Data System (ADS)

    Bourdarie, S.; Blake, B.; Cao, J. B.; Friedel, R.; Miyoshi, Y.; Panasyuk, M.; Underwood, C.

    It is now clear to everybody that the current standard AE8 AP8 model for ionising particle specification in the radiation belts must be updated But such an objective is quite difficult to reach just as a reminder to develop AE8 AP8 model in the seventies was 10 persons full time for ten years It is clear that world-wide efforts must be combined because not any individual group has the human resource to perform these new models by themselves Under COSPAR umbrella an international group of expert well distributed around the world has been created to set up a common framework for everybody involved in this field Planned activities of the international group of experts are to - Define users needs - Provide guidelines for standard file format for ionising measurements - Set up guidelines to process in-situ data on a common basis - Decide in which form the new models will have to be - Centralise all progress done world-wide to advise the community - Try to organise world-wide activities as a project to ensure complementarities and more efficiencies between all efforts done Activities of this working group since its creation will be reported as well as future plans

  13. Global constraints on vector-like WIMP effective interactions

    DOE PAGES

    Blennow, Mattias; Coloma, Pilar; Fernandez-Martinez, Enrique; ...

    2016-04-07

    In this work we combine information from relic abundance, direct detection, cosmic microwave background, positron fraction, gamma rays, and colliders to explore the existing constraints on couplings between Dark Matter and Standard Model constituents when no underlying model or correlation is assumed. For definiteness, we include independent vector-like effective interactions for each Standard Model fermion. Our results show that low Dark Matter masses below 20 GeV are disfavoured at the 3 σ  level with respect to higher masses, due to the tension between the relic abundance requirement and upper constraints on the Dark Matter couplings. Lastly, large couplings are typically onlymore » allowed in combinations which avoid effective couplings to the nuclei used in direct detection experiments.« less

  14. The prediction of nonlinear dynamic loads on helicopters from flight variables using artificial neural networks

    NASA Technical Reports Server (NTRS)

    Cook, A. B.; Fuller, C. R.; O'Brien, W. F.; Cabell, R. H.

    1992-01-01

    A method of indirectly monitoring component loads through common flight variables is proposed which requires an accurate model of the underlying nonlinear relationships. An artificial neural network (ANN) model learns relationships through exposure to a database of flight variable records and corresponding load histories from an instrumented military helicopter undergoing standard maneuvers. The ANN model, utilizing eight standard flight variables as inputs, is trained to predict normalized time-varying mean and oscillatory loads on two critical components over a range of seven maneuvers. Both interpolative and extrapolative capabilities are demonstrated with agreement between predicted and measured loads on the order of 90 percent to 95 percent. This work justifies pursuing the ANN method of predicting loads from flight variables.

  15. Modeling and Simulation Plans in Support of Low Cost, Size, Weight, and Power Surveillance Systems for Detecting and Tracking Non-Cooperative Aircraft

    NASA Technical Reports Server (NTRS)

    Wu, Gilbert; Santiago, Confesor

    2017-01-01

    RTCA Special Committee (SC) 228 has initiated a second phase for the development of minimum operational performance standards (MOPS) for UAS detect and avoid (DAA) systems. Technologies to enable UAS with less available Size, Weight, and Power (SWaP) will be considered. RTCA SC-228 has established sub-working groups and one of the sub-working groups is focused on aligning modeling and simulations activities across all participating committee members. This briefing will describe NASAs modeling and simulation plans for the development of performance standards for low cost, size, weight, and power (C-SWaP) surveillance systems that detect and track non-cooperative aircraft. The briefing will also describe the simulation platform NASA intends to use to support end-to-end verification and validation for these DAA systems. Lastly, the briefing will highlight the experiment plan for our first simulation study, and provide a high-level description of our future flight test plans. This briefing does not contain any results or data.

  16. Poor working conditions and work stress among Canadian sex workers.

    PubMed

    Duff, P; Sou, J; Chapman, J; Dobrer, S; Braschel, M; Goldenberg, S; Shannon, K

    2017-10-01

    While sex work is often considered the world's oldest profession, there remains a dearth of research on work stress among sex workers (SWs) in occupational health epidemiological literature. A better understanding of the drivers of work stress among SWs is needed to inform sex work policy, workplace models and standards. To examine the factors that influence work stress among SWs in Metro Vancouver. Analyses drew from a longitudinal cohort of SWs, known as An Evaluation of Sex Workers' Health Access (AESHA) (2010-14). A modified standardized 'work stress' scale, multivariable linear regression with generalized estimating equations was used to longitudinally examine the factors associated with work stress. In multivariable analysis, poor working conditions were associated with increased work stress and included workplace physical/sexual violence (β = 0.18; 95% confidence interval (CI) 0.06, 0.29), displacement due to police (β = 0.26; 95% CI 0.14, 0.38), working in public spaces (β = 0.73; 95% CI 0.61, 0.84). Older (β = -0.02; 95% CI -0.03, -0.01) and Indigenous SWs experienced lower work stress (β = -0.25; 95% CI -0.43, -0.08), whereas non-injection (β = 0.32; 95% CI 0.14, 0.49) and injection drug users (β = 0.17; 95% CI 0.03, 0.31) had higher work stress. Vancouver-based SWs' work stress was largely shaped by poor work conditions, such as violence, policing, lack of safe workspaces. There is a need to move away from criminalized approaches which shape unsafe work conditions and increase work stress for SWs. Policies that promote SWs' access to the same occupational health, safety and human rights standards as workers in other labour sectors are also needed. © The Author 2017. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. Designing testing service at baristand industri Medan’s liquid waste laboratory

    NASA Astrophysics Data System (ADS)

    Kusumawaty, Dewi; Napitupulu, Humala L.; Sembiring, Meilita T.

    2018-03-01

    Baristand Industri Medan is a technical implementation unit under the Industrial and Research and Development Agency, the Ministry of Industry. One of the services often used in Baristand Industri Medan is liquid waste testing service. The company set the standard of service is nine working days for testing services. At 2015, 89.66% on testing services liquid waste does not meet the specified standard of services company because of many samples accumulated. The purpose of this research is designing online services to schedule the coming the liquid waste sample. The method used is designing an information system that consists of model design, output design, input design, database design and technology design. The results of designing information system of testing liquid waste online consist of three pages are pages to the customer, the recipient samples and laboratory. From the simulation results with scheduled samples, then the standard services a minimum of nine working days can be reached.

  18. Computational Fluid Dynamics Assessment Associated with Transcatheter Heart Valve Prostheses: A Position Paper of the ISO Working Group.

    PubMed

    Wei, Zhenglun Alan; Sonntag, Simon Johannes; Toma, Milan; Singh-Gryzbon, Shelly; Sun, Wei

    2018-04-19

    The governing international standard for the development of prosthetic heart valves is International Organization for Standardization (ISO) 5840. This standard requires the assessment of the thrombus potential of transcatheter heart valve substitutes using an integrated thrombus evaluation. Besides experimental flow field assessment and ex vivo flow testing, computational fluid dynamics is a critical component of this integrated approach. This position paper is intended to provide and discuss best practices for the setup of a computational model, numerical solving, post-processing, data evaluation and reporting, as it relates to transcatheter heart valve substitutes. This paper is not intended to be a review of current computational technology; instead, it represents the position of the ISO working group consisting of experts from academia and industry with regards to considerations for computational fluid dynamic assessment of transcatheter heart valve substitutes.

  19. Standard Gibbs energy of metabolic reactions: II. Glucose-6-phosphatase reaction and ATP hydrolysis.

    PubMed

    Meurer, Florian; Do, Hoang Tam; Sadowski, Gabriele; Held, Christoph

    2017-04-01

    ATP (adenosine triphosphate) is a key reaction for metabolism. Tools from systems biology require standard reaction data in order to predict metabolic pathways accurately. However, literature values for standard Gibbs energy of ATP hydrolysis are highly uncertain and differ strongly from each other. Further, such data usually neglect the activity coefficients of reacting agents, and published data like this is apparent (condition-dependent) data instead of activity-based standard data. In this work a consistent value for the standard Gibbs energy of ATP hydrolysis was determined. The activity coefficients of reacting agents were modeled with electrolyte Perturbed-Chain Statistical Associating Fluid Theory (ePC-SAFT). The Gibbs energy of ATP hydrolysis was calculated by combining the standard Gibbs energies of hexokinase reaction and of glucose-6-phosphate hydrolysis. While the standard Gibbs energy of hexokinase reaction was taken from previous work, standard Gibbs energy of glucose-6-phosphate hydrolysis reaction was determined in this work. For this purpose, reaction equilibrium molalities of reacting agents were measured at pH7 and pH8 at 298.15K at varying initial reacting agent molalities. The corresponding activity coefficients at experimental equilibrium molalities were predicted with ePC-SAFT yielding the Gibbs energy of glucose-6-phosphate hydrolysis of -13.72±0.75kJ·mol -1 . Combined with the value for hexokinase, the standard Gibbs energy of ATP hydrolysis was finally found to be -31.55±1.27kJ·mol -1 . For both, ATP hydrolysis and glucose-6-phosphate hydrolysis, a good agreement with own and literature values were obtained when influences of pH, temperature, and activity coefficients were explicitly taken into account in order to calculate standard Gibbs energy at pH7, 298.15K and standard state. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Strategic Deployment of Clinical Models.

    PubMed

    Goossen, William

    2016-01-01

    The selection, implementation, and certification of electronic health records (EHR) could benefit from the required use of one of the established clinical model approaches. For the lifelong record of data about individuals, issues arise about the permanence and preservation of data during or even beyond a lifetime. Current EHR do not fully adhere to pertinent standards for clinical data, where it is known for some 20 plus years that standardization of health data is a cornerstone for patient safety, interoperability, data retrieval for various purposes and the lifelong preservation of such data. This paper briefly introduces the issues and gives a brief recommendation for future work in this area.

  1. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.

  2. Steady-State Cycle Deck Launcher Developed for Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    VanDrei, Donald E.

    1997-01-01

    One of the objectives of NASA's High Performance Computing and Communications Program's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to reduce the time and cost of generating aerothermal numerical representations of engines, called customer decks. These customer decks, which are delivered to airframe companies by various U.S. engine companies, numerically characterize an engine's performance as defined by the particular U.S. airframe manufacturer. Until recently, all numerical models were provided with a Fortran-compatible interface in compliance with the Society of Automotive Engineers (SAE) document AS681F, and data communication was performed via a standard, labeled common structure in compliance with AS681F. Recently, the SAE committee began to develop a new standard: AS681G. AS681G addresses multiple language requirements for customer decks along with alternative data communication techniques. Along with the SAE committee, the NPSS Steady-State Cycle Deck project team developed a standard Application Program Interface (API) supported by a graphical user interface. This work will result in Aerospace Recommended Practice 4868 (ARP4868). The Steady-State Cycle Deck work was validated against the Energy Efficient Engine customer deck, which is publicly available. The Energy Efficient Engine wrapper was used not only to validate ARP4868 but also to demonstrate how to wrap an existing customer deck. The graphical user interface for the Steady-State Cycle Deck facilitates the use of the new standard and makes it easier to design and analyze a customer deck. This software was developed following I. Jacobson's Object-Oriented Design methodology and is implemented in C++. The AS681G standard will establish a common generic interface for U.S. engine companies and airframe manufacturers. This will lead to more accurate cycle models, quicker model generation, and faster validation leading to specifications. The standard will facilitate cooperative work between industry and NASA. The NPSS Steady-State Cycle Deck team released a batch version of the Steady-State Cycle Deck in March 1996. Version 1.1 was released in June 1996. During fiscal 1997, NPSS accepted enhancements and modifications to the Steady-State Cycle Deck launcher. Consistent with NPSS' commercialization plan, these modifications will be done by a third party that can provide long-term software support.

  3. A comparative study of various inflow boundary conditions and turbulence models for wind turbine wake predictions

    NASA Astrophysics Data System (ADS)

    Tian, Lin-Lin; Zhao, Ning; Song, Yi-Lei; Zhu, Chun-Ling

    2018-05-01

    This work is devoted to perform systematic sensitivity analysis of different turbulence models and various inflow boundary conditions in predicting the wake flow behind a horizontal axis wind turbine represented by an actuator disc (AD). The tested turbulence models are the standard k-𝜀 model and the Reynolds Stress Model (RSM). A single wind turbine immersed in both uniform flows and in modeled atmospheric boundary layer (ABL) flows is studied. Simulation results are validated against the field experimental data in terms of wake velocity and turbulence intensity.

  4. Wine Traceability: A Data Model and Prototype in Albanian Context

    PubMed Central

    Vukatana, Kreshnik; Sevrani, Kozeta; Hoxha, Elira

    2016-01-01

    Vine traceability is a critical issue that has gained interest internationally. Quality control programs and schemes are mandatory in many countries including EU members and the USA. Albania has transformed most of the EU regulations on food into laws. Regarding the vine sector, the obligation of wine producers to keep traceability data is part of the legislation. The analysis on the interviews conducted with Albanian winemakers show that these data are actually recorded only in hard copy. Another fact that emerges from the interviews is that only two producers have implemented the ISO (International Organization for Standardization) standards on food. The purpose of this paper is to develop an agile and automated traceability system based on these standards. We propose a data model and system prototype that are described in the second and third section of this work. The data model is an adaption along the lines of the GS1 (Global Standards One) specifications for a wine supply chain. The proposed prototype has a key component that is mobile access to the information about wine through barcode technology. By using this mechanism the consumer obtains transparency on his expectations concerning the quality criteria. Another important component of the proposed system in this paper is a real-time notification module that works as an alert system when a risk is identified. This can help producers and authorities to have a rapid identification of a contaminated product. It is important in cases when recalling the product from the market or preventing it from reaching the consumer. PMID:28231105

  5. Wine Traceability: A Data Model and Prototype in Albanian Context.

    PubMed

    Vukatana, Kreshnik; Sevrani, Kozeta; Hoxha, Elira

    2016-02-17

    Vine traceability is a critical issue that has gained interest internationally. Quality control programs and schemes are mandatory in many countries including EU members and the USA. Albania has transformed most of the EU regulations on food into laws. Regarding the vine sector, the obligation of wine producers to keep traceability data is part of the legislation. The analysis on the interviews conducted with Albanian winemakers show that these data are actually recorded only in hard copy. Another fact that emerges from the interviews is that only two producers have implemented the ISO (International Organization for Standardization) standards on food. The purpose of this paper is to develop an agile and automated traceability system based on these standards. We propose a data model and system prototype that are described in the second and third section of this work. The data model is an adaption along the lines of the GS1 (Global Standards One) specifications for a wine supply chain. The proposed prototype has a key component that is mobile access to the information about wine through barcode technology. By using this mechanism the consumer obtains transparency on his expectations concerning the quality criteria. Another important component of the proposed system in this paper is a real-time notification module that works as an alert system when a risk is identified. This can help producers and authorities to have a rapid identification of a contaminated product. It is important in cases when recalling the product from the market or preventing it from reaching the consumer.

  6. Blood flow and oxygen uptake during exercise

    NASA Technical Reports Server (NTRS)

    Mitchell, J. W.; Stolwijk, J. A. J.; Nadel, E. R.

    1973-01-01

    A model is developed for predicting oxygen uptake, muscle blood flow, and blood chemistry changes under exercise conditions. In this model, the working muscle mass system is analyzed. The conservation of matter principle is applied to the oxygen in a unit mass of working muscle under transient exercise conditions. This principle is used to relate the inflow of oxygen carried with the blood to the outflow carried with blood, the rate of change of oxygen stored in the muscle myoglobin, and the uptake by the muscle. Standard blood chemistry relations are incorporated to evaluate venous levels of oxygen, pH, and carbon dioxide.

  7. A comparison between standard methods and structural nested modelling when bias from a healthy worker survivor effect is suspected: an iron-ore mining cohort study.

    PubMed

    Björ, Ove; Damber, Lena; Jonsson, Håkan; Nilsson, Tohr

    2015-07-01

    Iron-ore miners are exposed to extremely dusty and physically arduous work environments. The demanding activities of mining select healthier workers with longer work histories (ie, the Healthy Worker Survivor Effect (HWSE)), and could have a reversing effect on the exposure-response association. The objective of this study was to evaluate an iron-ore mining cohort to determine whether the effect of respirable dust was confounded by the presence of an HWSE. When an HWSE exists, standard modelling methods, such as Cox regression analysis, produce biased results. We compared results from g-estimation of accelerated failure-time modelling adjusted for HWSE with corresponding unadjusted Cox regression modelling results. For all-cause mortality when adjusting for the HWSE, cumulative exposure from respirable dust was associated with a 6% decrease of life expectancy if exposed ≥15 years, compared with never being exposed. Respirable dust continued to be associated with mortality after censoring outcomes known to be associated with dust when adjusting for the HWSE. In contrast, results based on Cox regression analysis did not support that an association was present. The adjustment for the HWSE made a difference when estimating the risk of mortality from respirable dust. The results of this study, therefore, support the recommendation that standard methods of analysis should be complemented with structural modelling analysis techniques, such as g-estimation of accelerated failure-time modelling, to adjust for the HWSE. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. "Someone's rooting for you": continuity, advocacy and street-level bureaucracy in UK maternal healthcare.

    PubMed

    Finlay, Susanna; Sandall, Jane

    2009-10-01

    Continuity and advocacy are widely held to be important elements in maternal healthcare, yet they are often lacking from the care women receive. In order to understand this disparity, we draw upon interviews and ethnographic observational findings from The One-to-One Caseload Project, a study exploring the impacts of a caseload model of maternity care within an urban National Health Service provider in Britain. Drawing on Lipsky's (1980) and Prottas's (1979) theories of street-level bureaucracy, this paper attempts to understand how midwives, working on the frontline within caseload and standard care models, manage the competing demands of delivering a personalised service within a bureaucratic organisation. The caseload care model serves as a case study for how a client-centred model of working can assist street-level bureaucrats to manage the administrative pressures of public service organisations and provide their clients with a personalised, responsive service. Nevertheless, despite such benefits, client-centred models of working may have unintended consequences for both health carers and healthcare systems.

  9. European approaches to work-related stress: a critical review on risk evaluation.

    PubMed

    Zoni, Silvia; Lucchini, Roberto G

    2012-03-01

    In recent years, various international organizations have raised awareness regarding psychosocial risks and work-related stress. European stakeholders have also taken action on these issues by producing important documents, such as position papers and government regulations, which are reviewed in this article. In particular, 4 European models that have been developed for the assessment and management of work-related stress are considered here. Although important advances have been made in the understanding of work-related stress, there are still gaps in the translation of this knowledge into effective practice at the enterprise level. There are additional problems regarding the methodology in the evaluation of work-related stress. The European models described in this article are based on holistic, global and participatory approaches, where the active role of and involvement of workers are always emphasized. The limitations of these models are in the lack of clarity on preventive intervention and, for two of them, the lack of instrument standardization for risk evaluation. The comparison among the European models to approach work-related stress, although with limitations and socio-cultural differences, offers the possibility for the development of a social dialogue that is important in defining the correct and practical methodology for work stress evaluation and prevention.

  10. New IEEE standard enables data collection for medical applications.

    PubMed

    Kennelly, R J; Wittenber, J

    1994-01-01

    The IEEE has gone to ballot on a "Standard for Medical Device Communications", IEEE P1073. The lower layer, hardware portions of the standard are expected to be approved by the IEEE Standards Board at their December 11-13, 1994 meeting. Other portions of the standard are in the initial stages of the IEEE ballot process. The intent of the standard is to allow hospitals and other users to interface medical electronic devices to host computer systems in a standard, interchangeable manner. The standard is optimized for acute care environments such as ICU's, operating rooms, and emergency rooms. [1] IEEE General Committee and Subcommittee work has been on-going since 1984. Significant amounts of work have been done to discover and meet the needs of the patient care setting. Surveys performed in 1989 identified the following four key user requirements for medical device communications: 1) Frequent reconfiguration of the network. 2) Allow "plug and play" operation by users. 3) Associate devices with a specific bed and patient. 4) Support a wide range of hospital computer system topologies. Additionally, the most critical difference in the acute care setting is patient safety, which has an overall effect on the standard. The standard that went to ballot meets these requirements. The standard is based on existing ISO standards. P1073 is compliant with the OSI seven layer model. P1073 specifies the entire communication stack, from object-oriented software to hospital unique connectors. The standard will be able to be put forward as a true international standard, much in the way that the IEEE 802.x family of standards (like Ethernet) were presented as draft ISO standards.(ABSTRACT TRUNCATED AT 250 WORDS)

  11. Association of nurse work environment and safety climate on patient mortality: A cross-sectional study.

    PubMed

    Olds, Danielle M; Aiken, Linda H; Cimiotti, Jeannie P; Lake, Eileen T

    2017-09-01

    There are two largely distinct research literatures on the association of the nurse work environment and the safety climate on patient outcomes. To determine whether hospital safety climate and work environment make comparable or distinct contributions to patient mortality. Cross-sectional secondary analysis of linked datasets of Registered Nurse survey responses, adult acute care discharge records, and hospital characteristics. Acute care hospitals in California, Florida, New Jersey, and Pennsylvania. The sample included 600 hospitals linked to 27,009 nurse survey respondents and 852,974 surgical patients. Nurse survey data included assessments of the nurse work environment and hospital safety climate. The outcome of interest was in-hospital mortality. Data analyses included descriptive statistics and multivariate random intercept logistic regression. In a fully adjusted model, a one standard deviation increase in work environment score was associated with an 8.1% decrease in the odds of mortality (OR 0.919, p<0.001). A one-standard deviation increase in safety climate score was similarly associated with a 7.7% decrease in the odds of mortality (OR 0.923, p<0.001). However, when work environment and safety climate were modeled together, the effect of the work environment remained significant, while safety climate became a non-significant predictor of mortality odds (OR 0.940, p=0.035 vs. OR 0.971, p=0.316). We found that safety climate perception is not predictive of patient mortality beyond the effect of the nurse work environment. To advance hospital safety and quality and improve patient outcomes, organizational interventions should be directed toward improving nurse work environments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Defending Against Advanced Persistent Threats Using Game-Theory

    PubMed Central

    König, Sandra; Schauer, Stefan

    2017-01-01

    Advanced persistent threats (APT) combine a variety of different attack forms ranging from social engineering to technical exploits. The diversity and usual stealthiness of APT turns them into a central problem of contemporary practical system security, since information on attacks, the current system status or the attacker’s incentives is often vague, uncertain and in many cases even unavailable. Game theory is a natural approach to model the conflict between the attacker and the defender, and this work investigates a generalized class of matrix games as a risk mitigation tool for an advanced persistent threat (APT) defense. Unlike standard game and decision theory, our model is tailored to capture and handle the full uncertainty that is immanent to APTs, such as disagreement among qualitative expert risk assessments, unknown adversarial incentives and uncertainty about the current system state (in terms of how deeply the attacker may have penetrated into the system’s protective shells already). Practically, game-theoretic APT models can be derived straightforwardly from topological vulnerability analysis, together with risk assessments as they are done in common risk management standards like the ISO 31000 family. Theoretically, these models come with different properties than classical game theoretic models, whose technical solution presented in this work may be of independent interest. PMID:28045922

  13. Twenty Years On!: Updating the IEA BESTEST Building Thermal Fabric Test Cases for ASHRAE Standard 140

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.

    2013-07-01

    ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less

  14. Twenty Years On!: Updating the IEA BESTEST Building Thermal Fabric Test Cases for ASHRAE Standard 140: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.

    2013-07-01

    ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less

  15. A Simulation Analysis of Work Based Navy Manpower Requirements

    DTIC Science & Technology

    2012-09-01

    Workweek is also the subject of greater discussion in the 2002 CNA study. At the time, the Navy Standard Workweek was 67 hours of...policies, states that the Navy Standard Workweek includes 8 hours of sleep a day (Navy, 2007). To properly model this while also treating...system during these non-scheduled hours . Figure 4 shows an example of a sailor’s schedule as built into Arena. The graph’s x-axis represents the

  16. Analytical and Experimental Performance Evaluation of BLE Neighbor Discovery Process Including Non-Idealities of Real Chipsets

    PubMed Central

    Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio

    2017-01-01

    The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage. PMID:28273801

  17. Analytical and Experimental Performance Evaluation of BLE Neighbor Discovery Process Including Non-Idealities of Real Chipsets.

    PubMed

    Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio

    2017-03-03

    The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage.

  18. Information Object Definition–based Unified Modeling Language Representation of DICOM Structured Reporting

    PubMed Central

    Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K.P.

    2002-01-01

    Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification. PMID:11751804

  19. NASA's Geospatial Interoperability Office(GIO)Program

    NASA Technical Reports Server (NTRS)

    Weir, Patricia

    2004-01-01

    NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including areas such as ESE Applications, the SEEDS Working Groups, the Facilities Engineering Division (Code JX) and NASA's Chief Information Offices (CIO). With these agency level requirements GIO leads, brokers and facilitates efforts to, develop, implement, influence and fully participate in standards development internationally, federally and locally. The GIO also represents NASA in the OpenGIS Consortium and ISO TC211. The OGC has made considerable progress in regards to relations with other open standards bodies; namely ISO, W3C and OASIS. ISO TC211 is the Geographic and Geomatics Information technical committee that works towards standardization in the field of digital geographic information. The GIO focuses on seamless access to data, applications of data, and enabling technologies furthering the interoperability of distributed data. Through teaming within the Applications Directorate and partnerships with government, private industry, education and communities, GIO works towards the data application goals of NASA, the ESE Applications Directorate, and our Federal partners by managing projects in four categories: Geospatial Standards and Leadership, Geospatial One Stop, Standards Development and Implementation, and National and NASA Activities.

  20. Computational fluid dynamic on the temperature simulation of air preheat effect combustion in propane turbulent flame

    NASA Astrophysics Data System (ADS)

    Elwina; Yunardi; Bindar, Yazid

    2018-04-01

    this paper presents results obtained from the application of a computational fluid dynamics (CFD) code Fluent 6.3 to modelling of temperature in propane flames with and without air preheat. The study focuses to investigate the effect of air preheat temperature on the temperature of the flame. A standard k-ε model and Eddy Dissipation model are utilized to represent the flow field and combustion of the flame being investigated, respectively. The results of calculations are compared with experimental data of propane flame taken from literature. The results of the study show that a combination of the standard k-ε turbulence model and eddy dissipation model is capable of producing reasonable predictions of temperature, particularly in axial profile of all three flames. Both experimental works and numerical simulation showed that increasing the temperature of the combustion air significantly increases the flame temperature.

  1. Assessing the effects of pharmacists' perceived organizational support, organizational commitment and turnover intention on provision of medication information at community pharmacies in Lithuania: a structural equation modeling approach.

    PubMed

    Urbonas, Gvidas; Kubilienė, Loreta; Kubilius, Raimondas; Urbonienė, Aušra

    2015-03-01

    As a member of a pharmacy organization, a pharmacist is not only bound to fulfill his/her professional obligations but is also affected by different personal and organizational factors that may influence his/her behavior and, consequently, the quality of the services he/she provides to patients. The main purpose of the research was to test a hypothesized model of the relationships among several organizational variables, and to investigate whether any of these variables affects the service of provision of medication information at community pharmacies. During the survey, pharmacists working at community pharmacies in Lithuania were asked to express their opinions on the community pharmacies at which they worked and to reflect on their actions when providing information on medicines to their patients. The statistical data were analyzed by applying a structural equation modeling technique to test the hypothesized model of the relationships among the variables of Perceived Organizational Support, Organizational Commitment, Turnover Intention, and Provision of Medication Information. The final model revealed that Organizational Commitment had a positive direct effect on Provision of Medication Information (standardized estimate = 0.27) and a negative direct effect (standardized estimate = -0.66) on Turnover Intention. Organizational Commitment mediated the indirect effects of Perceived Organizational Support on Turnover Intention (standardized estimate = -0.48) and on Provision of Medication Information (standardized estimate = 0.20). Pharmacists' Turnover Intention had no significant effect on Provision of Medication Information. Community pharmacies may be viewed as encouraging, to some extent, the service of provision of medication information. Pharmacists who felt higher levels of support from their organizations also expressed, to a certain extent, higher commitment to their organizations by providing more consistent medication information to patients. However, the effect of organizational variables on the variable of Provision of Medication Information appeared to be limited.

  2. Ocean Research - Perspectives from an international Ocean Research Coordination Network

    NASA Astrophysics Data System (ADS)

    Pearlman, Jay; Williams, Albert, III

    2013-04-01

    The need for improved coordination in ocean observations is more urgent now given the issues of climate change, sustainable food sources and increased need for energy. Ocean researchers must work across disciplines to provide policy makers with clear and understandable assessments of the state of the ocean. With advances in technology, not only in observation, but also communication and computer science, we are in a new era where we can answer questions asked over the last 100 years at the time and space scales that are relevant. Programs like GLOBEC moved us forward but we are still challenged by the disciplinary divide. Interdisciplinary problem solving must be addressed not only by the exchange of data between the many sides, but through levels where questions require day-to-day collaboration. A National Science Foundation-funded Research Coordination Network (RCN) is addressing approaches for improving interdisciplinary research capabilities in the ocean sciences. During the last year, the RCN had a working group for Open Data led by John Orcutt, Peter Pissierssens and Albert Williams III. The teams has focused on three areas: 1. Data and Information formats and standards; 2. Data access models (including IPR, business models for open data, data policies,...); 3. Data publishing, data citation. There has been a significant trend toward free and open access to data in the last few years. In 2007, the US announced that Landsat data would be available at no charge. Float data from the US (NDBC), JCOMM and OceanSites offer web-based access. The IODE is developing its Ocean Data Portal giving immediate and free access to ocean data. However, from the aspect of long-term collaborations across communities, this global trend is less robust than might appear at the surface. While there are many standard data formats for data exchange, there is not yet widespread uniformity in their adoption. Use of standard data formats can be encouraged in several ways: sponsors of observational science programs can encourage or require standard formats for data storage; scientific journals can require that data in support of publication be deposited in a standard format; and finally, communities of scientists can recognize that observational or model-developed data sets are professional contributions deserving citation. Even with standards for exchange, the availability of data and models can limited by cultural and policy issues. Investigators on NSF grants are expected to share with other researchers the primary data, samples, physical collections and other supporting materials created under their grants. Broader approaches to data availability are seen in the model of the human genome project; according to the Bermuda Agreement (1996), the funding agencies required that all scientists working on the human genome make the data quickly and openly available. Is this a model for ocean data? This presentation will examine the steps forward in stimulating interdisciplinary research through data exchange and better addressing the gaps in communication and approaches that are still common across the ocean sciences.

  3. The trials and tribulations of a practitioner-researcher: challenges and lessons learned through testing a feminist-cognitive-relational social work model of practice.

    PubMed

    Dombo, Eileen A; Bass, Ami P

    2014-01-01

    In practice with adult women who survived childhood sexual abuse, the field of social work currently lacks an evidence-based intervention. The current interventions, from the 1990s, come primarily from psychologists. The hypothesis that the Feminist-Cognitive-Relational Social Work Model and Intervention will be more effective in decreasing cognitive distortions, and increasing intimacy and relational health when compared to the standard agency intervention was tested in a quasi-experimental study. The challenges in carrying out the study in small, non-profit organizations are explored to highlight the difficulties in developing evidence-based interventions. Changes to implementation that resulted from the research findings are discussed.

  4. Assessment of a novel biomechanical fracture model for distal radius fractures

    PubMed Central

    2012-01-01

    Background Distal radius fractures (DRF) are one of the most common fractures and often need surgical treatment, which has been validated through biomechanical tests. Currently a number of different fracture models are used, none of which resemble the in vivo fracture location. The aim of the study was to develop a new standardized fracture model for DRF (AO-23.A3) and compare its biomechanical behavior to the current gold standard. Methods Variable angle locking volar plates (ADAPTIVE, Medartis) were mounted on 10 pairs of fresh-frozen radii. The osteotomy location was alternated within each pair (New: 10 mm wedge 8 mm / 12 mm proximal to the dorsal / volar apex of the articular surface; Gold standard: 10 mm wedge 20 mm proximal to the articular surface). Each specimen was tested in cyclic axial compression (increasing load by 100 N per cycle) until failure or −3 mm displacement. Parameters assessed were stiffness, displacement and dissipated work calculated for each cycle and ultimate load. Significance was tested using a linear mixed model and Wald test as well as t-tests. Results 7 female and 3 male pairs of radii aged 74 ± 9 years were tested. In most cases (7/10), the two groups showed similar mechanical behavior at low loads with increasing differences at increasing loads. Overall the novel fracture model showed a significant different biomechanical behavior than the gold standard model (p < 0,001). The average final loads resisted were significantly lower in the novel model (860 N ± 232 N vs. 1250 N ± 341 N; p = 0.001). Conclusion The novel biomechanical fracture model for DRF more closely mimics the in vivo fracture site and shows a significantly different biomechanical behavior with increasing loads when compared to the current gold standard. PMID:23244634

  5. High Performance Programming Using Explicit Shared Memory Model on Cray T3D1

    NASA Technical Reports Server (NTRS)

    Simon, Horst D.; Saini, Subhash; Grassi, Charles

    1994-01-01

    The Cray T3D system is the first-phase system in Cray Research, Inc.'s (CRI) three-phase massively parallel processing (MPP) program. This system features a heterogeneous architecture that closely couples DEC's Alpha microprocessors and CRI's parallel-vector technology, i.e., the Cray Y-MP and Cray C90. An overview of the Cray T3D hardware and available programming models is presented. Under Cray Research adaptive Fortran (CRAFT) model four programming methods (data parallel, work sharing, message-passing using PVM, and explicit shared memory model) are available to the users. However, at this time data parallel and work sharing programming models are not available to the user community. The differences between standard PVM and CRI's PVM are highlighted with performance measurements such as latencies and communication bandwidths. We have found that the performance of neither standard PVM nor CRI s PVM exploits the hardware capabilities of the T3D. The reasons for the bad performance of PVM as a native message-passing library are presented. This is illustrated by the performance of NAS Parallel Benchmarks (NPB) programmed in explicit shared memory model on Cray T3D. In general, the performance of standard PVM is about 4 to 5 times less than obtained by using explicit shared memory model. This degradation in performance is also seen on CM-5 where the performance of applications using native message-passing library CMMD on CM-5 is also about 4 to 5 times less than using data parallel methods. The issues involved (such as barriers, synchronization, invalidating data cache, aligning data cache etc.) while programming in explicit shared memory model are discussed. Comparative performance of NPB using explicit shared memory programming model on the Cray T3D and other highly parallel systems such as the TMC CM-5, Intel Paragon, Cray C90, IBM-SP1, etc. is presented.

  6. On standardization of basic datasets of electronic medical records in traditional Chinese medicine.

    PubMed

    Zhang, Hong; Ni, Wandong; Li, Jing; Jiang, Youlin; Liu, Kunjing; Ma, Zhaohui

    2017-12-24

    Standardization of electronic medical record, so as to enable resource-sharing and information exchange among medical institutions has become inevitable in view of the ever increasing medical information. The current research is an effort towards the standardization of basic dataset of electronic medical records in traditional Chinese medicine. In this work, an outpatient clinical information model and an inpatient clinical information model are created to adequately depict the diagnosis processes and treatment procedures of traditional Chinese medicine. To be backward compatible with the existing dataset standard created for western medicine, the new standard shall be a superset of the existing standard. Thus, the two models are checked against the existing standard in conjunction with 170,000 medical record cases. If a case cannot be covered by the existing standard due to the particularity of Chinese medicine, then either an existing data element is expanded with some Chinese medicine contents or a new data element is created. Some dataset subsets are also created to group and record Chinese medicine special diagnoses and treatments such as acupuncture. The outcome of this research is a proposal of standardized traditional Chinese medicine medical records datasets. The proposal has been verified successfully in three medical institutions with hundreds of thousands of medical records. A new dataset standard for traditional Chinese medicine is proposed in this paper. The proposed standard, covering traditional Chinese medicine as well as western medicine, is expected to be soon approved by the authority. A widespread adoption of this proposal will enable traditional Chinese medicine hospitals and institutions to easily exchange information and share resources. Copyright © 2017. Published by Elsevier B.V.

  7. Toward a Geoscientific Semantic Web Based on How Geoscientists Talk Across Disciplines

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2015-12-01

    Are there terms and scientific concepts from math and science that almost all geoscientists understand? Is there a limited set of terms, patterns and language elements that geoscientists use for efficient, unambiguous communication that could be used to describe the variables that they measure, store in data sets and use as model inputs and outputs? In this talk it will be argued that the answer to both questions is "yes" by drawing attention to many such patterns and then showing how they have been used to create a rich set of naming conventions for variables called the CSDMS Standard Names. Variables, which store numerical quantities associated with specific objects, are the fundamental currency of science. They are the items that are measured and saved in data sets, which may then be read into models. They are the inputs and outputs of models and the items exchanged between coupled models. They also star in the equations that summarize our scientific knowledge. Carefully constructed, unambiguous and unique labels for commonly used variables therefore provide an attractive mechanism for automatic semantic mediation when variables are to be shared between heterogeous resources. They provide a means to automatically check for semantic equivalence so that variables can be safely shared in resource compositions. A good set of standardized variable names can serve as the hub in a hub-and-spoke solution to semantic mediation, where the "internal vocabularies" of geoscience resources (i.e. data sets and models) are mapped to and from the hub to facilitate interoperability and data sharing. When built from patterns and terms that most geoscientists are already familiar with, these standardized variable names are then "readable" by both humans and machines. Despite the importance of variables in scientific work, most of the ontological work in the geosciences is focused at a higher level that supports finding resources (e.g data sets) but not on describing the contents of those resources. The CSDMS Standard Names have matured continuously since they were first introduced over three years ago. Many recent extensions and applications of them (e.g. different science domains, different projects, new rules, ontological work) as well as their compatibility with the International System of Quantities (ISO 80000) will be discussed.

  8. 40 CFR 745.85 - Work practice standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Work practice standards. 745.85... Renovation § 745.85 Work practice standards. (a) Standards for renovation activities. Renovations must be... in § 745.90(b). (1) Occupant protection. Firms must post signs clearly defining the work area and...

  9. Representing Hydrologic Models as HydroShare Resources to Facilitate Model Sharing and Collaboration

    NASA Astrophysics Data System (ADS)

    Castronova, A. M.; Goodall, J. L.; Mbewe, P.

    2013-12-01

    The CUAHSI HydroShare project is a collaborative effort that aims to provide software for sharing data and models within the hydrologic science community. One of the early focuses of this work has been establishing metadata standards for describing models and model-related data as HydroShare resources. By leveraging this metadata definition, a prototype extension has been developed to create model resources that can be shared within the community using the HydroShare system. The extension uses a general model metadata definition to create resource objects, and was designed so that model-specific parsing routines can extract and populate metadata fields from model input and output files. The long term goal is to establish a library of supported models where, for each model, the system has the ability to extract key metadata fields automatically, thereby establishing standardized model metadata that will serve as the foundation for model sharing and collaboration within HydroShare. The Soil Water & Assessment Tool (SWAT) is used to demonstrate this concept through a case study application.

  10. Modeling Behavior of Students in E-Learning Courses on the Basis of Use Interactive Animations

    ERIC Educational Resources Information Center

    Magdin, Martin; Turcáni, Milan

    2016-01-01

    Authors in their contribution deal with modeling the behavior of user in e-learning course based on the use of interactive animations. Nowadays, E-learning courses form a standard part of educational process. However, it is not so easy to determine the way students work with study material, whether they make use of it in order to increase didactic…

  11. Animal models for microbicide studies.

    PubMed

    Veazey, Ronald S; Shattock, Robin J; Klasse, Per Johan; Moore, John P

    2012-01-01

    There have been encouraging recent successes in the development of safe and effective topical microbicides to prevent vaginal or rectal HIV-1 transmission, based on the use of anti-retroviral drugs. However, much work remains to be accomplished before a microbicide becomes a standard element of prevention science strategies. Animal models should continue to play an important role in pre-clinical testing, with emphasis on safety, pharmacokinetic and efficacy testing.

  12. Family Decisionmaking Over the Life Cycle: Some Implications for Estimating the Effects of Income Maintenance Programs.

    ERIC Educational Resources Information Center

    Smith, James P.

    The standard one-period labor supply model that economists have used is in some ways an inadequate tool to evaluate a Family Assistance Plan (FAP). The principal difficulty is that an FAP will have important interperiod or life cycle effects. The pure life cycle model, an extension of the work of Becker and Ghez, is derived here without reference…

  13. The Enabler: A concept for a lunar work vehicle

    NASA Technical Reports Server (NTRS)

    Brazell, James W.; Campbell, Craig; Kaser, Ken; Austin, James A.; Beard, Clark; Ceniza, Glenn; Hamby, Thomas; Robinson, Anne; Wooters, Dana

    1992-01-01

    The Enabler is an earthbound prototype designed to model an actual lunar work vehicle and is able to perform many of the tasks that might be expected of a lunar work vehicle. The vehicle will be constructed entirely from parts made by students and from standard stock parts. The design utilizes only four distinct chassis pieces and sixteen moving parts. The Enabler has non-orthogonal articulating joints that give the vehicle a wide range of mobility and reduce the total number of parts. Composite wheels provide the primary suspension system for the vehicle.

  14. Constraining the top-Higgs sector of the standard model effective field theory

    NASA Astrophysics Data System (ADS)

    Cirigliano, V.; Dekens, W.; de Vries, J.; Mereghetti, E.

    2016-08-01

    Working in the framework of the Standard Model effective field theory, we study chirality-flipping couplings of the top quark to Higgs and gauge bosons. We discuss in detail the renormalization-group evolution to lower energies and investigate direct and indirect contributions to high- and low-energy C P -conserving and C P -violating observables. Our analysis includes constraints from collider observables, precision electroweak tests, flavor physics, and electric dipole moments. We find that indirect probes are competitive or dominant for both C P -even and C P -odd observables, even after accounting for uncertainties associated with hadronic and nuclear matrix elements, illustrating the importance of including operator mixing in constraining the Standard Model effective field theory. We also study scenarios where multiple anomalous top couplings are generated at the high scale, showing that while the bounds on individual couplings relax, strong correlations among couplings survive. Finally, we find that enforcing minimal flavor violation does not significantly affect the bounds on the top couplings.

  15. [Three dimensional mathematical model of tooth for finite element analysis].

    PubMed

    Puskar, Tatjana; Vasiljević, Darko; Marković, Dubravka; Jevremović, Danimir; Pantelić, Dejan; Savić-Sević, Svetlana; Murić, Branka

    2010-01-01

    The mathematical model of the abutment tooth is the starting point of the finite element analysis of stress and deformation of dental structures. The simplest and easiest way is to form a model according to the literature data of dimensions and morphological characteristics of teeth. Our method is based on forming 3D models using standard geometrical forms (objects) in programmes for solid modeling. Forming the mathematical model of abutment of the second upper premolar for finite element analysis of stress and deformation of dental structures. The abutment tooth has a form of a complex geometric object. It is suitable for modeling in programs for solid modeling SolidWorks. After analysing the literature data about the morphological characteristics of teeth, we started the modeling dividing the tooth (complex geometric body) into simple geometric bodies (cylinder, cone, pyramid,...). Connecting simple geometric bodies together or substricting bodies from the basic body, we formed complex geometric body, tooth. The model is then transferred into Abaqus, a computational programme for finite element analysis. Transferring the data was done by standard file format for transferring 3D models ACIS SAT. Using the programme for solid modeling SolidWorks, we developed three models of abutment of the second maxillary premolar: the model of the intact abutment, the model of the endodontically treated tooth with two remaining cavity walls and the model of the endodontically treated tooth with two remaining walls and inserted post. Mathematical models of the abutment made according to the literature data are very similar with the real abutment and the simplifications are minimal. These models enable calculations of stress and deformation of the dental structures. The finite element analysis provides useful information in understanding biomechanical problems and gives guidance for clinical research.

  16. A UML profile for the OBO relation ontology.

    PubMed

    Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G

    2012-01-01

    Ontologies have increasingly been used in the biomedical domain, which has prompted the emergence of different initiatives to facilitate their development and integration. The Open Biological and Biomedical Ontologies (OBO) Foundry consortium provides a repository of life-science ontologies, which are developed according to a set of shared principles. This consortium has developed an ontology called OBO Relation Ontology aiming at standardizing the different types of biological entity classes and associated relationships. Since ontologies are primarily intended to be used by humans, the use of graphical notations for ontology development facilitates the capture, comprehension and communication of knowledge between its users. However, OBO Foundry ontologies are captured and represented basically using text-based notations. The Unified Modeling Language (UML) provides a standard and widely-used graphical notation for modeling computer systems. UML provides a well-defined set of modeling elements, which can be extended using a built-in extension mechanism named Profile. Thus, this work aims at developing a UML profile for the OBO Relation Ontology to provide a domain-specific set of modeling elements that can be used to create standard UML-based ontologies in the biomedical domain.

  17. The Educator's "Action" Office.

    ERIC Educational Resources Information Center

    Martin, Dikran J.

    Design criteria, standards, and human factors related to designing and planning flexible and efficient work environments for college faculty members are overviewed with six model (example) office plans included. The physical and psychological design needs in such an office facility are given, with task performance data on student faculty…

  18. Galaxy formation through hierarchical clustering

    NASA Astrophysics Data System (ADS)

    White, Simon D. M.; Frenk, Carlos S.

    1991-09-01

    Analytic methods for studying the formation of galaxies by gas condensation within massive dark halos are presented. The present scheme applies to cosmogonies where structure grows through hierarchical clustering of a mixture of gas and dissipationless dark matter. The simplest models consistent with the current understanding of N-body work on dissipationless clustering, and that of numerical and analytic work on gas evolution and cooling are adopted. Standard models for the evolution of the stellar population are also employed, and new models for the way star formation heats and enriches the surrounding gas are constructed. Detailed results are presented for a cold dark matter universe with Omega = 1 and H(0) = 50 km/s/Mpc, but the present methods are applicable to other models. The present luminosity functions contain significantly more faint galaxies than are observed.

  19. Neutral kaon mixing beyond the Standard Model with n f = 2 + 1 chiral fermions. Part 2: non perturbative renormalisation of the Δ F = 2 four-quark operators

    NASA Astrophysics Data System (ADS)

    Boyle, Peter A.; Garron, Nicolas; Hudspith, Renwick J.; Lehner, Christoph; Lytle, Andrew T.

    2017-10-01

    We compute the renormalisation factors ( Z-matrices) of the Δ F = 2 four-quark operators needed for Beyond the Standard Model (BSM) kaon mixing. We work with n f = 2+1 flavours of Domain-Wall fermions whose chiral-flavour properties are essential to maintain a continuum-like mixing pattern. We introduce new RI-SMOM renormalisation schemes, which we argue are better behaved compared to the commonly-used corresponding RI-MOM one. We find that, once converted to \\overline{MS} , the Z-factors computed through these RI-SMOM schemes are in good agreement but differ significantly from the ones computed through the RI-MOM scheme. The RI-SMOM Z-factors presented here have been used to compute the BSM neutral kaon mixing matrix elements in the companion paper [1]. We argue that the renormalisation procedure is responsible for the discrepancies observed by different collaborations, we will investigate and elucidate the origin of these differences throughout this work.

  20. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  1. Individual music therapy for depression: randomised controlled trial.

    PubMed

    Erkkilä, Jaakko; Punkanen, Marko; Fachner, Jörg; Ala-Ruona, Esa; Pöntiö, Inga; Tervaniemi, Mari; Vanhala, Mauno; Gold, Christian

    2011-08-01

    Music therapy has previously been found to be effective in the treatment of depression but the studies have been methodologically insufficient and lacking in clarity about the clinical model employed. Aims To determine the efficacy of music therapy added to standard care compared with standard care only in the treatment of depression among working-age people. Participants (n = 79) with an ICD-10 diagnosis of depression were randomised to receive individual music therapy plus standard care (20 bi-weekly sessions) or standard care only, and followed up at baseline, at 3 months (after intervention) and at 6 months. Clinical measures included depression, anxiety, general functioning, quality of life and alexithymia. ISRCTN84185937. Participants receiving music therapy plus standard care showed greater improvement than those receiving standard care only in depression symptoms (mean difference 4.65, 95% CI 0.59 to 8.70), anxiety symptoms (1.82, 95% CI 0.09 to 3.55) and general functioning (-4.58, 95% CI -8.93 to -0.24) at 3-month follow-up. The response rate was significantly higher for the music therapy plus standard care group than for the standard care only group (odds ratio 2.96, 95% CI 1.01 to 9.02). Individual music therapy combined with standard care is effective for depression among working-age people with depression. The results of this study along with the previous research indicate that music therapy with its specific qualities is a valuable enhancement to established treatment practices.

  2. Combining dictionary techniques with extensible markup language (XML)--requirements to a new approach towards flexible and standardized documentation.

    PubMed Central

    Altmann, U.; Tafazzoli, A. G.; Noelle, G.; Huybrechts, T.; Schweiger, R.; Wächter, W.; Dudeck, J. W.

    1999-01-01

    In oncology various international and national standards exist for the documentation of different aspects of a disease. Since elements of these standards are repeated in different contexts, a common data dictionary could support consistent representation in any context. For the construction of such a dictionary existing documents have to be worked up in a complex procedure, that considers aspects of hierarchical decomposition of documents and of domain control as well as aspects of user presentation and models of the underlying model of patient data. In contrast to other thesauri, text chunks like definitions or explanations are very important and have to be preserved, since oncologic documentation often means coding and classification on an aggregate level and the safe use of coding systems is an important precondition for comparability of data. This paper discusses the potentials of the use of XML in combination with a dictionary for the promotion and development of standard conformable applications for tumor documentation. PMID:10566311

  3. Person-centered work environments, psychological safety, and positive affect in healthcare: a theoretical framework.

    PubMed

    Rathert, Cheryl; May, Douglas R

    2008-01-01

    We propose that in order to systematically improve healthcare quality, healthcare organizations (HCOs) need work environments that are person-centered: environments that support the careprovider as well as the patient. We further argue that HCOs have a moral imperative to provide a workplace where professional care standards can be achieved. We draw upon a large body of research from several disciplines to propose and articulate a theoretical framework that explains how the work environment should be related to the well-being of patients and careproviders, that is, the potential mediating mechanisms. Person-centered work environments include: 1. Climates for patient-centered care. 2. Climates for quality improvement. 3. Benevolent ethical climates. Such a work environment should support the provision of patient-centered care, and should lead to positive psychological states for careproviders, including psychological safety and positive affect. The model contributes to theory by specifying relationships between important organizational variables. The model can potentially contribute to practice by linking specific work environment attributes to outcomes for careproviders and patients.

  4. Australian midwifery students and the continuity of care experience--getting it right.

    PubMed

    Sidebotham, Mary

    2014-09-01

    The evidence base supporting the value to be gained by women and babies from receiving continuity of care from a known midwife is growing; it is essential, therefore, that we nurture the future workforce to work within this model of care. The Australian National Midwifery Education Standards mandate that midwifery students provide continuity of care to 20 women as part of their practice requirements. The educational value to students and the degree of preparation this provides for future work patterns is well acknowledged. There is also growing evidence that women, too, benefit from having a student follow them through the pregnancy journey. This paper examines the experience of some students working within this model and comments on the importance of providing a flexible programme delivery model and supportive midwifery educators in order to sustain and develop this innovative approach to completing clinical practice requirements within a midwifery education programme.

  5. Risk-Based Tailoring of the Verification, Validation, and Accreditation/Acceptance Processes (Adaptation fondee sur le risque, des processus de verification, de validation, et d’accreditation/d’acceptation)

    DTIC Science & Technology

    2012-04-01

    Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December

  6. A new simplified volume-loaded heterotopic rabbit heart transplant model with improved techniques and a standard operating procedure.

    PubMed

    Lu, Wei; Zheng, Jun; Pan, Xu-Dong; Li, Bing; Zhang, Jin-Wei; Wang, Long-Fei; Sun, Li-Zhong

    2015-04-01

    The classic non-working (NW) heterotopic heart transplant (HTX) model in rodents had been widely used for researches related to immunology, graft rejection, evaluation of immunosuppressive therapies and organ preservation. But unloaded models are considered not suitable for some researches. Accordingly, We have constructed a volume-loaded (VL) model by a new and simple technique. Thirty male New Zealand White rabbits were randomly divided into two groups, group NW with 14 rabbits and group VL with 16 rabbits, which served as donors and recipients. We created a large and nonrestrictive shunt to provide left heart a sufficient preload. The donor superior vena cave and ascending aorta (AO) were anastomosed to the recipient abdominal aorta (AAO) and inferior vena cava (IVC), respectively. No animals suffered from paralysis, pneumonia and lethal bleeding. Recipients' mortality and morbidity were 6.7% (1/15) and 13.3% (2/15), respectively. The cold ischemia time in group VL is slight longer than that in group NW. The maximal aortic velocity (MAV) of donor heart was approximately equivalent to half that of native heart in group VL. Moreover, the similar result was achieved in the parameter of late diastolic mitral inflow velocity between donor heart and native heart in group VL. The echocardiography (ECHO) showed a bidirectional flow in donor SVC of VL model, inflow during diastole and outflow during systole. PET-CT imaging showed the standard uptake value (SUV) of allograft was equal to that of native heart in both groups on the postoperative day 3. We have developed a new VL model in rabbits, which imitates a native heart hemodynamically while only requiring a minor additional procedure. Surgical technique is simple compared with currently used HTX models. We also developed a standard operating procedure that significantly improved graft and recipient survival rate. This study may be useful for investigations in transplantation in which a working model is required.

  7. Working with the HL7 metamodel in a Model Driven Engineering context.

    PubMed

    Martínez-García, A; García-García, J A; Escalona, M J; Parra-Calderón, C L

    2015-10-01

    HL7 (Health Level 7) International is an organization that defines health information standards. Most HL7 domain information models have been designed according to a proprietary graphic language whose domain models are based on the HL7 metamodel. Many researchers have considered using HL7 in the MDE (Model-Driven Engineering) context. A limitation has been identified: all MDE tools support UML (Unified Modeling Language), which is a standard model language, but most do not support the HL7 proprietary model language. We want to support software engineers without HL7 experience, thus real-world problems would be modeled by them by defining system requirements in UML that are compliant with HL7 domain models transparently. The objective of the present research is to connect HL7 with software analysis using a generic model-based approach. This paper introduces a first approach to an HL7 MDE solution that considers the MIF (Model Interchange Format) metamodel proposed by HL7 by making use of a plug-in developed in the EA (Enterprise Architect) tool. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Neurocognitive Predictors of Mathematical Processing in School-Aged Children with Spina Bifida and Their Typically Developing Peers: Attention, Working Memory, and Fine Motor Skills

    PubMed Central

    Raghubar, Kimberly P.; Barnes, Marcia A.; Dennis, Maureen; Cirino, Paul T.; Taylor, Heather; Landry, Susan

    2015-01-01

    Objective Math and attention are related in neurobiological and behavioral models of mathematical cognition. This study employed model-driven assessments of attention and math in children with spina bifida myelomeningocele (SBM), who have known math difficulties and specific attentional deficits, to more directly examine putative relations between attention and mathematical processing. The relation of other domain general abilities and math was also investigated. Method Participants were 9.5-year-old children with SBM (N = 44) and typically developing children (N = 50). Participants were administered experimental exact and approximate arithmetic tasks, and standardized measures of math fluency and calculation. Cognitive measures included the Attention Network Test (ANT), and standardized measures of fine motor skills, verbal working memory (WM), and visual-spatial WM. Results Children with SBM performed similarly to peers on exact arithmetic but more poorly on approximate and standardized arithmetic measures. On the ANT, children with SBM differed from controls on orienting attention but not alerting and executive attention. Multiple mediation models showed that: fine motor skills and verbal WM mediated the relation of group to approximate arithmetic; fine motor skills and visual-spatial WM mediated the relation of group to math fluency; and verbal and visual-spatial WM mediated the relation of group to math calculation. Attention was not a significant mediator of the effects of group for any aspect of math in this study. Conclusions Results are discussed with reference to models of attention, WM, and mathematical cognition. PMID:26011113

  9. Neurocognitive predictors of mathematical processing in school-aged children with spina bifida and their typically developing peers: Attention, working memory, and fine motor skills.

    PubMed

    Raghubar, Kimberly P; Barnes, Marcia A; Dennis, Maureen; Cirino, Paul T; Taylor, Heather; Landry, Susan

    2015-11-01

    Math and attention are related in neurobiological and behavioral models of mathematical cognition. This study employed model-driven assessments of attention and math in children with spina bifida myelomeningocele (SBM), who have known math difficulties and specific attentional deficits, to more directly examine putative relations between attention and mathematical processing. The relation of other domain general abilities and math was also investigated. Participants were 9.5-year-old children with SBM (n = 44) and typically developing children (n = 50). Participants were administered experimental exact and approximate arithmetic tasks, and standardized measures of math fluency and calculation. Cognitive measures included the Attention Network Test (ANT), and standardized measures of fine motor skills, verbal working memory (WM), and visual-spatial WM. Children with SBM performed similarly to peers on exact arithmetic, but more poorly on approximate and standardized arithmetic measures. On the ANT, children with SBM differed from controls on orienting attention, but not on alerting and executive attention. Multiple mediation models showed that fine motor skills and verbal WM mediated the relation of group to approximate arithmetic; fine motor skills and visual-spatial WM mediated the relation of group to math fluency; and verbal and visual-spatial WM mediated the relation of group to math calculation. Attention was not a significant mediator of the effects of group for any aspect of math in this study. Results are discussed with reference to models of attention, WM, and mathematical cognition. (c) 2015 APA, all rights reserved).

  10. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  11. Is outdoor work associated with elevated rates of cerebrovascular disease mortality? A cohort study based on iron-ore mining.

    PubMed

    Björ, Ove; Jonsson, Håkan; Damber, Lena; Burström, Lage; Nilsson, Tohr

    2016-01-01

    A cohort study that examined iron ore mining found negative associations between cumulative working time employed underground and several outcomes, including mortality of cerebrovascular diseases. In this cohort study, and using the same group of miners, we examined whether work in an outdoor environment could explain elevated cerebrovascular disease rates. This study was based on a Swedish iron ore mining cohort consisting of 13,000 workers. Poisson regression models were used to generate smoothed estimates of standardized mortality ratios and adjusted rate ratios, both models by cumulative exposure time in outdoor work. The adjusted rate ratio between employment classified as outdoor work ≥25 years and outdoor work 0-4 years was 1.62 (95 % CI 1.07-2.42). The subgroup underground work ≥15 years deviated most in occurrence of cerebrovascular disease mortality compared with the external reference population: SMR (0.70 (95 % CI 0.56-0.85)). Employment in outdoor environments was associated with elevated rates of cerebrovascular disease mortality. In contrast, work in tempered underground employment was associated with a protecting effect.

  12. NASA Standard for Models and Simulations: Credibility Assessment Scale

    NASA Technical Reports Server (NTRS)

    Babula, Maria; Bertch, William J.; Green, Lawrence L.; Hale, Joseph P.; Mosier, Gary E.; Steele, Martin J.; Woods, Jody

    2009-01-01

    As one of its many responses to the 2003 Space Shuttle Columbia accident, NASA decided to develop a formal standard for models and simulations (M&S). Work commenced in May 2005. An interim version was issued in late 2006. This interim version underwent considerable revision following an extensive Agency-wide review in 2007 along with some additional revisions as a result of the review by the NASA Engineering Management Board (EMB) in the first half of 2008. Issuance of the revised, permanent version, hereafter referred to as the M&S Standard or just the Standard, occurred in July 2008. Bertch, Zang and Steeleiv provided a summary review of the development process of this standard up through the start of the review by the EMB. A thorough recount of the entire development process, major issues, key decisions, and all review processes are available in Ref. v. This is the second of a pair of papers providing a summary of the final version of the Standard. Its focus is the Credibility Assessment Scale, a key feature of the Standard, including an example of its application to a real-world M&S problem for the James Webb Space Telescope. The companion paper summarizes the overall philosophy of the Standard and an overview of the requirements. Verbatim quotes from the Standard are integrated into the text of this paper, and are indicated by quotation marks.

  13. Search for Decays of the Λ$$0\\atop{b}$$ Baryon with the D0 Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camacho, Enrique

    2011-11-25

    This thesis presents work I performed within the D0 Collaboration to make the measurement of the Branching Ratio of Λmore » $$0\\atop{b}$$ baryon in the channel Λ$$0\\atop{b}$$ → J/ΨΛ 0 . The b-hadron such as the Λ$$0\\atop{b}$$ are currently the subject of much research in both the theorical and experimental particle physics communities. Measurements of the production and decays of b-hadrons can improve the understanding of the electroweak and strong interactions described by the Standard Model of particle physics, as well as proving opportunities to search for physics beyond the Standard Model.« less

  14. Progress toward a new measurement of the neutron lifetime

    NASA Astrophysics Data System (ADS)

    Grammer, Kyle

    2015-10-01

    Free neutron decay is the simplest nuclear beta decay. A precise value for the neutron lifetime is valuable for standard model consistency tests and Big Bang Nucleosynthesis models. There is a disagreement between the measured neutron lifetime from cold neutron beam experiments and ultracold neutron storage experiments. A new measurement of the neutron lifetime using the beam method is planned at the National Institute of Standards and Technology Center for Neutron Research. Experimental improvements should result in a 1s uncertainty measurement of the neutron lifetime. The technical improvements, recent apparatus tests, and the path towards the new measurement will be discussed. This work is supported by DOE Office of Science, NIST, and NSF.

  15. Progress toward a new measurement of the neutron lifetime

    NASA Astrophysics Data System (ADS)

    Grammer, Kyle

    2015-04-01

    Free neutron decay is the simplest nuclear beta decay. A precise value for the neutron lifetime is valuable for standard model consistency tests and Big Bang Nucleosynthesis models. There is a disagreement between the measured neutron lifetime from cold neutron beam experiments and ultracold neutron storage experiments. A new measurement of the neutron lifetime using the beam method is planned at the National Institute of Standards and Technology Center for Neutron Research. Experimental improvements should result in a 1s uncertainty measurement of the neutron lifetime. The technical improvements and the path towards the new measurement will be discussed. This work is supported by DOE Office of Science, NIST, and NSF.

  16. Testing for Lorentz violation: constraints on standard-model-extension parameters via lunar laser ranging.

    PubMed

    Battat, James B R; Chandler, John F; Stubbs, Christopher W

    2007-12-14

    We present constraints on violations of Lorentz invariance based on archival lunar laser-ranging (LLR) data. LLR measures the Earth-Moon separation by timing the round-trip travel of light between the two bodies and is currently accurate to the equivalent of a few centimeters (parts in 10(11) of the total distance). By analyzing this LLR data under the standard-model extension (SME) framework, we derived six observational constraints on dimensionless SME parameters that describe potential Lorentz violation. We found no evidence for Lorentz violation at the 10(-6) to 10(-11) level in these parameters. This work constitutes the first LLR constraints on SME parameters.

  17. 29 CFR 1926.11 - Coverage under section 103 of the act distinguished.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of the Contract Work Hours and Safety Standards Act. The application of the overtime requirements is... Contract Work Hours and Safety Standards Act, a contract must be one which (1) is entered into under a... statute “providing wage standards for such work.” The statutes “providing wage standards for such work...

  18. 48 CFR 22.403-3 - Contract Work Hours and Safety Standards Act.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Contract Work Hours and... Standards for Contracts Involving Construction 22.403-3 Contract Work Hours and Safety Standards Act. The Contract Work Hours and Safety Standards Act (40 U.S.C. 3701 et seq.) requires that certain contracts (see...

  19. 29 CFR 1926.11 - Coverage under section 103 of the act distinguished.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of the Contract Work Hours and Safety Standards Act. The application of the overtime requirements is... Contract Work Hours and Safety Standards Act, a contract must be one which (1) is entered into under a... statute “providing wage standards for such work.” The statutes “providing wage standards for such work...

  20. A Standards-Based Inventory of Foundation Competencies in Social Work with Groups

    ERIC Educational Resources Information Center

    Macgowan, Mark J.

    2012-01-01

    Objective: This article describes the development of a measure of foundation competencies in group work derived from the Standards for Social Work Practice with Groups. Developed by the Association for the Advancement of Social Work with Groups, the Standards have not been widely used. An instrument based on the Standards can help advance…

  1. 40 CFR 63.4893 - What work practice standards must I meet?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Limitations § 63.4893 What work practice standards must I meet? (a) For any coating operation or group of... 40 Protection of Environment 12 2010-07-01 2010-07-01 true What work practice standards must I... controls option to demonstrate compliance, you are not required to meet any work practice standards. (b...

  2. A thought construction of working perpetuum mobile of the second kind

    NASA Astrophysics Data System (ADS)

    Čápek, V.; Bok, J.

    1999-12-01

    The previously published model of the isothermal Maxwell demon as one of models of open quantum systems endowed with the faculty of selforganization is reconstructed here. It describes an open quantum system interacting with a single thermodynamic bath but otherwise not aided from outside. Its activity is given by the standard linear Liouville equation for the system and bath. Owing to its selforganization property, the model then yields cyclic conversion of heat from the bath into mechanical work without compensation. Hence, it provides an explicit thought construction of perpetuum mobile of the second kind, contradicting thus the Thomson formulation of the second law of thermodynamics. No approximation is involved as a special scaling procedure is used which makes the employed kinetic equations exact.

  3. Real-Time Multimedia on the Internet: What Will It Take?

    ERIC Educational Resources Information Center

    Sodergren, Mike

    1998-01-01

    Considers the requirements for real-time, interactive multimedia over the Internet. Topics include demand for interactivity; new pricing models for Internet service; knowledgeable suppliers; consumer education on standards; enhanced infrastructure, including bandwidth; and new technology, including RSVP, and end-to-end Internet-working protocol.…

  4. The performance of a reduced-order adaptive controller when used in multi-antenna hyperthermia treatments with nonlinear temperature-dependent perfusion.

    PubMed

    Cheng, Kung-Shan; Yuan, Yu; Li, Zhen; Stauffer, Paul R; Maccarini, Paolo; Joines, William T; Dewhirst, Mark W; Das, Shiva K

    2009-04-07

    In large multi-antenna systems, adaptive controllers can aid in steering the heat focus toward the tumor. However, the large number of sources can greatly increase the steering time. Additionally, controller performance can be degraded due to changes in tissue perfusion which vary non-linearly with temperature, as well as with time and spatial position. The current work investigates whether a reduced-order controller with the assumption of piecewise constant perfusion is robust to temperature-dependent perfusion and achieves steering in a shorter time than required by a full-order controller. The reduced-order controller assumes that the optimal heating setting lies in a subspace spanned by the best heating vectors (virtual sources) of an initial, approximate, patient model. An initial, approximate, reduced-order model is iteratively updated by the controller, using feedback thermal images, until convergence of the heat focus to the tumor. Numerical tests were conducted in a patient model with a right lower leg sarcoma, heated in a 10-antenna cylindrical mini-annual phased array applicator operating at 150 MHz. A half-Gaussian model was used to simulate temperature-dependent perfusion. Simulated magnetic resonance temperature images were used as feedback at each iteration step. Robustness was validated for the controller, starting from four approximate initial models: (1) a 'standard' constant perfusion lower leg model ('standard' implies a model that exactly models the patient with the exception that perfusion is considered constant, i.e., not temperature dependent), (2) a model with electrical and thermal tissue properties varied from 50% higher to 50% lower than the standard model, (3) a simplified constant perfusion pure-muscle lower leg model with +/-50% deviated properties and (4) a standard model with the tumor position in the leg shifted by 1.5 cm. Convergence to the desired focus of heating in the tumor was achieved for all four simulated models. The controller accomplished satisfactory therapeutic outcomes: approximately 80% of the tumor was heated to temperatures 43 degrees C and approximately 93% was maintained at temperatures <41 degrees C. Compared to the controller without model reduction, a approximately 9-25 fold reduction in convergence time was accomplished using approximately 2-3 orthonormal virtual sources. In the situations tested, the controller was robust to the presence of temperature-dependent perfusion. The results of this work can help to lay the foundation for real-time thermal control of multi-antenna hyperthermia systems in clinical situations where perfusion can change rapidly with temperature.

  5. Computer investigations of the turbulent flow around a NACA2415 airfoil wind turbine

    NASA Astrophysics Data System (ADS)

    Driss, Zied; Chelbi, Tarek; Abid, Mohamed Salah

    2015-12-01

    In this work, computer investigations are carried out to study the flow field developing around a NACA2415 airfoil wind turbine. The Navier-Stokes equations in conjunction with the standard k-ɛ turbulence model are considered. These equations are solved numerically to determine the local characteristics of the flow. The models tested are implemented in the software "SolidWorks Flow Simulation" which uses a finite volume scheme. The numerical results are compared with experiments conducted on an open wind tunnel to validate the numerical results. This will help improving the aerodynamic efficiency in the design of packaged installations of the NACA2415 airfoil type wind turbine.

  6. Why some colors appear more memorable than others: A model combining categories and particulars in color working memory.

    PubMed

    Bae, Gi-Yeul; Olkkonen, Maria; Allred, Sarah R; Flombaum, Jonathan I

    2015-08-01

    Categorization with basic color terms is an intuitive and universal aspect of color perception. Yet research on visual working memory capacity has largely assumed that only continuous estimates within color space are relevant to memory. As a result, the influence of color categories on working memory remains unknown. We propose a dual content model of color representation in which color matches to objects that are either present (perception) or absent (memory) integrate category representations along with estimates of specific values on a continuous scale ("particulars"). We develop and test the model through 4 experiments. In a first experiment pair, participants reproduce a color target, both with and without a delay, using a recently influential estimation paradigm. In a second experiment pair, we use standard methods in color perception to identify boundary and focal colors in the stimulus set. The main results are that responses drawn from working memory are significantly biased away from category boundaries and toward category centers. Importantly, the same pattern of results is present without a memory delay. The proposed dual content model parsimoniously explains these results, and it should replace prevailing single content models in studies of visual working memory. More broadly, the model and the results demonstrate how the main consequence of visual working memory maintenance is the amplification of category related biases and stimulus-specific variability that originate in perception. (c) 2015 APA, all rights reserved).

  7. 28 CFR Appendix A to Part 70 - Contract Provisions

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Department. 4. Contract Work Hours and Safety Standards Act (40 U.S.C. 327-333)—Where applicable, all... standard work week of forty hours. Work in excess of the standard work week is permissible provided that... all hours worked in excess of forty hours in the work week. Section 107 of the Act is applicable to...

  8. 38 CFR Appendix A to Part 49 - Contract Provisions

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Federal awarding agency. 4. Contract Work Hours and Safety Standards Act (40 U.S.C. 327-333)—Where... the basis of a standard work week of 40 hours. Work in excess of the standard work week is permissible... hours worked in excess of 40 hours in the work week. Section 107 of the Act is applicable to...

  9. The relationship between quality of work life and turnover intention of primary health care nurses in Saudi Arabia.

    PubMed

    Almalki, Mohammed J; FitzGerald, Gerry; Clark, Michele

    2012-09-12

    Quality of work life (QWL) has been found to influence the commitment of health professionals, including nurses. However, reliable information on QWL and turnover intention of primary health care (PHC) nurses is limited. The aim of this study was to examine the relationship between QWL and turnover intention of PHC nurses in Saudi Arabia. A cross-sectional survey was used in this study. Data were collected using Brooks' survey of Quality of Nursing Work Life, the Anticipated Turnover Scale and demographic data questions. A total of 508 PHC nurses in the Jazan Region, Saudi Arabia, completed the questionnaire (RR = 87%). Descriptive statistics, t-test, ANOVA, General Linear Model (GLM) univariate analysis, standard multiple regression, and hierarchical multiple regression were applied for analysis using SPSS v17 for Windows. Findings suggested that the respondents were dissatisfied with their work life, with almost 40% indicating a turnover intention from their current PHC centres. Turnover intention was significantly related to QWL. Using standard multiple regression, 26% of the variance in turnover intention was explained by QWL, p < 0.001, with R2 = .263. Further analysis using hierarchical multiple regression found that the total variance explained by the model as a whole (demographics and QWL) was 32.1%, p < 0.001. QWL explained an additional 19% of the variance in turnover intention, after controlling for demographic variables. Creating and maintaining a healthy work life for PHC nurses is very important to improve their work satisfaction, reduce turnover, enhance productivity and improve nursing care outcomes.

  10. The relationship between quality of work life and turnover intention of primary health care nurses in Saudi Arabia

    PubMed Central

    2012-01-01

    Background Quality of work life (QWL) has been found to influence the commitment of health professionals, including nurses. However, reliable information on QWL and turnover intention of primary health care (PHC) nurses is limited. The aim of this study was to examine the relationship between QWL and turnover intention of PHC nurses in Saudi Arabia. Methods A cross-sectional survey was used in this study. Data were collected using Brooks’ survey of Quality of Nursing Work Life, the Anticipated Turnover Scale and demographic data questions. A total of 508 PHC nurses in the Jazan Region, Saudi Arabia, completed the questionnaire (RR = 87%). Descriptive statistics, t-test, ANOVA, General Linear Model (GLM) univariate analysis, standard multiple regression, and hierarchical multiple regression were applied for analysis using SPSS v17 for Windows. Results Findings suggested that the respondents were dissatisfied with their work life, with almost 40% indicating a turnover intention from their current PHC centres. Turnover intention was significantly related to QWL. Using standard multiple regression, 26% of the variance in turnover intention was explained by QWL, p < 0.001, with R2 = .263. Further analysis using hierarchical multiple regression found that the total variance explained by the model as a whole (demographics and QWL) was 32.1%, p < 0.001. QWL explained an additional 19% of the variance in turnover intention, after controlling for demographic variables. Conclusions Creating and maintaining a healthy work life for PHC nurses is very important to improve their work satisfaction, reduce turnover, enhance productivity and improve nursing care outcomes. PMID:22970764

  11. Conformance testing strategies for DICOM protocols in a heterogenous communications system

    NASA Astrophysics Data System (ADS)

    Meyer, Ralph; Hewett, Andrew J.; Cordonnier, Emmanuel; Piqueras, Joachim; Jensch, Peter F.

    1995-05-01

    The goal of the DICOM standard is to define a standard network interface and data model for imaging devices from various vendors. It shall facilitate the development and integration of information systems and picture archiving and communication systems (PACS) in a networked environment. Current activities in Oldenburg, Germany include projects to establish cooperative work applications for radiological purposes, comprising (joined) text, data, signal and image communications, based on narrowband ISDN and ATM communication for regional and Pan European applications. In such a growing and constantly changing environment it is vital to have a solid and implementable plan to bring standards in operation. A communication standard alone cannot ensure interoperability between different vendor implementations. Even DICOM does not specify implementation-specific requirements nor does it specify a testing procedure to assess an implementation's conformance to the standard. The conformance statements defined in the DICOM standard only allow a user to determine which optional components are supported by the implementation. The goal of our work is to build a conformance test suite for DICOM. Conformance testing can aid to simplify and solve problems with multivendor systems. It will check a vendor's implementation against the DICOM standard and state the found subset of functionality. The test suite will be built in respect to the ISO 9646 Standard (OSI-Conformance Testing Methodology and Framework) which is a standard devoted to the subject of conformance testing implementations of Open Systems Interconnection (OSI) standards. For our heterogeneous communication environments we must also consider ISO 9000 - 9004 (quality management and quality assurance) to give the users the confidence in evolving applications.

  12. The forgotten realm of the new and emerging psychosocial risk factors.

    PubMed

    Chirico, Francesco

    2017-09-28

    In Europe, employers of all private and public enterprises have a legal obligation to protect their employers by all the different types of workplace hazards to the safety and health of workers. The most important methods developed for the work-related stress risk assessment are based on the Cox's research commissioned by European Agency for Safety and Health at Work (EU-OSHA) and are the Management Standard HSE for work-related stress in United Kingdom, the START method in Germany, the Screening, Observation, Analysis, Expertise (SOBANE) in Belgium, and the National Institute for Prevention and Safety at Work (INAIL-ISPESL) model in Italy, the latter based on the British Management Standard. Unfortunately, the definition of "work-related stress" elaborated by EU-OSHA was criticized, because it is not completely equal to the broader "psychosocial risk," which includes new and emerging psychosocial risk factors, such as the combined exposure to physical and psychosocial risks, job insecurity, work intensification and high demands at work, high emotional load related to burnout, work-life balance problems, and violence and harassment at work. All these new emerging psychosocial hazards could require different and additional methodologies to save workers' health and safety. For this reason, the concept that stakeholders and policy makers should keep in mind in order to develop better national regulations and strategies is that work-related stress risk and psychosocial risk factors are not the same.

  13. Cosmological Models and Stability

    NASA Astrophysics Data System (ADS)

    Andersson, Lars

    Principles in the form of heuristic guidelines or generally accepted dogma play an important role in the development of physical theories. In particular, philosophical considerations and principles figure prominently in the work of Albert Einstein. As mentioned in the talk by Jiří Bičák at this conference, Einstein formulated the equivalence principle, an essential step on the road to general relativity, during his time in Prague 1911-1912. In this talk, I would like to discuss some aspects of cosmological models. As cosmology is an area of physics where "principles" such as the "cosmological principle" or the "Copernican principle" play a prominent role in motivating the class of models which form part of the current standard model, I will start by comparing the role of the equivalence principle to that of the principles used in cosmology. I will then briefly describe the standard model of cosmology to give a perspective on some mathematical problems and conjectures on cosmological models, which are discussed in the later part of this paper.

  14. Angular dependence models for radiance to flux conversion

    NASA Technical Reports Server (NTRS)

    Green, Richard N.; Suttles, John T.; Wielicki, Bruce A.

    1990-01-01

    Angular dependence models (ADM) used for converting the measured radiance to flux at the top of the atmosphere are reviewed, and emphasis is placed on the measure of their effectiveness and the implications of requiring the ADMs to satisfy reciprocity. The overall significance of the ADMs is figured out by analyzing the same satellite data with a single Lambertian model, single mean model, and the 12 Earth Radiation Budget Experiment (ERBE) ADMs. It is shown that the Lambertian ADM is inadequate, while the mean ADM results in nearly unbiased fluxes but creates substantial differences for individual pixel fluxes. The standard ERBE ADM works well except for a 10-pct to 15-pct albedo growth across the scan; a modified ADM based on the standard ERBE ADM but forced to satisfy the principle of reciprocity increases the limb brightening and reduces the albedo growth but does not improve the scanner and nonscanner intercomparison.

  15. A Voltammetric Electronic Tongue for the Resolution of Ternary Nitrophenol Mixtures

    PubMed Central

    González-Calabuig, Andreu; Cetó, Xavier

    2018-01-01

    This work reports the applicability of a voltammetric sensor array able to quantify the content of 2,4-dinitrophenol, 4-nitrophenol, and picric acid in artificial samples using the electronic tongue (ET) principles. The ET is based on cyclic voltammetry signals, obtained from an array of metal disk electrodes and a graphite epoxy composite electrode, compressed using discrete wavelet transform with chemometric tools such as artificial neural networks (ANNs). ANNs were employed to build the quantitative prediction model. In this manner, a set of standards based on a full factorial design, ranging from 0 to 300 mg·L−1, was prepared to build the model; afterward, the model was validated with a completely independent set of standards. The model successfully predicted the concentration of the three considered phenols with a normalized root mean square error of 0.030 and 0.076 for the training and test subsets, respectively, and r ≥ 0.948. PMID:29342848

  16. Implications of AM for the Navy Supply Chain

    DTIC Science & Technology

    2016-12-01

    Cornell University and Queens University of Canada. He is the co-chair of the America Makes Working Group for Additive Manufacturing Qualification and...strategic deployment of additive manufacturing (AM) ma- chines throughout the supply chain, coupled with the right business model, is an imperative need in...60 Table 1. Additive Manufacturing Business Model Factors to develop a standard BCA template, taking into consideration the parameters in Table 1

  17. Obtaining of Analytical Relations for Hydraulic Parameters of Channels With Two Phase Flow Using Open CFD Toolbox

    NASA Astrophysics Data System (ADS)

    Varseev, E.

    2017-11-01

    The present work is dedicated to verification of numerical model in standard solver of open-source CFD code OpenFOAM for two-phase flow simulation and to determination of so-called “baseline” model parameters. Investigation of heterogeneous coolant flow parameters, which leads to abnormal friction increase of channel in two-phase adiabatic “water-gas” flows with low void fractions, presented.

  18. 2010 Anthropometric Survey of U.S. Marine Corps Personnel: Methods and Summary Statistics

    DTIC Science & Technology

    2013-06-01

    models for the ergonomic design of working environments. Today, the entire production chain for a piece of clothing, beginning with the design and...Corps 382 crewstations and workstations. Digital models are increasingly used in the design process for seated and standing workstations, as well...International Standards for Ergonomic Design : These dimensions are useful for comparing data sets between nations, and are measured according to

  19. Animal models for microbicide studies

    PubMed Central

    Veazey, Ronald S.; Shattock, Robin J; Klasse, Per Johan; Moore, John P.

    2013-01-01

    There have been encouraging recent successes in the development of safe and effective topical microbicides to prevent vaginal or rectal HIV-1 transmission, based on the use of anti-retroviral drugs. However, much work remains to be accomplished before a microbicide becomes a standard element of prevention science strategies. Animal models should continue to play an important role in pre-clinical testing, with emphasis on safety, pharmacokinetic and efficacy testing. PMID:22264049

  20. Archiving InSight Lander Science Data Using PDS4 Standards

    NASA Astrophysics Data System (ADS)

    Stein, T.; Guinness, E. A.; Slavney, S.

    2017-12-01

    The InSight Mars Lander is scheduled for launch in 2018, and science data from the mission will be archived in the NASA Planetary Data System (PDS) using the new PDS4 standards. InSight is a geophysical lander with a science payload that includes a seismometer, a probe to measure subsurface temperatures and heat flow, a suite of meteorology instruments, a magnetometer, an experiment using radio tracking, and a robotic arm that will provide soil physical property information based on interactions with the surface. InSight is not the first science mission to archive its data using PDS4. However, PDS4 archives do not currently contain examples of the kinds of data that several of the InSight instruments will produce. Whereas the existing common PDS4 standards were sufficient for most of archiving requirements of InSight, the data generated by a few instruments required development of several extensions to the PDS4 information model. For example, the seismometer will deliver a version of its data in SEED format, which is standard for the terrestrial seismology community. This format required the design of a new product type in the PDS4 information model. A local data dictionary has also been developed for InSight that contains attributes that are not part of the common PDS4 dictionary. The local dictionary provides metadata relevant to all InSight data sets, and attributes specific to several of the instruments. Additional classes and attributes were designed for the existing PDS4 geometry dictionary that will capture metadata for the lander position and orientation, along with camera models for stereo image processing. Much of the InSight archive planning and design work has been done by a Data Archiving Working Group (DAWG), which has members from the InSight project and the PDS. The group coordinates archive design, schedules and peer review of the archive documentation and test products. The InSight DAWG archiving effort for PDS is being led by the PDS Geosciences Node with several other nodes working one-on-one with instruments relevant to their disciplines. Once the InSight mission begins operations, the DAWG will continue to provide oversight on release of InSight data to PDS. Lessons learned from InSight archive work will also feed forward to planning the archives for the Mars 2020 rover.

  1. Modelling robot construction systems

    NASA Technical Reports Server (NTRS)

    Grasso, Chris

    1990-01-01

    TROTER's are small, inexpensive robots that can work together to accomplish sophisticated construction tasks. To understand the issues involved in designing and operating a team of TROTER's, the robots and their components are being modeled. A TROTER system that features standardized component behavior is introduced. An object-oriented model implemented in the Smalltalk programming language is described and the advantages of the object-oriented approach for simulating robot and component interactions are discussed. The presentation includes preliminary results and a discussion of outstanding issues.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnowitt, R.; Nath, P.

    A survey is given of supersymmetry and supergravity and their phenomenology. Some of the topics discussed are the basic ideas of global supersymmetry, the minimal supersymmetric Standard Model (MSSM) and its phenomenology, the basic ideas of local supersymmetry (supergravity), grand unification, supersymmetry breaking in supergravity grand unified models, radiative breaking of SU(2) {times} U(1), proton decay, cosmological constraints, and predictions of supergravity grand unified models. While the number of detailed derivations are necessarily limited, a sufficient number of results are given so that a reader can get a working knowledge of this field.

  3. Towards a Credibility Assessment of Models and Simulations

    NASA Technical Reports Server (NTRS)

    Blattnig, Steve R.; Green, Lawrence L.; Luckring, James M.; Morrison, Joseph H.; Tripathi, Ram K.; Zang, Thomas A.

    2008-01-01

    A scale is presented to evaluate the rigor of modeling and simulation (M&S) practices for the purpose of supporting a credibility assessment of the M&S results. The scale distinguishes required and achieved levels of rigor for a set of M&S elements that contribute to credibility including both technical and process measures. The work has its origins in an interest within NASA to include a Credibility Assessment Scale in development of a NASA standard for models and simulations.

  4. Simplifying the Reuse and Interoperability of Geoscience Data Sets and Models with Semantic Metadata that is Human-Readable and Machine-actionable

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2017-12-01

    Standardized, deep descriptions of digital resources (e.g. data sets, computational models, software tools and publications) make it possible to develop user-friendly software systems that assist scientists with the discovery and appropriate use of these resources. Semantic metadata makes it possible for machines to take actions on behalf of humans, such as automatically identifying the resources needed to solve a given problem, retrieving them and then automatically connecting them (despite their heterogeneity) into a functioning workflow. Standardized model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. A carefully constructed, unambiguous and rules-based schema to address this problem, called the Geoscience Standard Names ontology will be presented that utilizes Semantic Web best practices and technologies. It has also been designed to work across science domains and to be readable by both humans and machines.

  5. Leveraging standards to support patient-centric interdisciplinary plans of care.

    PubMed

    Dykes, Patricia C; DaDamio, Rebecca R; Goldsmith, Denise; Kim, Hyeon-eui; Ohashi, Kumiko; Saba, Virginia K

    2011-01-01

    As health care systems and providers move towards meaningful use of electronic health records, the once distant vision of collaborative patient-centric, interdisciplinary plans of care, generated and updated across organizations and levels of care, may soon become a reality. Effective care planning is included in the proposed Stages 2-3 Meaningful Use quality measures. To facilitate interoperability, standardization of plan of care messaging, content, information and terminology models are needed. This degree of standardization requires local and national coordination. The purpose of this paper is to review some existing standards that may be leveraged to support development of interdisciplinary patient-centric plans of care. Standards are then applied to a use case to demonstrate one method for achieving patient-centric and interoperable interdisciplinary plan of care documentation. Our pilot work suggests that existing standards provide a foundation for adoption and implementation of patient-centric plans of care that are consistent with federal requirements.

  6. The decline and fall of Esperanto: lessons for standards committees.

    PubMed

    Patterson, R; Huff, S M

    1999-01-01

    In 1887, Polish physician Ludovic Zamenhof introduced Esperanto, a simple, easy-to-learn planned language. His goal was to erase communication barriers between ethnic groups by providing them with a politically neutral, culturally free standard language. His ideas received both praise and condemnation from the leaders of his time. Interest in Esperanto peaked in the 1970s but has since faded somewhat. Despite the logical concept and intellectual appeal of a standard language, Esperanto has not evolved into a dominant worldwide language. Instead, English, with all its idiosyncrasies, is closest to an international lingua franca. Like Zamenhof, standards committees in medical informatics have recognized communication chaos and have tried to establish working models, with mixed results. In some cases, previously shunned proprietary systems have become the standard. A proposed standard, no matter how simple, logical, and well designed, may have difficulty displacing an imperfect but functional "real life" system.

  7. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gettelman, Andrew

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  8. 40 CFR Table 3 to Subpart Kkkkk of... - Work Practice Standards

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 14 2012-07-01 2011-07-01 true Work Practice Standards 3 Table 3 to..., Subpt. KKKKK, Table 3 Table 3 to Subpart KKKKK of Part 63—Work Practice Standards As stated in § 63.8555, you must comply with each work practice standard in the following table that applies to you. For...

  9. 40 CFR Table 3 to Subpart Kkkkk of... - Work Practice Standards

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 14 2014-07-01 2014-07-01 false Work Practice Standards 3 Table 3 to..., Subpt. KKKKK, Table 3 Table 3 to Subpart KKKKK of Part 63—Work Practice Standards As stated in § 63.8555, you must comply with each work practice standard in the following table that applies to you. For...

  10. 40 CFR Table 3 to Subpart Kkkkk of... - Work Practice Standards

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 14 2013-07-01 2013-07-01 false Work Practice Standards 3 Table 3 to..., Subpt. KKKKK, Table 3 Table 3 to Subpart KKKKK of Part 63—Work Practice Standards As stated in § 63.8555, you must comply with each work practice standard in the following table that applies to you. For...

  11. 40 CFR Table 3 to Subpart Sssss of... - Work Practice Standards

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 14 2010-07-01 2010-07-01 false Work Practice Standards 3 Table 3 to.... 63, Subpt. SSSSS, Table 3 Table 3 to Subpart SSSSS of Part 63—Work Practice Standards As stated in § 63.9788, you must comply with the work practice standards for affected sources in the following table...

  12. 40 CFR Table 4 to Subpart Eeee of... - Work Practice Standards

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 12 2011-07-01 2009-07-01 true Work Practice Standards 4 Table 4 to..., Table 4 Table 4 to Subpart EEEE of Part 63—Work Practice Standards As stated in § 63.2346, you may elect to comply with one of the work practice standards for existing, reconstructed, or new affected...

  13. 40 CFR Table 3 to Subpart Kkkkk of... - Work Practice Standards

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 13 2011-07-01 2011-07-01 false Work Practice Standards 3 Table 3 to..., Subpt. KKKKK, Table 3 Table 3 to Subpart KKKKK of Part 63—Work Practice Standards As stated in § 63.8555, you must comply with each work practice standard in the following table that applies to you. For...

  14. 40 CFR Table 3 to Subpart Kkkkk of... - Work Practice Standards

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 13 2010-07-01 2010-07-01 false Work Practice Standards 3 Table 3 to..., Subpt. KKKKK, Table 3 Table 3 to Subpart KKKKK of Part 63—Work Practice Standards As stated in § 63.8555, you must comply with each work practice standard in the following table that applies to you. For...

  15. 40 CFR 63.9635 - How do I demonstrate continuous compliance with the work practice standards that apply to me?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compliance with the work practice standards that apply to me? 63.9635 Section 63.9635 Protection of... demonstrate continuous compliance with the work practice standards that apply to me? (a) You must demonstrate continuous compliance with the work practice standard requirements in § 63.9591 by operating in accordance...

  16. An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.

    PubMed

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2010-10-01

    The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.

  17. Minimum Information about a Cardiac Electrophysiology Experiment (MICEE): Standardised Reporting for Model Reproducibility, Interoperability, and Data Sharing

    PubMed Central

    Quinn, TA; Granite, S; Allessie, MA; Antzelevitch, C; Bollensdorff, C; Bub, G; Burton, RAB; Cerbai, E; Chen, PS; Delmar, M; DiFrancesco, D; Earm, YE; Efimov, IR; Egger, M; Entcheva, E; Fink, M; Fischmeister, R; Franz, MR; Garny, A; Giles, WR; Hannes, T; Harding, SE; Hunter, PJ; Iribe, G; Jalife, J; Johnson, CR; Kass, RS; Kodama, I; Koren, G; Lord, P; Markhasin, VS; Matsuoka, S; McCulloch, AD; Mirams, GR; Morley, GE; Nattel, S; Noble, D; Olesen, SP; Panfilov, AV; Trayanova, NA; Ravens, U; Richard, S; Rosenbaum, DS; Rudy, Y; Sachs, F; Sachse, FB; Saint, DA; Schotten, U; Solovyova, O; Taggart, P; Tung, L; Varró, A; Volders, PG; Wang, K; Weiss, JN; Wettwer, E; White, E; Wilders, R; Winslow, RL; Kohl, P

    2011-01-01

    Cardiac experimental electrophysiology is in need of a well-defined Minimum Information Standard for recording, annotating, and reporting experimental data. As a step toward establishing this, we present a draft standard, called Minimum Information about a Cardiac Electrophysiology Experiment (MICEE). The ultimate goal is to develop a useful tool for cardiac electrophysiologists which facilitates and improves dissemination of the minimum information necessary for reproduction of cardiac electrophysiology research, allowing for easier comparison and utilisation of findings by others. It is hoped that this will enhance the integration of individual results into experimental, computational, and conceptual models. In its present form, this draft is intended for assessment and development by the research community. We invite the reader to join this effort, and, if deemed productive, implement the Minimum Information about a Cardiac Electrophysiology Experiment standard in their own work. PMID:21745496

  18. Development of hybrid electric vehicle powertrain test system based on virtue instrument

    NASA Astrophysics Data System (ADS)

    Xu, Yanmin; Guo, Konghui; Chen, Liming

    2017-05-01

    Hybrid powertrain has become the standard configuration of some automobile models. The test system of hybrid vehicle powertrain was developed based on virtual instrument, using electric dynamometer to simulate the work of engines, to test the motor and control unit of the powertrain. The test conditions include starting, acceleration, and deceleration. The results show that the test system can simulate the working conditions of the hybrid electric vehicle powertrain under various conditions.

  19. Predictors of new graduate nurses' workplace well-being: testing the job demands-resources model.

    PubMed

    Spence Laschinger, Heather K; Grau, Ashley L; Finegan, Joan; Wilk, Piotr

    2012-01-01

    New graduate nurses currently experience a stressful transition into the workforce, resulting in high levels of burnout and job turnover in their first year of practice. This study tested a theoretical model of new graduate nurses' worklife derived from the job demands-resources model to better understand how job demands (workload and bullying), job resources (job control and supportive professional practice environments), and a personal resource (psychological capital) combine to influence new graduate experiences of burnout and work engagement and, ultimately, health and job outcomes. A descriptive correlational design was used to test the hypothesized model in a sample of newly graduated nurses (N = 420) working in acute care hospitals in Ontario, Canada. Data were collected from July to November 2009. Participants were mailed questionnaires to their home address using the Total Design Method to improve response rates. All variables were measured using standardized questionnaires, and structural equation modeling was used to test the model. The final model fit statistics partially supported the original hypothesized model. In the final model, job demands (workload and bullying) predicted burnout and, subsequently, poor mental health. Job resources (supportive practice environment and control) predicted work engagement and, subsequently, lower turnover intentions. Burnout also was a significant predictor of turnover intent (a crossover effect). Furthermore, personal resources (psychological capital) significantly influenced both burnout and work engagement. The model suggests that managerial strategies targeted at specific job demands and resources can create workplace environments that promote work engagement and prevent burnout to support the retention and well-being of the new graduate nurse population.

  20. Modular modelling with Physiome standards

    PubMed Central

    Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.

    2016-01-01

    Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233

  1. Lateral Load Capacity of Piles: A Comparative Study Between Indian Standards and Theoretical Approach

    NASA Astrophysics Data System (ADS)

    Jayasree, P. K.; Arun, K. V.; Oormila, R.; Sreelakshmi, H.

    2018-05-01

    As per Indian Standards, laterally loaded piles are usually analysed using the method adopted by IS 2911-2010 (Part 1/Section 2). But the practising engineers are of the opinion that the IS method is very conservative in design. This work aims at determining the extent to which the conventional IS design approach is conservative. This is done through a comparative study between IS approach and the theoretical model based on Vesic's equation. Bore log details for six different bridges were collected from the Kerala Public Works Department. Cast in situ fixed head piles embedded in three soil conditions both end bearing as well as friction piles were considered and analyzed separately. Piles were also modelled in STAAD.Pro software based on IS approach and the results were validated using Matlock and Reese (In Proceedings of fifth international conference on soil mechanics and foundation engineering, 1961) equation. The results were presented as the percentage variation in values of bending moment and deflection obtained by different methods. The results obtained from the mathematical model based on Vesic's equation and that obtained as per the IS approach were compared and the IS method was found to be uneconomical and conservative.

  2. BOOK REVIEW: Structures in the Universe by Exact Methods: Formation, Evolutions, Interactions (Cambridge Monographs on Mathematical Physics) Structures in the Universe by Exact Methods: Formation, Evolutions, Interactions (Cambridge Monographs on Mathematical Physics)

    NASA Astrophysics Data System (ADS)

    Coley, Alan

    2010-05-01

    In this book the use of inhomogeneous models in cosmology, both in modelling structure formation and interpreting cosmological observations, is discussed. The authors concentrate on exact solutions, and particularly the Lemaitre-Tolman (LT) and Szekeres models (the important topic of averaging is not discussed). The book serves to demonstrate that inhomogeneous metrics can generate realistic models of cosmic structure formation and nonlinear evolution and shows that general relativity has a lot more to offer to cosmology than just the standard spatially homogeneous FLRW model. I would recommend this book to people working in theoretical cosmology. In the introduction (and in the concluding chapter and throughout the book) a reasonable discussion of the potential problems with the standard FLRW cosmology is presented, and a list of examples illustrating the limitations of standard FLRW cosmology are discussed (including potential problems with perturbation methods). In particular, the authors argue that the assumptions of isotropy and spatial homogeneity (and consequently the Copernican principle) must be properly challenged and revisited. Indeed, it is possible for `good old general relativity' to be used to explain cosmological observations without introducing speculative elements. In part I of the book the necessary background is presented (readers need a background in general relativity theory at an advanced undergraduate or graduate level). There is a good (and easy to read) review of the exact spherically symmetric dust Lemaitre-Tolman model (LT) (often denoted the LTB model) and the Lemaitre and Szekeres models. Light propogation (i.e. null geodesics, for both central and off-center observers) in exact inhomogeneous (LT) models is reviewed. In part II a number of applications of exact inhomogeneous models are presented (taken mainly from the authors' own work). In chapter 4, the evolution of exact inhomogeneous models (primarily the LT model, but also the Szekeres model) is studied regarding structure formation. I thought that the authors describe the advantages and drawbacks of the idealized exact solutions used in the physical modelling in a reasonable manner (although more concise conclusions might have been useful). The authors also address the formation of a galaxy with a central black hole, the formation and evolution of rich galactic clusters and voids and other structures, and the effects of radiation in the models. The most interesting application is presented in chapter 5; namely, the effects of inhomogeneities on observations such as the luminosity distance relation and the explanation of the observed dimming of distant SN Ia (which is usually interpreted within the standard FLRW model in terms of the existence of dark energy). The main conclusion of this work is that data can be reproduced within the LT model (via inhomogeneities in general relativity, but without introducing dark energy). In particular, a number of exact LT solutions were surveyed, and a full discussion of various models in the literature (and a critique of the various assumptions) is presented. In the next chapter the possible resolution of the horizon problem without inflation, in terms of shell crossing in a LT model, is discussed. This is perhaps the most controversial chapter of the book. In the final chapter 7, the influence of inhomogeneous structures in the path of a light ray (for both center and off-center observers in a special Szekeres Swiss cheese model) on the observed temperature distribution of the CMB is discussed. This is a very important topic, but only a heuristic and qualitative study is presented here; more work on the multipole moments of higher order would be necessary for a more comprehensive analysis.

  3. Personalized-detailed clinical model for data interoperability among clinical standards.

    PubMed

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung

    2013-08-01

    Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems.

  4. Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards

    PubMed Central

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir

    2013-01-01

    Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730

  5. 34 CFR Appendix A to Part 74 - Contract Provisions

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... suspected or reported violations to the Federal awarding agency. 4. Contract Work Hours and Safety Standards... mechanic and laborer on the basis of a standard work week of 40 hours. Work in excess of the standard work... basic rate of pay for all hours worked in excess of 40 hours in the work week. Section 107 of the Act is...

  6. 14 CFR Appendix to Part 1274 - Listing of Exhibits

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... laborer on the basis of a standard work week of 40 hours. Work in excess of the standard work week is... pay for all hours worked in excess of 40 hours in the work week. Section 107 of the Act is applicable... Work Hours and Safety Standards Act (40 U.S.C. 327-333)—Where applicable, all contracts awarded by...

  7. 36 CFR Appendix A to Part 1210 - Contract Provisions

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... suspected or reported violations to the Federal awarding agency. 4. Contract Work Hours and Safety Standards... mechanic and laborer on the basis of a standard work week of 40 hours. Work in excess of the standard work... basic rate of pay for all hours worked in excess of 40 hours in the work week. Section 107 of the Act is...

  8. 10 CFR Appendix A to Subpart B of... - Contract Provisions

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... suspected or reported violations to the Federal awarding agency. 4. Contract Work Hours and Safety Standards... mechanic and laborer on the basis of a standard work week of 40 hours. Work in excess of the standard work... basic rate of pay for all hours worked in excess of 40 hours in the work week. Section 107 of the Act is...

  9. The Higgs and Supersymmetry at Run II of the LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shih, David

    2016-04-14

    Prof. David Shih was supported by DOE grant DE-SC0013678 from April 2015 to April 2016. His research during this year focused on the phenomenology of super-symmetry (SUSY) and maximizing its future discovery potential at Run II of the LHC. SUSY is one of the most well-motivated frameworks for physics beyond the Standard Model. It solves the "naturalness" or "hierarchy" problem by stabilizing the Higgs mass against otherwise uncontrolled quantum corrections, predicts "grand unification" of the fundamental forces, and provides many potential candidates for dark matter. However, after decades of null results from direct and indirect searches, the viable parameter spacemore » for SUSY is increasingly constrained. Also, the discovery of a Standard Model-like Higgs with a mass at 125 GeV places a stringent constraint on SUSY models. In the work supported on this grant, Shih has worked on four different projects motivated by these issues. He has built natural SUSY models that explain the Higgs mass and provide viable dark matter; he has studied the parameter space of "gauge mediated supersymmetry breaking" (GMSB) that satisfies the Higgs mass constraint; he has developed new tools for the precision calculation of flavor and CP observables in general SUSY models; and he has studied new techniques for discovery of supersymmetric partners of the top quark.« less

  10. [The standard of position of physician-stomatologist-therapeutist in the conditions of working together with assistant-stomatological].

    PubMed

    Kalininskaya, A A; Mescheryakov, D G; Ildarov, R B

    2013-01-01

    The article presents scope of work, algorithms of labor operations, standardization of work of stomatologist-therapeutist in the conditions of working together with assistant-stomatological in four hands. The calculations are given concerning the standard numbers of positions of stomatologist in new conditions of work.

  11. Data Modeling Using Finite Differences

    ERIC Educational Resources Information Center

    Rhoads, Kathryn; Mendoza Epperson, James A.

    2017-01-01

    The Common Core State Standards for Mathematics (CCSSM) states that high school students should be able to recognize patterns of growth in linear, quadratic, and exponential functions and construct such functions from tables of data (CCSSI 2010). In their work with practicing secondary teachers, the authors found that teachers may make some tacit…

  12. Getting AM Up to Speed Across the Army Life Cycle

    DTIC Science & Technology

    2016-12-01

    acquisition domain, more engi- neering work is needed to better define what standards should be used in Data Item Descriptions (DID) and Contract Data... compared to traditional parts replacement using procurement. While cold spray AM technology does not necessarily require the use of 3D models, to fully

  13. Variability within Systemic In Vivo Toxicity Points-of-Departure (SOT)

    EPA Science Inventory

    In vivo studies have long been considered the gold standard for toxicology screening and deriving points of departure (POD). With the push to decrease the use of animal studies, predictive models using in vivo data are being developed to estimate POD. However, recent work has il...

  14. Fractions, Number Lines, Third Graders

    ERIC Educational Resources Information Center

    Cramer, Kathleen; Ahrendt, Sue; Monson, Debra; Wyberg, Terry; Colum, Karen

    2017-01-01

    The Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010) outlines ambitious goals for fraction learning, starting in third grade, that include the use of the number line model. Understanding and constructing fractions on a number line are particularly complex tasks. The current work of the authors centers on ways to successfully…

  15. Precursor Model and Preschool Science Learning about Shadows Formation

    ERIC Educational Resources Information Center

    Delserieys, Alice; Jégou, Corinne; Boilevin, Jean-Marie; Ravanis, Konstantinos

    2018-01-01

    Background: This work is based on the idea that young children benefit from early introduction of scientific concepts. Few researches describe didactical strategies focusing on physics understanding for young children and analyse their effectiveness in standard classroom environments. Purpose: The aim is to identify whether didactical strategies…

  16. Getting Out of the Way: A Lesson in Change

    ERIC Educational Resources Information Center

    McEnery, Douglas

    2005-01-01

    When the author's school implemented a model of standards-driven, research-based teaching practices, he realized that working individually with teachers on their professional development goals was not improving teacher performance or student achievement. As such, he developed a more facilitative role and listened to groups of teachers discuss…

  17. Freshman Learning Communities, College Performance, and Retention. Working Paper 2005-22

    ERIC Educational Resources Information Center

    Hotchkiss, Julie L.; Moore, Robert E.; Pitts, M. Melinda

    2005-01-01

    This paper applies a standard treatment effects model to determine that participation in Freshman Learning Communities (FLCs) improves academic performance and retention. Not controlling for individual self-selection into FLC participation leads one to incorrectly conclude that the impact is the same across race and gender groups. Accurately…

  18. Reinforcing the Afrocentric Paradigm: A Theoretical Project

    ERIC Educational Resources Information Center

    Sams, Timothy E.

    2010-01-01

    Thomas Kuhn's 1962 groundbreaking work, "The Scientific Revolution," established the process for creating, and the components of, a disciplinary paradigm. This "scientific revolution" has evolved to become the standard for determining a field's claim to disciplinary status. In 2001 and 2003, Ama Mazama, used Kuhn's model to establish the…

  19. Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Ian M; Danoix, F; Forbes, Richard

    2011-01-01

    Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less

  20. Repopulation Kinetics and the Linear-Quadratic Model

    NASA Astrophysics Data System (ADS)

    O'Rourke, S. F. C.; McAneney, H.; Starrett, C.; O'Sullivan, J. M.

    2009-08-01

    The standard Linear-Quadratic (LQ) survival model for radiotherapy is used to investigate different schedules of radiation treatment planning for advanced head and neck cancer. We explore how these treament protocols may be affected by different tumour repopulation kinetics between treatments. The laws for tumour cell repopulation include the logistic and Gompertz models and this extends the work of Wheldon et al. [1], which was concerned with the case of exponential repopulation between treatments. Treatment schedules investigated include standarized and accelerated fractionation. Calculations based on the present work show, that even with growth laws scaled to ensure that the repopulation kinetics for advanced head and neck cancer are comparable, considerable variation in the survival fraction to orders of magnitude emerged. Calculations show that application of the Gompertz model results in a significantly poorer prognosis for tumour eradication. Gaps in treatment also highlight the differences in the LQ model with the effect of repopulation kinetics included.

  1. Model reference adaptive control (MRAC)-based parameter identification applied to surface-mounted permanent magnet synchronous motor

    NASA Astrophysics Data System (ADS)

    Zhong, Chongquan; Lin, Yaoyao

    2017-11-01

    In this work, a model reference adaptive control-based estimated algorithm is proposed for online multi-parameter identification of surface-mounted permanent magnet synchronous machines. By taking the dq-axis equations of a practical motor as the reference model and the dq-axis estimation equations as the adjustable model, a standard model-reference-adaptive-system-based estimator was established. Additionally, the Popov hyperstability principle was used in the design of the adaptive law to guarantee accurate convergence. In order to reduce the oscillation of identification result, this work introduces a first-order low-pass digital filter to improve precision regarding the parameter estimation. The proposed scheme was then applied to an SPM synchronous motor control system without any additional circuits and implemented using a DSP TMS320LF2812. For analysis, the experimental results reveal the effectiveness of the proposed method.

  2. A novel metadata management model to capture consent for record linkage in longitudinal research studies.

    PubMed

    McMahon, Christiana; Denaxas, Spiros

    2017-11-06

    Informed consent is an important feature of longitudinal research studies as it enables the linking of the baseline participant information with administrative data. The lack of standardized models to capture consent elements can lead to substantial challenges. A structured approach to capturing consent-related metadata can address these. a) Explore the state-of-the-art for recording consent; b) Identify key elements of consent required for record linkage; and c) Create and evaluate a novel metadata management model to capture consent-related metadata. The main methodological components of our work were: a) a systematic literature review and qualitative analysis of consent forms; b) the development and evaluation of a novel metadata model. We qualitatively analyzed 61 manuscripts and 30 consent forms. We extracted data elements related to obtaining consent for linkage. We created a novel metadata management model for consent and evaluated it by comparison with the existing standards and by iteratively applying it to case studies. The developed model can facilitate the standardized recording of consent for linkage in longitudinal research studies and enable the linkage of external participant data. Furthermore, it can provide a structured way of recording consent-related metadata and facilitate the harmonization and streamlining of processes.

  3. Intelligence Reach for Expertise (IREx)

    NASA Astrophysics Data System (ADS)

    Hadley, Christina; Schoening, James R.; Schreiber, Yonatan

    2015-05-01

    IREx is a search engine for next-generation analysts to find collaborators. U.S. Army Field Manual 2.0 (Intelligence) calls for collaboration within and outside the area of operations, but finding the best collaborator for a given task can be challenging. IREx will be demonstrated as part of Actionable Intelligence Technology Enabled Capability Demonstration (AI-TECD) at the E15 field exercises at Ft. Dix in July 2015. It includes a Task Model for describing a task and its prerequisite competencies, plus a User Model (i.e., a user profile) for individuals to assert their capabilities and other relevant data. These models use a canonical suite of ontologies as a foundation for these models, which enables robust queries and also keeps the models logically consistent. IREx also supports learning validation, where a learner who has completed a course module can search and find a suitable task to practice and demonstrate that their new knowledge can be used in the real world for its intended purpose. The IREx models are in the initial phase of a process to develop them as an IEEE standard. This initiative is currently an approved IEEE Study Group, after which follows a standards working group, then a balloting group, and if all goes well, an IEEE standard.

  4. Ground control station software design for micro aerial vehicles

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  5. International Oil Supplies and Demands

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-04-01

    The eleventh Energy Modeling Forum (EMF) working group met four times over the 1989--1990 period to compare alternative perspectives on international oil supplies and demands through 2010 and to discuss how alternative supply and demand trends influence the world's dependence upon Middle Eastern oil. Proprietors of eleven economic models of the world oil market used their respective models to simulate a dozen scenarios using standardized assumptions. From its inception, the study was not designed to focus on the short-run impacts of disruptions on oil markets. Nor did the working group attempt to provide a forecast or just a single viewmore » of the likely future path for oil prices. The model results guided the group's thinking about many important longer-run market relationships and helped to identify differences of opinion about future oil supplies, demands, and dependence.« less

  6. International Oil Supplies and Demands

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-09-01

    The eleventh Energy Modeling Forum (EMF) working group met four times over the 1989--90 period to compare alternative perspectives on international oil supplies and demands through 2010 and to discuss how alternative supply and demand trends influence the world's dependence upon Middle Eastern oil. Proprietors of eleven economic models of the world oil market used their respective models to simulate a dozen scenarios using standardized assumptions. From its inception, the study was not designed to focus on the short-run impacts of disruptions on oil markets. Nor did the working group attempt to provide a forecast or just a single viewmore » of the likely future path for oil prices. The model results guided the group's thinking about many important longer-run market relationships and helped to identify differences of opinion about future oil supplies, demands, and dependence.« less

  7. Comparing Standard and Selective Degradation DNA Extraction Methods: Results from a Field Experiment with Sexual Assault Kits.

    PubMed

    Campbell, Rebecca; Pierce, Steven J; Sharma, Dhruv B; Shaw, Jessica; Feeney, Hannah; Nye, Jeffrey; Schelling, Kristin; Fehler-Cabral, Giannina

    2017-01-01

    A growing number of U.S. cities have large numbers of untested sexual assault kits (SAKs) in police property facilities. Testing older kits and maintaining current case work will be challenging for forensic laboratories, creating a need for more efficient testing methods. We evaluated selective degradation methods for DNA extraction using actual case work from a sample of previously unsubmitted SAKs in Detroit, Michigan. We randomly assigned 350 kits to either standard or selective degradation testing methods and then compared DNA testing rates and CODIS entry rates between the two groups. Continuation-ratio modeling showed no significant differences, indicating that the selective degradation method had no decrement in performance relative to customary methods. Follow-up equivalence tests indicated that CODIS entry rates for the two methods could differ by more than ±5%. Selective degradation methods required less personnel time for testing and scientific review than standard testing. © 2016 American Academy of Forensic Sciences.

  8. Towards a Policy Framework for Decent Work.

    ERIC Educational Resources Information Center

    Egger, Philippe

    2002-01-01

    International Labour Organization (ILO) standards for decent work promote social justice and humane working conditions. These standards can contribute to sustainable development, macroeconomic security, and fairer distribution of benefits from growth. The ILO is working for policy integration and promotion of international labor standards as a…

  9. A randomized controlled trial of a Return-to-Work Coordinator model of care in a general hospital to facilitate return to work of injured workers.

    PubMed

    Tan, Heidi Siew Khoon; Yeo, Doreen Sai Ching; Giam, Joanna Yu Ting; Cheong, Florence Wai Fong; Chan, Kay Fei

    2016-04-07

    Return-to-work (RTW) programmes for injured workers have been prevalent in Western countries with established work injury management policies for decades. In recent years, more Asian countries have started to develop RTW programmes in the absence of work injury management policies. However, few studies have evaluated the effectiveness of RTW programmes in Asia. Return-to-work coordination has been found to be an important facilitator in RTW programmes. This study seeks to determine the effectiveness of a Return-to-work coordinator (RTWC) model of care in facilitating early RTW for injured workers in Singapore. A randomized controlled trial was used. 160 injured workers in a general hospital were randomly allocated to either control (receive usual hospital standard care) or intervention (assigned a RTWC) group. The RTWC closely supported RTW arrangements and proactively liaised with employers and healthcare professionals on RTW solutions for the injured workers. At three months post injury, workers in the intervention group RTW 10 days earlier than the control group, with a higher proportion of workers in the intervention group returning to modified jobs. There were no significant differences in the quality of life measures between the two groups. The addition of a RTWC into the hospital model of care is effective in facilitating early RTW for injured workers. This could be a potential model of care for injured workers in Asian countries where work injury management policies are not yet established.

  10. The employer's decision to provide health insurance under the health reform law.

    PubMed

    Pang, Gaobo; Warshawsky, Mark J

    2013-01-01

    This article considers the employer's decision to continue or to drop health insurance coverage for its workers under the provisions of the 2010 health reform law, on the presumption that the primary influence on that decision is what will produce a higher worker standard of living during working years and retirement. The authors incorporate the most recent empirical estimates of health care costs into their long-horizon, optimal savings consumption model for workers. Their results show that the employer sponsorship of health plans is valuable for maintaining a consistent and higher living standard over the life cycle for middle- and upper-income households considered here, whereas exchange-purchased and subsidized coverage is more beneficial for lower income households (roughly 4-6% of illustrative single workers and 15-22% of working families).

  11. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less

  12. Revisiting an old concept: the coupled oscillator model for VCD. Part 1: the generalised coupled oscillator mechanism and its intrinsic connection to the strength of VCD signals.

    PubMed

    Nicu, Valentin Paul

    2016-08-03

    Motivated by the renewed interest in the coupled oscillator (CO) model for VCD, in this work a generalised coupled oscillator (GCO) expression is derived by introducing the concept of a coupled oscillator origin. Unlike the standard CO expression, the GCO expression is exact within the harmonic approximation. Using two illustrative example molecules, the theoretical concepts introduced here are demonstrated by performing a GCO decomposition of the rotational strengths computed using DFT. This analysis shows that: (1) the contributions to the rotational strengths that are normally neglected in the standard CO model can be comparable to or larger than the CO contribution, and (2) the GCO mechanism introduced here can affect the VCD intensities of all types of modes in symmetric and asymmetric molecules.

  13. Towards Accurate Modelling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model

    NASA Astrophysics Data System (ADS)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron K.; Scoccimarro, Roman; Piscionere, Jennifer A.; Wibking, Benjamin D.

    2018-04-01

    Interpreting the small-scale clustering of galaxies with halo models can elucidate the connection between galaxies and dark matter halos. Unfortunately, the modelling is typically not sufficiently accurate for ruling out models statistically. It is thus difficult to use the information encoded in small scales to test cosmological models or probe subtle features of the galaxy-halo connection. In this paper, we attempt to push halo modelling into the "accurate" regime with a fully numerical mock-based methodology and careful treatment of statistical and systematic errors. With our forward-modelling approach, we can incorporate clustering statistics beyond the traditional two-point statistics. We use this modelling methodology to test the standard ΛCDM + halo model against the clustering of SDSS DR7 galaxies. Specifically, we use the projected correlation function, group multiplicity function and galaxy number density as constraints. We find that while the model fits each statistic separately, it struggles to fit them simultaneously. Adding group statistics leads to a more stringent test of the model and significantly tighter constraints on model parameters. We explore the impact of varying the adopted halo definition and cosmological model and find that changing the cosmology makes a significant difference. The most successful model we tried (Planck cosmology with Mvir halos) matches the clustering of low luminosity galaxies, but exhibits a 2.3σ tension with the clustering of luminous galaxies, thus providing evidence that the "standard" halo model needs to be extended. This work opens the door to adding interesting freedom to the halo model and including additional clustering statistics as constraints.

  14. 14 CFR Appendix A to Subpart B of... - Contract Provisions

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... suspected or reported violations to the NASA. 4. Contract Work Hours and Safety Standards Act (40 U.S.C. 327... laborer on the basis of a standard work week of 40 hours. Work in excess of the standard work week is... pay for all hours worked in excess of 40 hours in the work week. Section 107 of the Act is applicable...

  15. Information object definition-based unified modeling language representation of DICOM structured reporting: a case study of transcoding DICOM to XML.

    PubMed

    Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K P

    2002-01-01

    Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification.

  16. Standardized acquisition, storing and provision of 3D enabled spatial data

    NASA Astrophysics Data System (ADS)

    Wagner, B.; Maier, S.; Peinsipp-Byma, E.

    2017-05-01

    In the area of working with spatial data, in addition to the classic, two-dimensional geometrical data (maps, aerial images, etc.), the needs for three-dimensional spatial data (city models, digital elevation models, etc.) is increasing. Due to this increased demand the acquiring, storing and provision of 3D enabled spatial data in Geographic Information Systems (GIS) is more and more important. Existing proprietary solutions quickly reaches their limits during data exchange and data delivery to other systems. They generate a large workload, which will be very costly. However, it is noticeable that these expenses and costs can generally be significantly reduced using standards. The aim of this research is therefore to develop a concept in the field of three-dimensional spatial data that runs on existing standards whenever possible. In this research, the military image analysts are the preferred user group of the system. To achieve the objective of the widest possible use of standards in spatial 3D data, existing standards, proprietary interfaces and standards under discussion have been analyzed. Since the here used GIS of the Fraunhofer IOSB is already using and supporting OGC (Open Geospatial Consortium) and NATO-STANAG (NATO-Standardization Agreement) standards for the most part of it, a special attention for possible use was laid on their standards. The most promising standard is the OGC standard 3DPS (3D Portrayal Service) with its occurrences W3DS (Web 3D Service) and WVS (Web View Service). A demo system was created, using a standardized workflow from the data acquiring, storing and provision and showing the benefit of our approach.

  17. DSMC study of oxygen shockwaves based on high-fidelity vibrational relaxation and dissociation models

    NASA Astrophysics Data System (ADS)

    Borges Sebastião, Israel; Kulakhmetov, Marat; Alexeenko, Alina

    2017-01-01

    This work evaluates high-fidelity vibrational-translational (VT) energy relaxation and dissociation models for pure O2 normal shockwave simulations with the direct simulation Monte Carlo (DSMC) method. The O2-O collisions are described using ab initio state-specific relaxation and dissociation models. The Macheret-Fridman (MF) dissociation model is adapted to the DSMC framework by modifying the standard implementation of the total collision energy (TCE) model. The O2-O2 dissociation is modeled with this TCE+MF approach, which is calibrated with O2-O ab initio data and experimental equilibrium dissociation rates. The O2-O2 vibrational relaxation is modeled via the Larsen-Borgnakke model, calibrated to experimental VT rates. All the present results are compared to experimental data and previous calculations available in the literature. It is found that, in general, the ab initio dissociation model is better than the TCE model at matching the shock experiments. Therefore, when available, efficient ab initio models are preferred over phenomenological models. We also show that the proposed TCE + MF formulation can be used to improve the standard TCE model results when ab initio data are not available or limited.

  18. Promoting consistent use of the communication function classification system (CFCS).

    PubMed

    Cunningham, Barbara Jane; Rosenbaum, Peter; Hidecker, Mary Jo Cooley

    2016-01-01

    We developed a Knowledge Translation (KT) intervention to standardize the way speech-language pathologists working in Ontario Canada's Preschool Speech and Language Program (PSLP) used the Communication Function Classification System (CFCS). This tool was being used as part of a provincial program evaluation and standardizing its use was critical for establishing reliability and validity within the provincial dataset. Two theoretical foundations - Diffusion of Innovations and the Communication Persuasion Matrix - were used to develop and disseminate the intervention to standardize use of the CFCS among a cohort speech-language pathologists. A descriptive pre-test/post-test study was used to evaluate the intervention. Fifty-two participants completed an electronic pre-test survey, reviewed intervention materials online, and then immediately completed an electronic post-test survey. The intervention improved clinicians' understanding of how the CFCS should be used, their intentions to use the tool in the standardized way, and their abilities to make correct classifications using the tool. Findings from this work will be shared with representatives of the Ontario PSLP. The intervention may be disseminated to all speech-language pathologists working in the program. This study can be used as a model for developing and disseminating KT interventions for clinicians in paediatric rehabilitation. The Communication Function Classification System (CFCS) is a new tool that allows speech-language pathologists to classify children's skills into five meaningful levels of function. There is uncertainty and inconsistent practice in the field about the methods for using this tool. This study used combined two theoretical frameworks to develop an intervention to standardize use of the CFCS among a cohort of speech-language pathologists. The intervention effectively increased clinicians' understanding of the methods for using the CFCS, ability to make correct classifications, and intention to use the tool in the standardized way in the future.

  19. Multivariate calibration standardization across instruments for the determination of glucose by Fourier transform near-infrared spectrometry.

    PubMed

    Zhang, Lin; Small, Gary W; Arnold, Mark A

    2003-11-01

    The transfer of multivariate calibration models is investigated between a primary (A) and two secondary Fourier transform near-infrared (near-IR) spectrometers (B, C). The application studied in this work is the use of bands in the near-IR combination region of 5000-4000 cm(-)(1) to determine physiological levels of glucose in a buffered aqueous matrix containing varying levels of alanine, ascorbate, lactate, triacetin, and urea. The three spectrometers are used to measure 80 samples produced through a randomized experimental design that minimizes correlations between the component concentrations and between the concentrations of glucose and water. Direct standardization (DS), piecewise direct standardization (PDS), and guided model reoptimization (GMR) are evaluated for use in transferring partial least-squares calibration models developed with the spectra of 64 samples from the primary instrument to the prediction of glucose concentrations in 16 prediction samples measured with each secondary spectrometer. The three algorithms are evaluated as a function of the number of standardization samples used in transferring the calibration models. Performance criteria for judging the success of the calibration transfer are established as the standard error of prediction (SEP) for internal calibration models built with the spectra of the 64 calibration samples collected with each secondary spectrometer. These SEP values are 1.51 and 1.14 mM for spectrometers B and C, respectively. When calibration standardization is applied, the GMR algorithm is observed to outperform DS and PDS. With spectrometer C, the calibration transfer is highly successful, producing an SEP value of 1.07 mM. However, an SEP of 2.96 mM indicates unsuccessful calibration standardization with spectrometer B. This failure is attributed to differences in the variance structure of the spectra collected with spectrometers A and B. Diagnostic procedures are presented for use with the GMR algorithm that forecasts the successful calibration transfer with spectrometer C and the unsatisfactory results with spectrometer B.

  20. Impacts of Climate Policy on Regional Air Quality, Health, and Air Quality Regulatory Procedures

    NASA Astrophysics Data System (ADS)

    Thompson, T. M.; Selin, N. E.

    2011-12-01

    Both the changing climate, and the policy implemented to address climate change can impact regional air quality. We evaluate the impacts of potential selected climate policies on modeled regional air quality with respect to national pollution standards, human health and the sensitivity of health uncertainty ranges. To assess changes in air quality due to climate policy, we couple output from a regional computable general equilibrium economic model (the US Regional Energy Policy [USREP] model), with a regional air quality model (the Comprehensive Air Quality Model with Extensions [CAMx]). USREP uses economic variables to determine how potential future U.S. climate policy would change emissions of regional pollutants (CO, VOC, NOx, SO2, NH3, black carbon, and organic carbon) from ten emissions-heavy sectors of the economy (electricity, coal, gas, crude oil, refined oil, energy intensive industry, other industry, service, agriculture, and transportation [light duty and heavy duty]). Changes in emissions are then modeled using CAMx to determine the impact on air quality in several cities in the Northeast US. We first calculate the impact of climate policy by using regulatory procedures used to show attainment with National Ambient Air Quality Standards (NAAQS) for ozone and particulate matter. Building on previous work, we compare those results with the calculated results and uncertainties associated with human health impacts due to climate policy. This work addresses a potential disconnect between NAAQS regulatory procedures and the cost/benefit analysis required for and by the Clean Air Act.

  1. Development of a standardized, citywide process for managing smart-pump drug libraries.

    PubMed

    Walroth, Todd A; Smallwood, Shannon; Arthur, Karen; Vance, Betsy; Washington, Alana; Staublin, Therese; Haslar, Tammy; Reddan, Jennifer G; Fuller, James

    2018-06-15

    Development and implementation of an interprofessional consensus-driven process for review and optimization of smart-pump drug libraries and dosing limits are described. The Indianapolis Coalition for Patient Safety (ICPS), which represents 6 Indianapolis-area health systems, identified an opportunity to reduce clinically insignificant alerts that smart infusion pumps present to end users. Through a consensus-driven process, ICPS aimed to identify best practices to implement at individual hospitals in order to establish specific action items for smart-pump drug library optimization. A work group of pharmacists, nurses, and industrial engineers met to evaluate variability within and lack of scrutiny of smart-pump drug libraries. The work group used Lean Six Sigma methodologies to generate a list of key needs and barriers to be addressed in process standardization. The group reviewed targets for smart-pump drug library optimization, including dosing limits, types of alerts reviewed, policies, and safety best practices. The work group also analyzed existing processes at each site to develop a final consensus statement outlining a model process for reviewing alerts and managing smart-pump data. Analysis of the total number of alerts per device across ICPS-affiliated health systems over a 4-year period indicated a 50% decrease (from 7.2 to 3.6 alerts per device per month) after implementation of the model by ICPS member organizations. Through implementation of a standardized, consensus-driven process for smart-pump drug library optimization, ICPS member health systems reduced clinically insignificant smart-pump alerts. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  2. A POD reduced order model for resolving angular direction in neutron/photon transport problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buchan, A.G., E-mail: andrew.buchan@imperial.ac.uk; Calloo, A.A.; Goffin, M.G.

    2015-09-01

    This article presents the first Reduced Order Model (ROM) that efficiently resolves the angular dimension of the time independent, mono-energetic Boltzmann Transport Equation (BTE). It is based on Proper Orthogonal Decomposition (POD) and uses the method of snapshots to form optimal basis functions for resolving the direction of particle travel in neutron/photon transport problems. A unique element of this work is that the snapshots are formed from the vector of angular coefficients relating to a high resolution expansion of the BTE's angular dimension. In addition, the individual snapshots are not recorded through time, as in standard POD, but instead theymore » are recorded through space. In essence this work swaps the roles of the dimensions space and time in standard POD methods, with angle and space respectively. It is shown here how the POD model can be formed from the POD basis functions in a highly efficient manner. The model is then applied to two radiation problems; one involving the transport of radiation through a shield and the other through an infinite array of pins. Both problems are selected for their complex angular flux solutions in order to provide an appropriate demonstration of the model's capabilities. It is shown that the POD model can resolve these fluxes efficiently and accurately. In comparison to high resolution models this POD model can reduce the size of a problem by up to two orders of magnitude without compromising accuracy. Solving times are also reduced by similar factors.« less

  3. A bias correction for covariance estimators to improve inference with generalized estimating equations that use an unstructured correlation matrix.

    PubMed

    Westgate, Philip M

    2013-07-20

    Generalized estimating equations (GEEs) are routinely used for the marginal analysis of correlated data. The efficiency of GEE depends on how closely the working covariance structure resembles the true structure, and therefore accurate modeling of the working correlation of the data is important. A popular approach is the use of an unstructured working correlation matrix, as it is not as restrictive as simpler structures such as exchangeable and AR-1 and thus can theoretically improve efficiency. However, because of the potential for having to estimate a large number of correlation parameters, variances of regression parameter estimates can be larger than theoretically expected when utilizing the unstructured working correlation matrix. Therefore, standard error estimates can be negatively biased. To account for this additional finite-sample variability, we derive a bias correction that can be applied to typical estimators of the covariance matrix of parameter estimates. Via simulation and in application to a longitudinal study, we show that our proposed correction improves standard error estimation and statistical inference. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Modeling of Cavitating Flow through Waterjet Propulsors

    DTIC Science & Technology

    2015-02-18

    1-0197 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Jules W. Lindau 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING...RESPONSIBLE PERSON Jules W. Lindau 19b. TELEPONE NUMBER (Include area code) 814-865-8938 ^\\6^G%013 Standard Form 298 (Rev. 8-98) Prescribed by ANSI-Std...239-18 Modeling of Cavitating Flow through Waterjet Propulsors Jules W. Lindau The Pennsylvania State University, Applied Research Laboratory, State

  5. A Framework for Building and Reasoning with Adaptive and Interoperable PMESII Models

    DTIC Science & Technology

    2007-11-01

    Description Logic SOA Service Oriented Architecture SPARQL Simple Protocol And RDF Query Language SQL Standard Query Language SROM Stability and...another by providing a more expressive ontological structure for one of the models, e.g., semantic networks can be mapped to first- order logical...Pellet is an open-source reasoner that works with OWL-DL. It accepts the SPARQL protocol and RDF query language ( SPARQL ) and provides a Java API to

  6. An avionics scenario and command model description for Space Generic Open Avionics Architecture (SGOAA)

    NASA Technical Reports Server (NTRS)

    Stovall, John R.; Wray, Richard B.

    1994-01-01

    This paper presents a description of a model for a space vehicle operational scenario and the commands for avionics. This model will be used in developing a dynamic architecture simulation model using the Statemate CASE tool for validation of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA has been proposed as an avionics architecture standard to NASA through its Strategic Avionics Technology Working Group (SATWG) and has been accepted by the Society of Automotive Engineers (SAE) for conversion into an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division (FDSD) of the NASA Johnson Space Center (JSC) by the Lockheed Engineering and Sciences Company (LESC), Houston, Texas. This SGOAA includes a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, and a nine class model of interfaces. The SGOAA is both scalable and recursive and can be applied to any hierarchical level of hardware/software processing systems.

  7. [THE MOTIVATION OF MEDICAL PERSONNEL OF MULTIFIELD HOSPITAL TO WORKING OVER STANDARDS OF SINGLE JOB POSITION].

    PubMed

    Khazov, M V; Romanov, S V; Abaeva, O P; Murigina, M M

    2015-01-01

    The article considers results of study concerning prevalence of physicians working over standards of single job position in multifield hospital, including factors motivating to extra job. The research purpose was to analyze impact of gender age structure of medical personnel of multfield public medical organization on motivation of physicians to work over standards of single job position. The objectives consisted of analyzing prevalence of over standard work of medical personnel of multifield public medical organization with consideration of social structure and studying factors motivating physicians to work over standards of single job position. The study was carried out on the basis of questionnaire survey of physicians. The results of study testify higher degree of prevalence of working over standards of single job position in modern health care. So, 64.8 ± 3.6% of respondents work subject to conditions of internal and/or external moonlighting. Moreover; one third of physicians enlisted to extra job. Among physicians males more often than females work subject to conditions of moonlighting, perform extra job, enlist to working during days off. The specialists of age group from 35 to 54 years work subject to conditions of external and internal moonlighting more often than younger physicians. Among physicians, the most significant cause of working subject to conditions of moonlighting is additional earnings. At that, every fifth physician works over standards of job position with purpose to increase one's professional competence. The study results permit concluding that aspects of social structure of modern medical staff significantly impact motivation and hence possibility of enlisting workers to work over standards of single job position.

  8. A Standard-Driven Data Dictionary for Data Harmonization of Heterogeneous Datasets in Urban Geological Information Systems

    NASA Astrophysics Data System (ADS)

    Liu, G.; Wu, C.; Li, X.; Song, P.

    2013-12-01

    The 3D urban geological information system has been a major part of the national urban geological survey project of China Geological Survey in recent years. Large amount of multi-source and multi-subject data are to be stored in the urban geological databases. There are various models and vocabularies drafted and applied by industrial companies in urban geological data. The issues such as duplicate and ambiguous definition of terms and different coding structure increase the difficulty of information sharing and data integration. To solve this problem, we proposed a national standard-driven information classification and coding method to effectively store and integrate urban geological data, and we applied the data dictionary technology to achieve structural and standard data storage. The overall purpose of this work is to set up a common data platform to provide information sharing service. Research progresses are as follows: (1) A unified classification and coding method for multi-source data based on national standards. Underlying national standards include GB 9649-88 for geology and GB/T 13923-2006 for geography. Current industrial models are compared with national standards to build a mapping table. The attributes of various urban geological data entity models are reduced to several categories according to their application phases and domains. Then a logical data model is set up as a standard format to design data file structures for a relational database. (2) A multi-level data dictionary for data standardization constraint. Three levels of data dictionary are designed: model data dictionary is used to manage system database files and enhance maintenance of the whole database system; attribute dictionary organizes fields used in database tables; term and code dictionary is applied to provide a standard for urban information system by adopting appropriate classification and coding methods; comprehensive data dictionary manages system operation and security. (3) An extension to system data management function based on data dictionary. Data item constraint input function is making use of the standard term and code dictionary to get standard input result. Attribute dictionary organizes all the fields of an urban geological information database to ensure the consistency of term use for fields. Model dictionary is used to generate a database operation interface automatically with standard semantic content via term and code dictionary. The above method and technology have been applied to the construction of Fuzhou Urban Geological Information System, South-East China with satisfactory results.

  9. 24 CFR 206.47 - Property standards; repair work.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Property standards; repair work... Property standards; repair work. (a) Need for repairs. Properties must meet the applicable property... insured mortgage. (b) Assurance that repairs are made. The mortgage may be closed before the repair work...

  10. 77 FR 19008 - Guidelines for Home Energy Professionals: Standard Work Specifications for Single Family Energy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-29

    ... Professionals: Standard Work Specifications for Single Family Energy Upgrades AGENCY: Office of Energy...: Standard Work Specifications for Single Family Energy Upgrades. This document is a set of work specifications applicable to energy efficiency retrofits of single family homes. These specifications are...

  11. Pleiades and OCO-2: Using Supercomputing Resources to Process OCO-2 Science Data

    NASA Technical Reports Server (NTRS)

    LaHaye, Nick

    2012-01-01

    For a period of ten weeks I got the opportunity to assist in doing research for the OCO-2 project in the Science Data Operations System Team. This research involved writing a prototype interface that would work as a model for the system implemented for the project's operations. This would only be the case if when the system is tested it worked properly and up to the team's standards. This paper gives the details of the research done and its results.

  12. Integrated data management for clinical studies: automatic transformation of data models with semantic annotations for principal investigators, data managers and statisticians.

    PubMed

    Dugas, Martin; Dugas-Breit, Susanne

    2014-01-01

    Design, execution and analysis of clinical studies involves several stakeholders with different professional backgrounds. Typically, principle investigators are familiar with standard office tools, data managers apply electronic data capture (EDC) systems and statisticians work with statistics software. Case report forms (CRFs) specify the data model of study subjects, evolve over time and consist of hundreds to thousands of data items per study. To avoid erroneous manual transformation work, a converting tool for different representations of study data models was designed. It can convert between office format, EDC and statistics format. In addition, it supports semantic annotations, which enable precise definitions for data items. A reference implementation is available as open source package ODMconverter at http://cran.r-project.org.

  13. 22 CFR Appendix A to Part 145 - Clauses for Contracts and Small Purchases Awarded by Recipient

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... suspected or reported violations to the Department. 4. Contract Work Hours and Safety Standards Act (40 U.S... mechanic and laborer on the basis of a standard work week of 40 hours. Work in excess of the standard work... basic rate of pay for all hours worked in excess of 40 hours in the work week. Section 107 of the Act is...

  14. Accelerated pharmacokinetic map determination for dynamic contrast enhanced MRI using frequency-domain based Tofts model.

    PubMed

    Vajuvalli, Nithin N; Nayak, Krupa N; Geethanath, Sairam

    2014-01-01

    Dynamic Contrast Enhanced Magnetic Resonance Imaging (DCE-MRI) is widely used in the diagnosis of cancer and is also a promising tool for monitoring tumor response to treatment. The Tofts model has become a standard for the analysis of DCE-MRI. The process of curve fitting employed in the Tofts equation to obtain the pharmacokinetic (PK) parameters is time-consuming for high resolution scans. Current work demonstrates a frequency-domain approach applied to the standard Tofts equation to speed-up the process of curve-fitting in order to obtain the pharmacokinetic parameters. The results obtained show that using the frequency domain approach, the process of curve fitting is computationally more efficient compared to the time-domain approach.

  15. Error modelling of quantum Hall array resistance standards

    NASA Astrophysics Data System (ADS)

    Marzano, Martina; Oe, Takehiko; Ortolano, Massimo; Callegaro, Luca; Kaneko, Nobu-Hisa

    2018-04-01

    Quantum Hall array resistance standards (QHARSs) are integrated circuits composed of interconnected quantum Hall effect elements that allow the realization of virtually arbitrary resistance values. In recent years, techniques were presented to efficiently design QHARS networks. An open problem is that of the evaluation of the accuracy of a QHARS, which is affected by contact and wire resistances. In this work, we present a general and systematic procedure for the error modelling of QHARSs, which is based on modern circuit analysis techniques and Monte Carlo evaluation of the uncertainty. As a practical example, this method of analysis is applied to the characterization of a 1 MΩ QHARS developed by the National Metrology Institute of Japan. Software tools are provided to apply the procedure to other arrays.

  16. Progress toward a new beam measurement of the neutron lifetime

    NASA Astrophysics Data System (ADS)

    Hoogerheide, Shannon Fogwell

    2016-09-01

    Neutron beta decay is the simplest example of nuclear beta decay. A precise value of the neutron lifetime is important for consistency tests of the Standard Model and Big Bang Nucleosysnthesis models. The beam neutron lifetime method requires the absolute counting of the decay protons in a neutron beam of precisely known flux. Recent work has resulted in improvements in both the neutron and proton detection systems that should permit a significant reduction in systematic uncertainties. A new measurement of the neutron lifetime using the beam method will be performed at the National Institute of Standards and Technology Center for Neutron Research. The projected uncertainty of this new measurement is 1 s. An overview of the measurement and the technical improvements will be discussed.

  17. New Evidence that CMEs are Self-Propelled Magnetic Bubbles

    NASA Technical Reports Server (NTRS)

    Moore, Ronald L.; Sterling, Alphonse C.; Seuss, Steven T.

    2007-01-01

    We briefly describe the "standard model" for the production of coronal mass ejections (CMEs), and our view of how it works. We then summarize pertinent recent results that we have found from SOHO observations of CMEs and the flares at the sources of these magnetic explosions. These results support our interpretation of the standard model: a CME is basically a self-propelled magnetic bubble, a low-beta plasmoitl, that (1) is built and unleashed by the tether-cutting reconnection that builds and heats the coronal flare arcade, (2) can explode from a flare site that is far from centered under the full-blown CME in the outer corona, and (3) drives itself out into the solar wind by pushing on the surrounding coronal magnetic field.

  18. Standard model CP violation and cold electroweak baryogenesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tranberg, Anders

    2011-10-15

    Using large-scale real-time lattice simulations, we calculate the baryon asymmetry generated at a fast, cold electroweak symmetry breaking transition. CP-violation is provided by the leading effective bosonic term resulting from integrating out the fermions in the Minimal Standard Model at zero-temperature, and performing a covariant gradient expansion [A. Hernandez, T. Konstandin, and M. G. Schmidt, Nucl. Phys. B812, 290 (2009).]. This is an extension of the work presented in [A. Tranberg, A. Hernandez, T. Konstandin, and M. G. Schmidt, Phys. Lett. B 690, 207 (2010).]. The numerical implementation is described in detail, and we address issues specifically related to usingmore » this CP-violating term in the context of Cold Electroweak Baryogenesis.« less

  19. Modeling PBX 9501 overdriven release experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, P.K.

    1997-11-01

    High Explosives (HE) performs work by the expansion of its detonation products. Along with the propagation of the detonation wave, the equation of state (EOS) of the products determines the HE performance in an engineering system. The authors show the failure of the standard Jones-Wilkins-Lee (JWL) equation of state (EOS) in modeling the overdriven release experiments of PBX 9501. The deficiency can be tracked back to inability of the same EOS in matching the shock pressure and the sound speed on the Hugoniot in the hydrodynamic regime above the Chapman-Jouguet pressure. After adding correction terms to the principal isentrope ofmore » the standard JWL EOS, the authors are able to remedy this shortcoming and the simulation was successful.« less

  20. Dimensions of Credibility in Models and Simulations

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2008-01-01

    Based on the National Aeronautics and Space Administration's (NASA's) work in developing a standard for models and simulations (M&S), the subject of credibility in M&S became a distinct focus. This is an indirect result from the Space Shuttle Columbia Accident Investigation Board (CAIB), which eventually resulted in an action, among others, to improve the rigor in NASA's M&S practices. The focus of this action came to mean a standardized method for assessing and reporting results from any type of M&S. As is typical in the standards development process, this necessarily developed into defming a common terminology base, common documentation requirements (especially for M&S used in critical decision making), and a method for assessing the credibility of M&S results. What surfaced in the development of the NASA Standard was the various dimensions credibility to consider when accepting the results from any model or simulation analysis. The eight generally relevant factors of credibility chosen in the NASA Standard proved only one aspect in the dimensionality of M&S credibility. At the next level of detail, the full comprehension of some of the factors requires an understanding along a couple of dimensions as well. Included in this discussion are the prerequisites for the appropriate use of a given M&S, the choice of factors in credibility assessment with their inherent dimensionality, and minimum requirements for fully reporting M&S results.

  1. Development of an informatics infrastructure for data exchange of biomolecular simulations: architecture, data models and ontology$

    PubMed Central

    Thibault, J. C.; Roe, D. R.; Eilbeck, K.; Cheatham, T. E.; Facelli, J. C.

    2015-01-01

    Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data – both within the same organization and among different ones – remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations. PMID:26387907

  2. Development of an informatics infrastructure for data exchange of biomolecular simulations: Architecture, data models and ontology.

    PubMed

    Thibault, J C; Roe, D R; Eilbeck, K; Cheatham, T E; Facelli, J C

    2015-01-01

    Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data - both within the same organization and among different ones - remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations.

  3. Difference between two species of emu hides a test for lepton flavour violation

    NASA Astrophysics Data System (ADS)

    Lester, Christopher G.; Brunt, Benjamin H.

    2017-03-01

    We argue that an LHC measurement of some simple quantities related to the ratio of rates of e + μ - to e - μ + events is surprisingly sensitive to as-yet unexcluded R-parity violating supersymmetric models with non-zero λ 231 ' couplings. The search relies upon the approximate lepton universality in the Standard Model, the sign of the charge of the proton, and a collection of favourable detector biases. The proposed search is unusual because: it does not require any of the displaced vertices, hadronic neutralino decay products, or squark/gluino production relied upon by existing LHC RPV searches; it could work in cases in which the only light sparticles were smuons and neutralinos; and it could make a discovery (though not necessarily with optimal significance) without requiring the computation of a leading-order Monte Carlo estimate of any background rate. The LHC has shown no strong hints of post-Higgs physics and so precision Standard Model measurements are becoming ever more important. We argue that in this environment growing profits are to be made from searches that place detector biases and symmetries of the Standard Model at their core — searches based around `controls' rather than around signals.

  4. Measurement of ultra-low power oscillators using adaptive drift cancellation with applications to nano-magnetic spin torque oscillators.

    PubMed

    Tamaru, S; Ricketts, D S

    2013-05-01

    This work presents a technique for measuring ultra-low power oscillator signals using an adaptive drift cancellation method. We demonstrate this technique through spectrum measurements of a sub-pW nano-magnet spin torque oscillator (STO). We first present a detailed noise analysis of the standard STO characterization apparatus to estimate the background noise level, then compare these results to the noise level of three measurement configurations. The first and second share the standard configuration but use different spectrum analyzers (SA), an older model and a state-of-the-art model, respectively. The third is the technique proposed in this work using the same old SA as for the first. Our results show that the first and second configurations suffer from a large drift that requires ~30 min to stabilize each time the SA changes the frequency band, even though the SA has been powered on for longer than 24 h. The third configuration introduced in this work, however, shows absolutely no drift as the SA changes frequency band, and nearly the same noise performance as with a state-of-the-art SA, thus providing a reliable method for measuring very low power signals for a wide variety of applications.

  5. Prior robust empirical Bayes inference for large-scale data by conditioning on rank with application to microarray data

    PubMed Central

    Liao, J. G.; Mcmurry, Timothy; Berg, Arthur

    2014-01-01

    Empirical Bayes methods have been extensively used for microarray data analysis by modeling the large number of unknown parameters as random effects. Empirical Bayes allows borrowing information across genes and can automatically adjust for multiple testing and selection bias. However, the standard empirical Bayes model can perform poorly if the assumed working prior deviates from the true prior. This paper proposes a new rank-conditioned inference in which the shrinkage and confidence intervals are based on the distribution of the error conditioned on rank of the data. Our approach is in contrast to a Bayesian posterior, which conditions on the data themselves. The new method is almost as efficient as standard Bayesian methods when the working prior is close to the true prior, and it is much more robust when the working prior is not close. In addition, it allows a more accurate (but also more complex) non-parametric estimate of the prior to be easily incorporated, resulting in improved inference. The new method’s prior robustness is demonstrated via simulation experiments. Application to a breast cancer gene expression microarray dataset is presented. Our R package rank.Shrinkage provides a ready-to-use implementation of the proposed methodology. PMID:23934072

  6. Eye micromotions influence on an error of Zernike coefficients reconstruction in the one-ray refractometry of an eye

    NASA Astrophysics Data System (ADS)

    Osipova, Irina Y.; Chyzh, Igor H.

    2001-06-01

    The influence of eye jumps on the accuracy of estimation of Zernike coefficients from eye transverse aberration measurements was investigated. By computer modeling the ametropy and astigmatism have been examined. The standard deviation of the wave aberration function was calculated. It was determined that the standard deviation of the wave aberration function achieves the minimum value if the number of scanning points is equal to the number of eye jumps in scanning period. The recommendations for duration of measurement were worked out.

  7. Toward improved guideline quality: using the COGS statement with GEM.

    PubMed

    Shiffman, Richard N; Michel, Georges

    2004-01-01

    The Conference on Guideline Standardization (COGS) was convened to create a standardized documentation checklist for clinical practice guidelines in an effort to promote guideline quality and facilitate implementation. The statement was created by a multidisciplinary panel using a rigorous consensus development methodology. The Guideline Elements Model (GEM) provides a standardized approach to representing guideline documents using XML. In this work, we demonstrate the sufficiency of GEM for describing COGS components. Using the mapping between COGS and GEM elements we built an XSLT application to examine a guideline's adherence (or non-adherence) to the COGS checklist. Once a guideline has been marked up according to the GEM hierarchy, its knowledge content can be reused in multiple ways.

  8. Duality linking standard and tachyon scalar field cosmologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avelino, P. P.; Bazeia, D.; Losano, L.

    2010-09-15

    In this work we investigate the duality linking standard and tachyon scalar field homogeneous and isotropic cosmologies in N+1 dimensions. We determine the transformation between standard and tachyon scalar fields and between their associated potentials, corresponding to the same background evolution. We show that, in general, the duality is broken at a perturbative level, when deviations from a homogeneous and isotropic background are taken into account. However, we find that for slow-rolling fields the duality is still preserved at a linear level. We illustrate our results with specific examples of cosmological relevance, where the correspondence between scalar and tachyon scalarmore » field models can be calculated explicitly.« less

  9. Occupational heat stress and associated productivity loss estimation using the PHS model (ISO 7933): a case study from workplaces in Chennai, India.

    PubMed

    Lundgren, Karin; Kuklane, Kalev; Venugopal, Vidhya

    2014-01-01

    Heat stress is a major occupational problem in India that can cause adverse health effects and reduce work productivity. This paper explores this problem and its impacts in selected workplaces, including industrial, service, and agricultural sectors in Chennai, India. Quantitative measurements of heat stress, workload estimations, and clothing testing, and qualitative information on health impacts, productivity loss, etc., were collected. Heat strain and associated impacts on labour productivity between the seasons were assessed using the International Standard ISO 7933:2004, which applies the Predicted Heat Strain (PHS) model. All workplaces surveyed had very high heat exposure in the hot season (Wet Bulb Globe Temperature =29.7), often reaching the international standard safe work values (ISO 7243:1989). Most workers had moderate to high workloads (170-220 W/m2), with some exposed to direct sun. Clothing was found to be problematic, with high insulation values in relation to the heat exposure. Females were found to be more vulnerable because of the extra insulation added from wearing a protective shirt on top of traditional clothing (0.96 clo) while working. When analysing heat strain--in terms of core temperature and dehydration--and associated productivity loss in the PHS model, the parameters showed significant impacts that affected productivity in all workplaces, apart from the laundry facility, especially during the hot season. For example, in the canteen, the core temperature limit of 38°C predicted by the model was reached in only 64 min for women. With the expected increases in temperature due to climate change, additional preventive actions have to be implemented to prevent further productivity losses and adverse health impacts. Overall, this study presented insight into using a thermo-physiological model to estimate productivity loss due to heat exposure in workplaces. This is the first time the PHS model has been used for this purpose. An exploratory approach was taken for further development of the model.

  10. Dark Energy and Dark Matter Hidden in the Geometry of Space?

    NASA Astrophysics Data System (ADS)

    Buchert, Thomas

    A spatially flat and infinite Universe in the form of a "concordant" standard model of cosmology rules present-day thinking of cosmologists. The price to pay is an unknown physical origin of Dark Energy and Dark Matter that are supposed to exist and even appear to rule the dynamics of our Universe. A growing number of cosmologists question the existence of dark constituents: the standard model of cosmology may be just too simple, since it neglects the influence of structure in the Universe on its global expansion history. The key-issue appears to be the curvature of space: the formation of structure interacts with the geometry of space, changing our global picture of the Universe. This chapter explains the underlying mechanism that works in the right direction to uncover the dark faces of the standard model of cosmology. If successful, this novel approach furnishes a new paradigm of modern cosmology. Hundreds of researchers have recently embarked into studies of this new subject. We understand much at present, but there are many open questions.

  11. Dark revelations of the [SU(3)]3 and [SU(3)]4 gauge extensions of the standard model

    NASA Astrophysics Data System (ADS)

    Kownacki, Corey; Ma, Ernest; Pollard, Nicholas; Popov, Oleg; Zakeri, Mohammadreza

    2018-02-01

    Two theoretically well-motivated gauge extensions of the standard model are SU(3)C × SU(3)L × SU(3)R and SU(3)q × SU(3)L × SU(3)l × SU(3)R, where SU(3)q is the same as SU(3)C and SU(3)l is its color leptonic counterpart. Each has three variations, according to how SU(3)R is broken. It is shown here for the first time that a built-in dark U(1)D gauge symmetry exists in all six versions. However, the corresponding symmetry breaking pattern does not reduce properly to that of the standard model, unless an additional Z2‧ symmetry is defined, so that U(1)D ×Z2‧ is broken to Z2 dark parity. The available dark matter candidates in each case include fermions, scalars, as well as vector gauge bosons. This work points to the possible unity of matter with dark matter, the origin of which may not be ad hoc.

  12. Whole-Motion Model of Perception during Forward- and Backward-Facing Centrifuge Runs

    PubMed Central

    Holly, Jan E.; Vrublevskis, Arturs; Carlson, Lindsay E.

    2009-01-01

    Illusory perceptions of motion and orientation arise during human centrifuge runs without vision. Asymmetries have been found between acceleration and deceleration, and between forward-facing and backward-facing runs. Perceived roll tilt has been studied extensively during upright fixed-carriage centrifuge runs, and other components have been studied to a lesser extent. Certain, but not all, perceptual asymmetries in acceleration-vs-deceleration and forward-vs-backward motion can be explained by existing analyses. The immediate acceleration-deceleration roll-tilt asymmetry can be explained by the three-dimensional physics of the external stimulus; in addition, longer-term data has been modeled in a standard way using physiological time constants. However, the standard modeling approach is shown in the present research to predict forward-vs-backward-facing symmetry in perceived roll tilt, contradicting experimental data, and to predict perceived sideways motion, rather than forward or backward motion, around a curve. The present work develops a different whole-motion-based model taking into account the three-dimensional form of perceived motion and orientation. This model predicts perceived forward or backward motion around a curve, and predicts additional asymmetries such as the forward-backward difference in roll tilt. This model is based upon many of the same principles as the standard model, but includes an additional concept of familiarity of motions as a whole. PMID:19208962

  13. Electroweak standard model with very special relativity

    NASA Astrophysics Data System (ADS)

    Alfaro, Jorge; González, Pablo; Ávila, Ricardo

    2015-05-01

    The very special relativity electroweak Standard Model (VSR EW SM) is a theory with SU (2 )L×U (1 )R symmetry, with the same number of leptons and gauge fields as in the usual Weinberg-Salam model. No new particles are introduced. The model is renormalizable and unitarity is preserved. However, photons obtain mass and the massive bosons obtain different masses for different polarizations. Besides, neutrino masses are generated. A VSR-invariant term will produce neutrino oscillations and new processes are allowed. In particular, we compute the rate of the decays μ →e +γ . All these processes, which are forbidden in the electroweak Standard Model, put stringent bounds on the parameters of our model and measure the violation of Lorentz invariance. We investigate the canonical quantization of this nonlocal model. Second quantization is carried out, and we obtain a well-defined particle content. Additionally, we do a counting of the degrees of freedom associated with the gauge bosons involved in this work, after spontaneous symmetry breaking has been realized. Violations of Lorentz invariance have been predicted by several theories of quantum gravity [J. Alfaro, H. Morales-Tecotl, and L. F. Urrutia, Phys. Rev. Lett. 84, 2318 (2000); Phys. Rev. D 65, 103509 (2002)]. It is a remarkable possibility that the low-energy effects of Lorentz violation induced by quantum gravity could be contained in the nonlocal terms of the VSR EW SM.

  14. Heat strain models applicable for protective clothing: Comparison of core temperature response. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, R.R.; McLellan, T.M.; Withey, W.R.

    This report represents the results of TTCP-UTP6 efforts on modeling aspects when chemical protective ensembles are worn which need to be considered in warm environments. Since 1983, a significant data base has been collected using human experimental studies and wide clothing systems from which predictive modeling equations have been developed with individuals working in temperate and hot environments, but few comparisons of the -- results from various model outputs have ever been carried out. This initial comparison study was part of a key technical area (KIA) project for The Technical Cooperation Program (TTCP) UTP-6 working party. A modeling workshop wasmore » conducted in Toronto, Canada on 9-10 June 1994 to discuss the data reduction and results acquired in an initial clothing analysis study of TTCP using various chemical protective garments. To our knowledge, no comprehensive study to date has ever focused on comparing experimental results using an international standardized heat stress procedure matched to physiological outputs from various model predictions in individuals dressed in chemical protective clothing systems. This is the major focus of this TTCP key technical study. This technical report covers one aspect of the working party`s results.« less

  15. Top quark rare decays via loop-induced FCNC interactions in extended mirror fermion model

    NASA Astrophysics Data System (ADS)

    Hung, P. Q.; Lin, Yu-Xiang; Nugroho, Chrisna Setyo; Yuan, Tzu-Chiang

    2018-02-01

    Flavor changing neutral current (FCNC) interactions for a top quark t decays into Xq with X represents a neutral gauge or Higgs boson, and q a up- or charm-quark are highly suppressed in the Standard Model (SM) due to the Glashow-Iliopoulos-Miami mechanism. Whilst current limits on the branching ratios of these processes have been established at the order of 10-4 from the Large Hadron Collider experiments, SM predictions are at least nine orders of magnitude below. In this work, we study some of these FCNC processes in the context of an extended mirror fermion model, originally proposed to implement the electroweak scale seesaw mechanism for non-sterile right-handed neutrinos. We show that one can probe the process t → Zc for a wide range of parameter space with branching ratios varying from 10-6 to 10-8, comparable with various new physics models including the general two Higgs doublet model with or without flavor violations at tree level, minimal supersymmetric standard model with or without R-parity, and extra dimension model.

  16. Workflow standardization of a novel team care model to improve chronic care: a quasi-experimental study.

    PubMed

    Panattoni, Laura; Hurlimann, Lily; Wilson, Caroline; Durbin, Meg; Tai-Seale, Ming

    2017-04-19

    Team-based chronic care models have not been widely adopted in community settings, partly due to their varying effectiveness in randomized control trials, implementation challenges, and concerns about physician acceptance. The Palo Alto Medical Foundation designed and implemented "Champion," a novel team-based model that includes new standard work (e.g. proactive patient outreach, pre-visit schedule grooming, depression screening, care planning, health coaching) to support patients' self-management of hypertension and diabetes. We investigated whether Champion improved clinical outcomes. We conducted a quasi-experimental study comparing the Champion clinic-level intervention (n = 38 physicians) with a usual care clinic (n = 37 physicians) in Northern California. The primary outcomes, blood pressure and glycohemoglobin (A1c), were analyzed using a piecewise linear growth curve model for patients exposed to a Champion physician visit (n = 3156) or usual care visit (n = 8034) in the two years prior and one year post implementation. Secondary outcomes were provider experience, compared at baseline and 12 months in both the intervention and usual care clinics using multi-level ordered logistic modeling, and electronic health record based fidelity measures. Compared to usual care, in the first 6 months after a Champion physician visit, diabetes patients aged 18-75 experienced an additional -1.13 mm Hg (95% CI: -2.23 to -0.04) decline in diastolic blood pressure and -0.47 (95% CI: -0.61 to -0.33) decline in A1c. There were no additional improvements in blood pressure or A1c 6 to 12 months post physician visit. At 12 months, Champion physicians reported improved experience with managing chronic care patients in 6 of 7 survey items (p < 0.05), but compared to usual, this difference was only statistically significant for one item (p < 0.05). Fidelity to standard work was uneven; depression screening was the most commonly documented element (85% of patients), while care plans were the least (30.8% of patients). Champion standard work improved glycemic control over the first 6 months and physicians' experience with managing chronic care; changes in blood pressure were not clinically meaningful. Our results suggest the need to understand the relationship between the intervention, the contextual features of implementation, and fidelity to further improve chronic disease outcomes. This study was retrospectively registered with the ISRCTN Registry on March 15, 2017 (ISRCTN11341906).

  17. Reduction of Endotracheal Tube Connector Dead Space Improves Ventilation: A Bench Test on a Model Lung Simulating an Extremely Low Birth Weight Neonate.

    PubMed

    Ivanov, Vadim A

    2016-02-01

    The reduction of instrumental dead space is a recognized approach to preventing ventilation-induced lung injury in premature infants. However, there are no published data regarding the effectiveness of instrumental dead-space reduction in endotracheal tube (ETT) connectors. We tested the impact of the Y-piece/ETT connector pairs with reduced instrumental dead space on CO2 elimination in a model of the premature neonate lung. The standard ETT connector was compared with a low-dead-space ETT connector and with a standard connector equipped with an insert. We compared the setups by measuring the CO2 elimination rate in an artificial lung ventilated via the connectors. The lung was connected to a ventilator via a standard circuit, a 2.5-mm ETT, and one of the connectors under investigation. The ventilator was run in volume-controlled continuous mandatory ventilation mode. The low-dead-space ETT connector/Y-piece and insert-equipped standard connector/Y-piece pairs had instrumental dead space reduced by 36 and 67%, respectively. With set tidal volumes (VT) of 2.5, 5, and 10 mL, in comparison with the standard ETT connector, the low-dead-space connector reduced CO2 elimination time by 4.5% (P < .05), 4.4% (P < .01), and 7.1% (not significant), respectively. The insert-equipped standard connector reduced CO2 elimination time by 13.5, 25.1, and 16.1% (all P < .01). The low-dead-space connector increased inspiratory resistance by 17.8% (P < .01), 9.6% (P < .05), and 5.0% (not significant); the insert-equipped standard connector increased inspiratory resistance by 9.1, 8.4, and 5.9% (all not significant). The low-dead-space connector decreased expiratory resistance by 6.8% (P < .01) and 1.8% (not significant) and increased it by 1.4% (not significant); the insert-equipped standard connector decreased expiratory resistance by 1.5 and 1% and increased it by 1% (all not significant). The low-dead-space connector increased work of breathing by 4.7% (P < .01), 3.8% (P < .01), and 2.5% (not significant); the insert-equipped standard connector increased it by 0.8% (not significant), 2.5% (P < .01), and 2.8% (P < .01). Both methods of instrumental dead-space reduction led to improvements in artificial lung ventilation. Negative effects on resistance and work of breathing appeared minimal. Further testing in vivo should be performed to confirm the lung model results and, if successful, translated into clinical practice. Copyright © 2016 by Daedalus Enterprises.

  18. WAIS-IV subtest covariance structure: conceptual and statistical considerations.

    PubMed

    Ward, L Charles; Bergman, Maria A; Hebert, Katina R

    2012-06-01

    D. Wechsler (2008b) reported confirmatory factor analyses (CFAs) with standardization data (ages 16-69 years) for 10 core and 5 supplemental subtests from the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV). Analyses of the 15 subtests supported 4 hypothesized oblique factors (Verbal Comprehension, Working Memory, Perceptual Reasoning, and Processing Speed) but also revealed unexplained covariance between Block Design and Visual Puzzles (Perceptual Reasoning subtests). That covariance was not included in the final models. Instead, a path was added from Working Memory to Figure Weights (Perceptual Reasoning subtest) to improve fit and achieve a desired factor pattern. The present research with the same data (N = 1,800) showed that the path from Working Memory to Figure Weights increases the association between Working Memory and Matrix Reasoning. Specifying both paths improves model fit and largely eliminates unexplained covariance between Block Design and Visual Puzzles but with the undesirable consequence that Figure Weights and Matrix Reasoning are equally determined by Perceptual Reasoning and Working Memory. An alternative 4-factor model was proposed that explained theory-implied covariance between Block Design and Visual Puzzles and between Arithmetic and Figure Weights while maintaining compatibility with WAIS-IV Index structure. The proposed model compared favorably with a 5-factor model based on Cattell-Horn-Carroll theory. The present findings emphasize that covariance model comparisons should involve considerations of conceptual coherence and theoretical adherence in addition to statistical fit. (c) 2012 APA, all rights reserved

  19. How do people evaluate social sexual conduct at work? A psycholegal model.

    PubMed

    Wiener, R L; Hurt, L E

    2000-02-01

    The authors tested a psycholegal model of how people evaluate social sexual conduct at work with videotaped reenactments of interviews with alleged complainants, perpetrators, and other workers. Participants (200 full-time male and female workers) were randomly assigned to evaluate the complaints with either the reasonable person or reasonable woman legal standard. Participants answered questions about sexual harassment law and completed the Ambivalent Sexism Inventory. Participants who took the reasonable woman perspective, as compared with those who took the reasonable person perspective, were more likely to find the conduct harassing; this was especially the case among participants high in hostile sexism. Medium-sized gender effects were found in the severe case but were absent in the weaker, more ambiguous case. The implications of these findings for hostile work environment law are discussed.

  20. 76 FR 58078 - Thirteenth Meeting: RTCA Special Committee 214: Working Group 78: Standards for Air Traffic Data...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-19

    ... Committee 214: Working Group 78: Standards for Air Traffic Data Communication Services AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of RTCA Special Committee 214: Working Group 78... public of a meeting of the RTCA Special Committee 214: Working Group 78: Standards for Air Traffic Data...

  1. University of Oklahoma - High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skubic, Patrick L.

    2013-07-31

    The High Energy Physics program at the University of Oklahoma, Pat Skubic, Principal Investigator, is attempting to understand nature at the deepest level using the most advanced experimental and theoretical tools. The four experimental faculty, Brad Abbott, Phil Gutierrez, Pat Skubic, and Mike Strauss, together with post-doctoral associates and graduate students, are finishing their work as part of the D0 collaboration at Fermilab, and increasingly focusing their investigations at the Large Hadron Collidor (LHC) as part of the ATLAS Collaboration. Work at the LHC has become even more exciting with the recent discovery by ATLAS and the other collaboration, CMS,more » of the long-sought Higgs boson, which plays a key role in generating masses for the elementary constituents of matter. Work of the OUHEP group has been in the three areas of hardware, software, and analysis. Now that the Higgs boson has been discovered, completing the Standard Model of fundamental physics, new efforts will focus on finding hints of physics beyond the standard model, such as supersymmetry. The OUHEP theory group (Kim Milton, PI) also consists of four faculty members, Howie Baer, Chung Kao, Kim Milton, and Yun Wang, and associated students and postdocs. They are involved in understanding fundamental issues in formulating theories of the microworld, and in proposing models that carry us past the Standard Model, which is an incomplete description of nature. They therefore work in close concert with their experimental colleagues. One also can study fundamental physics by looking at the large scale structure of the universe; in particular the ``dark energy'' that seems to be causing the universe to expand at an accelerating rate, effectively makes up about 3/4 of the energy in the universe, and yet is totally unidentified. Dark energy and dark matter, which together account for nearly all of the energy in the universe, are an important probe of fundamental physics at the very shortest distances, or at the very highest energies. The outcomes of the group's combined experimental and theoretical research will be an improved understanding of nature, at the highest energies reachable, from which applications to technological innovation will surely result, as they always have from such studies in the past.« less

  2. Meeting Youth in Movement and on Neutral Ground

    ERIC Educational Resources Information Center

    Nissen, Morten

    2015-01-01

    The article articulates an educational motto--expressed in the title--found in a "prototypical narrative" of social youth work carried out by activists in Copenhagen in the 1990s. This way of modeling pedagogical practice is first outlined as different from the standardizing approach dominant in science. As a prototypical narrative, the…

  3. 76 FR 46814 - Medicare Program; Evaluation Criteria and Standards for Quality Improvement Program Contracts...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-03

    ... Work) The Patient Safety initiatives are designed to help achieve the goals of improving individual... coordinating center, the Center for Medicare and Medicaid Innovation, and the Agency for Healthcare Research... outreach activities required to complete all Aims of the 10th SOW successfully. The CRISP Model is designed...

  4. Information Literacy for Archives and Special Collections: Defining Outcomes

    ERIC Educational Resources Information Center

    Carini, Peter

    2016-01-01

    This article provides the framework for a set of standards and outcomes that would constitute information literacy with primary sources. Based on a working model used at Dartmouth College's Rauner Special Collections Library in Hanover, New Hampshire, these concepts create a framework for teaching with primary source materials intended to produce…

  5. Stay on or Drop Out? The Role of Identities in Continued Participation at Technical College in Australia

    ERIC Educational Resources Information Center

    Fisher, Laurel J.

    2014-01-01

    Identities extend standard models that explain student motivations to complete courses at technical college. A differential hypothesis was that profiles of identities (individuality, belonging and place) explain the self-concepts and task values that contribute to participation, considering demographic factors (age, gender, location, paid work).…

  6. Appropriateness Measurement with Polychotomous Item Response Models and Standardized Indices. Measurement Series, 84-1.

    ERIC Educational Resources Information Center

    Drasgow, Fritz; And Others

    The test scores of some examinees on a multiple-choice test may not provide adequate measures of their abilities. The goal of appropriateness measurement is to identify such individuals. Earlier theoretical and experimental work considered examinees answering all, or almost all, test items. This article reports research that extends…

  7. Measuring Effect Sizes: The Effect of Measurement Error. Working Paper 19

    ERIC Educational Resources Information Center

    Boyd, Donald; Grossman, Pamela; Lankford, Hamilton; Loeb, Susanna; Wyckoff, James

    2008-01-01

    Value-added models in education research allow researchers to explore how a wide variety of policies and measured school inputs affect the academic performance of students. Researchers typically quantify the impacts of such interventions in terms of "effect sizes", i.e., the estimated effect of a one standard deviation change in the…

  8. Digital Management of a Hysteroscopy Surgery Using Parts of the SNOMED Medical Model

    PubMed Central

    Kollias, Anastasios; Paschopoulos, Minas; Evangelou, Angelos; Poulos, Marios

    2012-01-01

    This work describes a hysteroscopy surgery management application that was designed based on the medical information standard SNOMED. We describe how the application fulfils the needs of this procedure and the way in which existing handwritten medical information is effectively transmitted to the application’s database. PMID:22848338

  9. Beyond a Politics of the Plural in World Englishes

    ERIC Educational Resources Information Center

    Porter, Curt

    2014-01-01

    This article explores three recent books related to World Englishes studies and considers ways they overtly and implicitly frame the politics of the field. The author also describes some of his own experiences working with graduate students that suggest a disruption of traditional dichotomies between single standard and pluralistic models of…

  10. Group Work during International Disaster Outreach Projects: A Model to Advance Cultural Competence

    ERIC Educational Resources Information Center

    West-Olatunji, Cirecie; Henesy, Rachel; Varney, Melanie

    2015-01-01

    Given the rise in disasters worldwide, counselors will increasingly be called upon to respond. Current accreditation standards require that programs train students to become skillful in disaster/crisis interventions. Group processing to enhance self-awareness and improve conceptualization skills is an essential element of such training. This…

  11. Education Network of Ontario: Content/Curriculum Models for the Internet-Connected Classroom.

    ERIC Educational Resources Information Center

    Beam, Mary

    The Education Network of Ontario (ENO) is a telecommunications corporation creating an access and applications network for and by Ontario's 130,000-member education community. When educators register with ENO, they receive full industry-standard Internet and Intranet services in English and French. ENO/REO works from school or home. Statistics…

  12. Knowledge, Understanding, and Behavior

    DTIC Science & Technology

    2003-10-04

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP021346 TITLE: Knowledge , Understanding , and Behavior DISTRIBUTION... Knowledge , Understanding , and Behavior James Albus Intelligent Systems Division , National Institute of Standards and Technology, Gaithersburg, MD 20899 301... understanding ? How does understanding the world works, and knowledge of procedures for using influence behavior? These are philosophical questions models to

  13. 75 FR 33565 - Notice of Intent To Prepare an Environmental Impact Statement for New Medium- and Heavy-Duty Fuel...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-14

    ...- and Heavy-Duty Fuel Efficiency Improvement Program AGENCY: National Highway Traffic Safety... efficiency improvement program for commercial medium- and heavy-duty on-highway vehicles and work trucks... efficiency standards starting with model year (MY) 2016 commercial medium- and heavy-duty on-highway vehicles...

  14. Competencies in Geriatric Nursing: Empirical Evidence from a Computer-Based Large-Scale Assessment Calibration Study

    ERIC Educational Resources Information Center

    Kaspar, Roman; Döring, Ottmar; Wittmann, Eveline; Hartig, Johannes; Weyland, Ulrike; Nauerth, Annette; Möllers, Michaela; Rechenbach, Simone; Simon, Julia; Worofka, Iberé

    2016-01-01

    Valid and reliable standardized assessment of nursing competencies is needed to monitor the quality of vocational education and training (VET) in nursing and evaluate learning outcomes for care work trainees with increasingly heterogeneous learning backgrounds. To date, however, the modeling of professional competencies has not yet evolved into…

  15. Working towards accreditation by the International Standards Organization 15189 Standard: how to validate an in-house developed method an example of lead determination in whole blood by electrothermal atomic absorption spectrometry.

    PubMed

    Garcia Hejl, Carine; Ramirez, Jose Manuel; Vest, Philippe; Chianea, Denis; Renard, Christophe

    2014-09-01

    Laboratories working towards accreditation by the International Standards Organization (ISO) 15189 standard are required to demonstrate the validity of their analytical methods. The different guidelines set by various accreditation organizations make it difficult to provide objective evidence that an in-house method is fit for the intended purpose. Besides, the required performance characteristics tests and acceptance criteria are not always detailed. The laboratory must choose the most suitable validation protocol and set the acceptance criteria. Therefore, we propose a validation protocol to evaluate the performance of an in-house method. As an example, we validated the process for the detection and quantification of lead in whole blood by electrothermal absorption spectrometry. The fundamental parameters tested were, selectivity, calibration model, precision, accuracy (and uncertainty of measurement), contamination, stability of the sample, reference interval, and analytical interference. We have developed a protocol that has been applied successfully to quantify lead in whole blood by electrothermal atomic absorption spectrometry (ETAAS). In particular, our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics.

  16. Neutral kaon mixing beyond the Standard Model with n f = 2 + 1 chiral fermions. Part 2: non perturbative renormalisation of the ΔF = 2 four-quark operators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyle, Peter A.; Garron, Nicolas; Hudspith, Renwick J.

    We compute the renormalisation factors (Z-matrices) of the ΔF = 2 four-quark operators needed for Beyond the Standard Model (BSM) kaon mixing. We work with nf = 2+1 flavours of Domain-Wall fermions whose chiral-flavour properties are essential to maintain a continuum-like mixing pattern. We introduce new RI-SMOM renormalisation schemes, which we argue are better behaved compared to the commonly-used corresponding RI-MOM one. We find that, once converted to MS¯, the Z-factors computed through these RI-SMOM schemes are in good agreement but differ significantly from the ones computed through the RI-MOM scheme. The RI-SMOM Z-factors presented here have been used tomore » compute the BSM neutral kaon mixing matrix elements in the companion paper. In conclusion, we argue that the renormalisation procedure is responsible for the discrepancies observed by different collaborations, we will investigate and elucidate the origin of these differences throughout this work.« less

  17. Stress, Health and Well-Being: The Mediating Role of Employee and Organizational Commitment

    PubMed Central

    Jain, Ajay K.; Giga, Sabir I.; Cooper, Cary L.

    2013-01-01

    This study investigates the mediating impact of organizational commitment on the relationship between organizational stressors and employee health and well-being. Data were collected from 401 operator level employees working in business process outsourcing organizations (BPOs) based in New Delhi, India. In this research several dimensions from ASSET, which is an organizational stress screening tool, were used to measure employee perceptions of stressors, their commitment to the organization, their perception of the organization’s commitment to them, and their health and well-being. Data were analyzed using structural equation modeling on AMOS software. Results of the mediation analysis highlight both employee commitment to their organization and their perceptions of the organization’s commitment to them mediate the impact of stressors on physical health and psychological well-being. All indices of the model fit were found to be above standard norms. Implications are discussed with the view to improving standards of health and well-being within the call center industry, which is a sector that has reported higher turnover rates and poor working conditions among its employees internationally. PMID:24157512

  18. Stress, health and well-being: the mediating role of employee and organizational commitment.

    PubMed

    Jain, Ajay K; Giga, Sabir I; Cooper, Cary L

    2013-10-11

    This study investigates the mediating impact of organizational commitment on the relationship between organizational stressors and employee health and well-being. Data were collected from 401 operator level employees working in business process outsourcing organizations (BPOs) based in New Delhi, India. In this research several dimensions from ASSET, which is an organizational stress screening tool, were used to measure employee perceptions of stressors, their commitment to the organization, their perception of the organization's commitment to them, and their health and well-being. Data were analyzed using structural equation modeling on AMOS software. Results of the mediation analysis highlight both employee commitment to their organization and their perceptions of the organization's commitment to them mediate the impact of stressors on physical health and psychological well-being. All indices of the model fit were found to be above standard norms. Implications are discussed with the view to improving standards of health and well-being within the call center industry, which is a sector that has reported higher turnover rates and poor working conditions among its employees internationally.

  19. Neutral kaon mixing beyond the Standard Model with n f = 2 + 1 chiral fermions. Part 2: non perturbative renormalisation of the ΔF = 2 four-quark operators

    DOE PAGES

    Boyle, Peter A.; Garron, Nicolas; Hudspith, Renwick J.; ...

    2017-10-10

    We compute the renormalisation factors (Z-matrices) of the ΔF = 2 four-quark operators needed for Beyond the Standard Model (BSM) kaon mixing. We work with nf = 2+1 flavours of Domain-Wall fermions whose chiral-flavour properties are essential to maintain a continuum-like mixing pattern. We introduce new RI-SMOM renormalisation schemes, which we argue are better behaved compared to the commonly-used corresponding RI-MOM one. We find that, once converted to MS¯, the Z-factors computed through these RI-SMOM schemes are in good agreement but differ significantly from the ones computed through the RI-MOM scheme. The RI-SMOM Z-factors presented here have been used tomore » compute the BSM neutral kaon mixing matrix elements in the companion paper. In conclusion, we argue that the renormalisation procedure is responsible for the discrepancies observed by different collaborations, we will investigate and elucidate the origin of these differences throughout this work.« less

  20. The Decline and Fall of Esperanto

    PubMed Central

    Patterson, Robert; Huff, Stanley M.

    1999-01-01

    In 1887, Polish physician Ludovic Zamenhof introduced Esperanto, a simple, easy-to-learn planned language. His goal was to erase communication barriers between ethnic groups by providing them with a politically neutral, culturally free standard language. His ideas received both praise and condemnation from the leaders of his time. Interest in Esperanto peaked in the 1970s but has since faded somewhat. Despite the logical concept and intellectual appeal of a standard language, Esperanto has not evolved into a dominant worldwide language. Instead, English, with all its idiosyncrasies, is closest to an international lingua franca. Like Zamenhof, standards committees in medical informatics have recognized communication chaos and have tried to establish working models, with mixed results. In some cases, previously shunned proprietary systems have become the standard. A proposed standard, no matter how simple, logical, and well designed, may have difficulty displacing an imperfect but functional “real life” system. PMID:10579602

  1. Standards for data acquisition and software-based analysis of in vivo electroencephalography recordings from animals. A TASK1-WG5 report of the AES/ILAE Translational Task Force of the ILAE.

    PubMed

    Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S

    2017-11-01

    Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  2. "It's Not My Problem": The Growth of Non-Standard Work and Its Impact on Vocational Education and Training in Australia.

    ERIC Educational Resources Information Center

    Hall, Richard; Bretherton, Tanya; Buchanan, John

    A study investigated implications of the increase in non-standard forms of employment (casual work, working through labor-hire companies, and work that is outsourced) for vocational education and training (VET) in Australia. Data sources were published statistics on growth of non-standard work; research on reasons for the growth and the business…

  3. Authentic leadership and nurse-assessed adverse patient outcomes.

    PubMed

    Wong, Carol A; Giallonardo, Lisa M

    2013-07-01

    Our purpose was to test a model examining relationships among authentic leadership, nurses' trust in their manager, areas of work life and nurse-assessed adverse patient outcomes. Although several work environment factors have been cited as critical to patient outcomes, studies linking nursing leadership styles with patient outcomes are limited suggesting the need for additional research to investigate the mechanisms by which leadership may influence patient outcomes. Secondary analysis of data collected in a cross-sectional survey of 280 (48% response rate) registered nurses working in acute care hospitals in Ontario was conducted using structural equation modelling. The final model fit the data acceptably (χ(2) = 1.30, df = 2, P = 0.52, IFI = 0.99, CFI = 1.00, RMSEA = 0.00). Authentic leadership was significantly associated with decreased adverse patient outcomes through trust in the manager and areas of work life. The findings suggest that nurses who see their managers as demonstrating high levels of authentic leadership report increased trust, greater congruence in the areas of work life and lower frequencies of adverse patient outcomes. Managers who emphasize transparency, balanced processing, self-awareness and high ethical standards in their interactions with nurses may contribute to safer work environments for patients and nurses. © 2013 John Wiley & Sons Ltd.

  4. Evaluation of methods for characterizing surface topography of models for high Reynolds number wind-tunnels

    NASA Technical Reports Server (NTRS)

    Teague, E. C.; Vorburger, T. V.; Scire, F. E.; Baker, S. M.; Jensen, S. W.; Gloss, B. B.; Trahan, C.

    1982-01-01

    Current work by the National Bureau of Standards at the NASA National Transonic Facility (NTF) to evaluate the performance of stylus instruments for determining the topography of models under investigation is described along with instrumentation for characterization of the surface microtopography. Potential areas of surface effects are reviewed, and the need for finer surfaced models for the NTF high Reynolds number flows is stressed. Current stylus instruments have a radii as large as 25 microns, and three models with surface finishes of 4-6, 8-10, and 12-15 micro-in. rms surface finishes were fabricated for tests with a stylus with a tip radius of 1 micron and a 50 mg force. Work involving three-dimensional stylus profilometry is discussed in terms of stylus displacement being converted to digital signals, and the design of a light scattering instrument capable of measuring the surface finish on curved objects is presented.

  5. Modeling users, context and devices for ambient assisted living environments.

    PubMed

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-03-17

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works.

  6. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  7. International Oil Supplies and Demands. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-09-01

    The eleventh Energy Modeling Forum (EMF) working group met four times over the 1989--90 period to compare alternative perspectives on international oil supplies and demands through 2010 and to discuss how alternative supply and demand trends influence the world`s dependence upon Middle Eastern oil. Proprietors of eleven economic models of the world oil market used their respective models to simulate a dozen scenarios using standardized assumptions. From its inception, the study was not designed to focus on the short-run impacts of disruptions on oil markets. Nor did the working group attempt to provide a forecast or just a single viewmore » of the likely future path for oil prices. The model results guided the group`s thinking about many important longer-run market relationships and helped to identify differences of opinion about future oil supplies, demands, and dependence.« less

  8. International Oil Supplies and Demands. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-04-01

    The eleventh Energy Modeling Forum (EMF) working group met four times over the 1989--1990 period to compare alternative perspectives on international oil supplies and demands through 2010 and to discuss how alternative supply and demand trends influence the world`s dependence upon Middle Eastern oil. Proprietors of eleven economic models of the world oil market used their respective models to simulate a dozen scenarios using standardized assumptions. From its inception, the study was not designed to focus on the short-run impacts of disruptions on oil markets. Nor did the working group attempt to provide a forecast or just a single viewmore » of the likely future path for oil prices. The model results guided the group`s thinking about many important longer-run market relationships and helped to identify differences of opinion about future oil supplies, demands, and dependence.« less

  9. Generational changes in materialism and work centrality, 1976-2007: associations with temporal changes in societal insecurity and materialistic role modeling.

    PubMed

    Twenge, Jean M; Kasser, Tim

    2013-07-01

    We examined whether culture-level indices of threat, instability, and materialistic modeling were linked to the materialistic values of American 12th graders between 1976 and 2007 (N = 355,296). Youth materialism (such as the importance of money and of owning expensive material items) increased over the generations, peaking in the late 1980s to early 1990s with Generation X and then staying at historically high levels for Millennials (GenMe). Societal instability and disconnection (e.g., unemployment, divorce) and social modeling (e.g., advertising spending) had both contemporaneous and lagged associations with higher levels of materialism, with advertising most influential during adolescence and instability during childhood. Societal-level living standards during childhood predicted materialism 10 years later. When materialistic values increased, work centrality steadily declined, suggesting a growing discrepancy between the desire for material rewards and the willingness to do the work usually required to earn them.

  10. Selected topics in high energy physics: Flavon, neutrino and extra-dimensional models

    NASA Astrophysics Data System (ADS)

    Dorsner, Ilja

    There is already significant evidence, both experimental and theoretical, that the Standard Model of elementary particle physics is just another effective physical theory. Thus, it is crucial (a) to anticipate the experiments in search for signatures of the physics beyond the Standard Model, and (b) whether some theoretically preferred structure can reproduce the low-energy signature of the Standard Model. This work pursues these two directions by investigating various extensions of the Standard Model. One of them is a simple flavon model that accommodates the observed hierarchy of the charged fermion masses and mixings. We show that flavor changing and CP violating signatures of this model are equally near the present experimental limits. We find that, for a significant range of parameters, mu-e conversion can be the most sensitive place to look for such signatures. We then propose two variants of an SO(10) model in five-dimensional framework. The first variant demonstrates that one can embed a four-dimensional flipped SU(5) model into a five-dimensional SO(10) model. This allows one to maintain the advantages of flipped SU(5) while avoiding its well-known drawbacks. The second variant shows that exact unification of the gauge couplings is possible even in the higher dimensional setting. This unification yields low-energy values of the gauge couplings that are in a perfect agreement with experimental values. We show that the corrections to the usual four-dimensional running, due to the Kaluza-Klein towers of states, can be unambiguously and systematically evaluated. We also consider the various main types of models of neutrino masses and mixings from the point of view of how naturally they give the large mixing angle MSW solution to the solar neutrino problem. Special attention is given to one particular "lopsided" SU(5) model, which is then analyzed in a completely statistical manner. We suggest that this sort of statistical analysis should be applicable to other models of neutrino mixing.

  11. Interrelated structure of high altitude atmospheric profiles

    NASA Technical Reports Server (NTRS)

    Engler, N. A.; Goldschmidt, M. A.

    1972-01-01

    A preliminary development of a mathematical model to compute probabilities of thermodynamic profiles is presented. The model assumes an exponential expression for pressure and utilizes the hydrostatic law and equation of state in the determination of density and temperature. It is shown that each thermodynamic variable can be factored into the produce of steady state and perturbation functions. The steady state functions have profiles similar to those of the 1962 standard atmosphere while the perturbation functions oscillate about 1. Limitations of the model and recommendations for future work are presented.

  12. The case for the relativistic hot big bang cosmology

    NASA Technical Reports Server (NTRS)

    Peebles, P. J. E.; Schramm, D. N.; Kron, R. G.; Turner, E. L.

    1991-01-01

    What has become the standard model in cosmology is described, and some highlights are presented of the now substantial range of evidence that most cosmologists believe convincingly establishes this model, the relativistic hot big bang cosmology. It is shown that this model has yielded a set of interpretations and successful predictions that substantially outnumber the elements used in devising the theory, with no well-established empirical contradictions. Brief speculations are made on how the open puzzles and work in progress might affect future developments in this field.

  13. Linear and non-linear perturbations in dark energy models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Escamilla-Rivera, Celia; Casarini, Luciano; Fabris, Júlio C.

    2016-11-01

    In this work we discuss observational aspects of three time-dependent parameterisations of the dark energy equation of state w ( z ). In order to determine the dynamics associated with these models, we calculate their background evolution and perturbations in a scalar field representation. After performing a complete treatment of linear perturbations, we also show that the non-linear contribution of the selected w ( z ) parameterisations to the matter power spectra is almost the same for all scales, with no significant difference from the predictions of the standard ΛCDM model.

  14. Optimized survey design for electrical resistivity tomography: combined optimization of measurement configuration and electrode placement

    NASA Astrophysics Data System (ADS)

    Uhlemann, Sebastian; Wilkinson, Paul B.; Maurer, Hansruedi; Wagner, Florian M.; Johnson, Timothy C.; Chambers, Jonathan E.

    2018-07-01

    Within geoelectrical imaging, the choice of measurement configurations and electrode locations is known to control the image resolution. Previous work has shown that optimized survey designs can provide a model resolution that is superior to standard survey designs. This paper demonstrates a methodology to optimize resolution within a target area, while limiting the number of required electrodes, thereby selecting optimal electrode locations. This is achieved by extending previous work on the `Compare-R' algorithm, which by calculating updates to the resolution matrix optimizes the model resolution in a target area. Here, an additional weighting factor is introduced that allows to preferentially adding measurement configurations that can be acquired on a given set of electrodes. The performance of the optimization is tested on two synthetic examples and verified with a laboratory study. The effect of the weighting factor is investigated using an acquisition layout comprising a single line of electrodes. The results show that an increasing weight decreases the area of improved resolution, but leads to a smaller number of electrode positions. Imaging results superior to a standard survey design were achieved using 56 per cent fewer electrodes. The performance was also tested on a 3-D acquisition grid, where superior resolution within a target at the base of an embankment was achieved using 22 per cent fewer electrodes than a comparable standard survey. The effect of the underlying resistivity distribution on the performance of the optimization was investigated and it was shown that even strong resistivity contrasts only have minor impact. The synthetic results were verified in a laboratory tank experiment, where notable image improvements were achieved. This work shows that optimized surveys can be designed that have a resolution superior to standard survey designs, while requiring significantly fewer electrodes. This methodology thereby provides a means for improving the efficiency of geoelectrical imaging.

  15. Optimized survey design for Electrical Resistivity Tomography: combined optimization of measurement configuration and electrode placement

    NASA Astrophysics Data System (ADS)

    Uhlemann, Sebastian; Wilkinson, Paul B.; Maurer, Hansruedi; Wagner, Florian M.; Johnson, Timothy C.; Chambers, Jonathan E.

    2018-03-01

    Within geoelectrical imaging, the choice of measurement configurations and electrode locations is known to control the image resolution. Previous work has shown that optimized survey designs can provide a model resolution that is superior to standard survey designs. This paper demonstrates a methodology to optimize resolution within a target area, while limiting the number of required electrodes, thereby selecting optimal electrode locations. This is achieved by extending previous work on the `Compare-R' algorithm, which by calculating updates to the resolution matrix optimizes the model resolution in a target area. Here, an additional weighting factor is introduced that allows to preferentially adding measurement configurations that can be acquired on a given set of electrodes. The performance of the optimization is tested on two synthetic examples and verified with a laboratory study. The effect of the weighting factor is investigated using an acquisition layout comprising a single line of electrodes. The results show that an increasing weight decreases the area of improved resolution, but leads to a smaller number of electrode positions. Imaging results superior to a standard survey design were achieved using 56 per cent fewer electrodes. The performance was also tested on a 3D acquisition grid, where superior resolution within a target at the base of an embankment was achieved using 22 per cent fewer electrodes than a comparable standard survey. The effect of the underlying resistivity distribution on the performance of the optimization was investigated and it was shown that even strong resistivity contrasts only have minor impact. The synthetic results were verified in a laboratory tank experiment, where notable image improvements were achieved. This work shows that optimized surveys can be designed that have a resolution superior to standard survey designs, while requiring significantly fewer electrodes. This methodology thereby provides a means for improving the efficiency of geoelectrical imaging.

  16. Theory and Modelling Resources Cookbook

    NASA Astrophysics Data System (ADS)

    Gray, Norman

    This cookbook is intended to assemble references to resources likely to be of interest to theorists and modellers. It's not a collection of standard recipes, but instead a repository of brief introductions to facilities. It includes references to sources of authoritative information, including those Starlink documents most likely to be of interest to theorists. Although the topics are chosen for their relevance to theoretical work, a good proportion of the information should be of interest to all of the astronomical computing community.

  17. On the biomechanical analysis of the calories expended in a straight boxing jab.

    PubMed

    Zohdi, T I

    2017-04-01

    Boxing and related sports activities have become a standard workout regime at many fitness studios worldwide. Oftentimes, people are interested in the calories expended during these workouts. This note focuses on determining the calories in a boxer's jab, using kinematic vector-loop relations and basic work-energy principles. Numerical simulations are undertaken to illustrate the basic model. Multi-limb extensions of the model are also discussed. © 2017 The Author(s).

  18. Collaborative Metadata Curation in Support of NASA Earth Science Data Stewardship

    NASA Technical Reports Server (NTRS)

    Sisco, Adam W.; Bugbee, Kaylin; le Roux, Jeanne; Staton, Patrick; Freitag, Brian; Dixon, Valerie

    2018-01-01

    Growing collection of NASA Earth science data is archived and distributed by EOSDIS’s 12 Distributed Active Archive Centers (DAACs). Each collection and granule is described by a metadata record housed in the Common Metadata Repository (CMR). Multiple metadata standards are in use, and core elements of each are mapped to and from a common model – the Unified Metadata Model (UMM). Work done by the Analysis and Review of CMR (ARC) Team.

  19. Detailed Spectral Analysis of the 260 ks XMM-Newton Data of 1E 1207.4-5209 and Significance of a 2.1 keV Absorption Feature

    NASA Astrophysics Data System (ADS)

    Mori, Kaya; Chonko, James C.; Hailey, Charles J.

    2005-10-01

    We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.

  20. 29 CFR 5.15 - Limitations, variations, tolerances, and exemptions under the Contract Work Hours and Safety...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the Contract Work Hours and Safety Standards Act. 5.15 Section 5.15 Labor Office of the Secretary of... WORK HOURS AND SAFETY STANDARDS ACT) Davis-Bacon and Related Acts Provisions and Procedures § 5.15 Limitations, variations, tolerances, and exemptions under the Contract Work Hours and Safety Standards Act. (a...

  1. EEG and MEG data analysis in SPM8.

    PubMed

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.

  2. A visual detection model for DCT coefficient quantization

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.; Peterson, Heidi A.

    1993-01-01

    The discrete cosine transform (DCT) is widely used in image compression, and is part of the JPEG and MPEG compression standards. The degree of compression, and the amount of distortion in the decompressed image are determined by the quantization of the transform coefficients. The standards do not specify how the DCT coefficients should be quantized. Our approach is to set the quantization level for each coefficient so that the quantization error is at the threshold of visibility. Here we combine results from our previous work to form our current best detection model for DCT coefficient quantization noise. This model predicts sensitivity as a function of display parameters, enabling quantization matrices to be designed for display situations varying in luminance, veiling light, and spatial frequency related conditions (pixel size, viewing distance, and aspect ratio). It also allows arbitrary color space directions for the representation of color.

  3. EEG and MEG Data Analysis in SPM8

    PubMed Central

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221

  4. Application of micromechanics to the characterization of mortar by ultrasound.

    PubMed

    Hernández, M G; Anaya, J J; Izquierdo, M A G; Ullate, L G

    2002-05-01

    Mechanical properties of concrete and mortar structures can be estimated by ultrasonic non-destructive testing. When the ultrasonic velocity is known, there are standardized methods based on considering the concrete a homogeneous material. Cement composites, however, are heterogeneous and porous, and have a negative effect on the mechanical properties of structures. This work studies the impact of porosity on mechanical properties by considering concrete a multiphase material. A micromechanical model is applied in which the material is considered to consist of two phases: a solid matrix and pores. From this method, a set of expressions is obtained that relates the acoustic velocity and Young's modulus of mortar. Experimental work is based on non-destructive and destructive procedures over mortar samples whose porosity is varied. A comparison is drawn between micromechanical and standard methods, showing positive results for the method here proposed.

  5. Workplace System Factors of Obstetric Nurses in Northeastern Ontario, Canada: Using a Work Disability Prevention Approach.

    PubMed

    Nowrouzi, Behdin; Lightfoot, Nancy; Carter, Lorraine; Larivère, Michel; Rukholm, Ellen; Belanger-Gardner, Diane

    2015-12-01

    The purpose of this study was to examine the relationship nursing personal and workplace system factors (work disability) and work ability index scores in Ontario, Canada. A total of 111 registered nurses were randomly selected from the total number of registered nurses on staff in the labor, delivery, recovery, and postpartum areas of four northeastern Ontario hospitals. Using a stratified random design approach, 51 participants were randomly selected in four northeastern Ontario cities. A total of 51 (45.9% response rate) online questionnaires were returned and another 60 (54.1% response rate) were completed using the paper format. The obstetric workforce in northeastern Ontario was predominately female (94.6%) with a mean age of 41.9 (standard deviation = 10.2). In the personal systems model, three variables: marital status (p = 0.025), respondent ethnicity (p = 0.026), and mean number of patients per shift (p = 0.049) were significantly contributed to the variance in work ability scores. In the workplace system model, job and career satisfaction (p = 0.026) had a positive influence on work ability scores, while work absenteeism (p = 0.023) demonstrated an inverse relationship with work ability scores. In the combined model, all the predictors were significantly related to work ability scores. Work ability is closely related to job and career satisfaction, and perceived control at work among obstetric nursing. In order to improve work ability, nurses need to work in environments that support them and allow them to be engaged in the decision-making processes.

  6. Implementing the Next Generation Science Standards: How Instructional Coaches Mediate Standards-Based Educational Reform to Teacher Practice

    NASA Astrophysics Data System (ADS)

    Laxton, Katherine E.

    This dissertation takes a close look at how district-level instructional coaches support teachers in learning to shifting their instructional practice, related to the Next Generation Science Standards. This dissertation aims to address how re-structuring professional development to a job-embedded coaching model supports individual teacher learning of new reform-related instructional practice. Implementing the NGSS is a problem of supporting professional learning in a way that will enable educators to make fundamental changes to their teaching practice. However, there are few examples in the literature that explain how coaches interact with teachers to improve teacher learning of reform-related instructional practice. There are also few examples in the literature that specifically address how supporting teachers with extended professional learning opportunities, aligned with high-leverage practices, tools and curriculum, impacts how teachers make sense of new standards-based educational reforms and what manifests in classroom instruction. This dissertation proposes four conceptual categories of sense-making that influence how instructional coaches interpret the nature of reform, their roles and in instructional improvement and how to work with teachers. It is important to understand how coaches interpret reform because their interpretations may have unintended consequences related to privileging certain views about instruction, or establishing priorities for how to work with teachers. In this dissertation, we found that re-structuring professional development to a job-embedded coaching model supported teachers in learning new reform-related instructional practice. However, individual teacher interpretations of reform emerged and seemed to be linked to how instructional coaches supported teacher learning.

  7. Risk Estimation for Lung Cancer in Libya: Analysis Based on Standardized Morbidity Ratio, Poisson-Gamma Model, BYM Model and Mixture Model

    PubMed

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-03-01

    Cancer is the most rapidly spreading disease in the world, especially in developing countries, including Libya. Cancer represents a significant burden on patients, families, and their societies. This disease can be controlled if detected early. Therefore, disease mapping has recently become an important method in the fields of public health research and disease epidemiology. The correct choice of statistical model is a very important step to producing a good map of a disease. Libya was selected to perform this work and to examine its geographical variation in the incidence of lung cancer. The objective of this paper is to estimate the relative risk for lung cancer. Four statistical models to estimate the relative risk for lung cancer and population censuses of the study area for the time period 2006 to 2011 were used in this work. They are initially known as Standardized Morbidity Ratio, which is the most popular statistic, which used in the field of disease mapping, Poisson-gamma model, which is one of the earliest applications of Bayesian methodology, Besag, York and Mollie (BYM) model and Mixture model. As an initial step, this study begins by providing a review of all proposed models, which we then apply to lung cancer data in Libya. Maps, tables and graph, goodness-of-fit (GOF) were used to compare and present the preliminary results. This GOF is common in statistical modelling to compare fitted models. The main general results presented in this study show that the Poisson-gamma model, BYM model, and Mixture model can overcome the problem of the first model (SMR) when there is no observed lung cancer case in certain districts. Results show that the Mixture model is most robust and provides better relative risk estimates across a range of models. Creative Commons Attribution License

  8. Risk Estimation for Lung Cancer in Libya: Analysis Based on Standardized Morbidity Ratio, Poisson-Gamma Model, BYM Model and Mixture Model

    PubMed Central

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-01-01

    Cancer is the most rapidly spreading disease in the world, especially in developing countries, including Libya. Cancer represents a significant burden on patients, families, and their societies. This disease can be controlled if detected early. Therefore, disease mapping has recently become an important method in the fields of public health research and disease epidemiology. The correct choice of statistical model is a very important step to producing a good map of a disease. Libya was selected to perform this work and to examine its geographical variation in the incidence of lung cancer. The objective of this paper is to estimate the relative risk for lung cancer. Four statistical models to estimate the relative risk for lung cancer and population censuses of the study area for the time period 2006 to 2011 were used in this work. They are initially known as Standardized Morbidity Ratio, which is the most popular statistic, which used in the field of disease mapping, Poisson-gamma model, which is one of the earliest applications of Bayesian methodology, Besag, York and Mollie (BYM) model and Mixture model. As an initial step, this study begins by providing a review of all proposed models, which we then apply to lung cancer data in Libya. Maps, tables and graph, goodness-of-fit (GOF) were used to compare and present the preliminary results. This GOF is common in statistical modelling to compare fitted models. The main general results presented in this study show that the Poisson-gamma model, BYM model, and Mixture model can overcome the problem of the first model (SMR) when there is no observed lung cancer case in certain districts. Results show that the Mixture model is most robust and provides better relative risk estimates across a range of models. PMID:28440974

  9. Towards accurate modelling of galaxy clustering on small scales: testing the standard ΛCDM + halo model

    NASA Astrophysics Data System (ADS)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron K.; Scoccimarro, Roman; Piscionere, Jennifer A.; Wibking, Benjamin D.

    2018-07-01

    Interpreting the small-scale clustering of galaxies with halo models can elucidate the connection between galaxies and dark matter haloes. Unfortunately, the modelling is typically not sufficiently accurate for ruling out models statistically. It is thus difficult to use the information encoded in small scales to test cosmological models or probe subtle features of the galaxy-halo connection. In this paper, we attempt to push halo modelling into the `accurate' regime with a fully numerical mock-based methodology and careful treatment of statistical and systematic errors. With our forward-modelling approach, we can incorporate clustering statistics beyond the traditional two-point statistics. We use this modelling methodology to test the standard Λ cold dark matter (ΛCDM) + halo model against the clustering of Sloan Digital Sky Survey (SDSS) seventh data release (DR7) galaxies. Specifically, we use the projected correlation function, group multiplicity function, and galaxy number density as constraints. We find that while the model fits each statistic separately, it struggles to fit them simultaneously. Adding group statistics leads to a more stringent test of the model and significantly tighter constraints on model parameters. We explore the impact of varying the adopted halo definition and cosmological model and find that changing the cosmology makes a significant difference. The most successful model we tried (Planck cosmology with Mvir haloes) matches the clustering of low-luminosity galaxies, but exhibits a 2.3σ tension with the clustering of luminous galaxies, thus providing evidence that the `standard' halo model needs to be extended. This work opens the door to adding interesting freedom to the halo model and including additional clustering statistics as constraints.

  10. Comparison of Computational-Model and Experimental-Example Trained Neural Networks for Processing Speckled Fringe Patterns

    NASA Technical Reports Server (NTRS)

    Decker, A. J.; Fite, E. B.; Thorp, S. A.; Mehmed, O.

    1998-01-01

    The responses of artificial neural networks to experimental and model-generated inputs are compared for detection of damage in twisted fan blades using electronic holography. The training-set inputs, for this work, are experimentally generated characteristic patterns of the vibrating blades. The outputs are damage-flag indicators or second derivatives of the sensitivity-vector-projected displacement vectors from a finite element model. Artificial neural networks have been trained in the past with computational-model-generated training sets. This approach avoids the difficult inverse calculations traditionally used to compare interference fringes with the models. But the high modeling standards are hard to achieve, even with fan-blade finite-element models.

  11. Comparison of Computational, Model and Experimental, Example Trained Neural Networks for Processing Speckled Fringe Patterns

    NASA Technical Reports Server (NTRS)

    Decker, A. J.; Fite, E. B.; Thorp, S. A.; Mehmed, O.

    1998-01-01

    The responses of artificial neural networks to experimental and model-generated inputs are compared for detection of damage in twisted fan blades using electronic holography. The training-set inputs, for this work, are experimentally generated characteristic patterns of the vibrating blades. The outputs are damage-flag indicators or second derivatives of the sensitivity-vector-projected displacement vectors from a finite element model. Artificial neural networks have been trained in the past with computational-model- generated training sets. This approach avoids the difficult inverse calculations traditionally used to compare interference fringes with the models. But the high modeling standards are hard to achieve, even with fan-blade finite-element models.

  12. Translational Modeling to Guide Study Design and Dose Choice in Obesity Exemplified by AZD1979, a Melanin‐concentrating Hormone Receptor 1 Antagonist

    PubMed Central

    Trägårdh, M; Lindén, D; Ploj, K; Johansson, A; Turnbull, A; Carlsson, B; Antonsson, M

    2017-01-01

    In this study, we present the translational modeling used in the discovery of AZD1979, a melanin‐concentrating hormone receptor 1 (MCHr1) antagonist aimed for treatment of obesity. The model quantitatively connects the relevant biomarkers and thereby closes the scaling path from rodent to man, as well as from dose to effect level. The complexity of individual modeling steps depends on the quality and quantity of data as well as the prior information; from semimechanistic body‐composition models to standard linear regression. Key predictions are obtained by standard forward simulation (e.g., predicting effect from exposure), as well as non‐parametric input estimation (e.g., predicting energy intake from longitudinal body‐weight data), across species. The work illustrates how modeling integrates data from several species, fills critical gaps between biomarkers, and supports experimental design and human dose‐prediction. We believe this approach can be of general interest for translation in the obesity field, and might inspire translational reasoning more broadly. PMID:28556607

  13. Effects of tectonic plate deformation on the geodetic reference frame of Mexico

    NASA Astrophysics Data System (ADS)

    Gonzalez Franco, G. A.; Avalos, D.; Esquivel, R.

    2013-05-01

    Positioning for geodetic applications is commonly determined at one observation epoch, but tectonic drift and tectonic deformation cause the coordinates to be different for any other epoch. Finding the right coordinates at a different epoch from that of the observation time is necessary in Mexico in order to comply the official reference frame, which requires all coordinates to be referred to the standard epoch 2010.0. Available models of horizontal movement in rigid tectonic plates are used to calculate the displacement of coordinates; however for a portion of Mexico these models fail because of miss-modeled regional deformation, decreasing the quality of users' data transformed to the standard epoch. In this work we present the progress achieved in measuring actual horizontal motion towards an improved modeling of horizontal displacements for some regions. Miss-modeled velocities found are as big as 23mm/a, affecting significantly applications like cadastral and geodetic control. Data from a large set of GNSS permanent stations in Mexico is being analyzed to produce the preliminary model of horizontal crustal movement that will be used to minimize distortions of the reference frame.

  14. Longitudinal Relationship Between Sitting Time on a Working Day and Vitality, Work Performance, Presenteeism, and Sickness Absence.

    PubMed

    Hendriksen, Ingrid J M; Bernaards, Claire M; Steijn, Wouter M P; Hildebrandt, Vincent H

    2016-08-01

    The aim of this study was to explore the longitudinal relationship between sitting time on a working day and vitality, work performance, presenteeism, and sickness absence. At the start and end of a five-month intervention program at the workplace, as well as 10 months after the intervention, sitting time and work-related outcomes were measured using a standardized self-administered questionnaire and company records. Generalized linear mixed models were used to estimate the longitudinal relationship between sitting time and work-related outcomes, and possible interaction effects over time. A significant and sustainable decrease in sitting time on a working day was observed. Sitting less was significantly related to higher vitality scores, but this effect was marginal (b = -0.0006, P = 0.000). Our finding of significant though marginal associations between sitting time and important work-related outcomes justifies further research.

  15. Impact of Neutrino Opacities on Core-collapse Supernova Simulations

    NASA Astrophysics Data System (ADS)

    Kotake, Kei; Takiwaki, Tomoya; Fischer, Tobias; Nakamura, Ko; Martínez-Pinedo, Gabriel

    2018-02-01

    The accurate description of neutrino opacities is central to both the core-collapse supernova (CCSN) phenomenon and the validity of the explosion mechanism itself. In this work, we study in a systematic fashion the role of a variety of well-selected neutrino opacities in CCSN simulations where the multi-energy, three-flavor neutrino transport is solved using the isotropic diffusion source approximation (IDSA) scheme. To verify our code, we first present results from one-dimensional (1D) simulations following the core collapse, bounce, and ∼250 ms postbounce of a 15 {M}ȯ star using a standard set of neutrino opacities by Bruenn. A detailed comparison with published results supports the reliability of our three-flavor IDSA scheme using the standard opacity set. We then investigate in 1D simulations how individual opacity updates lead to differences with the baseline run with the standard opacity set. Through detailed comparisons with previous work, we check the validity of our implementation of each update in a step-by-step manner. Individual neutrino opacities with the largest impact on the overall evolution in 1D simulations are selected for systematic comparisons in our two-dimensional (2D) simulations. Special attention is given to the criterion of explodability in the 2D models. We discuss the implications of these results as well as its limitations and the requirements for future, more elaborate CCSN modeling.

  16. Keynote Address: ACR-NEMA standards and their implications for teleradiology

    NASA Astrophysics Data System (ADS)

    Horii, Steven C.

    1990-06-01

    The ACR-NEMA Standard was developed initially as an interface standard for the interconnection of two pieces of imaging equipment Essentially the Standard defmes a point-to-point hardware connection with the necessary protocol and data structure so that two differing devices which meet the specification will be able to communicate with each other. The Standard does not defme a particular PACS architecture nor does it specify a database structure. In part these are the reasons why implementers have had difficulty in using the Standard in a full PACS. Recent activity of the Working Groups formed by the Committee overseeing work on the ACR-NEMA Standard has changed some of the " flavor" of the Standard. It was realized that connection of PACS with hospital and radiology information systems (HIS and RIS) is necessary if a PACS is ever to be succesful. The idea of interconnecting heterogeneous computer systems has pushed Standards development beyond the scope of the original work. Teleradiology which inherenfly involves wide-area networking may be a direct beneficiary of the new directions taken by the Standards Working Groups. This paper will give a brief history of the ACR-NEMA effort describe the " parent" Standard and its " offspring" and describe the activity of the current Working Groups with particular emphasis on the potential impacts on teleradiology.

  17. A comparison of selected models for estimating cable icing

    NASA Astrophysics Data System (ADS)

    McComber, Pierre; Druez, Jacques; Laflamme, Jean

    In many cold climate countries, it is becoming increasingly important to monitor transmission line icing. Indeed, by knowing in advance of localized danger for icing overloads, electric utilities can take measures in time to prevent generalized failure of the power transmission network. Recently in Canada, a study was made to compare the estimation of a few icing models working from meteorological data in estimating ice loads for freezing rain events. The models tested were using only standard meteorological parameters, i.e. wind speed and direction, temperature and precipitation rate. This study has shown that standard meteorological parameters can only achieve very limited accuracy, especially for longer icing events. However, with the help of an additional instrument monitoring the icing rate intensity, a significant improvement in model prediction might be achieved. The icing rate meter (IRM) which counts icing and de-icing cycles per unit time on a standard probe can be used to estimate the icing intensity. A cable icing estimation is then made by taking into consideration the accretion size, temperature, wind speed and direction, and precipitation rate. In this paper, a comparison is made between the predictions of two previously tested models (one obtained and the other reconstructed from their description in the public literature) and of a model based on the icing rate meter readings. The models are tested against nineteen events recorded on an icing test line at Mt. Valin, Canada, during the winter season 1991-1992. These events are mostly rime resulting from in-cloud icing. However, freezing rain and wet snow events were also recorded. Results indicate that a significant improvement in the estimation is attained by using the icing rate meter data together with the other standard meteorological parameters.

  18. The International Reference Ionosphere: Rawer's IRI and its status today

    NASA Astrophysics Data System (ADS)

    Bilitza, D.

    2014-11-01

    When the Committee on Space Research (COSPAR) initiated the International Reference Ionosphere (IRI) project in 1968 it wisely selected K. Rawer as its first Chairperson. With a solid footing and good contacts in both the ground-based and space-based ionospheric communities he was ideally suited to pull together colleagues and data from both communities to help build the first version of the IRI. He assembled a team of 20+ international ionospheric experts in the IRI Working Group and chaired and directed the group from 1968 to 1984. The working group has now grown to 63 members and the IRI model has undergone many revisions as new data became available and new modeling techniques were applied. This paper was presented during a special session of the Kleinheubach Tagung 2013 in honor of K. Rawer's 100th birthday. It will review the current status of the IRI model and project and the international recognition it has achieved. It is quite fitting that this year we not only celebrate K. Rawer's 100th birthday but also the exciting news that his favorite science endeavor, IRI, has been internationally recognized as an ISO (International Standardization Organization) standard. The IRI homepage is at http://irimodel.org.

  19. Exotic equilibria of Harary graphs and a new minimum degree lower bound for synchronization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canale, Eduardo A., E-mail: ecanale@pol.una.py; Monzón, Pablo, E-mail: monzon@fing.edu.uy

    2015-02-15

    This work is concerned with stability of equilibria in the homogeneous (equal frequencies) Kuramoto model of weakly coupled oscillators. In 2012 [R. Taylor, J. Phys. A: Math. Theor. 45, 1–15 (2012)], a sufficient condition for almost global synchronization was found in terms of the minimum degree–order ratio of the graph. In this work, a new lower bound for this ratio is given. The improvement is achieved by a concrete infinite sequence of regular graphs. Besides, non standard unstable equilibria of the graphs studied in Wiley et al. [Chaos 16, 015103 (2006)] are shown to exist as conjectured in that work.

  20. Predictors of severe trunk postures among short-haul truck drivers during non-driving tasks: an exploratory investigation involving video-assessment and driver behavioural self-monitoring.

    PubMed

    Olson, R; Hahn, D I; Buckert, A

    2009-06-01

    Short-haul truck (lorry) drivers are particularly vulnerable to back pain and injury due to exposure to whole body vibration, prolonged sitting and demanding material handling tasks. The current project reports the results of video-based assessments (711 stops) and driver behavioural self-monitoring (BSM) (385 stops) of injury hazards during non-driving work. Participants (n = 3) worked in a trailer fitted with a camera system during baseline and BSM phases. Descriptive analyses showed that challenging customer environments and non-standard ingress/egress were prevalent. Statistical modelling of video-assessment results showed that each instance of manual material handling increased the predicted mean for severe trunk postures by 7%, while customer use of a forklift, moving standard pallets and moving non-standard pallets decreased predicted means by 12%, 20% and 22% respectively. Video and BSM comparisons showed that drivers were accurate at self-monitoring frequent environmental conditions, but less accurate at monitoring trunk postures and rare work events. The current study identified four predictors of severe trunk postures that can be modified to reduce risk of injury among truck drivers and showed that workers can produce reliable self-assessment data with BSM methods for frequent and easily discriminated events environmental.

Top