Sample records for imperative compiler framework

  1. Measuring research impact in Australia's medical research institutes: a scoping literature review of the objectives for and an assessment of the capabilities of research impact assessment frameworks.

    PubMed

    Deeming, Simon; Searles, Andrew; Reeves, Penny; Nilsson, Michael

    2017-03-21

    Realising the economic potential of research institutions, including medical research institutes, represents a policy imperative for many Organisation for Economic Co-operation and Development nations. The assessment of research impact has consequently drawn increasing attention. Research impact assessment frameworks (RIAFs) provide a structure to assess research translation, but minimal research has examined whether alternative RIAFs realise the intended policy outcomes. This paper examines the objectives presented for RIAFs in light of economic imperatives to justify ongoing support for health and medical research investment, leverage productivity via commercialisation and outcome-efficiency gains in health systems, and ensure that translation and impact considerations are embedded into the research process. This paper sought to list the stated objectives for RIAFs, to identify existing frameworks and to evaluate whether the identified frameworks possessed the capabilities necessary to address the specified objectives. A scoping review of the literature to identify objectives specified for RIAFs, inform upon descriptive criteria for each objective and identify existing RIAFs. Criteria were derived for each objective. The capability for the existing RIAFs to realise the alternative objectives was evaluated based upon these criteria. The collated objectives for RIAFs included accountability (top-down), transparency/accountability (bottom-up), advocacy, steering, value for money, management/learning and feedback/allocation, prospective orientation, and speed of translation. Of the 25 RIAFs identified, most satisfied objectives such as accountability and advocacy, which are largely sufficient for the first economic imperative to justify research investment. The frameworks primarily designed to optimise the speed of translation or enable the prospective orientation of research possessed qualities most likely to optimise the productive outcomes from research. However, the results show that few frameworks met the criteria for these objectives. It is imperative that the objective(s) for an assessment framework are explicit and that RIAFs are designed to realise these objectives. If the objectives include the capability to pro-actively drive productive research impacts, the potential for prospective orientation and a focus upon the speed of translation merits prioritisation. Frameworks designed to optimise research translation and impact, rather than simply assess impact, offer greater promise to contribute to the economic imperatives compelling their implementation.

  2. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Vetter, Jeffrey S

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less

  3. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; Tick, E.

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. Themore » fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.« less

  4. Driving toward guiding principles: a goal for privacy, confidentiality, and security of health information.

    PubMed

    Buckovich, S A; Rippen, H E; Rozen, M J

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate.

  5. Driving Toward Guiding Principles

    PubMed Central

    Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate. PMID:10094065

  6. Makahiki: An Open Source Serious Game Framework for Sustainability Education and Conservation

    ERIC Educational Resources Information Center

    Xu, Yongwen; Johnson, Philip M.; Lee, George E.; Moore, Carleton A.; Brewer, Robert S.

    2014-01-01

    Sustainability education and conservation have become an international imperative due to the rising cost of energy, increasing scarcity of natural resource and irresponsible environmental practices. This paper presents Makahiki, an open source serious game framework for sustainability, which implements an extensible framework for different…

  7. ProjectQ Software Framework

    NASA Astrophysics Data System (ADS)

    Steiger, Damian S.; Haener, Thomas; Troyer, Matthias

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. A high level quantum programming language and optimizing compilers are essential components to achieve scalable quantum computation. In order to address this, we introduce the ProjectQ software framework - an open source effort to support both theorists and experimentalists by providing intuitive tools to implement and run quantum algorithms. Here, we present our ProjectQ quantum compiler, which compiles a quantum algorithm from our high-level Python-embedded language down to low-level quantum gates available on the target system. We demonstrate how this compiler can be used to control actual hardware and to run high-performance simulations.

  8. Functional Programming in Computer Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Loren James; Davis, Marion Kei

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functionalmore » language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.« less

  9. Parallelizing Compiler Framework and API for Power Reduction and Software Productivity of Real-Time Heterogeneous Multicores

    NASA Astrophysics Data System (ADS)

    Hayashi, Akihiro; Wada, Yasutaka; Watanabe, Takeshi; Sekiguchi, Takeshi; Mase, Masayoshi; Shirako, Jun; Kimura, Keiji; Kasahara, Hironori

    Heterogeneous multicores have been attracting much attention to attain high performance keeping power consumption low in wide spread of areas. However, heterogeneous multicores force programmers very difficult programming. The long application program development period lowers product competitiveness. In order to overcome such a situation, this paper proposes a compilation framework which bridges a gap between programmers and heterogeneous multicores. In particular, this paper describes the compilation framework based on OSCAR compiler. It realizes coarse grain task parallel processing, data transfer using a DMA controller, power reduction control from user programs with DVFS and clock gating on various heterogeneous multicores from different vendors. This paper also evaluates processing performance and the power reduction by the proposed framework on a newly developed 15 core heterogeneous multicore chip named RP-X integrating 8 general purpose processor cores and 3 types of accelerator cores which was developed by Renesas Electronics, Hitachi, Tokyo Institute of Technology and Waseda University. The framework attains speedups up to 32x for an optical flow program with eight general purpose processor cores and four DRP(Dynamically Reconfigurable Processor) accelerator cores against sequential execution by a single processor core and 80% of power reduction for the real-time AAC encoding.

  10. The Pathways Framework Meets Consumer Culture: Young People, Careers, and Commitment

    ERIC Educational Resources Information Center

    Vaughan, Karen

    2005-01-01

    This article engages with current debates in New Zealand over the legitimacy of various young people's activities within a transition-to-work framework based around the metaphor of "pathways". The article argues for a more complex understanding of the imperatives young people now face in choosing careers within a deregulated, seamless…

  11. MetaJC++: A flexible and automatic program transformation technique using meta framework

    NASA Astrophysics Data System (ADS)

    Beevi, Nadera S.; Reghu, M.; Chitraprasad, D.; Vinodchandra, S. S.

    2014-09-01

    Compiler is a tool to translate abstract code containing natural language terms to machine code. Meta compilers are available to compile more than one languages. We have developed a meta framework intends to combine two dissimilar programming languages, namely C++ and Java to provide a flexible object oriented programming platform for the user. Suitable constructs from both the languages have been combined, thereby forming a new and stronger Meta-Language. The framework is developed using the compiler writing tools, Flex and Yacc to design the front end of the compiler. The lexer and parser have been developed to accommodate the complete keyword set and syntax set of both the languages. Two intermediate representations have been used in between the translation of the source program to machine code. Abstract Syntax Tree has been used as a high level intermediate representation that preserves the hierarchical properties of the source program. A new machine-independent stack-based byte-code has also been devised to act as a low level intermediate representation. The byte-code is essentially organised into an output class file that can be used to produce an interpreted output. The results especially in the spheres of providing C++ concepts in Java have given an insight regarding the potential strong features of the resultant meta-language.

  12. Employability and Higher Education: The Follies of the "Productivity Challenge" in the Teaching Excellence Framework

    ERIC Educational Resources Information Center

    Frankham, Jo

    2017-01-01

    This article considers questions of "employability," a notion foregrounded in the Green and White Papers on the Teaching Excellence Framework (TEF). The paper first questions government imperatives concerning employability and suggests a series of mismatches that are evident in the rhetorics in this area. This summary opens up elements…

  13. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  14. Becoming a World Christian: Hospitality as a Framework for Engaging Otherness

    ERIC Educational Resources Information Center

    Arrington, Aminta

    2017-01-01

    Hospitality is the Christian imperative of welcoming the stranger to our table, which serves as a living metaphor for the salvation God extends to all of us, welcoming us as sinners to his table of abundance. As we transition from the era of missions to the era of world Christianity, a hospitality framework is helpful for the concomitant task of…

  15. Validity as a social imperative for assessment in health professions education: a concept analysis.

    PubMed

    Marceau, Mélanie; Gallagher, Frances; Young, Meredith; St-Onge, Christina

    2018-06-01

    Assessment can have far-reaching consequences for future health care professionals and for society. Thus, it is essential to establish the quality of assessment. Few modern approaches to validity are well situated to ensure the quality of complex assessment approaches, such as authentic and programmatic assessments. Here, we explore and delineate the concept of validity as a social imperative in the context of assessment in health professions education (HPE) as a potential framework for examining the quality of complex and programmatic assessment approaches. We conducted a concept analysis using Rodgers' evolutionary method to describe the concept of validity as a social imperative in the context of assessment in HPE. Supported by an academic librarian, we developed and executed a search strategy across several databases for literature published between 1995 and 2016. From a total of 321 citations, we identified 67 articles that met our inclusion criteria. Two team members analysed the texts using a specified approach to qualitative data analysis. Consensus was achieved through full team discussions. Attributes that characterise the concept were: (i) demonstration of the use of evidence considered credible by society to document the quality of assessment; (ii) validation embedded through the assessment process and score interpretation; (iii) documented validity evidence supporting the interpretation of the combination of assessment findings, and (iv) demonstration of a justified use of a variety of evidence (quantitative and qualitative) to document the quality of assessment strategies. The emerging concept of validity as a social imperative highlights some areas of focus in traditional validation frameworks, whereas some characteristics appear unique to HPE and move beyond traditional frameworks. The study reflects the importance of embedding consideration for society and societal concerns throughout the assessment and validation process, and may represent a potential lens through which to examine the quality of complex and programmatic assessment approaches. © 2018 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  16. GIS-assisted spatial analysis for urban regulatory detailed planning: designer's dimension in the Chinese code system

    NASA Astrophysics Data System (ADS)

    Yu, Yang; Zeng, Zheng

    2009-10-01

    By discussing the causes behind the high amendments ratio in the implementation of urban regulatory detailed plans in China despite its law-ensured status, the study aims to reconcile conflict between the legal authority of regulatory detailed planning and the insufficient scientific support in its decision-making and compilation by introducing into the process spatial analysis based on GIS technology and 3D modeling thus present a more scientific and flexible approach to regulatory detailed planning in China. The study first points out that the current compilation process of urban regulatory detailed plan in China employs mainly an empirical approach which renders it constantly subjected to amendments; the study then discusses the need and current utilization of GIS in the Chinese system and proposes the framework of a GIS-assisted 3D spatial analysis process from the designer's perspective which can be regarded as an alternating processes between the descriptive codes and physical design in the compilation of regulatory detailed planning. With a case study of the processes and results from the application of the framework, the paper concludes that the proposed framework can be an effective instrument which provides more rationality, flexibility and thus more efficiency to the compilation and decision-making process of urban regulatory detailed plan in China.

  17. On Fusing Recursive Traversals of K-d Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajbhandari, Samyam; Kim, Jinsung; Krishnamoorthy, Sriram

    Loop fusion is a key program transformation for data locality optimization that is implemented in production compilers. But optimizing compilers currently cannot exploit fusion opportunities across a set of recursive tree traversal computations with producer-consumer relationships. In this paper, we develop a compile-time approach to dependence characterization and program transformation to enable fusion across recursively specified traversals over k-ary trees. We present the FuseT source-to-source code transformation framework to automatically generate fused composite recursive operators from an input program containing a sequence of primitive recursive operators. We use our framework to implement fused operators for MADNESS, Multiresolution Adaptive Numerical Environmentmore » for Scientific Simulation. We show that locality optimization through fusion can offer more than an order of magnitude performance improvement.« less

  18. Caring as an Imperative for Nursing Education.

    ERIC Educational Resources Information Center

    Cook, Patricia R.; Cullen, Janice A.

    2003-01-01

    An associate nursing degree program threads caring across the curriculum using Watson's framework of interpersonal/transpersonal processes for caring and a taxonomy of affective competencies. Ways of caring are integrated into classroom and clinical experiences. (Contains 20 references.) (SK)

  19. Literary and Personal Criticism for Preservice Teachers: A Pedagogical Imperative.

    ERIC Educational Resources Information Center

    Roberts, Sherron Killingsworth

    1998-01-01

    Provides a theoretical framework for designing a children's literature course that requires preservice teachers to critically analyze literature in ways that are personally meaningful. Suggests how preservice teachers can read children's literature intensively rather than extensively. (PA)

  20. Catheter-related bacteraemia and infective endocarditis caused by Kocuria species.

    PubMed

    Lai, C C; Wang, J Y; Lin, S H; Tan, C K; Wang, C Y; Liao, C H; Chou, C H; Huang, Y T; Lin, H I; Hsueh, P R

    2011-02-01

    We describe five patients with positive blood culture for Kocuria species. Three patients had catheter-related bacteraemia and one had infective endocarditis caused by Kocuria kristinae, and one had a K. marina isolate, which was considered to be a contaminant. Identification of the isolates was further confirmed by 16S rRNA gene sequence analysis. In conclusion, Kocuria species are an unusual cause of infection in immunocompromised patients. Accurate identification with molecular methods is imperative for the diagnosis of these unusual pathogens. © 2010 The Authors. Journal Compilation © 2010 European Society of Clinical Microbiology and Infectious Diseases.

  1. Aspect-Oriented Monitoring of C Programs

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; VanWyk, Eric

    2008-01-01

    The paper presents current work on extending ASPECTC with state machines, resulting in a framework for aspect-oriented monitoring of C programs. Such a framework can be used for testing purposes, or it can be part of a fault protection strategy. The long term goal is to explore the synergy between the fields of runtime verification, focused on program monitoring, and aspect-oriented programming, focused on more general program development issues. The work is inspired by the observation that most work in this direction has been done for JAVA, partly due to the lack of easily accessible extensible compiler frameworks for C. The work is performed using the SILVER extensible attribute grammar compiler framework, in which C has been defined as a host language. Our work consists of extending C with ASPECTC, and subsequently to extend ASPECTC with state machines.

  2. Further and Higher Education Partnerships. The Future for Collaboration.

    ERIC Educational Resources Information Center

    Abramson, Mike, Ed.; And Others

    The following papers are included: "Introduction" (Mike Abramson, John Bird, Anne Stennett); "Partnership Imperatives: A Critical Appraisal" (Mike Abramson); "Further and Higher Education Partnerships: The Evolution of a National Policy Framework" (John Bird); "Finance: The Bedrock of Good Partnerships"…

  3. FX-87 performance measurements: data-flow implementation. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammel, R.T.; Gifford, D.K.

    1988-11-01

    This report documents a series of experiments performed to explore the thesis that the FX-87 effect system permits a compiler to schedule imperative programs (i.e., programs that may contain side-effects) for execution on a parallel computer. The authors analyze how much the FX-87 static effect system can improve the execution times of five benchmark programs on a parallel graph interpreter. Three of their benchmark programs do not use side-effects (factorial, fibonacci, and polynomial division) and thus did not have any effect-induced constraints. Their FX-87 performance was comparable to their performance in a purely functional language. Two of the benchmark programsmore » use side effects (DNA sequence matching and Scheme interpretation) and the compiler was able to use effect information to reduce their execution times by factors of 1.7 to 5.4 when compared with sequential execution times. These results support the thesis that a static effect system is a powerful tool for compilation to multiprocessor computers. However, the graph interpreter we used was based on unrealistic assumptions, and thus our results may not accurately reflect the performance of a practical FX-87 implementation. The results also suggest that conventional loop analysis would complement the FX-87 effect system« less

  4. BioInt: an integrative biological object-oriented application framework and interpreter.

    PubMed

    Desai, Sanket; Burra, Prasad

    2015-01-01

    BioInt, a biological programming application framework and interpreter, is an attempt to equip the researchers with seamless integration, efficient extraction and effortless analysis of the data from various biological databases and algorithms. Based on the type of biological data, algorithms and related functionalities, a biology-specific framework was developed which has nine modules. The modules are a compilation of numerous reusable BioADTs. This software ecosystem containing more than 450 biological objects underneath the interpreter makes it flexible, integrative and comprehensive. Similar to Python, BioInt eliminates the compilation and linking steps cutting the time significantly. The researcher can write the scripts using available BioADTs (following C++ syntax) and execute them interactively or use as a command line application. It has features that enable automation, extension of the framework with new/external BioADTs/libraries and deployment of complex work flows.

  5. A domain-specific compiler for a parallel multiresolution adaptive numerical simulation environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajbhandari, Samyam; Kim, Jinsung; Krishnamoorthy, Sriram

    This paper describes the design and implementation of a layered domain-specific compiler to support MADNESS---Multiresolution ADaptive Numerical Environment for Scientific Simulation. MADNESS is a high-level software environment for the solution of integral and differential equations in many dimensions, using adaptive and fast harmonic analysis methods with guaranteed precision. MADNESS uses k-d trees to represent spatial functions and implements operators like addition, multiplication, differentiation, and integration on the numerical representation of functions. The MADNESS runtime system provides global namespace support and a task-based execution model including futures. MADNESS is currently deployed on massively parallel supercomputers and has enabled many science advances.more » Due to the highly irregular and statically unpredictable structure of the k-d trees representing the spatial functions encountered in MADNESS applications, only purely runtime approaches to optimization have previously been implemented in the MADNESS framework. This paper describes a layered domain-specific compiler developed to address some performance bottlenecks in MADNESS. The newly developed static compile-time optimizations, in conjunction with the MADNESS runtime support, enable significant performance improvement for the MADNESS framework.« less

  6. A high performance data parallel tensor contraction framework: Application to coupled electro-mechanics

    NASA Astrophysics Data System (ADS)

    Poya, Roman; Gil, Antonio J.; Ortigosa, Rogelio

    2017-07-01

    The paper presents aspects of implementation of a new high performance tensor contraction framework for the numerical analysis of coupled and multi-physics problems on streaming architectures. In addition to explicit SIMD instructions and smart expression templates, the framework introduces domain specific constructs for the tensor cross product and its associated algebra recently rediscovered by Bonet et al. (2015, 2016) in the context of solid mechanics. The two key ingredients of the presented expression template engine are as follows. First, the capability to mathematically transform complex chains of operations to simpler equivalent expressions, while potentially avoiding routes with higher levels of computational complexity and, second, to perform a compile time depth-first or breadth-first search to find the optimal contraction indices of a large tensor network in order to minimise the number of floating point operations. For optimisations of tensor contraction such as loop transformation, loop fusion and data locality optimisations, the framework relies heavily on compile time technologies rather than source-to-source translation or JIT techniques. Every aspect of the framework is examined through relevant performance benchmarks, including the impact of data parallelism on the performance of isomorphic and nonisomorphic tensor products, the FLOP and memory I/O optimality in the evaluation of tensor networks, the compilation cost and memory footprint of the framework and the performance of tensor cross product kernels. The framework is then applied to finite element analysis of coupled electro-mechanical problems to assess the speed-ups achieved in kernel-based numerical integration of complex electroelastic energy functionals. In this context, domain-aware expression templates combined with SIMD instructions are shown to provide a significant speed-up over the classical low-level style programming techniques.

  7. A grim contradiction: the practice and consequences of corporate social responsibility by British American Tobacco in Malaysia.

    PubMed

    Barraclough, Simon; Morrow, Martha

    2008-04-01

    In the wake of the World Health Organization Framework Convention on Tobacco Control, corporate social responsibility (CSR) is among the few remaining mechanisms for tobacco corporations publicly to promote their interests. Health advocates may be unaware of the scale, nature and implications of tobacco industry CSR. This investigation aimed to construct a typology of tobacco industry CSR through a case study of the evolution and impact of CSR activities of a particular tobacco corporation in one country - British American Tobacco, Malaysia (BATM), the Malaysian market leader. Methods included searching, compiling and critically appraising publicly available materials from British American Tobacco, BATM, published literature and other sources. The study examined BATM's CSR strategy, the issues which it raises, consequences for tobacco control and potential responses by health advocates. The investigation found that BATM's CSR activities included assistance to tobacco growers, charitable donations, scholarships, involvement in anti-smuggling measures, 'youth smoking prevention' programs and annual Social Reports. BATM has stated that its model is predominantly motivated by social and stakeholder obligations. Its CSR activities have, however, had the additional benefits of contributing to a favourable image, deflecting criticism and establishing a modus vivendi with regulators that assists BATM's continued operations and profitability. It is imperative that health advocates highlight the potential conflicts inherent in such arrangements and develop strategies to address the concerns raised.

  8. Copilot: Monitoring Embedded Systems

    NASA Technical Reports Server (NTRS)

    Pike, Lee; Wegmann, Nis; Niller, Sebastian; Goodloe, Alwyn

    2012-01-01

    Runtime verification (RV) is a natural fit for ultra-critical systems, where correctness is imperative. In ultra-critical systems, even if the software is fault-free, because of the inherent unreliability of commodity hardware and the adversity of operational environments, processing units (and their hosted software) are replicated, and fault-tolerant algorithms are used to compare the outputs. We investigate both software monitoring in distributed fault-tolerant systems, as well as implementing fault-tolerance mechanisms using RV techniques. We describe the Copilot language and compiler, specifically designed for generating monitors for distributed, hard real-time systems. We also describe two case-studies in which we generated Copilot monitors in avionics systems.

  9. Difference, Globalisation and the Internationalisation of Curriculum.

    ERIC Educational Resources Information Center

    Rizvi, Fazal; Walsh, Lucas

    1998-01-01

    As Australian higher education advances, new ways of thinking about the college curriculum need to be developed to meet the changing imperatives of the global environment and address the need for student-centered instruction. Internationalization tends to destabilize conventional frameworks of curriculum design and implementation at local,…

  10. ProjectQ: Compiling quantum programs for various backends

    NASA Astrophysics Data System (ADS)

    Haener, Thomas; Steiger, Damian S.; Troyer, Matthias

    In order to control quantum computers beyond the current generation, a high level quantum programming language and optimizing compilers will be essential. Therefore, we have developed ProjectQ - an open source software framework to facilitate implementing and running quantum algorithms both in software and on actual quantum hardware. Here, we introduce the backends available in ProjectQ. This includes a high-performance simulator and emulator to test and debug quantum algorithms, tools for resource estimation, and interfaces to several small-scale quantum devices. We demonstrate the workings of the framework and show how easily it can be further extended to control upcoming quantum hardware.

  11. Making Policy in the Classroom

    ERIC Educational Resources Information Center

    Hohmann, Ulrike

    2016-01-01

    The concept of street-level bureaucracy (Lipsky, 1980, 2010) examines the form and extent discretion takes in teachers' and other public policy enactors' work and how they negotiate their way through sometimes contradictory policy imperatives. It provides a framework for straddling top-down and bottom-up perspectives on policy making. In this…

  12. Assessing Key Competences across the Curriculum--And Europe

    ERIC Educational Resources Information Center

    Pepper, David

    2011-01-01

    The development of key competences for lifelong learning has been an important policy imperative for EU Member States. The European Reference Framework of key competences (2006) built on previous developments by the OECD, UNESCO and Member States themselves. It defined key competences as knowledge, skills and attitudes applied appropriately to…

  13. Using Media Literacy to Explore Stereotypes of Mexican Immigrants.

    ERIC Educational Resources Information Center

    Vargas, Lucila; dePyssler, Bruce

    1998-01-01

    Examines media portrayals of Mexican immigrants, and interplay between these images and portrayals of U.S.-born Latinos. Argues that examining media images is imperative because the influence of media saturation is almost overwhelming. Suggests a media-literacy framework for developing abilities for interpreting media and giving students control…

  14. Arguing with Adversaries: Aikido, Rhetoric, and the Art of Peace

    ERIC Educational Resources Information Center

    Kroll, Barry M.

    2008-01-01

    The Japanese martial art of aikido affords a framework for understanding argument as harmonization rather than confrontation. Two movements, circling away ("tenkan") and entering in ("irimi"), suggest tactics for arguing with adversaries. The ethical imperative of aikido involves protecting one's adversary from harm, using the least force…

  15. Towards an agent-oriented programming language based on Scala

    NASA Astrophysics Data System (ADS)

    Mitrović, Dejan; Ivanović, Mirjana; Budimac, Zoran

    2012-09-01

    Scala and its multi-threaded model based on actors represent an excellent framework for developing purely reactive agents. This paper presents an early research on extending Scala with declarative programming constructs, which would result in a new agent-oriented programming language suitable for developing more advanced, BDI agent architectures. The main advantage the new language over many other existing solutions for programming BDI agents is a natural and straightforward integration of imperative and declarative programming constructs, fitted under a single development framework.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Kim, Jungwon; Vetter, Jeffrey S

    This paper presents a directive-based, high-level programming framework for high-performance reconfigurable computing. It takes a standard, portable OpenACC C program as input and generates a hardware configuration file for execution on FPGAs. We implemented this prototype system using our open-source OpenARC compiler; it performs source-to-source translation and optimization of the input OpenACC program into an OpenCL code, which is further compiled into a FPGA program by the backend Altera Offline OpenCL compiler. Internally, the design of OpenARC uses a high- level intermediate representation that separates concerns of program representation from underlying architectures, which facilitates portability of OpenARC. In fact, thismore » design allowed us to create the OpenACC-to-FPGA translation framework with minimal extensions to our existing system. In addition, we show that our proposed FPGA-specific compiler optimizations and novel OpenACC pragma extensions assist the compiler in generating more efficient FPGA hardware configuration files. Our empirical evaluation on an Altera Stratix V FPGA with eight OpenACC benchmarks demonstrate the benefits of our strategy. To demonstrate the portability of OpenARC, we show results for the same benchmarks executing on other heterogeneous platforms, including NVIDIA GPUs, AMD GPUs, and Intel Xeon Phis. This initial evidence helps support the goal of using a directive-based, high-level programming strategy for performance portability across heterogeneous HPC architectures.« less

  17. Enhancing Graduate Students' Reflection in E-Portfolios Using the TPACK Framework

    ERIC Educational Resources Information Center

    Ching, Yu-Hui; Yang, Dazhi; Baek, YoungKyun; Baldwin, Sally

    2016-01-01

    When electronic portfolios (e-portfolios) are employed as summative assessments for degree granting programs, it is imperative that students demonstrate their knowledge in the field to showcase learning growth and achievement of the program learning outcomes. This study examined the extent graduate students in the field of educational technology…

  18. African Higher Education in Collaboration To Respond to Contemporary Development Imperatives.

    ERIC Educational Resources Information Center

    Wagaw, Teshome G.

    2001-01-01

    Sets up critical frameworks for exploring the emergence and development of higher education in Africa, focusing on ways in which higher education institutions and systems are responding to the challenges they face as agents of development and highlighting South Africa and Ethiopia. Explores the nature of the unfolding relationship between South…

  19. Philosophical Education toward Democratization and Boko Haram Insurgency in Nigeria

    ERIC Educational Resources Information Center

    Ugwuozor, Felix Okechukwu

    2016-01-01

    This paper examines Nigeria's democratization dilemmas and the imperatives of an educational framework against the backdrop of the Boko Haram insurgency. It identifies and connects the pattern, character and dynamics of the existing educational system. It also discusses the system's failure in calling for a new approach to overcome the prevailing…

  20. Advanced compilation techniques in the PARADIGM compiler for distributed-memory multicomputers

    NASA Technical Reports Server (NTRS)

    Su, Ernesto; Lain, Antonio; Ramaswamy, Shankar; Palermo, Daniel J.; Hodges, Eugene W., IV; Banerjee, Prithviraj

    1995-01-01

    The PARADIGM compiler project provides an automated means to parallelize programs, written in a serial programming model, for efficient execution on distributed-memory multicomputers. .A previous implementation of the compiler based on the PTD representation allowed symbolic array sizes, affine loop bounds and array subscripts, and variable number of processors, provided that arrays were single or multi-dimensionally block distributed. The techniques presented here extend the compiler to also accept multidimensional cyclic and block-cyclic distributions within a uniform symbolic framework. These extensions demand more sophisticated symbolic manipulation capabilities. A novel aspect of our approach is to meet this demand by interfacing PARADIGM with a powerful off-the-shelf symbolic package, Mathematica. This paper describes some of the Mathematica routines that performs various transformations, shows how they are invoked and used by the compiler to overcome the new challenges, and presents experimental results for code involving cyclic and block-cyclic arrays as evidence of the feasibility of the approach.

  1. Interacting Neural Processes of Feeding, Hyperactivity, Stress, Reward, and the Utility of the Activity-Based Anorexia Model of Anorexia Nervosa

    PubMed Central

    Ross, Rachel A.; Mandelblat-Cerf, Yael; Verstegen, Anne M.J.

    2017-01-01

    Anorexia nervosa (AN) is a psychiatric illness with minimal effective treatments and a very high rate of mortality. Understanding the neurobiological underpinnings of the disease is imperative for improving outcomes and can be aided by the study of animal models. The activity-based anorexia rodent model (ABA) is the current best parallel for the study of AN. This review describes the basic neurobiology of feeding and hyperactivity seen in both ABA and AN, and compiles the research on the role that stress-response and reward pathways play in modulating the homeostatic drive to eat and to expend energy, which become dysfunctional in ABA and AN. PMID:27824637

  2. Interacting Neural Processes of Feeding, Hyperactivity, Stress, Reward, and the Utility of the Activity-Based Anorexia Model of Anorexia Nervosa.

    PubMed

    Ross, Rachel A; Mandelblat-Cerf, Yael; Verstegen, Anne M J

    Anorexia nervosa (AN) is a psychiatric illness with minimal effective treatments and a very high rate of mortality. Understanding the neurobiological underpinnings of the disease is imperative for improving outcomes and can be aided by the study of animal models. The activity-based anorexia rodent model (ABA) is the current best parallel for the study of AN. This review describes the basic neurobiology of feeding and hyperactivity seen in both ABA and AN, and compiles the research on the role that stress-response and reward pathways play in modulating the homeostatic drive to eat and to expend energy, which become dysfunctional in ABA and AN.

  3. Three-dimensional hydrogeologic framework model of the Rio Grande transboundary region of New Mexico and Texas, USA, and northern Chihuahua, Mexico

    USGS Publications Warehouse

    Sweetkind, Donald S.

    2017-09-08

    As part of a U.S. Geological Survey study in cooperation with the Bureau of Reclamation, a digital three-dimensional hydrogeologic framework model was constructed for the Rio Grande transboundary region of New Mexico and Texas, USA, and northern Chihuahua, Mexico. This model was constructed to define the aquifer system geometry and subsurface lithologic characteristics and distribution for use in a regional numerical hydrologic model. The model includes five hydrostratigraphic units: river channel alluvium, three informal subdivisions of Santa Fe Group basin fill, and an undivided pre-Santa Fe Group bedrock unit. Model input data were compiled from published cross sections, well data, structure contour maps, selected geophysical data, and contiguous compilations of surficial geology and structural features in the study area. These data were used to construct faulted surfaces that represent the upper and lower subsurface hydrostratigraphic unit boundaries. The digital three-dimensional hydrogeologic framework model is constructed through combining faults, the elevation of the tops of each hydrostratigraphic unit, and boundary lines depicting the subsurface extent of each hydrostratigraphic unit. The framework also compiles a digital representation of the distribution of sedimentary facies within each hydrostratigraphic unit. The digital three-dimensional hydrogeologic model reproduces with reasonable accuracy the previously published subsurface hydrogeologic conceptualization of the aquifer system and represents the large-scale geometry of the subsurface aquifers. The model is at a scale and resolution appropriate for use as the foundation for a numerical hydrologic model of the study area.

  4. [Controlling and operation management in hospitals].

    PubMed

    Vagts, Dierk A

    2010-03-01

    The economical pressure on the health system and especially on hospitals is growing rapidly. Hence, economical knowledge for people in medical executive positions becomes imperative. In advanced and forward-looking hospitals controlling is gaining more and more weight, because it takes over a coordinative responsibility. Ideally controlling is navigating the teamwork of managers (CEOs) and medical executives by weighing medical necessities and economical framework. Controlling is contributing to evaluate an optimal efficiency of a hospital in a highly competitive surrounding by providing medical and economical data on a regular basis. A close, open-minded and trusting cooperation between all people, who are involved, is imperative. Hence, controlling in the proper meaning of the word can not flourish in dominant and hierarchic hospital structures. Georg Thieme Verlag Stuttgart * New York.

  5. An Open Source modular platform for hydrological model implementation

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur; Bruland, Oddbjørn

    2010-05-01

    An implementation framework for setup and evaluation of spatio-temporal models is developed, forming a highly modularized distributed model system. The ENKI framework allows building space-time models for hydrological or other environmental purposes, from a suite of separately compiled subroutine modules. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational hydropower forecasting or other water resource management. Written in C++, ENKI uses a plug-in structure to build a complete model from separately compiled subroutine implementations. These modules contain very little code apart from the core process simulation, and are compiled as dynamic-link libraries (dll). A narrow interface allows the main executable to recognise the number and type of the different variables in each routine. The framework then exposes these variables to the user within the proper context, ensuring that time series exist for input variables, initialisation for states, GIS data sets for static map data, manually or automatically calibrated values for parameters etc. ENKI is designed to meet three different levels of involvement in model construction: • Model application: Running and evaluating a given model. Regional calibration against arbitrary data using a rich suite of objective functions, including likelihood and Bayesian estimation. Uncertainty analysis directed towards input or parameter uncertainty. o Need not: Know the model's composition of subroutines, or the internal variables in the model, or the creation of method modules. • Model analysis: Link together different process methods, including parallel setup of alternative methods for solving the same task. Investigate the effect of different spatial discretization schemes. o Need not: Write or compile computer code, handle file IO for each modules, • Routine implementation and testing. Implementation of new process-simulating methods/equations, specialised objective functions or quality control routines, testing of these in an existing framework. o Need not: Implement user or model interface for the new routine, IO handling, administration of model setup and run, calibration and validation routines etc. From being developed for Norway's largest hydropower producer Statkraft, ENKI is now being turned into an Open Source project. At the time of writing, the licence and the project administration is not established. Also, it remains to port the application to other compilers and computer platforms. However, we hope that ENKI will prove useful for both academic and operational users.

  6. Teaching the Relation between Solar Cell Efficiency and Annual Energy Yield

    ERIC Educational Resources Information Center

    van Sark, Wilfried G. J. H. M.

    2007-01-01

    To reach a sustainable world the use of renewable energy sources is imperative. Photovoltaics (PV) is but one of the technologies that use the power of the sun and its deployment is growing very fast. Several master programs have been developed over the world, including Utrecht University, that teach these technologies. Within the framework of a…

  7. Intrapreneurship--A New Way of Doing Business: Maintaining Academic Integrity in the Face of the Political Imperative To Make Money.

    ERIC Educational Resources Information Center

    Elford, Elsie; Hemstreet, Brad

    In the context of educational institutions, intrapreneurs are proactive and innovative educational leaders who work as entrepreneurs inside the institution. In order to successfully carry out intrapreneurial activities, community colleges must have a structural and administrative framework that supports a market orientation and a reduced…

  8. Pathologizing the Poor: Implications for Preparing Teachers to Work in High-Poverty Schools

    ERIC Educational Resources Information Center

    Ullucci, Kerri; Howard, Tyrone

    2015-01-01

    The recent economic downturn highlights that poverty continues to be a significant social problem. Mindful of this demographic reality, it is imperative for teacher educators to pay close attention to the manner in which teachers are prepared to educate students from impoverished backgrounds. Given the number of frameworks that offer reductive…

  9. Competences in Romanian Higher Education--An Empirical Investigation for the Business Sector

    ERIC Educational Resources Information Center

    Deaconu, Adela; Nistor, Cristina Silvia

    2017-01-01

    This research study particularizes the general descriptions of the European Qualifications Framework for Lifelong Learning, as compiled and developed within the Romanian qualification framework, to the business and economics field in general and to the property economic analysis and valuation field in particular. By means of an empirical analysis,…

  10. HPC Software Stack Testing Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garvey, Cormac

    The HPC Software stack testing framework (hpcswtest) is used in the INL Scientific Computing Department to test the basic sanity and integrity of the HPC Software stack (Compilers, MPI, Numerical libraries and Applications) and to quickly discover hard failures, and as a by-product it will indirectly check the HPC infrastructure (network, PBS and licensing servers).

  11. A Framework for Latino Nursing Leadership.

    PubMed

    Villarruel, Antonia M

    2017-10-01

    There is an urgent need for Latino leaders in nursing, yet little has been written about Latino leaders and leadership. Leadership comes with challenges and opportunities in particular for Latino nurses who contend with specific cultural imperatives and obstacles. In this article, I review the current healthcare environment and propose a framework for Latino nursing leadership within the context of current challenges and opportunities and my personal experience in nursing. This framework is meant to serve as a guide for the development of Latino nurses who will improve the health and well-being of those in the most vulnerable communities by utilizing their cultural strengths and professional skills to deliver quality and compassionate care.

  12. HOPE: A Python just-in-time compiler for astrophysical computations

    NASA Astrophysics Data System (ADS)

    Akeret, J.; Gamper, L.; Amara, A.; Refregier, A.

    2015-04-01

    The Python programming language is becoming increasingly popular for scientific applications due to its simplicity, versatility, and the broad range of its libraries. A drawback of this dynamic language, however, is its low runtime performance which limits its applicability for large simulations and for the analysis of large data sets, as is common in astrophysics and cosmology. While various frameworks have been developed to address this limitation, most focus on covering the complete language set, and either force the user to alter the code or are not able to reach the full speed of an optimised native compiled language. In order to combine the ease of Python and the speed of C++, we developed HOPE, a specialised Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimisation on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. We assess the performance of HOPE by performing a series of benchmarks and compare its execution speed with that of plain Python, C++ and the other existing frameworks. We find that HOPE improves the performance compared to plain Python by a factor of 2 to 120, achieves speeds comparable to that of C++, and often exceeds the speed of the existing solutions. We discuss the differences between HOPE and the other frameworks, as well as future extensions of its capabilities. The fully documented HOPE package is available at http://hope.phys.ethz.ch and is published under the GPLv3 license on PyPI and GitHub.

  13. Compiler-Driven Performance Optimization and Tuning for Multicore Architectures

    DTIC Science & Technology

    2015-04-10

    develop a powerful system for auto-tuning of library routines and compute-intensive kernels, driven by the Pluto system for multicores that we are...kernels, driven by the Pluto system for multicores that we are developing. The work here is motivated by recent advances in two major areas of...automatic C-to-CUDA code generator using a polyhedral compiler transformation framework. We have used and adapted PLUTO (our state-of-the-art tool

  14. Sorption of pollutants by porous carbon, carbon nanotubes and fullerene- an overview.

    PubMed

    Gupta, Vinod K; Saleh, Tawfik A

    2013-05-01

    The quality of water is continuously deteriorating due to its increasing toxic threat to humans and the environment. It is imperative to perform treatment of wastewater in order to remove pollutants and to get good quality water. Carbon materials like porous carbon, carbon nanotubes and fullerene have been extensively used for advanced treatment of wastewaters. In recent years, carbon nanomaterials have become promising adsorbents for water treatment. This review attempts to compile relevant knowledge about the adsorption activities of porous carbon, carbon nanotubes and fullerene related to various organic and inorganic pollutants from aqueous solutions. A detailed description of the preparation and treatment methods of porous carbon, carbon nanotubes and fullerene along with relevant applications and regeneration is also included.

  15. Moving from Sustainable Management to Sustainable Governance of Natural Resources: The Role of Social Learning Processes in Rural India, Bolivia and Mali

    ERIC Educational Resources Information Center

    Rist, Stephan; Chidambaranathan, Mani; Escobar, Cesar; Wiesmann, Urs; Zimmermann, Anne

    2007-01-01

    The present paper discusses a conceptual, methodological and practical framework within which the limitations of the conventional notion of natural resource management (NRM) can be overcome. NRM is understood as the application of scientific ecological knowledge to resource management. By including a consideration of the normative imperatives that…

  16. The Imperatives of Challenges for Africa in the Knowledge Age: Status and Role of National Information Policy.

    ERIC Educational Resources Information Center

    Oladele, Benedict A.

    In principle, the emergence of National Information Policy (NIP) as a framework for developing information resources and institutions was welcomed by most countries in Africa with a messianic zeal. However, most of these countries, particularly those in the Sub-Sahara region, were unable to correspondingly match their zeal with concrete efforts…

  17. An Integrated Performance Support System (IPSS). How It Can Help Develop a Competitive Workforce in the '90s.

    ERIC Educational Resources Information Center

    Courseware/Andersen Consulting, San Diego, CA.

    This concept paper begins by arguing that Integrated Performance Support Systems (IPSS) are an imperative for boosting productivity in the workplace and gaining competitive advantage in the marketplace. It then presents the framework for an IPSS solution to meet the challenges of the 1990s. Discussion of the implementation of an IPSS solution…

  18. The Nanny Cam, Latex Gloves, and Two-Finger Touch: New Technologies for Disciplining Bodies

    ERIC Educational Resources Information Center

    Johnson, Richard T.

    2015-01-01

    Drawing on a range of personal caregiver experiences and risk analysis frameworks, this paper offers a critical reflection on related policy and practices in the U.S., focusing particularly on risk and it's impact on early childhood education. It develops a critique of the taken for granted imperative to protect children from any and all risks,…

  19. Adaptation and Extension of the Framework of Reducing Abstraction in the Case of Differential Equations

    ERIC Educational Resources Information Center

    Raychaudhuri, Debasree

    2014-01-01

    Although there is no consensus in regard to a unique meaning for abstraction, there is a recognition of the existence of several theories of abstraction, and that the ability to abstract is imperative to learning and doing meaningful mathematics. The theory of "reducing abstraction" maps the abstract nature of mathematics to the nature…

  20. Promoting Global Perspective and Raising the Visibility of Asia in World History: An Assignment for Pre-Service Teachers

    ERIC Educational Resources Information Center

    Keirn, Tim; Luhr, Eileen; Escobar, Miguel; Choudhary, Manoj

    2012-01-01

    Given California's role in the Pacific economy, its historic Asian heritage, and the strong and growing presence of Asian communities and businesses in the state, it is imperative that students statewide understand the history of Asia. Unfortunately, the California state curricular framework and standards in history and social science limit the…

  1. A Multiprocessor SoC Architecture with Efficient Communication Infrastructure and Advanced Compiler Support for Easy Application Development

    NASA Astrophysics Data System (ADS)

    Urfianto, Mohammad Zalfany; Isshiki, Tsuyoshi; Khan, Arif Ullah; Li, Dongju; Kunieda, Hiroaki

    This paper presentss a Multiprocessor System-on-Chips (MPSoC) architecture used as an execution platform for the new C-language based MPSoC design framework we are currently developing. The MPSoC architecture is based on an existing SoC platform with a commercial RISC core acting as the host CPU. We extend the existing SoC with a multiprocessor-array block that is used as the main engine to run parallel applications modeled in our design framework. Utilizing several optimizations provided by our compiler, an efficient inter-communication between processing elements with minimum overhead is implemented. A host-interface is designed to integrate the existing RISC core to the multiprocessor-array. The experimental results show that an efficacious integration is achieved, proving that the designed communication module can be used to efficiently incorporate off-the-shelf processors as a processing element for MPSoC architectures designed using our framework.

  2. Accelerating semantic graph databases on commodity clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morari, Alessandro; Castellana, Vito G.; Haglin, David J.

    We are developing a full software system for accelerating semantic graph databases on commodity cluster that scales to hundreds of nodes while maintaining constant query throughput. Our framework comprises a SPARQL to C++ compiler, a library of parallel graph methods and a custom multithreaded runtime layer, which provides a Partitioned Global Address Space (PGAS) programming model with fork/join parallelism and automatic load balancing over a commodity clusters. We present preliminary results for the compiler and for the runtime.

  3. Rapid Prototyping of Application Specific Signal Processors (RASSP)

    DTIC Science & Technology

    1993-12-23

    Compilers 2-9 - Cadre Teamwork 2-13 - CodeCenter (Centerline) 2-15 - dbx/dbxtool (UNIXm) 2-17 - Falcon (Mentor) 2-19 - FrameMaker (Frame Tech) 2-21 - gprof...UNIXm C debuggers Falcon Mentor ECAD Framework FrameMaker Frame Tech Word Processing gcc GNU CIC++ compiler gprof GNU Software profiling tool...organization can put their own documentation on-line using the BOLD Com- poser for Framemaker . " The AMPLE programming language is a C like language used for

  4. A conceptual hydrogeologic model for the hydrogeologic framework, geochemistry, and groundwater-flow system of the Edwards-Trinity and related aquifers in the Pecos County region, Texas

    USGS Publications Warehouse

    Thomas, Jonathan V.; Stanton, Gregory P.; Bumgarner, Johnathan R.; Pearson, Daniel K.; Teeple, Andrew; Houston, Natalie A.; Payne, Jason; Musgrove, MaryLynn

    2013-01-01

    Several previous studies have been done to compile or collect physical and chemical data, describe the hydrogeologic processes, and develop conceptual and numerical groundwater-flow models of the Edwards-Trinity aquifer in the Trans-Pecos region. Documented methods were used to compile and collect groundwater, surface-water, geochemical, geophysical, and geologic information that subsequently were used to develop this conceptual model.

  5. Recent advances in the compilation of holocene relative Sea-level database in North America

    NASA Astrophysics Data System (ADS)

    Horton, B.; Vacchi, M.; Engelhart, S. E.; Nikitina, D.

    2015-12-01

    Reconstruction of relative sea level (RSL) has implications for investigation of crustal movements, calibration of earth rheology models and the reconstruction of ice sheets. In recent years, efforts were made to create RSL databases following a standardized methodology. These regional databases provided a framework for developing our understanding of the primary mechanisms of RSL change since the Last Glacial Maximum and a long-term baseline against which to gauge changes in sea-level during the 20th century and forecasts for the 21st. Here we present two quality-controlled Holocene RSL database compiled for North America. Along the Pacific coast of North America (British Columbia, Canada to California, USA), our re-evaluation of sea-level indicators from geological and archaeological investigations yield 841 RSL data-points mainly from salt and freshwater wetlands or adjacent estuarine sediment as well as from isolation basin. Along the Atlantic coast of North America (Hudson Bay, Canada to South Carolina, USA), we are currently compiling a database including more than 2000 RSL data-points from isolation basin, salt and freshwater wetlands, beach ridges and intratidal deposits. We outline the difficulties and solutions we made to compile databases in such different depostional environment. We address complex tectonics and the framework to compare such large variability of RSL data-point. We discuss the implications of our results for the glacio-isostatic adjustment (GIA) models in the two studied regions.

  6. Framework for managing mycotoxin risks in the food industry.

    PubMed

    Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie

    2014-12-01

    We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.

  7. A Framework for Describing Health Care Delivery Organizations and Systems

    PubMed Central

    Cohen, Perry D.; Larson, David B.; Marion, Lucy N.; Sills, Marion R.; Solberg, Leif I.; Zerzan, Judy

    2015-01-01

    Describing, evaluating, and conducting research on the questions raised by comparative effectiveness research and characterizing care delivery organizations of all kinds, from independent individual provider units to large integrated health systems, has become imperative. Recognizing this challenge, the Delivery Systems Committee, a subgroup of the Agency for Healthcare Research and Quality’s Effective Health Care Stakeholders Group, which represents a wide diversity of perspectives on health care, created a draft framework with domains and elements that may be useful in characterizing various sizes and types of care delivery organizations and may contribute to key outcomes of interest. The framework may serve as the door to further studies in areas in which clear definitions and descriptions are lacking. PMID:24922130

  8. A vision framework for the localization of soccer players and ball on the pitch using Handycams

    NASA Astrophysics Data System (ADS)

    Vilas, Tiago; Rodrigues, J. M. F.; Cardoso, P. J. S.; Silva, Bruno

    2015-03-01

    The current performance requirements in soccer make imperative the use of new technologies for game observation and analysis, such that detailed information about the teams' actions is provided. This paper summarizes a framework to collect the soccer players and ball positions using one or more Full HD Handycams, placed no more than 20cm apart in the stands, as well as how this framework connects to the FootData project. The system was based on four main modules: the detection and delimitation of the soccer pitch, the ball and the players detection and assignment to their teams, the tracking of players and ball and finally the computation of their localization (in meters) in the pitch.

  9. Incorporating Risk and Indicators into a Water Security Framework

    NASA Astrophysics Data System (ADS)

    Allen, D. M.; Bakker, K.; Simpson, M. W.; Norman, E.; Dunn, G.

    2010-12-01

    The concept of water security has received growing attention over the past five years in academic debates and policy circles, particularly with respect to cumulative impacts assessment and watershed management. We propose an integrative definition for water security; one that considers both stressors and impacts (or effects) on hydrological systems. We present a water security assessment framework that considers status and risk indicators for both water quality and quantity as measures of impacts. This assessment framework also integrates the social sciences with natural science, engineering, and public health, providing opportunities to address environmental challenges, including the relationship between water and land use dynamics, the integration of aquatic ecosystem and human health concerns, and the alignment of governance with water management imperatives. We argue that this framework has the potential to advance water science, the contributing disciplines, and water policy and management.

  10. A ROSE-based OpenMP 3.0 Research Compiler Supporting Multiple Runtime Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, C; Quinlan, D; Panas, T

    2010-01-25

    OpenMP is a popular and evolving programming model for shared-memory platforms. It relies on compilers for optimal performance and to target modern hardware architectures. A variety of extensible and robust research compilers are key to OpenMP's sustainable success in the future. In this paper, we present our efforts to build an OpenMP 3.0 research compiler for C, C++, and Fortran; using the ROSE source-to-source compiler framework. Our goal is to support OpenMP research for ourselves and others. We have extended ROSE's internal representation to handle all of the OpenMP 3.0 constructs and facilitate their manipulation. Since OpenMP research is oftenmore » complicated by the tight coupling of the compiler translations and the runtime system, we present a set of rules to define a common OpenMP runtime library (XOMP) on top of multiple runtime libraries. These rules additionally define how to build a set of translations targeting XOMP. Our work demonstrates how to reuse OpenMP translations across different runtime libraries. This work simplifies OpenMP research by decoupling the problematic dependence between the compiler translations and the runtime libraries. We present an evaluation of our work by demonstrating an analysis tool for OpenMP correctness. We also show how XOMP can be defined using both GOMP and Omni and present comparative performance results against other OpenMP compilers.« less

  11. Tectonic evaluation of the Nubian shield of Northeastern Sudan using thematic mapper imagery

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Bechtel is nearing completion of a one-year program that uses digitally enhanced LANDSAT Thematic Mapper (TM) data to compile the first comprehensive regional tectonic map of the Proterozoic Nubian Shield exposed in the northern Red Sea Hills of northeastern Sudan. The status of significant objectives of this study are given. Pertinent published and unpublished geologic literature and maps of the northern Red Sea Hills to establish the geologic framework of the region were reviewed. Thematic mapper imagery for optimal base-map enhancements was processed. Photo mosaics of enhanced images to serve as base maps for compilation of geologic information were completed. Interpretation of TM imagery to define and delineate structural and lithogologic provinces was completed. Geologic information (petrologic, and radiometric data) was compiled from the literature review onto base-map overlays. Evaluation of the tectonic evolution of the Nubian Shield based on the image interpretation and the compiled tectonic maps is continuing.

  12. Design and Implementation of a Basic Cross-Compiler and Virtual Memory Management System for the TI-59 Programmable Calculator.

    DTIC Science & Technology

    1983-06-01

    previously stated requirements to construct the framework for a software soluticn. It is during this phase of design that lany cf the most critical...the linker would have to be deferred until the compiler was formalized and ir the implementation phase of design. The second problem involved...memory liait was encountered. At this point a segmentation occurred. The memory limits were reset and the combining process continued until another

  13. Let the social sciences evolve.

    PubMed

    Smaldino, Paul E; Waring, Timothy M

    2014-08-01

    We agree that evolutionary perspectives may help us organize many divergent realms of the science of human behavior. Nevertheless, an imperative to unite all social science under an evolutionary framework risks turning off researchers who have their own theoretical perspectives that can be informed by evolutionary theory without being exclusively defined by it. We propose a few considerations for scholars interested in joining the evolutionary and social sciences.

  14. a Framework for Distributed Mixed Language Scientific Applications

    NASA Astrophysics Data System (ADS)

    Quarrie, D. R.

    The Object Management Group has defined an architecture (CORBA) for distributed object applications based on an Object Request Broker and Interface Definition Language. This project builds upon this architecture to establish a framework for the creation of mixed language scientific applications. A prototype compiler has been written that generates FORTRAN 90 or Eiffel stubs and skeletons and the required C++ glue code from an input IDL file that specifies object interfaces. This generated code can be used directly for non-distributed mixed language applications or in conjunction with the C++ code generated from a commercial IDL compiler for distributed applications. A feasibility study is presently underway to see whether a fully integrated software development environment for distributed, mixed-language applications can be created by modifying the back-end code generator of a commercial CASE tool to emit IDL.

  15. Compiler analysis for irregular problems in FORTRAN D

    NASA Technical Reports Server (NTRS)

    Vonhanxleden, Reinhard; Kennedy, Ken; Koelbel, Charles; Das, Raja; Saltz, Joel

    1992-01-01

    We developed a dataflow framework which provides a basis for rigorously defining strategies to make use of runtime preprocessing methods for distributed memory multiprocessors. In many programs, several loops access the same off-processor memory locations. Our runtime support gives us a mechanism for tracking and reusing copies of off-processor data. A key aspect of our compiler analysis strategy is to determine when it is safe to reuse copies of off-processor data. Another crucial function of the compiler analysis is to identify situations which allow runtime preprocessing overheads to be amortized. This dataflow analysis will make it possible to effectively use the results of interprocedural analysis in our efforts to reduce interprocessor communication and the need for runtime preprocessing.

  16. Expanding roles within mental health legislation: an opportunity for professional growth or a missed opportunity?

    PubMed

    Hurley, J; Linsley, P

    2007-09-01

    This paper aims to highlight both the necessity, and the way forward for mental health nursing to integrate proposed legislative roles into practice. Argued is that community mental health nursing, historically absent from active participation within mental health law in the UK, is faced with new and demanding roles under proposed changes to the 1983 Mental Health Act of England and Wales. While supporting multidisciplinary training for such roles, the imperative of incorporating nursing specific values into consequent training programs is addressed through the offered educative framework. This framework explores the issues of power, ethics, legislative thematics and application to contemporary service structures.

  17. Leverage hadoop framework for large scale clinical informatics applications.

    PubMed

    Dong, Xiao; Bahroos, Neil; Sadhu, Eugene; Jackson, Tommie; Chukhman, Morris; Johnson, Robert; Boyd, Andrew; Hynes, Denise

    2013-01-01

    In this manuscript, we present our experiences using the Apache Hadoop framework for high data volume and computationally intensive applications, and discuss some best practice guidelines in a clinical informatics setting. There are three main aspects in our approach: (a) process and integrate diverse, heterogeneous data sources using standard Hadoop programming tools and customized MapReduce programs; (b) after fine-grained aggregate results are obtained, perform data analysis using the Mahout data mining library; (c) leverage the column oriented features in HBase for patient centric modeling and complex temporal reasoning. This framework provides a scalable solution to meet the rapidly increasing, imperative "Big Data" needs of clinical and translational research. The intrinsic advantage of fault tolerance, high availability and scalability of Hadoop platform makes these applications readily deployable at the enterprise level cluster environment.

  18. Effective energy data management for low-carbon growth planning: An analytical framework for assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Bo; Evans, Meredydd; Yu, Sha

    Readily available and reliable energy data is fundamental to effective analysis and policymaking for the energy sector. Energy statistics of high quality, systematically compiled and effectively disseminated, not only support governments to ensure national security and evaluate energy policies, but they also guide investment decisions in both the private and public sectors. Because of energy’s close link to greenhouse gas emissions, energy data has a particularly important role in assessing emissions and strategies to reduce emissions. In this study, energy data management in four countries – Canada, Germany, the United Kingdom and the United States – are examined from bothmore » organizational and operational perspectives. With insights from these best practices, we present a framework for the evaluation of national energy data management systems. It can be used by national statistics compilers to assess their chosen model and to identify areas for improvement. We then use India as a test case for this framework. Its government is working to enhance India’s energy data management to improve sustainable growth planning.« less

  19. Strategic Delusions - The Cold Start Doctrine: Proactive Strategy

    DTIC Science & Technology

    2016-05-26

    recent tactical strike in Myanmar is miscued as a precursor for times to come in the regional context.10 These tactical actions also known as Hot...Attacks Militant Camps in Myanmar , Wall Street Journal, June 10, 2015, accessed August 24, 2015, http://www.wsj.com/articles/indian-army-attacks...militant-camps-in- myanmar -1433927858. 7 imperatives associated with the Indian PAS and its validation through the framework of strategy and the

  20. Using a Framework of 21st Century Competencies to Examine Changes between China's 2001 and 2011 Mathematics Curriculum Standards for Basic Education

    ERIC Educational Resources Information Center

    Stephens, Max; Keqiang, Richard Xu

    2014-01-01

    In the Western developed world, the language of 21st century competencies, also referred to as 21st century skills or competences, is a powerful means of drawing attention to links between the secondary school curriculum, post-secondary education, and the social and economic imperatives of the developed economies. This paper will analyze different…

  1. Moral Imperatives, Professional Interventions and Resilience, and Educational Action in Chaotic Situations: The Souls of Children amidst the Horror of War

    ERIC Educational Resources Information Center

    Hill, Catherine M.

    2005-01-01

    Drawing from the research on children of war in Bogota, Beirut and Bosnia, this paper serves as a framework for dialogue about the criminalization of children by armed conflict and other forms of violence. Furthermore, it addresses the aching question of how best to care for these children so that they have every chance to become illuminated and…

  2. Managing Computer Systems Development: Understanding the Human and Technological Imperatives.

    DTIC Science & Technology

    1985-06-01

    for their organization’s use? How can they predict tle impact of future systems ca their management control capabilities ? Cf equal importance is the...commercial organizations discovered that there was only a limited capability of interaction between various types of computers. These organizations were...Viewed together, these three interrelated subsystems, EDP, MIS, and DSS, establish the framework of an overall systems capability known as a Computer

  3. A moral imperative to improve the quality of work-life for nurses: building inclusive social capital capacity.

    PubMed

    Hofmeyer, Anne

    2003-08-01

    The complexity and incessant change in the corporatised health care workplace has influenced nurses' work choices, morale, quality of work-life and the wellbeing of patients. Thus, there is an urgent moral imperative to improve the quality of work-life for nurses. To this end, it is crucial to re-define progress beyond the sole economic markers of success and profit in the health care workplace. This paper argues for the identification of ethical markers and indicators of organisational success based on bridging and linking social capital which could be used to re-organise health care organisations, hence crafting inclusive moral spaces where nurses can safely work and provide quality care for patients. Social and ethical evaluation is well suited to examine current workplace dilemmas from a psychosocial perspective and provide a framework for best practice in building capacity in effective social relations and family friendly, ethical workplaces.

  4. The compatibility between Shiite and Kantian approach to passive voluntary euthanasia.

    PubMed

    Dabbagh, Soroush; Aramesh, Kiarash

    2009-01-01

    Euthanasia is one of the controversial topics in current medical ethics. Among the six well-known types of euthanasia, passive voluntary euthanasia (PVE) seems to be more plausible in comparison with other types, from the moral point of view. According to the Kantian framework, ethical features come from 'reason'. Maxims are formulated as categorical imperative which has three different versions. Moreover, the second version of categorical imperative which is dubbed 'principle of ends' is associated with human dignity. It follows from this that human dignity has an indisputable role in the Kantian story. ON THE OTHER HAND, THERE ARE TWO MAIN THEOLOGICAL SCHOOLS IN ISLAMIC TRADITION WHICH ARE CALLED: Ash'arite and Mu'tazilite. Moreover, there are two main Islamic branches: Shiite and Sunni. From the theological point of view, Shiite's theoretical framework is similar to the Mu'tazilite one. According to Shiite and Mu'tazilite perspectives, moral goodness and badness can be discovered by reason, on its own. Accordingly, bioethical judgments can be made based on the very concept of human dignity rather than merely resorting to the Holy Scripture or religious jurisprudential deliberations. As far as PVE is concerned, the majority of Shiite scholars do not recognize a person's right to die voluntarily. Similarly, on the basis of Kantian ethical themes, PVE is immoral, categorically speaking. According to Shiite framework, however, PVE could be moral in some ethical contexts. In other words, in such contexts, the way in which Shiite scholars deal with PVE is more similar to Rossian ethics rather than the Kantian one.

  5. The compatibility between Shiite and Kantian approach to passive voluntary euthanasia

    PubMed Central

    Dabbagh, Soroush; Aramesh, Kiarash

    2009-01-01

    Euthanasia is one of the controversial topics in current medical ethics. Among the six well-known types of euthanasia, passive voluntary euthanasia (PVE) seems to be more plausible in comparison with other types, from the moral point of view. According to the Kantian framework, ethical features come from ‘reason’. Maxims are formulated as categorical imperative which has three different versions. Moreover, the second version of categorical imperative which is dubbed ‘principle of ends’ is associated with human dignity. It follows from this that human dignity has an indisputable role in the Kantian story. On the other hand, there are two main theological schools in Islamic tradition which are called: Ash’arite and Mu’tazilite. Moreover, there are two main Islamic branches: Shiite and Sunni. From the theological point of view, Shiite’s theoretical framework is similar to the Mu’tazilite one. According to Shiite and Mu’tazilite perspectives, moral goodness and badness can be discovered by reason, on its own. Accordingly, bioethical judgments can be made based on the very concept of human dignity rather than merely resorting to the Holy Scripture or religious jurisprudential deliberations. As far as PVE is concerned, the majority of Shiite scholars do not recognize a person’s right to die voluntarily. Similarly, on the basis of Kantian ethical themes, PVE is immoral, categorically speaking. According to Shiite framework, however, PVE could be moral in some ethical contexts. In other words, in such contexts, the way in which Shiite scholars deal with PVE is more similar to Rossian ethics rather than the Kantian one. PMID:23908735

  6. Yes! An object-oriented compiler compiler (YOOCC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avotins, J.; Mingins, C.; Schmidt, H.

    1995-12-31

    Grammar-based processor generation is one of the most widely studied areas in language processor construction. However, there have been very few approaches to date that reconcile object-oriented principles, processor generation, and an object-oriented language. Pertinent here also. is that currently to develop a processor using the Eiffel Parse libraries requires far too much time to be expended on tasks that can be automated. For these reasons, we have developed YOOCC (Yes! an Object-Oriented Compiler Compiler), which produces a processor framework from a grammar using an enhanced version of the Eiffel Parse libraries, incorporating the ideas hypothesized by Meyer, and Grapemore » and Walden, as well as many others. Various essential changes have been made to the Eiffel Parse libraries. Examples are presented to illustrate the development of a processor using YOOCC, and it is concluded that the Eiffel Parse libraries are now not only an intelligent, but also a productive option for processor construction.« less

  7. Compile-time estimation of communication costs in multicomputers

    NASA Technical Reports Server (NTRS)

    Gupta, Manish; Banerjee, Prithviraj

    1991-01-01

    An important problem facing numerous research projects on parallelizing compilers for distributed memory machines is that of automatically determining a suitable data partitioning scheme for a program. Any strategy for automatic data partitioning needs a mechanism for estimating the performance of a program under a given partitioning scheme, the most crucial part of which involves determining the communication costs incurred by the program. A methodology is described for estimating the communication costs at compile-time as functions of the numbers of processors over which various arrays are distributed. A strategy is described along with its theoretical basis, for making program transformations that expose opportunities for combining of messages, leading to considerable savings in the communication costs. For certain loops with regular dependences, the compiler can detect the possibility of pipelining, and thus estimate communication costs more accurately than it could otherwise. These results are of great significance to any parallelization system supporting numeric applications on multicomputers. In particular, they lay down a framework for effective synthesis of communication on multicomputers from sequential program references.

  8. Defense Health Agency and the Deployment of the Electronic Health Record: Building an Organizational Framework for Implementation and Sustainment

    DTIC Science & Technology

    2016-12-01

    vital component of hospital and information system management within a healthcare system, but it may be minimally realized by healthcare systems...both government and non-government hospital systems to address them in their strategic and operational plans. Nonetheless, EHR systems are a very... strategic imperatives. Trkman (2010) suggested in the Journal of Information Management , “Implementing change within an organization is dependent

  9. [Hematologic malignancies in pregnancy].

    PubMed

    Doubek, R; Petrovová, D; Kalvodová, J; Doubek, M

    2009-04-01

    To summarize available data concerning hematologic malignancies in pregnancy. Review article. Department of Obstetrics and Gynekology, Fakulty of Medicine, Masaryk University and University Hospital Brno. Compilation of published data from scientific literature. Cancer complicating pregnancy is a rare coexistence. The incidence is approximately 1 in 1,000 pregnancies. The most frequent hematologic malignant tumor is Hodgkin's lymphoma, leukemia is less frequent and myeloproliferative diseases complicating pregnancy are sporadic coexistence. Symptoms of these deseases are often nonspecific and disguised in pregnancy, then the diagnosis can be late. It is imperative that a multidisciplinary team involving hematooncologist and obstetrician (pediatric specialist) care for patient with hematologic malignancies. Cleary, every patient have to know whole prognosis and all risk factors of treatment. Optimum timing of delivery is after 36th week of pregnancy (when chemotherapy is ended more than two weeks ago). We prefer vaginal delivery to caesarean section.

  10. Defining the unique role of the specialist district nurse practitioner.

    PubMed

    Barrett, Anne; Latham, Dinah; Levermore, Joy

    2007-10-01

    Due to the reorganization of primary care trusts across the country, certain trusts proposed a reduction in the specialist district nurse practitioner numbers in favour of less qualified community nurses and health care assistants. Such proposals in one PCT were blocked, partly in response to documentation compiled by practitioners at the sharp end of nursing practice. With the new agenda of practice based commissioning, it is imperative that commissioners and management alike are aware of the scope of specialist district nurse practitioners. This is the first of a series of articles looking at specific case histories where the role of the district nurse is highlighted. It is the intention to stress the importance of the clinical expertise and confidence required by the district nurse to care for patients with complex needs in the community.

  11. Health information systems: a survey of frameworks for developing countries.

    PubMed

    Marcelo, A B

    2010-01-01

    The objective of this paper is to perform a survey of excellent research on health information systems (HIS) analysis and design, and their underlying theoretical frameworks. It classifies these frameworks along major themes, and analyzes the different approaches to HIS development that are practical in resource-constrained environments. Literature review based on PubMed citations and conference proceedings, as well as Internet searches on information systems in general, and health information systems in particular. The field of health information systems development has been studied extensively. Despite this, failed implementations are still common. Theoretical frameworks for HIS development are available that can guide implementers. As awareness, acceptance, and demand for health information systems increase globally, the variety of approaches and strategies will also follow. For developing countries with scarce resources, a trial-and-error approach can be very costly. Lessons from the successes and failures of initial HIS implementations have been abstracted into theoretical frameworks. These frameworks organize complex HIS concepts into methodologies that standardize techniques in implementation. As globalization continues to impact healthcare in the developing world, demand for more responsive health systems will become urgent. More comprehensive frameworks and practical tools to guide HIS implementers will be imperative.

  12. ENKI - An Open Source environmental modelling platfom

    NASA Astrophysics Data System (ADS)

    Kolberg, S.; Bruland, O.

    2012-04-01

    The ENKI software framework for implementing spatio-temporal models is now released under the LGPL license. Originally developed for evaluation and comparison of distributed hydrological model compositions, ENKI can be used for simulating any time-evolving process over a spatial domain. The core approach is to connect a set of user specified subroutines into a complete simulation model, and provide all administrative services needed to calibrate and run that model. This includes functionality for geographical region setup, all file I/O, calibration and uncertainty estimation etc. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines and various model compositions in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational water resource management. ENKI uses a plug-in structure to invoke separately compiled subroutines, separately built as dynamic-link libraries (dlls). The source code of an ENKI routine is highly compact, with a narrow framework-routine interface allowing the main program to recognise the number, types, and names of the routine's variables. The framework then exposes these variables to the user within the proper context, ensuring that distributed maps coincide spatially, time series exist for input variables, states are initialised, GIS data sets exist for static map data, manually or automatically calibrated values for parameters etc. By using function calls and memory data structures to invoke routines and facilitate information flow, ENKI provides good performance. For a typical distributed hydrological model setup in a spatial domain of 25000 grid cells, 3-4 time steps simulated per second should be expected. Future adaptation to parallel processing may further increase this speed. New modifications to ENKI include a full separation of API and user interface, making it possible to run ENKI from GIS programs and other software environments. ENKI currently compiles under Windows and Visual Studio only, but ambitions exist to remove the platform and compiler dependencies.

  13. On the Drive Specificity of Freudian Drives for the Generation of SEEKING Activities: The Importance of the Underestimated Imperative Motor Factor

    PubMed Central

    Kirsch, Michael; Mertens, Wolfgang

    2018-01-01

    Doubters of Freud’s theory of drives frequently mentioned that his approach is outdated and therefore cannot be useful for solving current problems in patients with mental disorders. At present, many scientists believe that affects rather than drives are of utmost importance for the emotional life and the theoretical framework of affective neuroscience, developed by Panksepp, strongly underpinned this view. Panksepp evaluated seven so-called command systems and the SEEKING system is therein of central importance. Panksepp used Pankseppian drives as inputs for the SEEKING system but noted the missing explanation of drive-specific generation of SEEKING activities in his description. Drive specificity requires dual action of the drive: the activation of a drive-specific brain area and the release of the neurotransmitter dopamine. Noticeably, as Freud claimed drive specificity too, it was here analyzed whether a Freudian drive can evoke the generation of drive-specific SEEKING activities. Special importance was addressed to the imperative motor factor in Freud’s drive theory because Panksepp’s formulations focused on neural pathways without specifying underlying neurotransmitter/endocrine factors impelling motor activity. As Panksepp claimed sleep as a Pankseppian drive, we firstly had to classified sleep as a Freudian drive by using three evaluated criteria for a Freudian drive. After that it was possible to identify the imperative motor factors of hunger, thirst, sex, and sleep. Most importantly, all of these imperative motor factors can both activate a drive-specific brain area and release dopamine from dopaminergic neurons, i.e., they can achieve the so-called drive specificity. Surprisingly, an impaired Freudian drive can alter via endocrinological pathways the concentration of the imperative motor factor of a second Freudian drive, obviously in some independence to the level of the metabolic deficit, thereby offering the possibility to modulate the generation of SEEKING activities of this second Freudian drive. This novel possibility might help to refine the general understanding of the action of Freudian drives. As only imperative motor factors of Freudian drives can guarantee drive specificity for the generation of SEEKING activities, the impact of Freud’s construct Eros (with its constituents hunger, thirst, sex, and sleep) should be revisited. PMID:29774002

  14. LIBVERSIONINGCOMPILER: An easy-to-use library for dynamic generation and invocation of multiple code versions

    NASA Astrophysics Data System (ADS)

    Cherubin, S.; Agosta, G.

    2018-01-01

    We present LIBVERSIONINGCOMPILER, a C++ library designed to support the dynamic generation of multiple versions of the same compute kernel in a HPC scenario. It can be used to provide continuous optimization, code specialization based on the input data or on workload changes, or otherwise to dynamically adjust the application, without the burden of a full dynamic compiler. The library supports multiple underlying compilers but specifically targets the LLVM framework. We also provide examples of use, showing the overhead of the library, and providing guidelines for its efficient use.

  15. Evolving African attitudes to European education: Resistance, pervert effects of the single system paradox, and the ubuntu framework for renewal

    NASA Astrophysics Data System (ADS)

    Assié-Lumumba, N'Dri Thérèse

    2016-02-01

    This paper is a reflection that critically examines the dynamics of education and the struggle by African people for freedom, control of the mind, self-definition and the right to determine their own destiny from the start of colonial rule to the present. The primary methodological approach is historical structuralism, which stipulates that social reality and facts are determined and created by social agents within structural and historical contingencies. It addresses some of the most powerful challenges and contradictions that explain the ineffectiveness of numerous post-independence reforms, and presents the arguments for relevance and use of African languages, for instance, that have been made since the 1960s. The first section of the paper deals with the colonial imperatives for setting new education systems in the colonised societies of Africa and the initial attitudes of the Africans towards colonial education. The second section critically examines the evolving meanings of Western education in Europeanising African societies, the articulation of their rationale and the mechanism for resistance. It analyses the turning point when Africans began to embrace European education and demand it in the colonial and post-independence era. The third section addresses the roots of the inadequacies of received post-colonial education and the imperative of deconstruction and re-appropriation of African education using an ubuntu framework for an African renewal.

  16. Geologic framework, structure, and hydrogeologic characteristics of the Knippa Gap area in eastern Uvalde and western Medina Counties, Texas

    USGS Publications Warehouse

    Clark, Allan K.; Pedraza, Diana E.; Morris, Robert R.

    2013-01-01

    By using data that were compiled and collected for this study and previous studies, a revised map was constructed depicting the geologic framework, structure, and hydrogeologic characteristics of the Knippa Gap area in eastern Uvalde and western Medina Counties, Tex. The map also shows the interpreted structural dip directions and interpreted location of a structural low (trough) in the area known as the Knippa Gap.

  17. Ageism vs the technical imperative, applying the GRADE framework to the evidence on hemodialysis in very elderly patients

    PubMed Central

    Thorsteinsdottir, Bjorg; Montori, Victor M; Prokop, Larry J; Murad, Mohammad Hassan

    2013-01-01

    Purpose Treatment intensity for elderly patients with end-stage renal disease has escalated beyond population growth. Ageism seems to have given way to a powerful imperative to treat patients irrespective of age, prognosis, or functional status. Hemodialysis (HD) is a prime example of this trend. Recent articles have questioned this practice. This paper aims to identify existing pre-synthesized evidence on HD in the very elderly and frame it from the perspective of a clinician who needs to involve their patient in a treatment decision. Patients and methods A comprehensive search of several databases from January 2002 to August 2012 was conducted for systematic reviews of clinical and economic outcomes of HD in the elderly. We also contacted experts to identify additional references. We applied the rigorous framework of decisional factors of the Grading of Recommendation, Assessment, Development and Evaluation (GRADE) to evaluate the quality of evidence and strength of recommendations. Results We found nine eligible systematic reviews. The quality of the evidence to support the current recommendation of HD initiation for most very elderly patients is very low. There is significant uncertainty in the balance of benefits and risks, patient preference, and whether default HD in this patient population is a wise use of resources. Conclusion Following the GRADE framework, recommendation for HD in this population would be weak. This means it should not be considered standard of care and should only be started based on the well-informed patient’s values and preferences. More studies are needed to delineate the true treatment effect and to guide future practice and policy. PMID:23847412

  18. Framework for adaptive interoperability of manufacturing enterprises (FAIME): a case study

    NASA Astrophysics Data System (ADS)

    Sims, John E.; Chu, Bei Tseng B.; Long, Junshen; Matthews, Mike; Barnes, Johnny G.; Jones, Chris H.; Anderson, Rayne A.; Lambert, Russ; Drake, Doug C.; Hamilton, Mark A.; Connard, Mark

    1997-01-01

    In todays global economy, manufacturing industries require to connect disparate applications seamlessly. They require not only to exchange data and transactions, but present a single business process image to their employees in the office, headquarters, and on the plant floor. Also, it is imperative that small and medium size manufacturing companies deploy manufacturing execution systems applications in conjunction with modern enterprise resource programs for cycle time reduction and better quality. This paper presents the experiences and reflections on a project that created a tool set to assist the above be accomplished not only in a shorter cycle time, with a better predictable quality, and with an object oriented framework, but also a tool set that allows the manufacturer to still use legacy applications. This framework has the capability of plug-and- play so that future migrations and re-engineering of processes are more productive.

  19. Livermore Compiler Analysis Loop Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornung, R. D.

    2013-03-01

    LCALS is designed to evaluate compiler optimizations and performance of a variety of loop kernels and loop traversal software constructs. Some of the loop kernels are pulled directly from "Livermore Loops Coded in C", developed at LLNL (see item 11 below for details of earlier code versions). The older suites were used to evaluate floating-point performances of hardware platforms prior to porting larger application codes. The LCALS suite is geared toward assissing C++ compiler optimizations and platform performance related to SIMD vectorization, OpenMP threading, and advanced C++ language features. LCALS contains 20 of 24 loop kernels from the older Livermoremore » Loop suites, plus various others representative of loops found in current production appkication codes at LLNL. The latter loops emphasize more diverse loop constructs and data access patterns than the others, such as multi-dimensional difference stencils. The loops are included in a configurable framework, which allows control of compilation, loop sampling for execution timing, which loops are run and their lengths. It generates timing statistics for analysis and comparing variants of individual loops. Also, it is easy to add loops to the suite as desired.« less

  20. Fast interrupt platform for extended DOS

    NASA Technical Reports Server (NTRS)

    Duryea, T. W.

    1995-01-01

    Extended DOS offers the unique combination of a simple operating system which allows direct access to the interrupt tables, 32 bit protected mode access to 4096 MByte address space, and the use of industry standard C compilers. The drawback is that fast interrupt handling requires both 32 bit and 16 bit versions of each real-time process interrupt handler to avoid mode switches on the interrupts. A set of tools has been developed which automates the process of transforming the output of a standard 32 bit C compiler to 16 bit interrupt code which directly handles the real mode interrupts. The entire process compiles one set of source code via a make file, which boosts productivity by making the management of the compile-link cycle very simple. The software components are in the form of classes written mostly in C. A foreground process written as a conventional application which can use the standard C libraries can communicate with the background real-time classes via a message passing mechanism. The platform thus enables the integration of high performance real-time processing into a conventional application framework.

  1. Nitrogen Source and Loading Data for EPA Estuary Data Mapper

    EPA Science Inventory

    Nitrogen source and loading data have been compiled and aggregated at the scale of estuaries and associated watersheds of the conterminous United States, using the spatial framework in EPA's Estuary Data Mapper (EDM) to provide system boundaries. Original sources of data include...

  2. Precarious Employment and Quality of Employment in Relation to Health and Well-being in Europe.

    PubMed

    Julià, Mireia; Vanroelen, Christophe; Bosmans, Kim; Van Aerden, Karen; Benach, Joan

    2017-07-01

    This article presents an overview of the recent work on precarious employment and employment quality in relation to workers' health and well-being. More specifically, the article mainly reviews the work performed in the E.U. 7th Framework project, SOPHIE. First, we present our overarching conceptual framework. Then, we provide a compiled overview of the evidence on the sociodemographic and European cross-country distribution of employment quality and employment precariousness. Subsequently, we provide the current evidence regarding the relations with health and broader worker well-being indicators. A final section summarizes current insights on the pathways relating precarious employment and health and well-being. The article concludes with a plea for further data collection and research into the longitudinal effects of employment precariousness among emerging groups of workers. Based on the evidence compiled in this article, policymakers should be convinced of the harmful health and well-being effects of employment precariousness and (further) labor market flexibilization.

  3. Scout: high-performance heterogeneous computing made simple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice

    2011-01-26

    Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less

  4. Infusing culture into oncology research on quality of life.

    PubMed

    Ashing-Giwa, Kimlin; Kagawa-Singer, Marjorie

    2006-01-01

    To review the literature relevant to understanding culturally informed oncology research, particularly as it relates to health-related quality of life. Published articles and books. A cultural perspective to the prevailing theory and research methods used in oncology research with respect to quality of life is imperative. A multidimensional and practical framework can be applied to increase cultural competence in research by addressing the purpose of the research, theoretical framework, and methodologic approaches. Culturally competent, multicultural research will help the scientific community better comprehend disparities that exist in health-related quality of life so that benefits can be experienced by all patients. Nursing practice and research must continue its leadership role to infuse cultural competence and reduce disparities in the healthcare system.

  5. Formal Compiler Implementation in a Logical Framework

    DTIC Science & Technology

    2003-04-29

    variable set [], we omit the brackets and use the simpler notation v. MetaPRL is a tactic-based prover that uses OCaml [20] as its meta-language. When a...rewrite is defined in MetaPRL, the framework creates an OCaml expression that can be used to apply the rewrite. Code to guide the application of...rewrites is written in OCaml , using a rich set of primitives provided by MetaPRL. MetaPRL automates the construction of most guidance code; we describe

  6. Theoretical links supporting the use of problem-based learning in the education of the nurse practitioner.

    PubMed

    Chikotas, Noreen Elaine

    2008-01-01

    The need to evaluate current strategies in educating the advanced practice nurse, specifically the nurse practitioner, is becoming more and more imperative due to the ever-changing health care environment. This article addresses the role of problem-based learning (PBL) as an instructional strategy in educating and preparing the nurse practitioner for future practice.Two theoretical frameworks supporting PBL, andragogy and constructivism, are presented as important to the use of PBL in the education of the nurse practitioner.

  7. Drug addiction and diabetes: South Asian action.

    PubMed

    Singh Balhara, Yatan Pal; Kalra, Sanjay

    2017-06-01

    Both diabetes and drug addiction are common phenomena across the world. Drug abuse impacts glycaemic control in multiple ways. It becomes imperative, therefore, to share guidance on drug deaddiction in persons with diabetes. The South Asian subcontinent is home to specific forms and patterns of drug abuse. Detailed study is needed to ensure good clinical practice regarding the same. This communication provides a simple and pragmatic framework to address this issue, while calling for concerted action on drug deaddiction in South Asia.

  8. Proposing a Universal Framework for Resilience: Optimizing Risk and Combating Human Vulnerabilities

    NASA Astrophysics Data System (ADS)

    Sarkar, Arunima

    2017-04-01

    In the recent years we have seen a massive impact of loss created to urban settlements and critical infrastructure as a result of disasters. The disaster risk associates itself vulnerabilities and many complexities which can disrupt the functioning of human society. The uncertain loss created by disasters can present unforeseeable risk which remain unaccounted to human understanding. It is imperative to note that human urbanization and development is correlated with human vulnerabilities and challenges posed by disasters. Disaster risks are aggravated by improper planning of cities, weak framework for urban governance and regulatory regimes and lack of equalities amongst the citizens. The international agenda on disaster risk reduction talks about increasing losses due to disasters associated with development and urbanization. The United Nations announced that the year 1990 was the International Decade for Natural Disaster Reduction. In relation to this, the "Yokohama Strategy and Plan of Action" was adopted at the first United Nations World Conference on Disaster Reduction. The United Nations Educational, Scientific and Cultural Organization's (UNESCO) Intergovernmental Oceanic Commission coordinated the World Conference on Disaster Reduction in 2005 where the Hyogo Framework for Action was adopted. The Hyogo Framework for Action: Building the resilience of communities to disaster was adopted by 168 nations after the massive loss caused by Indian ocean tsunami in 2005. The Hyogo Framework proposes to focus on implementation of risk and reliability system to shield disasters, proposes global scientific and community platform for disaster prevention and mitigation etc. The early warning system and its importance as an effective tool for reduction of human vulnerabilities for disaster management was majorly emphasized. It is imperative to highlight that resilience framework is important in order to minimize cost of disruption caused to critical infrastructure and to strengthen and optimize the decision making skill and platform for a better sustainable society. The resilience framework provides a cross-sector and multi-level analysis to tackle the vulnerabilities which can be caused to essential utilities like power, water, transport and various machineries that are essential for human sustainability. The direction of resilience framework focuses on prevention of damage and disruption of disaster, mitigate the loss caused to human society and provide the best response for disaster resilience. Thus, the basic pillars which are important for the implementation of resilience is proper governance framework and transparency which takes into account various cost and risk analysis. Thus a common and universal framework for resilience is the main requirement for mass accessibility. The aim of resilience framework focuses on universal adaptability, coherence and validation. A mixed method analysis has been undertaken in this research paper which focuses on the following issues: • Legal, Institutional and community framework for integrating resilience framework of global north and global south. • Spatial as well as statistical analysis to structuralize disaster risk and resilient framework for disaster management. • Early warning system and emergency response in a comparative scale to analyse the various models of risk and resilience framework implemented in USA, China, Nepal and India for proposing an integrated resilience strategy.

  9. Block Scheduling: Teaching Strategies for the Restructured School Day.

    ERIC Educational Resources Information Center

    National Science Teachers Association, Arlington, VA.

    This book is a compilation of articles taken from the National Science Teachers Association (NSTA) journal entitled "The Science Teacher" that pertain to block scheduling and strategies for effective science instruction within this framework. Articles include "Blockbuster Ideas" (Judy Bohince and Ireve King), "Tackling Block Scheduling" (Martha M.…

  10. Assessing Multidimensional Energy Literacy of Secondary Students Using Contextualized Assessment

    ERIC Educational Resources Information Center

    Chen, Kuan-Li; Liu, Shiang-Yao; Chen, Po-Hsi

    2015-01-01

    Energy literacy is multidimensional, comprising broad content knowledge as well as affect and behavior. Our previous study has defined four core dimensions for the assessment framework, including energy concepts, reasoning on energy issues, low-carbon lifestyle, and civic responsibility for a sustainable society. The present study compiled a…

  11. C++QEDv2: The multi-array concept and compile-time algorithms in the definition of composite quantum systems

    NASA Astrophysics Data System (ADS)

    Vukics, András

    2012-06-01

    C++QED is a versatile framework for simulating open quantum dynamics. It allows to build arbitrarily complex quantum systems from elementary free subsystems and interactions, and simulate their time evolution with the available time-evolution drivers. Through this framework, we introduce a design which should be generic for high-level representations of composite quantum systems. It relies heavily on the object-oriented and generic programming paradigms on one hand, and on the other hand, compile-time algorithms, in particular C++ template-metaprogramming techniques. The core of the design is the data structure which represents the state vectors of composite quantum systems. This data structure models the multi-array concept. The use of template metaprogramming is not only crucial to the design, but with its use all computations pertaining to the layout of the simulated system can be shifted to compile time, hence cutting on runtime. Program summaryProgram title: C++QED Catalogue identifier: AELU_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:http://cpc.cs.qub.ac.uk/licence/aelu_v1_0.html. The C++QED package contains other software packages, Blitz, Boost and FLENS, all of which may be distributed freely but have individual license requirements. Please see individual packages for license conditions. No. of lines in distributed program, including test data, etc.: 597 974 No. of bytes in distributed program, including test data, etc.: 4 874 839 Distribution format: tar.gz Programming language: C++ Computer: i386-i686, x86_64 Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60 MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1 MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2, 20 External routines: Boost C++ libraries (http://www.boost.org/), GNU Scientific Library (http://www.gnu.org/software/gsl/), Blitz++ (http://www.oonumerics.org/blitz/), Linear Algebra Package - Flexible Library for Efficient Numerical Solutions (http://flens.sourceforge.net/). Nature of problem: Definition of (open) composite quantum systems out of elementary building blocks [1]. Manipulation of such systems, with emphasis on dynamical simulations such as Master-equation evolution [2] and Monte Carlo wave-function simulation [3]. Solution method: Master equation, Monte Carlo wave-function method. Restrictions: Total dimensionality of the system. Master equation - few thousands. Monte Carlo wave-function trajectory - several millions. Unusual features: Because of the heavy use of compile-time algorithms, compilation of programs written in the framework may take a long time and much memory (up to several GBs). Additional comments: The framework is not a program, but provides and implements an application-programming interface for developing simulations in the indicated problem domain. Supplementary information: http://cppqed.sourceforge.net/. Running time: Depending on the magnitude of the problem, can vary from a few seconds to weeks.

  12. An Evaluation Framework and Comparative Analysis of the Widely Used First Programming Languages

    PubMed Central

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores. PMID:24586449

  13. An evaluation framework and comparative analysis of the widely used first programming languages.

    PubMed

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  14. Data sharing in stem cell translational science: policy statement by the International Stem Cell Forum Ethics Working Party.

    PubMed

    Bredenoord, Annelien L; Mostert, Menno; Isasi, Rosario; Knoppers, Bartha M

    2015-01-01

    Data and sample sharing constitute a scientific and ethical imperative but need to be conducted in a responsible manner in order to protect individual interests as well as maintain public trust. In 2014, the Global Alliance for Genomics and Health (GA4GH) adopted a common Framework for Responsible Sharing of Genomic and Health-Related Data. The GA4GH Framework is applicable to data sharing in the stem cell field, however, interpretation is required so as to provide guidance for this specific context. In this paper, the International Stem Cell Forum Ethics Working Party discusses those principles that are specific to translational stem cell science, including engagement, data quality and safety, privacy, security and confidentiality, risk-benefit analysis and sustainability.

  15. Three Sets of Case Studies Suggest Logic and Consistency Challenges with Value Frameworks.

    PubMed

    Cohen, Joshua T; Anderson, Jordan E; Neumann, Peter J

    2017-02-01

    To assess the logic and consistency of three prominent value frameworks. We reviewed the value frameworks from three organizations: the Memorial Sloan Kettering Cancer Center (DrugAbacus), the American Society of Clinical Oncologists, and the Institute for Clinical and Economic Review. For each framework, we developed case studies to explore the degree to which the frameworks have face validity in the sense that they are consistent with four important principles: value should be proportional to a therapy's benefit; components of value should matter to framework users (patients and payers); attribute weights should reflect user preferences; and value estimates used to inform therapy prices should reflect per-person benefit. All three frameworks can aid decision making by elucidating factors not explicitly addressed by conventional evaluation techniques (in particular, cost-effectiveness analyses). Our case studies identified four challenges: 1) value is not always proportional to benefit; 2) value reflects factors that may not be relevant to framework users (patients or payers); 3) attribute weights do not necessarily reflect user preferences or relate to value in ways that are transparent; and 4) value does not reflect per-person benefit. Although the value frameworks we reviewed capture value in a way that is important to various audiences, they are not always logical or consistent. Because these frameworks may have a growing influence on therapy access, it is imperative that analytic challenges be further explored. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. ZettaBricks: A Language Compiler and Runtime System for Anyscale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amarasinghe, Saman

    This grant supported the ZettaBricks and OpenTuner projects. ZettaBricks is a new implicitly parallel language and compiler where defining multiple implementations of multiple algorithms to solve a problem is the natural way of programming. ZettaBricks makes algorithmic choice a first class construct of the language. Choices are provided in a way that also allows our compiler to tune at a finer granularity. The ZettaBricks compiler autotunes programs by making both fine-grained as well as algorithmic choices. Choices also include different automatic parallelization techniques, data distributions, algorithmic parameters, transformations, and blocking. Additionally, ZettaBricks introduces novel techniques to autotune algorithms for differentmore » convergence criteria. When choosing between various direct and iterative methods, the ZettaBricks compiler is able to tune a program in such a way that delivers near-optimal efficiency for any desired level of accuracy. The compiler has the flexibility of utilizing different convergence criteria for the various components within a single algorithm, providing the user with accuracy choice alongside algorithmic choice. OpenTuner is a generalization of the experience gained in building an autotuner for ZettaBricks. OpenTuner is a new open source framework for building domain-specific multi-objective program autotuners. OpenTuner supports fully-customizable configuration representations, an extensible technique representation to allow for domain-specific techniques, and an easy to use interface for communicating with the program to be autotuned. A key capability inside OpenTuner is the use of ensembles of disparate search techniques simultaneously; techniques that perform well will dynamically be allocated a larger proportion of tests.« less

  17. Grade Repetition in Honduran Primary Schools

    ERIC Educational Resources Information Center

    Marshall, Jeffery H.

    2003-01-01

    This paper looks at several dimensions of the grade failure issue in Honduras using a unique data set compiled by the UMCE evaluation project in 1998 and 1999. The analytical framework incorporates econometric analysis of standardized tests and teacher pass/fail decisions for roughly 13,000 second and fourth grade students. The results show that…

  18. A Vision beyond Survival: A Resource Guide for Incarcerated Women.

    ERIC Educational Resources Information Center

    Smith, Brenda V., Ed.; Dailard, Cynthia, Ed.

    This guide is a compilation of material critical to incarcerated women and to women in the community who have a history involving the criminal justice system. It provides them with a framework for analyzing problems, formulating strategies for change, and crafting solutions. Section 1 deals with negotiating the prison system. Six chapters address…

  19. Raising the Degree of Service-Orientation of a SOA-based Software System: A Case Study

    DTIC Science & Technology

    2009-12-01

    protocols, as well as executable processes that can be compiled into runtime scripts” [2] The Business Process Modeling Notation ( BPMN ) provides a...Notation ( BPMN ) 1.2. Jan. 2009. URL: http://www.omg.org/spec/ BPMN /1.2/ [25] .NET Framework Developer Center. .NET Remoting Overview. 2003. URL: http

  20. Colloquial Hebrew Imperatives Revisited

    ERIC Educational Resources Information Center

    Bolozky, Shmuel

    2009-01-01

    In revisiting Bolozky's [Bolozky, Shmuel, 1979. "On the new imperative in colloquial Hebrew." "Hebrew Annual Review" 3, 17-24] and Bat-El's [Bat-El, Outi, 2002. "True truncation in colloquial Hebrew imperatives." "Language" 78(4), 651-683] analyses of colloquial Hebrew imperatives, the article argues for restricting Imperative Truncation to the…

  1. Testing the Consolidated Framework for Implementation Research on health care innovations from South Yorkshire.

    PubMed

    Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky

    2013-10-01

    There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.

  2. Chronic disease management in children based on the five domains of health.

    PubMed

    So, Wing Lung Alvin

    2013-01-01

    Through a case study of a child with cystic fibrosis, the interactions among various domains of health have been discussed-namely, biomedical, physical, psychological/behavioural, and social. In pediatrics, development is another key domain relevant to the management of a chronic disease. An individualised management plan for this case has been outlined, and consideration of this framework may be worthwhile when managing other paediatric patients with chronic disease. Patient empowerment and parental education, as well as good co-ordination of health service delivery, are imperative to holistic patient care.

  3. Toward a Framework for Multicultural STEM-Focused Career Interventions.

    PubMed

    Byars-Winston, Angela

    2014-12-14

    Numerous federal and national commissions have called for policies, funds, and initiatives aimed at expanding the nation's science, technology, engineering, and mathematics (STEM) workforce and education investments to create a significantly larger, more diverse talent pool of individuals who pursue technical careers. Career development professionals are poised to contribute to the equity discourse about broadening STEM participation. However, few are aware of STEM-related career development matters, career opportunities and pathways, or strategies for promoting STEM pursuits. The author summarizes STEM education and workforce trends and articulates an equity imperative for broadening and diversifying STEM participation. The author then offers a multicultural STEM-focused career development framework to encourage career development professionals' knowledge and awareness of STEM education and careers and delineates considerations for practice aimed at increasing the attainment and achievement of diverse groups in STEM fields.

  4. Toward a Framework for Multicultural STEM-Focused Career Interventions

    PubMed Central

    Byars-Winston, Angela

    2015-01-01

    Numerous federal and national commissions have called for policies, funds, and initiatives aimed at expanding the nation's science, technology, engineering, and mathematics (STEM) workforce and education investments to create a significantly larger, more diverse talent pool of individuals who pursue technical careers. Career development professionals are poised to contribute to the equity discourse about broadening STEM participation. However, few are aware of STEM-related career development matters, career opportunities and pathways, or strategies for promoting STEM pursuits. The author summarizes STEM education and workforce trends and articulates an equity imperative for broadening and diversifying STEM participation. The author then offers a multicultural STEM-focused career development framework to encourage career development professionals' knowledge and awareness of STEM education and careers and delineates considerations for practice aimed at increasing the attainment and achievement of diverse groups in STEM fields. PMID:25750480

  5. Trials and Tribulations of Protecting Children from Environmental Hazards

    PubMed Central

    Lanphear, Bruce P.; Paulson, Jerome; Beirne, Sandra

    2006-01-01

    Society is increasingly aware of the profound impact that the environment has on children’s health. Not surprisingly, there is increasing public scrutiny about children’s exposures to environmental hazards, especially for disadvantaged children. These trends underscore the ethical imperative to develop a framework to protect children from environmental hazards. Such a framework must include regulations to test new chemicals and other potential hazards before they are marketed, a strategy to conduct research necessary to protect children from persistent hazards that are widely dispersed in their environment, stronger regulatory mechanisms to eliminate human exposures to recognized or suspected toxicants, and guidelines about the ethical conduct of research and the role of experimental trials that test the efficacy and safety of interventions to prevent or ameliorate children’s exposure to persistent toxicants or hazards that are widely dispersed in their environment. PMID:17035151

  6. 2020 Vision: Envisioning a New Generation of STEM Learning Research

    ERIC Educational Resources Information Center

    Dierking, Lynn D.; Falk, John H.

    2016-01-01

    In this issue, we have compiled six original papers, outcomes from the U.S. National Science Foundation (US-NSF)-funded REESE (Research and Evaluation on Education in Science and Engineering) 2020 Vision: The Next Generation of STEM Learning Research project. The purpose of 2020 Vision was to re-envision the questions and frameworks guiding STEM…

  7. Has Technology Been Considered? A Guide for IEP Teams. CASE/TAM Assistive Technology Policy and Practice Series.

    ERIC Educational Resources Information Center

    Chambers, A. C.

    This guide compiles information essential to a working knowledge of assistive technology for children with disabilities. It addresses the definition of assistive technology and provides information on laws which direct the provision of assistive technology. The manual provides a framework to guide the Individualized Education Program (IEP) team as…

  8. Looking Out and Looking In: Exploring a Case of Faculty Perceptions during E-Learning Staff Development

    ERIC Educational Resources Information Center

    Esterhuizen, Hendrik Daniel; Blignaut, Seugnet; Ellis, Suria

    2013-01-01

    This explorative study captured the perceptions of faculty members new to technology enhanced learning and the longitudinal observations of the e-learning manager during dedicated professional development in order to compile a socially transformative emergent learning technology integration framework for open and distance learning at the School of…

  9. A Language for Specifying Compiler Optimizations for Generic Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcock, Jeremiah J.

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allowmore » the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.« less

  10. Towards a behavioral-matching based compilation of synthetic biology functions.

    PubMed

    Basso-Blandin, Adrien; Delaplace, Franck

    2015-09-01

    The field of synthetic biology is looking forward engineering framework for safely designing reliable de-novo biological functions. In this undertaking, Computer-Aided-Design (CAD) environments should play a central role for facilitating the design. Although, CAD environment is widely used to engineer artificial systems the application in synthetic biology is still in its infancy. In this article we address the problem of the design of a high level language which at the core of CAD environment. More specifically the Gubs (Genomic Unified Behavioural Specification) language is a specification language used to describe the observations of the expected behaviour. The compiler appropriately selects components such that the observation of the synthetic biological function resulting to their assembly complies to the programmed behaviour.

  11. A Conceptual Framework for Planning Systemic Human Adaptation to Global Warming.

    PubMed

    Tait, Peter W; Hanna, Elizabeth G

    2015-08-31

    Human activity is having multiple, inter-related effects on ecosystems. Greenhouse gas emissions persisting along current trajectories threaten to significantly alter human society. At 0.85 °C of anthropogenic warming, deleterious human impacts are acutely evident. Additional warming of 0.5 °C-1.0 °C from already emitted CO₂ will further intensify extreme heat and damaging storm events. Failing to sufficiently address this trend will have a heavy human toll directly and indirectly on health. Along with mitigation efforts, societal adaptation to a warmer world is imperative. Adaptation efforts need to be significantly upscaled to prepare society to lessen the public health effects of rising temperatures. Modifying societal behaviour is inherently complex and presents a major policy challenge. We propose a social systems framework for conceptualizing adaptation that maps out three domains within the adaptation policy landscape: acclimatisation, behavioural adaptation and technological adaptation, which operate at societal and personal levels. We propose that overlaying this framework on a systems approach to societal change planning methods will enhance governments' capacity and efficacy in strategic planning for adaptation. This conceptual framework provides a policy oriented planning assessment tool that will help planners match interventions to the behaviours being targeted for change. We provide illustrative examples to demonstrate the framework's application as a planning tool.

  12. Evolution of the ATLAS Nightly Build System

    NASA Astrophysics Data System (ADS)

    Undrus, A.

    2012-12-01

    The ATLAS Nightly Build System is a major component in the ATLAS collaborative software organization, validation, and code approval scheme. For over 10 years of development it has evolved into a factory for automatic release production and grid distribution. The 50 multi-platform branches of ATLAS releases provide vast opportunities for testing new packages, verification of patches to existing software, and migration to new platforms and compilers for ATLAS code that currently contains 2200 packages with 4 million C++ and 1.4 million python scripting lines written by about 1000 developers. Recent development was focused on the integration of ATLAS Nightly Build and Installation systems. The nightly releases are distributed and validated and some are transformed into stable releases used for data processing worldwide. The ATLAS Nightly System is managed by the NICOS control tool on a computing farm with 50 powerful multiprocessor nodes. NICOS provides the fully automated framework for the release builds, testing, and creation of distribution kits. The ATN testing framework of the Nightly System runs unit and integration tests in parallel suites, fully utilizing the resources of multi-core machines, and provides the first results even before compilations complete. The NICOS error detection system is based on several techniques and classifies the compilation and test errors according to their severity. It is periodically tuned to place greater emphasis on certain software defects by highlighting the problems on NICOS web pages and sending automatic e-mail notifications to responsible developers. These and other recent developments will be presented and future plans will be described.

  13. Driver gene classification reveals a substantial overrepresentation of tumor suppressors among very large chromatin-regulating proteins.

    PubMed

    Waks, Zeev; Weissbrod, Omer; Carmeli, Boaz; Norel, Raquel; Utro, Filippo; Goldschmidt, Yaara

    2016-12-23

    Compiling a comprehensive list of cancer driver genes is imperative for oncology diagnostics and drug development. While driver genes are typically discovered by analysis of tumor genomes, infrequently mutated driver genes often evade detection due to limited sample sizes. Here, we address sample size limitations by integrating tumor genomics data with a wide spectrum of gene-specific properties to search for rare drivers, functionally classify them, and detect features characteristic of driver genes. We show that our approach, CAnceR geNe similarity-based Annotator and Finder (CARNAF), enables detection of potentially novel drivers that eluded over a dozen pan-cancer/multi-tumor type studies. In particular, feature analysis reveals a highly concentrated pool of known and putative tumor suppressors among the <1% of genes that encode very large, chromatin-regulating proteins. Thus, our study highlights the need for deeper characterization of very large, epigenetic regulators in the context of cancer causality.

  14. Advancing Value Assessment in the United States: A Multistakeholder Perspective.

    PubMed

    Sorenson, Corinna; Lavezzari, Gabriela; Daniel, Gregory; Burkholder, Randy; Boutin, Marc; Pezalla, Edmund; Sanders, Gillian; McClellan, Mark

    2017-02-01

    Rising costs without perceived proportional improvements in quality and outcomes have motivated fundamental shifts in health care delivery and payment to achieve better value. Aligned with these efforts, several value assessment frameworks have been introduced recently to help providers, patients, and payers better understand the potential value of drugs and other interventions and make informed decisions about their use. Given their early stage of development, it is imperative to evaluate these efforts on an ongoing basis to identify how best to support and improve them moving forward. This article provides a multistakeholder perspective on the key limitations and opportunities posed by the current value assessment frameworks and areas of and actions for improvement. In particular, we outline 10 fundamental guiding principles and associated strategies that should be considered in subsequent iterations of the existing frameworks or by emerging initiatives in the future. Although value assessment frameworks may not be able to meet all the needs and preferences of stakeholders, we contend that there are common elements and potential next steps that can be supported to advance value assessment in the United States. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  15. Knowledge for Healthcare: the future of health librarianship.

    PubMed

    Bryant, Sue Lacey; Stewart, David; Goswami, Louise

    2015-09-01

    Many people are still not receiving the right care. It is imperative for health care librarians to come together around a common vision to achieve Knowledge for Healthcare so that the right knowledge and evidence is used at the right time in the right place. The authors describe five workstreams within a modernisation programme: Service Transformation, Workforce Planning and Development, Quality and Impact, Resource Discovery and Optimising Investment. Communications, engagement and partnership working are central to success. The development framework sets out principles on which to base decisions, and design criteria for transforming services. © 2015 Health Libraries Group.

  16. Promoting research on research integrity in Canada.

    PubMed

    Master, Zubin; McDonald, Michael; Williams-Jones, Bryn

    2012-01-01

    Research on research integrity is an important element in building a strong national research integrity framework. There is a lack of empirical evidence and conceptual research on research integrity in Canada. To further strengthen and develop our system of research integrity, we believe that greater support is needed to promote research on research integrity. Research on research integrity is imperative in order to gain a richer understanding of the diversity of responsible conduct of research norms, practices, education and policies from a Canadian perspective. The knowledge gained would help in the development of an evidenced-based and responsive Canadian system of research integrity.

  17. The Interactive Approach in the Teaching of Mathematical Methods in Physics

    NASA Astrophysics Data System (ADS)

    Vassileva, Radost I.

    2007-04-01

    Traditional pedagogical practice is mainly directed towards the implementation of obligatory syllabuses, transfer of knowledge, formation of skills and habits in students. It is authoritative and imperative in its essence. Modern educational tendencies impose the promotion of a pedagogical process which is oriented towards the individual. The young person should enjoy a new atmosphere - creative, interesting, meaningful, and it should be based on self-cognition and the life-long emotional and intellectual development of the individual. The article discusses certain opportunities for the realization of interactive pedagogical communication within the framework of a specific university subject studied by physics majors.

  18. Complexity leadership: a healthcare imperative.

    PubMed

    Weberg, Dan

    2012-01-01

    The healthcare system is plagued with increasing cost and poor quality outcomes. A major contributing factor for these issues is that outdated leadership practices, such as leader-centricity, linear thinking, and poor readiness for innovation, are being used in healthcare organizations. Complexity leadership theory provides a new framework with which healthcare leaders may practice leadership. Complexity leadership theory conceptualizes leadership as a continual process that stems from collaboration, complex systems thinking, and innovation mindsets. Compared to transactional and transformational leadership concepts, complexity leadership practices hold promise to improve cost and quality in health care. © 2012 Wiley Periodicals, Inc.

  19. A multicriteria framework for producing local, regional, and national insect and disease risk maps

    Treesearch

    Frank J. Jr. Krist; Frank J. Sapio

    2010-01-01

    The construction of the 2006 National Insect and Disease Risk Map, compiled by the USDA Forest Service, State and Private Forestry Area, Forest Health Protection Unit, resulted in the development of a GIS-based, multicriteria approach for insect and disease risk mapping that can account for regional variations in forest health concerns and threats. This risk mapping...

  20. Compiler-based code generation and autotuning for geometric multigrid on GPU-accelerated supercomputers

    DOE PAGES

    Basu, Protonu; Williams, Samuel; Van Straalen, Brian; ...

    2017-04-05

    GPUs, with their high bandwidths and computational capabilities are an increasingly popular target for scientific computing. Unfortunately, to date, harnessing the power of the GPU has required use of a GPU-specific programming model like CUDA, OpenCL, or OpenACC. Thus, in order to deliver portability across CPU-based and GPU-accelerated supercomputers, programmers are forced to write and maintain two versions of their applications or frameworks. In this paper, we explore the use of a compiler-based autotuning framework based on CUDA-CHiLL to deliver not only portability, but also performance portability across CPU- and GPU-accelerated platforms for the geometric multigrid linear solvers found inmore » many scientific applications. We also show that with autotuning we can attain near Roofline (a performance bound for a computation and target architecture) performance across the key operations in the miniGMG benchmark for both CPU- and GPU-based architectures as well as for a multiple stencil discretizations and smoothers. We show that our technology is readily interoperable with MPI resulting in performance at scale equal to that obtained via hand-optimized MPI+CUDA implementation.« less

  1. Compiler-based code generation and autotuning for geometric multigrid on GPU-accelerated supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basu, Protonu; Williams, Samuel; Van Straalen, Brian

    GPUs, with their high bandwidths and computational capabilities are an increasingly popular target for scientific computing. Unfortunately, to date, harnessing the power of the GPU has required use of a GPU-specific programming model like CUDA, OpenCL, or OpenACC. Thus, in order to deliver portability across CPU-based and GPU-accelerated supercomputers, programmers are forced to write and maintain two versions of their applications or frameworks. In this paper, we explore the use of a compiler-based autotuning framework based on CUDA-CHiLL to deliver not only portability, but also performance portability across CPU- and GPU-accelerated platforms for the geometric multigrid linear solvers found inmore » many scientific applications. We also show that with autotuning we can attain near Roofline (a performance bound for a computation and target architecture) performance across the key operations in the miniGMG benchmark for both CPU- and GPU-based architectures as well as for a multiple stencil discretizations and smoothers. We show that our technology is readily interoperable with MPI resulting in performance at scale equal to that obtained via hand-optimized MPI+CUDA implementation.« less

  2. A Compilation of Provisional Karst Geospatial Data for the Interior Low Plateaus Physiographic Region, Central United States

    USGS Publications Warehouse

    Taylor, Charles J.; Nelson, Hugh L.

    2008-01-01

    Geospatial data needed to visualize and evaluate the hydrogeologic framework and distribution of karst features in the Interior Low Plateaus physiographic region of the central United States were compiled during 2004-2007 as part of the Ground-Water Resources Program Karst Hydrology Initiative (KHI) project. Because of the potential usefulness to environmental and water-resources regulators, private consultants, academic researchers, and others, the geospatial data files created during the KHI project are being made available to the public as a provisional regional karst dataset. To enhance accessibility and visualization, the geospatial data files have been compiled as ESRI ArcReader data folders and user interactive Published Map Files (.pmf files), all of which are catalogued by the boundaries of surface watersheds using U.S. Geological Survey (USGS) eight-digit hydrologic unit codes (HUC-8s). Specific karst features included in the dataset include mapped sinkhole locations, sinking (or disappearing) streams, internally drained catchments, karst springs inventoried in the USGS National Water Information System (NWIS) database, relic stream valleys, and karst flow paths obtained from results of previously reported water-tracer tests.

  3. Social inclusion/exclusion as matters of social (in)justice: a call for nursing action.

    PubMed

    Yanicki, Sharon M; Kushner, Kaysi E; Reutter, Linda

    2015-06-01

    Social inclusion/exclusion involves just/unjust social relations and social structures enabling or constraining opportunities for participation and health. In this paper, social inclusion/exclusion is explored as a dialectic. Three discourses--discourses on recognition, capabilities, and equality and citizenship--are identified within Canadian literature. Each discourse highlights a different view of the injustices leading to social exclusion and the conditions supporting inclusion and social justice. An Integrated Framework for Social Justice that incorporates the three discourses is developed and used to critique the dominant focus on distributive justice within foundational Canadian nursing documents. We propose a broader conceptualization of social (in)justice that includes both relational and structural dimensions. Opportunities for multilevel interventions to promote social justice are identified. This framework is congruent with nursing's moral imperative to promote health equity and with the multiple roles played by nurses to promote social justice in everyday practice. © 2014 John Wiley & Sons Ltd.

  4. Commentary: Prevention of violence against children: a framework for progress in low- and middle-income countries.

    PubMed

    Chandran, Aruna; Puvanachandra, Prasanthi; Hyder, Adnan A

    2011-02-01

    Violence against children has been the least reported, studied, and understood area of child injuries. Initial awareness emerged from international conferences and resolutions, followed by national policies and statements. More effective responses around the world will require action. Although previous calls for action have pointed to important activities (gathering of baseline data, passing of legal reforms, and providing services to those who experience violence), the agenda is limited. Data collection needs to be continuous, systematic, and sustainable, and should enable ongoing evaluation of intervention programs. An inter-sectoral approach to violence against children incorporating public health, criminal justice, social services, education, non-governmental organizations, media, and businesses is imperative if the growing burden is to be mitigated. Thus we offer a framework, building on earlier recommendations, to focus on four domains: national surveillance, intervention research, legislation and policy, and partnerships and collaboration.

  5. Mainstreaming implementation science into immunization systems in the decade of vaccines: a programmatic imperative for the African Region.

    PubMed

    Adamu, Abdu A; Adamu, Aishatu L; Dahiru, Abdulkarim I; Uthman, Olalekan A; Wiysonge, Charles S

    2018-05-17

    Several innovations that can improve immunization systems already exist. Some interventions target service consumers within communities to raise awareness, build trust, improve understanding, remind caregivers, reward service users, and improve communication. Other interventions target health facilities to improve access and quality of vaccination services among others. Despite available empirical evidence, there is a delay in translating innovations into routine practice by immunization programmes. Drawing on an existing implementation science framework, we propose an interactive, and multi-perspective model to improve uptake and utilization of available immunization-related innovations in the African region. It is important to stress that our framework is by no means prescriptive. The key intention is to advocate for the entire immunization system to be viewed as an interconnected system of stakeholders, so as to foster better interaction, and proactive transfer of evidence-based innovation into policy and practice.

  6. Early stage structural development of prototypical zeolitic imidazolate framework (ZIF) in solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terban, Maxwell W.; Banerjee, Debasis; Ghose, Sanjit

    Given the wide-ranging potential applications of metal organic frameworks (MOFs), an emerging imperative is to understand their formation with atomic scale precision. This will aid in designing syntheses for next-generation MOFs with enhanced properties and functionalities. Major challenges are to characterize the early-stage seeds, and the pathways to framework growth, which require synthesis coupled with in situ structural characterization sensitive to nanoscale structures in solution. Here we report measurements of an in situ synthesis of a prototypical MOF, ZIF-8, utilizing synchrotron X-ray atomic pair distribution function (PDF) analysis optimized for sensitivity to dilute species, complemented by mass spectrometry, electron microscopy,more » and density functional theory calculations. We observe that despite rapid formation of the crystalline product, a high concentration of Zn(2-MeIm) 4(2-MeIm=2-methylimidazolate) initially forms and persists as stable clusters over long times. A secondary, amorphous phase also pervades during the synthesis, which has a structural similarity to the final ZIF-8 and may act as an intermediate to the final product.« less

  7. Measuring treatment effects on dual-task performance: a framework for research and clinical practice

    PubMed Central

    Plummer, Prudence; Eskes, Gail

    2015-01-01

    The relevance of dual-task walking to everyday ambulation is widely acknowledged, and numerous studies have demonstrated that dual-task interference can significantly impact recovery of functional walking in people with neurological disorders. The magnitude and direction of dual-task interference is influenced by the interaction between the two tasks, including how individuals spontaneously prioritize their attention. Therefore, to accurately interpret and characterize dual-task interference and identify changes over time, it is imperative to evaluate single and dual-task performance in both tasks, as well as the tasks relative to each other. Yet, reciprocal dual-task effects (DTE) are frequently ignored. The purpose of this perspective paper is to present a framework for measuring treatment effects on dual-task interference, specifically taking into account the interactions between the two tasks and how this can provide information on whether overall dual-task capacity has improved or a different attentional strategy has been adopted. In discussing the clinical implications of using this framework, we provide specific examples of using this method and provide some explicit recommendations for research and clinical practice. PMID:25972801

  8. Early stage structural development of prototypical zeolitic imidazolate framework (ZIF) in solution

    DOE PAGES

    Terban, Maxwell W.; Banerjee, Debasis; Ghose, Sanjit; ...

    2018-02-05

    Given the wide-ranging potential applications of metal organic frameworks (MOFs), an emerging imperative is to understand their formation with atomic scale precision. This will aid in designing syntheses for next-generation MOFs with enhanced properties and functionalities. Major challenges are to characterize the early-stage seeds, and the pathways to framework growth, which require synthesis coupled with in situ structural characterization sensitive to nanoscale structures in solution. Here we report measurements of an in situ synthesis of a prototypical MOF, ZIF-8, utilizing synchrotron X-ray atomic pair distribution function (PDF) analysis optimized for sensitivity to dilute species, complemented by mass spectrometry, electron microscopy,more » and density functional theory calculations. We observe that despite rapid formation of the crystalline product, a high concentration of Zn(2-MeIm) 4(2-MeIm=2-methylimidazolate) initially forms and persists as stable clusters over long times. A secondary, amorphous phase also pervades during the synthesis, which has a structural similarity to the final ZIF-8 and may act as an intermediate to the final product.« less

  9. Run-time parallelization and scheduling of loops

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Mirchandaney, Ravi; Crowley, Kay

    1991-01-01

    Run-time methods are studied to automatically parallelize and schedule iterations of a do loop in certain cases where compile-time information is inadequate. The methods presented involve execution time preprocessing of the loop. At compile-time, these methods set up the framework for performing a loop dependency analysis. At run-time, wavefronts of concurrently executable loop iterations are identified. Using this wavefront information, loop iterations are reordered for increased parallelism. Symbolic transformation rules are used to produce: inspector procedures that perform execution time preprocessing, and executors or transformed versions of source code loop structures. These transformed loop structures carry out the calculations planned in the inspector procedures. Performance results are presented from experiments conducted on the Encore Multimax. These results illustrate that run-time reordering of loop indexes can have a significant impact on performance.

  10. The research of .NET framework based on delegate of the LCE

    NASA Astrophysics Data System (ADS)

    Chen, Yi-peng

    2011-10-01

    Programmers realize LCE Enterprise services provided by NET framework when they develop applied VC# programming design language with component technology facing objects Lots of basic codes used to be compiled in the traditional programming design. However, nowadays this can be done just by adding corresponding character at class, interface, method, assembly with simple declarative program. This paper mainly expatiates the mechanism to realize LCE event services with delegate mode in C#. It also introduces the procedure of applying event class, event publisher, subscriber and client in LCE technology. It analyses the technology points of LCE based on delegate mode with popular language and practicing cases.

  11. A Characteristics-Based Approach to Radioactive Waste Classification in Advanced Nuclear Fuel Cycles

    NASA Astrophysics Data System (ADS)

    Djokic, Denia

    The radioactive waste classification system currently used in the United States primarily relies on a source-based framework. This has lead to numerous issues, such as wastes that are not categorized by their intrinsic risk, or wastes that do not fall under a category within the framework and therefore are without a legal imperative for responsible management. Furthermore, in the possible case that advanced fuel cycles were to be deployed in the United States, the shortcomings of the source-based classification system would be exacerbated: advanced fuel cycles implement processes such as the separation of used nuclear fuel, which introduce new waste streams of varying characteristics. To be able to manage and dispose of these potential new wastes properly, development of a classification system that would assign appropriate level of management to each type of waste based on its physical properties is imperative. This dissertation explores how characteristics from wastes generated from potential future nuclear fuel cycles could be coupled with a characteristics-based classification framework. A static mass flow model developed under the Department of Energy's Fuel Cycle Research & Development program, called the Fuel-cycle Integration and Tradeoffs (FIT) model, was used to calculate the composition of waste streams resulting from different nuclear fuel cycle choices: two modified open fuel cycle cases (recycle in MOX reactor) and two different continuous-recycle fast reactor recycle cases (oxide and metal fuel fast reactors). This analysis focuses on the impact of waste heat load on waste classification practices, although future work could involve coupling waste heat load with metrics of radiotoxicity and longevity. The value of separation of heat-generating fission products and actinides in different fuel cycles and how it could inform long- and short-term disposal management is discussed. It is shown that the benefits of reducing the short-term fission-product heat load of waste destined for geologic disposal are neglected under the current source-based radioactive waste classification system, and that it is useful to classify waste streams based on how favorable the impact of interim storage is on increasing repository capacity. The need for a more diverse set of waste classes is discussed, and it is shown that the characteristics-based IAEA classification guidelines could accommodate wastes created from advanced fuel cycles more comprehensively than the U.S. classification framework.

  12. Tectonic and neotectonic framework of the Yucca Mountain Region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schweickert, R.A.

    1992-09-30

    Highlights of major research accomplishments concerned with the tectonics and neotectonics of the Yucca Mountain Region include: structural studies in Grapevine Mountains, Bullfrog Hills, and Bare Mountain; recognition of significance of pre-Middle Miocene normal and strike-slip faulting at Bare Mountain; compilation of map of quaternary faulting in Southern Amargosa Valley; and preliminary paleomagnetic analysis of Paleozoic and Cenozoic units at Bare Mountain.

  13. Cultural group selection is plausible, but the predictions of its hypotheses should be tested with real-world data.

    PubMed

    Turchin, Peter; Currie, Thomas E

    2016-01-01

    The evidence compiled in the target article demonstrates that the assumptions of cultural group selection (CGS) theory are often met, and it is therefore a useful framework for generating plausible hypotheses. However, more can be said about how we can test the predictions of CGS hypotheses against competing explanations using historical, archaeological, and anthropological data.

  14. Procedures Manual: A Guide to Uniform Grant and Contract Management Standards and The Common Rule for Uniform Administrative Requirements for Grants and Cooperative Agreements to State and Local Governments.

    ERIC Educational Resources Information Center

    Conable, Sharon R.

    This manual has been compiled to provide consistent grant application and administrative procedures for state agencies which award grants or contracts to local governments. It provides a conceptual framework of information concerning the reporting, financial, contractual, and auditing requirements for recipients of Texas State Library grants…

  15. Pipeline Optimization Program (PLOP)

    DTIC Science & Technology

    2006-08-01

    the framework of the Dredging Operations Decision Support System (DODSS, https://dodss.wes.army.mil/wiki/0). PLOP compiles industry standards and...efficiency point ( BEP ). In the interest of acceptable wear rate on the pump, industrial standards dictate that the flow Figure 2. Pump class as a function of...percentage of the flow rate corresponding to the BEP . Pump Acceptability Rules. The facts for pump performance, industrial standards and pipeline and

  16. Man and Environment for the Intermediate Grades; A Curriculum Guide for Environmental Studies for Grades 4-8.

    ERIC Educational Resources Information Center

    National Association for Environmental Education, Miami, FL.

    This curriculum guide consists of environmental studies modules for grades 4-8. The curriculum, which is organized around major concepts, is intended to serve as a guide for program development and as a framework for compiling and sharing ideas on methods and application on a national basis. Each module may be utilized as an integral part of the…

  17. Production and Beyond: A Defining Moment for Public Sector Extension

    ERIC Educational Resources Information Center

    Rivera, William M.

    2009-01-01

    Two imperatives form the basis of the present paper. The first is the market-driven imperative, vital to production and value-chain development. The second is the knowledge imperative, central to the advancement of human capacity and institutional development. In view of these two imperatives, this paper argues for overhaul in extension toward a…

  18. A Framework for Developing the Structure of Public Health Economic Models.

    PubMed

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-01-01

    A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Increasing the Translation of Evidence Into Practice, Policy, and Public Health Improvements: A Framework for Training Health Professionals in Implementation and Dissemination Science

    PubMed Central

    Gonzales, Ralph; Handley, Margaret A.; Ackerman, Sara; O’Sullivan, Patricia S.

    2012-01-01

    The authors describe a conceptual framework for implementation and dissemination science (IDS) and propose competencies for IDS training. Their framework is designed to facilitate the application of theories and methods from the distinct domains of clinical disciplines (e.g., medicine, public health), population sciences (e.g., biostatistics, epidemiology) and translational disciplines (e.g., social and behavioral sciences, business administration education). They explore three principles that guided the development of their conceptual framework: Behavior change among organizations and/or individuals (providers, patients) is inherent in the translation process; engagement of stakeholder organizations, health care delivery systems, and individuals is imperative to achieve effective translation and sustained improvements; and IDS research is iterative, benefiting from cycles and collaborative, bidirectional relationships. The authors propose seven domains for IDS training--team science, context identification, literature identification and assessment, community engagement, intervention design and research implementation, evaluation of effect of translational activity, behavioral change communication strategies--and define twelve IDS training competencies within these domains. As a model, they describe specific courses introduced at the University of California, San Francisco, which they designed to develop these competencies. The authors encourage other training programs and institutions to use (or adapt) the design principles, conceptual framework, And proposed competencies to evaluate their current IDS training needs and to support new program development. PMID:22373617

  20. Mothers of Obese Children Use More Direct Imperatives to Restrict Eating.

    PubMed

    Pesch, Megan H; Miller, Alison L; Appugliese, Danielle P; Rosenblum, Katherine L; Lumeng, Julie C

    2018-04-01

    To examine the association of mother and child characteristics with use of direct imperatives to restrict eating. A total of 237 mother-child dyads (mean child age, 70.9 months) participated in a video-recorded, laboratory-standardized eating protocol with 2 large portions of cupcakes. Videos were reliably coded for counts of maternal direct imperatives to restrict children's eating. Anthropometrics were measured. Regression models tested the association of participant characteristics with counts of direct imperatives. Child obese weight status and maternal white non-Hispanic race/ethnicity were associated with greater levels of direct imperatives to restrict eating (p = .0001 and .0004, respectively). Mothers of obese children may be using more direct imperatives to restrict eating so as to achieve behavioral compliance to decrease their child's food intake. Future work should consider the effects direct imperatives have on children's short- and long-term eating behaviors and weight gain trajectories. Copyright © 2017 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  1. Ecoregions of Arizona (poster)

    USGS Publications Warehouse

    Griffith, Glenn E.; Omernik, James M.; Johnson, Colleen Burch; Turner, Dale S.

    2014-01-01

    Ecoregions denote areas of general similarity in ecosystems and in the type, quality, and quantity of environmental resources; they are designed to serve as a spatial framework for the research, assessment, management, and monitoring of ecosystems and ecosystem components. By recognizing the spatial differences in the capacities and potentials of ecosystems, ecoregions stratify the environment by its probable response to disturbance. These general purpose regions are critical for structuring and implementing ecosystem management strategies across federal agencies, state agencies, and nongovernment organizations that are responsible for different types of resources within the same geographical areas. The Arizona ecoregion map was compiled at a scale of 1:250,000. It revises and subdivides an earlier national ecoregion map that was originally compiled at a smaller scale. The approach used to compile this map is based on the premise that ecological regions can be identified through the analysis of the spatial patterns and the composition of biotic and abiotic phenomena that affect or reflect differences in ecosystem quality and integrity. These phenomena include geology, physiography, vegetation, climate, soils, land use, wildlife, and hydrology. The relative importance of each characteristic varies from one ecological region to another regardless of the hierarchical level. A Roman numeral hierarchical scheme has been adopted for different levels of ecological regions. Level I is the coarsest level, dividing North America into 15 ecological regions. Level II divides the continent into 50 regions. At level III, the continental United States contains 105 ecoregions and the conterminous United States has 85 ecoregions. Level IV is a further subdivision of level III ecoregions. Arizona contains arid deserts and canyonlands, semiarid shrub- and grass-covered plains, woodland- and shrubland-covered hills, lava fields and volcanic plateaus, forested mountains, glaciated peaks, and river alluvial floodplains. Ecological diversity is remarkably high. There are 7 level III ecoregions and 52 level IV ecoregions in Arizona and many continue into ecologically similar parts of adjacent states. This poster is part of a collaborative project primarily between the U.S. Geological Survey (USGS), USEPA National Health and Environmental Effects Research Laboratory (Corvallis, Oregon), USEPA Region IX, U.S. Department of Agriculture (USDA)–Natural Resources Conservation Service (NRCS), The Nature Conservancy, and several Arizona state agencies. The project is associated with an interagency effort to develop a common national framework of ecological regions. Reaching that objective requires recognition of the differences in the conceptual approaches and mapping methodologies applied to develop the most common ecoregion-type frameworks, including those developed by the USDA–Forest Service, the USEPA, and the NRCS. As each of these frameworks is further refined, their differences are becoming less discernible. Collaborative ecoregion projects, such as this one in Arizona, are a step toward attaining consensus and consistency in ecoregion frameworks for the entire nation.

  2. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalinina, Elena Arkadievna; Samsa, Michael

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literaturemore » for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs conducted by the previous studies.« less

  3. A Conceptual Framework for Planning Systemic Human Adaptation to Global Warming

    PubMed Central

    Tait, Peter W.; Hanna, Elizabeth G.

    2015-01-01

    Human activity is having multiple, inter-related effects on ecosystems. Greenhouse gas emissions persisting along current trajectories threaten to significantly alter human society. At 0.85 °C of anthropogenic warming, deleterious human impacts are acutely evident. Additional warming of 0.5 °C–1.0 °C from already emitted CO2 will further intensify extreme heat and damaging storm events. Failing to sufficiently address this trend will have a heavy human toll directly and indirectly on health. Along with mitigation efforts, societal adaptation to a warmer world is imperative. Adaptation efforts need to be significantly upscaled to prepare society to lessen the public health effects of rising temperatures. Modifying societal behaviour is inherently complex and presents a major policy challenge. We propose a social systems framework for conceptualizing adaptation that maps out three domains within the adaptation policy landscape: acclimatisation, behavioural adaptation and technological adaptation, which operate at societal and personal levels. We propose that overlaying this framework on a systems approach to societal change planning methods will enhance governments’ capacity and efficacy in strategic planning for adaptation. This conceptual framework provides a policy oriented planning assessment tool that will help planners match interventions to the behaviours being targeted for change. We provide illustrative examples to demonstrate the framework’s application as a planning tool. PMID:26334285

  4. Conceptualizing a Human Right to Prevention in Global HIV/AIDS Policy

    PubMed Central

    Meier, Benjamin Mason; Brugh, Kristen Nichole; Halima, Yasmin

    2012-01-01

    Given current constraints on universal treatment campaigns, recent advances in public health prevention initiatives have revitalized efforts to stem the tide of HIV transmission. Yet, despite a growing imperative for prevention—supported by the promise of behavioral, structural and biomedical approaches to lower the incidence of HIV—human rights frameworks remain limited in addressing collective prevention policy through global health governance. Assessing the evolution of rights-based approaches to global HIV/AIDS policy, this review finds that human rights have shifted from collective public health to individual treatment access. While the advent of the HIV/AIDS pandemic gave meaning to rights in framing global health policy, the application of rights in treatment access litigation came at the expense of public health prevention efforts. Where the human rights framework remains limited to individual rights enforced against a state duty bearer, such rights have faced constrained application in framing population-level policy to realize the public good of HIV prevention. Concluding that human rights frameworks must be developed to reflect the complementarity of individual treatment and collective prevention, this article conceptualizes collective rights to public health, structuring collective combination prevention to alleviate limitations on individual rights frameworks and frame rights-based global HIV/AIDS policy to assure research expansion, prevention access and health system integration. PMID:23226723

  5. Rapid development of entity-based data models for bioinformatics with persistence object-oriented design and structured interfaces.

    PubMed

    Ezra Tsur, Elishai

    2017-01-01

    Databases are imperative for research in bioinformatics and computational biology. Current challenges in database design include data heterogeneity and context-dependent interconnections between data entities. These challenges drove the development of unified data interfaces and specialized databases. The curation of specialized databases is an ever-growing challenge due to the introduction of new data sources and the emergence of new relational connections between established datasets. Here, an open-source framework for the curation of specialized databases is proposed. The framework supports user-designed models of data encapsulation, objects persistency and structured interfaces to local and external data sources such as MalaCards, Biomodels and the National Centre for Biotechnology Information (NCBI) databases. The proposed framework was implemented using Java as the development environment, EclipseLink as the data persistency agent and Apache Derby as the database manager. Syntactic analysis was based on J3D, jsoup, Apache Commons and w3c.dom open libraries. Finally, a construction of a specialized database for aneurysms associated vascular diseases is demonstrated. This database contains 3-dimensional geometries of aneurysms, patient's clinical information, articles, biological models, related diseases and our recently published model of aneurysms' risk of rapture. Framework is available in: http://nbel-lab.com.

  6. The effects of the Chesapeake Bay impact crater on the geologic framework and the correlation of hydrogeologic units of southeastern Virginia, south of the James River

    USGS Publications Warehouse

    Powars, David S.

    2000-01-01

    About 35 million years ago, a large comet or meteor slammed into the shallow shelf on the western margin of the Atlantic Ocean, creating the Chesapeake Bay impact crater. This report, the second in a series, refines the geologic framework of southeastern Virginia, south of the James River in and near the impact crater, and presents evidence for the existence of a pre-impact James River structural zone. The report includes detailed correlations of core lithologies with borehole geophysical logs; the correlations provide the foundation for the compilation of stratigraphic cross sections. These cross sections are tied into the geologic framework of the lower York-James Peninsula as presented in the first report in the series, Professional Paper 1612

  7. Estimating Regional and National-Scale Greenhouse Gas Emissions in the Agriculture, Forestry, and Other Land Use (AFOLU) Sector using the `Agricultural and Land Use (ALU) Tool'

    NASA Astrophysics Data System (ADS)

    Spencer, S.; Ogle, S. M.; Wirth, T. C.; Sivakami, G.

    2016-12-01

    The Intergovernmental Panel on Climate Change (IPCC) provides methods and guidance for estimating anthropogenic greenhouse gas emissions for reporting to the United Nations Framework Convention on Climate Change. The methods are comprehensive and require extensive data compilation, management, aggregation, documentation and calculations of source and sink categories to achieve robust emissions estimates. IPCC Guidelines describe three estimation tiers that require increasing levels of country-specific data and method complexity. Use of higher tiers should improve overall accuracy and reduce uncertainty in estimates. The AFOLU sector represents a complex set of methods for estimating greenhouse gas emissions and carbon sinks. Major AFOLU emissions and sinks include carbon dioxide (CO2) from carbon stock change in biomass, dead organic matter and soils, urea or lime application to soils, and oxidation of carbon in drained organic soils; nitrous oxide (N2O) and methane (CH4) emissions from livestock management and biomass burning; N2O from organic amendments and fertilizer application to soils, and CH4 emissions from rice cultivation. To assist inventory compilers with calculating AFOLU-sector estimates, the Agriculture and Land Use Greenhouse Gas Inventory Tool (ALU) was designed to implement Tier 1 and 2 methods using IPCC Good Practice Guidance. It guides the compiler through activity data entry, emission factor assignment, and emissions calculations while carefully maintaining data integrity. ALU also provides IPCC defaults and can estimate uncertainty. ALU was designed to simplify the AFOLU inventory compilation process at regional or national scales, disaggregating the process into a series of steps reduces the potential for errors in the compilation process. An example application has been developed using ALU to estimate methane emissions from rice production in the United States.

  8. Machine-learned and codified synthesis parameters of oxide materials

    NASA Astrophysics Data System (ADS)

    Kim, Edward; Huang, Kevin; Tomala, Alex; Matthews, Sara; Strubell, Emma; Saunders, Adam; McCallum, Andrew; Olivetti, Elsa

    2017-09-01

    Predictive materials design has rapidly accelerated in recent years with the advent of large-scale resources, such as materials structure and property databases generated by ab initio computations. In the absence of analogous ab initio frameworks for materials synthesis, high-throughput and machine learning techniques have recently been harnessed to generate synthesis strategies for select materials of interest. Still, a community-accessible, autonomously-compiled synthesis planning resource which spans across materials systems has not yet been developed. In this work, we present a collection of aggregated synthesis parameters computed using the text contained within over 640,000 journal articles using state-of-the-art natural language processing and machine learning techniques. We provide a dataset of synthesis parameters, compiled autonomously across 30 different oxide systems, in a format optimized for planning novel syntheses of materials.

  9. Discussion on the Criterion for the Safety Certification Basis Compilation - Brazilian Space Program Case

    NASA Astrophysics Data System (ADS)

    Niwa, M.; Alves, N. C.; Caetano, A. O.; Andrade, N. S. O.

    2012-01-01

    The recent advent of the commercial launch and re- entry activities, for promoting the expansion of human access to space for tourism and hypersonic travel, in the already complex ambience of the global space activities, brought additional difficulties over the development of a harmonized framework of international safety rules. In the present work, with the purpose of providing some complementary elements for global safety rule development, the certification-related activities conducted in the Brazilian space program are depicted and discussed, focusing mainly on the criterion for certification basis compilation. The results suggest that the composition of a certification basis with the preferential use of internationally-recognized standards, as is the case of ISO standards, can be a first step toward the development of an international safety regulation for commercial space activities.

  10. Protection: clarifying the concept for use in nursing practice.

    PubMed

    Lorenz, Susan G

    2007-01-01

    The protection of patients is integral in any healthcare setting. Healthcare organizations are increasingly held accountable for preventable medical errors, the attitudes toward safety, and communication among all levels of providers, collaborative practices, and recognition of risks. The concept of protection is inherent in nursing practice. It provides a framework, that further defines healthcare provider's roles in meeting these imperatives. The scope of protection is considered both globally and individually prominent. Nurses protect patients from environmental hazards, themselves, and any perceived threat. In this analysis of the phenomenon, the concept is clarified, and an evidence-based approach to protection is utilized for theory development and concept measurement.

  11. Understanding the evolution of rice technology in China - from traditional agriculture to GM rice today.

    PubMed

    Shen, Xiaobai

    2010-01-01

    This paper provides an historical survey of the evolution of rice technology in China, from the traditional farming system to genetically modified rice today. Using sociotechnological analytical framework, it analyses rice technology as a socio-technical ensemble - a complex interaction of material and social elements, and discusses the specificity of technology development and its socio-technical outcomes. It points to two imperatives in rice variety development: wholesale transporting agricultural technology and social mechanism to developing countries are likely lead to negative consequences; indigenous innovation including deploying GM technology for seed varietal development and capturing/cultivating local knowledge will provide better solutions.

  12. Contemporary cybernetics and its facets of cognitive informatics and computational intelligence.

    PubMed

    Wang, Yingxu; Kinsner, Witold; Zhang, Du

    2009-08-01

    This paper explores the architecture, theoretical foundations, and paradigms of contemporary cybernetics from perspectives of cognitive informatics (CI) and computational intelligence. The modern domain and the hierarchical behavioral model of cybernetics are elaborated at the imperative, autonomic, and cognitive layers. The CI facet of cybernetics is presented, which explains how the brain may be mimicked in cybernetics via CI and neural informatics. The computational intelligence facet is described with a generic intelligence model of cybernetics. The compatibility between natural and cybernetic intelligence is analyzed. A coherent framework of contemporary cybernetics is presented toward the development of transdisciplinary theories and applications in cybernetics, CI, and computational intelligence.

  13. Transoral endoscopic thyroidectomy vestibular approach: a preliminary framework for assessment and safety.

    PubMed

    Russell, Jonathon; Anuwong, Angkoon; Dionigi, Gianlorenzo; Inabnet, William B; Kim, Hoon Yub; Randolph, Gregory W; Richmon, Jeremy D; Tufano, Ralph P

    2018-05-23

    Transoral endoscopic thyroidectomy vestibular approach (TOETVA) is a new approach to the central neck that avoids an anterior cervical incision. This approach can be performed with endoscopic or robotic assistance and offers access to the bilateral central neck. It has been completed safely in both North American and, even more extensively, international populations. With any new technology or approach, complications during the learning curve, expense, instrument limitations, and overall safety may affect its ultimate adoption and utility. To ensure patient safety, it is imperative to define steps that should be considered by any surgeon or group before adoption of this new approach.

  14. Board oversight of community benefit: an ethical imperative.

    PubMed

    Magill, Gerard; Prybil, Lawrence D

    2011-03-01

    Board oversight of community benefit responsibility in tax-exempt organizations in the nonprofit health care sector is attracting considerable attention. Scrutiny by the IRS and other official bodies has led to stricter measures of compliance with the community benefit standard. But stricter compliance does not sufficiently engage the underlying ethical imperative for boards to provide effective oversight--an imperative that recent research suggests has not been sufficiently honored. This analysis considers why there is a distinctively ethical imperative for board oversight, the organizational nature of the imperative involved, and practical ways to fulfill its obligations. We adopt an organizational ethics paradigm to illuminate the constituent components of the ethical imperative and to clarify emerging benchmarks as flexible guidelines. As these emerging benchmarks enhance board oversight of community benefit they also can shed light on what it means to be a virtuous organization.

  15. FAIL-SAFE: Fault Aware IntelLigent Software for Exascale

    DTIC Science & Technology

    2016-06-13

    and that these programs can continue to correct solutions. To broaden the impact of this research, we also needed to be able to ameliorate errors...designing an interface between the application and an introspection framework for resilience ( IFR ) based on the inference engine SHINE; (4) using...the ROSE compiler to translate annotations into reasoning rules for the IFR ; and (5) designing a Knowledge/Experience Database, which will store

  16. A World Fit for Children: Millennium Development Goals; Special Session on Children Documents; The Convention on the Rights of the Child.

    ERIC Educational Resources Information Center

    United Nations Children's Fund, New York, NY.

    In May 2002, participants at the United Nations General Assembly's Special Session on Children committed to a set of specific goals for children and youth and a basic framework for meeting these goals. This report compiles the commitments that were part of the Special Session: (1) the Millennium Development Goals, earlier pledged to by all 189…

  17. Planetary Sciences: American and Soviet Research

    NASA Technical Reports Server (NTRS)

    Donahue, Thomas M. (Editor); Trivers, Kathleen Kearney (Editor); Abramson, David M. (Editor)

    1991-01-01

    Papers presented at the US-USSR Workshop on Planetary Sciences are compiled. The purpose of the workshop was to examine the current state of theoretical understanding of how the planets were formed and how they evolved to their present state. The workshop assessed the types of observations and experiments that are needed to advance understanding of the formation and evolution of the solar system based on the current theoretical framework.

  18. A Framework for an Automated Compilation System for Reconfigurable Architectures

    DTIC Science & Technology

    1997-03-01

    HDLs, Hardware C requires the designer to be thoroughly familiar with digital hardware design. 48 Vahid, Gong, and Gajski focus on the partitioning...of hardware used. Vahid, Gong, and Gajski suggest that the greedy approach used by Gupta and De Micheli is easily trapped in local minimums [46:216...iterative algorithm. To overcome this limitation, the Vahid, Gong, and Gajski suggest a binary constraint partitioning approach. The partitioning

  19. A systematic approach towards the identification and protection of vulnerable marine ecosystems

    USGS Publications Warehouse

    Ardron, Jeff A.; Clark, Malcolm R.; Penney, Andrew J.; Hourigan, Thomas F.; Rowden, Ashley A.; Dunstan, Piers K.; Watling, Les; Shank, Timothy M.; Tracey, Di M.; Dunn, Matthew R.; Parker, Steven J.

    2014-01-01

    The United Nations General Assembly in 2006 and 2009 adopted resolutions that call for the identification and protection of vulnerable marine ecosystems (VMEs) from significant adverse impacts of bottom fishing. While general criteria have been produced, there are no guidelines or protocols that elaborate on the process from initial identification through to the protection of VMEs. Here, based upon an expert review of existing practices, a 10-step framework is proposed: (1) Comparatively assess potential VME indicator taxa and habitats in a region; (2) determine VME thresholds; (3) consider areas already known for their ecological importance; (4) compile information on the distributions of likely VME taxa and habitats, as well as related environmental data; (5) develop predictive distribution models for VME indicator taxa and habitats; (6) compile known or likely fishing impacts; (7) produce a predicted VME naturalness distribution (areas of low cumulative impacts); (8) identify areas of higher value to user groups; (9) conduct management strategy evaluations to produce trade-off scenarios; (10) review and re-iterate, until spatial management scenarios are developed that fulfil international obligations and regional conservation and management objectives. To date, regional progress has been piecemeal and incremental. The proposed 10-step framework combines these various experiences into a systematic approach.

  20. YAPPA: a Compiler-Based Parallelization Framework for Irregular Applications on MPSoCs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lovergine, Silvia; Tumeo, Antonino; Villa, Oreste

    Modern embedded systems include hundreds of cores. Because of the difficulty in providing a fast, coherent memory architecture, these systems usually rely on non-coherent, non-uniform memory architectures with private memories for each core. However, programming these systems poses significant challenges. The developer must extract large amounts of parallelism, while orchestrating communication among cores to optimize application performance. These issues become even more significant with irregular applications, which present data sets difficult to partition, unpredictable memory accesses, unbalanced control flow and fine grained communication. Hand-optimizing every single aspect is hard and time-consuming, and it often does not lead to the expectedmore » performance. There is a growing gap between such complex and highly-parallel architectures and the high level languages used to describe the specification, which were designed for simpler systems and do not consider these new issues. In this paper we introduce YAPPA (Yet Another Parallel Programming Approach), a compilation framework for the automatic parallelization of irregular applications on modern MPSoCs based on LLVM. We start by considering an efficient parallel programming approach for irregular applications on distributed memory systems. We then propose a set of transformations that can reduce the development and optimization effort. The results of our initial prototype confirm the correctness of the proposed approach.« less

  1. Three tumor patients with total maxillectomy rehabilitated with implant-supported frameworks and maxillary obturators: a follow-up report.

    PubMed

    Örtorp, Anders

    2010-12-01

    Few reports are available on treatment using implant-supported frameworks with maxillary obturators after total maxillectomy on tumor patients. To describe, evaluate, and report the clinical and radiographic performance of implant-supported frameworks and maxillary obturators after maxillectomy during the first years of function. Three patients with cancer in the maxillary region treated by total maxillectomy were rehabilitated. Seventeen dental and two craniofacial implants were installed, and the patients each received implant-supported, screw-retained, three-unit frameworks with a U-shaped bar and obturators retained by four magnetic attachments. Clinical and radiographic data were collected up to 7 years of follow-up. The frequency of complications was low. Two craniofacial implants and one dental implant were loose and removed at abutment connection. No implants were lost after framework connection, and the mean marginal bone loss was small. Within the limitations of this report, dental implants are useful for rehabilitation of total maxillectomy patients, and a three-unit, screw-retained, implant-supported framework with maxillary obturator retained by magnetic attachment is a successful treatment concept for this patient group. © 2009, Copyright the Author. Journal Compilation © 2010, Wiley Periodicals, Inc.

  2. Achieving Information Dominance: Seven Imperatives for Success

    DTIC Science & Technology

    2002-06-01

    ACHIEVING INFORMATION DOMINANCE : SEVEN IMPERATIVES FOR SUCCESS Topical Area: C4ISR and Space Dr. Tom Kaye and Mr. George Galdorisi Dr. Tom Kaye Mr...00-00-2002 4. TITLE AND SUBTITLE Achieving Information Dominance : Seven Imperatives for Success 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...time. 3 ACHIEVING INFORMATION DOMINANCE : SEVEN IMPERATIVES FOR SUCCESS by Dr. Tom Kaye and Mr. George Galdorisi An integrated joint and combined C4ISR

  3. Bacterial bio-resources for remediation of hexachlorocyclohexane.

    PubMed

    Alvarez, Analía; Benimeli, Claudia S; Saez, Juliana M; Fuentes, María S; Cuozzo, Sergio A; Polti, Marta A; Amoroso, María J

    2012-11-15

    In the last few decades, highly toxic organic compounds like the organochlorine pesticide (OP) hexachlorocyclohexane (HCH) have been released into the environment. All HCH isomers are acutely toxic to mammals. Although nowadays its use is restricted or completely banned in most countries, it continues posing serious environmental and health concerns. Since HCH toxicity is well known, it is imperative to develop methods to remove it from the environment. Bioremediation technologies, which use microorganisms and/or plants to degrade toxic contaminants, have become the focus of interest. Microorganisms play a significant role in the transformation and degradation of xenobiotic compounds. Many Gram-negative bacteria have been reported to have metabolic abilities to attack HCH. For instance, several Sphingomonas strains have been reported to degrade the pesticide. On the other hand, among Gram-positive microorganisms, actinobacteria have a great potential for biodegradation of organic and inorganic toxic compounds. This review compiles and updates the information available on bacterial removal of HCH, particularly by Streptomyces strains, a prolific genus of actinobacteria. A brief account on the persistence and deleterious effects of these pollutant chemical is also given.

  4. Programming and reprogramming sequence timing following high and low contextual interference practice.

    PubMed

    Wright, David L; Magnuson, Curt E; Black, Charles B

    2005-09-01

    Individuals practiced two unique discrete sequence production tasks that differed in their relative time profile in either a blocked or random practice schedule. Each participant was subsequently administered a "precuing" protocol to examine the cost of initially compiling or modifying the plan for an upcoming movement's relative timing. The findings indicated that, in general, random practice facilitated the programming of the required movement timing, and this was accomplished while exhibiting greater accuracy in movement production. Participants exposed to random practice exhibited the greatest motor programming benefit, when a modification to an already prepared movement timing profile was required. When movement timing was only partially constructed prior to the imperative signal, the individuals who were trained in blocked and random practice formats accrued a similar cost to complete the programming process. These data provide additional support for the recent claim of Immink & Wright (2001) that at least some of the benefit from experience in a random as opposed to blocked training context can be localized to superior development and implementation of the motor programming process before executing the movement.

  5. Bacterial Bio-Resources for Remediation of Hexachlorocyclohexane

    PubMed Central

    Alvarez, Analía; Benimeli, Claudia S.; Saez, Juliana M.; Fuentes, María S.; Cuozzo, Sergio A.; Polti, Marta A.; Amoroso, María J.

    2012-01-01

    In the last few decades, highly toxic organic compounds like the organochlorine pesticide (OP) hexachlorocyclohexane (HCH) have been released into the environment. All HCH isomers are acutely toxic to mammals. Although nowadays its use is restricted or completely banned in most countries, it continues posing serious environmental and health concerns. Since HCH toxicity is well known, it is imperative to develop methods to remove it from the environment. Bioremediation technologies, which use microorganisms and/or plants to degrade toxic contaminants, have become the focus of interest. Microorganisms play a significant role in the transformation and degradation of xenobiotic compounds. Many Gram-negative bacteria have been reported to have metabolic abilities to attack HCH. For instance, several Sphingomonas strains have been reported to degrade the pesticide. On the other hand, among Gram-positive microorganisms, actinobacteria have a great potential for biodegradation of organic and inorganic toxic compounds. This review compiles and updates the information available on bacterial removal of HCH, particularly by Streptomyces strains, a prolific genus of actinobacteria. A brief account on the persistence and deleterious effects of these pollutant chemical is also given. PMID:23203113

  6. Understanding child sexual behavior problems: a developmental psychopathology framework.

    PubMed

    Elkovitch, Natasha; Latzman, Robert D; Hansen, David J; Flood, Mary Fran

    2009-11-01

    Children exhibiting sexual behavior have increasingly gained the attention of child welfare and mental health systems, as well as the scientific community. While a heterogeneous group, children with sexual behavior problems consistently demonstrate a number of problems related to adjustment and overall development. In order to appropriately intervene with these children, a comprehensive understanding of etiology is imperative. The overarching goal of the present paper is to review the extant research on mechanisms associated with the development of problematic sexual behavior in childhood within a developmental psychopathology framework. What is known about normative and nonnormative sexual behavior in childhood is reviewed, highlighting definitional challenges and age-related developmental differences. Further, the relationship between child sexual abuse and child sexual behavior problems is discussed, drawing attention to factors impacting this relationship. Risk factors for child sexual behavior problems, beyond that of sexual abuse, are also reviewed utilizing a transactional-ecological framework. Finally, we conclude with a discussion of implications of a developmental psychopathology perspective on problematic child sexual behaviors to inform future research and intervention efforts. Such implications include the need for attention to normative childhood sexual behavior, developmental sensitivity, and examinations of ecological domain in concert.

  7. Responsive space: Concept analysis and theoretical framework

    NASA Astrophysics Data System (ADS)

    Saleh, Joseph H.; Dubos, Gregory F.

    2009-08-01

    Customers' needs are dynamic and evolve in response to unfolding environmental uncertainties. The ability of a company or an industry to address these changing customers' needs in a timely and cost-effective way is a measure of its responsiveness. In the space industry, a systemic discrepancy exists between the time constants associated with the change of customers' needs, and the response time of the industry in delivering on-orbit solutions to these needs. There are important penalties associated with such delays, and space responsiveness is recognized as a strategic imperative in commercial competitive and military environments. In this paper, we provide a critical assessment of the literature on responsive space and introduce a new multi-disciplinary framework for thinking about and addressing issues of space responsiveness. Our framework advocates three levels of responsiveness: a global industry-wide responsiveness, a local stakeholder responsiveness, and an interactive or inter-stakeholder responsiveness. We introduce and motivate the use of "responsiveness maps" for multiple stakeholders. We then identify "levers of responsiveness": technical spacecraft- and launch-centric, as well as "soft" levers (e.g., acquisition policies) for improving the responsiveness of the space industry. Finally, we propose a series of research questions to aggressively tackle problems associated with space responsiveness.

  8. Performance measurement integrated information framework in e-Manufacturing

    NASA Astrophysics Data System (ADS)

    Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José

    2014-11-01

    The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.

  9. From Physics Model to Results: An Optimizing Framework for Cross-Architecture Code Generation

    DOE PAGES

    Blazewicz, Marek; Hinder, Ian; Koppelman, David M.; ...

    2013-01-01

    Starting from a high-level problem description in terms of partial differential equations using abstract tensor notation, the Chemora framework discretizes, optimizes, and generates complete high performance codes for a wide range of compute architectures. Chemora extends the capabilities of Cactus, facilitating the usage of large-scale CPU/GPU systems in an efficient manner for complex applications, without low-level code tuning. Chemora achieves parallelism through MPI and multi-threading, combining OpenMP and CUDA. Optimizations include high-level code transformations, efficient loop traversal strategies, dynamically selected data and instruction cache usage strategies, and JIT compilation of GPU code tailored to the problem characteristics. The discretization ismore » based on higher-order finite differences on multi-block domains. Chemora's capabilities are demonstrated by simulations of black hole collisions. This problem provides an acid test of the framework, as the Einstein equations contain hundreds of variables and thousands of terms.« less

  10. Operationalizing the Learning Health Care System in an Integrated Delivery System

    PubMed Central

    Psek, Wayne A.; Stametz, Rebecca A.; Bailey-Davis, Lisa D.; Davis, Daniel; Darer, Jonathan; Faucett, William A.; Henninger, Debra L.; Sellers, Dorothy C.; Gerrity, Gloria

    2015-01-01

    Introduction: The Learning Health Care System (LHCS) model seeks to utilize sophisticated technologies and competencies to integrate clinical operations, research and patient participation in order to continuously generate knowledge, improve care, and deliver value. Transitioning from concept to practical application of an LHCS presents many challenges but can yield opportunities for continuous improvement. There is limited literature and practical experience available in operationalizing the LHCS in the context of an integrated health system. At Geisinger Health System (GHS) a multi-stakeholder group is undertaking to enhance organizational learning and develop a plan for operationalizing the LHCS system-wide. We present a framework for operationalizing continuous learning across an integrated delivery system and lessons learned through the ongoing planning process. Framework: The framework focuses attention on nine key LHCS operational components: Data and Analytics; People and Partnerships; Patient and Family Engagement; Ethics and Oversight; Evaluation and Methodology; Funding; Organization; Prioritization; and Deliverables. Definitions, key elements and examples for each are presented. The framework is purposefully broad for application across different organizational contexts. Conclusion: A realistic assessment of the culture, resources and capabilities of the organization related to learning is critical to defining the scope of operationalization. Engaging patients in clinical care and discovery, including quality improvement and comparative effectiveness research, requires a defensible ethical framework that undergirds a system of strong but flexible oversight. Leadership support is imperative for advancement of the LHCS model. Findings from our ongoing work within the proposed framework may inform other organizations considering a transition to an LHCS. PMID:25992388

  11. [Is it possible to improve the preventive usefulness of workers' health surveillance in the current regulatory framework?

    PubMed

    Rodríguez Jareño, Mari Cruz; De Montserrat I Nonó, Jaume

    In Spain, the limited preventive usefulness of health surveillance is determined by the indiscriminate use of nonspecific "generic" health examinations aimed at producing a "fitness for work list", presumably allowing companies to comply with health and safety regulations. This study aimed to produce a technical interpretation of the Spanish Prevention of Risks at Work Act and propose a new conceptual framework to favour greater preventive usefulness of health surveillance within the current regulatory framework. Using qualitative techniques of content analysis, the text of the Law was studied, the key concepts that impeded the fulfilment of the preventive objectives of health surveillance were identified, and a technical interpretation adjusted to regulations was made in order to propose a new conceptual framework RESULTS: This conceptual framework would include: clearly differentiating health surveillance from health examinations (one of its instruments) and from fitness for work evaluations (an independent concept in itself); restricting mandatory health surveillance to situations in which it is "imperative" to carry it out because of the existence of a substantial risk to workers or third parties, including potentially vulnerable workers; and communicating the results of health surveillance through preventive recommendations to the company, reserving fitness for duty certificates -always based on clear, pre-established and justified criteria in relation to risk- for mandatory surveillance. The proposed new conceptual framework falls within the scope of the Spanish Prevention of Risks at Work Act, and its implementation could contribute to improving the preventive usefulness of health surveillance without the need to reform the legislation. Copyright belongs to the Societat Catalana de Salut Laboral.

  12. A common body of care: the ethics and politics of teamwork in the operating theater are inseparable.

    PubMed

    Bleakley, Alan

    2006-06-01

    In the operating theater, the micro-politics of practice, such as interpersonal communications, are central to patient safety and are intimately tied with values as well as knowledge and skills. Team communication is a shared and distributed work activity. In an era of "professionalism," that must now encompass "interprofessionalism," a virtue ethics framework is often invoked to inform practice choices, with reference to phronesis or practical wisdom. However, such a framework is typically cast in individualistic terms as a character trait, rather than in terms of a distributed quality that may be constituted through intentionally collaborative practice, or is an emerging property of a complex, adaptive system. A virtue ethics approach is a necessary but not sufficient condition for a collaborative bioethics within the operating theater. There is also an ecological imperative-the patient's entry into the household (oikos) of the operating theater invokes the need for "hospitality" as a form of ethical practice.

  13. Managing crises through organisational development: a conceptual framework.

    PubMed

    Lalonde, Carole

    2011-04-01

    This paper presents a synthesis of the guiding principles in crisis management in accordance with the four configurational imperatives (strategy, structure, leadership and environment) defined by Miller (1987) and outlines interventions in organisational development (OD) that may contribute to their achievement. The aim is to build a conceptual framework at the intersection of these two fields that could help to strengthen the resilient capabilities of individuals, organisations and communities to face crises. This incursion into the field of OD--to generate more efficient configurations of practices in crisis management--seems particularly fruitful considering the system-wide application of OD, based on open-systems theory (Burke, 2008). Various interventions proposed by OD in terms of human processes, structural designs and human resource management, as well as strategy, may help leaders, members of organisations and civil society apply effectively, and in a more sustainable way, the crisis management guiding principles defined by researchers. © 2011 The Author(s). Disasters © Overseas Development Institute, 2011.

  14. Strengthening the Health System to Better Confront Noncommunicable Diseases in India

    PubMed Central

    Duran, Antonio; Khot, Anagha

    2011-01-01

    The paper emphasizes the vital need to address the rising burden of noncommunicable diseases (NCDs) in India with a health systems approach. The authors argue that adoption of such approach may soon be imperative. Applying the health systems framework developed by the WHO in 2000 to NCDs means in summary re-examining the planning and organization of the entire health system, from service provision to financing, from information generation to ensuring adequate supply of pharmaceuticals/technologies or human resources, from improving facility management to performance monitoring. Using this framework the authors seek to highlight core issues and identify possible policy actions required. The challenge is to ensure the best implementation of what works, aligning the service provision function with the financial incentives, ensuring leadership/stewardship by the government across local/municipal, state or regional and national level while involving stakeholders. A health system perspective would also ensure that action against NCD goes hand in hand with tackling the remaining burden from communicable diseases, maternal, child health and nutrition issues. PMID:22628908

  15. Person-centered work environments, psychological safety, and positive affect in healthcare: a theoretical framework.

    PubMed

    Rathert, Cheryl; May, Douglas R

    2008-01-01

    We propose that in order to systematically improve healthcare quality, healthcare organizations (HCOs) need work environments that are person-centered: environments that support the careprovider as well as the patient. We further argue that HCOs have a moral imperative to provide a workplace where professional care standards can be achieved. We draw upon a large body of research from several disciplines to propose and articulate a theoretical framework that explains how the work environment should be related to the well-being of patients and careproviders, that is, the potential mediating mechanisms. Person-centered work environments include: 1. Climates for patient-centered care. 2. Climates for quality improvement. 3. Benevolent ethical climates. Such a work environment should support the provision of patient-centered care, and should lead to positive psychological states for careproviders, including psychological safety and positive affect. The model contributes to theory by specifying relationships between important organizational variables. The model can potentially contribute to practice by linking specific work environment attributes to outcomes for careproviders and patients.

  16. The edge-preservation multi-classifier relearning framework for the classification of high-resolution remotely sensed imagery

    NASA Astrophysics Data System (ADS)

    Han, Xiaopeng; Huang, Xin; Li, Jiayi; Li, Yansheng; Yang, Michael Ying; Gong, Jianya

    2018-04-01

    In recent years, the availability of high-resolution imagery has enabled more detailed observation of the Earth. However, it is imperative to simultaneously achieve accurate interpretation and preserve the spatial details for the classification of such high-resolution data. To this aim, we propose the edge-preservation multi-classifier relearning framework (EMRF). This multi-classifier framework is made up of support vector machine (SVM), random forest (RF), and sparse multinomial logistic regression via variable splitting and augmented Lagrangian (LORSAL) classifiers, considering their complementary characteristics. To better characterize complex scenes of remote sensing images, relearning based on landscape metrics is proposed, which iteratively quantizes both the landscape composition and spatial configuration by the use of the initial classification results. In addition, a novel tri-training strategy is proposed to solve the over-smoothing effect of relearning by means of automatic selection of training samples with low classification certainties, which always distribute in or near the edge areas. Finally, EMRF flexibly combines the strengths of relearning and tri-training via the classification certainties calculated by the probabilistic output of the respective classifiers. It should be noted that, in order to achieve an unbiased evaluation, we assessed the classification accuracy of the proposed framework using both edge and non-edge test samples. The experimental results obtained with four multispectral high-resolution images confirm the efficacy of the proposed framework, in terms of both edge and non-edge accuracy.

  17. The what, when, and why of implementation frameworks for evidence-based practices in child welfare and child mental health service systems.

    PubMed

    Hanson, Rochelle F; Self-Brown, Shannon; Rostad, Whitney L; Jackson, Matthew C

    2016-03-01

    It is widely recognized that children in the child welfare system are particularly vulnerable to the adverse health and mental effects associated with exposure to abuse and neglect, making it imperative to have broad-based availability of evidence-based practices (EBPs) that can prevent child maltreatment and reduce the negative mental health outcomes for youth who are victims. A variety of EBPs exist for reducing child maltreatment risk and addressing the associated negative mental health outcomes, but the reach of these practices is limited. An emerging literature documents factors that can enhance or inhibit the success of EBP implementation in community service agencies, including how the selection of a theory-driven conceptual framework, or model, might facilitate implementation planning by providing guidance for best practices during implementation phases. However, limited research is available to guide decision makers in the selection of implementation frameworks that can boost implementation success for EBPs that focus on preventing child welfare recidivism and serving the mental health needs of maltreated youth. The aims of this conceptual paper are to (1) provide an overview of existing implementation frameworks, beginning with a discussion of definitional issues and the selection criteria for frameworks included in the review; and (2) offer recommendations for practice and policy as applicable for professionals and systems serving victims of child maltreatment and their families. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.

    PubMed

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2014-01-01

    This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions.

  19. Strengthening rehabilitation services in Indonesia: A brief situation analysis.

    PubMed

    Nugraha, Boya; Setyono, Garry Rahardian; Defi, Irma Ruslina; Gutenbrunner, Christoph

    2018-04-18

    People with disability (PWD) in Indonesia are often neglected by society. Improving their life situation towards full participation in society is crucial. As a health strategy, rehabilitation can improve func-tioning, quality of life and participation in society. However, rehabilitation services in Indonesia need improvement. Making a situation analysis of rehabilitation services and their provision in the country is a pre-requisite to taking any action towards improvement. This paper compiles available data related to disability and rehabilitation services in Indonesia, using the Rehabilitation Services Assessment Tool (RSAT) as a framework. Gaps in provision were analysed, resulting in the compilation of a list of generic recommendations to improve rehabilitation services in the country. Indonesia faces many challenges in rehabilitation services, including the health workforce and the provision of services. This situation analysis and list of generic recommendations may be used in further discussions with relevant stakeholders in the country to develop a national strategy to strengthen rehabilitation services.

  20. Run-time parallelization and scheduling of loops

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Mirchandaney, Ravi; Crowley, Kay

    1990-01-01

    Run time methods are studied to automatically parallelize and schedule iterations of a do loop in certain cases, where compile-time information is inadequate. The methods presented involve execution time preprocessing of the loop. At compile-time, these methods set up the framework for performing a loop dependency analysis. At run time, wave fronts of concurrently executable loop iterations are identified. Using this wavefront information, loop iterations are reordered for increased parallelism. Symbolic transformation rules are used to produce: inspector procedures that perform execution time preprocessing and executors or transformed versions of source code loop structures. These transformed loop structures carry out the calculations planned in the inspector procedures. Performance results are presented from experiments conducted on the Encore Multimax. These results illustrate that run time reordering of loop indices can have a significant impact on performance. Furthermore, the overheads associated with this type of reordering are amortized when the loop is executed several times with the same dependency structure.

  1. Integrated primary care, the collaboration imperative inter-organizational cooperation in the integrated primary care field: a theoretical framework

    PubMed Central

    Valentijn, Pim P; Bruijnzeels, Marc A; de Leeuw, Rob J; Schrijvers, Guus J.P

    2012-01-01

    Purpose Capacity problems and political pressures have led to a rapid change in the organization of primary care from mono disciplinary small business to complex inter-organizational relationships. It is assumed that inter-organizational collaboration is the driving force to achieve integrated (primary) care. Despite the importance of collaboration and integration of services in primary care, there is no unambiguous definition for both concepts. The purpose of this study is to examine and link the conceptualisation and validation of the terms inter-organizational collaboration and integrated primary care using a theoretical framework. Theory The theoretical framework is based on the complex collaboration process of negotiation among multiple stakeholder groups in primary care. Methods A literature review of health sciences and business databases, and targeted grey literature sources. Based on the literature review we operationalized the constructs of inter-organizational collaboration and integrated primary care in a theoretical framework. The framework is being validated in an explorative study of 80 primary care projects in the Netherlands. Results and conclusions Integrated primary care is considered as a multidimensional construct based on a continuum of integration, extending from segregation to integration. The synthesis of the current theories and concepts of inter-organizational collaboration is insufficient to deal with the complexity of collaborative issues in primary care. One coherent and integrated theoretical framework was found that could make the complex collaboration process in primary care transparent. This study presented theoretical framework is a first step to understand the patterns of successful collaboration and integration in primary care services. These patterns can give insights in the organization forms needed to create a good working integrated (primary) care system that fits the local needs of a population. Preliminary data of the patterns of collaboration and integration will be presented.

  2. Are Women's Orgasms Hindered by Phallocentric Imperatives?

    PubMed

    Willis, Malachi; Jozkowski, Kristen N; Lo, Wen-Juo; Sanders, Stephanie A

    2018-02-20

    Women who have sex with women (WSW) are more likely to report experiencing an orgasm during partnered sex, compared to women who have sex with men (WSM). We investigated whether this difference can be partially accounted for by phallocentric imperatives-gendered sexual scripts that prioritize men's sexual experience. For example, these imperatives emphasize vaginal-penile intercourse (i.e., the coital imperative) and men's physical pleasure (i.e., the male orgasm imperative). We reasoned that a larger variety of sexual behaviors indicates less adherence to the coital imperative and that more self-oriented orgasm goals for women indicate less adherence to the male orgasm imperative. Consistent with previous work, we expected WSW to report higher rates of orgasm than WSM when taking frequency of sex into account. We also hypothesized that this difference in orgasm rates would dissipate when controlling for variety of sexual behavior and women's self-oriented orgasm goals. In a sample of 1988 WSM and 308 WSW, we found that WSW were 1.33 times (p < .001) more likely to report experiencing an orgasm than WSM, controlling for frequency of sex. This incidence rate ratio was reduced to 1.16 (p < .001) after taking into account variety of sexual behavior and self-oriented orgasm goals. Our findings indicate that certain sexual scripts (e.g., phallocentric imperatives) help explain the orgasm discrepancy between WSW and WSM. We discuss masturbation as another male-centered practice that may be relevant to this gap, as well as implications for intervention and future research.

  3. Humanitarian responses to mass violence perpetrated against vulnerable populations.

    PubMed Central

    Gellert, G. A.

    1995-01-01

    This multidisciplinary review links three areas of legitimate inquiry for practitioners of medicine and public health. The first is occurrences of mass violence or genocide perpetrated against vulnerable populations, with a focus on the failure of national and international mechanisms to prevent or predict such violence. The second is evolving concepts of national sovereignty and an emerging framework in which the imperative to assist vulnerable populations supersedes a state's right to self determination. The last is how medical, public health, and other systems of surveillance and rapid assessment of mass violence can accelerate public awareness and facilitate structured, consistent political decision making to prevent mass violence and to provide international humanitarian assistance. Images p1000-a PMID:7580643

  4. International Review of Frameworks for Impact Evaluation of Appliance Standards, Labeling, and Incentives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Nan; Romankiewicz, John; Vine, Edward

    2012-12-15

    In recent years, the number of energy efficiency policies implemented has grown very rapidly as energy security and climate change have become top policy issues for many governments around the world. Within the sphere of energy efficiency policy, governments (federal and local), electric utilities, and other types of businesses and institutions are implementing a wide variety of programs to spread energy efficiency practices in industry, buildings, transport, and electricity. As programs proliferate, there is an administrative and business imperative to evaluate the savings and processes of these programs to ensure that program funds spent are indeed leading to a moremore » energy-efficient economy.« less

  5. Effective communication during an influenza pandemic: the value of using a crisis and emergency risk communication framework.

    PubMed

    Reynolds, Barbara; Quinn Crouse, Sandra

    2008-10-01

    During a crisis, an open and empathetic style of communication that engenders the public's trust is the most effective when officials are attempting to galvanize the population to take a positive action or refrain from a harmful act. Although trust is imperative in a crisis, public suspicions of scientific experts and government are increasing for a variety of reasons, including access to more sources of conflicting information, a reduction in the use of scientific reasoning in decision making, and political infighting. Trust and credibility--which are demonstrated through empathy and caring, competence and expertise, honesty and openness, and dedication and commitment--are essential elements of persuasive communication.

  6. A strategic endeavor in business planning--an oncology perspective.

    PubMed

    Eck, C

    2000-06-01

    Planning is imperative to provide direction for future growth. The purpose of writing a business plan is to cultivate, analyze, and refine ideas. Planning for academic health centers has become increasingly important because of the changes in financing and delivery of health care. Gathering data related to the current patients population as well as the projected future trends is necessary to establish a framework. Identifying the market and financial data and formulating the strategies needed to move forward are key elements of a business plan. The ultimate outcome of the process is to convince others that the vision is achievable and to ensure allocation of resources to carry out the plan.

  7. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications

    PubMed Central

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks’ back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps’ detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523

  8. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    PubMed

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  9. Public trust in vaccination: an analytical framework.

    PubMed

    Gopichandran, Vijayaprasad

    2017-01-01

    While vaccination is one of the most successful public health interventions, there has always been a parallel movement against vaccines. Apart from scientific factors, the uptake of vaccinations is influenced by historical, political, sociocultural and economic factors. In India, the health system is struggling with logistical weaknesses in taking vaccination to the remotest corners; while on the other hand, some people in places where vaccination is available resist it. Unwillingness to be vaccinated is a growing problem in the developed world. This trend is gradually emerging in several parts of India as well. Other factors, such as heightened awareness of the profit motives of the vaccine industry, conflicts of interest among policy-makers, and social, cultural and religious considerations have eroded the people's trust in vaccination. This paper develops an analytical framework to assess trust in vaccination. The framework considers trust in vaccination from four perspectives - trust in the health system, the vaccine policy, vaccination providers and specific vaccines. The framework considers specific issues involved in vaccination trust, including the increasing scepticism towards medical technology, perceptions of conflicts of interest in the vaccine policy, and of lack of transparency and openness, the presence of strong alternative schools of thought, influence of the social media. The paper will conclude by arguing that engaging with communities and having a dialogue about the vaccination policy is an ethical imperative.

  10. Compiled MPI: Cost-Effective Exascale Applications Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardwaremore » procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over the application's lifetime. It includes: (1) New set of source code annotations, inserted either manually or automatically, that will clarify the application's use of MPI to the compiler infrastructure, enabling greater accuracy where needed; (2) A compiler transformation framework that leverages these annotations to transform the original MPI source code to improve its performance and scalability; (3) Novel MPI runtime implementation techniques that will provide a rich set of functionality extensions to be used by applications that have been transformed by our compiler; and (4) A novel compiler analysis that leverages simple user annotations to automatically extract the application's communication structure and synthesize most complex code annotations.« less

  11. KMCLib 1.1: Extended random number support and technical updates to the KMCLib general framework for kinetic Monte-Carlo simulations

    NASA Astrophysics Data System (ADS)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2015-11-01

    We here present a revised version, v1.1, of the KMCLib general framework for kinetic Monte-Carlo (KMC) simulations. The generation of random numbers in KMCLib now relies on the C++11 standard library implementation, and support has been added for the user to choose from a set of C++11 implemented random number generators. The Mersenne-twister, the 24 and 48 bit RANLUX and a 'minimal-standard' PRNG are supported. We have also included the possibility to use true random numbers via the C++11 std::random_device generator. This release also includes technical updates to support the use of an extended range of operating systems and compilers.

  12. The Imperative in Chinese.

    ERIC Educational Resources Information Center

    Hashimoto, Anne Yue

    A preliminary study of the syntactic characteristics of the imperative construction in modern Chinese is presented. The term "imperative" is used to refer to the type of syntactic construction which is marked by an implicit or explicit second person subject, and which expresses a direct command. Indirect or implied commands expressed by a…

  13. Automated Analysis of ARM Binaries using the Low-Level Virtual Machine Compiler Framework

    DTIC Science & Technology

    2011-03-01

    president to insist on keeping his smartphone [CNN09]. A self-proclaimed BlackBerry addict , President Obama fought hard to keep his mobile device after his... smartphone but renders a device non-functional on installation [FSe09][Hof07]. Complex interactions between hardware and software components both within... smartphone (which is a big assumption), the phone may still be vulnerable if the hardware or software does not correctly implement the design

  14. English-Russian, Russian-English glossary of coal-cleaning terms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pekar, J.

    1987-09-01

    The document is an English-Russian, Russian-English glossary of coal-cleaning terms, compiled as a joint U.S./Soviet effort. The need for the glossary resulted from the growing number of language-specific terms used during information exchanges within the framework of the U.S./U.S.S.R. Working Group on Stationary Source Air Pollution Control Technology, under the U.S./U.S.S.R. Agreement of Cooperation in the Field of Environmental Protection.

  15. A Framework for Violence: Clarifying the Role of Motivation in Lone-Actor Terrorism

    DTIC Science & Technology

    2017-03-01

    Within the timeframe included in this study , there was, on average, just over one case of lone-actor terrorism per year.60 Spaaij expanded his...informants and sting operations.63 Within the Becker study , twelve out of eighty-four attacks, relied on confidential informants.64 In certain cases , the...indicators of personal or ideological grievances. The data used for this research are publicly available and largely based on two case studies compiled

  16. Computer Vision Research and its Applications to Automated Cartography

    DTIC Science & Technology

    1985-09-01

    D Scene Geometry Thomas M. Strat and Martin A. Fischler Appendix D A New Sense for Depth of Field Alex P. Pentland iv 9.* qb CONTENTS (cont’d...D modeling. A. Baseline Stereo System As a framework for integration and evaluation of our research in modeling * 3-D scene geometry , as well as a...B. New Methods for Stereo Compilation As we previously indicated, the conventional approach to recovering scene geometry from a stereo pair of

  17. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    USGS Publications Warehouse

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 3 software suite can be compiled for Microsoft Windows®4 and Linux®5 operating systems; the source code is available in a Microsoft Visual Studio®6 2013 solution; Linux Makefiles are also provided. PEST++ Version 3 continues to build a foundation for an open-source framework capable of producing robust and efficient parameter estimation tools for large environmental models.

  18. Compiling a national resistivity atlas of Denmark based on airborne and ground-based transient electromagnetic data

    NASA Astrophysics Data System (ADS)

    Barfod, Adrian A. S.; Møller, Ingelise; Christiansen, Anders V.

    2016-11-01

    We present a large-scale study of the petrophysical relationship of resistivities obtained from densely sampled ground-based and airborne transient electromagnetic surveys and lithological information from boreholes. The overriding aim of this study is to develop a framework for examining the resistivity-lithology relationship in a statistical manner and apply this framework to gain a better description of the large-scale resistivity structures of the subsurface. In Denmark very large and extensive datasets are available through the national geophysical and borehole databases, GERDA and JUPITER respectively. In a 10 by 10 km grid, these data are compiled into histograms of resistivity versus lithology. To do this, the geophysical data are interpolated to the position of the boreholes, which allows for a lithological categorization of the interpolated resistivity values, yielding different histograms for a set of desired lithological categories. By applying the proposed algorithm to all available boreholes and airborne and ground-based transient electromagnetic data we build nation-wide maps of the resistivity-lithology relationships in Denmark. The presented Resistivity Atlas reveals varying patterns in the large-scale resistivity-lithology relations, reflecting geological details such as available source material for tills. The resistivity maps also reveal a clear ambiguity in the resistivity values for different lithologies. The Resistivity Atlas is highly useful when geophysical data are to be used for geological or hydrological modeling.

  19. A framework for recognition of prior learning within a Postgraduate Diploma of Nursing Management in South Africa.

    PubMed

    Jooste, Karien; Jasper, Melanie

    2010-09-01

    The present study focuses on the development of an initial framework to guide educators in nursing management in designing a portfolio for the recognition of prior learning for accreditation of competencies within a postgraduate diploma in South Africa. In South Africa, there is a unique educational need, arising from the legacy of apartheid and previous political regimes, to facilitate educational development in groups previously unable to access higher education. Awareness of the need for continuous professional development in nursing management practice and recognition of prior learning in the educational environment has presented the possibility of using one means to accomplish both aims. Although the content of the present study is pertinent to staff development of nurse managers, it is primarily written for nurse educators in the field of nursing management. The findings identify focus areas to be addressed in a recognition of prior learning portfolio to comply with the programme specific outcomes of Nursing Service Management. Further work to refine these focus areas to criteria that specify the level of performance required to demonstrate achievement is needed. CONCLUSION AND IMPLICATIONS FOR NURSE MANAGERS: Managers need to facilitate continuous professional development through portfolio compilation which acknowledges the learning opportunities within the workplace and can be used as recognition of prior learning. © 2010 The Authors. Journal compilation © 2010 Blackwell Publishing Ltd.

  20. Final Report A Multi-Language Environment For Programmable Code Optimization and Empirical Tuning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Qing; Whaley, Richard Clint; Qasem, Apan

    This report summarizes our effort and results of building an integrated optimization environment to effectively combine the programmable control and the empirical tuning of source-to-source compiler optimizations within the framework of multiple existing languages, specifically C, C++, and Fortran. The environment contains two main components: the ROSE analysis engine, which is based on the ROSE C/C++/Fortran2003 source-to-source compiler developed by Co-PI Dr.Quinlan et. al at DOE/LLNL, and the POET transformation engine, which is based on an interpreted program transformation language developed by Dr. Yi at University of Texas at San Antonio (UTSA). The ROSE analysis engine performs advanced compiler analysis,more » identifies profitable code transformations, and then produces output in POET, a language designed to provide programmable control of compiler optimizations to application developers and to support the parameterization of architecture-sensitive optimizations so that their configurations can be empirically tuned later. This POET output can then be ported to different machines together with the user application, where a POET-based search engine empirically reconfigures the parameterized optimizations until satisfactory performance is found. Computational specialists can write POET scripts to directly control the optimization of their code. Application developers can interact with ROSE to obtain optimization feedback as well as provide domain-specific knowledge and high-level optimization strategies. The optimization environment is expected to support different levels of automation and programmer intervention, from fully-automated tuning to semi-automated development and to manual programmable control.« less

  1. Collaboratively planning for medicines administration competency: a survey evaluation.

    PubMed

    Hemingway, Steve; Baxter, Hazel; Smith, George; Burgess-Dawson, Rebecca; Dewhirst, Kate

    2011-04-01

    This survey evaluated the experiences of mental health nurses who had undergone assessment of their competence in the administration of medicines using established assessment frameworks. Medicines management activities have at times been widely criticized. Joint collaborations between Higher Education Authorities and the National Health Service in education and training can start to address some of these criticisms. A questionnaire using 22 closed and open response questions was distributed to 827 practising mental health nurses and 44 graduate mental health nurses. A total of 70 registered and 41 graduate mental health nurses who had completed the assessment of administration competency frameworks responded to the survey. Response rates were 24 and 96%, respectively. The assessment frameworks were received positively. Environmental factors were perceived as the main barrier to medicines safety; however, this was not reflected in how this aspect of the competency framework was perceived. The administration of medicines is an area of mental health and all fields of nursing practice that needs attention. The use of competency frameworks as outlined in the 'Medicine with Respect Project' is one strategy to achieve the improvement in this essential clinical skill. © 2011 The Authors. Journal compilation © 2011 Blackwell Publishing Ltd.

  2. ProteoWizard: open source software for rapid proteomics tools development.

    PubMed

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  3. The Global Imperative for Teacher Education: Opportunities for Comparative and International Education

    ERIC Educational Resources Information Center

    Aydarova, Elena; Marquardt, Sheila K.

    2016-01-01

    In the context of globalization, teacher education has to respond to the global imperative by helping preservice teachers develop global consciousness and awareness (Apple, 2011; Zhao, 2010). This paper addresses this imperative by first identifying the spaces for global competencies in teacher education standards at the national, regional, state,…

  4. America’s Strategic Imperative: A National Energy Policy Manhattan Project

    DTIC Science & Technology

    2005-02-25

    AIR WAR COLLEGE AIR UNIVERSITY AMERICA’S STRATEGIC IMPERATIVE: A NATIONAL ENERGY POLICY MANHATTAN PROJECT by John M. Amidon, Lt Col, USAF A...COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE America’s Strategic Imperative: A National Energy Policy Manhattan Project 5a. CONTRACT...Peak.......................................................................................................30 A MANHATTAN PROJECT FOR ENERGY

  5. A Model-Based Diagnosis Framework for Distributed Systems

    DTIC Science & Technology

    2002-05-04

    of centralized compilation techniques as applied to [6] Marco Cadoli and Francesco M . Donini . A survey several areas, of which diagnosis is one. Our...for doing so than the family for that (1) Vi 1 ... m . Xi E 2V; (2) V ui(Xi[Xi E 1). tree-structured systems. For simplicity of notation, we will that (i...our diagnosis synthesis diagnoses using a likelihood weight ri assigned to each as- algorithm. sumable Ai, i = I, ... m . Using the likelihood algebra

  6. There’s carbon in them thar hills: But how much? Could Pacific Northwest forests store more?

    Treesearch

    Andrea Watts; Andrew Gray; Thomas Whittier

    2017-01-01

    As a signatory to the United Nations Framework Convention on Climate Change, the United States annually compiles a report on the nation’s carbon flux—the amount of carbon emitted into the atmosphere compared to the amount stored by terrestrial landscapes. Forests store vast amounts of carbon, but it’s not fully understood how a forest’s storage capacity fluctuates as...

  7. A network-based framework for assessing infrastructure resilience: a case study of the London metro system.

    PubMed

    Chopra, Shauhrat S; Dillon, Trent; Bilec, Melissa M; Khanna, Vikas

    2016-05-01

    Modern society is increasingly dependent on the stability of a complex system of interdependent infrastructure sectors. It is imperative to build resilience of large-scale infrastructures like metro systems for addressing the threat of natural disasters and man-made attacks in urban areas. Analysis is needed to ensure that these systems are capable of withstanding and containing unexpected perturbations, and develop heuristic strategies for guiding the design of more resilient networks in the future. We present a comprehensive, multi-pronged framework that analyses information on network topology, spatial organization and passenger flow to understand the resilience of the London metro system. Topology of the London metro system is not fault tolerant in terms of maintaining connectivity at the periphery of the network since it does not exhibit small-world properties. The passenger strength distribution follows a power law, suggesting that while the London metro system is robust to random failures, it is vulnerable to disruptions on a few critical stations. The analysis further identifies particular sources of structural and functional vulnerabilities that need to be mitigated for improving the resilience of the London metro network. The insights from our framework provide useful strategies to build resilience for both existing and upcoming metro systems. © 2016 The Author(s).

  8. Evolution in agriculture: the application of evolutionary approaches to the management of biotic interactions in agro-ecosystems

    PubMed Central

    Thrall, Peter H; Oakeshott, John G; Fitt, Gary; Southerton, Simon; Burdon, Jeremy J; Sheppard, Andy; Russell, Robyn J; Zalucki, Myron; Heino, Mikko; Ford Denison, R

    2011-01-01

    Anthropogenic impacts increasingly drive ecological and evolutionary processes at many spatio-temporal scales, demanding greater capacity to predict and manage their consequences. This is particularly true for agro-ecosystems, which not only comprise a significant proportion of land use, but which also involve conflicting imperatives to expand or intensify production while simultaneously reducing environmental impacts. These imperatives reinforce the likelihood of further major changes in agriculture over the next 30–40 years. Key transformations include genetic technologies as well as changes in land use. The use of evolutionary principles is not new in agriculture (e.g. crop breeding, domestication of animals, management of selection for pest resistance), but given land-use trends and other transformative processes in production landscapes, ecological and evolutionary research in agro-ecosystems must consider such issues in a broader systems context. Here, we focus on biotic interactions involving pests and pathogens as exemplars of situations where integration of agronomic, ecological and evolutionary perspectives has practical value. Although their presence in agro-ecosystems may be new, many traits involved in these associations evolved in natural settings. We advocate the use of predictive frameworks based on evolutionary models as pre-emptive management tools and identify some specific research opportunities to facilitate this. We conclude with a brief discussion of multidisciplinary approaches in applied evolutionary problems. PMID:25567968

  9. Designing climate change mitigation plans that add up.

    PubMed

    Bajželj, Bojana; Allwood, Julian M; Cullen, Jonathan M

    2013-07-16

    Mitigation plans to combat climate change depend on the combined implementation of many abatement options, but the options interact. Published anthropogenic emissions inventories are disaggregated by gas, sector, country, or final energy form. This allows the assessment of novel energy supply options, but is insufficient for understanding how options for efficiency and demand reduction interact. A consistent framework for understanding the drivers of emissions is therefore developed, with a set of seven complete inventories reflecting all technical options for mitigation connected through lossless allocation matrices. The required data set is compiled and calculated from a wide range of industry, government, and academic reports. The framework is used to create a global Sankey diagram to relate human demand for services to anthropogenic emissions. The application of this framework is demonstrated through a prediction of per-capita emissions based on service demand in different countries, and through an example showing how the "technical potentials" of a set of separate mitigation options should be combined.

  10. First-principles definition and measurement of planetary electromagnetic-energy budget.

    PubMed

    Mishchenko, Michael I; Lock, James A; Lacis, Andrew A; Travis, Larry D; Cairns, Brian

    2016-06-01

    The imperative to quantify the Earth's electromagnetic-energy budget with an extremely high accuracy has been widely recognized but has never been formulated in the framework of fundamental physics. In this paper we give a first-principles definition of the planetary electromagnetic-energy budget using the Poynting-vector formalism and discuss how it can, in principle, be measured. Our derivation is based on an absolute minimum of theoretical assumptions, is free of outdated notions of phenomenological radiometry, and naturally leads to the conceptual formulation of an instrument called the double hemispherical cavity radiometer (DHCR). The practical measurement of the planetary energy budget would require flying a constellation of several dozen planet-orbiting satellites hosting identical well-calibrated DHCRs.

  11. First-principles definition and measurement of planetary electromagnetic-energy budget

    NASA Astrophysics Data System (ADS)

    Mishchenko, M. I.; James, L.; Lacis, A. A.; Travis, L. D.; Cairns, B.

    2016-12-01

    The imperative to quantify the Earth's electromagnetic-energy budget with an extremely high accuracy has been widely recognized but has never been formulated in the framework of fundamental physics. In this talk we give a first-principles definition of the planetary electromagnetic-energy budget using the Poynting-vector formalism and discuss how it can, in principle, be measured. Our derivation is based on an absolute minimum of theoretical assumptions, is free of outdated concepts of phenomenological radiometry, and naturally leads to the conceptual formulation of an instrument called the double hemispherical cavity radiometer (DHCR). The practical measurement of the planetary energy budget would require flying a constellation of several dozen planet-orbiting satellites hosting identical well-calibrated DHCRs.

  12. First-Principles Definition and Measurement of Planetary Electromagnetic-Energy Budget

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Lock, James A.; Lacis, Andrew A.; Travis, Larry D.; Cairns, Brian

    2016-01-01

    The imperative to quantify the Earths electromagnetic-energy budget with an extremely high accuracy has been widely recognized but has never been formulated in the framework of fundamental physics. In this paper we give a first-principles definition of the planetary electromagnetic-energy budget using the Poynting- vector formalism and discuss how it can, in principle, be measured. Our derivation is based on an absolute minimum of theoretical assumptions, is free of outdated notions of phenomenological radiometry, and naturally leads to the conceptual formulation of an instrument called the double hemispherical cavity radiometer (DHCR). The practical measurement of the planetary energy budget would require flying a constellation of several dozen planet-orbiting satellites hosting identical well-calibrated DHCRs.

  13. Use of social media by healthcare professionals in Greece: an exploratory study.

    PubMed

    Apostolakis, Ioannis; Koulierakis, George; Berler, Alexander; Chryssanthou, Anargyros; Varlamis, Iraklis

    2012-01-01

    The continuously and rapidly changing landscape in the fields of communications, Internet and social media make it imperative for professionals to better understand the role of Information and Communication Technologies and their impact on everyday activities. Several frameworks have been proposed in order to capture various dimensions of social media and measure their impact on people's social, professional and other activities. The effect of social media and Web 2.0 applications on the healthcare sector is also significant. This paper examines Greek healthcare professionals' attitudes towards internet, social media and mobile technologies, explores their familiarity with social networks and associates their answers with their professional profile. The results of this exploratory study are discussed within the context of the growing international relevant literature.

  14. Endoscopic endonasal approaches for the management of skull base meningiomas. Selection criteria and clinical outcomes.

    PubMed

    Todeschini, Alexandre B; Otto, Bradley A; Carrau, Ricardo L; Prevedello, Daniel M

    2018-05-28

    Meningiomas are the most common primary intracranial tumor, arising from different locations, including the skull base. Despite advances in adjuvant treatments, surgical resection remains the main and best treatment for meningiomas. New surgical strategies, such as the endoscopic endonasal approach, have greatly contributed in achieving maximum and total safe resection, preserving the patient's neurological function. Based on the senior authors large experience and a review of the current literature, we have compiled this chapter. We review the surgical technique used at our institution and the most relevant aspects of patient selection when considering resecting a skull base meningioma using the the EEA. Further consideration is given to some skull base meningiomas arising from specific locations with some case examples. The EEA is not an ideal approach for every skull base meningioma. Careful evaluation of the surrounding neurovascular structures surrounding the tumor is imperative to select the appropriate surgical corridor for a safe resection. Nevertheless, for appropriately selected cases, the endoscopic technique is a very valuable tool with some evidences of being superior to the microscopic transcranial approach. A dual-trained surgeon, in both endoscopic and transcranial approaches, is the best alternative to achieve the best patient outcome.

  15. Abdominal tumours in children: 3-D visualisation and surgical planning.

    PubMed

    Günther, P; Schenk, J P; Wunsch, R; Tröger, J; Waag, K L

    2004-10-01

    Solid abdominal tumours are of special importance in the field of paediatric surgery. Because of the dangers of cumulative irradiation and improved delineation of soft parts MRI is usually employed in children for diagnostic assessment. Compiling the radiologic information for surgical planning is often difficult by conventional methods. Newly improved and efficient 3-D volume rendering software is now available for visual reconstruction of tumour anatomy utilising segmentation and other special techniques. Because the intraoperative complication rate is close to 20 % as described in the literature, optimal preoperative visualisation and planning would seem imperative. All children with solid abdominal tumours at Heidelberg University in the year 2002 were included in this study. MR examinations were performed with a 0.5 Tesla magnet using a standard protocol. All MR data were processed with VG Studio Max 1.1, converting the two-dimensional data into three-dimensional data. This report presents 15 cases using this special technique: 7 with abdominal neuroblastoma, 6 with nephroblastoma, 1 ganglioneuroma, and 1 ovarian teratoma. Our experience shows that a better understanding of the surgical anatomy, particularly regarding the surrounding organs and vasculature, can be helpful in decreasing the incidence of inadvertent intraoperative injuries to these structures.

  16. Phylogeny and Biogeography of Cyanobacteria and Their Produced Toxins

    PubMed Central

    Moreira, Cristiana; Vasconcelos, Vitor; Antunes, Agostinho

    2013-01-01

    Phylogeny is an evolutionary reconstruction of the past relationships of DNA or protein sequences and it can further be used as a tool to assess population structuring, genetic diversity and biogeographic patterns. In the microbial world, the concept that everything is everywhere is widely accepted. However, it is much debated whether microbes are easily dispersed globally or whether they, like many macro-organisms, have historical biogeographies. Biogeography can be defined as the science that documents the spatial and temporal distribution of a given taxa in the environment at local, regional and continental scales. Speciation, extinction and dispersal are proposed to explain the generation of biogeographic patterns. Cyanobacteria are a diverse group of microorganisms that inhabit a wide range of ecological niches and are well known for their toxic secondary metabolite production. Knowledge of the evolution and dispersal of these microorganisms is still limited, and further research to understand such topics is imperative. Here, we provide a compilation of the most relevant information regarding these issues to better understand the present state of the art as a platform for future studies, and we highlight examples of both phylogenetic and biogeographic studies in non-symbiotic cyanobacteria and cyanotoxins. PMID:24189276

  17. Integrated assessment of social and environmental sustainability dynamics in the Ganges-Brahmaputra-Meghna delta, Bangladesh

    NASA Astrophysics Data System (ADS)

    Nicholls, R. J.; Hutton, C. W.; Lázár, A. N.; Allan, A.; Adger, W. N.; Adams, H.; Wolf, J.; Rahman, M.; Salehin, M.

    2016-12-01

    Deltas provide diverse ecosystem services and benefits for their populations. At the same time, deltas are also recognised as one of the most vulnerable coastal environments, with a range of drivers operating at multiple scales, from global climate change and sea-level rise to deltaic-scale subsidence and land cover change. These drivers threaten these ecosystem services, which often provide livelihoods for the poorest communities in these regions. The imperative to maintain ecosystem services presents a development challenge: how to develop deltaic areas in ways that are sustainable and benefit all residents including the most vulnerable. Here we present an integrated framework to analyse changing ecosystem services in deltas and the implications for human well-being, focussing in particular on the provisioning ecosystem services of agriculture, inland and offshore capture fisheries, aquaculture and mangroves that directly support livelihoods. The framework is applied to the world's most populated delta, the Ganges-Brahmaputra-Meghna Delta within Bangladesh. The framework adopts a systemic perspective to represent the principal biophysical and socio-ecological components and their interaction. A range of methods are integrated within a quantitative framework, including biophysical and socio-economic modelling and analyses of governance through scenario development. The approach is iterative, with learning both within the project team and with national policy-making stakeholders. The analysis is used to explore physical and social outcomes for the delta under different scenarios and policy choices. We consider how the approach is transferable to other deltas and potentially other coastal areas.

  18. Beyond competencies: using a capability framework in developing practice standards for advanced practice nursing.

    PubMed

    O'Connell, Jane; Gardner, Glenn; Coyer, Fiona

    2014-12-01

    This paper presents a discussion on the application of a capability framework for advanced practice nursing standards/competencies. There is acceptance that competencies are useful and necessary for definition and education of practice-based professions. Competencies have been described as appropriate for practice in stable environments with familiar problems. Increasingly competencies are being designed for use in the health sector for advanced practice such as the nurse practitioner role. Nurse practitioners work in environments and roles that are dynamic and unpredictable necessitating attributes and skills to practice at advanced and extended levels in both familiar and unfamiliar clinical situations. Capability has been described as the combination of skills, knowledge, values and self-esteem which enables individuals to manage change, be flexible and move beyond competency. A discussion paper exploring 'capability' as a framework for advanced nursing practice standards. Data were sourced from electronic databases as described in the background section. As advanced practice nursing becomes more established and formalized, novel ways of teaching and assessing the practice of experienced clinicians beyond competency are imperative for the changing context of health services. Leading researchers into capability in health care state that traditional education and training in health disciplines concentrates mainly on developing competence. To ensure that healthcare delivery keeps pace with increasing demand and a continuously changing context there is a need to embrace capability as a framework for advanced practice and education. © 2014 John Wiley & Sons Ltd.

  19. Toward a Last Interglacial Compilation Using a Tephra-based Chronology: a Future Reference For Model-data Comparison

    NASA Astrophysics Data System (ADS)

    Bazin, L.; Govin, A.; Capron, E.; Nomade, S.; Lemieux-Dudon, B.; Landais, A.

    2017-12-01

    The Last Interglacial (LIG, 129-116 ka) is a key period to decipher the interactions between the different components of the climate system under warmer-than-preindustrial conditions. Modelling the LIG climate is now part of the CMIP6/PMIP4 targeted simulations. As a result, recent efforts have been made to propose surface temperature compilations focusing on the spatio-temporal evolution of the LIG climate, and not only on its peak warmth as previously proposed. However, the major limitation of these compilations remains in the climatic alignment of records (e.g. temperature, foraminiferal δ18O) that is performed to define the sites' chronologies. Such methods prevent the proper discussion of phase relationship between the different sites. Thanks to recent developments of the Bayesian Datice dating tool, we are now able to build coherent multi-archive chronologies with a proper propagation of the associated uncertainties. We make the best use of common tephra layers identified in well-dated continental archives and marine sediment cores of the Mediterranean region to propose a coherent chronological framework for the LIG independent of any climatic assumption. We then extend this precise chronological context to the North Atlantic as a first step toward a global coherent compilation of surface temperature and stable isotope records. Based on this synthesis, we propose guidelines for the interpretation of different proxies measured from different archives that will be compared with climate model parameters. Finally, we present time-slices (e.g. 127 ka) of the preliminary regional synthesis of temperature reconstructions and stable isotopes to serve as reference for future model-data comparison of the up-coming CMIP6/PMIP4 LIG simulations.

  20. Eli Lilly and Company's bioethics framework for human biomedical research.

    PubMed

    Van Campen, Luann E; Therasse, Donald G; Klopfenstein, Mitchell; Levine, Robert J

    2015-11-01

    Current ethics and good clinical practice guidelines address various aspects of pharmaceutical research and development, but do not comprehensively address the bioethical responsibilities of sponsors. To fill this void, in 2010 Eli Lilly and Company developed and implemented a Bioethics Framework for Human Biomedical Research to guide ethical decisions. (See our companion article that describes how the framework was developed and implemented and provides a critique of its usefulness and limitations.) This paper presents the actual framework that serves as a company resource for employee education and bioethics deliberations. The framework consists of four basic ethical principles and 13 essential elements for ethical human biomedical research and resides within the context of our company's mission, vision and values. For each component of the framework, we provide a high-level overview followed by a detailed description with cross-references to relevant well regarded guidance documents. The principles and guidance described should be familiar to those acquainted with research ethics. Therefore the novelty of the framework lies not in the foundational concepts presented as much as the attempt to specify and compile a sponsor's bioethical responsibilities to multiple stakeholders into one resource. When such a framework is employed, it can serve as a bioethical foundation to inform decisions and actions throughout clinical planning, trial design, study implementation and closeout, as well as to inform company positions on bioethical issues. The framework is, therefore, a useful tool for translating ethical aspirations into action - to help ensure pharmaceutical human biomedical research is conducted in a manner that aligns with consensus ethics principles, as well as a sponsor's core values.

  1. Robust feature extraction for rapid classification of damage in composites

    NASA Astrophysics Data System (ADS)

    Coelho, Clyde K.; Reynolds, Whitney; Chattopadhyay, Aditi

    2009-03-01

    The ability to detect anomalies in signals from sensors is imperative for structural health monitoring (SHM) applications. Many of the candidate algorithms for these applications either require a lot of training examples or are very computationally inefficient for large sample sizes. The damage detection framework presented in this paper uses a combination of Linear Discriminant Analysis (LDA) along with Support Vector Machines (SVM) to obtain a computationally efficient classification scheme for rapid damage state determination. LDA was used for feature extraction of damage signals from piezoelectric sensors on a composite plate and these features were used to train the SVM algorithm in parts, reducing the computational intensity associated with the quadratic optimization problem that needs to be solved during training. SVM classifiers were organized into a binary tree structure to speed up classification, which also reduces the total training time required. This framework was validated on composite plates that were impacted at various locations. The results show that the algorithm was able to correctly predict the different impact damage cases in composite laminates using less than 21 percent of the total available training data after data reduction.

  2. Alternative synthetic approaches for metal-organic frameworks: transformation from solid matters.

    PubMed

    Zhan, Guowu; Zeng, Hua Chun

    2016-12-20

    Developing economic and sustainable synthetic strategies for metal-organic frameworks (MOFs) is imperative for promoting MOF materials into large scale industrial use. Very recently, an alternative strategy for MOF synthesis by using solvent-insoluble "solid matters" as cation reservoirs and/or templates has been developed to accomplish this goal, in which the solid matters often refer to metals, metal oxides, hydroxides, carbonates, and so forth, but excluding the soluble metal salts which have been prevailingly used in MOF synthesis. Although most of the pioneering activities in this field have just started in the past 5 years, remarkable achievements have been made covering the synthesis, functionalization, positioning, and applications. A great number of MOFs in powder form, thin-films, or membranes, have been prepared through such solid-to-MOF transformations. This field is rapidly developing and expanding, and the number of related scientific publications has strikingly increased over the last few years. The aim of this review is to summarise the latest developments, highlight the present state-of-the-art, and also provide an overview for future research directions.

  3. Virtual reality disaster training: translation to practice.

    PubMed

    Farra, Sharon L; Miller, Elaine T; Hodgson, Eric

    2015-01-01

    Disaster training is crucial to the mitigation of both mortality and morbidity associated with disasters. Just as clinical practice needs to be grounded in evidence, effective disaster education is dependent upon the development and use of andragogic and pedagogic evidence. Educational research findings must be transformed into useable education strategies. Virtual reality simulation is a teaching methodology that has the potential to be a powerful educational tool. The purpose of this article is to translate research findings related to the use of virtual reality simulation in disaster training into education practice. The Ace Star Model serves as a valuable framework to translate the VRS teaching methodology and improve disaster training of healthcare professionals. Using the Ace Star Model as a framework to put evidence into practice, strategies for implementing a virtual reality simulation are addressed. Practice guidelines, implementation recommendations, integration to practice and evaluation are discussed. It is imperative that health educators provide more exemplars of how research evidence can be moved through the various stages of the model to advance practice and sustain learning outcomes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. The Imperatives of Tactical Level Maintenance,

    DTIC Science & Technology

    1986-11-24

    consist of the imperatives of tactical level maintenance. These imperatives are: fix forward; provision of repair parts supply in a responsive manner...Fix forward; Provide rer,onsi,,e repair parts supply support; Conduct responsive recoer’, and e,)acuat!Dr operations; Establish and maintain...1 primary battlefield source of supple becomes the maintenance system. The role of maintenance, therefore, is to ass-ist in tne pro ’,51 _n

  5. C++QEDv2 Milestone 10: A C++/Python application-programming framework for simulating open quantum dynamics

    NASA Astrophysics Data System (ADS)

    Sandner, Raimar; Vukics, András

    2014-09-01

    The v2 Milestone 10 release of C++QED is primarily a feature release, which also corrects some problems of the previous release, especially as regards the build system. The adoption of C++11 features has led to many simplifications in the codebase. A full doxygen-based API manual [1] is now provided together with updated user guides. A largely automated, versatile new testsuite directed both towards computational and physics features allows for quickly spotting arising errors. The states of trajectories are now savable and recoverable with full binary precision, allowing for trajectory continuation regardless of evolution method (single/ensemble Monte Carlo wave-function or Master equation trajectory). As the main new feature, the framework now presents Python bindings to the highest-level programming interface, so that actual simulations for given composite quantum systems can now be performed from Python. Catalogue identifier: AELU_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: yes No. of lines in distributed program, including test data, etc.: 492422 No. of bytes in distributed program, including test data, etc.: 8070987 Distribution format: tar.gz Programming language: C++/Python. Computer: i386-i686, x86 64. Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2. External routines: Boost C++ libraries, GNU Scientific Library, Blitz++, FLENS, NumPy, SciPy Catalogue identifier of previous version: AELU_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 1381 Does the new version supersede the previous version?: Yes Nature of problem: Definition of (open) composite quantum systems out of elementary building blocks [2,3]. Manipulation of such systems, with emphasis on dynamical simulations such as Master-equation evolution [4] and Monte Carlo wave-function simulation [5]. Solution method: Master equation, Monte Carlo wave-function method Reasons for new version: The new version is mainly a feature release, but it does correct some problems of the previous version, especially as regards the build system. Summary of revisions: We give an example for a typical Python script implementing the ring-cavity system presented in Sec. 3.3 of Ref. [2]: Restrictions: Total dimensionality of the system. Master equation-few thousands. Monte Carlo wave-function trajectory-several millions. Unusual features: Because of the heavy use of compile-time algorithms, compilation of programs written in the framework may take a long time and much memory (up to several GBs). Additional comments: The framework is not a program, but provides and implements an application-programming interface for developing simulations in the indicated problem domain. We use several C++11 features which limits the range of supported compilers (g++ 4.7, clang++ 3.1) Documentation, http://cppqed.sourceforge.net/ Running time: Depending on the magnitude of the problem, can vary from a few seconds to weeks. References: [1] Entry point: http://cppqed.sf.net [2] A. Vukics, C++QEDv2: The multi-array concept and compile-time algorithms in the definition of composite quantum systems, Comp. Phys. Comm. 183(2012)1381. [3] A. Vukics, H. Ritsch, C++QED: an object-oriented framework for wave-function simulations of cavity QED systems, Eur. Phys. J. D 44 (2007) 585. [4] H. J. Carmichael, An Open Systems Approach to Quantum Optics, Springer, 1993. [5] J. Dalibard, Y. Castin, K. Molmer, Wave-function approach to dissipative processes in quantum optics, Phys. Rev. Lett. 68 (1992) 580.

  6. Increasing Usability in Ocean Observing Systems

    NASA Astrophysics Data System (ADS)

    Chase, A. C.; Gomes, K.; O'Reilly, T.

    2005-12-01

    As observatory systems move to more advanced techniques for instrument configuration and data management, standardized frameworks are being developed to benefit from commodities of scale. ACE (A Configuror and Editor) is a tool that was developed for SIAM (Software Infrastructure and Application for MOOS), a framework for the seamless integration of self-describing plug-and-work instruments into the Monterey Ocean Observing System. As a comprehensive solution, the SIAM infrastructure requires a number of processes to be run to configure an instrument for use within its framework. As solutions move from the lab to the field, the steps needed to implement the solution must be made bulletproof so that they may be used in the field with confidence. Loosely defined command line interfaces don't always provide enough user feedback and business logic can be difficult to maintain over a series of scripts. ACE is a tool developed for guiding the user through a number of complicated steps, removing the reliance on command-line utilities and reducing the difficulty of completing the necessary steps, while also preventing operator error and enforcing system constraints. Utilizing the cross-platform nature of the Java programming language, ACE provides a complete solution for deploying an instrument within the SIAM infrastructure without depending on special software being installed on the users computer. Requirements such as the installation of a Unix emulator for users running Windows machines, and the installation of, and ability to use, a CVS client, have all been removed by providing the equivalent functionality from within ACE. In order to achieve a "one stop shop" for configuring instruments, ACE had to be written to handle a wide variety of functionality including: compiling java code, interacting with a CVS server and maintaining client-side CVS information, editing XML, interacting with a server side database, and negotiating serial port communications through Java. This paper will address the relative tradeoffs of including all the afore-mentioned functionality in a single tool, its affects on user adoption of the framework (SIAM) it provides access to, as well as further discussion of some of the functionality generally pertinent to data management (XML editing, source code management and compilation, etc).

  7. Compiling Holocene RSL databases from near- to far-field regions: proxies, difficulties and possible solutions

    NASA Astrophysics Data System (ADS)

    Vacchi, M.; Horton, B.; Mann, T.; Engelhart, S. E.; Rovere, A.; Nikitina, D.; Bender, M.; Roy, K.; Peltier, W. R.

    2017-12-01

    Reconstructions of relative sea level (RSL) have implications for investigation of crustal movements, calibration of earth rheology models and the reconstruction of ice sheets. In recent years, efforts were made to create RSL databases following a standardized methodology. These regional databases provide a framework for developing our understanding of the primary mechanisms of RSL change since the Last Glacial Maximum and a long-term baseline against which to gauge changes in sea level during the 20th century and forecasts for the 21st. We report here the results of recently compiled databases in very different climatic and geographic contexts that are the northeastern Canadian coast, the Mediterranean Sea as well as the southeastern Asiatic region. Our re-evaluation of sea-level indicators from geological and archaeological investigations have yielded more than 3000 RSL data-points mainly from salt and freshwater wetlands or adjacent estuarine sediment, isolation basins, beach ridges, fixed biological indicators, beachrocks as well as coastal archaeological structures. We outline some of the inherent difficulties, and potential solutions to analyse sea-level data in such different depositional environments. In particular, we discuss problems related with the definition of standardized indicative meaning, and with the re-evaluation of old radiocarbon samples. We further address complex tectonics influences and the framework to compare such large variability of RSL data-points. Finally we discuss the implications of our results for the patterns of glacio-isostatic adjustment in these regions.

  8. TOPPE: A framework for rapid prototyping of MR pulse sequences.

    PubMed

    Nielsen, Jon-Fredrik; Noll, Douglas C

    2018-06-01

    To introduce a framework for rapid prototyping of MR pulse sequences. We propose a simple file format, called "TOPPE", for specifying all details of an MR imaging experiment, such as gradient and radiofrequency waveforms and the complete scan loop. In addition, we provide a TOPPE file "interpreter" for GE scanners, which is a binary executable that loads TOPPE files and executes the sequence on the scanner. We also provide MATLAB scripts for reading and writing TOPPE files and previewing the sequence prior to hardware execution. With this setup, the task of the pulse sequence programmer is reduced to creating TOPPE files, eliminating the need for hardware-specific programming. No sequence-specific compilation is necessary; the interpreter only needs to be compiled once (for every scanner software upgrade). We demonstrate TOPPE in three different applications: k-space mapping, non-Cartesian PRESTO whole-brain dynamic imaging, and myelin mapping in the brain using inhomogeneous magnetization transfer. We successfully implemented and executed the three example sequences. By simply changing the various TOPPE sequence files, a single binary executable (interpreter) was used to execute several different sequences. The TOPPE file format is a complete specification of an MR imaging experiment, based on arbitrary sequences of a (typically small) number of unique modules. Along with the GE interpreter, TOPPE comprises a modular and flexible platform for rapid prototyping of new pulse sequences. Magn Reson Med 79:3128-3134, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  9. The Health Impact Assessment (HIA) Resource and Tool ...

    EPA Pesticide Factsheets

    Health Impact Assessment (HIA) is a relatively new and rapidly emerging field in the U.S. An inventory of available HIA resources and tools was conducted, with a primary focus on resources developed in the U.S. The resources and tools available to HIA practitioners in the conduct of their work were identified through multiple methods and compiled into a comprehensive list. The compilation includes tools and resources related to the HIA process itself and those that can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation. These resources include literature and evidence bases, data and statistics, guidelines, benchmarks, decision and economic analysis tools, scientific models, methods, frameworks, indices, mapping, and various data collection tools. Understanding the data, tools, models, methods, and other resources available to perform HIAs will help to advance the HIA community of practice in the U.S., improve the quality and rigor of assessments upon which stakeholder and policy decisions are based, and potentially improve the overall effectiveness of HIA to promote healthy and sustainable communities. The Health Impact Assessment (HIA) Resource and Tool Compilation is a comprehensive list of resources and tools that can be utilized by HIA practitioners with all levels of HIA experience to guide them throughout the HIA process. The HIA Resource

  10. The research imperative revisited: considerations for advancing the debate surrounding medical research as moral imperative.

    PubMed

    Wayne, Katherine; Glass, Kathleen Cranley

    2010-01-01

    Medical research is frequently regarded as not only laudable, but even obligatory. However, the moral foundation for such an obligation is far from clear. Lively debate concerning the viability of an obligation to conduct and support medical research is transpiring among a small number of scholars speaking from a variety of backgrounds, yet the current discussion is predominantly situated within several discrete academic and professional circles, allowing only sporadic engagement within and between scholarly disciplines and the medical realm. We aim to lay the groundwork for a focused critique of the "research imperative" by examining (1) its commitments within ideologies of science, medicine, and progress: and (2) its normative theoretical underpinnings. Our analysis finds no solid grounding for the research imperative and exposes problems in the attitudes and arguments supporting it. We believe these concerns present compelling reasons for devoting greater critical attention to the research imperative and to the morality of the medical research enterprise as a whole.

  11. "Notice the Similarities between the Two Sets …": Imperative Usage in a Corpus of Upper-Level Student Papers

    ERIC Educational Resources Information Center

    Neiderhiser, Justine A.; Kelley, Patrick; Kennedy, Kohlee M.; Swales, John M.; Vergaro, Carla

    2016-01-01

    The sparse literature on the use of imperatives in research papers suggests that they are relatively common in a small number of disciplines, but rare, if used at all, in others. The present study addresses the use of imperatives in a corpus of upper-level A-graded student papers from 16 disciplines. A total of 822 papers collected within the past…

  12. War without Oil: A Catalyst for True Transformation

    DTIC Science & Technology

    2006-02-17

    of oil consumed figure from Amidon, “America’s Strategic Imperative, A ‘ Manhattan Project ’ for...Strategic Imperative, A ‘ Manhattan Project ’ for Energy”, 70. 21 2 trillion barrel figure derived from EIA, “World Proved Reserves of Oil and Gas...Imperative, A ‘ Manhattan Project ’ for Energy”, 70. - 7 - Global Oil Supply/Demand 22Table 1 – 2004 Top 10 Petroleum Producers and Consumers

  13. Can the EVIDEM Framework Tackle Issues Raised by Evaluating Treatments for Rare Diseases: Analysis of Issues and Policies, and Context-Specific Adaptation.

    PubMed

    Wagner, Monika; Khoury, Hanane; Willet, Jacob; Rindress, Donna; Goetghebeur, Mireille

    2016-03-01

    The multiplicity of issues, including uncertainty and ethical dilemmas, and policies involved in appraising interventions for rare diseases suggests that multicriteria decision analysis (MCDA) based on a holistic definition of value is uniquely suited for this purpose. The objective of this study was to analyze and further develop a comprehensive MCDA framework (EVIDEM) to address rare disease issues and policies, while maintaining its applicability across disease areas. Specific issues and policies for rare diseases were identified through literature review. Ethical and methodological foundations of the EVIDEM framework v3.0 were systematically analyzed from the perspective of these issues, and policies and modifications of the framework were performed accordingly to ensure their integration. Analysis showed that the framework integrates ethical dilemmas and issues inherent to appraising interventions for rare diseases but required further integration of specific aspects. Modification thus included the addition of subcriteria to further differentiate disease severity, disease-specific treatment outcomes, and economic consequences of interventions for rare diseases. Scoring scales were further developed to include negative scales for all comparative criteria. A methodology was established to incorporate context-specific population priorities and policies, such as those for rare diseases, into the quantitative part of the framework. This design allows making more explicit trade-offs between competing ethical positions of fairness (prioritization of those who are worst off), the goal of benefiting as many people as possible, the imperative to help, and wise use of knowledge and resources. It also allows addressing variability in institutional policies regarding prioritization of specific disease areas, in addition to existing uncertainty analysis available from EVIDEM. The adapted framework measures value in its widest sense, while being responsive to rare disease issues and policies. It provides an operationalizable platform to integrate values, competing ethical dilemmas, and uncertainty in appraising healthcare interventions.

  14. Intelligent microchip networks: an agent-on-chip synthesis framework for the design of smart and robust sensor networks

    NASA Astrophysics Data System (ADS)

    Bosse, Stefan

    2013-05-01

    Sensorial materials consisting of high-density, miniaturized, and embedded sensor networks require new robust and reliable data processing and communication approaches. Structural health monitoring is one major field of application for sensorial materials. Each sensor node provides some kind of sensor, electronics, data processing, and communication with a strong focus on microchip-level implementation to meet the goals of miniaturization and low-power energy environments, a prerequisite for autonomous behaviour and operation. Reliability requires robustness of the entire system in the presence of node, link, data processing, and communication failures. Interaction between nodes is required to manage and distribute information. One common interaction model is the mobile agent. An agent approach provides stronger autonomy than a traditional object or remote-procedure-call based approach. Agents can decide for themselves, which actions are performed, and they are capable of flexible behaviour, reacting on the environment and other agents, providing some degree of robustness. Traditionally multi-agent systems are abstract programming models which are implemented in software and executed on program controlled computer architectures. This approach does not well scale to micro-chip level and requires full equipped computers and communication structures, and the hardware architecture does not consider and reflect the requirements for agent processing and interaction. We propose and demonstrate a novel design paradigm for reliable distributed data processing systems and a synthesis methodology and framework for multi-agent systems implementable entirely on microchip-level with resource and power constrained digital logic supporting Agent-On-Chip architectures (AoC). The agent behaviour and mobility is fully integrated on the micro-chip using pipelined communicating processes implemented with finite-state machines and register-transfer logic. The agent behaviour, interaction (communication), and mobility features are modelled and specified on a machine-independent abstract programming level using a state-based agent behaviour language (APL). With this APL a high-level agent compiler is able to synthesize a hardware model (RTL, VHDL), a software model (C, ML), or a simulation model (XML) suitable to simulate a multi-agent system using the SeSAm simulator framework. Agent communication is provided by a simple tuple-space database implemented on node level providing fault tolerant access of global data. A novel synthesis development kit (SynDK) based on a graph-structured database approach is introduced to support the rapid development of compilers and synthesis tools, used for example for the design and implementation of the APL compiler.

  15. Evolving Ethical Concepts

    ERIC Educational Resources Information Center

    Potter, Van Rensselaer

    1977-01-01

    Discusses the role of the scientist in changing ethical concepts from simple interpersonal and theological imperatives towards "survival imperatives that must form the core of environmental bioethics." (CS)

  16. eXframe: reusable framework for storage, analysis and visualization of genomics experiments

    PubMed Central

    2011-01-01

    Background Genome-wide experiments are routinely conducted to measure gene expression, DNA-protein interactions and epigenetic status. Structured metadata for these experiments is imperative for a complete understanding of experimental conditions, to enable consistent data processing and to allow retrieval, comparison, and integration of experimental results. Even though several repositories have been developed for genomics data, only a few provide annotation of samples and assays using controlled vocabularies. Moreover, many of them are tailored for a single type of technology or measurement and do not support the integration of multiple data types. Results We have developed eXframe - a reusable web-based framework for genomics experiments that provides 1) the ability to publish structured data compliant with accepted standards 2) support for multiple data types including microarrays and next generation sequencing 3) query, analysis and visualization integration tools (enabled by consistent processing of the raw data and annotation of samples) and is available as open-source software. We present two case studies where this software is currently being used to build repositories of genomics experiments - one contains data from hematopoietic stem cells and another from Parkinson's disease patients. Conclusion The web-based framework eXframe offers structured annotation of experiments as well as uniform processing and storage of molecular data from microarray and next generation sequencing platforms. The framework allows users to query and integrate information across species, technologies, measurement types and experimental conditions. Our framework is reusable and freely modifiable - other groups or institutions can deploy their own custom web-based repositories based on this software. It is interoperable with the most important data formats in this domain. We hope that other groups will not only use eXframe, but also contribute their own useful modifications. PMID:22103807

  17. FOAM: the modular adaptive optics framework

    NASA Astrophysics Data System (ADS)

    van Werkhoven, T. I. M.; Homs, L.; Sliepen, G.; Rodenhuis, M.; Keller, C. U.

    2012-07-01

    Control software for adaptive optics systems is mostly custom built and very specific in nature. We have developed FOAM, a modular adaptive optics framework for controlling and simulating adaptive optics systems in various environments. Portability is provided both for different control hardware and adaptive optics setups. To achieve this, FOAM is written in C++ and runs on standard CPUs. Furthermore we use standard Unix libraries and compilation procedures and implemented a hardware abstraction layer in FOAM. We have successfully implemented FOAM on the adaptive optics system of ExPo - a high-contrast imaging polarimeter developed at our institute - in the lab and will test it on-sky late June 2012. We also plan to implement FOAM on adaptive optics systems for microscopy and solar adaptive optics. FOAM is available* under the GNU GPL license and is free to be used by anyone.

  18. Towards an International Framework for Recommendations of Core Competencies in Nursing and Inter-Professional Informatics: The TIGER Competency Synthesis Project.

    PubMed

    Hübner, Ursula; Shaw, Toria; Thye, Johannes; Egbert, Nicole; Marin, Heimar; Ball, Marion

    2016-01-01

    Informatics competencies of the health care workforce must meet the requirements of inter-professional process and outcome oriented provision of care. In order to help nursing education transform accordingly, the TIGER Initiative deployed an international survey, with participation from 21 countries, to evaluate and prioritise a broad list of core competencies for nurses in five domains: 1) nursing management, 2) information technology (IT) management in nursing, 3) interprofessional coordination of care, 4) quality management, and 5) clinical nursing. Informatics core competencies were found highly important for all domains. In addition, this project compiled eight national cases studies from Austria, Finland, Germany, Ireland, New Zealand, the Philippines, Portugal, and Switzerland that reflected the country specific perspective. These findings will lead us to an international framework of informatics recommendations.

  19. Disaster Metrics: A Comprehensive Framework for Disaster Evaluation Typologies.

    PubMed

    Wong, Diana F; Spencer, Caroline; Boyd, Lee; Burkle, Frederick M; Archer, Frank

    2017-10-01

    Introduction The frequency of disasters is increasing around the world with more people being at risk. There is a moral imperative to improve the way in which disaster evaluations are undertaken and reported with the aim of reducing preventable mortality and morbidity in future events. Disasters are complex events and undertaking disaster evaluations is a specialized area of study at an international level. Hypothesis/Problem While some frameworks have been developed to support consistent disaster research and evaluation, they lack validation, consistent terminology, and standards for reporting across the different phases of a disaster. There is yet to be an agreed, comprehensive framework to structure disaster evaluation typologies. The aim of this paper is to outline an evolving comprehensive framework for disaster evaluation typologies. It is anticipated that this new framework will facilitate an agreement on identifying, structuring, and relating the various evaluations found in the disaster setting with a view to better understand the process, outcomes, and impacts of the effectiveness and efficiency of interventions. Research was undertaken in two phases: (1) a scoping literature review (peer-reviewed and "grey literature") was undertaken to identify current evaluation frameworks and typologies used in the disaster setting; and (2) a structure was developed that included the range of typologies identified in Phase One and suggests possible relationships in the disaster setting. No core, unifying framework to structure disaster evaluation and research was identified in the literature. The authors propose a "Comprehensive Framework for Disaster Evaluation Typologies" that identifies, structures, and suggests relationships for the various typologies detected. The proposed Comprehensive Framework for Disaster Evaluation Typologies outlines the different typologies of disaster evaluations that were identified in this study and brings them together into a single framework. This unique, unifying framework has relevance at an international level and is expected to benefit the disaster, humanitarian, and development sectors. The next step is to undertake a validation process that will include international leaders with experience in evaluation, in general, and disasters specifically. This work promotes an environment for constructive dialogue on evaluations in the disaster setting to strengthen the evidence base for interventions across the disaster spectrum. It remains a work in progress. Wong DF , Spencer C , Boyd L , Burkle FM Jr. , Archer F . Disaster metrics: a comprehensive framework for disaster evaluation typologies. Prehosp Disaster Med. 2017;32(5):501-514.

  20. Modal Composition and Age of Intrusions in North-Central and Northeast Nevada

    USGS Publications Warehouse

    du Bray, Edward A.; Crafford, A. Elizabeth Jones

    2007-01-01

    Introduction Data presented in this report characterize igneous intrusions of north-central and northeast Nevada and were compiled as part of the Metallogeny of the Great Basin project conducted by the U.S. Geological Survey (USGS) between 2001 and 2007. The compilation pertains to the area bounded by lats 38.5 and 42 N., long 118.5 W., and the Nevada-Utah border (fig. 1). The area contains numerous large plutons and smaller stocks but also contains equally numerous smaller, shallowly emplaced intrusions, including dikes, sills, and endogenous dome complexes. Igneous intrusions (hereafter, intrusions) of multiple ages are major constituents of the geologic framework of north-central and northeast Nevada (Stewart and Carlson, 1978). Mesozoic and Cenozoic intrusions are particularly numerous and considered to be related to subduction along the west edge of the North American plate during this time. Henry and Ressel (2000) and Ressel and others (2000) have highlighted the association between magmatism and ore deposits along the Carlin trend. Similarly, Theodore (2000) has demonstrated the association between intrusions and ore deposits in the Battle Mountain area. Decades of geologic investigations in north-central and northeast Nevada (hereafter, the study area) demonstrate that most hydrothermal ore deposits are spatially, and probably temporally and genetically, associated with intrusions. Because of these associations, studies of many individual intrusions have been conducted, including those by a large number of Master's and Doctoral thesis students (particularly University of Nevada at Reno students and associated faculty), economic geologists working on behalf of exploration and mining companies, and USGS earth scientists. Although the volume of study area intrusions is large and many are associated with ore deposits, no synthesis of available data that characterize these rocks has been assembled. Compilations that have been produced for intrusions in Nevada pertain to relatively restricted geographic areas and (or) do not include the broad array of data that would best aid interpretation of these rocks. For example, Smith and others (1971) presented potassium-argon geochronologic and basic petrographic data for a limited number of intrusions in northcentral Nevada. Similarly, Silberman and McKee (1971) presented potassium-argon geochronologic data for a significant number of central Nevada intrusions. More recently, Mortensen and others (2000) presented uranium-lead geochronology for a small number of central Nevada intrusions. Sloan and others (2003) released a national geochronologic database that contains age determinations made prior to 1991 for rocks of Nevada. Finally, C.D. Henry (Nevada Bureau of Mines and Geology, written commun., 2006) has assembled geochronologic data for igneous rocks of Nevada produced subsequent to completion of the Sloan and others (2003) compilation. Consequently, although age data for igneous rocks of Nevada have been compiled, data pertaining to other features of these rocks have not been systematically synthesized. Maldonado and others (1988) compiled the distribution and some basic characteristics of intrusions throughout Nevada. Lee (1984), John (1983, 1987, and 1992), John and others (1994), and Ressel (2005) have compiled data that partially characterize intrusions in some parts of the study area. This report documents the first phase of an effort to compile a robust database for study area intrusions; in this initial phase, modal composition and age data are synthesized. In the next phase, geochemical data available for these rocks will be compiled. The ultimate goal is to compile data as a basis for an evaluation of the time-space-compositional evolution of Mesozoic and Cenozoic magmatism in the study area and identification of genetic associations between magmatism and mineralizing processes in this region.

  1. COVERT: A Framework for Finding Buffer Overflows in C Programs via Software Verification

    DTIC Science & Technology

    2010-08-01

    is greater than the allocated size of B. In the case of a type-safe language or a language with runtime bounds checking (such as Java), an overflow...leads either to a (compile-time) type error or a (runtime) exception. In such languages , a buffer overflow can lead to a denial of service attack (i.e...of current and legacy software is written in unsafe languages (such as C or C++) that allow buffers to be overflowed with impunity. For reasons such as

  2. Reasoning with case histories of process knowledge for efficient process development

    NASA Technical Reports Server (NTRS)

    Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.

    1988-01-01

    The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.

  3. The evaluation life cycle: a retrospective assessment of stages and phases of the circles of care initiative.

    PubMed

    Bess, Gary; Allen, James; Deters, Pamela B

    2004-08-12

    A life cycle metaphor characterizes the evolving relationship between the evaluator and program staff. This framework suggests that common developmental dynamics occur in roughly the same order across groups and settings. There are stage-specific dynamics that begin with Pre-History, which characterize the relationship between the grantees and evaluator. The stages are: (a) Pre-History, (b) Process, (c) Development, (d) Action, (e) Findings-Compilation, and (f) Transition. The common dynamics, expectations, and activities for each stage are discussed.

  4. Using a multi-state Learning Community as an implementation strategy for immediate postpartum long-acting reversible contraception.

    PubMed

    DeSisto, Carla L; Estrich, Cameron; Kroelinger, Charlan D; Goodman, David A; Pliska, Ellen; Mackie, Christine N; Waddell, Lisa F; Rankin, Kristin M

    2017-11-21

    Implementation strategies are imperative for the successful adoption and sustainability of complex evidence-based public health practices. Creating a learning collaborative is one strategy that was part of a recently published compilation of implementation strategy terms and definitions. In partnership with the Centers for Disease Control and Prevention and other partner agencies, the Association of State and Territorial Health Officials recently convened a multi-state Learning Community to support cross-state collaboration and provide technical assistance for improving state capacity to increase access to long-acting reversible contraception (LARC) in the immediate postpartum period, an evidence-based practice with the potential for reducing unintended pregnancy and improving maternal and child health outcomes. During 2015-2016, the Learning Community included multi-disciplinary, multi-agency teams of state health officials, payers, clinicians, and health department staff from 13 states. This qualitative study was conducted to better understand the successes, challenges, and strategies that the 13 US states in the Learning Community used for increasing access to immediate postpartum LARC. We conducted telephone interviews with each team in the Learning Community. Interviews were semi-structured and organized by the eight domains of the Learning Community. We coded transcribed interviews for facilitators, barriers, and implementation strategies, using a recent compilation of expert-defined implementation strategies as a foundation for coding the latter. Data analysis showed three ways that the activities of the Learning Community helped in policy implementation work: structure and accountability, validity, and preparing for potential challenges and opportunities. Further, the qualitative data demonstrated that the Learning Community integrated six other implementation strategies from the literature: organize clinician implementation team meetings, conduct educational meetings, facilitation, promote network weaving, provide ongoing consultation, and distribute educational materials. Convening a multi-state learning collaborative is a promising approach for facilitating the implementation of new reimbursement policies for evidence-based practices complicated by systems challenges. By integrating several implementation strategies, the Learning Community serves as a meta-strategy for supporting implementation.

  5. A Survey of Techniques for Approximate Computing

    DOE PAGES

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  6. Children of suicide: the telling and the knowing.

    PubMed

    Cain, Albert C

    2002-01-01

    Amidst the still limited literature on survivors of suicide, and the particularly scanty literature on children of parental suicide, little focal attention has been given to the special issues surrounding surviving parents telling the children that their deceased parent's death was a suicide. Those few papers that deal with this topic have primarily emphasized the destructive consequences of not telling of the suicidal nature of the death, with imperatives to tell the children the whole truth and do so promptly post-death. Based primarily on clinical and preventive work with children of suicide, this absolutism and one-size-fits-all approach is questioned, the difference between being told and knowing accented and illustrated, and the nature and effects of surviving parent explanatory frameworks for the suicide--the 'why' of it--explored.

  7. Dementia and representative democracy: Exploring challenges and implications for democratic citizenship.

    PubMed

    Sonnicksen, Jared

    2016-05-01

    Despite growing recognition of the rights of people with dementia for full citizenship, issues related to democracy, whether from theoretical or practical perspectives, remain neglected. Especially since discourses on dementia have expanded to this rights-based approach, it is imperative to begin to examine the meanings and practices of democracy within a context of dementia. Accordingly, the purpose of this article is to assess implications of dementia in the context of democracy. Rather than surveying the variety of democratic concepts, it will focus the analytical framework on representative democracy and then outline several challenges to and for representative democracy and citizens with dementia. The intention is to begin to identify paths for ensuring representation, inclusion and participation for those who have dementia. © The Author(s) 2016.

  8. Quantum mechanical modeling the emission pattern and polarization of nanoscale light emitting diodes.

    PubMed

    Wang, Rulin; Zhang, Yu; Bi, Fuzhen; Frauenheim, Thomas; Chen, GuanHua; Yam, ChiYung

    2016-07-21

    Understanding of the electroluminescence (EL) mechanism in optoelectronic devices is imperative for further optimization of their efficiency and effectiveness. Here, a quantum mechanical approach is formulated for modeling the EL processes in nanoscale light emitting diodes (LED). Based on non-equilibrium Green's function quantum transport equations, interactions with the electromagnetic vacuum environment are included to describe electrically driven light emission in the devices. The presented framework is illustrated by numerical simulations of a silicon nanowire LED device. EL spectra of the nanowire device under different bias voltages are obtained and, more importantly, the radiation pattern and polarization of optical emission can be determined using the current approach. This work is an important step forward towards atomistic quantum mechanical modeling of the electrically induced optical response in nanoscale systems.

  9. Food security in a perfect storm: using the ecosystem services framework to increase understanding

    PubMed Central

    Poppy, G. M.; Chiotha, S.; Eigenbrod, F.; Harvey, C. A.; Honzák, M.; Hudson, M. D.; Jarvis, A.; Madise, N. J.; Schreckenberg, K.; Shackleton, C. M.; Villa, F.; Dawson, T. P.

    2014-01-01

    Achieving food security in a ‘perfect storm’ scenario is a grand challenge for society. Climate change and an expanding global population act in concert to make global food security even more complex and demanding. As achieving food security and the millennium development goal (MDG) to eradicate hunger influences the attainment of other MDGs, it is imperative that we offer solutions which are complementary and do not oppose one another. Sustainable intensification of agriculture has been proposed as a way to address hunger while also minimizing further environmental impact. However, the desire to raise productivity and yields has historically led to a degraded environment, reduced biodiversity and a reduction in ecosystem services (ES), with the greatest impacts affecting the poor. This paper proposes that the ES framework coupled with a policy response framework, for example Driver-Pressure-State-Impact-Response (DPSIR), can allow food security to be delivered alongside healthy ecosystems, which provide many other valuable services to humankind. Too often, agro-ecosystems have been considered as separate from other natural ecosystems and insufficient attention has been paid to the way in which services can flow to and from the agro-ecosystem to surrounding ecosystems. Highlighting recent research in a large multi-disciplinary project (ASSETS), we illustrate the ES approach to food security using a case study from the Zomba district of Malawi. PMID:24535394

  10. Food security in a perfect storm: using the ecosystem services framework to increase understanding.

    PubMed

    Poppy, G M; Chiotha, S; Eigenbrod, F; Harvey, C A; Honzák, M; Hudson, M D; Jarvis, A; Madise, N J; Schreckenberg, K; Shackleton, C M; Villa, F; Dawson, T P

    2014-04-05

    Achieving food security in a 'perfect storm' scenario is a grand challenge for society. Climate change and an expanding global population act in concert to make global food security even more complex and demanding. As achieving food security and the millennium development goal (MDG) to eradicate hunger influences the attainment of other MDGs, it is imperative that we offer solutions which are complementary and do not oppose one another. Sustainable intensification of agriculture has been proposed as a way to address hunger while also minimizing further environmental impact. However, the desire to raise productivity and yields has historically led to a degraded environment, reduced biodiversity and a reduction in ecosystem services (ES), with the greatest impacts affecting the poor. This paper proposes that the ES framework coupled with a policy response framework, for example Driver-Pressure-State-Impact-Response (DPSIR), can allow food security to be delivered alongside healthy ecosystems, which provide many other valuable services to humankind. Too often, agro-ecosystems have been considered as separate from other natural ecosystems and insufficient attention has been paid to the way in which services can flow to and from the agro-ecosystem to surrounding ecosystems. Highlighting recent research in a large multi-disciplinary project (ASSETS), we illustrate the ES approach to food security using a case study from the Zomba district of Malawi.

  11. Simulation-based medical education: an ethical imperative.

    PubMed

    Ziv, Amitai; Wolpe, Paul Root; Small, Stephen D; Glick, Shimon

    2006-01-01

    Medical training must at some point use live patients to hone the skills of health professionals. But there is also an obligation to provide optimal treatment and to ensure patients' safety and well-being. Balancing these 2 needs represents a fundamental ethical tension in medical education. Simulation-based learning can help mitigate this tension by developing health professionals' knowledge, skills, and attitudes while protecting patients from unnecessary risk. Simulation-based training has been institutionalized in other high-hazard professions, such as aviation, nuclear power, and the military, to maximize training safety and minimize risk. Health care has lagged behind in simulation applications for a number of reasons, including cost, lack of rigorous proof of effect, and resistance to change. Recently, the international patient safety movement and the U.S. federal policy agenda have created a receptive atmosphere for expanding the use of simulators in medical training, stressing the ethical imperative to "first do no harm" in the face of validated, large epidemiological studies describing unacceptable preventable injuries to patients as a result of medical management. Four themes provide a framework for an ethical analysis of simulation-based medical education: best standards of care and training, error management and patient safety, patient autonomy, and social justice and resource allocation. These themes are examined from the perspectives of patients, learners, educators, and society. The use of simulation wherever feasible conveys a critical educational and ethical message to all: patients are to be protected whenever possible and they are not commodities to be used as conveniences of training.

  12. Simulation-based medical education: an ethical imperative.

    PubMed

    Ziv, Amitai; Wolpe, Paul Root; Small, Stephen D; Glick, Shimon

    2003-08-01

    Medical training must at some point use live patients to hone the skills of health professionals. But there is also an obligation to provide optimal treatment and to ensure patients' safety and well-being. Balancing these two needs represents a fundamental ethical tension in medical education. Simulation-based learning can help mitigate this tension by developing health professionals' knowledge, skills, and attitudes while protecting patients from unnecessary risk. Simulation-based training has been institutionalized in other high-hazard professions, such as aviation, nuclear power, and the military, to maximize training safety and minimize risk. Health care has lagged behind in simulation applications for a number of reasons, including cost, lack of rigorous proof of effect, and resistance to change. Recently, the international patient safety movement and the U.S. federal policy agenda have created a receptive atmosphere for expanding the use of simulators in medical training, stressing the ethical imperative to "first do no harm" in the face of validated, large epidemiological studies describing unacceptable preventable injuries to patients as a result of medical management. Four themes provide a framework for an ethical analysis of simulation-based medical education: best standards of care and training, error management and patient safety, patient autonomy, and social justice and resource allocation. These themes are examined from the perspectives of patients, learners, educators, and society. The use of simulation wherever feasible conveys a critical educational and ethical message to all: patients are to be protected whenever possible and they are not commodities to be used as conveniences of training.

  13. Livelihoods, power, and food insecurity: adaptation of social capital portfolios in protracted crises--case study Burundi.

    PubMed

    Vervisch, Thomas G A; Vlassenroot, Koen; Braeckman, Johan

    2013-04-01

    The failure of food security and livelihood interventions to adapt to conflict settings remains a key challenge in humanitarian responses to protracted crises. This paper proposes a social capital analysis to address this policy gap, adding a political economy dimension on food security and conflict to the actor-based livelihood framework. A case study of three hillsides in north Burundi provides an ethnographic basis for this hypothesis. While relying on a theoretical framework in which different combinations of social capital (bonding, bridging, and linking) account for a diverse range of outcomes, the findings offer empirical insights into how social capital portfolios adapt to a protracted crisis. It is argued that these social capital adaptations have the effect of changing livelihood policies, institutions, and processes (PIPs), and clarify the impact of the distribution of power and powerlessness on food security issues. In addition, they represent a solid way of integrating political economy concerns into the livelihood framework. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  14. Chemical contaminants entering the marine environment from sea-based sources: A review with a focus on European seas.

    PubMed

    Tornero, Victoria; Hanke, Georg

    2016-11-15

    Anthropogenic contaminants reach the marine environment mostly directly from land-based sources, but there are cases in which they are emitted or re-mobilized in the marine environment itself. This paper reviews the literature, with a predominant focus on the European environment, to compile a list of contaminants potentially released into the sea from sea-based sources and provide an overview of their consideration under existing EU regulatory frameworks. The resulting list contains 276 substances and for some of them (22 antifouling biocides, 32 aquaculture medicinal products and 34 warfare agents) concentrations and toxicity data are additionally provided. The EU Marine Strategy Framework Directive Descriptor 8, together with the Water Framework Directive and the Regional Sea Conventions, provides the provisions against pollution of marine waters by chemical substances. This literature review should inform about the current state of knowledge regarding marine contaminant sources and provide support for setting-up of monitoring approaches, including hotspots screening. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. A decision support framework for characterizing and managing dermal exposures to chemicals during Emergency Management and Operations.

    PubMed

    Dotson, G Scott; Hudson, Naomi L; Maier, Andrew

    2015-01-01

    Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management.

  16. A decision support framework for characterizing and managing dermal exposures to chemicals during Emergency Management and Operations

    PubMed Central

    Dotson, G. Scott; Hudson, Naomi L.; Maier, Andrew

    2016-01-01

    Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management. PMID:26312660

  17. A Transparent and Transferable Framework for Tracking Quality Information in Large Datasets

    PubMed Central

    Smith, Derek E.; Metzger, Stefan; Taylor, Jeffrey R.

    2014-01-01

    The ability to evaluate the validity of data is essential to any investigation, and manual “eyes on” assessments of data quality have dominated in the past. Yet, as the size of collected data continues to increase, so does the effort required to assess their quality. This challenge is of particular concern for networks that automate their data collection, and has resulted in the automation of many quality assurance and quality control analyses. Unfortunately, the interpretation of the resulting data quality flags can become quite challenging with large data sets. We have developed a framework to summarize data quality information and facilitate interpretation by the user. Our framework consists of first compiling data quality information and then presenting it through 2 separate mechanisms; a quality report and a quality summary. The quality report presents the results of specific quality analyses as they relate to individual observations, while the quality summary takes a spatial or temporal aggregate of each quality analysis and provides a summary of the results. Included in the quality summary is a final quality flag, which further condenses data quality information to assess whether a data product is valid or not. This framework has the added flexibility to allow “eyes on” information on data quality to be incorporated for many data types. Furthermore, this framework can aid problem tracking and resolution, should sensor or system malfunctions arise. PMID:25379884

  18. Water quality and ecosystem management: Data-driven reality check of effects in streams and lakes

    NASA Astrophysics Data System (ADS)

    Destouni, Georgia; Fischer, Ida; Prieto, Carmen

    2017-08-01

    This study investigates nutrient-related water quality conditions and change trends in the first management periods of the EU Water Framework Directive (WFD; since 2009) and Baltic Sea Action Plan (BASP; since 2007). With mitigation of nutrients in inland waters and their discharges to the Baltic Sea being a common WFD and BSAP target, we use Sweden as a case study of observable effects, by compiling and analyzing all openly available water and nutrient monitoring data across Sweden since 2003. The data compilation reveals that nutrient monitoring covers only around 1% (down to 0.2% for nutrient loads) of the total number of WFD-classified stream and lake water bodies in Sweden. The data analysis further shows that the hydro-climatically driven water discharge dominates the determination of waterborne loads of both total phosphorus and total nitrogen across Sweden. Both water discharge and the related nutrient loads are in turn well correlated with the ecosystem status classification of Swedish water bodies. Nutrient concentrations do not exhibit such correlation and their changes over the study period are on average small, but concentration increases are found for moderate-to-bad status waters, for which both the WFD and the BSAP have instead targeted concentration decreases. In general, these results indicate insufficient distinction and mitigation of human-driven nutrient components in inland waters and their discharges to the sea by the internationally harmonized applications of the WFD and the BSAP. The results call for further comparative investigations of observable large-scale effects of such regulatory/management frameworks in different parts of the world.

  19. [Methodology for clinical research in Orthodontics, the assets of the beOrtho website].

    PubMed

    Ruiz, Martial; Thibult, François

    2014-06-01

    The rules applying to the "evidence-based" methodology strongly influenced the clinical research in orthodontics. However, the implementation of clinical studies requires rigour, important statistical and methodological knowledge, as well as a reliable environment in order to compile and store the data obtained from research. We developed the project "beOrtho.com" (based on orthodontic evidence) in order to fill up the gap between our desire to drive clinical research and the necessity of methodological rigour in the exploitation of its results. BeOrtho website was created to answer the issue of sample recruitment, data compilation and storage, while providing help for the methodological design of clinical studies. It allows the development and monitoring of clinical studies, as well as the creation of databases. On the other hand, we designed an evaluation grid for clinical studies which helps developing systematic reviews. In order to illustrate our point, we tested a research protocol evaluating the interest of the mandibular advancement in the framework of Class II treatment. © EDP Sciences, SFODF, 2014.

  20. Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution

    NASA Astrophysics Data System (ADS)

    Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin

    2018-04-01

    The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.

  1. Altitudes and thicknesses of hydrogeologic units of the Ozark Plateaus aquifer system in Arkansas, Kansas, Missouri, and Oklahoma

    USGS Publications Warehouse

    Westerman, Drew A.; Gillip, Jonathan A.; Richards, Joseph M.; Hays, Phillip D.; Clark, Brian R.

    2016-09-29

    A hydrogeologic framework was constructed to represent the altitudes and thicknesses of hydrogeologic units within the Ozark Plateaus aquifer system as part of a regional groundwater-flow model supported by the U.S. Geological Survey Water Availability and Use Science Program. The Ozark Plateaus aquifer system study area is nearly 70,000 square miles and includes parts of Arkansas, Kansas, Missouri, and Oklahoma. Nine hydrogeologic units were selected for delineation within the aquifer system and include the Western Interior Plains confining system, the Springfield Plateau aquifer, the Ozark confining unit, the Ozark aquifer, which was divided into the upper, middle, and lower Ozark aquifers to better capture the spatial variation in the hydrologic properties, the St. Francois confining unit, the St. Francois aquifer, and the basement confining unit. Geophysical and well-cutting logs, along with lithologic descriptions by well drillers, were compiled and interpreted to create hydrologic altitudes for each unit. The final compiled dataset included more than 23,000 individual altitude points (excluding synthetic points) representing the nine hydrogeologic units within the Ozark Plateaus aquifer system.

  2. Compiling, costing and funding complex packages of home-based health care.

    PubMed

    Noyes, Jane; Lewis, Mary

    2007-06-01

    Nurses play a central role in putting together complex packages of care to support children with complex healthcare needs and their families in the community. However, there is little evidence or guidance to support this area of practice. At present, the process of compiling a care package and obtaining funding takes too long, causing significant delays in discharge and great frustration for parents, children and professionals. This article presents a combination of best practice guidance and, where possible, evidence-based principles that can be adapted and applied to an individual case irrespective of the child's diagnosis. The aim is to assist nurses and other healthcare professionals in organising funding for packages of care, bringing about the desired outcomes of successful discharge and appropriate community support. To work effectively as keyworkers for these children and families nurses need knowledge and skills in relation to: multidisciplinary assessment frameworks and processes, identifying appropriate models of service provision, costing care packages and approaches to obtaining funding. A further article next month will address risk management and clinical governance issues in delivering complex home-based care.

  3. A Markovian state-space framework for integrating flexibility into space system design decisions

    NASA Astrophysics Data System (ADS)

    Lafleur, Jarret M.

    The past decades have seen the state of the art in aerospace system design progress from a scope of simple optimization to one including robustness, with the objective of permitting a single system to perform well even in off-nominal future environments. Integrating flexibility, or the capability to easily modify a system after it has been fielded in response to changing environments, into system design represents a further step forward. One challenge in accomplishing this rests in that the decision-maker must consider not only the present system design decision, but also sequential future design and operation decisions. Despite extensive interest in the topic, the state of the art in designing flexibility into aerospace systems, and particularly space systems, tends to be limited to analyses that are qualitative, deterministic, single-objective, and/or limited to consider a single future time period. To address these gaps, this thesis develops a stochastic, multi-objective, and multi-period framework for integrating flexibility into space system design decisions. Central to the framework are five steps. First, system configuration options are identified and costs of switching from one configuration to another are compiled into a cost transition matrix. Second, probabilities that demand on the system will transition from one mission to another are compiled into a mission demand Markov chain. Third, one performance matrix for each design objective is populated to describe how well the identified system configurations perform in each of the identified mission demand environments. The fourth step employs multi-period decision analysis techniques, including Markov decision processes from the field of operations research, to find efficient paths and policies a decision-maker may follow. The final step examines the implications of these paths and policies for the primary goal of informing initial system selection. Overall, this thesis unifies state-centric concepts of flexibility from economics and engineering literature with sequential decision-making techniques from operations research. The end objective of this thesis’ framework and its supporting tools is to enable selection of the next-generation space systems today, tailored to decision-maker budget and performance preferences, that will be best able to adapt and perform in a future of changing environments and requirements. Following extensive theoretical development, the framework and its steps are applied to space system planning problems of (1) DARPA-motivated multiple- or distributed-payload satellite selection and (2) NASA human space exploration architecture selection.

  4. Police referrals to a psychiatric hospital: experiences of nurses caring for police-referred admissions.

    PubMed

    Maharaj, Reshin; O'Brien, Louise; Gillies, Donna; Andrew, Sharon

    2013-08-01

    Police are a major source of referral to psychiatric hospitals in industrialized countries with mental health legislation. However, little attention has been paid to nurses' experience of caring for police-referred patients to psychiatric hospitals. This study utilized a Heideggerian phenomenological framework to explore the experiences of nine nurses caring for patients referred by the police, through semistructured interviews. Two major themes emerged from the hermeneutic analyses of interviews conducted with nurse participants: (i) 'expecting "the worst" '; and (ii) 'balancing therapeutic care and forced treatment'. Expecting 'the worst' related to the perceptions nurse participants had about patients referred by the police. This included two sub-themes: (i) 'we are here to care for whoever they bring in'; and (ii) 'but who deserves care?' The second theme balancing therapeutic care and forced treatment included the sub-themes: (i) 'taking control, taking care'; and (ii) 'managing power'. The study raises ethical and skill challenges for nursing including struggling with the notion of who deserves care, and balancing the imperatives of legislation with the need to work within a therapeutic framework. © 2012 The Authors; International Journal of Mental Health Nursing © 2012 Australian College of Mental Health Nurses Inc.

  5. A model for AGN variability on multiple time-scales

    NASA Astrophysics Data System (ADS)

    Sartori, Lia F.; Schawinski, Kevin; Trakhtenbrot, Benny; Caplar, Neven; Treister, Ezequiel; Koss, Michael J.; Urry, C. Megan; Zhang, C. E.

    2018-05-01

    We present a framework to link and describe active galactic nuclei (AGN) variability on a wide range of time-scales, from days to billions of years. In particular, we concentrate on the AGN variability features related to changes in black hole fuelling and accretion rate. In our framework, the variability features observed in different AGN at different time-scales may be explained as realisations of the same underlying statistical properties. In this context, we propose a model to simulate the evolution of AGN light curves with time based on the probability density function (PDF) and power spectral density (PSD) of the Eddington ratio (L/LEdd) distribution. Motivated by general galaxy population properties, we propose that the PDF may be inspired by the L/LEdd distribution function (ERDF), and that a single (or limited number of) ERDF+PSD set may explain all observed variability features. After outlining the framework and the model, we compile a set of variability measurements in terms of structure function (SF) and magnitude difference. We then combine the variability measurements on a SF plot ranging from days to Gyr. The proposed framework enables constraints on the underlying PSD and the ability to link AGN variability on different time-scales, therefore providing new insights into AGN variability and black hole growth phenomena.

  6. Perceptions of Ebola virus disease in Nigeria: Understanding the influence of imagination on health orientation.

    PubMed

    Oduyemi, Rachael O; Ayegboyin, Matthew; Salami, Kabiru K

    2016-06-01

    The 2014 Ebola virus disease (EVD) outbreak was officially declared in the West Africa region by the World Health Organization (WHO) on 23 March 2014. This first episode of EVD in Nigeria on 20 July 2014 raised more intense panic globally than the seven occurrences of the disease in Zaire. Although Nigeria was declared Ebola free by the WHO within 3 months, it is imperative to understand people's perceptions of the disease in the country. A discussion of peoples' perception of EVD in Nigeria is the aim of this article. This discussion paper complements secondary data with grey literature to explore how peoples' imagination and personification of thoughts influence their health orientation. Data are sourced from secondary information compiled from 'The Nation Newspaper, 2014'; 'Nairaland online forum, 2014' and 'Giftedgreen online magazine, 2014'. Ebola virus disease was perceived as a spiritual manipulation of witchcraft activities and described as biological terrorism and a means of creating a drug market, among other issues, in the country. Public health professionals should consider the sociocultural milieu to understand and offer health-care services in epidemics. Public health orientation work is urgently required in Nigeria to forestall future occurrence of EVD and other highly infectious diseases. © 2016 John Wiley & Sons Australia, Ltd.

  7. Landslide Detection in the Carlyon Beach, WA Peninsula: Analysis Of High Resolution DEMs

    NASA Astrophysics Data System (ADS)

    Fayne, J.; Tran, C.; Mora, O. E.

    2017-12-01

    Landslides are geological events caused by slope instability and degradation, leading to the sliding of large masses of rock and soil down a mountain or hillside. These events are influenced by topography, geology, weather and human activity, and can cause extensive damage to the environment and infrastructure, such as the destruction of transportation networks, homes, and businesses. It is therefore imperative to detect early-warning signs of landslide hazards as a means of mitigation and disaster prevention. Traditional landslide surveillance consists of field mapping, but the process is expensive and time consuming. This study uses Light Detection and Ranging (LiDAR) derived Digital Elevation Models (DEMs) and k-means clustering and Gaussian Mixture Model (GMM) to analyze surface roughness and extract spatial features and patterns of landslides and landslide-prone areas. The methodology based on several feature extractors employs an unsupervised classifier on the Carlyon Beach Peninsula in the state of Washington to attempt to identify slide potential terrain. When compared with the independently compiled landslide inventory map, the proposed algorithm correctly classifies up to 87% of the terrain. These results suggest that the proposed methods and LiDAR-derived DEMs can provide important surface information and be used as efficient tools for digital terrain analysis to create accurate landslide maps.

  8. CUDAMPF: a multi-tiered parallel framework for accelerating protein sequence search in HMMER on CUDA-enabled GPU.

    PubMed

    Jiang, Hanyu; Ganesan, Narayan

    2016-02-27

    HMMER software suite is widely used for analysis of homologous protein and nucleotide sequences with high sensitivity. The latest version of hmmsearch in HMMER 3.x, utilizes heuristic-pipeline which consists of MSV/SSV (Multiple/Single ungapped Segment Viterbi) stage, P7Viterbi stage and the Forward scoring stage to accelerate homology detection. Since the latest version is highly optimized for performance on modern multi-core CPUs with SSE capabilities, only a few acceleration attempts report speedup. However, the most compute intensive tasks within the pipeline (viz., MSV/SSV and P7Viterbi stages) still stand to benefit from the computational capabilities of massively parallel processors. A Multi-Tiered Parallel Framework (CUDAMPF) implemented on CUDA-enabled GPUs presented here, offers a finer-grained parallelism for MSV/SSV and Viterbi algorithms. We couple SIMT (Single Instruction Multiple Threads) mechanism with SIMD (Single Instructions Multiple Data) video instructions with warp-synchronism to achieve high-throughput processing and eliminate thread idling. We also propose a hardware-aware optimal allocation scheme of scarce resources like on-chip memory and caches in order to boost performance and scalability of CUDAMPF. In addition, runtime compilation via NVRTC available with CUDA 7.0 is incorporated into the presented framework that not only helps unroll innermost loop to yield upto 2 to 3-fold speedup than static compilation but also enables dynamic loading and switching of kernels depending on the query model size, in order to achieve optimal performance. CUDAMPF is designed as a hardware-aware parallel framework for accelerating computational hotspots within the hmmsearch pipeline as well as other sequence alignment applications. It achieves significant speedup by exploiting hierarchical parallelism on single GPU and takes full advantage of limited resources based on their own performance features. In addition to exceeding performance of other acceleration attempts, comprehensive evaluations against high-end CPUs (Intel i5, i7 and Xeon) shows that CUDAMPF yields upto 440 GCUPS for SSV, 277 GCUPS for MSV and 14.3 GCUPS for P7Viterbi all with 100 % accuracy, which translates to a maximum speedup of 37.5, 23.1 and 11.6-fold for MSV, SSV and P7Viterbi respectively. The source code is available at https://github.com/Super-Hippo/CUDAMPF.

  9. C++ software quality in the ATLAS experiment: tools and experience

    NASA Astrophysics Data System (ADS)

    Martin-Haugh, S.; Kluth, S.; Seuster, R.; Snyder, S.; Obreshkov, E.; Roe, S.; Sherwood, P.; Stewart, G. A.

    2017-10-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  10. Quantum Computing Architectural Design

    NASA Astrophysics Data System (ADS)

    West, Jacob; Simms, Geoffrey; Gyure, Mark

    2006-03-01

    Large scale quantum computers will invariably require scalable architectures in addition to high fidelity gate operations. Quantum computing architectural design (QCAD) addresses the problems of actually implementing fault-tolerant algorithms given physical and architectural constraints beyond those of basic gate-level fidelity. Here we introduce a unified framework for QCAD that enables the scientist to study the impact of varying error correction schemes, architectural parameters including layout and scheduling, and physical operations native to a given architecture. Our software package, aptly named QCAD, provides compilation, manipulation/transformation, multi-paradigm simulation, and visualization tools. We demonstrate various features of the QCAD software package through several examples.

  11. Enhancing infrastructure resilience through business continuity planning.

    PubMed

    Fisher, Ronald; Norman, Michael; Klett, Mary

    2017-01-01

    Critical infrastructure is crucial to the functionality and wellbeing of the world around us. It is a complex network that works together to create an efficient society. The core components of critical infrastructure are dependent on one another to function at their full potential. Organisations face unprecedented environmental risks such as increased reliance on information technology and telecommunications, increased infrastructure interdependencies and globalisation. Successful organisations should integrate the components of cyber-physical and infrastructure interdependencies into a holistic risk framework. Physical security plans, cyber security plans and business continuity plans can help mitigate environmental risks. Cyber security plans are becoming the most crucial to have, yet are the least commonly found in organisations. As the reliance on cyber continues to grow, it is imperative that organisations update their business continuity and emergency preparedness activities to include this.

  12. Judging the quality of mercy: drawing a line between palliation and euthanasia.

    PubMed

    Morrison, Wynne; Kang, Tammy

    2014-02-01

    Clinicians frequently worry that medications used to treat pain and suffering at the end of life might also hasten death. Intentionally hastening death, or euthanasia, is neither legal nor ethically appropriate in children. In this article, we explore some of the historical and legal background regarding appropriate end-of-life care and outline what distinguishes it from euthanasia. Good principles include clarity of goals and assessments, titration of medications to effect, and open communication. When used appropriately, medications to treat symptoms should rarely hasten death significantly. Medications and interventions that are not justifiable are also discussed, as are the implications of palliative sedation and withholding fluids or nutrition. It is imperative that clinicians know how to justify and use such medications to adequately treat suffering at the end of life within a relevant clinical and legal framework.

  13. Ethics of clear health communication: applying the CLEAN Look approach to communicate biobanking information for cancer research.

    PubMed

    Koskan, Alexis; Arevalo, Mariana; Gwede, Clement K; Quinn, Gwendolyn P; Noel-Thomas, Shalewa A; Luque, John S; Wells, Kristen J; Meade, Cathy D

    2012-11-01

    Cancer innovations, such as biobanking technologies, are continuously evolving to improve our understanding and knowledge about cancer prevention and treatment modalities. However, the public receives little communication about biobanking and is often unaware about this innovation until asked to donate biospecimens. It is the researchers' ethical duty to provide clear communications about biobanking and biospecimen research. Such information allows the public to understand biobanking processes and facilitates informed decision making about biospecimen donation. The aims of this paper are 1) to examine the importance of clear communication as an ethical imperative when conveying information about cancer innovations and 2) to illustrate the use of an organizing framework, the CLEAN ( C ulture, L iteracy, E ducation, A ssessment, and N etworking) Look approach for creating educational priming materials about the topic of biobanking.

  14. What do international ethics guidelines say in terms of the scope of medical research ethics?

    PubMed

    Bernabe, Rosemarie D L C; van Thiel, Ghislaine J M W; van Delden, Johannes J M

    2016-04-26

    In research ethics, the most basic question would always be, "which is an ethical issue, which is not?" Interestingly, depending on which ethics guideline we consult, we may have various answers to this question. Though we already have several international ethics guidelines for biomedical research involving human participants, ironically, we do not have a harmonized document which tells us what these various guidelines say and shows us the areas of consensus (or lack thereof). In this manuscript, we attempted to do just that. We extracted the imperatives from five internationally-known ethics guidelines and took note where the imperatives came from. In doing so, we gathered data on how many guidelines support a specific imperative. We found that there is no consensus on the majority of the imperatives and that in only 8.2% of the imperatives were there at least moderate consensus (i.e., consensus of at least 3 of the 5 ethics guidelines). Of the 12 clusters (Basic Principles; Research Collaboration; Social Value; Scientific Validity; Participant Selection; Favorable Benefit/Risk Ratio; Independent Review; Informed Consent; Respect for Participants; Publication and Registration; Regulatory Sanctions; and Justified Research on the Vulnerable Population), Informed Consent has the highest level of consensus and Research Collaboration and Regulatory Sanctions have the least. There was a lack of consensus in the majority of imperatives from the five internationally-known ethics guidelines. This may be partly explained by the differences among the guidelines in terms of their levels of specification as well as conceptual/ideological differences.

  15. Compiling software for a hierarchical distributed processing system

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  16. A Global Imperative for the Environment

    ERIC Educational Resources Information Center

    Strong, Maurice F.

    1974-01-01

    Population, depleted resources, pollution and other environmental issues are reviewed from an international perspective. The imperative nature of world-wide cooperation is emphasized for dealing effectively with ecological problems. Directions in which this cooperation should be effected are presented. (JP)

  17. Higher Education's Democratic Imperative

    ERIC Educational Resources Information Center

    Thomas, Nancy L.; Hartley, Matthew

    2010-01-01

    Last summer, the Democracy Imperative and the Deliberative Democracy Consortium, two national networks linking academics and deliberative democracy practitioners, hosted a national conference, No Better Time: Promising Opportunities in Deliberative Democracy for Educators and Practitioners ("No Better Time," 2010). Over 250 civic leaders,…

  18. Our Common Cause

    ERIC Educational Resources Information Center

    Eklund, Lowell

    1976-01-01

    Adult educators, providing necessary lifelong, comprehensive education, face the major imperative of solving the problem of racism. Numerous other imperatives concerning program design and content must be fulfilled in effectively meeting adult learners' needs. Guidance from the principles expressed in a proposed adult educator's Hippocratic Oath…

  19. Some Remarks on Mood and Aspect in Russian

    ERIC Educational Resources Information Center

    Miller, J.

    1974-01-01

    An explanation is offered of aspect in imperative verb forms and in certain infinitive verb forms in Russian. Three presuppositions or conditions of appropriateness are postulated and their correlation to the aspect of an imperative or infinitive form discussed. (RM)

  20. The DECIDE evidence to recommendation framework adapted to the public health field in Sweden.

    PubMed

    Guldbrandsson, Karin; Stenström, Nils; Winzer, Regina

    2016-12-01

    Organizations worldwide compile results from scientific studies, and grade the evidence of interventions, in order to assist policy makers. However, quality of evidence alone is seldom sufficient to make a recommendation. The Developing and Evaluating Communication Strategies to Support Informed Decisions and Practice Based on Evidence (DECIDE) framework aims to facilitate decision making and to improve dissemination and implementation of recommendations in the healthcare and public health sector. The aim of this study was to investigate whether the DECIDE framework is applicable in the public health field in Sweden. The DECIDE framework was presented and discussed in interviews with stakeholders and governmental organizations and tested in panels. Content analyses were performed. In general, the informants were positive to the DECIDE framework. However, two questions, the first regarding individual autonomy and the second regarding method sustainability, were by the stakeholders felt to be missing in the framework. The importance of the composition of the DECIDE stakeholder panel was lifted by the informants, as was the significant role of the chair. Further, the informants raised concerns about the general lack of research evidence based on RCT design regarding universal methods in the public health sector. Finally, the local, regional and national levels' responsibility for dissemination and implementation of recommendations were lifted by the informants. The DECIDE framework might be useful as a tool for dissemination and implementation of recommendations in the public health field in Sweden. Important questions for further research are whether these findings are suitable for other public health topics and in other public health settings. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. A system-wide analysis using a senior-friendly hospital framework identifies current practices and opportunities for improvement in the care of hospitalized older adults.

    PubMed

    Wong, Ken S; Ryan, David P; Liu, Barbara A

    2014-11-01

    Older adults are vulnerable to hospital-associated complications such as falls, pressure ulcers, functional decline, and delirium, which can contribute to prolonged hospital stay, readmission, and nursing home placement. These vulnerabilities are exacerbated when the hospital's practices, services, and physical environment are not sufficiently mindful of the complex, multidimensional needs of frail individuals. Several frameworks have emerged to help hospitals examine how organization-wide processes can be customized to avoid these complications. This article describes the application of one such framework-the Senior-Friendly Hospital (SFH) framework adopted in Ontario, Canada-which comprises five interrelated domains: organizational support, processes of care, emotional and behavioral environment, ethics in clinical care and research, and physical environment. This framework provided the blueprint for a self-assessment of all 155 adult hospitals across the province of Ontario. The system-wide analysis identified practice gaps and promising practices within each domain of the SFH framework. Taken together, these results informed 12 recommendations to support hospitals at all stages of development in becoming friendly to older adults. Priorities for system-wide action were identified, encouraging hospitals to implement or further develop their processes to better address hospital-acquired delirium and functional decline. These recommendations led to collaborative action across the province, including the development of an online toolkit and the identification of accountability indicators to support hospitals in quality improvement focusing on senior-friendly care. © 2014, Copyright the Authors Journal compilation © 2014, The American Geriatrics Society.

  2. A Needs-led Framework for Understanding the Impact of Caring for a Family Member With Dementia.

    PubMed

    Pini, Simon; Ingleson, Emma; Megson, Molly; Clare, Linda; Wright, Penny; Oyebode, Jan R

    2018-03-19

    Approximately half the care for people with dementia is provided by families. It is therefore imperative that research informs ways of maintaining such care. In this study, we propose that a needs-led approach can provide a useful, novel means of conceptualizing the impact of caring on the lives of family carers. Our aim was to develop and present a needs-led framework for understanding how providing care impacts on carers' fulfilment of needs. In this qualitative study, we conducted 42 semistructured interviews with a purposively diverse sample of family carers to generate nuanced contextualized accounts of how caring impacted on carers' lives. Our inductive thematic analysis focused upon asking: "What need is being impacted here?" in order to generate a needs-led framework for understanding. Nine themes were widely endorsed. Each completed the sentence: "Being a carer impacts on fulfilling my need to/for….": Freedom; feel close to my relative; feel in control of my life; be my own person; protect my relative; share/express my thoughts and feelings; take care of myself; feel connected to the people around me; get things done. These needs echo those from other research areas, with relational needs emerging as particularly central. The needs-led approach offers a perspective that is able to capture both stresses and positive aspects of caregiving. We recommend that clinical interviewing using Socratic questioning to discover human needs that are being impacted by caring would provide a valuable starting point for care planning.

  3. A Web GIS Enabled Comprehensive Hydrologic Information System for Indian Water Resources Systems

    NASA Astrophysics Data System (ADS)

    Goyal, A.; Tyagi, H.; Gosain, A. K.; Khosa, R.

    2017-12-01

    Hydrological systems across the globe are getting increasingly water stressed with each passing season due to climate variability & snowballing water demand. Hence, to safeguard food, livelihood & economic security, it becomes imperative to employ scientific studies for holistic management of indispensable resource like water. However, hydrological study of any scale & purpose is heavily reliant on various spatio-temporal datasets which are not only difficult to discover/access but are also tough to use & manage. Besides, owing to diversity of water sector agencies & dearth of standard operating procedures, seamless information exchange is challenging for collaborators. Extensive research is being done worldwide to address these issues but regrettably not much has been done in developing countries like India. Therefore, the current study endeavours to develop a Hydrological Information System framework in a Web-GIS environment for empowering Indian water resources systems. The study attempts to harmonize the standards for metadata, terminology, symbology, versioning & archiving for effective generation, processing, dissemination & mining of data required for hydrological studies. Furthermore, modelers with humble computing resources at their disposal, can consume this standardized data in high performance simulation modelling using cloud computing within the developed Web-GIS framework. They can also integrate the inputs-outputs of different numerical models available on the platform and integrate their results for comprehensive analysis of the chosen hydrological system. Thus, the developed portal is an all-in-one framework that can facilitate decision makers, industry professionals & researchers in efficient water management.

  4. More than meets the eye. Feminist poststructuralism as a lens towards understanding obesity.

    PubMed

    Aston, Megan; Price, Sheri; Kirk, Sara Frances Louise; Penney, Tarra

    2012-05-01

      This paper presents a discussion of the application of a feminist poststructuralist-based theoretical framework as an innovative approach towards understanding and managing the complex health issue of obesity.   Obesity is often viewed as a lifestyle choice for which the individual is blamed. This individualistic, dichotomous and behavioural perspective only allows for a narrow understanding of obesity and may even lead to misperceptions, stereotypes and marginalization of clients experiencing obesity. Feminist poststructuralism can provide a critical lens to understand the social construction of obesity and the broader environmental and cultural contexts of this health issue.   The theoretical framework draws from the writings of Foucault, Scott, Butler, Cheek, and Powers, published between 1983 and 2005.   The concepts of discourse analysis and power relations are explored and discussed in a clear manner so that nurses can easily apply this framework to their practice as they observe, question, analyse, critique and assess the care experienced by clients who are obese. The concepts of personal and social beliefs, values and stereotypes are also discussed and examples of how to apply them in practice are provided.   It is imperative that we continue to question our everyday nursing practices as we work to support clients, especially those who feel marginalized. This focus on power relations and reflective practice can give direction to new possibilities for change in obesity management. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.

  5. From data processing to mental organs: an interdisciplinary path to cognitive neuroscience.

    PubMed

    Patharkar, Manoj

    2011-01-01

    Human brain is a highly evolved coordinating mechanism in the species Homo sapiens. It is only in the last 100 years that extensive knowledge of the intricate structure and complex functioning of the human brain has been acquired, though a lot is yet to be known. However, from the beginning of civilisation, people have been conscious of a 'mind' which has been considered the origin of all scientific and cultural development. Philosophers have discussed at length the various attributes of consciousness. At the same time, most of the philosophical or scientific frameworks have directly or indirectly implied mind-body duality. It is now imperative that we develop an integrated approach to understand the interconnection between mind and consciousness on one hand and brain on the other. This paper begins with the proposition that the structure of the brain is analogous, at least to certain extent, to that of the computer system. Of course, it is much more sophisticated and complex. The second proposition is that the Chomskyean concept of 'mental organs' is a good working hypothesis that tries to characterise this complexity in terms of an innate cognitive framework. By following this dual approach, brain as a data processing system and brain as a superstructure of intricately linked mental organs, we can move toward a better understanding of 'mind' within the framework of empirical science. The one 'mental organ' studied extensively in Chomskyean terms is 'language faculty' which is unique in its relation to brain, mind and consciousness.

  6. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 67: Maximizing the Results of Federally-Funded Research and Development Through Knowledge Management: A Strategic Imperative for Improving US Competitiveness

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Barclay, Rebecca O.

    1998-01-01

    Federally-funded research and development (R&D) represents a significant annual investment (approximately $79 billion in fiscal year 1996) on the part of U.S. taxpayers. Based on the results of a 10-year study of knowledge diffusion in U.S. aerospace industry, the authors take the position that U.S. competitiveness will be enhanced if knowledge management strategies, employed within a capability-enhancing U.S. technology policy framework, are applied to diffusing the results of federally-funded R&D. In making their case, the authors stress the importance of knowledge as the source of competitive advantage in today's global economy. Next, they offer a practice-based definition of knowledge management and discuss three current approaches to knowledge management implementation-mechanistic, "the learning organization," and systemic. The authors then examine three weaknesses in existing U.S. public policy and policy implementation-the dominance of knowledge creation, the need for diffusion-oriented technology policy, and the prevalence of a dissemination model- that affect diffusion of the results of federally-funded R&D. To address these shortcomings, they propose the development of a knowledge management framework for diffusing the results of federally-funded R&D. The article closes with a discussion of some issues and challenges associated with implementing a knowledge management framework for diffusing the results of federally-funded R&D.

  7. Synthesis of Actinide Materials for the Study of Basic Actinide Science and Rapid Separation of Fission Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorhout, Jacquelyn Marie

    This dissertation covers several distinct projects relating to the fields of nuclear forensics and basic actinide science. Post-detonation nuclear forensics, in particular, the study of fission products resulting from a nuclear device to determine device attributes and information, often depends on the comparison of fission products to a library of known ratios. The expansion of this library is imperative as technology advances. Rapid separation of fission products from a target material, without the need to dissolve the target, is an important technique to develop to improve the library and provide a means to develop samples and standards for testing separations.more » Several materials were studied as a proof-of-concept that fission products can be extracted from a solid target, including microparticulate (< 10 μm diameter) dUO 2, porous metal organic frameworks (MOFs) synthesized from depleted uranium (dU), and other organicbased frameworks containing dU. The targets were irradiated with fast neutrons from one of two different neutron sources, contacted with dilute acids to facilitate the separation of fission products, and analyzed via gamma spectroscopy for separation yields. The results indicate that smaller particle sizes of dUO 2 in contact with the secondary matrix KBr yield higher separation yields than particles without a secondary matrix. It was also discovered that using 0.1 M HNO 3 as a contact acid leads to the dissolution of the target material. Lower concentrations of acid were used for future experiments. In the case of the MOFs, a larger pore size in the framework leads to higher separation yields when contacted with 0.01 M HNO 3. Different types of frameworks also yield different results.« less

  8. Organizational responses to accountability requirements: Do we get what we expect?

    PubMed

    Gray, Carolyn Steele; Berta, Whitney; Deber, Raisa; Lum, Janet

    In health care, accountability is being championed as a promising approach to meeting the dual imperatives of improving care quality while managing constrained budgets. Few studies focus on public sector organizations' responsiveness to government imperatives for accountability. We applied and adapted a theory of organizational responsiveness to community care agencies operating in Ontario, Canada, asking the question: What is the array of realized organizational responses to government-imposed accountability requirements among community agencies that receive public funds to provide home and community care? A sequential complementary mixed methods approach was used. It gathered data through a survey of 114 home and community care organizations in Ontario and interviews with 20 key informants representing 13 home and community care agencies and four government agencies. It generated findings using a parallel mixed analysis technique. In addition to responses predicted by the theory, we found that organizations engage in active, as well as passive, forms of compliance; we refer to this response as internal modification in which internal policies, practices, and/or procedures are changed to meet accountability requirements. We also found that environmental factors, such as the presence of an association representing organizational interests, can influence bargaining tactics. Our study helps us to better understand the range of likely responses to accountability requirements and is a first step toward encouraging the development of accountability frameworks that favor positive outcomes for organizations and those holding them to account. Tailoring agreements to organizational environments, aligning perceived compliance with behaviors that encourage improved performance, and allowing for flexibility in accountability arrangements are suggested strategies to support beneficial outcomes.

  9. Pragmatic phenomenological types.

    PubMed

    Goranson, Ted; Cardier, Beth; Devlin, Keith

    2015-12-01

    We approach a well-known problem: how to relate component physical processes in biological systems to governing imperatives in multiple system levels. The intent is to further practical tools that can be used in the clinical context. An example proposes a formal type system that would support this kind of reasoning, including in machines. Our example is based on a model of the connection between a quality of mind associated with creativity and neuropsychiatric dynamics: constructing narrative as a form of conscious introspection, which allows the manipulation of one's own driving imperatives. In this context, general creativity is indicated by an ability to manage multiple heterogeneous worldviews simultaneously in a developing narrative. 'Narrative' in this context is framed as the organizing concept behind rational linearization that can be applied to metaphysics as well as modeling perceptive dynamics. Introspection is framed as the phenomenological 'tip' that allows a perceiver to be within experience or outside it, reflecting on and modifying it. What distinguishes the approach is the rooting in well founded but disparate disciplines: phenomenology, ontic virtuality, two-sorted geometric logics, functional reactive programming, multi-level ontologies and narrative cognition. This paper advances the work by proposing a type strategy within a two-sorted reasoning system that supports cross-ontology structure. The paper describes influences on this approach, and presents an example that involves phenotype classes and monitored creativity enhanced by both soft methods and transcranial direct-current stimulation. The proposed solution integrates pragmatic phenomenology, situation theory, narratology and functional programming in one framework. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Climate change induced lanslide hazard mapping over Greece- A case study in Pelion Mountain (SE Thessaly, Central Greece)

    NASA Astrophysics Data System (ADS)

    Angelitsa, Varvara; Loupasakis, Constantinos; Anagnwstopoulou, Christina

    2015-04-01

    Landslides, as a major type of geological hazard, represent one of the natural events that occur most frequently worldwide after hydro-meteorological events. Landslides occur when the stability of a slope changes due to a number of factors, such as the steep terrain and prolonged precipitation. Identification of landslides and compilation of landslide susceptibility, hazard and risk maps are very important issues for the public authorities providing substantial information regarding, the strategic planning and management of the land-use. Although landslides cannot be predicted accurately, many attempts have been made to compile these maps. Important factors for the the compilation of reliable maps are the quality and the amount of available data and the selection of the best method for the analysis. Numerous studies and publications providing landslide susceptibility,hazard and risk maps, for different regions of Greece, have completed up to now. Their common characteristic is that they are static, taking into account parameters like geology, mean annual precipitaion, slope, aspect, distance from roads, faults and drainage network, soil capability, land use etc., without introducing the dimension of time. The current study focuses on the Pelion Mountain, which is located at the southeastern part of Thessaly in Central Greece; aiming to compile "dynamic" susceptibility and hazard maps depending on climate changes. For this purpose, past and future precipipation data from regional climate models (RCMs) datasets are introduced as input parameters for the compilation of "dynamic" landslide hazard maps. Moreover, land motion mapping data produced by Persistent Scatterer Interferometry (PSI) are used for the validation of the landslide occurrence during the period from June 1992 to December 2003 and as a result for the calibration of the mapping procedure. The PSI data can be applied at a regional scale as support for land motion mapping and at local scale for the monitoring of single well-known ground motion event. The PSI data were produced within the framework of the Terrafirma project. Terrafirma is a pan- European ground motion information service focused on seismic risk, flood defense and costal lowland subsidence,inactive mines and hydrogeological risks. The produced maps provided substantial information for the land use planning and the civil protection of an area presenting excelent natural beauty and numerous preservable trtaditional villages. Keywords: landslide, psi technique, regional climate models, lanslide susceptibility maps, Greece

  11. Cultural Identity Forum: Enacting the Self-Awareness Imperative in Intercultural Communication

    ERIC Educational Resources Information Center

    Anderson-Lain, Karen

    2017-01-01

    Courses: Intercultural Communication; any course with an intercultural communication unit. Objectives: Students will demonstrate the self-awareness imperative in intercultural communication, explore their own cultural identities, and reflect on others cultural identities in order to build their intercultural communication competence.

  12. Transcultural Nursing Education: A Worldwide Imperative.

    ERIC Educational Resources Information Center

    Leininger, Madeleine

    1994-01-01

    The nursing profession must make changes in all aspects of nursing as it shifts from a largely unicultural to multicultural focus to prepare thousands of nurses in the growing and imperative area of transcultural nursing in curriculum development, research, teaching, clinical practice, and consultation. (JOW)

  13. It's NOT rocket science: rethinking our metaphors for research in health professions education.

    PubMed

    Regehr, Glenn

    2010-01-01

    The health professional education community is struggling with a number of issues regarding the place and value of research in the field, including: the role of theory-building versus applied research; the relative value of generalisable versus contextually rich, localised solutions, and the relative value of local versus multi-institutional research. In part, these debates are limited by the fact that the health professional education community has become deeply entrenched in the notion of the physical sciences as presenting a model for 'ideal' research. The resulting emphasis on an 'imperative of proof' in our dominant research approaches has translated poorly to the domain of education, with a resulting denigration of the domain as 'soft' and 'unscientific' and a devaluing of knowledge acquired to date. Similarly, our adoption of the physical sciences''imperative of generalisable simplicity' has created difficulties for our ability to represent well the complexity of the social interactions that shape education and learning at a local level. Using references to the scientific paradigms associated with the physical sciences, this paper will reconsider the place of our current goals for education research in the production and evolution of knowledge within our community, and will explore the implications for enhancing the value of research in health professional education. Reorienting education research from its alignment with the imperative of proof to one with an imperative of understanding, and from the imperative of simplicity to an imperative of representing complexity well may enable a shift in research focus away from a problematic search for proofs of simple generalisable solutions to our collective problems, towards the generation of rich understandings of the complex environments in which our collective problems are uniquely embedded.

  14. Dissociating the Influence of Response Selection and Task Anticipation on Corticospinal Suppression During Response Preparation

    PubMed Central

    Duque, Julie; Labruna, Ludovica; Cazares, Christian; Ivry, Richard B.

    2014-01-01

    Motor behavior requires selecting between potential actions. The role of inhibition in response selection has frequently been examined in tasks in which participants are engaged in some advance preparation prior to the presentation of an imperative signal. Under such conditions, inhibition could be related to processes associated with response selection, or to more general inhibitory processes that are engaged in high states of anticipation. In Experiment 1, we manipulated the degree of anticipatory preparation. Participants performed a choice reaction time task that required choosing between a movement of the left or right index finger, and used transcranial magnetic stimulation (TMS) to elicit motor evoked potentials (MEPs) in the left hand agonist. In high anticipation blocks, a non-informative cue (e.g., fixation marker) preceded the imperative; in low anticipation blocks, there was no cue and participants were required to divide their attention between two tasks to further reduce anticipation. MEPs were substantially reduced before the imperative signal in high anticipation blocks. In contrast, in low anticipation blocks, MEPs remained unchanged before the imperative signal but showed a marked suppression right after the onset of the imperative. This effect occurred regardless of whether the imperative had signaled a left or right hand response. After this initial inhibition, left MEPs increased when the left hand was selected and remained suppressed when the right hand was selected. We obtained similar results in Experiment 2 except that the persistent left MEP suppression when the left hand was not selected was attenuated when the alternative response involved a non-homologous effector (right foot). These results indicate that, even in the absence of an anticipatory period, inhibitory mechanisms are engaged during response selection, possibly to prevent the occurrence of premature and inappropriate responses during a competitive selection process. PMID:25128431

  15. An expanded conceptual framework for solution-focused management of chemical pollution in European waters.

    PubMed

    Munthe, John; Brorström-Lundén, Eva; Rahmberg, Magnus; Posthuma, Leo; Altenburger, Rolf; Brack, Werner; Bunke, Dirk; Engelen, Guy; Gawlik, Bernd Manfred; van Gils, Jos; Herráez, David López; Rydberg, Tomas; Slobodnik, Jaroslav; van Wezel, Annemarie

    2017-01-01

    This paper describes a conceptual framework for solutions-focused management of chemical contaminants built on novel and systematic approaches for identifying, quantifying and reducing risks of these substances. The conceptual framework was developed in interaction with stakeholders representing relevant authorities and organisations responsible for managing environmental quality of water bodies. Stakeholder needs were compiled via a survey and dialogue. The content of the conceptual framework was thereafter developed with inputs from relevant scientific disciplines. The conceptual framework consists of four access points: Chemicals, Environment, Abatement and Society, representing different aspects and approaches to engaging in the issue of chemical contamination of surface waters. It widens the scope for assessment and management of chemicals in comparison to a traditional (mostly) perchemical risk assessment approaches by including abatement- and societal approaches as optional solutions. The solution-focused approach implies an identification of abatement- and policy options upfront in the risk assessment process. The conceptual framework was designed for use in current and future chemical pollution assessments for the aquatic environment, including the specific challenges encountered in prioritising individual chemicals and mixtures, and is applicable for the development of approaches for safe chemical management in a broader sense. The four access points of the conceptual framework are interlinked by four key topics representing the main scientific challenges that need to be addressed, i.e.: identifying and prioritising hazardous chemicals at different scales; selecting relevant and efficient abatement options; providing regulatory support for chemicals management; predicting and prioritising future chemical risks. The conceptual framework aligns current challenges in the safe production and use of chemicals. The current state of knowledge and implementation of these challenges is described. The use of the conceptual framework, and addressing the challenges, is intended to support: (1) forwarding sustainable use of chemicals, (2) identification of pollutants of priority concern for cost-effective management, (3) the selection of optimal abatement options and (4) the development and use of optimised legal and policy instruments.

  16. Greenhouse gas emission accounting and management of low-carbon community.

    PubMed

    Song, Dan; Su, Meirong; Yang, Jin; Chen, Bin

    2012-01-01

    As the major source of greenhouse gas (GHG) emission, cities have been under tremendous pressure of energy conservation and emission reduction for decades. Community is the main unit of urban housing, public facilities, transportation, and other properties of city's land use. The construction of low-carbon community is an important pathway to realize carbon emission mitigation in the context of rapid urbanization. Therefore, an efficient carbon accounting framework should be proposed for CO₂ emissions mitigation at a subcity level. Based on life-cycle analysis (LCA), a three-tier accounting framework for the carbon emissions of the community is put forward, including emissions from direct fossil fuel combustion, purchased energy (electricity, heat, and water), and supply chain emissions embodied in the consumption of goods. By compiling a detailed CO₂ emission inventory, the magnitude of carbon emissions and the mitigation potential in a typical high-quality community in Beijing are quantified within the accounting framework proposed. Results show that emissions from supply chain emissions embodied in the consumption of goods cannot be ignored. Specific suggestions are also provided for the urban decision makers to achieve the optimal resource allocation and further promotion of low-carbon communities.

  17. Recommendations to Improve the Accuracy of Estimates of Physical Activity Derived from Self Report

    PubMed Central

    Ainsworth, Barbara E; Caspersen, Carl J; Matthews, Charles E; Mâsse, Louise C; Baranowski, Tom; Zhu, Weimo

    2013-01-01

    Context Assessment of physical activity using self-report has the potential for measurement error that can lead to incorrect inferences about physical activity behaviors and bias study results. Objective To provide recommendations to improve the accuracy of physical activity derived from self report. Process We provide an overview of presentations and a compilation of perspectives shared by the authors of this paper and workgroup members. Findings We identified a conceptual framework for reducing errors using physical activity self-report questionnaires. The framework identifies six steps to reduce error: (1) identifying the need to measure physical activity, (2) selecting an instrument, (3) collecting data, (4) analyzing data, (5) developing a summary score, and (6) interpreting data. Underlying the first four steps are behavioral parameters of type, intensity, frequency, and duration of physical activities performed, activity domains, and the location where activities are performed. We identified ways to reduce measurement error at each step and made recommendations for practitioners, researchers, and organizational units to reduce error in questionnaire assessment of physical activity. Conclusions Self-report measures of physical activity have a prominent role in research and practice settings. Measurement error can be reduced by applying the framework discussed in this paper. PMID:22287451

  18. Greenhouse Gas Emission Accounting and Management of Low-Carbon Community

    PubMed Central

    Song, Dan; Su, Meirong; Yang, Jin; Chen, Bin

    2012-01-01

    As the major source of greenhouse gas (GHG) emission, cities have been under tremendous pressure of energy conservation and emission reduction for decades. Community is the main unit of urban housing, public facilities, transportation, and other properties of city's land use. The construction of low-carbon community is an important pathway to realize carbon emission mitigation in the context of rapid urbanization. Therefore, an efficient carbon accounting framework should be proposed for CO2 emissions mitigation at a subcity level. Based on life-cycle analysis (LCA), a three-tier accounting framework for the carbon emissions of the community is put forward, including emissions from direct fossil fuel combustion, purchased energy (electricity, heat, and water), and supply chain emissions embodied in the consumption of goods. By compiling a detailed CO2 emission inventory, the magnitude of carbon emissions and the mitigation potential in a typical high-quality community in Beijing are quantified within the accounting framework proposed. Results show that emissions from supply chain emissions embodied in the consumption of goods cannot be ignored. Specific suggestions are also provided for the urban decision makers to achieve the optimal resource allocation and further promotion of low-carbon communities. PMID:23251104

  19. Framework for architecture-independent run-time reconfigurable applications

    NASA Astrophysics Data System (ADS)

    Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.

    2000-10-01

    Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.

  20. Transformation management of primary health care services in two selected local authorities in Gauteng.

    PubMed

    Sibaya, W; Muller, M

    2000-12-01

    The transformation of health services in South Africa today is governed by the political, policy and legislative frameworks. This article focuses on the transformation of a primary health care service within a local authority in Gauteng. The purpose with this article is to explore and describe the perceptions (expectations and fears) of selected managers employed in this primary health care service. The results are utilised to compile a strategy (framework) for transformation management and leadership within the primary health care service. A qualitative research design was utilised and the data was collected by means of individual interviews with selected managers in the service, followed by a content analysis. The expectations and fears of managers focus mainly on personnel matters, community participation/satisfaction, salaries and parity, inadequate stocks/supplies and medication, the deterioration of quality service delivery and the need for training and empowerment. These results are divided into structure, process and outcome dimensions and are embodied in the conceptual framework for the transformation and leadership strategy. It is recommended that standards for transformation management be formulated and that the quality of transformation management be evaluated accordingly.

  1. Geological constraints on continental arc activity since 720 Ma: implications for the link between long-term climate variability and episodicity of continental arcs

    NASA Astrophysics Data System (ADS)

    Cao, W.; Lee, C. T.

    2016-12-01

    Continental arc volcanoes have been suggested to release more CO2 than island arc volcanoes due to decarbonation of wallrock carbonates in the continental upper plate through which the magmas traverse (Lee et al., 2013). Continental arcs may thus play an important role in long-term climate. To test this hypothesis, we compiled geological maps to reconstruct the surface distribution of granitoid plutons and the lengths of ancient continental arcs. These results were then compiled into a GIS framework and incorporated into GPlates plate reconstructions. Our results show an episodic nature of global continental arc activity since 720 Ma. The lengths of continental arcs were at minimums during most of the Cryogenian ( 720-670 Ma), the middle Paleozoic ( 460-300 Ma) and the Cenozoic ( 50-0 Ma). Arc lengths were highest during the Ediacaran ( 640-570 Ma), the early Paleozoic ( 550-430 Ma) and the entire Mesozoic with peaks in the Early Triassic ( 250-240 Ma), Late Jurassic-Early Cretaceous ( 160-130 Ma), and Late Cretaceous ( 90-65 Ma). The extensive continental arcs in the Ediacaran and early Paleozoic reflect the Pan-African events and circum-Gondwana subduction during the assembly of the Gondwana supercontinent. The Early Triassic peak is coincident with the final closure of the paleo-Asian oceans and the onset of circum-Pacific subduction associated with the assembly of the Pangea supercontinent. The Jurassic-Cretaceous peaks reflect the extensive continental arcs established in the western Pacific, North and South American Cordillera, coincident with the initial dispersal of the Pangea. Continental arcs are favored during the final assembly and the early-stage dispersal of a supercontinent. Our compilation shows a temporal match between continental arc activity and long-term climate at least since 720 Ma. For example, continental arc activity was reduced during the Cryogenian icehouse event, and enhanced during the Early Paleozoic and Jurassic-Cretaceous greenhouse events. This coherence provides further evidence that continental arcs may play an important role in controlling long-term climate evolution. CO2 degassing fluxes from continental arcs should be incorporated into global, long-term climate models. Our work provides a quantitative framework for estimating these fluxes.

  2. Critical Thinking for the New Millennium: A Pedagogical Imperative.

    ERIC Educational Resources Information Center

    Lee, Andrew Ann Dinkins

    The pedagogical imperative to prepare students to become critical thinkers, critical readers, and critical writers for the coming millennium necessitates a comprehensive college discourse on critical thinking. The paper cites seminars and workshops that incorporate theoretical and practical dimensions of teaching critical-analytical thinking…

  3. Authoritative knowledge, the technological imperative and women's responses to prenatal diagnostic technologies.

    PubMed

    McCoyd, Judith L M

    2010-12-01

    Theories about authoritative knowledge (AK) and the technological imperative have received varying levels of interest in anthropological, feminist and science and technology studies. Although the anthropological literature abounds with empirical considerations of authoritative knowledge, few have considered both theories through an empirical, inductive lens. Data extracted from an earlier study of 30 women's responses to termination for fetal anomaly are reanalyzed to consider the women's views of, and responses to, prenatal diagnostic technologies (PNDTs). Findings indicate that a small minority embrace the societal portrayal of technology as univalently positive, while the majority have nuanced and ambivalent responses to the use of PNDTs. Further, the interface of authoritative knowledge and the technological imperative suggests that AK derives not only from medical provider status and technology use, but also from the adequacy and trustworthiness of the information. The issue of timing and uncertainty of the information also are interrogated for their impact on women's lives and what that can illuminate about the theories of AK and the technological imperative.

  4. Twitter weather warnings: Communicating risk in 140 characters-the impact of imperative and declarative message style on weather risk perception and behavioral intentions.

    PubMed

    Rainear, Adam M; Lachlan, Kenneth A; Spence, Patric R

    Understanding how individuals utilize risk messages is important for protecting lives and gaining compliance toward safe behaviors. Recent advances in technology afford users with timeliness when needing to acquire information, and research investigating imperative and declarative message styles suggests utilizing both strategies is most effective. Similarly, the element of time can play a role when an individual engages in certain behaviors. This study employed an experimental design to better understand how imperative and declarative tweets, and time can contribute to risk perceptions and behavioral intentions. Results indicate the most negative affect is experienced after receiving an imperative-only tweet in a short-lead time condition, whereas a tweet utilizing both message styles in a long-lead time condition induces the most fear. Future research should investigate stylistic message elements on new media platforms to better understand how messages can be effectively sent and received by the intended audience within character-limited platforms.

  5. Integrating Health Behavior Theory and Design Elements in Serious Games.

    PubMed

    Cheek, Colleen; Fleming, Theresa; Lucassen, Mathijs Fg; Bridgman, Heather; Stasiak, Karolina; Shepherd, Matthew; Orpin, Peter

    2015-01-01

    Internet interventions for improving health and well-being have the potential to reach many people and fill gaps in service provision. Serious gaming interfaces provide opportunities to optimize user adherence and impact. Health interventions based in theory and evidence and tailored to psychological constructs have been found to be more effective to promote behavior change. Defining the design elements which engage users and help them to meet their goals can contribute to better informed serious games. To elucidate design elements important in SPARX, a serious game for adolescents with depression, from a user-centered perspective. We proposed a model based on an established theory of health behavior change and practical features of serious game design to organize ideas and rationale. We analyzed data from 5 studies comprising a total of 22 focus groups and 66 semistructured interviews conducted with youth and families in New Zealand and Australia who had viewed or used SPARX. User perceptions of the game were applied to this framework. A coherent framework was established using the three constructs of self-determination theory (SDT), autonomy, competence, and relatedness, to organize user perceptions and design elements within four areas important in design: computer game, accessibility, working alliance, and learning in immersion. User perceptions mapped well to the framework, which may assist developers in understanding the context of user needs. By mapping these elements against the constructs of SDT, we were able to propose a sound theoretical base for the model. This study's method allowed for the articulation of design elements in a serious game from a user-centered perspective within a coherent overarching framework. The framework can be used to deliberately incorporate serious game design elements that support a user's sense of autonomy, competence, and relatedness, key constructs which have been found to mediate motivation at all stages of the change process. The resulting model introduces promising avenues for future exploration. Involving users in program design remains an imperative if serious games are to be fit for purpose.

  6. Global hunger: a challenge to agricultural, food, and nutritional sciences.

    PubMed

    Wu, Shiuan-Huei; Ho, Chi-Tang; Nah, Sui-Lin; Chau, Chi-Fai

    2014-01-01

    Hunger has been a concern for generations and has continued to plague hundreds of millions of people around the world. Although many efforts have been devoted to reduce hunger, challenges such as growing competitions for natural resources, emerging climate changes and natural disasters, poverty, illiteracy, and diseases are posing threats to food security and intensifying the hunger crisis. Concerted efforts of scientists to improve agricultural and food productivity, technology, nutrition, and education are imperative to facilitate appropriate strategies for defeating hunger and malnutrition. This paper provides some aspects of world hunger issues and summarizes the efforts and measures aimed to alleviate food problems from the food and nutritional sciences perspectives. The prospects and constraints of some implemented strategies for alleviating hunger and achieving sustainable food security are also discussed. This comprehensive information source could provide insights into the development of a complementary framework for dealing with the global hunger issue.

  7. Social capital and child nutrition in India: The moderating role of development.

    PubMed

    Vikram, Kriti

    2018-03-01

    Empirical studies of social capital rarely take into account the socioeconomic context of the region in which it operates, indeed as most of this research has been located in high income countries. It is imperative to investigate how development may influence the impact of social capital, especially in developing countries. This paper examines the relationship between social capital and child nutrition using the India Human Development Survey, 2005-2006. Using a multilevel framework and a sample of 6770 rural children under the age of five, it finds that household based bridging social capital, expressed as connections with development based organizations, is positively associated with child nutrition. Bonding social capital, expressed as ties with caste and religious based organizations, has the opposite impact. At the village level, contextual measures of social capital are associated with nutritional status of children, but their influence is conditional on local development. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Optimizing nursing care delivery systems in the Army: back to basics with care teams and peer feedback.

    PubMed

    Prue-Owens, Kathy; Watkins, Miko; Wolgast, Kelly A

    2011-01-01

    The Patient CaringTouch System emerged from a comprehensive assessment and gap analysis of clinical nursing capabilities in the Army. The Patient CaringTouch System now provides the framework and set of standards by which we drive excellence in quality nursing care for our patients and excellence in quality of life for our nurses in Army Medicine. As part of this enterprise transformation, we placed particular emphasis on the delivery of nursing care at the bedside as well as the integration of a formal professional peer feedback process in support of individual nurse practice enhancement. The Warrior Care Imperative Action Team was chartered to define and establish the standards for care teams in the clinical settings and the process by which we established formal peer feedback for our professional nurses. This back-to-basics approach is a cornerstone of the Patient CaringTouch System implementation and sustainment.

  9. Cross-Cultural Effects of Cannabis Use Disorder: Evidence to Support a Cultural Neuroscience Approach

    PubMed Central

    Prashad, Shikha; Milligan, Amber L.; Cousijn, Janna; Filbey, Francesca M.

    2017-01-01

    Purpose of review Cannabis use disorders (CUDs) are prevalent worldwide. Current epidemiological studies underscore differences in behaviors that contribute to cannabis use across cultures that can be leveraged towards prevention and treatment of CUDs. This review proposes a framework for understanding the effects of cross-cultural differences on psychological, neural, and genomic processes underlying CUDs that has the potential to inform global policies and impact global public health. Recent findings We found that cultural factors may influence (1) the willingness to acknowledge CUD-related symptoms among populations of different countries, and (2) neural responses related to the sense of self, perception, emotion, and attention. These findings leverage the potential effects of culture on neural mechanisms underlying CUDs. Summary As the number of individuals seeking treatment for CUDs increases globally, it is imperative to incorporate cultural considerations to better understand and serve differing populations and develop more targeted treatment strategies and interventions. PMID:29062679

  10. The ethics of paediatric anti-depressant use: erring on the side of caution.

    PubMed

    Shearer, M C; Bermingham, S L

    2008-10-01

    This paper aims to outline the ethical concerns regarding the use of antidepressant medication in children and adolescents. Recent debates surrounding this issue have focused on the link between selective serotonin reuptake inhibitor use and an increased risk of suicidal thinking/behaviour, and weighed that against the benefit of the alleviation of depressive symptoms. It is argued here that such an approach is simplistic. There are several serious risks surrounding antidepressant use in the young that ought to be included in the equation, along with a consideration of the neuroethical concerns surrounding pharmacotherapy for affective disorders. Using the precautionary principle as a framework for analysis it is concluded that the risks are sufficiently serious and plausible that the prescribing of antidepressant medication to the young ought to be severely restricted; further it is imperative that the child and their parents are made fully aware of the risks, short-term and long-term, involved.

  11. Is Paid Surrogacy a Form of Reproductive Prostitution? A Kantian Perspective.

    PubMed

    Patrone, Tatiana

    2018-01-01

    This article reexamines the "prostitution objection" to paid surrogacy, and argues that rebuttals to this objection fail to focus on surrogates as embodied persons. This failure is based on the false distinction between "selling one's reproductive services" and "selling one's body." To ground the analysis of humans as embodied persons, this article uses Kant's late ethical theory, which develops the conceptual framework for understanding human beings as embodied selves. Literature on surrogacy commonly emphasizes that all Kantian duties heed to the categorical prohibition to treat persons as mere means. What this literature leaves out is that this imperative commands us more specifically to engage ourselves and others as embodied persons. This article aims to relate this point to a specific issue in assisted reproduction. It argues that a Kantian account of human beings as embodied persons prohibits paid surrogacy on exactly the same grounds as it prohibits prostitution.

  12. IT vendor selection model by using structural equation model & analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  13. Cross-Cultural Effects of Cannabis Use Disorder: Evidence to Support a Cultural Neuroscience Approach.

    PubMed

    Prashad, Shikha; Milligan, Amber L; Cousijn, Janna; Filbey, Francesca M

    2017-06-01

    Cannabis use disorders (CUDs) are prevalent worldwide. Current epidemiological studies underscore differences in behaviors that contribute to cannabis use across cultures that can be leveraged towards prevention and treatment of CUDs. This review proposes a framework for understanding the effects of cross-cultural differences on psychological, neural, and genomic processes underlying CUDs that has the potential to inform global policies and impact global public health. We found that cultural factors may influence (1) the willingness to acknowledge CUD-related symptoms among populations of different countries, and (2) neural responses related to the sense of self, perception, emotion, and attention. These findings leverage the potential effects of culture on neural mechanisms underlying CUDs. As the number of individuals seeking treatment for CUDs increases globally, it is imperative to incorporate cultural considerations to better understand and serve differing populations and develop more targeted treatment strategies and interventions.

  14. A New Standard for Assessing the Performance of High Contrast Imaging Systems

    NASA Astrophysics Data System (ADS)

    Jensen-Clem, Rebecca; Mawet, Dimitri; Gomez Gonzalez, Carlos A.; Absil, Olivier; Belikov, Ruslan; Currie, Thayne; Kenworthy, Matthew A.; Marois, Christian; Mazoyer, Johan; Ruane, Garreth; Tanner, Angelle; Cantalloube, Faustine

    2018-01-01

    As planning for the next generation of high contrast imaging instruments (e.g., WFIRST, HabEx, and LUVOIR, TMT-PFI, EELT-EPICS) matures and second-generation ground-based extreme adaptive optics facilities (e.g., VLT-SPHERE, Gemini-GPI) finish their principal surveys, it is imperative that the performance of different designs, post-processing algorithms, observing strategies, and survey results be compared in a consistent, statistically robust framework. In this paper, we argue that the current industry standard for such comparisons—the contrast curve—falls short of this mandate. We propose a new figure of merit, the “performance map,” that incorporates three fundamental concepts in signal detection theory: the true positive fraction, the false positive fraction, and the detection threshold. By supplying a theoretical basis and recipe for generating the performance map, we hope to encourage the widespread adoption of this new metric across subfields in exoplanet imaging.

  15. Methane Ebullition in Temperate Hydropower Reservoirs and Implications for US Policy on Greenhouse Gas Emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Benjamin L.; Arntzen, Evan V.; Goldman, Amy E.

    The United States is home to more than 87,000 dams, 2,198 of which are actively used for hydropower production. With the December 2015 consensus adoption of the United Nations Framework Convention on Climate Change’s Paris Agreement, it is imperative for the U.S. to accurately quantify greenhouse gas fluxes from its hydropower reservoirs. Methane ebullition, or methane bubbles originating from river or lake sediments, can account for nearly all of a reservoir’s methane emissions to the atmosphere. However, methane ebullition in hydropower reservoirs has been studied in only three temperate locations, none of which are in the United States. This studymore » measures high ebullitive methane fluxes from two hydropower reservoirs in eastern Washington, synthesizes the known information about methane ebullition from tropical, boreal, and temperate hydropower reservoirs, and investigates the implications for U.S. hydropower management and growth.« less

  16. Mental health and addiction workforce development: federal leadership is needed to address the growing crisis.

    PubMed

    Hoge, Michael A; Stuart, Gail W; Morris, John; Flaherty, Michael T; Paris, Manuel; Goplerud, Eric

    2013-11-01

    The mental health and addiction workforce has long been plagued by shortages, high turnover, a lack of diversity, and concerns about its effectiveness. This article presents a framework to guide workforce policy and practice, emphasizing the need to train other health care providers as well as individuals in recovery to address behavioral health needs; strengthen recruitment, retention, and training of specialist behavioral health providers; and improve the financial and technical assistance infrastructure to better support and sustain the workforce. The pressing challenge is to scale up existing plans and strategies and to implement them in ways that have a meaningful impact on the size and effectiveness of the workforce. The aging and increasing diversity of the US population, combined with the expanded access to services that will be created by health reform, make it imperative to take immediate action.

  17. [Challenges and inputs of the gender perspective to the study of vector borne diseases].

    PubMed

    Arenas-Monreal, Luz; Piña-Pozas, Maricela; Gómez-Dantés, Héctor

    2015-01-01

    The analysis of social determinants and gender within the health-disease-care process is an imperative to understand the variables that define the vulnerability of populations, their exposure risks, the determinants of their care, and the organization and participation in prevention and control programs. Ecohealth incorporates the study of the social determinants and gender perspectives because the emergency of dengue, malaria and Chagas disease are bound to unplanned urbanization, deficient sanitary infrastructure, and poor housing conditions. Gender emerges as an explanatory element of the roles played by men and women in the different scenarios (domestic, communitarian and social) that shape exposure risks to vectors and offer a better perspective of success for the prevention, control and care strategies. The objective is to contribute to the understanding on the gender perspective in the analysis of health risks through a conceptual framework.

  18. Security Framework for Pervasive Healthcare Architectures Utilizing MPEG-21 IPMP Components.

    PubMed

    Fragopoulos, Anastasios; Gialelis, John; Serpanos, Dimitrios

    2009-01-01

    Nowadays in modern and ubiquitous computing environments, it is imperative more than ever the necessity for deployment of pervasive healthcare architectures into which the patient is the central point surrounded by different types of embedded and small computing devices, which measure sensitive physical indications, interacting with hospitals databases, allowing thus urgent medical response in occurrences of critical situations. Such environments must be developed satisfying the basic security requirements for real-time secure data communication, and protection of sensitive medical data and measurements, data integrity and confidentiality, and protection of the monitored patient's privacy. In this work, we argue that the MPEG-21 Intellectual Property Management and Protection (IPMP) components can be used in order to achieve protection of transmitted medical information and enhance patient's privacy, since there is selective and controlled access to medical data that sent toward the hospital's servers.

  19. Ethical budgets: a critical success factor in implementing new public management accountability in health care.

    PubMed

    Bosa, Iris M

    2010-05-01

    New public management accountability is increasingly being introduced into health-care systems throughout the world - albeit with mixed success. This paper examines the successful introduction of new management accounting systems among general practitioners (GPs) as an aspect of reform in the Italian health-care system. In particular, the study examines the critical role played by the novel concept of an 'ethical budget' in engaging the willing cooperation of the medical profession in implementing change. Utilizing a qualitative research design, with in-depth interviews with GPs, hospital doctors and managers, along with archival analysis, the present study finds that management accounting can be successfully implemented among medical professionals provided there is alignment between the management imperative and the ethical framework in which doctors practise their profession. The concept of an 'ethical budget' has been shown to be an innovative and effective tool in achieving this alignment.

  20. The equity imperative in tertiary education: Promoting fairness and efficiency

    NASA Astrophysics Data System (ADS)

    Salmi, Jamil; Bassett, Roberta Malee

    2014-06-01

    While the share of the tertiary education age cohort (19-25) which is being given the opportunity to study has increased worldwide over the past two decades, this does not in fact translate into reduced inequality. For many young people, especially in the developing world, major obstacles such as disparities in terms of gender, minority population membership or disabilities as well as academic and financial barriers are still standing in their way. The authors of this article propose a conceptual framework to analyse equity issues in tertiary education and document the scope, significance and consequences of disparities in tertiary education opportunities. They throw some light on the main determinants of these inequalities and offer suggestions about effective equity promotion policies directed towards widening participation and improving the chances of success of underprivileged youths in order to create societies which uphold humanistic values.

  1. Integrating mental health services into a general hospital in Puerto Rico.

    PubMed

    Jiménez, J; Rivera, D; Benítez, P; Tarrats, H; Ramos, A

    2013-09-01

    The prevalence of mental health problems in the general population should be carefully considered. The literature has reported a high co-morbidity of medical and mental illnesses; therefore, collaborative efforts incorporating psychological services into medical settings are imperative. In Puerto Rico, this is not a regular practice in general hospitals. Improving access to mental health services is a challenge and requires the creation of new venues within the healthcare system. This paper describes the theoretical framework, mission, and objectives of the Clinical Psychology Services Program (CPSP) implemented at Damas Hospital in Puerto Rico. From December 2002 to December 2010, a total of 13,580 visits were made to inpatients in diverse clinical units of the hospital; 61% of all inpatients evaluated met the criteria for at least one mental health disorder based on the DSM-IV-TR. The CPSP's outcomes highlight the acceptance and relevance of incorporating mental health services and clinical psychologists into general hospitals.

  2. Infrastructure for Rapid Development of Java GUI Programs

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Hostetter, Carl F.; Wheeler, Philip

    2006-01-01

    The Java Application Shell (JAS) is a software framework that accelerates the development of Java graphical-user-interface (GUI) application programs by enabling the reuse of common, proven GUI elements, as distinguished from writing custom code for GUI elements. JAS is a software infrastructure upon which Java interactive application programs and graphical user interfaces (GUIs) for those programs can be built as sets of plug-ins. JAS provides an application- programming interface that is extensible by application-specific plugins that describe and encapsulate both specifications of a GUI and application-specific functionality tied to the specified GUI elements. The desired GUI elements are specified in Extensible Markup Language (XML) descriptions instead of in compiled code. JAS reads and interprets these descriptions, then creates and configures a corresponding GUI from a standard set of generic, reusable GUI elements. These elements are then attached (again, according to the XML descriptions) to application-specific compiled code and scripts. An application program constructed by use of JAS as its core can be extended by writing new plug-ins and replacing existing plug-ins. Thus, JAS solves many problems that Java programmers generally solve anew for each project, thereby reducing development and testing time.

  3. Advances through collaboration: sharing seismic reflection data via the Antarctic Seismic Data Library System for Cooperative Research (SDLS)

    USGS Publications Warehouse

    Wardell, N.; Childs, J. R.; Cooper, A. K.

    2007-01-01

    The Antarctic Seismic Data Library System for Cooperative Research (SDLS) has served for the past 16 years under the auspices of the Antarctic Treaty (ATCM Recommendation XVI-12) as a role model for collaboration and equitable sharing of Antarctic multichannel seismic reflection (MCS) data for geoscience studies. During this period, collaboration in MCS studies has advanced deciphering the seismic stratigraphy and structure of Antarctica’s continental margin more rapidly than previously. MCS data compilations provided the geologic framework for scientific drilling at several Antarctic locations and for high-resolution seismic and sampling studies to decipher Cenozoic depositional paleoenvironments. The SDLS successes come from cooperation of National Antarctic Programs and individual investigators in “on-time” submissions of their MCS data. Most do, but some do not. The SDLS community has an International Polar Year (IPY) goal of all overdue MCS data being sent to the SDLS by end of IPY. The community science objective is to compile all Antarctic MCS data to derive a unified seismic stratigraphy for the continental margin – a stratigraphy to be used with drilling data to derive Cenozoic circum-Antarctic paleobathymetry maps and local-to-regional scale paleoenvironmental histories.

  4. Ada (Trade Name) Compiler Validation Summary Report. Harris Corporation, HARRIS Ada Compiler, Version 1.0, Harris H1200 and H800.

    DTIC Science & Technology

    1987-04-30

    AiBI 895 ADA (TRADENNANE) COMPILER VALIDATION SUMMARY REPORT / HARRIS CORPORATION HA (U) INFORMATION SYSTEMS AND TECHNOLOGY CENTER W-P AFS OH ADA...Compiler Validation Summary Report : 30 APR 1986 to 30 APR 1987 Harris Corporation, HARRIS Ada Compiler, Version 1.0, Harris H1200 and H800 6...the United States Government (Ada Joint Program Office). Adae Compiler Validation mary Report : Compiler Name: HARRIS Ada Compiler, Version 1.0 1 Host

  5. Ada (Tradename) Compiler Validation Summary Report. Harris Corporation. Harris Ada Compiler, Version 1.0. Harris H700 and H60.

    DTIC Science & Technology

    1986-06-28

    Report : 28 JUN 1986 to 28 JUN 1987 Harris Corporation, HARRIS Ada Compiler, Version 1.0, Harris H700 and H60 6. PERFORMING ORG. REPORT ...CLASSIFICATION OF THIS PAGE (When Oata Entered) .. . • -- 7 1. -SUPPLEMENTARYNOTES Ada ® Compiler Validation Summary Report : Compiler Name: HARRIS Ada Compiler...AVF-VSR-43.1086 Ada® COMPILER VALIDATION SUMMARY REPORT : Harris Corporation HARRIS Ada Compiler, Version 1.0 Harris H700 and H60 Completion of

  6. Beyond School Boundaries: New Health Imperatives, Families and Schools

    ERIC Educational Resources Information Center

    Rich, Emma

    2012-01-01

    This article draws upon research examining the impact of new health imperatives on schools in the United Kingdom. Specifically, it examines features of emerging surveillant relations, which not only speak to the changing nature of health-related practices in schools but have particular currency for broader understandings of theorisations of…

  7. A Call for Strategic Planning: The Two-Year College Imperative.

    ERIC Educational Resources Information Center

    Masoner, David J.; Essex, Nathan, L.

    1987-01-01

    Addresses the imperative for strategic and tactical planning to support the viability of the two-year college. Describes a process for approaching strategic planning, comprising the following steps: self-identification, self-analysis, analysis of service area, informed decision making, and the development of a marketing plan. (CBC)

  8. Electronic Commerce, Digital Information, and the Firm.

    ERIC Educational Resources Information Center

    Rosenbaum, Howard

    2000-01-01

    Discussion of the social context of electronic commerce (ecommerce) focuses on information imperatives, or rules that are critical for ecommerce firms. Concludes with a discussion of the organizational changes that can be expected to accompany the incorporation of these imperatives into the mission and core business processes of ecommerce firms.…

  9. Lifelong Learning Imperative in Engineering: Sustaining American Competitiveness in the 21st Century

    ERIC Educational Resources Information Center

    Dutta, Debasish; Patil, Lalit; Porter, James B., Jr.

    2012-01-01

    The Lifelong Learning Imperative (LLI) project was initiated to assess current practices in lifelong learning for engineering professionals, reexamine the underlying assumptions behind those practices, and outline strategies for addressing unmet needs. The LLI project brought together leaders of U.S. industry, academia, government, and…

  10. Understanding Student Learning in Environmental Education in Aotearoa New Zealand

    ERIC Educational Resources Information Center

    Eames, Chris; Barker, Miles

    2011-01-01

    This paper seeks to provide a perspective on environmental education in Aotearoa New Zealand. To contextualise this perspective, it illustrates how environmental, socio-cultural and political imperatives have shaped the development of environmental education in this land. These imperatives illuminate the natural history of the country, the…

  11. The Reflexive Imperative among High-Achieving Adolescents: A Flemish Case Study

    ERIC Educational Resources Information Center

    Van Lancker, Inge

    2016-01-01

    The socio-cultural conditions of late modernity induce a "reflexive imperative" amongst young people, which also results in metapragmatic and metalinguistic behaviour, as has been demonstrated by linguistic ethnographers (LE). However, recent LE studies on reflexivity in Western European settings have mainly focused on how groups of…

  12. Compliance with Requests by Children with Autism: The Impact of Sentence Type

    ERIC Educational Resources Information Center

    Kissine, Mikhail; De Brabanter, Philippe; Leybaert, Jacqueline

    2012-01-01

    This study assesses the extent to which children with autism understand requests performed with grammatically non-imperative sentence types. Ten children with autism were videotaped in naturalistic conditions. Four grammatical sentence types were distinguished: imperative, declarative, interrogative and sub-sentential. For each category, the…

  13. Male Dentists at Midlife: An Exploration of the One Life/One Career Imperative.

    ERIC Educational Resources Information Center

    Born, David O.; Nelson, Bradley J.

    1984-01-01

    Surveyed 172 male dentists to examine the extent to which they felt trapped in a one life/one career imperative. Results showed 84 percent agreed that most professionals can pursue only one career. Dentists undergoing a midlife crisis were less satisfied with their careers. (JAC)

  14. Anchoring Democracy: The Civic Imperative for Higher Education

    ERIC Educational Resources Information Center

    Guarasci, Richard

    2018-01-01

    American colleges and universities are facing a civic imperative. The nation is floundering, if not unwinding. Democracies are vulnerable social constructions. They flourish in eras of social consensus and economic prosperity, when citizens believe in the prospect of brighter futures. They stumble, if not decay, during times of harsh economic…

  15. Ancient Maritime Fish-Traps of Brittany (France): A Reappraisal of the Relationship Between Human and Coastal Environment During the Holocene

    NASA Astrophysics Data System (ADS)

    Langouët, Loïc; Daire, Marie-Yvane

    2009-12-01

    The present-day maritime landscape of Western France forms the geographical framework for a recent research project dedicated to the archaeological study of ancient fish-traps, combining regional-scale and site-scale investigations. Based on the compilation and exploitation of a large unpublished dataset including more than 550 sites, a preliminary synthetic study allows us to present some examples of synchronic and thematic approaches, and propose a morphological classification of the weirs. These encouraging first results open up new perspectives on fish-trap chronology closely linked to wider studies on Holocene sea-level changes.

  16. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  17. An Earth-Moon System Trajectory Design Reference Catalog

    NASA Technical Reports Server (NTRS)

    Folta, David; Bosanac, Natasha; Guzzetti, Davide; Howell, Kathleen C.

    2014-01-01

    As demonstrated by ongoing concept designs and the recent ARTEMIS mission, there is, currently, significant interest in exploiting three-body dynamics in the design of trajectories for both robotic and human missions within the Earth-Moon system. The concept of an interactive and 'dynamic' catalog of potential solutions in the Earth-Moon system is explored within this paper and analyzed as a framework to guide trajectory design. Characterizing and compiling periodic and quasi-periodic solutions that exist in the circular restricted three-body problem may offer faster and more efficient strategies for orbit design, while also delivering innovative mission design parameters for further examination.

  18. CIL: Compiler Implementation Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gries, David

    1969-03-01

    This report is a manual for the proposed Compiler Implementation Language, CIL. It is not an expository paper on the subject of compiler writing or compiler-compilers. The language definition may change as work progresses on the project. It is designed for writing compilers for the IBM 360 computers.

  19. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  20. The medico-legal prerequisite for initiating quarantine and isolation practices in public health emergency management in hospitals in Ghana.

    PubMed

    Norman, I D; Aikins, M; Binka, F N

    2011-12-01

    Hospitals and other health facilities in Ghana do not appear to have standardized practices for quarantine and isolation in public health emergency management. This paper reviews the legislative framework governing the medico-legal prerequisites for initiating quarantine and isolation procedures as articulated in the Infectious Disease Act (Cap 78) 1908 amended, 1935, the Quarantine Act (Cap 77) 1915 amended, 1938, the Emergency Powers Act of 1994, (Act 472), and the National Disaster Management Act, 1996, (Act 517) in consonance with the 1992 Constitution of Ghana. The findings provide that (1) The legislative framework outlines systematic standards and protocols to be followed in the committal of person or persons in quarantine and isolation during public health emergencies. (2) These standards and protocols consider as imperative, the creation of standardized national templates for the initiation of quarantine and isolation measures. (3) The non-compliance of the standards and protocols renders vulnerable medical facilities and hospitals with their personnel to the threat of medical malpractice suits and breach of professional ethics. This paper provides suggestions to hospital administrators and medical personnel of how to develop administrative templates in compliance with the law in managing public health emergencies. It also provides examples of such templates for possible adoption by hospitals and other health administrators.

  1. Ethical Challenges in the Provision of Dialysis in Resource-Constrained Environments.

    PubMed

    Luyckx, Valerie A; Miljeteig, Ingrid; Ejigu, Addisu M; Moosa, M Rafique

    2017-05-01

    The number of patients requiring dialysis by 2030 is projected to double worldwide, with the largest increase expected in low- and middle-income countries (LMICs). Dialysis is seldom considered a high priority by health care funders, consequently, few LMICs develop policies regarding dialysis allocation. Dialysis facilities may exist, but access remains highly inequitable in LMICs. High out-of-pocket payments make dialysis unsustainable and plunge many families into poverty. Patients, families, and clinicians suffer significant emotional and moral distress from daily life-and-death decisions imposed by dialysis. The health system's obligation to provide financial risk protection is an important component of global and national strategies to achieve universal health coverage. An ethical imperative therefore exists to develop transparent dialysis priority-setting guidelines to facilitate public understanding and acceptance of the realistic limits within the health system, and facilitate fair allocation of scarce resources. In this article, we present ethical challenges faced by patients, families, clinicians, and policy makers where dialysis is not universally accessible and discuss the potential ethical consequences of various dialysis allocation strategies. Finally, we suggest an ethical framework for use in policy development for priority setting of dialysis care. The accountability for reasonableness framework is proposed as a procedurally fair decision-making, priority-setting process. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Patient-centered communication in the era of electronic health records: What does the evidence say?

    PubMed

    Rathert, Cheryl; Mittler, Jessica N; Banerjee, Sudeep; McDaniel, Jennifer

    2017-01-01

    Patient-physician communication is essential for patient-centered health care. Physicians are concerned that electronic health records (EHRs) negatively affect communication with patients. This study identified a framework for understanding communication functions that influence patient outcomes. We then conducted a systematic review of the literature and organized it within the framework to better understand what is known. A comprehensive search of three databases (CINAHL, Medline, PsycINFO) yielded 41 articles for analysis. Results indicated that EHR use improves capture and sharing of certain biomedical information. However, it may interfere with collection of psychosocial and emotional information, and therefore may interfere with development of supportive, healing relationships. Patient access to the EHR and messaging functions may improve communication, patient empowerment, engagement, and self-management. More rigorous examination of EHR impacts on communication functions and their influences on patient outcomes is imperative for achieving patient-centered care. By focusing on the role of communication functions on patient outcomes, future EHRs can be developed to facilitate care. Training alone is likely to be insufficient to address disruptions to communication processes. Processes must be improved, and EHRs must be developed to capture useful data without interfering with physicians' and patients' abilities to effectively communicate. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Monitoring the price and affordability of foods and diets globally.

    PubMed

    Lee, A; Mhurchu, C N; Sacks, G; Swinburn, B; Snowdon, W; Vandevijvere, S; Hawkes, C; L'abbé, M; Rayner, M; Sanders, D; Barquera, S; Friel, S; Kelly, B; Kumanyika, S; Lobstein, T; Ma, J; Macmullan, J; Mohan, S; Monteiro, C; Neal, B; Walker, C

    2013-10-01

    Food prices and food affordability are important determinants of food choices, obesity and non-communicable diseases. As governments around the world consider policies to promote the consumption of healthier foods, data on the relative price and affordability of foods, with a particular focus on the difference between 'less healthy' and 'healthy' foods and diets, are urgently needed. This paper briefly reviews past and current approaches to monitoring food prices, and identifies key issues affecting the development of practical tools and methods for food price data collection, analysis and reporting. A step-wise monitoring framework, including measurement indicators, is proposed. 'Minimal' data collection will assess the differential price of 'healthy' and 'less healthy' foods; 'expanded' monitoring will assess the differential price of 'healthy' and 'less healthy' diets; and the 'optimal' approach will also monitor food affordability, by taking into account household income. The monitoring of the price and affordability of 'healthy' and 'less healthy' foods and diets globally will provide robust data and benchmarks to inform economic and fiscal policy responses. Given the range of methodological, cultural and logistical challenges in this area, it is imperative that all aspects of the proposed monitoring framework are tested rigorously before implementation. © 2013 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of the International Association for the Study of Obesity.

  4. Survey of Cyber Crime in Big Data

    NASA Astrophysics Data System (ADS)

    Rajeswari, C.; Soni, Krishna; Tandon, Rajat

    2017-11-01

    Big data is like performing computation operations and database operations for large amounts of data, automatically from the data possessor’s business. Since a critical strategic offer of big data access to information from numerous and various areas, security and protection will assume an imperative part in big data research and innovation. The limits of standard IT security practices are notable, with the goal that they can utilize programming sending to utilize programming designers to incorporate pernicious programming in a genuine and developing risk in applications and working frameworks, which are troublesome. The impact gets speedier than big data. In this way, one central issue is that security and protection innovation are sufficient to share controlled affirmation for countless direct get to. For powerful utilization of extensive information, it should be approved to get to the information of that space or whatever other area from a space. For a long time, dependable framework improvement has arranged a rich arrangement of demonstrated ideas of demonstrated security to bargain to a great extent with the decided adversaries, however this procedure has been to a great extent underestimated as “needless excess” and sellers In this discourse, essential talks will be examined for substantial information to exploit this develop security and protection innovation, while the rest of the exploration difficulties will be investigated.

  5. "At My Age … ": Defining Sexual Wellness in Mid- and Later Life.

    PubMed

    Syme, Maggie L; Cohn, Tracy J; Stoffregen, Sydney; Kaempfe, Hanna; Schippers, Desiree

    2018-04-18

    Sexual wellness is integral to quality of life across the life span, despite ageist stereotypes suggesting sexual expression ends at midlife. However, conceptualizing sexual wellness in mid- and later life is complicated by a dysfunction-based narrative, lack of a sex-positive aging framework, and existing measures that are age irrelevant and limited in scope. This study aimed to address these limitations by providing a conceptualization of sexual wellness grounded in definitions from midlife and older adults. A sample of 373 midlife and older adults (M = 60, SD = 5.84) in the United States provided a definition of sexual wellness. Using thematic analysis, multiple researchers coded qualitative responses, and results suggested a biopsychosocial-cultural framework. Findings reflect that midlife and older adults provide multifaceted definitions inclusive of various behavioral experiences, including disengaging from sex. They are also keenly aware of physical and psychological limitations and strengths, and emphasize mutual experiences and synchronicity. Midlife and older adults also reflect on age, drawing comparisons to different phases of life and often displaying adaptability in adjusting expectations. When conceptualizing sexual wellness in this population it is imperative to capture this multidimensionality, include those who are not actively engaging in sex, and be aware of the influence of ageist and dys/function narratives.

  6. Moral landscapes and everyday life in families with Huntington's disease: aligning ethnographic description and bioethics.

    PubMed

    Huniche, Lotte

    2011-06-01

    This article is concerned with understanding moral aspects of everyday life in families with Huntington's Disease (HD). It draws on findings from an empirical research project in Denmark in 1998-2002 involving multi-sited ethnography to argue that medical genetics provides a particular framework for conducting life in an HD family. A framework that implies that being informed and making use of genetic services expresses greater moral responsibility than conducting life without drawing on these resources. The moral imperative of engagement in medical genetics is challenged here by two pieces of ethnographic analysis. The first concerns a person who, as described by a family member, does not engage with medical genetics but who brings to the fore other culturally legitimate concerns, priorities and areas of responsibility. The second figures a genetic counselling session where neither the knowledge nor the imagined solutions of medical genetics are as unproblematic and straightforward as might be thought. To assist our understanding of the moral aspects of living with severe familial disease, the ethnographic analysis is aligned with bioethical reflections that place the concrete concerns of those personally involved centre stage in the development of theoretical stances that aim to assist reflections in practice. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Development, Theoretical Framework, and Outcome Evaluation from Implementation of a Parent and Teacher-Delivered Adolescent Intervention on Adolescent Vaccination

    PubMed Central

    Gargano, Lisa M.; Herbert, Natasha L.; Painter, Julia E.; Sales, Jessica M.; Vogt, Tara M.; Morfaw, Christopher; Jones, LaDawna M.; Murray, Dennis; DiClemente, Ralph J.; Hughes, James M.

    2017-01-01

    The Advisory Committee on Immunization Practices recommended immunization schedule for adolescents includes three vaccines (Tdap, HPV, and MCV4) and annual influenza vaccination. Given the increasing number of recommended vaccines for adolescents and health and economic costs associated with non-vaccination, it is imperative that effective strategies for increasing vaccination rates among adolescents be developed. This article describes the development, theoretical framework, and initial first-year evaluation of an intervention designed to promote vaccine acceptance among a middle- and high-school based sample of adolescents and their parents in eastern Georgia. Adolescents, parents, and teachers were active participants in the development of the intervention. The intervention, which consisted of a brochure for parents and a teacher-delivered curriculum for adolescents, was guided by constructs from the Health Belief Model and Theory of Reasoned Action. Evaluation results indicated that our intervention development methods were successful in creating a brochure that met cultural relevance and literacy needs of parents. We also demonstrated an increase in student knowledge of and attitudes toward vaccines. To our knowledge, this study is the first to extensively engage middle- and high-school students, parents, and teachers in the design and implementation of key theory-based educational components of a school-based, teacher-delivered adolescent vaccination intervention. PMID:24440920

  8. The potential of ecological theory for building an integrated framework to develop the public health contribution of health visiting.

    PubMed

    Bryans, Alison; Cornish, Flora; McIntosh, Jean

    2009-11-01

    In line with recent UK and Scottish policy imperatives, there is increasing pressure for the health visiting service to assume an enhanced role in improving public health. Although health visiting has so far maintained its unique position as a primarily preventive service within the UK health service, its distinctive contribution now appears under threat. The continuing absence of a comprehensive and integrated conceptual basis for practice has a negative impact on the profession's ability to respond to current challenges. Establishing an integrative framework to conceptualise health visiting practice would enable more sensitive, focused and appropriate research, education and evaluation in relation to practice. Work in this area could thus usefully contribute to the future development of the service at a difficult time. Our paper aims to make such a contribution. In support of our conceptual aims, we draw on a study of health visiting practice undertaken within a large conurbation in central Scotland. The study used a mixed method, collaborative approach involving 12 audio-recorded and observed health visitor-client interactions, semi-structured interviews with the 12 HVs and 12 clients, examination of related documentation and workshops with the HV participants. We critically consider prevalent models of health visiting practice and describe the more integrative conceptual approach provided by Bronfenbrenner's ecological, 'person-in-context' framework. The paper subsequently explores relationships between this framework and understandings of need demonstrated by health visitors who participated in our study. Current policy emphasises the need to focus on public health and social inclusion in order to improve health. However, if this policy is to be translated into practice, we must develop a more adequate understanding of how practitioners work effectively with families and individuals in a sensitive and context-specific manner. Bronfenbrenner's framework appears to offer a promising means of building on the current strengths of the health visiting service to further develop a 'person-in-context' approach to health improvement that is mindful of and responsive to multiple, inter-related influences on health. We therefore recommend further research to directly test the utility of this framework.

  9. Transition through co-optation: Harnessing carbon democracy for clean energy

    NASA Astrophysics Data System (ADS)

    Meng, Kathryn-Louise

    This dissertation explores barriers to a clean energy transition in the United States. Clean energy is demonstrably viable, yet the pace of clean energy adoption in the U.S. is slow, particularly given the immediate threat of global climate change. The purpose of this dissertation is to examine the factors inhibiting a domestic energy transition and to propose pragmatic approaches to catalyzing a transition. The first article examines the current political-economic and socio-technical energy landscape in the U.S. Fossil fuels are central to the functioning of the American economy. Given this centrality, constellations of power have been constructed around the reliable and affordable access of fossil fuels. The fossil fuel energy regime is comprised of: political-economic networks with vested interests in continued fossil fuel reliance, and fixed infrastructure that is minimally compatible with distributed generation. A transition to clean energy threatens the profitability of fossil fuel regime actors. Harnessing structural critiques from political ecology and process and function-oriented socio-technical systems frameworks, I present a multi-level approach to identifying pragmatic means to catalyzing an energy transition. High-level solutions confront the existing structure, mid-level solutions harness synergy with the existing structure, and low-level solutions lie outside of the energy system or foster the TIS. This is exemplified using a case study of solar development in Massachusetts. Article two presents a case study of the clean energy technological innovation system (TIS) in Massachusetts. I examine the actors and institutions that support cleantech development. Further, I scrutinize the actors and institutions that help sustain the TIS support system. The concept of a catalyst is presented; a catalyst is an actor that serves to propel TIS functions. Catalysts are critical to facilitating anchoring. Strategic corporate partners are identified as powerful catalysts that can help infuse capital into the TIS, propel TIS functions, and facilitate anchoring to the socio-technical regime and landscape. In the final article I argue that the environmental narrative that traditionally frames the need for clean energy is ineffective. Environmental narratives are antagonistic towards powerful actors and institutions discussed in the first article. Such antagonism can impede the development of clean energy incentives, decelerating a transition to clean energy. The need for clean energy can be reframed according to a security discourse. I demonstrate the compatibility between clean energy development and national security imperatives and argue that security imperatives are more likely to receive legislative and financial support than environmental imperatives. Ultimately I argue that geographers can find utility in the very structures, institutions, and actors that they critique. Capitalist imperatives of profit and growth can be harnessed so as to appeal to strategic corporate partners. The military, its budget, industrial complex, and research and development resources can in fact be beneficial to developing clean energy domestically.

  10. The Online Learning Imperative: A Solution to Three Looming Crises in Education

    ERIC Educational Resources Information Center

    Wise, Bob

    2010-01-01

    Currently, K-12 education in the United States is dealing with three major challenges: (1) global skill demands versus educational attainment; (2) the funding cliff; and (3) a looming teacher shortage. Independently, these factors present significant challenges. In combination, they create a national imperative for swift action to create a more…

  11. Feminist Imperative(s) in Music and Education: Philosophy, Theory, or What Matters Most

    ERIC Educational Resources Information Center

    Gould, Elizabeth

    2011-01-01

    A historically feminized profession, education in North America remains remarkably unaffected by feminism, with the notable exception of pedagogy and its impact on curriculum. The purpose of this paper is to describe characteristics of feminism that render it particularly useful and appropriate for developing potentialities in education and music…

  12. Humanism as Moral Imperative: Comments on the Role of Knowing in the Helping Encounter

    ERIC Educational Resources Information Center

    Hansen, James T.

    2006-01-01

    Counseling orientations are redescribed in terms of the relative importance they place on knowing. This epistemological redescription results in a reconsideration of the role of humanism. Specifically, rather than a treatment orientation, the author argues that humanism should be considered a moral imperative. Implications of this conclusion for…

  13. The Education of the Categorical Imperative

    ERIC Educational Resources Information Center

    Johnston, James Scott

    2006-01-01

    In this article, I examine anew the moral philosophy of Immanuel Kant and its contributions to educational theory. I make four claims. First, that Kant should be read as having the Categorical Imperative develop out of subjective maxims. Second, that moral self-perfection is the aim of moral education. Third, that moral self-perfection develops by…

  14. Towards a Certified Lightweight Array Bound Checker for Java Bytecode

    NASA Technical Reports Server (NTRS)

    Pichardie, David

    2009-01-01

    Dynamic array bound checks are crucial elements for the security of a Java Virtual Machines. These dynamic checks are however expensive and several static analysis techniques have been proposed to eliminate explicit bounds checks. Such analyses require advanced numerical and symbolic manipulations that 1) penalize bytecode loading or dynamic compilation, 2) complexify the trusted computing base. Following the Foundational Proof Carrying Code methodology, our goal is to provide a lightweight bytecode verifier for eliminating array bound checks that is both efficient and trustable. In this work, we define a generic relational program analysis for an imperative, stackoriented byte code language with procedures, arrays and global variables and instantiate it with a relational abstract domain as polyhedra. The analysis has automatic inference of loop invariants and method pre-/post-conditions, and efficient checking of analysis results by a simple checker. Invariants, which can be large, can be specialized for proving a safety policy using an automatic pruning technique which reduces their size. The result of the analysis can be checked efficiently by annotating the program with parts of the invariant together with certificates of polyhedral inclusions. The resulting checker is sufficiently simple to be entirely certified within the Coq proof assistant for a simple fragment of the Java bytecode language. During the talk, we will also report on our ongoing effort to scale this approach for the full sequential JVM.

  15. A framework for teaching medical students and residents about practice-based learning and improvement, synthesized from a literature review.

    PubMed

    Ogrinc, Greg; Headrick, Linda A; Mutha, Sunita; Coleman, Mary T; O'Donnell, Joseph; Miles, Paul V

    2003-07-01

    To create a framework for teaching the knowledge and skills of practice-based learning and improvement to medical students and residents based on proven, effective strategies. The authors conducted a Medline search of English-language articles published between 1996 and May 2001, using the term "quality improvement" (QI), and cross-matched it with "medical education" and "health professions education." A thematic-synthesis method of review was used to compile the information from the articles. Based on the literature review, an expert panel recommended educational objectives for practice-based learning and improvement. Twenty-seven articles met the inclusion criteria. The majority of studies were conducted in academic medical centers and medical schools and 40% addressed experiential learning of QI. More than 75% were qualitative case reports capturing educational outcomes, and 7% included an experimental study design. The expert panel integrated data from the literature review with the Dreyfus model of professional skill acquisition, the Institute for Healthcare Improvement's (IHI) knowledge domains for improving health care, and the ACGME competencies and generated a framework of core educational objectives about teaching practice-based learning and improvement to medical students and residents. Teaching the knowledge and skills of practice-based learning and improvement to medical students and residents is a necessary and important foundation for improving patient care. The authors present a framework of learning objectives-informed by the literature and synthesized by the expert panel-to assist educational leaders when integrating these objectives into a curriculum. This framework serves as a blueprint to bridge the gap between current knowledge and future practice needs.

  16. Ada (Trade Name) Compiler Validation Summary Report: Harris Corporation Harris Ada Compiler, Version 1.3 Harris HCX-7.

    DTIC Science & Technology

    1987-06-03

    Harris Corp. Harris Ada Compiler, Ver.1.3 Harris HCX-7 6. PERFORMING ORG. REPORT NUMBER 7 AUTH R(s 8. CONTRACT OR GRANT...VALIDATION SUMMARY REPORT : Harris Corporation Harris Ada Compiler, Version 1.3 Harris HCX-7 Completion of On-Site Testing: 3 June 1987 & .. . 0 Prepared...Place NTIS form here + .. . .. . .. .. Ada’ Compiler Validation Summary Report : Compiler Name: Harris Ada Compiler, Version 1.3 Host: Target: Harris

  17. Large-area mapping of biodiversity

    USGS Publications Warehouse

    Scott, J.M.; Jennings, M.D.

    1998-01-01

    The age of discovery, description, and classification of biodiversity is entering a new phase. In responding to the conservation imperative, we can now supplement the essential work of systematics with spatially explicit information on species and assemblages of species. This is possible because of recent conceptual, technical, and organizational progress in generating synoptic views of the earth's surface and a great deal of its biological content, at multiple scales of thematic as well as geographic resolution. The development of extensive spatial data on species distributions and vegetation types provides us with a framework for: (a) assessing what we know and where we know it at meso-scales, and (b) stratifying the biological universe so that higher-resolution surveys can be more efficiently implemented, coveting, for example, geographic adequacy of specimen collections, population abundance, reproductive success, and genetic dynamics. The land areas involved are very large, and the questions, such as resolution, scale, classification, and accuracy, are complex. In this paper, we provide examples from the United States Gap Analysis Program on the advantages and limitations of mapping the occurrence of terrestrial vertebrate species and dominant land-cover types over large areas as joint ventures and in multi-organizational partnerships, and how these cooperative efforts can be designed to implement results from data development and analyses as on-the-ground actions. Clearly, new frameworks for thinking about biogeographic information as well as organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The Gap Analysis experience provides one model for achieving these new frameworks.

  18. Addressing governance challenges in the provision of animal health services: A review of the literature and empirical application transaction cost theory.

    PubMed

    Ilukor, John; Birner, Regina; Nielsen, Thea

    2015-11-01

    Providing adequate animal health services to smallholder farmers in developing countries has remained a challenge, in spite of various reform efforts during the past decades. The focuses of the past reforms were on market failures to decide what the public sector, the private sector, and the "third sector" (the community-based sector) should do with regard to providing animal health services. However, such frameworks have paid limited attention to the governance challenges inherent in the provision of animal health services. This paper presents a framework for analyzing institutional arrangements for providing animal health services that focus not only on market failures, but also on governance challenges, such as elite capture, and absenteeism of staff. As an analytical basis, Williamson's discriminating alignment hypothesis is applied to assess the cost-effectiveness of different institutional arrangements for animal health services in view of both market failures and governance challenges. This framework is used to generate testable hypotheses on the appropriateness of different institutional arrangements for providing animal health services, depending on context-specific circumstances. Data from Uganda and Kenya on clinical veterinary services is used to provide an empirical test of these hypotheses and to demonstrate application of Williamson's transaction cost theory to veterinary service delivery. The paper concludes that strong public sector involvement, especially in building and strengthening a synergistic relation-based referral arrangement between paraprofessionals and veterinarians is imperative in improving animal health service delivery in developing countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. From Data Processing to Mental Organs: An Interdisciplinary Path to Cognitive Neuroscience**

    PubMed Central

    Patharkar, Manoj

    2011-01-01

    Human brain is a highly evolved coordinating mechanism in the species Homo sapiens. It is only in the last 100 years that extensive knowledge of the intricate structure and complex functioning of the human brain has been acquired, though a lot is yet to be known. However, from the beginning of civilisation, people have been conscious of a ‘mind’ which has been considered the origin of all scientific and cultural development. Philosophers have discussed at length the various attributes of consciousness. At the same time, most of the philosophical or scientific frameworks have directly or indirectly implied mind-body duality. It is now imperative that we develop an integrated approach to understand the interconnection between mind and consciousness on one hand and brain on the other. This paper begins with the proposition that the structure of the brain is analogous, at least to certain extent, to that of the computer system. Of course, it is much more sophisticated and complex. The second proposition is that the Chomskyean concept of ‘mental organs’ is a good working hypothesis that tries to characterise this complexity in terms of an innate cognitive framework. By following this dual approach, brain as a data processing system and brain as a superstructure of intricately linked mental organs, we can move toward a better understanding of ‘mind’ within the framework of empirical science. The one ‘mental organ’ studied extensively in Chomskyean terms is ‘language faculty’ which is unique in its relation to brain, mind and consciousness. PMID:21694973

  20. A Needs-led Framework for Understanding the Impact of Caring for a Family Member With Dementia

    PubMed Central

    Pini, Simon; Ingleson, Emma; Megson, Molly; Wright, Penny; Oyebode, Jan R

    2018-01-01

    Abstract Background and Objectives Approximately half the care for people with dementia is provided by families. It is therefore imperative that research informs ways of maintaining such care. In this study, we propose that a needs-led approach can provide a useful, novel means of conceptualizing the impact of caring on the lives of family carers. Our aim was to develop and present a needs-led framework for understanding how providing care impacts on carers’ fulfilment of needs. Design and Methods In this qualitative study, we conducted 42 semistructured interviews with a purposively diverse sample of family carers to generate nuanced contextualized accounts of how caring impacted on carers’ lives. Our inductive thematic analysis focused upon asking: “What need is being impacted here?” in order to generate a needs-led framework for understanding. Results Nine themes were widely endorsed. Each completed the sentence: “Being a carer impacts on fulfilling my need to/for….”: Freedom; feel close to my relative; feel in control of my life; be my own person; protect my relative; share/express my thoughts and feelings; take care of myself; feel connected to the people around me; get things done. Discussion and Implications These needs echo those from other research areas, with relational needs emerging as particularly central. The needs-led approach offers a perspective that is able to capture both stresses and positive aspects of caregiving. We recommend that clinical interviewing using Socratic questioning to discover human needs that are being impacted by caring would provide a valuable starting point for care planning. PMID:29562360

  1. Dissociating the influence of response selection and task anticipation on corticospinal suppression during response preparation.

    PubMed

    Duque, Julie; Labruna, Ludovica; Cazares, Christian; Ivry, Richard B

    2014-12-01

    Motor behavior requires selecting between potential actions. The role of inhibition in response selection has frequently been examined in tasks in which participants are engaged in some advance preparation prior to the presentation of an imperative signal. Under such conditions, inhibition could be related to processes associated with response selection, or to more general inhibitory processes that are engaged in high states of anticipation. In Experiment 1, we manipulated the degree of anticipatory preparation. Participants performed a choice reaction time task that required choosing between a movement of the left or right index finger, and used transcranial magnetic stimulation (TMS) to elicit motor evoked potentials (MEPs) in the left hand agonist. In high anticipation blocks, a non-informative cue (e.g., fixation marker) preceded the imperative; in low anticipation blocks, there was no cue and participants were required to divide their attention between two tasks to further reduce anticipation. MEPs were substantially reduced before the imperative signal in high anticipation blocks. In contrast, in low anticipation blocks, MEPs remained unchanged before the imperative signal but showed a marked suppression right after the onset of the imperative. This effect occurred regardless of whether the imperative had signalled a left or right hand response. After this initial inhibition, left MEPs increased when the left hand was selected and remained suppressed when the right hand was selected. We obtained similar results in Experiment 2 except that the persistent left MEP suppression when the left hand was not selected was attenuated when the alternative response involved a non-homologous effector (right foot). These results indicate that, even in the absence of an anticipatory period, inhibitory mechanisms are engaged during response selection, possibly to prevent the occurrence of premature and inappropriate responses during a competitive selection process. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. [Comparison of the compilation features of Science of Meridians and Acupoints among different editions].

    PubMed

    Chen, Xiaojun

    The compilation features of Jingluo Shuxue Xue ( Science of Meridians and Acupoints ) among different editions were summarized and analyzed. Jingluo Xue ( Science of Meridians ) and Shuxue Xue ( Science of Acupoints ) published by Shanghai Scientific and Technical Publishers in 1984 are the pioneer as the textbook for the education of acupuncture discipline for the bachelor degree, but there is the big controversy for the editions in 1996. These two books were combined as one, titled Science of Meridians and Acupoints , 2013 edition, published by China Press of Traditional Chinese Medicine. It is concise and coherent in content and is regarded as the milestone in the history of textbook compilation. This book was re-edited in 2007 without major changes in content. The one in 2009 was revised a lot on the basis of the original several editions, published by Shanghai Scientific and Technical Publishers. But unfortunately, it did not bring the big impacts in China. The edition in 2012, published by China Press of Traditional Chinese Medicine had made the innovations besides integrating the achievements of the previous editions, characterized as preciseness and conciseness. By contrast, the edition in 2012, published by People's Medical Publishing House was accomplished by simple modification on the basis of the editions in 2003 and in 2007, without great innovation. Regarding the on-going publication of the textbooks in "the 13th five-year plan", it is viewed that the new edition of textbook should maintain the general framework of "the 12th five-year plan", based on which, a few questions should be revised appropriately. Additionally, "less words, more illustration" should be the basic principle for the revision of the new edition.

  3. A global compilation of coral sea-level benchmarks: Implications and new challenges

    NASA Astrophysics Data System (ADS)

    Medina-Elizalde, Martín

    2013-01-01

    I present a quality-controlled compilation of sea-level data from U-Th dated corals, encompassing 30 studies of 13 locations around the world. The compilation contains relative sea level (RSL) data from each location based on both conventional and open-system U-Th ages. I have applied a commonly used age quality control criterion based on the initial 234U/238U activity ratios of corals in order to select reliable ages and to reconstruct sea level histories for the last 150,000 yr. This analysis reveals scatter of RSL estimates among coeval coral benchmarks both within individual locations and between locations, particularly during Marine Isotope Stage (MIS) 5a and the glacial inception following the last interglacial. The character of data scatter during these time intervals imply that uncertainties still exist regarding tectonics, glacio-isostacy, U-series dating, and/or coral position. To elucidate robust underlying patterns, with confidence limits, I performed a Monte Carlo-style statistical analysis of the compiled coral data considering appropriate age and sea-level uncertainties. By its nature, such an analysis has the tendency to smooth/obscure millennial-scale (and finer) details that may be important in individual datasets, and favour the major underlying patterns that are supported by all datasets. This statistical analysis is thus functional to illustrate major trends that are statistically robust ('what we know'), trends that are suggested but still are supported by few data ('what we might know, subject to addition of more supporting data and improved corrections'), and which patterns/data are clear outliers ('unlikely to be realistic given the rest of the global data and possibly needing further adjustments'). Prior to the last glacial maximum and with the possible exception of the 130-120 ka period, available coral data generally have insufficient temporal resolution and unexplained scatter, which hinders identification of a well-defined pattern with usefully narrow confidence limits. This analysis thus provides a framework that objectively identifies critical targets for new data collection, improved corrections, and integration of coral data with independent, stratigraphically continuous methods of sea-level reconstruction.

  4. Ecoregions of California

    USGS Publications Warehouse

    Griffith, Glenn E.; Omernik, James M.; Smith, David W.; Cook, Terry D.; Tallyn, Ed; Moseley, Kendra; Johnson, Colleen B.

    2016-02-23

    Ecoregions denote areas of general similarity in ecosystems and in the type, quality, and quantity of environmental resources. They are designed to serve as a spatial framework for the research, assessment, management, and monitoring of ecosystems and ecosystem components. By recognizing the spatial differences in the capacities and potentials of ecosystems, ecoregions stratify the environment by its probable response to disturbance (Bryce and others, 1999). These general purpose regions are critical for structuring and implementing ecosystem management strategies across Federal agencies, State agencies, and nongovernment organizations that are responsible for different types of resources in the same geographical areas (Omernik and others, 2000).The approach used to compile this map is based on the premise that ecological regions are hierarchical and can be identified through the analysis of the spatial patterns and the composition of biotic and abiotic phenomena that affect or reflect differences in ecosystem quality and integrity (Wiken, 1986; Omernik, 1987, 1995). These phenomena include geology, physiography, vegetation, climate, soils, land use, wildlife, and hydrology. The relative importance of each characteristic varies from one ecological region to another regardless of the hierarchical level. A Roman numeral hierarchical scheme has been adopted for different levels of ecological regions. Level I is the coarsest level, dividing North America into 15 ecological regions. Level II divides the continent into 50 regions (Commission for Environmental Cooperation Working Group, 1997, map revised 2006). At level III, the continental United States contains 105 ecoregions and the conterminous United States has 85 ecoregions (U.S. Environmental Protection Agency, 2013). Level IV, depicted here for California, is a further refinement of level III ecoregions. Explanations of the methods used to define these ecoregions are given in Omernik (1995), Omernik and others (2000), and Omernik and Griffith (2014).California has great ecological and biological diversity. The State contains offshore islands and coastal lowlands, large alluvial valleys, forested mountain ranges, deserts, and various aquatic habitats. There are 13 level III ecoregions and 177 level IV ecoregions in California and most continue into ecologically similar parts of adjacent States of the United States or Mexico (Bryce and others, 2003; Thorson and others, 2003; Griffith and others, 2014).The California ecoregion map was compiled at a scale of 1:250,000. It revises and subdivides an earlier national ecoregion map that was originally compiled at a smaller scale (Omernik, 1987; U.S. Environmental Protection Agency, 2013). This poster is the result of a collaborative project primarily between U.S. Environmental Protection Agency (USEPA) Region IX, USEPA National Health and Environmental Effects Research Laboratory (Corvallis, Oregon), California Department of Fish and Wildlife (DFW), U.S. Department of Agriculture (USDA)–Natural Resources Conservation Service (NRCS), U.S. Department of the Interior–Geological Survey (USGS), and other State of California agencies and universities.The project is associated with interagency efforts to develop a common framework of ecological regions (McMahon and others, 2001). Reaching that objective requires recognition of the differences in the conceptual approaches and mapping methodologies applied to develop the most common ecoregion-type frameworks, including those developed by the USDA–Forest Service (Bailey and others, 1994; Miles and Goudy, 1997; Cleland and others, 2007), the USEPA (Omernik 1987, 1995), and the NRCS (U.S. Department of Agriculture–Soil Conservation Service, 1981; U.S. Department of Agriculture–Natural Resources Conservation Service, 2006). As each of these frameworks is further refined, their differences are becoming less discernible. Regional collaborative projects such as this one in California, where some agreement has been reached among multiple resource-management agencies, are a step toward attaining consensus and consistency in ecoregion frameworks for the entire nation.

  5. Intercultural Education as an Imperative of Social Development

    ERIC Educational Resources Information Center

    Baça, Ferit

    2015-01-01

    The imperative need of a social coexistence among different groups of people is the implementation of the intercultural education. In these circumstances, school is the most important place and factor for pupils and students as future citizens to take the first knowledge-based on society, life and coexistence in a given country. On the other hand,…

  6. Generic Skills for Graduate Accountants: The Bigger Picture, a Social and Economic Imperative in the New Knowledge Economy

    ERIC Educational Resources Information Center

    Bunney, Diane; Sharplin, Elaine; Howitt, Christine

    2015-01-01

    The case for integrating generic skills in university accounting programmes is well documented in the literature, but the implementation of strategies designed to teach generic skills in the context of accounting courses has posed ongoing challenges for academics and course administrators. The imperative for generic skills in accounting programmes…

  7. Learning Organization Models and Their Application to the U.S. Army

    DTIC Science & Technology

    2016-06-01

    Watkins and Marsick’s action imperatives. While different, these models agree on several components including reduced bureaucracy and hierarchy, a shared...David Garvin’s building blocks of a learning organization, Michael Marquardt’s systems-linked learning organization, and Karen Watkins ’ and Victoria...Organization (Marquardt, 1996) ....................................5 Learning Organization Action Imperatives (Marsick and Watkins , 1999

  8. The Counterproliferation Imperative: Meeting Tomorrow’s Challenges

    DTIC Science & Technology

    2001-11-01

    western equine encephalitis / eastern equine encephalitis ) vaccine Multiagent vaccine delivery system Portable Common Diagnostic System Licensed multivalent...vaccine Licensed new plague vaccine Licensed new Venezuelan Equine Encephalomyelitis (VEE) vaccine Licensed multivalent equine encephalitis (VEE...NOV 2001 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE The Counterproliferation Imperative Meeting Tomorrow’s Challenges 5a

  9. Educational-Methodical Projects for Students' Intellectual Competences Formation: The Imperative Goal of the Educational Process of the University

    ERIC Educational Resources Information Center

    Kutuev, Ruslan A.; Kudyasheva, Albina N.; Buldakova, Natalya V.; Aleksandrova, Natalia S.; Vasilenko, Alexandra S.

    2016-01-01

    The research urgency is caused by the tendencies of the modern information society which produces and consumes intelligence, knowledge and competences as the main educational product of labor market. These trends fundamentally alter the methodological basis of the educational process of the University, subjecting it to imperative goals: the…

  10. New Capabilities for Cyber Charter School Leadership: An Emerging Imperative for Integrating Educational Technology and Educational Leadership Knowledge

    ERIC Educational Resources Information Center

    Kowch, Eugene

    2009-01-01

    Cyber charter schools (CCS) and cyber schools may soon become the most "disruptive innovation" in the education system (Christensen, Horn & Johnson, 2008) so the author urges educational technologists to take up the imperative to develop new administration knowledge among the students along with educational technology skills to support future…

  11. Discourse, the Moral Imperative and Faraday's Candle

    ERIC Educational Resources Information Center

    Melville, Wayne

    2013-01-01

    This commentary considers two lines of inquiry into the work of Ideland and Malmberg: the role of discourse in shaping teachers' responses to Roberts' (2011) Visions of Science and the moral imperatives that will accompany any shifts between Vision I and II. Vision I of science has accreted to itself great power and prestige, both of which shape…

  12. Responding to a Relevance Imperative in School Science and Mathematics: Humanising the Curriculum through Story

    ERIC Educational Resources Information Center

    Darby-Hobbs, Linda

    2013-01-01

    There has been a recent push to reframe curriculum and pedagogy in ways that make school more meaningful and relevant to students' lives and perceived needs. This "relevance imperative" is evident in contemporary rhetoric surrounding quality education, and particularly in relation to the junior secondary years where student disengagement with…

  13. Imperatives of Information and Communication Technology (ICT) for Second Language Learners and Teachers

    ERIC Educational Resources Information Center

    Akinwamide, Timothy Kolade

    2012-01-01

    The introduction of information and communication technology (ICT) to education creates new learning paradigms. We are dwelling in a world which technology has reduced to a global village and the breakthrough in technology is underpinning pedagogical submissions. It may become imperative therefore to have a rethinking on how to ameliorate the…

  14. Water in the Great Basin region; Idaho, Nevada, Utah, and Wyoming

    USGS Publications Warehouse

    Price, Don; Eakin, Thomas E.

    1974-01-01

    The Great Basin Region is defined to include the drainage of the Great Basin physiographic section (Fennman, 1931) in Idaho, Nevada, Utah, and Wyoming. In October 1966, the President’s Water Resources Council requested that a comprehensive framework study be made in the Great Basin Region under the leadership of the Pacific Southwest Interagency Committee. The study, which included evaluation of the water resources of the region and guidelines for future study and development, was completed June 30, 1971. Results of the study received limited distribution.The purpose of this atlas is to make available to the public the hydrologic data (including a general appraisal) that were compiled for the comprehensive framework study. Most of the work was done by a water-resources work group consisting of members from several Federal and State agencies under the chairmanship of Thomas E. Eakin of the U.S. Geological Survey. This atlas contains some data not included in the framework study.The data presented herein are reconnaissance in nature and should be used with discretion. The maps are highly generalized and are intended only to illustrate the regional distribution of the supply and general chemical quality of the water. Sources of more detailed information on the hydrology of specific parts of the Great Basin region are listed in the selected references.

  15. In-Memory Graph Databases for Web-Scale Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Morari, Alessandro; Weaver, Jesse R.

    RDF databases have emerged as one of the most relevant way for organizing, integrating, and managing expo- nentially growing, often heterogeneous, and not rigidly structured data for a variety of scientific and commercial fields. In this paper we discuss the solutions integrated in GEMS (Graph database Engine for Multithreaded Systems), a software framework for implementing RDF databases on commodity, distributed-memory high-performance clusters. Unlike the majority of current RDF databases, GEMS has been designed from the ground up to primarily employ graph-based methods. This is reflected in all the layers of its stack. The GEMS framework is composed of: a SPARQL-to-C++more » compiler, a library of data structures and related methods to access and modify them, and a custom runtime providing lightweight software multithreading, network messages aggregation and a partitioned global address space. We provide an overview of the framework, detailing its component and how they have been closely designed and customized to address issues of graph methods applied to large-scale datasets on clusters. We discuss in details the principles that enable automatic translation of the queries (expressed in SPARQL, the query language of choice for RDF databases) to graph methods, and identify differences with respect to other RDF databases.« less

  16. An assessment of patient navigator activities in breast cancer patient navigation programs using a nine-principle framework.

    PubMed

    Gunn, Christine M; Clark, Jack A; Battaglia, Tracy A; Freund, Karen M; Parker, Victoria A

    2014-10-01

    To determine how closely a published model of navigation reflects the practice of navigation in breast cancer patient navigation programs. Observational field notes describing patient navigator activities collected from 10 purposefully sampled, foundation-funded breast cancer navigation programs in 2008-2009. An exploratory study evaluated a model framework for patient navigation published by Harold Freeman by using an a priori coding scheme based on model domains. Field notes were compiled and coded. Inductive codes were added during analysis to characterize activities not included in the original model. Programs were consistent with individual-level principles representing tasks focused on individual patients. There was variation with respect to program-level principles that related to program organization and structure. Program characteristics such as the use of volunteer or clinical navigators were identified as contributors to patterns of model concordance. This research provides a framework for defining the navigator role as focused on eliminating barriers through the provision of individual-level interventions. The diversity observed at the program level in these programs was a reflection of implementation according to target population. Further guidance may be required to assist patient navigation programs to define and tailor goals and measurement to community needs. © Health Research and Educational Trust.

  17. An Assessment of Patient Navigator Activities in Breast Cancer Patient Navigation Programs Using a Nine-Principle Framework

    PubMed Central

    Gunn, Christine M; Clark, Jack A; Battaglia, Tracy A; Freund, Karen M; Parker, Victoria A

    2014-01-01

    Objective To determine how closely a published model of navigation reflects the practice of navigation in breast cancer patient navigation programs. Data Source Observational field notes describing patient navigator activities collected from 10 purposefully sampled, foundation-funded breast cancer navigation programs in 2008–2009. Study Design An exploratory study evaluated a model framework for patient navigation published by Harold Freeman by using an a priori coding scheme based on model domains. Data Collection Field notes were compiled and coded. Inductive codes were added during analysis to characterize activities not included in the original model. Principal Findings Programs were consistent with individual-level principles representing tasks focused on individual patients. There was variation with respect to program-level principles that related to program organization and structure. Program characteristics such as the use of volunteer or clinical navigators were identified as contributors to patterns of model concordance. Conclusions This research provides a framework for defining the navigator role as focused on eliminating barriers through the provision of individual-level interventions. The diversity observed at the program level in these programs was a reflection of implementation according to target population. Further guidance may be required to assist patient navigation programs to define and tailor goals and measurement to community needs. PMID:24820445

  18. Evaluation in undergraduate medical education: Conceptualizing and validating a novel questionnaire for assessing the quality of bedside teaching.

    PubMed

    Dreiling, Katharina; Montano, Diego; Poinstingl, Herbert; Müller, Tjark; Schiekirka-Schwake, Sarah; Anders, Sven; von Steinbüchel, Nicole; Raupach, Tobias

    2017-08-01

    Evaluation is an integral part of curriculum development in medical education. Given the peculiarities of bedside teaching, specific evaluation tools for this instructional format are needed. Development of these tools should be informed by appropriate frameworks. The purpose of this study was to develop a specific evaluation tool for bedside teaching based on the Stanford Faculty Development Program's clinical teaching framework. Based on a literature review yielding 47 evaluation items, an 18-item questionnaire was compiled and subsequently completed by undergraduate medical students at two German universities. Reliability and validity were assessed in an exploratory full information item factor analysis (study one) and a confirmatory factor analysis as well as a measurement invariance analysis (study two). The exploratory analysis involving 824 students revealed a three-factor structure. Reliability estimates of the subscales were satisfactory (α = 0.71-0.84). The model yielded satisfactory fit indices in the confirmatory factor analysis involving 1043 students. The new questionnaire is short and yet based on a widely-used framework for clinical teaching. The analyses presented here indicate good reliability and validity of the instrument. Future research needs to investigate whether feedback generated from this tool helps to improve teaching quality and student learning outcome.

  19. Optimal allocation of leaf epidermal area for gas exchange.

    PubMed

    de Boer, Hugo J; Price, Charles A; Wagner-Cremer, Friederike; Dekker, Stefan C; Franks, Peter J; Veneklaas, Erik J

    2016-06-01

    A long-standing research focus in phytology has been to understand how plants allocate leaf epidermal space to stomata in order to achieve an economic balance between the plant's carbon needs and water use. Here, we present a quantitative theoretical framework to predict allometric relationships between morphological stomatal traits in relation to leaf gas exchange and the required allocation of epidermal area to stomata. Our theoretical framework was derived from first principles of diffusion and geometry based on the hypothesis that selection for higher anatomical maximum stomatal conductance (gsmax ) involves a trade-off to minimize the fraction of the epidermis that is allocated to stomata. Predicted allometric relationships between stomatal traits were tested with a comprehensive compilation of published and unpublished data on 1057 species from all major clades. In support of our theoretical framework, stomatal traits of this phylogenetically diverse sample reflect spatially optimal allometry that minimizes investment in the allocation of epidermal area when plants evolve towards higher gsmax . Our results specifically highlight that the stomatal morphology of angiosperms evolved along spatially optimal allometric relationships. We propose that the resulting wide range of viable stomatal trait combinations equips angiosperms with developmental and evolutionary flexibility in leaf gas exchange unrivalled by gymnosperms and pteridophytes. © 2016 The Authors New Phytologist © 2016 New Phytologist Trust.

  20. Testing-Based Compiler Validation for Synchronous Languages

    NASA Technical Reports Server (NTRS)

    Garoche, Pierre-Loic; Howar, Falk; Kahsai, Temesghen; Thirioux, Xavier

    2014-01-01

    In this paper we present a novel lightweight approach to validate compilers for synchronous languages. Instead of verifying a compiler for all input programs or providing a fixed suite of regression tests, we extend the compiler to generate a test-suite with high behavioral coverage and geared towards discovery of faults for every compiled artifact. We have implemented and evaluated our approach using a compiler from Lustre to C.

  1. HAL/S-FC compiler system functional specification

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  2. Flow and Dispersion in Urban Areas

    NASA Astrophysics Data System (ADS)

    Britter, R. E.

    Increasing urbanization and concern about sustainability and quality of life issues have produced considerable interest in flow and dispersion in urban areas. We address this subject at four scales: regional, city, neighborhood, and street. The flow is one over and through a complex array of structures. Most of the local fluid mechanical processes are understood; how these combine and what is the most appropriate framework to study and quantify the result is less clear. Extensive and structured experimental databases have been compiled recently in several laboratories. A number of major field experiments in urban areas have been completed very recently and more are planned. These have aided understanding as well as model development and evaluation.

  3. Tectonic summaries of magnitude 7 and greater earthquakes from 2000 to 2015

    USGS Publications Warehouse

    Hayes, Gavin P.; Meyers, Emma K.; Dewey, James W.; Briggs, Richard W.; Earle, Paul S.; Benz, Harley M.; Smoczyk, Gregory M.; Flamme, Hanna E.; Barnhart, William D.; Gold, Ryan D.; Furlong, Kevin P.

    2017-01-11

    This paper describes the tectonic summaries for all magnitude 7 and larger earthquakes in the period 2000–2015, as produced by the U.S. Geological Survey National Earthquake Information Center during their routine response operations to global earthquakes. The goal of such summaries is to provide important event-specific information to the public rapidly and concisely, such that recent earthquakes can be understood within a global and regional seismotectonic framework. We compile these summaries here to provide a long-term archive for this information, and so that the variability in tectonic setting and earthquake history from region to region, and sometimes within a given region, can be more clearly understood.

  4. HAL/S-FC compiler system specifications

    NASA Technical Reports Server (NTRS)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  5. Compiling quantum circuits to realistic hardware architectures using temporal planners

    NASA Astrophysics Data System (ADS)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  6. Five Strategic Imperatives for Interdisciplinary Study in Mass Communications/Media Studies in the U.S. and U.K.

    ERIC Educational Resources Information Center

    Petrausch, Robert J.

    2005-01-01

    Interdisciplinary study can allow students to share ideas with scholars in allied fields and broaden their knowledge of global issues. Mass communication/media studies programs in the U.S. and U.K. can serve as models to lead students into successful learning through interdisciplinary study. This paper outlines five strategic imperatives for the…

  7. The Imperative Educational Network: Parents, Teachers, and Concerned Individuals. Volume 4. Proceedings of the Imperative Educational Network Conference (Athens, Georgia, 1992).

    ERIC Educational Resources Information Center

    Tomlinson, Louise M., Ed.

    This conference sought to provide a forum for the exchange of ideas on how parents, teachers, and other concerned individuals can contribute to strengthening the educational support system, and sought to generate practical information on strategies to improve achievement levels of youth. A conference program prospectus by Louise M. Tomlinson…

  8. New Mandates and Imperatives in the Revised "ACA Code of Ethics"

    ERIC Educational Resources Information Center

    Kaplan, David M.; Kocet, Michael M.; Cottone, R. Rocco; Glosoff, Harriet L.; Miranti, Judith G.; Moll, E. Christine; Bloom, John W.; Bringaze, Tammy B.; Herlihy, Barbara; Lee, Courtland C.; Tarvydas, Vilia M.

    2009-01-01

    The first major revision of the "ACA Code of Ethics" in a decade occurred in late 2005, with the updated edition containing important new mandates and imperatives. This article provides interviews with members of the Ethics Revision Task Force that flesh out seminal changes in the revised "ACA Code of Ethics" in the areas of confidentiality,…

  9. Dialogues of Teacher Education: A Social Justice Imperative for Teacher Preparation and Practice

    ERIC Educational Resources Information Center

    Jenlink, Patrick M.

    2010-01-01

    This paper argues the need for critical discourse that at once illuminates the nature of injustices that plague society and the need to examine the political and ideological as well as pedagogical nature of social justice as an imperative for teacher education and practice. Given the reality of injustices in society, there can be little question…

  10. The Imperative of Basic Tax Education for Citizens

    ERIC Educational Resources Information Center

    Nwanna, Gladson; Richards, Darlington

    2010-01-01

    The role and impact of taxes in the lives of Americans makes basic tax education an imperative for all Americans. Not only will that knowledge be valuable to the taxpayer, it will also be valuable to the Government that imposes a variety of taxes. Specifically, it is our position that the lack of basic understanding of taxes is unwarranted, long…

  11. The Criticality of Norms to the Functional Imperatives of the Social Action System of College and University Work

    ERIC Educational Resources Information Center

    Braxton, John M.

    2010-01-01

    In this article, I assert that the work of colleges and universities forms a social action system. I array the critical positions represented in this issue according to the four functional imperatives of social action systems: adaptation, goal attainment, integration, and pattern maintenance. I discuss the role of normative structures for these…

  12. The Imperative Educational Network: Parents, Teachers, and Concerned Individuals. Volume 2. Proceedings of the Imperative Educational Network Conference (Athens, Georgia, 1990).

    ERIC Educational Resources Information Center

    Tomlinson, Louise M., Ed.

    This conference was designed to provide a forum for the exchange of ideas on how parents, teachers, and other concerned individuals can contribute to strengthening the educational support system, and to generate practical information on strategies to improve achievement levels of youth. The first article, titled "Conference Program…

  13. Paleoclimate and paleoelevation in the western US Cordillera, 80 Ma to Present

    NASA Astrophysics Data System (ADS)

    Snell, K. E.; Thompson, J. M.; Foreman, B. Z.; Wernicke, B. P.; Chamberlain, C. P.; Eiler, J. M.; Koch, P. L.

    2011-12-01

    Disentangling local to regional paleoclimatic signals from paleoelevation changes in the terrestrial sedimentary record is challenging, and can be done with confidence only by compiling spatially and temporally distributed datasets (preferably drawing on diverse proxies). Spatial coverage is particularly important for paleoelevation reconstruction because climate at low elevation sites must be known to identify higher paleoelevation sites and to quantify their altitude. The abundance of previous paleoclimatic and paleoelevation studies from the western United States can provide some of the necessary temporal and spatial framework for discriminating signals of climate change from elevation changes. Here, we present a compilation of previously published and new paleotemperature data from the western United States from the Late Cretaceous - Present derived from leaf physiognomy MAT estimates and carbonate clumped-isotope temperature estimates. After coarsely binning the data into high paleoelevation (>2 km) and lower paleoelevation (<2 km) sites (according to original interpretations made by the authors of previous studies), we compare the general temporal patterns of temperature change from western North America with those implied by the marine stable isotope record. Within this framework, we begin to evaluate sites of uncertain paleoelevation that cannot be compared with contemporaneous, adjacent low elevation sites. In this compilation, both low and high elevation land temperatures are warmer than today during the Late Cretaceous, reach an apex during the early-middle Eocene and then cool to the Present (sharply from the late Miocene to Pleistocene). The observed pattern matches reasonably well with the coarse temporal pattern of climate change based on the marine oxygen isotope record. Paleobotanical data reflect mean annual temperature (MAT), whereas the clumped isotope data from paleosol and lacustrine carbonates appear to be biased toward summer temperatures. Throughout the Late Mesozoic and Cenozoic, both MAT and summer paleotemperature estimates are higher than modern MAT and summer temperature, but the relatively consistent difference between these records implies a seasonal range in temperature that was similar to modern. Summer temperatures from low paleoelevation sites during the Late Cretaceous to the Early Eocene are relatively warm (30 - 40 degrees C), though these values may include a few degrees of radiant solar heating of the surface. Interestingly, Early Eocene-aged carbonate samples from southwest Montana are cooler on average than other carbonate samples of roughly the same age, but are similar in temperature to samples thought to be at high elevation during the Late Cretaceous. Thus, these samples may reflect high elevation summer temperatures, rather than low elevation temperatures, demonstrating the utility of this combined spatial and temporal approach to evaluating terrestrial paleoenvironmental records.

  14. CAAT Altex workshop paper entitled "Towards Good Read ...

    EPA Pesticide Factsheets

    Grouping of substances and utilizing read-across within those groups represents an important data gap filling technique for chemical safety assessments. Categories/analogue groups are typically developed based on structural similarity, and increasingly often, also on mechanistic similarity. While read-across can play a key role in complying with legislations such as the European REACH legislation, the lack of consensus regarding the extent and type of evidence necessary to support it often hampers it’s successful application and acceptance by regulatory authorities. Despite a potentially broad user community, expertise is still concentrated across a handful of organizations and individuals. In order to facilitate the effective use of read-across, this document aims to summarize the state-of-the-art, summarise insights learned from reviewing ECHA published decisions as far as the relative successes/pitfalls surrounding read-across under the REACH and compile the relevant activities and guidance documents. Special emphasis is given to the available existing tools and approaches, the consideration and expression of uncertainty, the use of biological support data and the potential impact of the ECHA Read-Across Assessment Framework (RAAF) which was published in 2015. This is a paper that summarises different activities and research needs for read-across for regulatory purposes that has been compiled as a result of a CAAT initiative led by Thomas Hartung.

  15. Executing CLIPS expert systems in a distributed environment

    NASA Technical Reports Server (NTRS)

    Taylor, James; Myers, Leonard

    1990-01-01

    This paper describes a framework for running cooperating agents in a distributed environment to support the Intelligent Computer Aided Design System (ICADS), a project in progress at the CAD Research Unit of the Design Institute at the California Polytechnic State University. Currently, the systems aids an architectural designer in creating a floor plan that satisfies some general architectural constraints and project specific requirements. At the core of ICADS is the Blackboard Control System. Connected to the blackboard are any number of domain experts called Intelligent Design Tools (IDT). The Blackboard Control System monitors the evolving design as it is being drawn and helps resolve conflicts from the domain experts. The user serves as a partner in this system by manipulating the floor plan in the CAD system and validating recommendations made by the domain experts. The primary components of the Blackboard Control System are two expert systems executed by a modified CLIPS shell. The first is the Message Handler. The second is the Conflict Resolver. The Conflict Resolver synthesizes the suggestions made by the domain experts, which can be either CLIPS expert systems, or compiled C programs. In DEMO1, the current ICADS prototype, the CLIPS domain expert systems are Acoustics, Lighting, Structural, and Thermal; the compiled C domain experts are the CAD system and the User Interface.

  16. Risk and resilience in the shale gas context: a nexus perspective

    NASA Astrophysics Data System (ADS)

    Rosales, T. Y.; Notte, C. A.; Allen, D. M.; Kirste, D. M.

    2014-12-01

    The accelerated exploration for and development of unconventional gas plays around the world has raised public concern about potential risks to human health and the environment. In this study, a risk assessment framework specific to shale gas development is proposed. The framework aims to quantify and/or qualify both risk and resilience within a water-energy nexus context, using a comprehensive approach that considers environment, health and policy. The risk assessment framework is intended to be flexible so that it can be used in different regions, but will be tested in North East British Columbia, Canada where shale gas development is rapidly expanding. The main components of risk include hazards, susceptibility and potential consequences, which will be evaluated in space and time using ArcGIS software. The hazards are associated with all phases of shale gas development and include: water, air, and soil contamination; water use (surface and groundwater), and land use disturbance, and their assessment will take into account where they may occur, their frequency, duration and magnitude. Hazard-specific susceptibility maps will be generated based on the physical characteristics of the environment (e.g. soil, geology, hydrology, topography) as well as water source information (e.g. well locations), community footprints, etc. When combined with an evaluation of potential consequences, the resulting set of spatial risk maps can then be used for water resource management, land use planning, and industry permitting. Resilience, which buffers risk, here considers the existing regulatory framework and whether or not existing regulations can mitigate risk by reducing the hazard potential or consequences. The study considers how regulations may fully, partially, or inadequately mitigate the consequences of a given hazard. If development is to continue at its current pace in North East BC, it is imperative that decision-makers recognize the changing risk and resilience profiles and respond with appropriate policy. A critical component of the study comprises a gap analysis of current regulation and a possible path forward.

  17. A predictive framework and review of the ecological impacts of exotic plant invasions on reptiles and amphibians.

    PubMed

    Martin, Leigh J; Murray, Brad R

    2011-05-01

    The invasive spread of exotic plants in native vegetation can pose serious threats to native faunal assemblages. This is of particular concern for reptiles and amphibians because they form a significant component of the world's vertebrate fauna, play a pivotal role in ecosystem functioning and are often neglected in biodiversity research. A framework to predict how exotic plant invasion will affect reptile and amphibian assemblages is imperative for conservation, management and the identification of research priorities. Here, we present a new predictive framework that integrates three mechanistic models. These models are based on exotic plant invasion altering: (1) habitat structure; (2) herbivory and predator-prey interactions; (3) the reproductive success of reptile and amphibian species and assemblages. We present a series of testable predictions from these models that arise from the interplay over time among three exotic plant traits (growth form, area of coverage, taxonomic distinctiveness) and six traits of reptiles and amphibians (body size, lifespan, home range size, habitat specialisation, diet, reproductive strategy). A literature review provided robust empirical evidence of exotic plant impacts on reptiles and amphibians from each of the three model mechanisms. Evidence relating to the role of body size and diet was less clear-cut, indicating the need for further research. The literature provided limited empirical support for many of the other model predictions. This was not, however, because findings contradicted our model predictions but because research in this area is sparse. In particular, the small number of studies specifically examining the effects of exotic plants on amphibians highlights the pressing need for quantitative research in this area. There is enormous scope for detailed empirical investigation of interactions between exotic plants and reptile and amphibian species and assemblages. The framework presented here and further testing of predictions will provide a basis for informing and prioritising environmental management and exotic plant control efforts. © 2010 The Authors. Biological Reviews © 2010 Cambridge Philosophical Society.

  18. State Public Health Enabling Authorities: Results of a Fundamental Activities Assessment Examining Core and Essential Services

    PubMed Central

    Hoss, Aila; Menon, Akshara; Corso, Liza

    2016-01-01

    Context Public health enabling authorities establish the legal foundation for financing, organizing, and delivering public health services. State laws vary in terms of the content, depth, and breadth of these fundamental public health activities. Given this variance, the Institute of Medicine has identified state public health laws as an area that requires further examination. To respond to this call for further examination, the Centers for Disease Control and Prevention’s Public Health Law Program conducted a fundamental activities legal assessment on state public health laws. Objective The goal of the legal assessment was to examine state laws referencing frameworks representing public health department fundamental activities (ie, core and essential services) in an effort to identify, catalog, and describe enabling authorities of state governmental public health systems. Design In 2013, Public Health Law Program staff compiled a list of state statutes and regulations referencing different commonly-recognized public health frameworks of fundamental activities. The legal assessment included state fundamental activities laws available on WestlawNext as of July 2013. The results related to the 10 essential public health services and the 3 core public health functions were confirmed and updated in June 2016. Results Eighteen states reference commonly-recognized frameworks of fundamental activities in their laws. Thirteen states have listed the 10 essential public health services in their laws. Eight of these states have also referenced the 3 core public health functions in their laws. Five states reference only the core public health functions. Conclusions Several states reference fundamental activities in their state laws, particularly through use of the essential services framework. Further work is needed to capture the public health laws and practices of states that may be performing fundamental activities but without reference to a common framework. PMID:27682724

  19. Nature-based supportive care opportunities: a conceptual framework.

    PubMed

    Blaschke, Sarah; O'Callaghan, Clare C; Schofield, Penelope

    2018-03-22

    Given preliminary evidence for positive health outcomes related to contact with nature for cancer populations, research is warranted to ascertain possible strategies for incorporating nature-based care opportunities into oncology contexts as additional strategies for addressing multidimensional aspects of cancer patients' health and recovery needs. The objective of this study was to consolidate existing research related to nature-based supportive care opportunities and generate a conceptual framework for discerning relevant applications in the supportive care setting. Drawing on research investigating nature-based engagement in oncology contexts, a two-step analytic process was used to construct a conceptual framework for guiding nature-based supportive care design and future research. Concept analysis methodology generated new representations of understanding by extracting and synthesising salient concepts. Newly formulated concepts were transposed to findings from related research about patient-reported and healthcare expert-developed recommendations for nature-based supportive care in oncology. Five theoretical concepts (themes) were formulated describing patients' reasons for engaging with nature and the underlying needs these interactions address. These included: connecting with what is genuinely valued, distancing from the cancer experience, meaning-making and reframing the cancer experience, finding comfort and safety, and vital nurturance. Eight shared patient and expert recommendations were compiled, which address the identified needs through nature-based initiatives. Eleven additional patient-reported recommendations attend to beneficial and adverse experiential qualities of patients' nature-based engagement and complete the framework. The framework outlines salient findings about helpful nature-based supportive care opportunities for ready access by healthcare practitioners, designers, researchers and patients themselves. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. Development, implementation and critique of a bioethics framework for pharmaceutical sponsors of human biomedical research.

    PubMed

    Van Campen, Luann E; Therasse, Donald G; Klopfenstein, Mitchell; Levine, Robert J

    2015-11-01

    Pharmaceutical human biomedical research is a multi-dimensional endeavor that requires collaboration among many parties, including those who sponsor, conduct, participate in, or stand to benefit from the research. Human subjects' protections have been promulgated to ensure that the benefits of such research are accomplished with respect for and minimal risk to individual research participants, and with an overall sense of fairness. Although these protections are foundational to clinical research, most ethics guidance primarily highlights the responsibilities of investigators and ethics review boards. Currently, there is no published resource that comprehensively addresses bioethical responsibilities of industry sponsors; including their responsibilities to parties who are not research participants, but are, nevertheless key stakeholders in the endeavor. To fill this void, in 2010 Eli Lilly and Company instituted a Bioethics Framework for Human Biomedical Research. This paper describes how the framework was developed and implemented and provides a critique based on four years of experience. A companion article provides the actual document used by Eli Lilly and Company to guide ethical decisions regarding all phases of human clinical trials. While many of the concepts presented in this framework are not novel, compiling them in a manner that articulates the ethical responsibilities of a sponsor is novel. By utilizing this type of bioethics framework, we have been able to develop bioethics positions on various topics, provide research ethics consultations, and integrate bioethics into the daily operations of our human biomedical research. We hope that by sharing these companion papers we will stimulate discussion within and outside the biopharmaceutical industry for the benefit of the multiple parties involved in pharmaceutical human biomedical research.

  1. State Public Health Enabling Authorities: Results of a Fundamental Activities Assessment Examining Core and Essential Services.

    PubMed

    Hoss, Aila; Menon, Akshara; Corso, Liza

    2016-01-01

    Public health enabling authorities establish the legal foundation for financing, organizing, and delivering public health services. State laws vary in terms of the content, depth, and breadth of these fundamental public health activities. Given this variance, the Institute of Medicine has identified state public health laws as an area that requires further examination. To respond to this call for further examination, the Centers for Disease Control and Prevention's Public Health Law Program conducted a fundamental activities legal assessment on state public health laws. The goal of the legal assessment was to examine state laws referencing frameworks representing public health department fundamental activities (ie, core and essential services) in an effort to identify, catalog, and describe enabling authorities of state governmental public health systems. In 2013, Public Health Law Program staff compiled a list of state statutes and regulations referencing different commonly-recognized public health frameworks of fundamental activities. The legal assessment included state fundamental activities laws available on WestlawNext as of July 2013. The results related to the 10 essential public health services and the 3 core public health functions were confirmed and updated in June 2016. Eighteen states reference commonly-recognized frameworks of fundamental activities in their laws. Thirteen states have listed the 10 essential public health services in their laws. Eight of these states have also referenced the 3 core public health functions in their laws. Five states reference only the core public health functions. Several states reference fundamental activities in their state laws, particularly through use of the essential services framework. Further work is needed to capture the public health laws and practices of states that may be performing fundamental activities but without reference to a common framework.

  2. Integrating Health Behavior Theory and Design Elements in Serious Games

    PubMed Central

    Fleming, Theresa; Lucassen, Mathijs FG; Bridgman, Heather; Stasiak, Karolina; Shepherd, Matthew; Orpin, Peter

    2015-01-01

    Background Internet interventions for improving health and well-being have the potential to reach many people and fill gaps in service provision. Serious gaming interfaces provide opportunities to optimize user adherence and impact. Health interventions based in theory and evidence and tailored to psychological constructs have been found to be more effective to promote behavior change. Defining the design elements which engage users and help them to meet their goals can contribute to better informed serious games. Objective To elucidate design elements important in SPARX, a serious game for adolescents with depression, from a user-centered perspective. Methods We proposed a model based on an established theory of health behavior change and practical features of serious game design to organize ideas and rationale. We analyzed data from 5 studies comprising a total of 22 focus groups and 66 semistructured interviews conducted with youth and families in New Zealand and Australia who had viewed or used SPARX. User perceptions of the game were applied to this framework. Results A coherent framework was established using the three constructs of self-determination theory (SDT), autonomy, competence, and relatedness, to organize user perceptions and design elements within four areas important in design: computer game, accessibility, working alliance, and learning in immersion. User perceptions mapped well to the framework, which may assist developers in understanding the context of user needs. By mapping these elements against the constructs of SDT, we were able to propose a sound theoretical base for the model. Conclusions This study’s method allowed for the articulation of design elements in a serious game from a user-centered perspective within a coherent overarching framework. The framework can be used to deliberately incorporate serious game design elements that support a user’s sense of autonomy, competence, and relatedness, key constructs which have been found to mediate motivation at all stages of the change process. The resulting model introduces promising avenues for future exploration. Involving users in program design remains an imperative if serious games are to be fit for purpose. PMID:26543916

  3. The Imperative Educational Network: Parents, Teachers, and Concerned Individuals. Volume 3. Proceedings of the Imperative Educational Network Conference (Athens, Georgia, 1991).

    ERIC Educational Resources Information Center

    Tomlinson, Louise M., Ed.

    This conference sought to provide a forum for the exchange of ideas on how parents, teachers, and other concerned individuals can contribute to strengthening the educational support system, and to generate practical information on strategies to improve achievement levels of youth. A conference program prospectus by Louise M. Tomlinson offers a…

  4. Where and How Do "We" Enter: (Re)Imagining and Bridging Culturally Relevant Civic Engagements of Teacher Educators, Teachers, and Immigrant Youth

    ERIC Educational Resources Information Center

    Knight-Diop, Michelle

    2011-01-01

    The author takes up the invitation to engage in the dialogue on the imperatives for civic engagement in teacher education at the intersections of youth, immigration, and globalization in urban contexts--especially when given that many of the youth in K-12 schools are immigrants or children of immigrants. The first imperative considers the…

  5. An Elusive Policy Imperative: Data and Methodological Challenges When Using Growth in Student Achievement to Evaluate Teacher Education Programs' "Value-Added"

    ERIC Educational Resources Information Center

    Amrein-Beardsley, Audrey; Lawton, Kerry; Ronan, Katherine

    2017-01-01

    In this study researchers examined the effectiveness of one of the largest teacher education programs located within the largest research-intensive universities within the US. They did this using a value-added model as per current federal educational policy imperatives to assess the measurable effects of teacher education programs on their teacher…

  6. State of the South: North Carolina's Economic Imperative: Building an Infrastructure of Opportunity. A Report for the John M. Belk Endowment

    ERIC Educational Resources Information Center

    MDC, Inc., 2016

    2016-01-01

    In partnership with the John M. Belk Endowment, MDC has researched and written the report, "North Carolina's Economic Imperative: Building an Infrastructure of Opportunity," part of our work with the endowment to study, address, and propose solutions for the low mobility rate of too many North Carolinians. The report examines patterns of…

  7. Just Learning: The Imperative to Transform Juvenile Justice Systems into Effective Educational Systems. A Study of Juvenile Justice Schools in the South and the Nation

    ERIC Educational Resources Information Center

    Suitts, Steve; Dunn, Katherine; Sabree, Nasheed

    2014-01-01

    With awareness growing that schools are disciplining and suspending minority students at alarming rates, the report provides powerful evidence that young people placed in the juvenile justice system-predominately minority males incarcerated for minor offenses-are receiving a substandard education. The report, "Just Learning: The Imperative to…

  8. Teacher Education in Post-Apartheid South Africa: Navigating a Way through Competing State and Global Imperatives for Change

    ERIC Educational Resources Information Center

    Schafer, Marc; Wilmot, Di

    2012-01-01

    This article focuses on teacher education in post-apartheid South Africa. It argues that the restructuring and reorganization of teacher education is at the nexus of the axes of tension created by national and global imperatives for change. Along with the dismantling of apartheid and the transition to a free and democratic state in 1994 came the…

  9. Hate Won, but Love Will Have the Final Word: Critical Pedagogy, Liberation Theology, and the Moral Imperative of Resistance

    ERIC Educational Resources Information Center

    Kirylo, James D

    2017-01-01

    In the context of the recent presidential election in the United States, this article examines the place of critical pedagogy and liberation theology and its positionality in impacting the moral imperative of resisting a climate of hate and intolerance. Particularly drawing from the work of Peter McLaren, Gustavo Gutiérrez, Paulo Freire and…

  10. Socially Inclusive Development: The Foundations for Decent Societies in East and Southern Africa.

    PubMed

    Abbott, Pamela; Wallace, Claire; Sapsford, Roger

    2017-01-01

    This article is concerned with how social processes and social provision are conceptualised and measured in societies in order to offer guidance on how to improve developmental progress. Significant advances have been made in developing multidimensional measures of development, but they provide little guidance to governments on how to build sustainable societies. We argue for the need to develop a theoretically informed social and policy framework that permits the foundations for building decent societies to be put in place by governments. In our view the recently developed Decent Society Model provides such a framework. Our example is the assessment of government provision, by function, within fourteen countries of East and Southern Africa. The context is the current debates about socially inclusive development, but we argue that it is necessary to range more widely, as social processes of different kinds are multiply interrelated. Social inclusion is recognised by governments as well as international agencies, including the World Bank and the United Nations, as not only an ethical imperative but smart economics; socially inclusive societies are more stable and have greater potential for economic growth. Societies that can develop sustainably need not only to be inclusive, however, but to provide economic security for all, to be socially cohesive and to empower citizens so that as individuals and communities they can take control over their own lives.

  11. Stressed and overworked? A cross-sectional study of the working situation of urban and rural general practitioners in Austria in the framework of the QUALICOPC project

    PubMed Central

    Hoffmann, Kathryn; Wojczewski, Silvia; George, Aaron; Schäfer, Willemijn L. A.; Maier, Manfred

    2015-01-01

    Aim To assess the workload of general practitioners (GPs) in Austria, with a focus on identifying the differences between GPs working in urban and rural areas. Methods Within the framework of the Quality and Costs of Primary Care in Europe (QUALICOPC) study, data were collected from a stratified sample of GPs using a standardized questionnaire between November 2011 and May 2012. Data analysis included descriptive statistics and regression analysis. Results The analysis included data from 173 GPs. GPs in rural areas reported an average of 49.3 working hours per week, plus 23.7 on-call duties per 3 months and 26.2 out-of-office care services per week. Compared to GPs working in urban areas, even in the fully adjusted regression model, rural GPs had significantly more working hours (B 7.00; P = 0.002) and on-call duties (B 18.91; P < 0.001). 65.8% of all GPs perceived their level of stress as high and 84.6% felt they were required to do unnecessary administrative work. Conclusion Our findings show a high workload among Austrian GPs, particularly those working in rural areas. Since physicians show a diminishing interest to work as GPs, there is an imperative to improve this situation. PMID:26321030

  12. Development, theoretical framework, and evaluation of a parent and teacher-delivered intervention on adolescent vaccination.

    PubMed

    Gargano, Lisa M; Herbert, Natasha L; Painter, Julia E; Sales, Jessica M; Vogt, Tara M; Morfaw, Christopher; Jones, LaDawna M; Murray, Dennis; DiClemente, Ralph J; Hughes, James M

    2014-07-01

    The Advisory Committee on Immunization Practices recommended immunization schedule for adolescents includes three vaccines (tetanus, diphtheria, and acellular pertussis [Tdap]; human papillomavirus [HPV] vaccine; and meningococcal conjugate vaccine [MCV4]) and an annual influenza vaccination. Given the increasing number of recommended vaccines for adolescents and health and economic costs associated with nonvaccination, it is imperative that effective strategies for increasing vaccination rates among adolescents are developed. This article describes the development, theoretical framework, and initial first-year evaluation of an intervention designed to promote vaccine acceptance among a middle and high school-based sample of adolescents and their parents in eastern Georgia. Adolescents, parents, and teachers were active participants in the development of the intervention. The intervention, which consisted of a brochure for parents and a teacher-delivered curriculum for adolescents, was guided by constructs from the health belief model and theory of reasoned action. Evaluation results indicated that our intervention development methods were successful in creating a brochure that met cultural relevance and the literacy needs of parents. We also demonstrated an increase in student knowledge of and positive attitudes toward vaccines. To our knowledge, this study is the first to extensively engage middle and high school students, parents, and teachers in the design and implementation of key theory-based educational components of a school-based, teacher-delivered adolescent vaccination intervention. © 2014 Society for Public Health Education.

  13. No risk, no gain: invest in women and girls by funding advocacy, organizing, litigation and work to shift culture.

    PubMed

    McGovern, Theresa

    2013-11-01

    The new development framework aspires to merge long-term hopes for environmental, political and financial sustainability with international poverty eradication goals. Central to this agenda is the promotion and protection of the human rights of women and girls. Yet national mechanisms, donors and international development agencies often do not fully tackle these issues or confront the accompanying politically sensitive, complex issues intermingling religion, socioeconomic status, social, cultural and family life. The increasing reliance on private investment may further weaken a women's rights approach. The proposed framework described in the High-Level Panel of Eminent Persons Report could further systematize this problem, even though it improves on the MDGs by expanding targets related to women. Success will require support for a potent mix of advocacy, movement building and a complex set of ground-based strategies that shift cultural practices, laws and policies that harm women and girls. Funding for advocacy and interventions that hold firm on human rights is imperative, but given the conflicting loyalties of governments and public-private partnerships, reliance on either sector may be risky. An analysis of the status of women's rights work, infrastructure and donor support in Bangladesh and South Africa shows the need for vigilance and long-term investment in effective work. Copyright © 2013 Reproductive Health Matters. Published by Elsevier Ltd. All rights reserved.

  14. Competing values in healthcare: balancing the (un)balanced scorecard.

    PubMed

    Wicks, Angela M; St Clair, Lynda

    2007-01-01

    Facing a complex environment driven by two decades of dramatic change, healthcare organizations are adopting new strategic frameworks such as the Balanced Scorecard (BSC) to evaluate performance (Kaplan and Norton 1992). The BSC was not originally developed as a performance management tool, however. Rather, it was designed as a tool to communicate strategy and, as such, provides little guidance when actual outcomes fall short of desired outcomes. In addition, although the BSC is an improvement over exclusively financial measures, it has three conceptual limitations that are especially problematic for evaluating healthcare organizations: (1) it underemphasizes the employee perspective, (2) it is founded on a control-based management philosophy, and (3) it emphasizes making trade-offs. To address these limitations, we propose using the Competing Values Framework (CVF), a theoretically grounded, comprehensive approach to understanding and improving organizational and managerial performance by focusing on four action imperatives: competing, controlling, collaborating, and creating. The CVF pays particular attention to the employee perspective, is consistent with a commitment-based management philosophy, and emphasizes transcending apparent paradoxes to identify win-win solutions. Rather than focusing on customer satisfaction or employee satisfaction, the CVF looks for ways to satisfy customers and employees while still addressing financial constraints and growth opportunities. The CVF also can be used to assess both the culture of the organization and the competencies of individual managers, thereby providing a clear link between strategy and implementation.

  15. The impact of culture and religion on truth telling at the end of life.

    PubMed

    de Pentheny O'Kelly, Clarissa; Urch, Catherine; Brown, Edwina A

    2011-12-01

    Truth telling, a cardinal rule in Western medicine, is not a globally shared moral stance. Honest disclosure of terminal prognosis and diagnosis are regarded as imperative in preparing for the end of life. Yet in many cultures, truth concealment is common practice. In collectivist Asian and Muslim cultures, illness is a shared family affair. Consequently, decision making is family centred and beneficence and non-malfeasance play a dominant role in their ethical model, in contrast to patient autonomy in Western cultures. The 'four principles' are prevalent throughout Eastern and Western cultures, however, the weight with which they are considered and their understanding differ. The belief that a grave diagnosis or prognosis will extinguish hope in patients leads families to protect ill members from the truth. This denial of the truth, however, is linked with not losing faith in a cure. Thus, aggressive futile treatment can be expected. The challenge is to provide a health care service that is equable for all individuals in a given country. The British National Health Service provides care to all cultures but is bound by the legal principles and framework of the UK and aims for equity of provision by working within the UK ethical framework with legal and ethical norms being explained to all patients and relatives. This requires truth telling about prognosis and efficacy of potential treatments so that unrealistic expectations are not raised.

  16. Nimble Compiler Environment for Agile Hardware. Volume 1

    DTIC Science & Technology

    2001-10-01

    APPENDIX G . XIMA - THE NIMBLE DATAPATH COMPILER .......................................................................... 172 ABSTRACT...Approach of the Nimble Compiler Task 3 G Xima - The Nimble Datapath Compiler Task 4 H Domain Generator Tutorial for the Nimble Compiler Project Task 5 I...a loop example. Nodes A- G are basic blocks inside the loop. It is obvious that there are four distinct paths inside the loop (without counting the

  17. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  18. Ada Compiler Validation Summary Report: Harris Corporation, Harris Ada Compiler, Version 4.0, Harris HCX-9 (Host) and (Target), 880603W1.09059

    DTIC Science & Technology

    1988-06-06

    TYPE Of REPORT & PERIOD COVERED Ada Compiler Validation Summary Report : Harris 6 June 1988 to 6 June 1988 Corporation, Harris Ada Compiler, Version...4.0, Harris 1 PERFORINGDRG REPORT NUMBER HCX-9 (Host) and (Target), 880603W1.09059 7. AUTHOR(s) S. CONTRACT OR 6RANT NUMBER(s) Wright-Patterson AFB...88-03-02-HAR Ada COMPILER VALIDATION SUMMARY REPORT : Certificate Number: 880603WI.09059 A Harris Corporation AccessionFor Harris Ada Compiler, Version

  19. Constructing Indicators for Measuring Provincial Sustainable Development Index in Vietnam

    NASA Astrophysics Data System (ADS)

    Truong, Van Canh; Lisowski, Andrzej

    2018-03-01

    Sustainable development is zeitgeist of our age. It is one kind of development that in this trajectory humanity can create a stable and developed socio-economic foundations, conserve environment and therefore able to continue for a long time. Using indicators is one of the best ways to monitor and measure the progress toward sustainable development. In this paper we have proposed the way to create indicators for measuring provincial sustainable development index in Vietnam. We firstly made a framework of elements for economic, social and environmental component and compiled a list of indicators of 20 national and international agencies in the world. We then applied the SMART framework (Specific, Measurable, Achievable, Relevant, and Time-related) to choose indicators which will be relevant for Vietnam and put them back to the elements. We then have 39 relevant indicators with 12 indicators for economy, 17 indicators for social and 10 indicators for environmental component. Finally, we have established the way to determine the worst and best value for each indicator from available data for countries in the world.

  20. The Fortune of the Commons: Participatory Evaluation of Small-Scale Fisheries in the Brazilian Amazon

    NASA Astrophysics Data System (ADS)

    Oviedo, Antonio F. P.; Bursztyn, Marcel

    2016-05-01

    This paper applies a participatory approach in evaluating small-scale fisheries, focusing on the Arapaima gigas fishery in the Brazilian Amazon. The evaluation uses the social-ecological system (SES) framework, adopted to explain the conditions needed for sustainability and user cooperation in natural resources management, as a more suitable alternative to the `blueprint' or `panaceas' approaches, based only on property rights or governmental intervention. However, managers and users often do not have the necessary information compiled and available for a specific SES while some actions need to be taken immediately. Thus, consensus and negotiation among stakeholders about SES variables may be useful to evaluate system performance and indicate actions to promote sustainability. In the case study, using a consensus-building model, we found that arapaima SES leads to sustainability and is far from being a case of `tragedy of the commons.' More investments in suitable monitoring and enforcement for adaptive management are recommended. Adopting an SES framework based on stakeholders' prospects may be useful until complete interdisciplinary studies become available so as to seek of sustainability in the long term.

  1. Estonian greenhouse gas emissions inventory report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punning, J.M.; Ilomets, M.; Karindi, A.

    1996-07-01

    It is widely accepted that the increase of greenhouse gas concentrations in the atmosphere due to human activities would result in warming of the Earth`s surface. To examine this effect and better understand how the GHG increase in the atmosphere might change the climate in the future, how ecosystems and societies in different regions of the World should adapt to these changes, what must policymakers do for the mitigation of that effect, the worldwide project within the Framework Convention on Climate Change was generated by the initiative of United Nations. Estonia is one of more than 150 countries, which signedmore » the Framework Convention on Climate Change at the United Nations Conference on Environment and Development held in Rio de Janeiro in June 1992. In 1994 a new project, Estonian Country Study was initiated within the US Country Studies Program. The project will help to compile the GHG inventory for Estonia, find contemporary trends to investigate the impact of climate change on the Estonian ecosystems and economy and to formulate national strategies for Estonia addressing to global climate change.« less

  2. What girls won't do for love: human immunodeficiency virus/sexually transmitted infections risk among young African-American women driven by a relationship imperative.

    PubMed

    Raiford, Jerris L; Seth, Puja; DiClemente, Ralph J

    2013-05-01

    Rates of Human immunodeficiency virus (HIV) and other sexually transmitted infections (STIs) continue to increase among African-American youth. Adolescents who have a stronger identity in relation to others (relational identity) rather than to themselves (self-identity) may view intimate relationships as imperative to a positive self-concept, which may lead to risky sexual behavior and abuse. Therefore, the present study assessed the associations among a relationship imperative and HIV/STI-related risk factors and behaviors. Participants were 715 African-American adolescent females, aged 15 to 21 years. They completed measures that assessed how important a relationship was to them and HIV-related risk factors and behaviors. Participants also provided vaginal swab specimens for STI testing. Multivariate logistic regression analyses, controlling for covariates, were conducted. Females who endorsed a relationship imperative (29%), compared to those who did not, were more likely to report: unprotected sex, less power in their relationships, perceived inability to refuse sex, anal sex, sex while their partner was high on alcohol/drugs, and partner abuse. Furthermore, participants with less power, recent partner abuse, and a perceived ability to refuse sex were more likely to test STI positive. These results indicate that if African-American adolescent females believe a relationship is imperative, they are more likely to engage in riskier sexual behaviors. Additionally, less perceived power and partner abuse increases their risk for STIs. HIV/STI prevention programs should target males and females and address healthy relationships, sense of self-worth, self-esteem and the gender power imbalance that may persist in the community along with HIV/STI risk. Published by Elsevier Inc.

  3. Designing Successful Next-Generation Instruments to Detect the Epoch of Reionization

    NASA Astrophysics Data System (ADS)

    Thyagarajan, Nithyanandan; Hydrogen Epoch of Reionization Array (HERA) team, Murchison Widefield Array (MWA) team

    2018-01-01

    The Epoch of Reionization (EoR) signifies a period of intense evolution of the Inter-Galactic Medium (IGM) in the early Universe caused by the first generations of stars and galaxies, wherein they turned the neutral IGM to be completely ionized by redshift ≥ 6. This important epoch is poorly explored to date. Measurement of redshifted 21 cm line from neutral Hydrogen during the EoR is promising to provide the most direct constraints of this epoch. Ongoing experiments to detect redshifted 21 cm power spectrum during reionization, including the Murchison Widefield Array (MWA), Precision Array for Probing the Epoch of Reionization (PAPER), and the Low Frequency Array (LOFAR), appear to be severely affected by bright foregrounds and unaccounted instrumental systematics. For example, the spectral structure introduced by wide-field effects, aperture shapes and angular power patterns of the antennas, electrical and geometrical reflections in the antennas and electrical paths, and antenna position errors can be major limiting factors. These mimic the 21 cm signal and severely degrade the instrument performance. It is imperative for the next-generation of experiments to eliminate these systematics at their source via robust instrument design. I will discuss a generic framework to set cosmologically motivated antenna performance specifications and design strategies using the Precision Radio Interferometry Simulator (PRISim) -- a high-precision tool that I have developed for simulations of foregrounds and the instrument transfer function intended primarily for 21 cm EoR studies, but also broadly applicable to interferometer-based intensity mapping experiments. The Hydrogen Epoch of Reionization Array (HERA), designed in-part based on this framework, is expected to detect the 21 cm signal with high significance. I will present this framework and the simulations, and their potential for designing upcoming radio instruments such as HERA and the Square Kilometre Array (SKA).

  4. Interprofessional Education: A Summary of Reports and Barriers to Recommendations.

    PubMed

    Meleis, Afaf I

    2016-01-01

    Effective, quality care to achieve the newly developed sustainable development goals requires the development of collaborative teams and is predicated on implementing transformative interprofessional education and on team members who are equally empowered. This is a report on The Lancet commission on transformative education for health professionals and the National Academy of Medicine's dialogues on developing and implementing innovations to enhance collaborations and to facilitate the effectiveness of healthcare teams. Using postcolonial feminist theory for critical analysis and integrations of findings from both reports, as well as for identification of barriers to achieving equity in team functioning. The global Lancet commission and the National Academy of Medicine/Institute of Medicine forum developed frameworks that could be used to educate the next generation of professionals based on identifying the local needs of communities within a global context. Recommendations included breaking down silos that exists between schools and using an equity and justice framework in developing educational programs; utilizing contemporary innovations in teaching that correspond with innovations in healthcare systems; and insuring investments in time, energy, and resources in interprofessional education. However, without addressing the silos created through professional identities and power differentials, goals of interprofessional education and collaborative practice may not be achieved. While a great deal has been written about interprofessional education, it is imperative for faculty in the different professional schools and for members of healthcare teams to engage in dialogues that address the fundamental and most obstinate barriers to forming equitable teams, which is the consistent narrative of medical privilege and centrism. The dialogues about medical privilege and physician centrism in education and health care could drive the development of programmatic approaches to enhancing interprofessional education and teamwork based on justice and equity frameworks. © 2015 Sigma Theta Tau International.

  5. Development of an OSSE Framework for a Global Atmospheric Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald; Errico, Ronald M.; Prive, N.

    2012-01-01

    Observing system simulation experiments (OSSEs) are powerful tools for estimating the usefulness of various configurations of envisioned observing systems and data assimilation techniques. Their utility stems from their being conducted in an entirely simulated context, utilizing simulated observations having simulated errors and drawn from a simulation of the earth's environment. Observations are generated by applying physically based algorithms to the simulated state, such as performed during data assimilation or using other appropriate algorithms. Adding realistic instrument plus representativeness errors, including their biases and correlations, can be critical for obtaining realistic assessments of the impact of a proposed observing system or analysis technique. If estimates of the expected accuracy of proposed observations are realistic, then the OSSE can be also used to learn how best to utilize the new information, accelerating its transition to operations once the real data are available. As with any inferences from simulations, however, it is first imperative that some baseline OSSEs are performed and well validated against corresponding results obtained with a real observing system. This talk provides an overview of, and highlights critical issues related to, the development of an OSSE framework for the tropospheric weather prediction component of the NASA GEOS-5 global atmospheric data assimilation system. The framework includes all existing observations having significant impact on short-term forecast skill. Its validity has been carefully assessed using a range of metrics that can be evaluated in both the OSSE and real contexts, including adjoint-based estimates of observation impact. A preliminary application to the Aeolus Doppler wind lidar mission, scheduled for launch by the European Space Agency in 2014, has also been investigated.

  6. Drug regulators and ethics: which GCP issues are also ethical issues?

    PubMed

    Bernabe, Rosemarie D L C; van Thiel, Ghislaine J M W; Breekveldt, Nancy S; van Delden, Johannes J M

    2016-02-01

    Within the European Union (EU), good clinical practice (GCP) provides an ethical mandate to regulators; however, it is unclear what the content of that mandate is. By looking at the correspondence between GCP and ethical imperatives, we identify that the mandate is within the following: principles; benefit-risk ratio; scientific validity; results publication; informed consent; respect for participants; and special populations. There are also cases when regulations were ethical but were not pairable to an imperative, and when the former were stricter than the latter. Hence, we suggest closer cooperation between ethics committees and regulators to ensure that future versions of ethics guidelines cover the ethically relevant regulations that were not directly pairable to any imperative, and cooperation between GCP legislative bodies and ethics guideline-making bodies to resolve the discordant areas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Nonparametric Simulation of Signal Transduction Networks with Semi-Synchronized Update

    PubMed Central

    Nassiri, Isar; Masoudi-Nejad, Ali; Jalili, Mahdi; Moeini, Ali

    2012-01-01

    Simulating signal transduction in cellular signaling networks provides predictions of network dynamics by quantifying the changes in concentration and activity-level of the individual proteins. Since numerical values of kinetic parameters might be difficult to obtain, it is imperative to develop non-parametric approaches that combine the connectivity of a network with the response of individual proteins to signals which travel through the network. The activity levels of signaling proteins computed through existing non-parametric modeling tools do not show significant correlations with the observed values in experimental results. In this work we developed a non-parametric computational framework to describe the profile of the evolving process and the time course of the proportion of active form of molecules in the signal transduction networks. The model is also capable of incorporating perturbations. The model was validated on four signaling networks showing that it can effectively uncover the activity levels and trends of response during signal transduction process. PMID:22737250

  8. Conceptual basis for an integrated system for the management of a protected area. Examples from its application in a mediterranean area.

    PubMed

    Cornejo, E; Fungairiño, S G; Barandica, J M; Serrano, J M; Zorrilla, J M; Gómez, T; Zapata, F J; Acosta, F J

    2016-01-15

    Improving the efficiency of management in protected areas is imperative in a generalized context of limited conservation budgets. However, this is overlooked due to flaws in problem definition, general disregard for cost information, and a lack of suitable tools for measuring costs and management quality. This study describes an innovative methodological framework, implemented in the web application SIGEIN, focused on maximizing the quality of management against its costs, establishing an explicit justification for any decision. The tool integrates, with this aim, a procedure for prioritizing management objects according to a conservation value, modified by a functional criterion; a project management module; and a module for management of continuous assessment. This appraisal associates the relevance of the conservation targets, the efficacy of the methods employed, both resource and personnel investments, and the resulting costs. Preliminary results of a prototypical SIGEIN application on the Site of Community Importance Chafarinas Islands are included. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Interventions to Improve Care for Patients with Limited Health Literacy

    PubMed Central

    Sudore, Rebecca L.; Schillinger, Dean

    2009-01-01

    Objective To propose a framework and describe best practices for improving care for patients with limited health literacy (LHL). Methods Review of the literature. Results Approximately half of the U.S. adult population has LHL. Because LHL is associated with poor health outcomes and contributes to health disparities, the adoption of evidence-based best practices is imperative. Feasible interventions at the clinician-patient level (eg, patient-centered communication, clear communication techniques, teach-to-goal methods, and reinforcement), at the system-patient level (eg, clear health education materials, visual aids, clear medication labeling, self-management support programs, and shame-free clinical environments), and at the community-patient level (eg, adult education referrals, lay health educators, and harnessing the mass media) can improve health outcomes for patients with LHL. Conclusion Because LHL is prevalent, and because the recommended communication strategies can benefit patients of all literacy levels, clinicians, health system planners, and health policy leaders should promote the uptake of these strategies into routine care. PMID:20046798

  10. Surface faceting and elemental diffusion behaviour at atomic scale for alloy nanoparticles during in situ annealing

    PubMed Central

    Chi, Miaofang; Wang, Chao; Lei, Yinkai; Wang, Guofeng; Li, Dongguo; More, Karren L.; Lupini, Andrew; Allard, Lawrence F.; Markovic, Nenad M.; Stamenkovic, Vojislav R.

    2015-01-01

    The catalytic performance of nanoparticles is primarily determined by the precise nature of the surface and near-surface atomic configurations, which can be tailored by post-synthesis annealing effectively and straightforwardly. Understanding the complete dynamic response of surface structure and chemistry to thermal treatments at the atomic scale is imperative for the rational design of catalyst nanoparticles. Here, by tracking the same individual Pt3Co nanoparticles during in situ annealing in a scanning transmission electron microscope, we directly discern five distinct stages of surface elemental rearrangements in Pt3Co nanoparticles at the atomic scale: initial random (alloy) elemental distribution; surface platinum-skin-layer formation; nucleation of structurally ordered domains; ordered framework development and, finally, initiation of amorphization. Furthermore, a comprehensive interplay among phase evolution, surface faceting and elemental inter-diffusion is revealed, and supported by atomistic simulations. This work may pave the way towards designing catalysts through post-synthesis annealing for optimized catalytic performance. PMID:26576477

  11. Surface faceting and elemental diffusion behaviour at atomic scale for alloy nanoparticles during in situ annealing

    DOE PAGES

    Chi, Miaofang; Wang, Chao; Lei, Yinkai; ...

    2015-11-18

    The catalytic performance of nanoparticles is primarily determined by the precise nature of the surface and near-surface atomic configurations, which can be tailored by post-synthesis annealing effectively and straightforwardly. Understanding the complete dynamic response of surface structure and chemistry to thermal treatments at the atomic scale is imperative for the rational design of catalyst nanoparticles. Here, by tracking the same individual Pt 3Co nanoparticles during in situ annealing in a scanning transmission electron microscope, we directly discern five distinct stages of surface elemental rearrangements in Pt 3Co nanoparticles at the atomic scale: initial random (alloy) elemental distribution; surface platinum-skin-layer formation;more » nucleation of structurally ordered domains; ordered framework development and, finally, initiation of amorphization. Furthermore, a comprehensive interplay among phase evolution, surface faceting and elemental inter-diffusion is revealed, and supported by atomistic simulations. In conlcusion, this work may pave the way towards designing catalysts through post-synthesis annealing for optimized catalytic performance.« less

  12. Mathematical models, rational choice, and the search for Cold War culture.

    PubMed

    Erickson, Paul

    2010-06-01

    A key feature of the social, behavioral, and biological sciences after World War II has been the widespread adoption of new mathematical techniques drawn from cybernetics, information theory, and theories of rational choice. Historians of science have typically sought to explain this adoption either by reference to military patronage, or to a characteristic Cold War culture or discursive framework strongly shaped by the concerns of national security. This essay explores several episodes in the history of game theory--a mathematical theory of rational choice--that demonstrate the limits of such explanations. Military funding was indeed critical to game theory's early development in the 1940s. However, the theory's subsequent spread across disciplines ranging from political science to evolutionary biology was the result of a diverse collection of debates about the nature of "rationality" and "choice" that marked the Cold War era. These debates are not easily reduced to the national security imperatives that have been the focus of much historiography to date.

  13. Consistent design schematics for biological systems: standardization of representation in biological engineering

    PubMed Central

    Matsuoka, Yukiko; Ghosh, Samik; Kitano, Hiroaki

    2009-01-01

    The discovery by design paradigm driving research in synthetic biology entails the engineering of de novo biological constructs with well-characterized input–output behaviours and interfaces. The construction of biological circuits requires iterative phases of design, simulation and assembly, leading to the fabrication of a biological device. In order to represent engineered models in a consistent visual format and further simulating them in silico, standardization of representation and model formalism is imperative. In this article, we review different efforts for standardization, particularly standards for graphical visualization and simulation/annotation schemata adopted in systems biology. We identify the importance of integrating the different standardization efforts and provide insights into potential avenues for developing a common framework for model visualization, simulation and sharing across various tools. We envision that such a synergistic approach would lead to the development of global, standardized schemata in biology, empowering deeper understanding of molecular mechanisms as well as engineering of novel biological systems. PMID:19493898

  14. A Framework to Expand and Advance Probabilistic Risk Assessment to Support Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; David Schwieder; Robert Nourgaliev

    2012-09-01

    During the early development of nuclear power plants, researchers and engineers focused on many aspects of plant operation, two of which were getting the newly-found technology to work and minimizing the likelihood of perceived accidents through redundancy and diversity. As time, and our experience, has progressed, the realization of plant operational risk/reliability has entered into the design, operation, and regulation of these plants. But, to date, we have only dabbled at the surface of risk and reliability technologies. For the next generation of small modular reactors (SMRs), it is imperative that these technologies evolve into an accepted, encompassing, validated, andmore » integral part of the plant in order to reduce costs and to demonstrate safe operation. Further, while it is presumed that safety margins are substantial for proposed SMR designs, the depiction and demonstration of these margins needs to be better understood in order to optimize the licensing process.« less

  15. Mapping polaronic states and lithiation gradients in individual V2O5 nanowires

    PubMed Central

    De Jesus, Luis R.; Horrocks, Gregory A.; Liang, Yufeng; Parija, Abhishek; Jaye, Cherno; Wangoh, Linda; Wang, Jian; Fischer, Daniel A.; Piper, Louis F. J.; Prendergast, David; Banerjee, Sarbajit

    2016-01-01

    The rapid insertion and extraction of Li ions from a cathode material is imperative for the functioning of a Li-ion battery. In many cathode materials such as LiCoO2, lithiation proceeds through solid-solution formation, whereas in other materials such as LiFePO4 lithiation/delithiation is accompanied by a phase transition between Li-rich and Li-poor phases. We demonstrate using scanning transmission X-ray microscopy (STXM) that in individual nanowires of layered V2O5, lithiation gradients observed on Li-ion intercalation arise from electron localization and local structural polarization. Electrons localized on the V2O5 framework couple to local structural distortions, giving rise to small polarons that serves as a bottleneck for further Li-ion insertion. The stabilization of this polaron impedes equilibration of charge density across the nanowire and gives rise to distinctive domains. The enhancement in charge/discharge rates for this material on nanostructuring can be attributed to circumventing challenges with charge transport from polaron formation. PMID:27349567

  16. The Afro-Cardiac Study: Cardiovascular Disease Risk and Acculturation in West African Immigrants in the United States: Rationale and Study Design.

    PubMed

    Commodore-Mensah, Yvonne; Sampah, Maame; Berko, Charles; Cudjoe, Joycelyn; Abu-Bonsrah, Nancy; Obisesan, Olawunmi; Agyemang, Charles; Adeyemo, Adebowale; Himmelfarb, Cheryl Dennison

    2016-12-01

    Cardiovascular disease (CVD) remains the leading cause of death in the United States (US). African-descent populations bear a disproportionate burden of CVD risk factors. With the increase in the number of West African immigrants (WAIs) to the US over the past decades, it is imperative to specifically study this new and substantial subset of the African-descent population and how acculturation impacts their CVD risk. The Afro-Cardiac study, a community-based cross-sectional study of adult WAIs in the Baltimore-Washington metropolis. Guided by the PRECEDE-PROCEED model, we used a modification of the World Health Organization Steps survey to collect data on demographics, socioeconomic status, migration-related factors and behaviors. We obtained physical, biochemical, acculturation measurements as well as a socio-demographic and health history. Our study provides critical data on the CVD risk of WAIs. The framework used is valuable for future epidemiological studies addressing CVD risk and acculturation among immigrants.

  17. Space Science in the Twenty-First Century: Imperatives for the Decades 1995 to 2015. Mission to Planet Earth

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A unified program is outlined for studying the Earth, from its deep interior to its fluid envelopes. A system is proposed for measuring devices involving both space-based and in-situ observations that can accommodate simultaneously a large range of scientific needs. The scientific objectices served by this integrated infrastructure are cased into a framework of four grand themes. In summary these are: to determine the composition, structure, dynamics, and evolution of the Earth's crust and deeper interior; to establish and understand the structure, dynamics, and chemistry of the oceans, atmosphere, and cryosphere, and their interaction with the solid Earth; to characterize the history and dynamics of living organisms and their interaction with the environment; and to monitor and understand the interaction of human activities with the natural environment. A focus on these grand themes will help to understand the origin and fate of the planet, and to place it in the context of the solar system.

  18. Law's Dilemma: Validating Complementary and Alternative Medicine and the Clash of Evidential Paradigms

    PubMed Central

    Iyioha, Ireh

    2011-01-01

    This paper examines the (in)compatibility between the diagnostic and therapeutic theories of complementary and alternative medicine (CAM) and a science-based regulatory framework. Specifically, the paper investigates the nexus between statutory legitimacy and scientific validation of health systems, with an examination of its impact on the development of complementary and alternative therapies. The paper evaluates competing theories for validating CAM ranging from the RCT methodology to anthropological perspectives and contends that while the RCT method might be beneficial in the regulation of many CAM therapies, yet dogmatic adherence to this paradigm as the exclusive method for legitimizing CAM will be adverse to the independent development of many CAM therapies whose philosophies and mechanisms of action are not scientifically interpretable. Drawing on history and research evidence to support this argument, the paper sues for a regulatory model that is accommodative of different evidential paradigms in support of a pluralistic healthcare system that balances the imperative of quality assurance with the need to ensure access. PMID:20953428

  19. Ice Stream Slowdown Will Drive Long-Term Thinning of the Ross Ice Shelf, With or Without Ocean Warming

    NASA Astrophysics Data System (ADS)

    Campbell, Adam J.; Hulbe, Christina L.; Lee, Choon-Ki

    2018-01-01

    As time series observations of Antarctic change proliferate, it is imperative that mathematical frameworks through which they are understood keep pace. Here we present a new method of interpreting remotely sensed change using spatial statistics and apply it to the specific case of thickness change on the Ross Ice Shelf. First, a numerical model of ice shelf flow is used together with empirical orthogonal function analysis to generate characteristic patterns of response to specific forcings. Because they are continuous and scalable in space and time, the patterns allow short duration observations to be placed in a longer time series context. Second, focusing only on changes that are statistically significant, the synthetic response surfaces are used to extract magnitude and timing of past events from the observational data. Slowdown of Kamb and Whillans Ice Streams is clearly detectable in remotely sensed thickness change. Moreover, those past events will continue to drive thinning into the future.

  20. Ethical adoption: A new imperative in the development of technology for dementia.

    PubMed

    Robillard, Julie M; Cleland, Ian; Hoey, Jesse; Nugent, Chris

    2018-06-19

    Technology interventions are showing promise to assist persons with dementia and their carers. However, low adoption rates for these technologies and ethical considerations have impeded the realization of their full potential. Building on recent evidence and an iterative framework development process, we propose the concept of "ethical adoption": the deep integration of ethical principles into the design, development, deployment, and usage of technology. Ethical adoption is founded on five pillars, supported by empirical evidence: (1) inclusive participatory design; (2) emotional alignment; (3) adoption modelling; (4) ethical standards assessment; and (5) education and training. To close the gap between adoption research, ethics and practice, we propose a set of 18 practical recommendations based on these ethical adoption pillars. Through the implementation of these recommendations, researchers and technology developers alike will benefit from evidence-informed guidance to ensure their solution is adopted in a way that maximizes the benefits to people with dementia and their carers while minimizing possible harm. Copyright © 2018 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  1. Experimental research control software system

    NASA Astrophysics Data System (ADS)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  2. Moral alchemy: How love changes norms.

    PubMed

    Magid, Rachel W; Schulz, Laura E

    2017-10-01

    We discuss a process by which non-moral concerns (that is concerns agreed to be non-moral within a particular cultural context) can take on moral content. We refer to this phenomenon as moral alchemy and suggest that it arises because moral obligations of care entail recursively valuing loved ones' values, thus allowing propositions with no moral weight in themselves to become morally charged. Within this framework, we predict that when people believe a loved one cares about a behavior more than they do themselves, the moral imperative to care about the loved one's interests will raise the value of that behavior, such that people will be more likely to infer that third parties will see the behavior as wrong (Experiment 1) and the behavior itself as more morally important (Experiment 2) than when the same behaviors are considered outside the context of a caring relationship. The current study confirmed these predictions. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Pharma Opportunities and Risks Multiply as Regulatory Reform Remakes APAC: Expanded Accelerated Pathways Challenge Developer Value Story, Evidence Collection, and Market Access Strategies.

    PubMed

    Grignolo, Alberto; Mingping, Zhang

    2018-01-01

    Sweeping reforms in the largest markets of the Asia-Pacific region are transforming the regulatory and commercial landscape for foreign pharmaceutical companies. Japan, South Korea, and China are leading the charge, establishing mechanisms and infrastructure that both reflect and help drive international regulatory convergence and accelerate delivery of needed, innovative products to patients. In this rapidly evolving regulatory and commercial environment, drug developers can benefit from reforms and proliferating accelerated pathway (AP) frameworks, but only with regulatory and evidence-generation strategies tailored to the region. Otherwise, they will confront significant pricing and reimbursement headwinds. Although APAC economies are at different stages of development, they share a common imperative: to balance pharmaceutical innovation with affordability. Despite the complexity of meeting these sometimes conflicting demands, companies that focus on demonstrating and delivering value for money, and that price new treatments reasonably and sustainably, can succeed both for their shareholders and the region's patient population.

  4. Skills development and structural change: Possibilities for and limitations of redressing structural racial inequalities in South Africa

    NASA Astrophysics Data System (ADS)

    Groener, Zelda

    2013-12-01

    Improving structural racial equality for historically-disadvantaged Black South Africans, including low-skilled and unemployed adults and youths, is a pertinent challenge for the South African government during the ongoing transition from apartheid capitalism to post-apartheid capitalism. Within the framework of the National Skills Development Strategy (NSDS), the introduction of "learnerships" and "learning programmes", which include structured learning programmes, learnerships, apprenticeships and skills programmes, has had some impact. But emerging theoretical perspectives assert that apartheid structural racial inequalities persist and that structural reform is imperative. Opposing positions translate into two perspectives on social transition: either capitalism can be de-racialised, or capitalism in South Africa should be dismantled in order to de-racialise it. After a review of relevant literature and governmental documents, the author identifies five structural and pedagogical barriers as likely causes for low completion rates of skills development courses and concludes that structural reform needs more favourable political and economic conditions in order to be successful.

  5. Algal biorefinery-based industry: an approach to address fuel and food insecurity for a carbon-smart world.

    PubMed

    Subhadra, Bobban

    2011-01-15

    Food and fuel production are intricately interconnected. In a carbon-smart society, it is imperative to produce both food and fuel sustainably. Integration of the emerging biorefinery concept with other industries can bring many environmental deliverables while mitigating several sustainability-related issues with respect to greenhouse gas emissions, fossil fuel usage, land use change for fuel production and future food insufficiency. A new biorefinery-based integrated industrial ecology encompasses the different value chain of products, coproducts, and services from the biorefinery industries. This paper discusses a framework to integrate the algal biofuel-based biorefinery, a booming biofuel sector, with other industries such as livestock, lignocellulosic and aquaculture. Using the USA as an example, this paper also illustrates the benefits associated with sustainable production of fuel and food. Policy and regulatory initiatives for synergistic development of the algal biofuel sector with other industries can bring many sustainable solutions for the future existence of mankind. Copyright © 2010 Society of Chemical Industry.

  6. Climate change and food security in East Asia.

    PubMed

    Su, Yi-Yuan; Weng, Yi-Hao; Chiu, Ya-Wen

    2009-01-01

    Climate change causes serious food security risk for East Asian countries. The United Nations Framework Convention on Climate Change (UNFCCC) has recognized that the climate change will impact agriculture and all nations should prepare adaptations to the impacts on food security. This article reviews the context of adaptation rules and current policy development in East Asian region. The UNFCCC and Kyoto Protocol have established specific rules for countries to develop national or regional adaptation policies and measurements. The current development of the ASEAN Strategic Plan on food security is inspiring, but the commitments to implementation by its members remain an issue of concern. We suggest that the UNFCCC enhances co-operation with the Food and Agriculture Organization (FAO) and other international organizations to further develop methodologies and technologies for all parties. Our findings suggest that agriculture is one of the most vulnerable sectors in terms of risks associated with climate change and distinct programmatic initiatives are necessary. It's imperative to promote co-operation among multilateral organizations, including the UNFCCC, FAO, World Health Organization, and others.

  7. Sand dredging and environmental efficiency of artisanal fishermen in Lagos state, Nigeria.

    PubMed

    Sowunmi, Fatai A; Hogarh, Jonathan N; Agbola, Peter O; Atewamba, Calvin

    2016-03-01

    Environmentally detrimental input (water turbidity) and conventional production inputs were considered within the framework of stochastic frontier analysis to estimate technical and environmental efficiencies of fishermen in sand dredging and non-dredging areas. Environmental efficiency was low among fishermen in the sand dredging areas. Educational status and experience in fishing and sand dredging were the factors influencing environmental efficiency in the sand dredging areas. Average quantity of fish caught per labour- hour was higher among fishermen in the non-dredging areas. Fishermen in the fishing community around the dredging areas travelled long distance in order to reduce the negative effect of sand dredging on their fishing activity. The study affirmed large household size among fishermen. The need to regulate the activities of sand dredgers by restricting license for sand dredging to non-fishing communities as well as intensifying family planning campaign in fishing communities to reduce the negative effect of high household size on fishing is imperative for the sustainability of artisanal fishing.

  8. 14 CFR 1203.302 - Combination, interrelation or compilation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... INFORMATION SECURITY PROGRAM Classification Principles and Considerations § 1203.302 Combination.... Compilations of unclassified information are considered unclassified unless some additional significant factor is added in the process of compilation. For example: (a) The way unclassified information is compiled...

  9. 14 CFR 1203.302 - Combination, interrelation or compilation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... INFORMATION SECURITY PROGRAM Classification Principles and Considerations § 1203.302 Combination.... Compilations of unclassified information are considered unclassified unless some additional significant factor is added in the process of compilation. For example: (a) The way unclassified information is compiled...

  10. Looking Up: Conditions for Insurgent Airpower in Unconventional Warfare

    DTIC Science & Technology

    2017-12-01

    development of insurgent air capabilities, it does not expound on the idea. This study examines the conditions needed to build an insurgent air...of UW, insurgencies, and air operations, the study forms theorized conditions and employment imperatives for insurgent air. It then tests these...theorized conditions and imperatives against two historic case studies , Hmong pilots in Laos and the Tamil Air Tigers in Sri Lanka. This study concludes

  11. The Ethical Imperative of Reason: How Anti-Intellectualism, Denialism, and Apathy Threaten National Security

    DTIC Science & Technology

    2016-03-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited THE ETHICAL ...REPORT DATE March 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE THE ETHICAL IMPERATIVE OF REASON: HOW ANTI...intellectualism in the policy process and demonstrates that, in the intricate and dynamic matters of our nation’s security, there is an ethical

  12. Just Learning: The Imperative to Transform Juvenile Justice Systems into Effective Educational Systems. A Study of Juvenile Justice Schools in the South and the Nation. Special Summary

    ERIC Educational Resources Information Center

    Southern Education Foundation, 2014

    2014-01-01

    This brief summarizes the findings of the larger study, "Just Learning: The Imperative to Transform Juvenile Justice Systems into Effective Educational Systems. A Study of Juvenile Justice Schools in the South and the Nation." With awareness growing that schools are disciplining and suspending minority students at alarming rates, the…

  13. "Don't Bother with That": The Use of Negative Imperative Directives for Defusing Student Conflict in a Special Support Classroom

    ERIC Educational Resources Information Center

    Svahn, Johanna

    2017-01-01

    This article examines episodes of potential student conflict in a Swedish special support classroom in which teachers deploy a particular type of directive in the form of a negative imperative: "bry dig inte" (Eng. "don't mind …; don't bother …"). The analyses of three such extended episodes, by use of a conversation analytic…

  14. An assessment of the impact of the pet trade on five CITES-Appendix II case studies - Boa constrictor imperator

    USGS Publications Warehouse

    Montgomery, Chad E.; Boback, Scott M.; Reed, Robert N.; Frazier, Julius A.

    2015-01-01

    Boa constrictor is a wide ranging snake species that is common in the pet trade and is currently listed in CITES Appendix II. Hog Island boas, or Cayos Cochinos boas, are a dwarf, insular race of Boa constrictor imperator endemic to the Cayos Cochinos Archipelago, Honduras. Cayos Cochinos boas are prized in the international pet trade for their light pink dorsal coloration, as well as for being much smaller and more docile than mainland boas (Porras, 1999; Russo, 2007). The boa population in the Cayos Cochinos was heavily exploited for the pet trade from 1979 to 1993, and researchers reported finding no boas on the islands during a five day herpetological survey trip in the early 1990s (Wilson and CruzDiaz, 1993), leading to the speculation that the population had been extirpated (e.g., Russo, 2007). The Cayos Cochinos Archipelago Natural Marine Monument has been managed by the Honduran Coral Reef Foundation since 1994 and prohibits removal of boas from the area. Poaching for the pet trade continues today, although at a lower level. Due to the endemic nature of this island morph of B. c. imperator it is imperative that we understand the dynamics of the populations and the ongoing threats that could negatively impact their long-term survival.

  15. Distributed memory compiler design for sparse problems

    NASA Technical Reports Server (NTRS)

    Wu, Janet; Saltz, Joel; Berryman, Harry; Hiranandani, Seema

    1991-01-01

    A compiler and runtime support mechanism is described and demonstrated. The methods presented are capable of solving a wide range of sparse and unstructured problems in scientific computing. The compiler takes as input a FORTRAN 77 program enhanced with specifications for distributing data, and the compiler outputs a message passing program that runs on a distributed memory computer. The runtime support for this compiler is a library of primitives designed to efficiently support irregular patterns of distributed array accesses and irregular distributed array partitions. A variety of Intel iPSC/860 performance results obtained through the use of this compiler are presented.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Jeffrey R.

    The design and characterization of new materials for hydrogen storage is an important area of research, as the ability to store hydrogen at lower pressures and higher temperatures than currently feasible would lower operating costs for small hydrogen fuel cell vehicles. In particular, metal-organic frameworks (MOFs) represent promising materials for use in storing hydrogen in this capacity. MOFs are highly porous, three-dimensional crystalline solids that are formed via linkages between metal ions (e.g., iron, nickel, and zinc) and organic molecules. MOFs can store hydrogen via strong adsorptive interactions between the gas molecules and the pores of the framework, providing amore » high surface area for gas adsorption and thus the opportunity to store hydrogen at significantly lower pressures than with current technologies. By lowering the energy required for hydrogen storage, these materials hold promise in rendering hydrogen a more viable fuel for motor vehicles, which is a highly desirable outcome given the clean nature of hydrogen fuel cells (water is the only byproduct of combustion) and the current state of global climate change resulting from the combustion of fossil fuels. The work presented in this report is the result of collaborative efforts between researchers at Lawrence Berkeley National Lab (LBNL), the National Institute of Standards and Technology (NIST), and General Motors Corporation (GM) to discover novel MOFs promising for H 2 storage and characterize their properties. Described herein are several new framework systems with improved gravimetric and volumetric capacity to strongly bind H 2 at temperatures relevant for vehicle storage. These materials were rigorously characterized using neutron diffraction, to determine the precise binding locations of hydrogen within the frameworks, and high-pressure H 2 adsorption measurements, to provide a comprehensive picture of H 2 adsorption at all relevant pressures. A rigorous understanding of experimental findings was further achieved via first-principles electronic structure calculations, which also supported synthetic efforts through predictions of additional novel frameworks with promising properties for vehicular H 2 storage. The results of the computational efforts also helped to elucidate the fundamental principles governing the interaction of H 2 with the frameworks, and in particular with exposed metal sites in the pores of these materials. Significant accomplishments from this project include the discovery of a metal-organic framework with a high H 2 binding enthalpy and volumetric capacity at 25 °C and 100 bar, which surpasses the metrics of any other known metal-organic framework. Additionally this material was designed to be extremely cost effective compared to most comparable adsorbents, which is imperative for eventual real-world applications. Progress toward synthesizing new frameworks containing multiple open coordination sites is also discussed, and appears to be the most promising future direction for hydrogen storage in these porous materials.« less

  17. Development and promotion of Malaysian Dietary Guidelines.

    PubMed

    Tee, E-Siong

    2011-01-01

    Development and promotion of dietary guidelines is one of the key activities outlined in the National Plan of Action for Nutrition of Malaysia for the prevention of nutrition-related disorders. The first official Malaysian Dietary Guidelines (MDG) was published in 1999 and was thoroughly reviewed and launched on 25 March 2010. The new MDG 2010 is a compilation of science-based nutrition and physical activity recommendations. These guidelines form the basis of consistent and scientifically sound nutrition messages for the public. There are 14 key messages and 55 recommendations, covering the whole range of food and nutrition issues, from importance of consuming a variety of foods to guidance on specific food groups, messages to encourage physical activities, consuming safe food and beverages and making effective use of nutrition information on food labels. The MDG also has an updated food pyramid. Various efforts have been made to ensure that the revised MDG is disseminated to all stakeholders. The Ministry of Health has organised a series of workshops for nutritionists and other health care professionals, and the food industry. In collaboration with other professional bodies and the private sector, the Nutrition Society of Malaysia has been promoting the dissemination and usage of the MDG to the public through a variety of formats and channels. These include the publication of a series of leaflets, educational press articles, educational booklets, as well as through educational activities for children. It is imperative to monitor the usage and evaluation of these dietary messages.

  18. A survey of disease connections for CD4+ T cell master genes and their directly linked genes.

    PubMed

    Li, Wentian; Espinal-Enríquez, Jesús; Simpfendorfer, Kim R; Hernández-Lemus, Enrique

    2015-12-01

    Genome-wide association studies and other genetic analyses have identified a large number of genes and variants implicating a variety of disease etiological mechanisms. It is imperative for the study of human diseases to put these genetic findings into a coherent functional context. Here we use system biology tools to examine disease connections of five master genes for CD4+ T cell subtypes (TBX21, GATA3, RORC, BCL6, and FOXP3). We compiled a list of genes functionally interacting (protein-protein interaction, or by acting in the same pathway) with the master genes, then we surveyed the disease connections, either by experimental evidence or by genetic association. Embryonic lethal genes (also known as essential genes) are over-represented in master genes and their interacting genes (55% versus 40% in other genes). Transcription factors are significantly enriched among genes interacting with the master genes (63% versus 10% in other genes). Predicted haploinsufficiency is a feature of most these genes. Disease-connected genes are enriched in this list of genes: 42% of these genes have a disease connection according to Online Mendelian Inheritance in Man (OMIM) (versus 23% in other genes), and 74% are associated with some diseases or phenotype in a Genome Wide Association Study (GWAS) (versus 43% in other genes). Seemingly, not all of the diseases connected to genes surveyed were immune related, which may indicate pleiotropic functions of the master regulator genes and associated genes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Organic food: nutritious food or food for thought? A review of the evidence.

    PubMed

    Magkos, Faidon; Arvaniti, Fotini; Zampelas, Antonis

    2003-09-01

    Apparently, one of the primary reasons for purchasing organic food is the perception that it is more nutritious than conventional food. Given the increasing interest towards organic food products, it is imperative to review the existing literature concerning the nutritional value of the produce, and to determine to what extent are consumer expectations met. There are only few well-controlled studies that are capable of making a valid comparison and, therefore, compilation of the results is difficult and generalisation of the conclusions should be made with caution. In spite of these limitations, however, some differences can be identified. Although there is little evidence that organic and conventional foods differ in respect to the concentrations of the various micronutrients (vitamins, minerals and trace elements), there seems to be a slight trend towards higher ascorbic acid content in organically grown leafy vegetables and potatoes. There is also a trend towards lower protein concentration but of higher quality in some organic vegetables and cereal crops. With respect to the rest of the nutrients and the other food groups, existing evidence is inadequate to allow for valid conclusions. Finally, animal feeding experiments indicate that animal health and reproductive performance are slightly improved when they are organically fed. A similar finding has not yet been identified in humans. Several important directions can be highlighted for future research; it seems, however, that despite any differences, a well-balanced diet can equally improve health regardless of its organic or conventional origin.

  20. MCdevelop - a universal framework for Stochastic Simulations

    NASA Astrophysics Data System (ADS)

    Slawinska, M.; Jadach, S.

    2011-03-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 48 136 No. of bytes in distributed program, including test data, etc.: 355 698 Distribution format: tar.gz Programming language: ANSI C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system. Operating system: Most UNIX systems, Linux. The application programs were thoroughly tested under Ubuntu 7.04, 8.04 and CERN Scientific Linux 5. Has the code been vectorised or parallelised?: Tools (scripts) for optional parallelisation on a PC farm are included. RAM: 500 bytes Classification: 11.3 External routines: ROOT package version 5.0 or higher ( http://root.cern.ch/drupal/). Nature of problem: Developing any type of stochastic simulation program for high energy physics and other areas. Solution method: Object Oriented programming in C++ with added persistency mechanism, batch scripts for running on PC farms and Autotools.

  1. Compiler-assisted multiple instruction rollback recovery using a read buffer

    NASA Technical Reports Server (NTRS)

    Alewine, N. J.; Chen, S.-K.; Fuchs, W. K.; Hwu, W.-M.

    1993-01-01

    Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper focuses on compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations indicate improved efficiency over previous hardware-based and compiler-based schemes.

  2. Context-sensitive trace inlining for Java.

    PubMed

    Häubl, Christian; Wimmer, Christian; Mössenböck, Hanspeter

    2013-12-01

    Method inlining is one of the most important optimizations in method-based just-in-time (JIT) compilers. It widens the compilation scope and therefore allows optimizing multiple methods as a whole, which increases the performance. However, if method inlining is used too frequently, the compilation time increases and too much machine code is generated. This has negative effects on the performance. Trace-based JIT compilers only compile frequently executed paths, so-called traces, instead of whole methods. This may result in faster compilation, less generated machine code, and better optimized machine code. In the previous work, we implemented a trace recording infrastructure and a trace-based compiler for [Formula: see text], by modifying the Java HotSpot VM. Based on this work, we evaluate the effect of trace inlining on the performance and the amount of generated machine code. Trace inlining has several major advantages when compared to method inlining. First, trace inlining is more selective than method inlining, because only frequently executed paths are inlined. Second, the recorded traces may capture information about virtual calls, which simplify inlining. A third advantage is that trace information is context sensitive so that different method parts can be inlined depending on the specific call site. These advantages allow more aggressive inlining while the amount of generated machine code is still reasonable. We evaluate several inlining heuristics on the benchmark suites DaCapo 9.12 Bach, SPECjbb2005, and SPECjvm2008 and show that our trace-based compiler achieves an up to 51% higher peak performance than the method-based Java HotSpot client compiler. Furthermore, we show that the large compilation scope of our trace-based compiler has a positive effect on other compiler optimizations such as constant folding or null check elimination.

  3. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach.

    PubMed

    Gagliardi, Anna R; Légaré, France; Brouwers, Melissa C; Webster, Fiona; Wiljer, David; Badley, Elizabeth; Straus, Sharon

    2011-03-22

    Patient involvement in healthcare represents the means by which to achieve a healthcare system that is responsive to patient needs and values. Characterization and evaluation of strategies for involving patients in their healthcare may benefit from a knowledge translation (KT) approach. The purpose of this knowledge synthesis is to develop a conceptual framework for patient-mediated KT interventions. A preliminary conceptual framework for patient-mediated KT interventions was compiled to describe intended purpose, recipients, delivery context, intervention, and outcomes. A realist review will be conducted in consultation with stakeholders from the arthritis and cancer fields to explore how these interventions work, for whom, and in what contexts. To identify patient-mediated KT interventions in these fields, we will search MEDLINE, the Cochrane Library, and EMBASE from 1995 to 2010; scan references of all eligible studies; and examine five years of tables of contents for journals likely to publish quantitative or qualitative studies that focus on developing, implementing, or evaluating patient-mediated KT interventions. Screening and data collection will be performed independently by two individuals. The conceptual framework of patient-mediated KT options and outcomes could be used by healthcare providers, managers, educationalists, patient advocates, and policy makers to guide program planning, service delivery, and quality improvement and by us and other researchers to evaluate existing interventions or develop new interventions. By raising awareness of options for involving patients in improving their own care, outcomes based on using a KT approach may lead to greater patient-centred care delivery and improved healthcare outcomes.

  4. Regionally Adaptable Ground Motion Prediction Equation (GMPE) from Empirical Models of Fourier and Duration of Ground Motion

    NASA Astrophysics Data System (ADS)

    Bora, Sanjay; Scherbaum, Frank; Kuehn, Nicolas; Stafford, Peter; Edwards, Benjamin

    2016-04-01

    The current practice of deriving empirical ground motion prediction equations (GMPEs) involves using ground motions recorded at multiple sites. However, in applications like site-specific (e.g., critical facility) hazard ground motions obtained from the GMPEs are need to be adjusted/corrected to a particular site/site-condition under investigation. This study presents a complete framework for developing a response spectral GMPE, within which the issue of adjustment of ground motions is addressed in a manner consistent with the linear system framework. The present approach is a two-step process in which the first step consists of deriving two separate empirical models, one for Fourier amplitude spectra (FAS) and the other for a random vibration theory (RVT) optimized duration (Drvto) of ground motion. In the second step the two models are combined within the RVT framework to obtain full response spectral amplitudes. Additionally, the framework also involves a stochastic model based extrapolation of individual Fourier spectra to extend the useable frequency limit of the empirically derived FAS model. The stochastic model parameters were determined by inverting the Fourier spectral data using an approach similar to the one as described in Edwards and Faeh (2013). Comparison of median predicted response spectra from present approach with those from other regional GMPEs indicates that the present approach can also be used as a stand-alone model. The dataset used for the presented analysis is a subset of the recently compiled database RESORCE-2012 across Europe, the Middle East and the Mediterranean region.

  5. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach

    PubMed Central

    2011-01-01

    Background Patient involvement in healthcare represents the means by which to achieve a healthcare system that is responsive to patient needs and values. Characterization and evaluation of strategies for involving patients in their healthcare may benefit from a knowledge translation (KT) approach. The purpose of this knowledge synthesis is to develop a conceptual framework for patient-mediated KT interventions. Methods A preliminary conceptual framework for patient-mediated KT interventions was compiled to describe intended purpose, recipients, delivery context, intervention, and outcomes. A realist review will be conducted in consultation with stakeholders from the arthritis and cancer fields to explore how these interventions work, for whom, and in what contexts. To identify patient-mediated KT interventions in these fields, we will search MEDLINE, the Cochrane Library, and EMBASE from 1995 to 2010; scan references of all eligible studies; and examine five years of tables of contents for journals likely to publish quantitative or qualitative studies that focus on developing, implementing, or evaluating patient-mediated KT interventions. Screening and data collection will be performed independently by two individuals. Conclusions The conceptual framework of patient-mediated KT options and outcomes could be used by healthcare providers, managers, educationalists, patient advocates, and policy makers to guide program planning, service delivery, and quality improvement and by us and other researchers to evaluate existing interventions or develop new interventions. By raising awareness of options for involving patients in improving their own care, outcomes based on using a KT approach may lead to greater patient-centred care delivery and improved healthcare outcomes. PMID:21426573

  6. Array distribution in data-parallel programs

    NASA Technical Reports Server (NTRS)

    Chatterjee, Siddhartha; Gilbert, John R.; Schreiber, Robert; Sheffler, Thomas J.

    1994-01-01

    We consider distribution at compile time of the array data in a distributed-memory implementation of a data-parallel program written in a language like Fortran 90. We allow dynamic redistribution of data and define a heuristic algorithmic framework that chooses distribution parameters to minimize an estimate of program completion time. We represent the program as an alignment-distribution graph. We propose a divide-and-conquer algorithm for distribution that initially assigns a common distribution to each node of the graph and successively refines this assignment, taking computation, realignment, and redistribution costs into account. We explain how to estimate the effect of distribution on computation cost and how to choose a candidate set of distributions. We present the results of an implementation of our algorithms on several test problems.

  7. The GED4GEM project: development of a Global Exposure Database for the Global Earthquake Model initiative

    USGS Publications Warehouse

    Gamba, P.; Cavalca, D.; Jaiswal, K.S.; Huyck, C.; Crowley, H.

    2012-01-01

    In order to quantify earthquake risk of any selected region or a country of the world within the Global Earthquake Model (GEM) framework (www.globalquakemodel.org/), a systematic compilation of building inventory and population exposure is indispensable. Through the consortium of leading institutions and by engaging the domain-experts from multiple countries, the GED4GEM project has been working towards the development of a first comprehensive publicly available Global Exposure Database (GED). This geospatial exposure database will eventually facilitate global earthquake risk and loss estimation through GEM’s OpenQuake platform. This paper provides an overview of the GED concepts, aims, datasets, and inference methodology, as well as the current implementation scheme, status and way forward.

  8. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less

  9. A novel data-mining approach leveraging social media to monitor consumer opinion of sitagliptin.

    PubMed

    Akay, Altug; Dragomir, Andrei; Erlandsson, Björn-Erik

    2015-01-01

    A novel data mining method was developed to gauge the experience of the drug Sitagliptin (trade name Januvia) by patients with diabetes mellitus type 2. To this goal, we devised a two-step analysis framework. Initial exploratory analysis using self-organizing maps was performed to determine structures based on user opinions among the forum posts. The results were a compilation of user's clusters and their correlated (positive or negative) opinion of the drug. Subsequent modeling using network analysis methods was used to determine influential users among the forum members. These findings can open new avenues of research into rapid data collection, feedback, and analysis that can enable improved outcomes and solutions for public health and important feedback for the manufacturer.

  10. Discovery and problem solving: Triangulation as a weak heuristic

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel

    1987-01-01

    Recently the artificial intelligence community has turned its attention to the process of discovery and found that the history of science is a fertile source for what Darden has called compiled hindsight. Such hindsight generates weak heuristics for discovery that do not guarantee that discoveries will be made but do have proven worth in leading to discoveries. Triangulation is one such heuristic that is grounded in historical hindsight. This heuristic is explored within the general framework of the BACON, GLAUBER, STAHL, DALTON, and SUTTON programs. In triangulation different bases of information are compared in an effort to identify gaps between the bases. Thus, assuming that the bases of information are relevantly related, the gaps that are identified should be good locations for discovery and robust analysis.

  11. Statistical Analysis of Protein Ensembles

    NASA Astrophysics Data System (ADS)

    Máté, Gabriell; Heermann, Dieter

    2014-04-01

    As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.

  12. Research and Practice of the News Map Compilation Service

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  13. Ecological Feasibility Studies in Restoration Decision Making

    NASA Astrophysics Data System (ADS)

    Hopfensperger, Kristine N.; Engelhardt, Katharina A. M.; Seagle, Steven W.

    2007-06-01

    The restoration of degraded systems is essential for maintaining the provision of valuable ecosystem services, including the maintenance of aesthetic values. However, restoration projects often fail to reach desired goals for a variety of ecologic, financial, and social reasons. Feasibility studies that evaluate whether a restoration effort should even be attempted can enhance restoration success by highlighting potential pitfalls and gaps in knowledge before the design phase of a restoration. Feasibility studies also can bring stakeholders together before a restoration project is designed to discuss potential disagreements. For these reasons, a feasibility study was conducted to evaluate the efficacy of restoring a tidal freshwater marsh in the Potomac River near Alexandria, Virginia. The study focused on science rather than engineering questions, and thus differed in approach from other feasibility studies that are mostly engineering driven. The authors report the framework they used to conduct a feasibility study to inform other potential restoration projects with similar goals. The seven steps of the framework encompass (1) initiation of a feasibility study, (2) compilation of existing data, (3) collection of current site information, (4) examination of case studies, (5) synthesis of information in a handbook, (6) meeting with selected stakeholders, and (7) evaluation of meeting outcomes. By conducting a feasibility study using the seven-step framework, the authors set the stage for conducting future compliance studies and enhancing the chance of a successful restoration.

  14. A decision-making framework for total ownership cost management of complex systems: A Delphi study

    NASA Astrophysics Data System (ADS)

    King, Russel J.

    This qualitative study, using a modified Delphi method, was conducted to develop a decision-making framework for the total ownership cost management of complex systems in the aerospace industry. The primary focus of total ownership cost is to look beyond the purchase price when evaluating complex system life cycle alternatives. A thorough literature review and the opinions of a group of qualified experts resulted in a compilation of total ownership cost best practices, cost drivers, key performance factors, applicable assessment methods, practitioner credentials and potential barriers to effective implementation. The expert panel provided responses to the study questions using a 5-point Likert-type scale. Data were analyzed and provided to the panel members for review and discussion with the intent to achieve group consensus. As a result of the study, the experts agreed that a total ownership cost analysis should (a) be as simple as possible using historical data; (b) establish cost targets, metrics, and penalties early in the program; (c) monitor the targets throughout the product lifecycle and revise them as applicable historical data becomes available; and (d) directly link total ownership cost elements with other success factors during program development. The resultant study framework provides the business leader with incentives and methods to develop and implement strategies for controlling and reducing total ownership cost over the entire product life cycle when balancing cost, schedule, and performance decisions.

  15. Methodological Framework for World Health Organization Estimates of the Global Burden of Foodborne Disease

    PubMed Central

    Devleesschauwer, Brecht; Haagsma, Juanita A.; Angulo, Frederick J.; Bellinger, David C.; Cole, Dana; Döpfer, Dörte; Fazil, Aamir; Fèvre, Eric M.; Gibb, Herman J.; Hald, Tine; Kirk, Martyn D.; Lake, Robin J.; Maertens de Noordhout, Charline; Mathers, Colin D.; McDonald, Scott A.; Pires, Sara M.; Speybroeck, Niko; Thomas, M. Kate; Torgerson, Paul R.; Wu, Felicia; Havelaar, Arie H.; Praet, Nicolas

    2015-01-01

    Background The Foodborne Disease Burden Epidemiology Reference Group (FERG) was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs). This paper describes the methodological framework developed by FERG's Computational Task Force to transform epidemiological information into FBD burden estimates. Methods and Findings The global and regional burden of 31 FBDs was quantified, along with limited estimates for 5 other FBDs, using Disability-Adjusted Life Years in a hazard- and incidence-based approach. To accomplish this task, the following workflow was defined: outline of disease models and collection of epidemiological data; design and completion of a database template; development of an imputation model; identification of disability weights; probabilistic burden assessment; and estimating the proportion of the disease burden by each hazard that is attributable to exposure by food (i.e., source attribution). All computations were performed in R and the different functions were compiled in the R package 'FERG'. Traceability and transparency were ensured by sharing results and methods in an interactive way with all FERG members throughout the process. Conclusions We developed a comprehensive framework for estimating the global burden of FBDs, in which methodological simplicity and transparency were key elements. All the tools developed have been made available and can be translated into a user-friendly national toolkit for studying and monitoring food safety at the local level. PMID:26633883

  16. SCIRehab Project Series: The Psychology Taxonomy

    PubMed Central

    Wilson, Catherine; Huston, Toby; Koval, Jill; Gordon, Samuel A; Schwebel, Andrea; Gassaway, Julie

    2009-01-01

    Context: The integration of psychologists as members of the rehabilitation team has occurred in conjunction with the evolution and adoption of interdisciplinary teams as the standard of care in spinal cord injury (SCI) rehabilitation. Although the value of psychological services during rehabilitation is endorsed widely, specific interventions and their association with patient outcomes have not been examined adequately. Objective: To address this shortcoming, psychologists from 6 SCI centers collaborated to develop a psychology intervention taxonomy and documentation framework. Methods: Utilizing an interactive process, the lead psychologists from 6 centers compiled an inclusive list of patient characteristics assessed and interventions delivered in routine psychological practice at the participating rehabilitation facilities. These were systematically grouped, defined, and compared. Results: The resulting taxonomy became the basis of a documentation framework utilized by psychologists for the study. The psychology taxonomy includes 4 major clinical categories (assessment, psychotherapeutic interventions, psychoeducational interventions, and consultation) with 5 to 10 specific activities in each category. Conclusions: Examination of psychological interventions and their potential association with positive outcomes for persons who sustain SCI requires the development of a taxonomy. Results of these efforts illustrate similarities and differences in psychological practice among SCI centers and offer the opportunity to blend research and clinical practice in an innovative approach to evidence-based practice improvement. The established taxonomy provides a basic framework for future studies on the effect of psychological interventions. PMID:19810633

  17. A Comparison of Automatic Parallelization Tools/Compilers on the SGI Origin 2000 Using the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Frumkin, Michael; Hribar, Michelle; Jin, Hao-Qiang; Waheed, Abdul; Yan, Jerry

    1998-01-01

    Porting applications to new high performance parallel and distributed computing platforms is a challenging task. Since writing parallel code by hand is extremely time consuming and costly, porting codes would ideally be automated by using some parallelization tools and compilers. In this paper, we compare the performance of the hand written NAB Parallel Benchmarks against three parallel versions generated with the help of tools and compilers: 1) CAPTools: an interactive computer aided parallelization too] that generates message passing code, 2) the Portland Group's HPF compiler and 3) using compiler directives with the native FORTAN77 compiler on the SGI Origin2000.

  18. Simulation in Training--The Current Imperative.

    DTIC Science & Technology

    1980-05-16

    Carlisle Barracks, PA 17013 - I1. CONTROLLING OFFICE NAME AND ADDRESS 12 . REPORT DATE I. NUMBER OF PAGES 24 14. MONITORING AGENCY NAME & ADDRESSQIf different...growth in components, spare parts, fuel and lubricants, as well as limited space in which to train, the Army must adapt a strategy of field train- ing...I________________________ k AUTHOR(S): Richard P. Diehl, LTC, INF TITLE: Simulation in Training--The Current Imperative FORMAT: Individual Study Project DATE: 16

  19. Is complete seizure control imperative?

    PubMed

    Andermann, Frederick

    2002-01-01

    Is complete control imperative? The answer depends on whether complete control is indeed possible, on the possibility of achieving modifications of lifestyle, and on the type of epilepsy, with particular reference to the presence of progressive dysfunction. This may be seen in patients with temporal lobe or other forms of focal epilepsy, in the epileptic encephalopathies such as West and Lennox Gastaut Syndromes and even in some patients with idiopathic generalized epilepsy. Progressive memory changes and global cognitive problems are examples. Progressive language deterioration, secondary epileptogenesis and phenomena analogous to kindling are also important issues. How long treatment should be continued depends on many factors, not least the preference of the patient and of the family. Weighing the benefits of complete control versus the side effects and risks of medication or surgery is crucial. There are obvious benefits to complete control; it is imperative if these benefits are greater than the cost.

  20. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Investigatory files compiled... Records § 902.57 Investigatory files compiled for law enforcement purposes. (a) Files compiled by the...) Constitute an unwarranted invasion of personal privacy; (4) Disclose the identity of a confidential source...

Top