Pas, Elise T; Bradshaw, Catherine P
2012-10-01
Although there is an established literature supporting the efficacy of a variety of prevention programs, there has been less empirical work on the translation of such research to everyday practice or when scaled-up state-wide. There is a considerable need for more research on factors that enhance implementation of programs and optimize outcomes, particularly in school settings. The current paper examines how the implementation fidelity of an increasingly popular and widely disseminated prevention model called, School-wide Positive Behavioral Interventions and Supports (SW-PBIS), relates to student outcomes within the context of a state-wide scale-up effort. Data come from a scale-up effort of SW-PBIS in Maryland; the sample included 421 elementary and middle schools trained in SW-PBIS. SW-PBIS fidelity, as measured by one of three fidelity measures, was found to be associated with higher math achievement, higher reading achievement, and lower truancy. School contextual factors were related to implementation levels and outcomes. Implications for scale-up efforts of behavioral and mental health interventions and measurement considerations are discussed.
ERIC Educational Resources Information Center
Jowers, Keri L.; Bradshaw, Catherine P.; Gately, Sherry
2007-01-01
Public schools are under increased pressure to implement evidence-based substance abuse prevention programs. A number of model programs have been identified, but little research has examined the effectiveness of these programs when "brought to scale" or implemented district-wide. The current paper summarizes the application of the Adelman and…
Horner, Robert H; Sugai, George
2015-05-01
School-wide Positive Behavioral Interventions and Supports (PBIS) is an example of applied behavior analysis implemented at a scale of social importance. In this paper, PBIS is defined and the contributions of behavior analysis in shaping both the content and implementation of PBIS are reviewed. Specific lessons learned from implementation of PBIS over the past 20 years are summarized.
Examining the Association Between Implementation and Outcomes
Pas, Elise T.; Bradshaw, Catherine P.
2012-01-01
Although there is an established literature supporting the efficacy of a variety of prevention programs, there has been less empirical work on the tran of such research to everyday practice or when scaled-up state-wide. There is a considerable need for more research on factors that enhance implementation of programs and optimize outcomes, particularly in school settings. The current paper examines how the implementation fidelity of an increasingly popular and widely disseminated prevention model called, School-wide Positive Behavioral Interventions and Supports (SW-PBIS), relates to student outcomes within the context of a state-wide scale-up effort. Data come from a scale-up effort of SW-PBIS in Maryland; the sample included 421 elementary and middle schools trained in SW-PBIS. SW-PBIS fidelity, as measured by one of three fidelity measures, was found to be associated with higher math achievement, higher reading achievement, and lower truancy. School contextual factors were related to implementation levels and outcomes. Implications for scale-up efforts of behavioral and mental health interventions and measurement considerations are discussed. PMID:22836758
Large-Scale Implementation of Check-In, Check-Out: A Descriptive Study
ERIC Educational Resources Information Center
Hawken, Leanne S.; Bundock, Kaitlin; Barrett, Courtenay A.; Eber, Lucille; Breen, Kimberli; Phillips, Danielle
2015-01-01
Check-In, Check-Out (CICO) is one of the most widely implemented Tier 2 behavior interventions in a school-wide system of Positive Behavior Interventions and Supports (PBIS). Much literature has documented implementation of CICO across individual schools or districts. The Illinois PBIS Network, currently known as the Midwest PBIS Network, has…
Aarons, Gregory A; Fettes, Danielle L; Hurlburt, Michael S; Palinkas, Lawrence A; Gunderson, Lara; Willging, Cathleen E; Chaffin, Mark J
2014-01-01
Implementation and scale-up of evidence-based practices (EBPs) is often portrayed as involving multiple stakeholders collaborating harmoniously in the service of a shared vision. In practice, however, collaboration is a more complex process that may involve shared and competing interests and agendas, and negotiation. The present study examined the scale-up of an EBP across an entire service system using the Interagency Collaborative Team approach. Participants were key stakeholders in a large-scale county-wide implementation of an EBP to reduce child neglect, SafeCare. Semistructured interviews and/or focus groups were conducted with 54 individuals representing diverse constituents in the service system, followed by an iterative approach to coding and analysis of transcripts. The study was conceptualized using the Exploration, Preparation, Implementation, and Sustainment framework. Although community stakeholders eventually coalesced around implementation of SafeCare, several challenges affected the implementation process. These challenges included differing organizational cultures, strategies, and approaches to collaboration; competing priorities across levels of leadership; power struggles; and role ambiguity. Each of the factors identified influenced how stakeholders approached the EBP implementation process. System-wide scale-up of EBPs involves multiple stakeholders operating in a nexus of differing agendas, priorities, leadership styles, and negotiation strategies. The term collaboration may oversimplify the multifaceted nature of the scale-up process. Implementation efforts should openly acknowledge and consider this nexus when individual stakeholders and organizations enter into EBP implementation through collaborative processes.
Aarons, Gregory A.; Fettes, Danielle; Hurlburt, Michael; Palinkas, Lawrence; Gunderson, Lara; Willging, Cathleen; Chaffin, Mark
2014-01-01
Objective Implementation and scale-up of evidence-based practices (EBPs) is often portrayed as involving multiple stakeholders collaborating harmoniously in the service of a shared vision. In practice, however, collaboration is a more complex process that may involve shared and competing interests and agendas, and negotiation. The present study examined the scale-up of an EBP across an entire service system using the Interagency Collaborative Team (ICT) approach. Methods Participants were key stakeholders in a large-scale county-wide implementation of an EBP to reduce child neglect, SafeCare®. Semi-structured interviews and/or focus groups were conducted with 54 individuals representing diverse constituents in the service system, followed by an iterative approach to coding and analysis of transcripts. The study was conceptualized using the Exploration, Preparation, Implementation, and Sustainment (EPIS) framework. Results Although community stakeholders eventually coalesced around implementation of SafeCare, several challenges affected the implementation process. These challenges included differing organizational cultures, strategies, and approaches to collaboration, competing priorities across levels of leadership, power struggles, and role ambiguity. Each of the factors identified influenced how stakeholders approached the EBP implementation process. Conclusions System wide scale-up of EBPs involves multiple stakeholders operating in a nexus of differing agendas, priorities, leadership styles, and negotiation strategies. The term collaboration may oversimplify the multifaceted nature of the scale-up process. Implementation efforts should openly acknowledge and consider this nexus when individual stakeholders and organizations enter into EBP implementation through collaborative processes. PMID:24611580
ERIC Educational Resources Information Center
Horner, Robert H.; Kincaid, Donald; Sugai, George; Lewis, Timothy; Eber, Lucille; Barrett, Susan; Dickey, Celeste Rossetto; Richter, Mary; Sullivan, Erin; Boezio, Cyndi; Algozzine, Bob; Reynolds, Heather; Johnson, Nanci
2014-01-01
Scaling of evidence-based practices in education has received extensive discussion but little empirical evaluation. We present here a descriptive summary of the experience from seven states with a history of implementing and scaling School-Wide Positive Behavioral Interventions and Supports (SWPBIS) over the past decade. Each state has been…
Klingner, Jill; Moscovice, Ira; Casey, Michelle; McEllistrem Evenson, Alex
2015-01-01
Previously published findings based on field tests indicated that emergency department patient transfer communication measures are feasible and worthwhile to implement in rural hospitals. This study aims to expand those findings by focusing on the wide-scale implementation of these measures in the 79 Critical Access Hospitals (CAHs) in Minnesota from 2011 to 2013. Information was obtained from interviews with key informants involved in implementing the emergency department patient transfer communication measures in Minnesota as part of required statewide quality reporting. The first set of interviews targeted state-level organizations regarding their experiences working with providers. A second set of interviews targeted quality and administrative staff from CAHs regarding their experiences implementing measures. Implementing the measures in Minnesota CAHs proved to be successful in a number of respects, but informants also faced new challenges. Our recommendations, addressed to those seeking to successfully implement these measures in other states, take these challenges into account. Field-testing new quality measure implementations with volunteers may not be indicative of a full-scale implementation that requires facilities to participate. The implementation team's composition, communication efforts, prior relationships with facilities and providers, and experience with data collection and abstraction tools are critical factors in successfully implementing required reporting of quality measures on a wide scale. © 2014 National Rural Health Association.
Sybil--efficient constraint-based modelling in R.
Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J
2013-11-13
Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).
NASA Astrophysics Data System (ADS)
Andini, S.; Fitriana, L.; Budiyono
2018-04-01
This research is aimed to describe the process and to get product development of learning material using flipbook. The learning material is developed in geometry, especially quadrilateral. This research belongs to Research and Development (R&D). The procedure includes the steps of Budiyono Model such as conducting preliminary research, planning and developing a theoretical and prototype product, and determining product quality (validity, practicality, and effectiveness). The average assessment result of the theoretical product by the experts gets 4,54, while validity result of prototype product by the experts is 4,62. Practicability is obtained by the implementation of flipbook prototype in each meeting of limited-scale try out based on learning observation, with the average score of 4,10 and increasing of 4,50 in wide-scale try out. The effectiveness of the prototype product is obtained by the result from pretest and posttest on a limited-scale and a wide-scale try out. The limited-scale pre-test result showed a significant increase in average score of wide-scale pre-test of 25,2, and there is an increase in the average score of posttest on limited-scale try out and wide-scale try out is 8,16. The result of product quality can be concluded that flipbook media can be used in the geometry learning in elementary school which implemented curriculum 2013.
ERIC Educational Resources Information Center
Cavanaugh, Brian; Swan, Meaghan
2015-01-01
School-wide Positive Behavioral Interventions and Supports (SWPBIS) is a widely used framework for supporting student social and academic behavior. Implementation science indicates that one effective way to implement and scale-up practices, such as SWPBIS, is through coaching; thus, there is a need for efficient, cost-effective methods to develop…
Examining the Implementation of Technology-Based Blended Algebra I Curriculum at Scale
ERIC Educational Resources Information Center
Karam, Rita; Pane, John F.; Griffin, Beth Ann; Robyn, Abby; Phillips, Andrea; Daugherty, Lindsay
2017-01-01
Studies on blended education pay little attention to implementation, thus limiting the understanding of how such programs contribute to student math learning. This article examines the implementation of a widely used blended algebra curriculum and the relationship between implementation and student outcomes. The study was conducted in 74 middle…
The School Implementation Scale: Measuring Implementation in Response to Intervention Models
ERIC Educational Resources Information Center
Erickson, Amy Gaumer; Noonan, Pattie M.; Jenson, Ronda
2012-01-01
Models of response to intervention (RTI) have been widely developed and implemented and have expanded to include integrated academic/behavior RTI models. Until recently, evaluation of model effectiveness has focused primarily on student-level data, but additional measures of treatment integrity within these multi-tiered models are emerging to…
ERIC Educational Resources Information Center
Greaney, Mary; Hardwick, Cary K.; Mezgebu, Solomon; Lindsay, Ana C.; Roover, Michelle L.; Peterson, Karen E.
2007-01-01
Background: University-community partnerships can support schools in implementing evidence-based responses to youth obesity trends. An inter-organizational partnership was established to implement and evaluate the Healthy Choices Collaborative Intervention (HCCI). HCCI combines an interdisciplinary curriculum, before/after school activities, and…
NASA Astrophysics Data System (ADS)
Langston, Abigail L.; Tucker, Gregory E.
2018-01-01
Understanding how a bedrock river erodes its banks laterally is a frontier in geomorphology. Theories for the vertical incision of bedrock channels are widely implemented in the current generation of landscape evolution models. However, in general existing models do not seek to implement the lateral migration of bedrock channel walls. This is problematic, as modeling geomorphic processes such as terrace formation and hillslope-channel coupling depends on the accurate simulation of valley widening. We have developed and implemented a theory for the lateral migration of bedrock channel walls in a catchment-scale landscape evolution model. Two model formulations are presented, one representing the slow process of widening a bedrock canyon and the other representing undercutting, slumping, and rapid downstream sediment transport that occurs in softer bedrock. Model experiments were run with a range of values for bedrock erodibility and tendency towards transport- or detachment-limited behavior and varying magnitudes of sediment flux and water discharge in order to determine the role that each plays in the development of wide bedrock valleys. The results show that this simple, physics-based theory for the lateral erosion of bedrock channels produces bedrock valleys that are many times wider than the grid discretization scale. This theory for the lateral erosion of bedrock channel walls and the numerical implementation of the theory in a catchment-scale landscape evolution model is a significant first step towards understanding the factors that control the rates and spatial extent of wide bedrock valleys.
Implementing Effective Educational Practices at Scales of Social Importance.
Horner, Robert H; Sugai, George; Fixsen, Dean L
2017-03-01
Implementing evidence-based practices is becoming both a goal and standard across medicine, psychology, and education. Initial successes, however, are now leading to questions about how successful demonstrations may be expanded to scales of social importance. In this paper, we review lessons learned about scaling up evidence-based practices gleaned from our experience implementing school-wide positive behavioral interventions and supports (PBIS) across more than 23,000 schools in the USA. We draw heavily from the work of Flay et al. (Prev Sci 6:151-175, 2005. doi: 10.1007/s11121-005-5553-y ) related to defining evidence-based practices, the significant contributions from the emerging "implementation science" movement (Fixsen et al. in Implementation research: a synthesis of the literature, University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231), Tampa 2005), and guidance we have received from teachers, family members, students, and administrators who have adopted PBIS.
The Pennsylvania Positive Behavior Support Network: Describing Our Scale-Up
ERIC Educational Resources Information Center
Runge, Timothy J.; Longwill, Douglas A.; Staszkiewicz, Mark J.; Palmiero, James; Lawson, Tina M.
2016-01-01
Pennsylvania began scaling up high-fidelity implementation of SchoolWide Positive Behavioral Interventions and Supports (SWPBIS) in 2006-2007 due to converging regulatory, legal, ethical, and practical influences. The Pennsylvania Community of Practice on School-Based Behavioral Health adopted Algozzine et al.'s (2010) blueprint to describe and…
ERIC Educational Resources Information Center
Soneral, Paula A. G.; Wyse, Sara A.
2017-01-01
Student-centered learning environments with upside-down pedagogies (SCALE-UP) are widely implemented at institutions across the country, and learning gains from these classrooms have been well documented. This study investigates the specific design feature(s) of the SCALE-UP classroom most conducive to teaching and learning. Using pilot survey…
NASA Astrophysics Data System (ADS)
Li, Shuangcai; Duffy, Christopher J.
2011-03-01
Our ability to predict complex environmental fluid flow and transport hinges on accurate and efficient simulations of multiple physical phenomenon operating simultaneously over a wide range of spatial and temporal scales, including overbank floods, coastal storm surge events, drying and wetting bed conditions, and simultaneous bed form evolution. This research implements a fully coupled strategy for solving shallow water hydrodynamics, sediment transport, and morphological bed evolution in rivers and floodplains (PIHM_Hydro) and applies the model to field and laboratory experiments that cover a wide range of spatial and temporal scales. The model uses a standard upwind finite volume method and Roe's approximate Riemann solver for unstructured grids. A multidimensional linear reconstruction and slope limiter are implemented, achieving second-order spatial accuracy. Model efficiency and stability are treated using an explicit-implicit method for temporal discretization with operator splitting. Laboratory-and field-scale experiments were compiled where coupled processes across a range of scales were observed and where higher-order spatial and temporal accuracy might be needed for accurate and efficient solutions. These experiments demonstrate the ability of the fully coupled strategy in capturing dynamics of field-scale flood waves and small-scale drying-wetting processes.
E-Learning in a Large Organization: A Study of the Critical Role of Information Sharing
ERIC Educational Resources Information Center
Netteland, Grete; Wasson, Barbara; Morch, Anders I
2007-01-01
Purpose: The purpose of this paper is to provide new insights into the implementation of large-scale learning projects; thereby better understanding the difficulties, frustrations, and obstacles encountered when implementing enterprise-wide e-learning as a tool for training and organization transformation in a complex organization.…
Handling PBIS with Care: Scaling up to School-Wide Implementation
ERIC Educational Resources Information Center
Cressey, James M.; Whitcomb, Sara A.; McGilvray-Rivet, Susan J.; Morrison, Rebecca J.; Shander-Reynolds, Katherine J.
2015-01-01
This case study describes the leadership of a school counselor in implementing positive behavioral interventions and supports (PBIS) in a low-income, diverse elementary school with a modest level of external supports. After initiating a grade-level pilot program, the school counselor partnered with university-based consultants to expand the PBIS…
Implementation Blueprint and Self-Assessment: Positive Behavioral Interventions and Supports
ERIC Educational Resources Information Center
Technical Assistance Center on Positive Behavioral Interventions and Supports, 2010
2010-01-01
A "blueprint" is a guide designed to improve large-scale implementations of a specific systems or organizational approach, like School-Wide Positive Behavior Support (SWPBS). This blueprint is intended to make the conceptual theory, organizational models, and practices of SWPBS more accessible for those involved in enhancing how schools,…
"Scaling Up" Educational Change: Some Musings on Misrecognition and Doxic Challenges
ERIC Educational Resources Information Center
Thomson, Pat
2014-01-01
Educational policy-makers around the world are strongly committed to the notion of "scaling up". This can mean anything from encouraging more teachers to take up a pedagogical innovation, all the way through to system-wide efforts to implement "what works" across all schools. In this paper, I use Bourdieu's notions of…
ERIC Educational Resources Information Center
Smolkowski, Keith; Strycker, Lisa; Ward, Bryce
2016-01-01
This study evaluated the scale-up of a Safe & Civil Schools "Foundations: Establishing Positive Discipline Policies" positive behavioral interventions and supports initiative through 4 years of "real-world" implementation in a large urban school district. The study extends results from a previous randomized controlled trial…
Improving Real World Performance of Vision Aided Navigation in a Flight Environment
2016-09-15
Introduction . . . . . . . 63 4.2 Wide Area Search Extent . . . . . . . . . . . . . . . . . 64 4.3 Large-Scale Image Navigation Histogram Filter ...65 4.3.1 Location Model . . . . . . . . . . . . . . . . . . 66 4.3.2 Measurement Model . . . . . . . . . . . . . . . 66 4.3.3 Histogram Filter ...Iteration of Histogram Filter . . . . . . . . . . . 70 4.4 Implementation and Flight Test Campaign . . . . . . . . 71 4.4.1 Software Implementation
ERIC Educational Resources Information Center
Rollins, Howard; And Others
The results of a 3-year project that developed a practical program for the wide-scale implementation of behavior modification in urban schools are presented in this paper. The major outcomes of the project were (a) a practical, cost-effective behavior modification program that reduces discipline problems, increases student motivation, and…
ERIC Educational Resources Information Center
Bryson, Susan E.; Koegel, Lynn K.; Koegel, Robert L.; Openden, Daniel; Smith, Isabel M.; Nefdt, Nicolette
2007-01-01
This paper describes a collaborative effort aimed at province-wide dissemination and implementation of pivotal response treatment (PRT) for young children with autism spectrum disorder (ASD) in Nova Scotia, Canada. Three critical components of the associated training model are described: (1) direct training of treatment teams (parents, one-to-one…
ERIC Educational Resources Information Center
Knowles, Valerie; Kaljee, Linda; Deveaux, Lynette; Lunn, Sonja; Rolle, Glenda; Stanton, Bonita
2012-01-01
A wide range of behavioral prevention interventions have been demonstrated through longitudinal, randomized controlled trials to reduce sexual risk behaviors. Many of these interventions have been made available at little cost for implementation on a public health scale. However, efforts to utilize such programs typically have been met with a…
Neuromorphic Silicon Neuron Circuits
Indiveri, Giacomo; Linares-Barranco, Bernabé; Hamilton, Tara Julia; van Schaik, André; Etienne-Cummings, Ralph; Delbruck, Tobi; Liu, Shih-Chii; Dudek, Piotr; Häfliger, Philipp; Renaud, Sylvie; Schemmel, Johannes; Cauwenberghs, Gert; Arthur, John; Hynna, Kai; Folowosele, Fopefolu; Saighi, Sylvain; Serrano-Gotarredona, Teresa; Wijekoon, Jayawan; Wang, Yingxue; Boahen, Kwabena
2011-01-01
Hardware implementations of spiking neurons can be extremely useful for a large variety of applications, ranging from high-speed modeling of large-scale neural systems to real-time behaving systems, to bidirectional brain–machine interfaces. The specific circuit solutions used to implement silicon neurons depend on the application requirements. In this paper we describe the most common building blocks and techniques used to implement these circuits, and present an overview of a wide range of neuromorphic silicon neurons, which implement different computational models, ranging from biophysically realistic and conductance-based Hodgkin–Huxley models to bi-dimensional generalized adaptive integrate and fire models. We compare the different design methodologies used for each silicon neuron design described, and demonstrate their features with experimental results, measured from a wide range of fabricated VLSI chips. PMID:21747754
Evaluation of Hydrogel Technologies for the Decontamination ...
Report This current research effort was developed to evaluate intermediate level (between bench-scale and large-scale or wide-area implementation) decontamination procedures, materials, technologies, and techniques used to remove radioactive material from different surfaces. In the event of such an incident, application of this technology would primarily be intended for decontamination of high-value buildings, important infrastructure, and landmarks.
An introduction to web scale discovery systems.
Hoy, Matthew B
2012-01-01
This article explores the basic principles of web-scale discovery systems and how they are being implemented in libraries. "Web scale discovery" refers to a class of products that index a vast number of resources in a wide variety formats and allow users to search for content in the physical collection, print and electronic journal collections, and other resources from a single search box. Search results are displayed in a manner similar to Internet searches, in a relevance ranked list with links to online content. The advantages and disadvantages of these systems are discussed, and a list of popular discovery products is provided. A list of library websites with discovery systems currently implemented is also provided.
Three Collaborative Models for Scaling Up Evidence-Based Practices
Roberts, Rosemarie; Jones, Helen; Marsenich, Lynne; Sosna, Todd; Price, Joseph M.
2015-01-01
The current paper describes three models of research-practice collaboration to scale-up evidence-based practices (EBP): (1) the Rolling Cohort model in England, (2) the Cascading Dissemination model in San Diego County, and (3) the Community Development Team model in 53 California and Ohio counties. Multidimensional Treatment Foster Care (MTFC) and KEEP are the focal evidence-based practices that are designed to improve outcomes for children and families in the child welfare, juvenile justice, and mental health systems. The three scale-up models each originated from collaboration between community partners and researchers with the shared goal of wide-spread implementation and sustainability of MTFC/KEEP. The three models were implemented in a variety of contexts; Rolling Cohort was implemented nationally, Cascading Dissemination was implemented within one county, and Community Development Team was targeted at the state level. The current paper presents an overview of the development of each model, the policy frameworks in which they are embedded, system challenges encountered during scale-up, and lessons learned. Common elements of successful scale-up efforts, barriers to success, factors relating to enduring practice relationships, and future research directions are discussed. PMID:21484449
Boland, Melinde R S; Kruis, Annemarije L; Huygens, Simone A; Tsiachristas, Apostolos; Assendelft, Willem J J; Gussekloo, Jacobijn; Blom, Coert M G; Chavannes, Niels H; Rutten-van Mölken, Maureen P M H
2015-12-17
This study aims to (1) examine the variation in implementation of a 2-year chronic obstructive pulmonary disease (COPD) management programme called RECODE, (2) analyse the facilitators and barriers to implementation and (3) investigate the influence of this variation on health outcomes. Implementation variation among the 20 primary-care teams was measured directly using a self-developed scale and indirectly through the level of care integration as measured with the Patient Assessment of Chronic Illness Care (PACIC) and the Assessment of Chronic Illness Care (ACIC). Interviews were held to obtain detailed information regarding the facilitators and barriers to implementation. Multilevel models were used to investigate the association between variation in implementation and change in outcomes. The teams implemented, on average, eight of the 19 interventions, and the specific package of interventions varied widely. Important barriers and facilitators of implementation were (in)sufficient motivation of healthcare provider and patient, the high starting level of COPD care, the small size of the COPD population per team, the mild COPD population, practicalities of the information and communication technology (ICT) system, and hurdles in reimbursement. Level of implementation as measured with our own scale and the ACIC was not associated with health outcomes. A higher level of implementation measured with the PACIC was positively associated with improved self-management capabilities, but this association was not found for other outcomes. There was a wide variety in the implementation of RECODE, associated with barriers at individual, social, organisational and societal level. There was little association between extent of implementation and health outcomes.
Boland, Melinde R S; Kruis, Annemarije L; Huygens, Simone A; Tsiachristas, Apostolos; Assendelft, Willem J J; Gussekloo, Jacobijn; Blom, Coert M G; Chavannes, Niels H; Rutten-van Mölken, Maureen P M H
2015-01-01
This study aims to (1) examine the variation in implementation of a 2-year chronic obstructive pulmonary disease (COPD) management programme called RECODE, (2) analyse the facilitators and barriers to implementation and (3) investigate the influence of this variation on health outcomes. Implementation variation among the 20 primary-care teams was measured directly using a self-developed scale and indirectly through the level of care integration as measured with the Patient Assessment of Chronic Illness Care (PACIC) and the Assessment of Chronic Illness Care (ACIC). Interviews were held to obtain detailed information regarding the facilitators and barriers to implementation. Multilevel models were used to investigate the association between variation in implementation and change in outcomes. The teams implemented, on average, eight of the 19 interventions, and the specific package of interventions varied widely. Important barriers and facilitators of implementation were (in)sufficient motivation of healthcare provider and patient, the high starting level of COPD care, the small size of the COPD population per team, the mild COPD population, practicalities of the information and communication technology (ICT) system, and hurdles in reimbursement. Level of implementation as measured with our own scale and the ACIC was not associated with health outcomes. A higher level of implementation measured with the PACIC was positively associated with improved self-management capabilities, but this association was not found for other outcomes. There was a wide variety in the implementation of RECODE, associated with barriers at individual, social, organisational and societal level. There was little association between extent of implementation and health outcomes. PMID:26677770
Evaluating large-scale health programmes at a district level in resource-limited countries.
Svoronos, Theodore; Mate, Kedar S
2011-11-01
Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.
Pathways for scaling up public health interventions.
Indig, Devon; Lee, Karen; Grunseit, Anne; Milat, Andrew; Bauman, Adrian
2017-08-01
To achieve population-wide health improvement, public health interventions found effective in selected samples need to be 'scaled up' and implemented more widely. The pathways through which interventions are scaled up are not well characterised. The aim of this paper is to identify examples of public health interventions which have been scaled up and to develop a conceptual framework which quantifies and describes this process. A multi-stage international literature search was undertaken to identify examples of public health interventions in high income countries that have been scaled up or implemented at scale. Initial abstract review identified articles which met all the criteria of being a: 1) public health intervention; 2) chronic disease prevention focus; 3) program delivered at a wide geographical scale (state, national or international). Interventions were reviewed and coded into a conceptual framework pathway to document their scaling up process. For each program, an in-depth review of the identified articles was undertaken along with a broad internet based search to determine the outcomes of the dissemination process. A conceptual framework of scaling up pathways was developed that involved four stages (development, efficacy testing, real world trial and dissemination) to which the 40 programs were mapped. The search identified 40 public health interventions that showed evidence of being scaled up. Four pathways were identified to capture the different scaling up trajectories taken which included: 'Type I - Comprehensive' (55%) which passed through all four stages, 'Type II - Efficacy omitters' (5%) which did not conduct efficacy testing, 'Type III - Trial omitters' (25%) which did not conduct a real world trial, and 'Type IV - At scale dissemination' (15%) which skipped both efficacy testing and a real world trial. This is the first study to classify and quantify the potential pathways through which public health interventions in high income countries are scaled up to reach the broader population. Mapping these pathways not only demonstrates the different trajectories that occur in scaling up public health interventions, but also allows the variation across scaling up pathways to be classified. The policy and practice determinants leading to each pathway remain for future study, especially to identify the conditions under which efficacy and replication stages are missing.
ERIC Educational Resources Information Center
Rutaisire, John; Gahima, Charles
2009-01-01
Purpose: The purpose of this paper is to assess the relationship between policy development and research evidence with specific reference to the Rwandan Teacher Development and Management Policy introduced in 2005. It aims to highlight the complexity of implementing large-scale system wide change in the specific context of a small African nation…
Automated Detection of Craters in Martian Satellite Imagery Using Convolutional Neural Networks
NASA Astrophysics Data System (ADS)
Norman, C. J.; Paxman, J.; Benedix, G. K.; Tan, T.; Bland, P. A.; Towner, M.
2018-04-01
Crater counting is used in determining surface age of planets. We propose improvements to martian Crater Detection Algorithms by implementing an end-to-end detection approach with the possibility of scaling the algorithm planet-wide.
Scaling Agile Methods for Department of Defense Programs
2016-12-01
concepts that drive the design of scaling frameworks, the contextual drivers that shape implementation, and widely known frameworks available today...Barlow probably governs some of the design choices you make. Barlow’s formula helps us understand the relationship between the outside diameter of a...encouraged to cross-train engineering staff and move away from a team structure where people focus on only one specialty, such as design
Fourier Plane Image Combination by Feathering
NASA Astrophysics Data System (ADS)
Cotton, W. D.
2017-09-01
Astronomical objects frequently exhibit structure over a wide range of scales whereas many telescopes, especially interferometer arrays, only sample a limited range of spatial scales. To properly image these objects, images from a set of instruments covering the range of scales may be needed. These images then must be combined in a manner to recover all spatial scales. This paper describes the feathering technique for image combination in the Fourier transform plane. Implementations in several packages are discussed and example combinations of single dish and interferometric observations of both simulated and celestial radio emission are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, Russell G.; Winther, Eric C.; Fox, Lyle G.
2004-01-01
This report presents results for year twelve in a basin-wide program to harvest northern pikeminnow1 (Ptychocheilus oregonensis). This program was started in an effort to reduce predation by northern pikeminnow on juvenile salmonids during their emigration from natal streams to the ocean. Earlier work in the Columbia River Basin suggested predation by northern pikeminnow on juvenile salmonids might account for most of the 10-20% mortality juvenile salmonids experience in each of eight Columbia River and Snake River reservoirs. Modeling simulations based on work in John Day Reservoir from 1982 through 1988 indicated that, if predator-size northern pikeminnow were exploited atmore » a 10-20% rate, the resulting restructuring of their population could reduce their predation on juvenile salmonids by 50%. To test this hypothesis, we implemented a sport-reward angling fishery and a commercial longline fishery in the John Day Pool in 1990. We also conducted an angling fishery in areas inaccessible to the public at four dams on the mainstem Columbia River and at Ice Harbor Dam on the Snake River. Based on the success of these limited efforts, we implemented three test fisheries on a system-wide scale in 1991--a tribal longline fishery above Bonneville Dam, a sport-reward fishery, and a dam-angling fishery. Low catch of target fish and high cost of implementation resulted in discontinuation of the tribal longline fishery. However, the sport-reward and dam-angling fisheries were continued in 1992 and 1993. In 1992, we investigated the feasibility of implementing a commercial longline fishery in the Columbia River below Bonneville Dam and found that implementation of this fishery was also infeasible. Estimates of combined annual exploitation rates resulting from the sport-reward and damangling fisheries remained at the low end of our target range of 10-20%. This suggested the need for additional effective harvest techniques. During 1991 and 1992, we developed and tested a modified (small-sized) Merwin trapnet. We found this floating trapnet to be very effective in catching northern pikeminnow at specific sites. Consequently, in 1993 we examined a system-wide fishery using floating trapnets, but found this fishery to be ineffective at harvesting large numbers of northern pikeminnow on a system-wide scale.« less
Dean, Shannon M; Gilmore-Bykovskyi, Andrea; Buchanan, Joel; Ehlenfeldt, Brad; Kind, Amy JH
2016-01-01
Background The hospital discharge summary is the primary method used to communicate a patient's plan of care to the next provider(s). Despite the existence of regulations and guidelines outlining the optimal content for the discharge summary and its importance in facilitating an effective transition to post-hospital care, incomplete discharge summaries remain a common problem that may contribute to poor post-hospital outcomes. Electronic health records (EHRs) are regularly used as a platform upon which standardization of content and format can be implemented. Objective We describe here the design and hospital-wide implementation of a standardized discharge summary using an EHR. Methods We employed the evidence-based Replicating Effective Programs implementation strategy to guide the development and implementation during this large-scale project. Results Within 18 months, 90% of all hospital discharge summaries were written using the standardized format. Hospital providers found the template helpful and easy to use, and recipient providers perceived an improvement in the quality of discharge summaries compared to those sent from our hospital previously. Conclusions Discharge summaries can be standardized and implemented hospital-wide with both author and recipient provider satisfaction, especially if evidence-based implementation strategies are employed. The use of EHR tools to guide clinicians in writing comprehensive discharge summaries holds promise in improving the existing deficits in communication at transitions of care. PMID:28334559
Bond, Gary R; Drake, Robert E; Rapp, Charles A; McHugo, Gregory J; Xie, Haiyi
2009-09-01
Fidelity scales have been widely used to assess program adherence to the principles of an evidence-based practice, but they do not measure important aspects of quality of care. Pragmatic scales measuring clinical quality of services are needed to complement fidelity scales measuring structural aspects of program implementation. As part of the instrumentation developed for the National Implementing Evidence-Based Practices Project, we piloted a new instrument with two 5-item quality scales, Individualization (a client-level quality scale) and Quality Improvement (an organizational-level quality scale). Pairs of independent fidelity assessors conducted fidelity reviews in 49 sites in 8 states at baseline and at four subsequent 6-month intervals over a 2-year follow-up period. The assessors followed a standardized protocol to administer these quality scales during daylong site visits; during these same visits they assessed programs on fidelity to the evidence-based practice that the site was seeking to implement. Assessors achieved acceptable interrater reliability for both Individualization and Quality Improvement. Principal components factor analysis confirmed the 2-scale structure. The two scales were modestly correlated with each other and with the evidence-based practice fidelity scales. Over the first year, Individualization and Quality Improvement improved, but showed little or no improvement during the last year of follow-up. The two newly developed scales showed adequate psychometric properties in this preliminary study, but further research is needed to assess their validity and utility in routine clinical practice.
White, Michelle C.; Baxter, Linden S.; Close, Kristin L.; Ravelojaona, Vaonandianina A.; Rakotoarison, Hasiniaina N.; Bruno, Emily; Herbert, Alison; Andean, Vanessa; Callahan, James; Andriamanjato, Hery H.; Shrime, Mark G.
2018-01-01
Background The 2009 World Health Organisation (WHO) surgical safety checklist significantly reduces surgical mortality and morbidity (up to 47%). Yet in 2016, only 25% of East African anesthetists regularly use the checklist. Nationwide implementation of the checklist is reported in high-income countries, but in low- and middle-income countries (LMICs) reports of successful implementations are sparse, limited to single institutions and require intensive support. Since checklist use leads to the biggest improvements in outcomes in LMICs, methods of wide-scale implementation are needed. We hypothesized that, using a three-day course, successful wide-scale implementation of the checklist could be achieved, as measured by at least 50% compliance with six basic safety processes at three to four months. We also aimed to determine predictors for checklist utilization. Materials and methods Using a blended educational implementation strategy based on prior pilot studies we designed a three-day dynamic educational course to facilitate widespread implementation of the WHO checklist. The course utilized lectures, film, small group breakouts, participant feedback and simulation to teach the knowledge, skills and behavior changes needed to implement the checklist. In collaboration with the Ministry of Health and local hospital leadership, the course was delivered to 427 multi-disciplinary staff at 21 hospitals located in 19 of 22 regions of Madagascar between September 2015 and March 2016. We evaluated implementation at three to four months using questionnaires (with a 5-point Likert scale) and focus groups. Multivariate linear regression was used to test predictors of checklist utilization. Results At three to four months, 65% of respondents reported always using the checklist, with another 13% using it in part. Participant’s years in practice, hospital size, or surgical volume did not predict checklist use. Checklist use was associated with counting instruments (p< 0.05), but not with verifying: patient identity, difficult intubation risk, risk of blood loss, prophylactic antibiotic administration, or counting needles and sponges. Conclusion Use of a multi-disciplinary three-day course for checklist implementation resulted in 78% of participants using the checklist, at three months; and an increase in counting surgical instruments. Successful checklist implementation was not predicted by participant length of medical service, hospital size or surgical volume. If reproducible in other countries, widespread implementation in LMICs becomes a realistic possibility. PMID:29401465
White, Michelle C; Baxter, Linden S; Close, Kristin L; Ravelojaona, Vaonandianina A; Rakotoarison, Hasiniaina N; Bruno, Emily; Herbert, Alison; Andean, Vanessa; Callahan, James; Andriamanjato, Hery H; Shrime, Mark G
2018-01-01
The 2009 World Health Organisation (WHO) surgical safety checklist significantly reduces surgical mortality and morbidity (up to 47%). Yet in 2016, only 25% of East African anesthetists regularly use the checklist. Nationwide implementation of the checklist is reported in high-income countries, but in low- and middle-income countries (LMICs) reports of successful implementations are sparse, limited to single institutions and require intensive support. Since checklist use leads to the biggest improvements in outcomes in LMICs, methods of wide-scale implementation are needed. We hypothesized that, using a three-day course, successful wide-scale implementation of the checklist could be achieved, as measured by at least 50% compliance with six basic safety processes at three to four months. We also aimed to determine predictors for checklist utilization. Using a blended educational implementation strategy based on prior pilot studies we designed a three-day dynamic educational course to facilitate widespread implementation of the WHO checklist. The course utilized lectures, film, small group breakouts, participant feedback and simulation to teach the knowledge, skills and behavior changes needed to implement the checklist. In collaboration with the Ministry of Health and local hospital leadership, the course was delivered to 427 multi-disciplinary staff at 21 hospitals located in 19 of 22 regions of Madagascar between September 2015 and March 2016. We evaluated implementation at three to four months using questionnaires (with a 5-point Likert scale) and focus groups. Multivariate linear regression was used to test predictors of checklist utilization. At three to four months, 65% of respondents reported always using the checklist, with another 13% using it in part. Participant's years in practice, hospital size, or surgical volume did not predict checklist use. Checklist use was associated with counting instruments (p< 0.05), but not with verifying: patient identity, difficult intubation risk, risk of blood loss, prophylactic antibiotic administration, or counting needles and sponges. Use of a multi-disciplinary three-day course for checklist implementation resulted in 78% of participants using the checklist, at three months; and an increase in counting surgical instruments. Successful checklist implementation was not predicted by participant length of medical service, hospital size or surgical volume. If reproducible in other countries, widespread implementation in LMICs becomes a realistic possibility.
Djurfeldt, Mikael
2012-07-01
The connection-set algebra (CSA) is a novel and general formalism for the description of connectivity in neuronal network models, from small-scale to large-scale structure. The algebra provides operators to form more complex sets of connections from simpler ones and also provides parameterization of such sets. CSA is expressive enough to describe a wide range of connection patterns, including multiple types of random and/or geometrically dependent connectivity, and can serve as a concise notation for network structure in scientific writing. CSA implementations allow for scalable and efficient representation of connectivity in parallel neuronal network simulators and could even allow for avoiding explicit representation of connections in computer memory. The expressiveness of CSA makes prototyping of network structure easy. A C+ + version of the algebra has been implemented and used in a large-scale neuronal network simulation (Djurfeldt et al., IBM J Res Dev 52(1/2):31-42, 2008b) and an implementation in Python has been publicly released.
On the application of the Germano identity to subgrid-scale modeling
NASA Technical Reports Server (NTRS)
Ronchi, C.; Ypma, M.; Canuto, V. M.
1992-01-01
An identity proposed by Germano (1992) has been widely applied to several turbulent flows to dynamically compute rather than adjust the Smagorinsky coefficient. The assumptions under which the method has been used are discussed, and some conceptual difficulties in its current implementation are examined.
Kane, Michael D; Springer, John A; Sprague, Jon E
2008-07-01
The rationale and overall system-wide behavior of a clinical genotyping information system (both DNA analysis and data management) requires a near-term, scalable approach, which is emerging in the focused implementation of pharmacogenomics and drug safety assurance. The challenges to implementing a successful clinical genotyping system are described, as are how the benefits of a focused, near-term system for drug safety assessment and assurance overcome the logistical and operational challenges that perpetually hinder the development of a societal-scale clinical genotyping system. This rationale is based on the premise that a focused application domain for clinical genotyping, specifically drug safety assurance, provides a transition paradigm for both professionals and consumers of healthcare, thereby facilitating the movement of genotyping from bench to bedside and paving the way for the adoption of prognostic and diagnostic applications in clinical genomics.
Colloquium on Large Scale Improvement: Implications for AISI
ERIC Educational Resources Information Center
McEwen, Nelly, Ed.
2008-01-01
The Alberta Initiative for School Improvement (AISI) is a province-wide partnership program whose goal is to improve student learning and performance by fostering initiatives that reflect the unique needs and circumstances of each school authority. It is currently ending its third cycle and ninth year of implementation. "The Colloquium on…
Scale Up in Education: Volume 2: Issues in Practice
ERIC Educational Resources Information Center
Schneider, Barbara, Ed.; McDonald, Sarah-Kathryn, Ed.
2006-01-01
This book explores the challenges of implementing and assessing educational interventions in varied classroom contexts. Included are reflections on the challenges of designing studies for improving the instructional core of schools, guidelines for establishing evidence of interventions' impacts across a wide range of settings, and an assessment of…
Students' Attitudes toward Gene Technology: Deconstructing a Construct
ERIC Educational Resources Information Center
Gardner, Grant E.; Troelstrup, Angelique
2015-01-01
Emergent technologies are commonly characterized as involving cutting-edge developments while lacking wide-scale public implementation. Although currently prevalent in many applications, gene technology is often considered emergent in that the science changes so rapidly. Science educators at all levels of formal education are faced with a unique…
Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice
Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J
2015-01-01
Background System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children’s service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. Methods We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Results Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Conclusions Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit. PMID:27512239
Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice.
Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J
2014-04-01
System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children's service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit.
Chiang, Michael F; Starren, Justin B
2002-01-01
The successful implementation of clinical information systems is difficult. In examining the reasons and potential solutions for this problem, the medical informatics community may benefit from the lessons of a rich body of software engineering and management literature about the failure of software projects. Based on previous studies, we present a conceptual framework for understanding the risk factors associated with large-scale projects. However, the vast majority of existing literature is based on large, enterprise-wide systems, and it unclear whether those results may be scaled down and applied to smaller projects such as departmental medical information systems. To examine this issue, we discuss the case study of a delayed electronic medical record implementation project in a small specialty practice at Columbia-Presbyterian Medical Center. While the factors contributing to the delay of this small project share some attributes with those found in larger organizations, there are important differences. The significance of these differences for groups implementing small medical information systems is discussed.
Close, Kristin L; Baxter, Linden S; Ravelojaona, Vaonandianina A; Rakotoarison, Hasiniaina N; Bruno, Emily; Herbert, Alison; Andean, Vanessa; Callahan, James; Andriamanjato, Hery H
2017-01-01
The WHO Surgical Safety Checklist was launched in 2009, and appropriate use reduces mortality, surgical site infections and complications after surgery by up to 50%. Implementation across low-income and middle-income countries has been slow; published evidence is restricted to reports from a few single institutions, and significant challenges to successful implementation have been identified and presented. The Mercy Ships Medical Capacity Building team developed a multidisciplinary 3-day Surgical Safety Checklist training programme designed for rapid wide-scale implementation in all regional referral hospitals in Madagascar. Particular attention was given to addressing previously reported challenges to implementation. We taught 427 participants in 21 hospitals; at 3–4 months postcourse, we collected surveys from 183 participants in 20 hospitals and conducted one focus group per hospital. We used a concurrent embedded approach in this mixed-methods design to evaluate participants’ experiences and behavioural change as a result of the training programme. Quantitative and qualitative data were analysed using descriptive statistics and inductive thematic analysis, respectively. This analysis paper describes our field experiences and aims to report participants’ responses to the training course, identify further challenges to implementation and describe the lessons learnt. Recommendations are given for stakeholders seeking widespread rapid scale up of quality improvement initiatives to promote surgical safety worldwide. PMID:29225958
1998-06-01
quality management can have on the intermediate level of maintenance. Power quality management is a preventative process that focuses on identifying and correcting problems that cause bad power. Using cost-benefit analysis we compare the effects of implementing a power quality management program at AIMD Lemoore and AIMD Fallon. The implementation of power quality management can result in wide scale logistical support changes in regards to the life cycle costs of maintaining the DoD’s current inventory
Clay-Williams, Robyn; Nosrati, Hadis; Cunningham, Frances C; Hillman, Kenneth; Braithwaite, Jeffrey
2014-09-03
While health care services are beginning to implement system-wide patient safety interventions, evidence on the efficacy of these interventions is sparse. We know that uptake can be variable, but we do not know the factors that affect uptake or how the interventions establish change and, in particular, whether they influence patient outcomes. We conducted a systematic review to identify how organisational and cultural factors mediate or are mediated by hospital-wide interventions, and to assess the effects of those factors on patient outcomes. A systematic review was conducted and reported in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Database searches were conducted using MEDLINE from 1946, CINAHL from 1991, EMBASE from 1947, Web of Science from 1934, PsycINFO from 1967, and Global Health from 1910 to September 2012. The Lancet, JAMA, BMJ, BMJ Quality and Safety, The New England Journal of Medicine and Implementation Science were also hand searched for relevant studies published over the last 5 years. Eligible studies were required to focus on organisational determinants of hospital- and system-wide interventions, and to provide patient outcome data before and after implementation of the intervention. Empirical, peer-reviewed studies reporting randomised and non-randomised controlled trials, observational, and controlled before and after studies were included in the review. Six studies met the inclusion criteria. Improved outcomes were observed for studies where outcomes were measured at least two years after the intervention. Associations between organisational factors, intervention success and patient outcomes were undetermined: organisational culture and patient outcomes were rarely measured together, and measures for culture and outcome were not standardised. Common findings show the difficulty of introducing large-scale interventions, and that effective leadership and clinical champions, adequate financial and educational resources, and dedicated promotional activities appear to be common factors in successful system-wide change.The protocol has been registered in the international prospective register of systematic reviews, PROSPERO (Registration No. CRD42103003050).
An Illustrative Guide to the Minerva Framework
NASA Astrophysics Data System (ADS)
Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration
2017-10-01
Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.
An Implicit Solver on A Parallel Block-Structured Adaptive Mesh Grid for FLASH
NASA Astrophysics Data System (ADS)
Lee, D.; Gopal, S.; Mohapatra, P.
2012-07-01
We introduce a fully implicit solver for FLASH based on a Jacobian-Free Newton-Krylov (JFNK) approach with an appropriate preconditioner. The main goal of developing this JFNK-type implicit solver is to provide efficient high-order numerical algorithms and methodology for simulating stiff systems of differential equations on large-scale parallel computer architectures. A large number of natural problems in nonlinear physics involve a wide range of spatial and time scales of interest. A system that encompasses such a wide magnitude of scales is described as "stiff." A stiff system can arise in many different fields of physics, including fluid dynamics/aerodynamics, laboratory/space plasma physics, low Mach number flows, reactive flows, radiation hydrodynamics, and geophysical flows. One of the big challenges in solving such a stiff system using current-day computational resources lies in resolving time and length scales varying by several orders of magnitude. We introduce FLASH's preliminary implementation of a time-accurate JFNK-based implicit solver in the framework of FLASH's unsplit hydro solver.
Weaving Culture and Core Content into FLEX Programs
ERIC Educational Resources Information Center
Schultz, Kennedy M.
2012-01-01
While immersion programs provide some of the greatest benefits to children learning a new language, many school systems have yet to dedicate the financial and personnel resources necessary to plan and implement such programs on a wide scale. In areas where immersion or formal FLES programming does not exist in the schools, often opportunities to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herk, A.; Beggs, T.
This report outlines the steps a developer can take when creating and implementing high performance standards such as the U.S. Department of Energy’s (DOE’s) Zero Energy Ready Home (ZERH) standards on a community-wide scale. The report also describes the specific examples of how this process is underway in the Stapleton community in Denver, Colorado, by the developer Forest City.
100 Years of Attempts to Transform Physics Education
ERIC Educational Resources Information Center
Otero, Valerie K.; Meltzer, David E.
2016-01-01
As far back as the late 1800s, U.S. physics teachers expressed many of the same ideas about physics education reform that are advocated today. However, several popular reform efforts eventually failed to have wide impact, despite strong and enthusiastic support within the physics education community. Broad-scale implementation of improved…
Croyden, Debbie L; Vidgen, Helen A; Esdaile, Emma; Hernandez, Emely; Magarey, Anthea; Moores, Carly J; Daniels, Lynne
2018-03-13
PEACH™QLD translated the PEACH™ Program, designed to manage overweight/obesity in primary school-aged children, from efficacious RCT and small scale community trial to a larger state-wide program. This paper describes the lessons learnt when upscaling to universal health coverage. The 6-month, family-focussed program was delivered in Queensland, Australia from 2013 to 2016. Its implementation was planned by researchers who developed the program and conducted the RCT, and experienced project managers and practitioners across the health continuum. The intervention targeted parents as the agents of change and was delivered via parent-only group sessions. Concurrently, children attended fun, non-competitive activity sessions. Sessions were delivered by facilitators who received standardised training and were employed by a range of service providers. Participants were referred by health professionals or self-referred in response to extensive promotion and marketing. A pilot phase and a quality improvement framework were planned to respond to emerging challenges. Implementation challenges included engagement of the health system; participant recruitment; and engagement. A total of 1513 children (1216 families) enrolled, with 1122 children (919 families) in the face-to-face program (105 groups in 50 unique venues) and 391 children (297 families) in PEACH™ Online. Self-referral generated 68% of enrolments. Unexpected, concurrent and, far-reaching public health system changes contributed to poor program uptake by the sector (only 56 [53%] groups delivered by publicly-funded health organisations) requiring substantial modification of the original implementation plan. Process evaluation during the pilot phase and an ongoing quality improvement framework informed program adaptations that included changing from fortnightly to weekly sessions aligned with school terms, revision of parent materials, modification of eligibility criteria to include healthy weight children and provision of services privately. Comparisons between pilot versus state-wide waves showed comparable prevalence of families not attending any sessions (25% vs 28%) but improved number of sessions attended (median = 5 vs 7) and completion rates (43% vs 56%). Translating programs developed in the research context to enable implementation at scale is complex and presents substantial challenges. Planning must ensure there is flexibility to accommodate and proactively manage the system changes that are inevitable over time. ACTRN12617000315314 . This trial was registered retrospectively on 28 February, 2017.
Putnam, Robert F; Kincaid, Donald
2015-05-01
Horner and Sugai (2015) recently wrote a manuscript providing an overview of school-wide positive behavioral interventions and supports (PBIS) and why it is an example of applied behavior analysis at the scale of social importance. This paper will describe why school-wide PBIS is important to behavior analysts, how it helps promote applied behavior analysis in schools and other organizations, and how behavior analysts can use this framework to assist them in the promotion and implementation of applied behavior analysis at both at the school and organizational level, as well as, the classroom and individual level.
MWASTools: an R/bioconductor package for metabolome-wide association studies.
Rodriguez-Martinez, Andrea; Posma, Joram M; Ayala, Rafael; Neves, Ana L; Anwar, Maryam; Petretto, Enrico; Emanueli, Costanza; Gauguier, Dominique; Nicholson, Jeremy K; Dumas, Marc-Emmanuel
2018-03-01
MWASTools is an R package designed to provide an integrated pipeline to analyse metabonomic data in large-scale epidemiological studies. Key functionalities of our package include: quality control analysis; metabolome-wide association analysis using various models (partial correlations, generalized linear models); visualization of statistical outcomes; metabolite assignment using statistical total correlation spectroscopy (STOCSY); and biological interpretation of metabolome-wide association studies results. The MWASTools R package is implemented in R (version > =3.4) and is available from Bioconductor: https://bioconductor.org/packages/MWASTools/. m.dumas@imperial.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Interval-level measurement with visual analogue scales in Internet-based research: VAS Generator.
Reips, Ulf-Dietrich; Funke, Frederik
2008-08-01
The present article describes VAS Generator (www.vasgenerator.net), a free Web service for creating a wide range of visual analogue scales that can be used as measurement devices in Web surveys and Web experimentation, as well as for local computerized assessment. A step-by-step example for creating and implementing a visual analogue scale with visual feedback is given. VAS Generator and the scales it generates work independently of platforms and use the underlying languages HTML and JavaScript. Results from a validation study with 355 participants are reported and show that the scales generated with VAS Generator approximate an interval-scale level. In light of previous research on visual analogue versus categorical (e.g., radio button) scales in Internet-based research, we conclude that categorical scales only reach ordinal-scale level, and thus visual analogue scales are to be preferred whenever possible.
The Challenges of Evaluating Large-Scale, Multi-Partner Programmes: The Case of NIHR CLAHRCs
ERIC Educational Resources Information Center
Martin, Graham P.; Ward, Vicky; Hendy, Jane; Rowley, Emma; Nancarrow, Susan; Heaton, Janet; Britten, Nicky; Fielden, Sandra; Ariss, Steven
2011-01-01
The limited extent to which research evidence is utilised in healthcare and other public services is widely acknowledged. The United Kingdom government has attempted to address this gap by funding nine Collaborations for Leadership in Applied Health Research and Care (CLAHRCs). CLAHRCs aim to carry out health research, implement research findings…
Improving Student Retention through Evidence Based Proactive Systems at the Open University (UK)
ERIC Educational Resources Information Center
Gibbs, Graham; Regan, Peter; Simpson, Ormond
2007-01-01
The Open University has been undertaking an extended initiative to improve student retention through enhanced support for at-risk students. This initiative has evolved through a series of stages from ad hoc small scale local interventions relying largely on tutors and student self-referral, to an institution-wide pro-active system implemented by…
The Roles of Artificial Intelligence in Education: Current Progress and Future Prospects
ERIC Educational Resources Information Center
McArthur, David; Lewis, Matthew; Bishary, Miriam
2005-01-01
This report begins by summarizing current applications of ideas from artificial intelligence (Al) to education. It then uses that summary to project various future applications of Al--and advanced technology in general--to education, as well as highlighting problems that will confront the wide scale implementation of these technologies in the…
Boyer, Dana; Ramaswami, Anu
2017-10-17
This paper develops a methodology for individual cities to use to analyze the in- and trans-boundary water, greenhouse gas (GHG), and land impacts of city-scale food system actions. Applied to Delhi, India, the analysis demonstrates that city-scale action can rival typical food policy interventions that occur at larger scales, although no single city-scale action can rival in all three environmental impacts. In particular, improved food-waste management within the city (7% system-wide GHG reduction) matches the GHG impact of preconsumer trans-boundary food waste reduction. The systems approach is particularly useful in illustrating key trade-offs and co-benefits. For instance, multiple diet shifts that can reduce GHG emissions have trade-offs that increase water and land impacts. Vertical farming technology (VFT) with current applications for fruits and vegetables can provide modest system-wide water (4%) and land reductions (3%), although implementation within the city itself may raise questions of constraints in water-stressed cities, with such a shift in Delhi increasing community-wide direct water use by 16%. Improving the nutrition status for the bottom 50% of the population to the median diet is accompanied by proportionally smaller increases of water, GHG, and land impacts (4%, 9%, and 8%, systemwide): increases that can be offset through simultaneous city-scale actions, e.g., improved food-waste management and VFT.
Particle dispersion in homogeneous turbulence using the one-dimensional turbulence model
Sun, Guangyuan; Lignell, David O.; Hewson, John C.; ...
2014-10-09
Lagrangian particle dispersion is studied using the one-dimensional turbulence (ODT) model in homogeneous decaying turbulence configurations. The ODT model has been widely and successfully applied to a number of reacting and nonreacting flow configurations, but only limited application has been made to multiphase flows. We present a version of the particle implementation and interaction with the stochastic and instantaneous ODT eddy events. The model is characterized by comparison to experimental data of particle dispersion for a range of intrinsic particle time scales and body forces. Particle dispersion, velocity, and integral time scale results are presented. Moreover, the particle implementation introducesmore » a single model parameter β p , and sensitivity to this parameter and behavior of the model are discussed. Good agreement is found with experimental data and the ODT model is able to capture the particle inertial and trajectory crossing effects. Our results serve as a validation case of the multiphase implementations of ODT for extensions to other flow configurations.« less
Newmark local time stepping on high-performance computing architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rietmann, Max, E-mail: max.rietmann@erdw.ethz.ch; Institute of Geophysics, ETH Zurich; Grote, Marcus, E-mail: marcus.grote@unibas.ch
In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strongmore » element-size contrasts (more than 100x). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.« less
Howard, N; Mounier-Jack, S; Gallagher, K E; Kabakama, S; Griffiths, U K; Feletto, M; LaMontagne, D S; Burchett, H E D; Watson-Jones, D
2016-09-01
Demonstration projects or pilots of new public health interventions aim to build learning and capacity to inform country-wide implementation. Authors examined the value of HPV vaccination demonstration projects and initial national programmes in low-income and lower-middle-income countries, including potential drawbacks and how value for national scale-up might be increased. Data from a systematic review and key informant interviews, analyzed thematically, included 55 demonstration projects and 8 national programmes implemented between 2007-2015 (89 years' experience). Initial demonstration projects quickly provided consistent lessons. Value would increase if projects were designed to inform sustainable national scale-up. Well-designed projects can test multiple delivery strategies, implementation for challenging areas and populations, and integration with national systems. Introduction of vaccines or other health interventions, particularly those involving new target groups or delivery strategies, needs flexible funding approaches to address specific questions of scalability and sustainability, including learning lessons through phased national expansion.
A Supervisor-Targeted Implementation Approach to Promote System Change: The R3 Model.
Saldana, Lisa; Chamberlain, Patricia; Chapman, Jason
2016-11-01
Opportunities to evaluate strategies to create system-wide change in the child welfare system (CWS) and the resulting public health impact are rare. Leveraging a real-world, system-initiated effort to infuse the use of evidence-based principles throughout a CWS workforce, a pilot of the R 3 model and supervisor-targeted implementation approach is described. The development of R 3 and its associated fidelity monitoring was a collaboration between the CWS and model developers. Outcomes demonstrate implementation feasibility, strong fidelity scale measurement properties, improved supervisor fidelity over time, and the acceptability and perception of positive change by agency leadership. The value of system-initiated collaborations is discussed.
NASA Astrophysics Data System (ADS)
Byun, Hye Suk; El-Naggar, Mohamed Y.; Kalia, Rajiv K.; Nakano, Aiichiro; Vashishta, Priya
2017-10-01
Kinetic Monte Carlo (KMC) simulations are used to study long-time dynamics of a wide variety of systems. Unfortunately, the conventional KMC algorithm is not scalable to larger systems, since its time scale is inversely proportional to the simulated system size. A promising approach to resolving this issue is the synchronous parallel KMC (SPKMC) algorithm, which makes the time scale size-independent. This paper introduces a formal derivation of the SPKMC algorithm based on local transition-state and time-dependent Hartree approximations, as well as its scalable parallel implementation based on a dual linked-list cell method. The resulting algorithm has achieved a weak-scaling parallel efficiency of 0.935 on 1024 Intel Xeon processors for simulating biological electron transfer dynamics in a 4.2 billion-heme system, as well as decent strong-scaling parallel efficiency. The parallel code has been used to simulate a lattice of cytochrome complexes on a bacterial-membrane nanowire, and it is broadly applicable to other problems such as computational synthesis of new materials.
ERIC Educational Resources Information Center
Hudson, Alan; Cameron, Christine; Matthews, Jan
2008-01-01
Background: While there have been several evaluations of programs to help parents manage difficult behaviour of their child with an intellectual disability, little research has focused on the evaluation of such programs when delivered to large populations. Method: The benchmarks recommended by Wiese, Stancliffe, and Hemsley (2005) were used to…
ERIC Educational Resources Information Center
Sotiriou, Sofoklis; Bybee, Rodger W.; Bogner, Franz X.
2017-01-01
The fundamental pioneering ideas about student-centered, inquiry-based learning initiatives are differing in Europe and the US. The latter had initiated various top-down schemes that have led to well-defined standards, while in Europe, with its some 50 independent educational systems, a wide variety of approaches has been evolved. In this present…
ERIC Educational Resources Information Center
Miller, Faith G.; Patwa, Shamim S.; Chafouleas, Sandra M.
2014-01-01
An increased emphasis on collecting and using data in schools has occurred, in part, because of the implementation of multi-tiered systems of support (MTSS). Commonly referred to as response to intervention in the academic domain and school-wide positive behavioral interventions and supports in the behavioral domain, these initiatives have a…
ERIC Educational Resources Information Center
Clarke, Nathan R.; Casey, John Patrick; Brown, Earlene D.; Oneyma, Ezenwa; Donaghy, Kelley J.
2006-01-01
A synthesis is developed to make biodiesel from vegetable oils such as soybean, sunflower, and corn oil, as an exercise in the laboratory. Viscosity measurements were used to gain an understanding of an intermolecular property of the biodiesel and that has limited the implementation of biodiesel on a wide scale basis, solidification at low…
BioC implementations in Go, Perl, Python and Ruby
Liu, Wanli; Islamaj Doğan, Rezarta; Kwon, Dongseop; Marques, Hernani; Rinaldi, Fabio; Wilbur, W. John; Comeau, Donald C.
2014-01-01
As part of a communitywide effort for evaluating text mining and information extraction systems applied to the biomedical domain, BioC is focused on the goal of interoperability, currently a major barrier to wide-scale adoption of text mining tools. BioC is a simple XML format, specified by DTD, for exchanging data for biomedical natural language processing. With initial implementations in C++ and Java, BioC provides libraries of code for reading and writing BioC text documents and annotations. We extend BioC to Perl, Python, Go and Ruby. We used SWIG to extend the C++ implementation for Perl and one Python implementation. A second Python implementation and the Ruby implementation use native data structures and libraries. BioC is also implemented in the Google language Go. BioC modules are functional in all of these languages, which can facilitate text mining tasks. BioC implementations are freely available through the BioC site: http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net/ PMID:24961236
DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.
Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less
DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia
Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.
2017-01-16
Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less
Durkin, Gregory J
2010-01-01
A wide variety of evaluation formats are available for new graduate nurses, but most of them are single-point evaluation tools that do not provide a clear picture of progress for orientee or educator. This article describes the development of a Web-based evaluation tool that combines learning taxonomies with the Synergy model into a rating scale based on independent performance. The evaluation tool and process provides open 24/7 access to evaluation documentation for members of the orientation team, demystifying the process and clarifying expectations. The implementation of the tool has proven to be transformative in the perceptions of evaluation and performance expectations of new graduates. This tool has been successful at monitoring progress, altering education, and opening dialogue about performance for over 125 new graduate nurses since inception.
The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.
Pang, Haotian; Liu, Han; Vanderbei, Robert
2014-02-01
We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.
Putnam, Robert F; Knoster, Tim
2016-03-01
In the previous issue of Behavior Analysis in Practice (May 2015), a special section of the journal was devoted to positive behavior intervention and support (PBIS). Horner and Sugai (2015) published a manuscript providing an overview of school-wide PBIS describing how PBIS is an example of applied behavior analysis at a scale of social importance. A number of manuscripts providing commentary on the Horner and Sugai manuscript were also published in this special section of the journal. This paper will review this PBIS manuscript along with the associated commentaries published in the May 2015 special section.
Means, Arianna Rubin; Ajjampur, Sitara S R; Bailey, Robin; Galactionova, Katya; Gwayi-Chore, Marie-Claire; Halliday, Katherine; Ibikounle, Moudachirou; Juvekar, Sanjay; Kalua, Khumbo; Kang, Gagandeep; Lele, Pallavi; Luty, Adrian J F; Pullan, Rachel; Sarkar, Rajiv; Schär, Fabian; Tediosi, Fabrizio; Weiner, Bryan J; Yard, Elodie; Walson, Judd
2018-01-01
Hybrid trials that include both clinical and implementation science outcomes are increasingly relevant for public health researchers that aim to rapidly translate study findings into evidence-based practice. The DeWorm3 Project is a series of hybrid trials testing the feasibility of interrupting the transmission of soil transmitted helminths (STH), while conducting implementation science research that contextualizes clinical research findings and provides guidance on opportunities to optimize delivery of STH interventions. The purpose of DeWorm3 implementation science studies is to ensure rapid and efficient translation of evidence into practice. DeWorm3 will use stakeholder mapping to identify individuals who influence or are influenced by school-based or community-wide mass drug administration (MDA) for STH and to evaluate network dynamics that may affect study outcomes and future policy development. Individual interviews and focus groups will generate the qualitative data needed to identify factors that shape, contextualize, and explain DeWorm3 trial outputs and outcomes. Structural readiness surveys will be used to evaluate the factors that drive health system readiness to implement novel interventions, such as community-wide MDA for STH, in order to target change management activities and identify opportunities for sustaining or scaling the intervention. Process mapping will be used to understand what aspects of the intervention are adaptable across heterogeneous implementation settings and to identify contextually-relevant modifiable bottlenecks that may be addressed to improve the intervention delivery process and to achieve intervention outputs. Lastly, intervention costs and incremental cost-effectiveness will be evaluated to compare the efficiency of community-wide MDA to standard-of-care targeted MDA both over the duration of the trial and over a longer elimination time horizon.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, Russell G.; Winther, Eric C.; Fox, Lyle G.
2003-03-01
This report presents results for year eleven in a basin-wide program to harvest northern pikeminnow (Ptychocheilus oregonensis). This program was started in an effort to reduce predation by northern pikeminnow on juvenile salmonids during their emigration from natal streams to the ocean. Earlier work in the Columbia River Basin suggested predation by northern pikeminnow on juvenile salmonids might account for most of the 10-20% mortality juvenile salmonids experience in each of eight Columbia River and Snake River reservoirs. Modeling simulations based on work in John Day Reservoir from 1982 through 1988 indicated that, if predator-size northern pikeminnow were exploited atmore » a 10-20% rate, the resulting restructuring of their population could reduce their predation on juvenile salmonids by 50%. To test this hypothesis, we implemented a sport-reward angling fishery and a commercial longline fishery in the John Day Pool in 1990. We also conducted an angling fishery in areas inaccessible to the public at four dams on the mainstem Columbia River and at Ice Harbor Dam on the Snake River. Based on the success of these limited efforts, we implemented three test fisheries on a system-wide scale in 1991--a tribal longline fishery above Bonneville Dam, a sport-reward fishery, and a dam-angling fishery. Low catch of target fish and high cost of implementation resulted in discontinuation of the tribal longline fishery. However, the sport-reward and dam-angling fisheries were continued in 1992 and 1993. In 1992, we investigated the feasibility of implementing a commercial longline fishery in the Columbia River below Bonneville Dam and found that implementation of this fishery was also infeasible.« less
NASA Astrophysics Data System (ADS)
Higashino, Satoru; Kobayashi, Shoei; Yamagami, Tamotsu
2007-06-01
High data transfer rate has been demanded for data storage devices along increasing the storage capacity. In order to increase the transfer rate, high-speed data processing techniques in read-channel devices are required. Generally, parallel architecture is utilized for the high-speed digital processing. We have developed a new architecture of Interpolated Timing Recovery (ITR) to achieve high-speed data transfer rate and wide capture-range in read-channel devices for the information storage channels. It facilitates the parallel implementation on large-scale-integration (LSI) devices.
Li, Jianan; Prodinger, Birgit; Reinhardt, Jan D; Stucki, Gerold
2016-06-13
In 2011 the Chinese leadership in rehabilitation, in collaboration with the International Classification of Functioning, Disability and Health (ICF) Research Branch, embarked on an effort towards the system-wide implementation of the ICF in the healthcare system in China. We report here on the lessons learned from the pilot phase of testing the ICF Generic Set, a parsimonious set of 7 ICF categories, which have been shown to best describe functioning across the general population and people with various health conditions, for use in routine clinical practice in China. The paper discusses whether classification and measurement are compatible, what number of ICF categories should be included in data collection in routine practice, and the usefulness of a functioning profile and functioning score in clinical practice and health research planning. In addition, the paper reflects on the use of ICF qualifiers in a rating scale and the particularities of certain ICF categories contained in the ICF Generic Set when used as items in the context of Chinese rehabilitation and healthcare. Finally, the steps required to enhance the utility of system-wide implementation of the ICF in rehabilitation and healthcare services are set out.
Measures of Agreement Between Many Raters for Ordinal Classifications
Nelson, Kerrie P.; Edwards, Don
2015-01-01
Screening and diagnostic procedures often require a physician's subjective interpretation of a patient's test result using an ordered categorical scale to define the patient's disease severity. Due to wide variability observed between physicians’ ratings, many large-scale studies have been conducted to quantify agreement between multiple experts’ ordinal classifications in common diagnostic procedures such as mammography. However, very few statistical approaches are available to assess agreement in these large-scale settings. Existing summary measures of agreement rely on extensions of Cohen's kappa [1 - 5]. These are prone to prevalence and marginal distribution issues, become increasingly complex for more than three experts or are not easily implemented. Here we propose a model-based approach to assess agreement in large-scale studies based upon a framework of ordinal generalized linear mixed models. A summary measure of agreement is proposed for multiple experts assessing the same sample of patients’ test results according to an ordered categorical scale. This measure avoids some of the key flaws associated with Cohen's kappa and its extensions. Simulation studies are conducted to demonstrate the validity of the approach with comparison to commonly used agreement measures. The proposed methods are easily implemented using the software package R and are applied to two large-scale cancer agreement studies. PMID:26095449
Financing a large-scale picture archival and communication system.
Goldszal, Alberto F; Bleshman, Michael H; Bryan, R Nick
2004-01-01
An attempt to finance a large-scale multi-hospital picture archival and communication system (PACS) solely based on cost savings from current film operations is reported. A modified Request for Proposal described the technical requirements, PACS architecture, and performance targets. The Request for Proposal was complemented by a set of desired financial goals-the main one being the ability to use film savings to pay for the implementation and operation of the PACS. Financing of the enterprise-wide PACS was completed through an operating lease agreement including all PACS equipment, implementation, service, and support for an 8-year term, much like a complete outsourcing. Equipment refreshes, both hardware and software, are included. Our agreement also linked the management of the digital imaging operation (PACS) and the traditional film printing, shifting the operational risks of continued printing and costs related to implementation delays to the PACS vendor. An additional optimization step provided the elimination of the negative film budget variances in the beginning of the project when PACS costs tend to be higher than film and film-related expenses. An enterprise-wide PACS has been adopted to achieve clinical workflow improvements and cost savings. PACS financing was solely based on film savings, which included the entire digital solution (PACS) and any residual film printing. These goals were achieved with simultaneous elimination of any over-budget scenarios providing a non-negative cash flow in each year of an 8-year term.
SQC: secure quality control for meta-analysis of genome-wide association studies.
Huang, Zhicong; Lin, Huang; Fellay, Jacques; Kutalik, Zoltán; Hubaux, Jean-Pierre
2017-08-01
Due to the limited power of small-scale genome-wide association studies (GWAS), researchers tend to collaborate and establish a larger consortium in order to perform large-scale GWAS. Genome-wide association meta-analysis (GWAMA) is a statistical tool that aims to synthesize results from multiple independent studies to increase the statistical power and reduce false-positive findings of GWAS. However, it has been demonstrated that the aggregate data of individual studies are subject to inference attacks, hence privacy concerns arise when researchers share study data in GWAMA. In this article, we propose a secure quality control (SQC) protocol, which enables checking the quality of data in a privacy-preserving way without revealing sensitive information to a potential adversary. SQC employs state-of-the-art cryptographic and statistical techniques for privacy protection. We implement the solution in a meta-analysis pipeline with real data to demonstrate the efficiency and scalability on commodity machines. The distributed execution of SQC on a cluster of 128 cores for one million genetic variants takes less than one hour, which is a modest cost considering the 10-month time span usually observed for the completion of the QC procedure that includes timing of logistics. SQC is implemented in Java and is publicly available at https://github.com/acs6610987/secureqc. jean-pierre.hubaux@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Ethics of Implementing Electronic Health Records in Developing Countries: Points to Consider
Were, Martin C.; Meslin, Eric M.
2011-01-01
Electronic Health Record systems (EHRs) are increasingly being used in many developing countries, several of which have moved beyond isolated pilot projects to active large-scale implementation as part of their national health strategies. Despite growing enthusiasm for adopting EHRs in resource poor settings, almost no attention has been paid to the ethical issues that might arise. In this article we argue that these ethical issues should be addressed now if EHRs are to be appropriately implemented in these settings. We take a systematic approach guided by a widely accepted ethical framework currently in use for developing countries to first describe the ethical issues, and then propose a set of ‘Points to Consider’ to guide further thinking and decision-making. PMID:22195214
Kenow, Kevin P.; Gretchen Benjamin,; Tim Schlagenhaft,; Ruth Nissen,; Mary Stefanski,; Gary Wege,; Scott A. Jutila,; Newton, Teresa J.
2016-01-01
The Upper Mississippi River (UMR) has been developed and subsequently managed for commercial navigation by the U.S. Army Corps of Engineers (USACE). The navigation pools created by a series of lock and dams initially provided a complex of aquatic habitats that supported a variety of fish and wildlife. However, biological productivity declined as the pools aged. The River Resources Forum, an advisory body to the St. Paul District of the USACE, established a multiagency Water Level Management Task Force (WLMTF) to evaluate the potential of water level management to improve ecological function and restore the distribution and abundance of fish and wildlife habitat. The WLMTF identified several water level management options and concluded that summer growing season drawdowns at the pool scale offered the greatest potential to provide habitat benefits over a large area. Here we summarize the process followed to plan and implement pool-wide drawdowns on the UMR, including involvement of stakeholders in decision making, addressing requirements to modify reservoir operating plans, development and evaluation of drawdown alternatives, pool selection, establishment of a monitoring plan, interagency coordination, and a public information campaign. Three pool-wide drawdowns were implemented within the St. Paul District and deemed successful in providing ecological benefits without adversely affecting commercial navigation and recreational use of the pools. Insights are provided based on more than 17 years of experience in planning and implementing drawdowns on the UMR.
Multi-scale Slip Inversion Based on Simultaneous Spatial and Temporal Domain Wavelet Transform
NASA Astrophysics Data System (ADS)
Liu, W.; Yao, H.; Yang, H. Y.
2017-12-01
Finite fault inversion is a widely used method to study earthquake rupture processes. Some previous studies have proposed different methods to implement finite fault inversion, including time-domain, frequency-domain, and wavelet-domain methods. Many previous studies have found that different frequency bands show different characteristics of the seismic rupture (e.g., Wang and Mori, 2011; Yao et al., 2011, 2013; Uchide et al., 2013; Yin et al., 2017). Generally, lower frequency waveforms correspond to larger-scale rupture characteristics while higher frequency data are representative of smaller-scale ones. Therefore, multi-scale analysis can help us understand the earthquake rupture process thoroughly from larger scale to smaller scale. By the use of wavelet transform, the wavelet-domain methods can analyze both the time and frequency information of signals in different scales. Traditional wavelet-domain methods (e.g., Ji et al., 2002) implement finite fault inversion with both lower and higher frequency signals together to recover larger-scale and smaller-scale characteristics of the rupture process simultaneously. Here we propose an alternative strategy with a two-step procedure, i.e., firstly constraining the larger-scale characteristics with lower frequency signals, and then resolving the smaller-scale ones with higher frequency signals. We have designed some synthetic tests to testify our strategy and compare it with the traditional one. We also have applied our strategy to study the 2015 Gorkha Nepal earthquake using tele-seismic waveforms. Both the traditional method and our two-step strategy only analyze the data in different temporal scales (i.e., different frequency bands), while the spatial distribution of model parameters also shows multi-scale characteristics. A more sophisticated strategy is to transfer the slip model into different spatial scales, and then analyze the smooth slip distribution (larger scales) with lower frequency data firstly and more detailed slip distribution (smaller scales) with higher frequency data subsequently. We are now implementing the slip inversion using both spatial and temporal domain wavelets. This multi-scale analysis can help us better understand frequency-dependent rupture characteristics of large earthquakes.
Rani, Manju; Nusrat, Sharmin; Hawken, Laura H
2012-10-16
Segmented service delivery with consequent inefficiencies in health systems was one of the main concerns raised during scaling up of disease-specific programs in the last two decades. The organized response to NCD is in infancy in most LMICs with little evidence on how the response is evolving in terms of institutional arrangements and policy development processes. Drawing on qualitative review of policy and program documents from five LMICs and data from global key-informant surveys conducted in 2004 and 2010, we examine current status of governance of response to NCDs at national level along three dimensions- institutional arrangements for stewardship and program management and implementation; policies/plans; and multisectoral coordination and partnerships. Several positive trends were noted in the organization and governance of response to NCDs: shift from specific NCD-based programs to integrated NCD programs, increasing inclusion of NCDs in sector-wide health plans, and establishment of high-level multisectoral coordination mechanisms.Several areas of concern were identified. The evolving NCD-specific institutional structures are being treated as 'program management and implementation' entities rather than as lead 'technical advisory' bodies, with unclear division of roles and responsibilities between NCD-specific and sector-wide structures. NCD-specific and sector-wide plans are poorly aligned and lack prioritization, costing, and appropriate targets. Finally, the effectiveness of existing multisectoral coordination mechanisms remains questionable. The 'technical functions' and 'implementation and management functions' should be clearly separated between NCD-specific units and sector-wide institutional structures to avoid duplicative segmented service delivery systems. Institutional capacity building efforts for NCDs should target both NCD-specific units (for building technical and analytical capacity) and sector-wide organizational units (for building program management and implementation capacity) in MOH.The sector-wide health plans should reflect NCDs in proportion to their public health importance. NCD specific plans should be developed in close consultation with sector-wide health- and non-health stakeholders. These plans should expand on the directions provided by sector-wide health plans specifying strategically prioritized, fully costed activities, and realistic quantifiable targets for NCD control linked with sector-wide expenditure framework. Multisectoral coordination mechanisms need to be strengthened with optimal decision-making powers and resource commitment and monitoring of their outputs.
Efficient data management in a large-scale epidemiology research project.
Meyer, Jens; Ostrzinski, Stefan; Fredrich, Daniel; Havemann, Christoph; Krafczyk, Janina; Hoffmann, Wolfgang
2012-09-01
This article describes the concept of a "Central Data Management" (CDM) and its implementation within the large-scale population-based medical research project "Personalized Medicine". The CDM can be summarized as a conjunction of data capturing, data integration, data storage, data refinement, and data transfer. A wide spectrum of reliable "Extract Transform Load" (ETL) software for automatic integration of data as well as "electronic Case Report Forms" (eCRFs) was developed, in order to integrate decentralized and heterogeneously captured data. Due to the high sensitivity of the captured data, high system resource availability, data privacy, data security and quality assurance are of utmost importance. A complex data model was developed and implemented using an Oracle database in high availability cluster mode in order to integrate different types of participant-related data. Intelligent data capturing and storage mechanisms are improving the quality of data. Data privacy is ensured by a multi-layered role/right system for access control and de-identification of identifying data. A well defined backup process prevents data loss. Over the period of one and a half year, the CDM has captured a wide variety of data in the magnitude of approximately 5terabytes without experiencing any critical incidents of system breakdown or loss of data. The aim of this article is to demonstrate one possible way of establishing a Central Data Management in large-scale medical and epidemiological studies. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Synthetic biology: advancing the design of diverse genetic systems
Wang, Yen-Hsiang; Wei, Kathy Y.; Smolke, Christina D.
2013-01-01
A main objective of synthetic biology is to make the process of designing genetically-encoded biological systems more systematic, predictable, robust, scalable, and efficient. The examples of genetic systems in the field vary widely in terms of operating hosts, compositional approaches, and network complexity, ranging from a simple genetic switch to search-and-destroy systems. While significant advances in synthesis capabilities support the potential for the implementation of pathway- and genome-scale programs, several design challenges currently restrict the scale of systems that can be reasonably designed and implemented. Synthetic biology offers much promise in developing systems to address challenges faced in manufacturing, the environment and sustainability, and health and medicine, but the realization of this potential is currently limited by the diversity of available parts and effective design frameworks. As researchers make progress in bridging this design gap, advances in the field hint at ever more diverse applications for biological systems. PMID:23413816
Accessible methods for the dynamic time-scale decomposition of biochemical systems.
Surovtsova, Irina; Simus, Natalia; Lorenz, Thomas; König, Artjom; Sahle, Sven; Kummer, Ursula
2009-11-01
The growing complexity of biochemical models asks for means to rationally dissect the networks into meaningful and rather independent subnetworks. Such foregoing should ensure an understanding of the system without any heuristics employed. Important for the success of such an approach is its accessibility and the clarity of the presentation of the results. In order to achieve this goal, we developed a method which is a modification of the classical approach of time-scale separation. This modified method as well as the more classical approach have been implemented for time-dependent application within the widely used software COPASI. The implementation includes different possibilities for the representation of the results including 3D-visualization. The methods are included in COPASI which is free for academic use and available at www.copasi.org. irina.surovtsova@bioquant.uni-heidelberg.de Supplementary data are available at Bioinformatics online.
A Locality-Based Threading Algorithm for the Configuration-Interaction Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shan, Hongzhang; Williams, Samuel; Johnson, Calvin
The Configuration Interaction (CI) method has been widely used to solve the non-relativistic many-body Schrodinger equation. One great challenge to implementing it efficiently on manycore architectures is its immense memory and data movement requirements. To address this issue, within each node, we exploit a hybrid MPI+OpenMP programming model in lieu of the traditional flat MPI programming model. Here in this paper, we develop optimizations that partition the workloads among OpenMP threads based on data locality,-which is essential in ensuring applications with complex data access patterns scale well on manycore architectures. The new algorithm scales to 256 threadson the 64-core Intelmore » Knights Landing (KNL) manycore processor and 24 threads on dual-socket Ivy Bridge (Xeon) nodes. Compared with the original implementation, the performance has been improved by up to 7× on theKnights Landing processor and 3× on the dual-socket Ivy Bridge node.« less
A Locality-Based Threading Algorithm for the Configuration-Interaction Method
Shan, Hongzhang; Williams, Samuel; Johnson, Calvin; ...
2017-07-03
The Configuration Interaction (CI) method has been widely used to solve the non-relativistic many-body Schrodinger equation. One great challenge to implementing it efficiently on manycore architectures is its immense memory and data movement requirements. To address this issue, within each node, we exploit a hybrid MPI+OpenMP programming model in lieu of the traditional flat MPI programming model. Here in this paper, we develop optimizations that partition the workloads among OpenMP threads based on data locality,-which is essential in ensuring applications with complex data access patterns scale well on manycore architectures. The new algorithm scales to 256 threadson the 64-core Intelmore » Knights Landing (KNL) manycore processor and 24 threads on dual-socket Ivy Bridge (Xeon) nodes. Compared with the original implementation, the performance has been improved by up to 7× on theKnights Landing processor and 3× on the dual-socket Ivy Bridge node.« less
Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy
2014-12-01
A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.
Shuman, Clayton J; Liu, Xuefeng; Aebersold, Michelle L; Tschannen, Dana; Banaszak-Holl, Jane; Titler, Marita G
2018-04-25
Nurse managers have a pivotal role in fostering unit climates supportive of implementing evidence-based practices (EBPs) in care delivery. EBP leadership behaviors and competencies of nurse managers and their impact on practice climates are widely overlooked in implementation science. The purpose of this study was to examine the contributions of nurse manager EBP leadership behaviors and nurse manager EBP competencies in explaining unit climates for EBP implementation in adult medical-surgical units. A multi-site, multi-unit cross-sectional research design was used to recruit the sample of 24 nurse managers and 553 randomly selected staff nurses from 24 adult medical-surgical units from 7 acute care hospitals in the Northeast and Midwestern USA. Staff nurse perceptions of nurse manager EBP leadership behaviors and unit climates for EBP implementation were measured using the Implementation Leadership Scale and Implementation Climate Scale, respectively. EBP competencies of nurse managers were measured using the Nurse Manager EBP Competency Scale. Participants were emailed a link to an electronic questionnaire and asked to respond within 1 month. The contributions of nurse manager EBP leadership behaviors and competencies in explaining unit climates for EBP implementation were estimated using mixed-effects models controlling for nurse education and years of experience on current unit and accounting for the variability across hospitals and units. Significance level was set at α < .05. Two hundred sixty-four staff nurses and 22 nurse managers were included in the final sample, representing 22 units in 7 hospitals. Nurse manager EBP leadership behaviors (p < .001) and EBP competency (p = .008) explained 52.4% of marginal variance in unit climate for EBP implementation. Leadership behaviors uniquely explained 45.2% variance. The variance accounted for by the random intercepts for hospitals and units (p < .001) and years of nursing experience in current unit (p < .05) were significant but level of nursing education was not. Nurse managers are significantly related to unit climates for EBP implementation primarily through their leadership behaviors. Future implementation studies should consider the leadership of nurse managers in creating climates supportive of EBP implementation.
Multi Length Scale Finite Element Design Framework for Advanced Woven Fabrics
NASA Astrophysics Data System (ADS)
Erol, Galip Ozan
Woven fabrics are integral parts of many engineering applications spanning from personal protective garments to surgical scaffolds. They provide a wide range of opportunities in designing advanced structures because of their high tenacity, flexibility, high strength-to-weight ratios and versatility. These advantages result from their inherent multi scale nature where the filaments are bundled together to create yarns while the yarns are arranged into different weave architectures. Their highly versatile nature opens up potential for a wide range of mechanical properties which can be adjusted based on the application. While woven fabrics are viable options for design of various engineering systems, being able to understand the underlying mechanisms of the deformation and associated highly nonlinear mechanical response is important and necessary. However, the multiscale nature and relationships between these scales make the design process involving woven fabrics a challenging task. The objective of this work is to develop a multiscale numerical design framework using experimentally validated mesoscopic and macroscopic length scale approaches by identifying important deformation mechanisms and recognizing the nonlinear mechanical response of woven fabrics. This framework is exercised by developing mesoscopic length scale constitutive models to investigate plain weave fabric response under a wide range of loading conditions. A hyperelastic transversely isotropic yarn material model with transverse material nonlinearity is developed for woven yarns (commonly used in personal protection garments). The material properties/parameters are determined through an inverse method where unit cell finite element simulations are coupled with experiments. The developed yarn material model is validated by simulating full scale uniaxial tensile, bias extension and indentation experiments, and comparing to experimentally observed mechanical response and deformation mechanisms. Moreover, mesoscopic unit cell finite elements are coupled with a design-of-experiments method to systematically identify the important yarn material properties for the macroscale response of various weave architectures. To demonstrate the macroscopic length scale approach, two new material models for woven fabrics were developed. The Planar Material Model (PMM) utilizes two important deformation mechanisms in woven fabrics: (1) yarn elongation, and (2) relative yarn rotation due to shear loads. The yarns' uniaxial tensile response is modeled with a nonlinear spring using constitutive relations while a nonlinear rotational spring is implemented to define fabric's shear stiffness. The second material model, Sawtooth Material Model (SMM) adopts the sawtooth geometry while recognizing the biaxial nature of woven fabrics by implementing the interactions between the yarns. Material properties/parameters required by both PMM and SMM can be directly determined from standard experiments. Both macroscopic material models are implemented within an explicit finite element code and validated by comparing to the experiments. Then, the developed macroscopic material models are compared under various loading conditions to determine their accuracy. Finally, the numerical models developed in the mesoscopic and macroscopic length scales are linked thus demonstrating the new systematic design framework involving linked mesoscopic and macroscopic length scale modeling approaches. The approach is demonstrated with both Planar and Sawtooth Material Models and the simulation results are verified by comparing the results obtained from meso and macro models.
Initial Implementation Indicators From a Statewide Rollout of SafeCare Within a Child Welfare System
Whitaker, Daniel J.; Ryan, Kerry A.; Wild, Robert C.; Self-Brown, Shannon; Lutzker, John R.; Shanley, Jenelle R.; Edwards, Anna M.; McFry, Erin A.; Moseley, Colby N.; Hodges, Amanda E.
2013-01-01
There is a strong movement toward implementation of evidence-based practices (EBP) in child welfare systems. The SafeCare parenting model is one of few parent-training models that addresses child neglect, the most common form of maltreatment. Here, the authors describe initial findings from a statewide effort to implement the EBP, SafeCare®, into a state child welfare system. A total of 50 agencies participated in training, with 295 individuals entering training to implement SafeCare. Analyses were conducted to describe the trainee sample, describe initial training and implementation indicators, and to examine correlates of initial training performance and implementation indicators. The quality of SafeCare uptake during training and implementation was high with trainees performing very well on training quizzes and role-plays, and demonstrating high fidelity when implementing SafeCare in the field (performing over 90% of expected behaviors). However, the quantity of implementation was generally low, with relatively few providers (only about 25%) implementing the model following workshop training. There were no significant predictors of training or implementation performance, once corrections for multiple comparisons were applied. The Discussion focuses on challenges to large-scale system-wide implementation of EBP. PMID:22146860
ePlant and the 3D data display initiative: integrative systems biology on the world wide web.
Fucile, Geoffrey; Di Biase, David; Nahal, Hardeep; La, Garon; Khodabandeh, Shokoufeh; Chen, Yani; Easley, Kante; Christendat, Dinesh; Kelley, Lawrence; Provart, Nicholas J
2011-01-10
Visualization tools for biological data are often limited in their ability to interactively integrate data at multiple scales. These computational tools are also typically limited by two-dimensional displays and programmatic implementations that require separate configurations for each of the user's computing devices and recompilation for functional expansion. Towards overcoming these limitations we have developed "ePlant" (http://bar.utoronto.ca/eplant) - a suite of open-source world wide web-based tools for the visualization of large-scale data sets from the model organism Arabidopsis thaliana. These tools display data spanning multiple biological scales on interactive three-dimensional models. Currently, ePlant consists of the following modules: a sequence conservation explorer that includes homology relationships and single nucleotide polymorphism data, a protein structure model explorer, a molecular interaction network explorer, a gene product subcellular localization explorer, and a gene expression pattern explorer. The ePlant's protein structure explorer module represents experimentally determined and theoretical structures covering >70% of the Arabidopsis proteome. The ePlant framework is accessed entirely through a web browser, and is therefore platform-independent. It can be applied to any model organism. To facilitate the development of three-dimensional displays of biological data on the world wide web we have established the "3D Data Display Initiative" (http://3ddi.org).
Ceramics Derived from Organo-Metallic Precursors
1991-10-01
spraying, and roller-coating may also be used to good effect . The films deposited by any of these techniques are ready to be fired immediately...films. The wet chemical route offers great potential for highly cost- effective processing; and the critical issue for its wide-scale implementation is...glasses and subsequently to crystallize single phase HTSC materials. The fourth composition, 4223, was made in order to test the effect of Bi on glass
BioC implementations in Go, Perl, Python and Ruby.
Liu, Wanli; Islamaj Doğan, Rezarta; Kwon, Dongseop; Marques, Hernani; Rinaldi, Fabio; Wilbur, W John; Comeau, Donald C
2014-01-01
As part of a communitywide effort for evaluating text mining and information extraction systems applied to the biomedical domain, BioC is focused on the goal of interoperability, currently a major barrier to wide-scale adoption of text mining tools. BioC is a simple XML format, specified by DTD, for exchanging data for biomedical natural language processing. With initial implementations in C++ and Java, BioC provides libraries of code for reading and writing BioC text documents and annotations. We extend BioC to Perl, Python, Go and Ruby. We used SWIG to extend the C++ implementation for Perl and one Python implementation. A second Python implementation and the Ruby implementation use native data structures and libraries. BioC is also implemented in the Google language Go. BioC modules are functional in all of these languages, which can facilitate text mining tasks. BioC implementations are freely available through the BioC site: http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net/ Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.
VariantSpark: population scale clustering of genotype information.
O'Brien, Aidan R; Saunders, Neil F W; Guo, Yi; Buske, Fabian A; Scott, Rodney J; Bauer, Denis C
2015-12-10
Genomic information is increasingly used in medical practice giving rise to the need for efficient analysis methodology able to cope with thousands of individuals and millions of variants. The widely used Hadoop MapReduce architecture and associated machine learning library, Mahout, provide the means for tackling computationally challenging tasks. However, many genomic analyses do not fit the Map-Reduce paradigm. We therefore utilise the recently developed SPARK engine, along with its associated machine learning library, MLlib, which offers more flexibility in the parallelisation of population-scale bioinformatics tasks. The resulting tool, VARIANTSPARK provides an interface from MLlib to the standard variant format (VCF), offers seamless genome-wide sampling of variants and provides a pipeline for visualising results. To demonstrate the capabilities of VARIANTSPARK, we clustered more than 3,000 individuals with 80 Million variants each to determine the population structure in the dataset. VARIANTSPARK is 80 % faster than the SPARK-based genome clustering approach, ADAM, the comparable implementation using Hadoop/Mahout, as well as ADMIXTURE, a commonly used tool for determining individual ancestries. It is over 90 % faster than traditional implementations using R and Python. The benefits of speed, resource consumption and scalability enables VARIANTSPARK to open up the usage of advanced, efficient machine learning algorithms to genomic data.
NASA Astrophysics Data System (ADS)
Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille
2017-12-01
Multiagent systems (MAS) provide a useful tool for exploring the complex dynamics and behavior of financial markets and now MAS approach has been widely implemented and documented in the empirical literature. This paper introduces the implementation of an innovative multi-scale mathematical model for a computational agent-based financial market. The paper develops a method to quantify the degree of self-organization which emerges in the system and shows that the capacity of self-organization is maximized when the agent behaviors are heterogeneous. Numerical results are presented and analyzed, showing how the global market behavior emerges from specific individual behavior interactions.
Why is the VLT Very Efficient?
NASA Astrophysics Data System (ADS)
Comerón, F.
2009-09-01
The operations model of the ESO Very Large Telescope (VLT) heavily relies on a full-scale implementation of Service Mode observing. In this contribution we review the main features of ESO's approach to Service Mode at the VLT, we outline the advantages offered by this mode, and the challenges faced when implementing it given the wide diversity of instrumentation and instrument modes currently available at the VLT and the VLT Interferometer (VLTI). We give special emphasis to the part of this challenge directly derived from the evolution of the atmospheric conditions, which drive the short-term scheduling of the different scientific programmes competing for the available time.
BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.
Huang, Hailiang; Tata, Sandeep; Prill, Robert J
2013-01-01
Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp
HMC algorithm with multiple time scale integration and mass preconditioning
NASA Astrophysics Data System (ADS)
Urbach, C.; Jansen, K.; Shindler, A.; Wenger, U.
2006-01-01
We present a variant of the HMC algorithm with mass preconditioning (Hasenbusch acceleration) and multiple time scale integration. We have tested this variant for standard Wilson fermions at β=5.6 and at pion masses ranging from 380 to 680 MeV. We show that in this situation its performance is comparable to the recently proposed HMC variant with domain decomposition as preconditioner. We give an update of the "Berlin Wall" figure, comparing the performance of our variant of the HMC algorithm to other published performance data. Advantages of the HMC algorithm with mass preconditioning and multiple time scale integration are that it is straightforward to implement and can be used in combination with a wide variety of lattice Dirac operators.
Nano/micro-scale magnetophoretic devices for biomedical applications
NASA Astrophysics Data System (ADS)
Lim, Byeonghwa; Vavassori, Paolo; Sooryakumar, R.; Kim, CheolGi
2017-01-01
In recent years there have been tremendous advances in the versatility of magnetic shuttle technology using nano/micro-scale magnets for digital magnetophoresis. While the technology has been used for a wide variety of single-cell manipulation tasks such as selection, capture, transport, encapsulation, transfection, or lysing of magnetically labeled and unlabeled cells, it has also expanded to include parallel actuation and study of multiple bio-entities. The use of nano/micro-patterned magnetic structures that enable remote control of the applied forces has greatly facilitated integration of the technology with microfluidics, thereby fostering applications in the biomedical arena. The basic design and fabrication of various scaled magnets for remote manipulation of individual and multiple beads/cells, and their associated energies and forces that underlie the broad functionalities of this approach, are presented. One of the most useful features enabled by such advanced integrated engineering is the capacity to remotely tune the magnetic field gradient and energy landscape, permitting such multipurpose shuttles to be implemented within lab-on-chip platforms for a wide range of applications at the intersection of cellular biology and biotechnology.
Scaling participation in payments for ecosystem services programs
Donlan, C. Josh; Boyle, Kevin J.; Xu, Weibin; Gelcich, Stefan
2018-01-01
Payments for ecosystem services programs have become common tools but most have failed to achieve wide-ranging conservation outcomes. The capacity for scale and impact increases when PES programs are designed through the lens of the potential participants, yet this has received little attention in research or practice. Our work with small-scale marine fisheries integrates the social science of PES programs and provides a framework for designing programs that focus a priori on scaling. In addition to payments, desirable non-monetary program attributes and ecological feedbacks attract a wider range of potential participants into PES programs, including those who have more negative attitudes and lower trust. Designing programs that draw individuals into participating in PES programs is likely the most strategic path to reaching scale. Research should engage in new models of participatory research to understand these dynamics and to design programs that explicitly integrate a broad range of needs, values, and modes of implementation. PMID:29522554
Cushman, Robert M; Jones, Sonja B
2002-03-01
Increasing atmospheric concentrations of greenhouse gases are widely expected to cause global warming and other climatic changes. It is important to establish priorities for reducing greenhouse-gas emissions, so that resources can be allocated efficiently and effectively. This is a global problem, and it is possible, on a global scale, to identify those activities whose emissions have the greatest potential for enhancing the greenhouse effect. However, perspectives from smaller scales must be appreciated, because it is on scales down to the local level that response measures will be implemented. This paper analyzes the relative importance of emissions from the many individual sources, on scales ranging from global to national to subnational. Individual country perspectives and proposed policy measures and those of subnational political entities exhibit some commonalities but differ among themselves and from a global-scale perspective in detail.
Garney, Whitney R; Szucs, Leigh E; Primm, Kristin; King Hahn, Laura; Garcia, Kristen M; Martin, Emily; McLeroy, Kenneth
2018-05-01
In 2014, the Centers for Disease Control and Prevention funded the American Heart Association to implement policy, systems, and environment-focused strategies targeting access to healthy food and beverages, physical activity, and smoke-free environments. To understand factors affecting implementation and variations in success across sites, evaluators conducted a multiple case study. Based on past literature, community sites were categorized as capacity-building or implementation-ready, for comparison. A sample of six communities were selected using a systematic selection tool. Through site visits, evaluators conducted interviews with program staff and community partners and assessed action plans. Evaluators identified important implications for nationally coordinated community-based prevention programming. Differences in implementation varied by the communities' readiness, with the most notable differences in how they planned activities and defined success. Existing partner relationships (or lack thereof) played a significant role, regardless of the American Heart Association's existing presence within the communities, in the progression of initiatives and the differences observed among phases. Last, goals in capacity-building sites were tied to organizational goals while goals in implementation-ready sites were more incremental with increased community influence and buy-in. Using national organizations as a mechanism to carry out large-scale community-based prevention work is a viable option that provides coordinated, wide-scale implementation without sacrificing a community's priorities or input. In funding future initiatives, the presence of relationships and the time needed to cultivate such relationships should be accounted for in the planning and implementation processes, as well as both local and national expectations.
Wide-aperture aspherical lens for high-resolution terahertz imaging
NASA Astrophysics Data System (ADS)
Chernomyrdin, Nikita V.; Frolov, Maxim E.; Lebedev, Sergey P.; Reshetov, Igor V.; Spektor, Igor E.; Tolstoguzov, Viktor L.; Karasik, Valeriy E.; Khorokhorov, Alexei M.; Koshelev, Kirill I.; Schadko, Aleksander O.; Yurchenko, Stanislav O.; Zaytsev, Kirill I.
2017-01-01
In this paper, we introduce wide-aperture aspherical lens for high-resolution terahertz (THz) imaging. The lens has been designed and analyzed by numerical methods of geometrical optics and electrodynamics. It has been made of high-density polyethylene by shaping at computer-controlled lathe and characterized using a continuous-wave THz imaging setup based on a backward-wave oscillator and Golay detector. The concept of image contrast has been implemented to estimate image quality. According to the experimental data, the lens allows resolving two points spaced at 0.95λ distance with a contrast of 15%. To highlight high resolution in the THz images, the wide-aperture lens has been employed for studying printed electronic circuit board containing sub-wavelength-scale elements. The observed results justify the high efficiency of the proposed lens design.
Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.
Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B
2017-03-30
Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.
ERIC Educational Resources Information Center
Bikowsky, Bella A.
2013-01-01
Concerns have arisen in the field of education regarding the social-emotional needs of students accompanied by an outcry for more effective discipline procedures in an effort to support not only the academic learning but also the social and emotional learning (SEL) of students. Initially, this study intended to examine the psychometric properties…
An Overview of Research and Evaluation Designs for Dissemination and Implementation
Brown, C. Hendricks; Curran, Geoffrey; Palinkas, Lawrence A.; Aarons, Gregory A.; Wells, Kenneth B.; Jones, Loretta; Collins, Linda M.; Duan, Naihua; Mittman, Brian S.; Wallace, Andrea; Tabak, Rachel G.; Ducharme, Lori; Chambers, David; Neta, Gila; Wiley, Tisha; Landsverk, John; Cheung, Ken; Cruden, Gracelyn
2016-01-01
Background The wide variety of dissemination and implementation designs now being used to evaluate and improve health systems and outcomes warrants review of the scope, features, and limitations of these designs. Methods This paper is one product of a design workgroup formed in 2013 by the National Institutes of Health to address dissemination and implementation research, and whose members represented diverse methodologic backgrounds, content focus areas, and health sectors. These experts integrated their collective knowledge on dissemination and implementation designs with searches of published evaluations strategies. Results This paper emphasizes randomized and non-randomized designs for the traditional translational research continuum or pipeline, which builds on existing efficacy and effectiveness trials to examine how one or more evidence-based clinical/prevention interventions are adopted, scaled up, and sustained in community or service delivery systems. We also mention other designs, including hybrid designs that combine effectiveness and implementation research, quality improvement designs for local knowledge, and designs that use simulation modeling. PMID:28384085
NASA Astrophysics Data System (ADS)
Pourmokhtarian, A.; Becknell, J. M.; Hall, J.; Desai, A. R.; Boring, L. R.; Duffy, P.; Staudhammer, C. L.; Starr, G.; Dietze, M.
2014-12-01
A wide array of human-induced disturbances can alter the structure and function of forests, including climate change, disturbance and management. While there have been numerous studies on climate change impacts on forests, interactions of management with changing climate and natural disturbance are poorly studied. Forecasts of the range of plausible responses of forests to climate change and management are need for informed decision making on new management approaches under changing climate, as well as adaptation strategies for coming decades. Terrestrial biosphere models (TBMs) provide an excellent opportunity to investigate and assess simultaneous responses of terrestrial ecosystems to climatic perturbations and management across multiple spatio-temporal scales, but currently do not represent a wide array of management activities known to impact carbon, water, surface energy fluxes, and biodiversity. The Ecosystem Demography model 2 (ED2) incorporates non-linear impacts of fine-scale (~10-1 km) heterogeneity in ecosystem structure both horizontally and vertically at a plant level. Therefore it is an ideal candidate to incorporate different forest management practices and test various hypotheses under changing climate and across various spatial scales. The management practices that we implemented were: clear-cut, conversion, planting, partial harvest, low intensity fire, restoration, salvage, and herbicide. The results were validated against observed data across 8 different sites in the U.S. Southeast (Duke Forest, Joseph Jones Ecological Research Center, North Carolina Loblolly Pine, and Ordway-Swisher Biological Station) and Pacific Northwest (Metolius Research Natural Area, H.J. Andrews Experimental Forest, Wind River Field Station, and Mount Rainier National Park). These sites differ in regards to climate, vegetation, soil, and historical land disturbance as well as management approaches. Results showed that different management practices could successfully and realistically be implemented in the ED2 model at a site level. Moreover, sensitivity analyses determined the most important processes at different spatial scales, and also those which could be ignored while minimizing overall error.
Chen, Weiliang; De Schutter, Erik
2017-01-01
Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation. PMID:28239346
Chen, Weiliang; De Schutter, Erik
2017-01-01
Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation.
Energy Feedback at the City-Wide Scale A comparison to building scale studies
NASA Astrophysics Data System (ADS)
Carter, Richard Allan
Climate change is a growing concern throughout the world. In the United States, leadership has so far failed to establish targeted reductions and agreement on mitigation strategies. Despite this, many large cities are taking on the challenge of measuring their emissions, establishing targeted reductions, and defining strategies for mitigation in the form of Climate Action Plans. Reporting of greenhouse gas (GHG) emissions by these cities is usually based on a one-time, annual calculation. Many studies have been conducted on the impact of providing energy use data or feedback to households, and in some cases, institutional or commercial businesses. In most of those studies, the act of providing feedback has resulted in a reduction of energy use, ranging from 2% to 15%, depending upon the features of the feedback. Many of these studies included only electric use. Studies where all energy use was reported are more accurate representations of GHG emissions. GHG emissions and energy use are not the same, depending on the fuel source and in the case of this paper, the focus is on reducing energy use. This research documents the characteristics of the feedback provided in those studies in order to determine which are most effective and should be considered for application to the community-wide scale. Eleven studies, including five primary and six secondary research papers, were reviewed and analyzed for the features of the feedback. Trends were established and evaluated with respect to their effectiveness and potential for use at the community-wide scale. This paper concludes that additional research is required to determine if the use of energy feedback at the city scale could result in savings similar to those observed at the household scale. This additional research could take advantage of the features assessed here in order to be more effective and to implement the features that are best able to scale up. Further research is needed to determine whether combining city-wide feedback with feedback for individual energy users within the city, both residential and commercial, has an even greater impact on reducing energy use and lowering GHG emissions.
Synthetic analog computation in living cells.
Daniel, Ramiz; Rubens, Jacob R; Sarpeshkar, Rahul; Lu, Timothy K
2013-05-30
A central goal of synthetic biology is to achieve multi-signal integration and processing in living cells for diagnostic, therapeutic and biotechnology applications. Digital logic has been used to build small-scale circuits, but other frameworks may be needed for efficient computation in the resource-limited environments of cells. Here we demonstrate that synthetic analog gene circuits can be engineered to execute sophisticated computational functions in living cells using just three transcription factors. Such synthetic analog gene circuits exploit feedback to implement logarithmically linear sensing, addition, ratiometric and power-law computations. The circuits exhibit Weber's law behaviour as in natural biological systems, operate over a wide dynamic range of up to four orders of magnitude and can be designed to have tunable transfer functions. Our circuits can be composed to implement higher-order functions that are well described by both intricate biochemical models and simple mathematical functions. By exploiting analog building-block functions that are already naturally present in cells, this approach efficiently implements arithmetic operations and complex functions in the logarithmic domain. Such circuits may lead to new applications for synthetic biology and biotechnology that require complex computations with limited parts, need wide-dynamic-range biosensing or would benefit from the fine control of gene expression.
PAB3D: Its History in the Use of Turbulence Models in the Simulation of Jet and Nozzle Flows
NASA Technical Reports Server (NTRS)
Abdol-Hamid, Khaled S.; Pao, S. Paul; Hunter, Craig A.; Deere, Karen A.; Massey, Steven J.; Elmiligui, Alaa
2006-01-01
This is a review paper for PAB3D s history in the implementation of turbulence models for simulating jet and nozzle flows. We describe different turbulence models used in the simulation of subsonic and supersonic jet and nozzle flows. The time-averaged simulations use modified linear or nonlinear two-equation models to account for supersonic flow as well as high temperature mixing. Two multiscale-type turbulence models are used for unsteady flow simulations. These models require modifications to the Reynolds Averaged Navier-Stokes (RANS) equations. The first scheme is a hybrid RANS/LES model utilizing the two-equation (k-epsilon) model with a RANS/LES transition function, dependent on grid spacing and the computed turbulence length scale. The second scheme is a modified version of the partially averaged Navier-Stokes (PANS) formulation. All of these models are implemented in the three-dimensional Navier-Stokes code PAB3D. This paper discusses computational methods, code implementation, computed results for a wide range of nozzle configurations at various operating conditions, and comparisons with available experimental data. Very good agreement is shown between the numerical solutions and available experimental data over a wide range of operating conditions.
Kazadi Mbamba, Christian; Flores-Alsina, Xavier; John Batstone, Damien; Tait, Stephan
2016-09-01
The focus of modelling in wastewater treatment is shifting from single unit to plant-wide scale. Plant-wide modelling approaches provide opportunities to study the dynamics and interactions of different transformations in water and sludge streams. Towards developing more general and robust simulation tools applicable to a broad range of wastewater engineering problems, this paper evaluates a plant-wide model built with sub-models from the Benchmark Simulation Model No. 2-P (BSM2-P) with an improved/expanded physico-chemical framework (PCF). The PCF includes a simple and validated equilibrium approach describing ion speciation and ion pairing with kinetic multiple minerals precipitation. Model performance is evaluated against data sets from a full-scale wastewater treatment plant, assessing capability to describe water and sludge lines across the treatment process under steady-state operation. With default rate kinetic and stoichiometric parameters, a good general agreement is observed between the full-scale datasets and the simulated results under steady-state conditions. Simulation results show differences between measured and modelled phosphorus as little as 4-15% (relative) throughout the entire plant. Dynamic influent profiles were generated using a calibrated influent generator and were used to study the effect of long-term influent dynamics on plant performance. Model-based analysis shows that minerals precipitation strongly influences composition in the anaerobic digesters, but also impacts on nutrient loading across the entire plant. A forecasted implementation of nutrient recovery by struvite crystallization (model scenario only), reduced the phosphorus content in the treatment plant influent (via centrate recycling) considerably and thus decreased phosphorus in the treated outflow by up to 43%. Overall, the evaluated plant-wide model is able to jointly describe the physico-chemical and biological processes, and is advocated for future use as a tool for design, performance evaluation and optimization of whole wastewater treatment plants. Copyright © 2016 Elsevier Ltd. All rights reserved.
Implementation of legal abortion in Nepal: a model for rapid scale-up of high-quality care
2012-01-01
Unsafe abortion's significant contribution to maternal mortality and morbidity was a critical factor leading to liberalization of Nepal's restrictive abortion law in 2002. Careful, comprehensive planning among a range of multisectoral stakeholders, led by Nepal's Ministry of Health and Population, enabled the country subsequently to introduce and scale up safe abortion services in a remarkably short timeframe. This paper examines factors that contributed to rapid, successful implementation of legal abortion in this mountainous republic, including deliberate attention to the key areas of policy, health system capacity, equipment and supplies, and information dissemination. Important elements of this successful model of scaling up safe legal abortion include: the pre-existence of postabortion care services, through which health-care providers were already familiar with the main clinical technique for safe abortion; government leadership in coordinating complementary contributions from a wide range of public- and private-sector actors; reliance on public-health evidence in formulating policies governing abortion provision, which led to the embrace of medical abortion and authorization of midlevel providers as key strategies for decentralizing care; and integration of abortion care into existing Safe Motherhood and the broader health system. While challenges remain in ensuring that all Nepali women can readily exercise their legal right to early pregnancy termination, the national safe abortion program has already yielded strong positive results. Nepal's experience making high-quality abortion care widely accessible in a short period of time offers important lessons for other countries seeking to reduce maternal mortality and morbidity from unsafe abortion and to achieve Millennium Development Goals. PMID:22475782
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, Russell G.; Glaser, Bryce G.; Amren, Jennifer
2003-03-01
This report presents results for year ten in a basin-wide program to harvest northern pikeminnow (Ptychocheilus oregonensis). This program was started in an effort to reduce predation by northern pikeminnow on juvenile salmonids during their emigration from natal streams to the ocean. Earlier work in the Columbia River Basin suggested predation by northern pikeminnow on juvenile salmonids might account for most of the 10-20% mortality juvenile salmonids experience in each of eight Columbia River and Snake River reservoirs. Modeling simulations based on work in John Day Reservoir from 1982 through 1988 indicated that, if predator-size northern pikeminnow were exploited atmore » a 10-20% rate, the resulting restructuring of their population could reduce their predation on juvenile salmonids by 50%. To test this hypothesis, we implemented a sport-reward angling fishery and a commercial longline fishery in the John Day Pool in 1990. We also conducted an angling fishery in areas inaccessible to the public at four dams on the mainstem Columbia River and at Ice Harbor Dam on the Snake River. Based on the success of these limited efforts, we implemented three test fisheries on a system-wide scale in 1991--a tribal longline fishery above Bonneville Dam, a sport-reward fishery, and a dam-angling fishery. Low catch of target fish and high cost of implementation resulted in discontinuation of the tribal longline fishery. However, the sport-reward and dam-angling fisheries were continued in 1992 and 1993. In 1992, we investigated the feasibility of implementing a commercial longline fishery in the Columbia River below Bonneville Dam and found that implementation of this fishery was also infeasible. Estimates of combined annual exploitation rates resulting from the sport-reward and damangling fisheries remained at the low end of our target range of 10-20%. This suggested the need for additional effective harvest techniques. During 1991 and 1992, we developed and tested a modified (small-sized) Merwin trapnet. We found this floating trapnet to be very effective in catching northern pikeminnow at specific sites. Consequently, in 1993 we examined a system wide fishery using floating trapnets, but found this fishery to be ineffective at harvesting large numbers of northern pikeminnow on a system-wide scale. In 1994, we investigated the use of trapnets and gillnets at specific locations where concentrations of northern pikeminnow were known or suspected to occur during the spring season (i.e., March through early June). In addition, we initiated a concerted effort to increase public participation in the sport-reward fishery through a series of promotional and incentive activities. In 1995, 1996, and 1997, promotional activities and incentives were further improved based on the favorable response in 1994. Results of these efforts are subjects of this annual report under Section I, Implementation. Evaluation of the success of test fisheries in achieving our target goal of a 10-20% annual exploitation rate on northern pikeminnow is presented in Section II of this report. Overall program success in terms of altering the size and age composition of the northern pikeminnow population and in terms of potential reductions in loss of juvenile salmonids to northern pikeminnow predation is also discussed under Section II.« less
Wade, Victoria A; Taylor, Alan D; Kidd, Michael R; Carati, Colin
2016-05-16
This study was a component of the Flinders Telehealth in the Home project, which tested adding home telehealth to existing rehabilitation, palliative care and geriatric outreach services. Due to the known difficulty of transitioning telehealth projects services, a qualitative study was conducted to produce a preferred implementation approach for sustainable and large-scale operations, and a process model that offers practical advice for achieving this goal. Initially, semi-structured interviews were conducted with senior clinicians, health service managers and policy makers, and a thematic analysis of the interview transcripts was undertaken to identify the range of options for ongoing operations, plus the factors affecting sustainability. Subsequently, the interviewees and other decision makers attended a deliberative forum in which participants were asked to select a preferred model for future implementation. Finally, all data from the study was synthesised by the researchers to produce a process model. 19 interviews with senior clinicians, managers, and service development staff were conducted, finding strong support for home telehealth but a wide diversity of views on governance, models of clinical care, technical infrastructure operations, and data management. The deliberative forum worked through these options and recommended a collaborative consortium approach for large-scale implementation. The process model proposes that the key factor for large-scale implementation is leadership support, which is enabled by 1) showing solutions to the problems of service demand, budgetary pressure and the relationship between hospital and primary care, 2) demonstrating how home telehealth aligns with health service policies, and 3) achieving clinician acceptance through providing evidence of benefit and developing new models of clinical care. Two key actions to enable change were marketing telehealth to patients, clinicians and policy-makers, and building a community of practice. The implementation of home telehealth services is still in an early stage. Change agents and a community of practice can contribute by marketing telehealth, demonstrating policy alignment and providing potential solutions for difficult health services problems. This should assist health leaders to move from trials to large-scale services.
Mitigation Strategies To Protect Food Against Intentional Adulteration. Final rule.
2016-05-27
The Food and Drug Administration (FDA or we) is issuing this final rule to require domestic and foreign food facilities that are required to register under the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to address hazards that may be introduced with the intention to cause wide scale public health harm. These food facilities are required to conduct a vulnerability assessment to identify significant vulnerabilities and actionable process steps and implement mitigation strategies to significantly minimize or prevent significant vulnerabilities identified at actionable process steps in a food operation. FDA is issuing these requirements as part of our implementation of the FDA Food Safety Modernization Act (FSMA).
Ouyang, Jianshu; Chen, Bo; Huang, Dahai
2018-01-01
Concretes with engineered thermal expansion coefficients, capable of avoiding failure or irreversible destruction of structures or devices, are important for civil engineering applications, such as dams, bridges, and buildings. In natural materials, thermal expansion usually cannot be easily regulated and an extremely low thermal expansion coefficient (TEC) is still uncommon. Here we propose a novel cementitious composite, doped with ZrW2O8, showing a wide range of tunable thermal expansion coefficients, from 8.65 × 10−6 °C−1 to 2.48 × 10−6 °C−1. Macro-scale experiments are implemented to quantify the evolution of the thermal expansion coefficients, compressive and flexural strength over a wide range of temperature. Scanning Electron Microscope (SEM) imaging was conducted to quantify the specimens’ microstructural characteristics including pores ratio and size. It is shown that the TEC of the proposed composites depends on the proportion of ZrW2O8 and the ambient curing temperature. Macro-scale experimental results and microstructures have a good agreement. The TEC and strength gradually decrease as ZrW2O8 increases from 0% to 20%, subsequently fluctuates until 60%. The findings reported here provide a new routine to design cementitious composites with tunable thermal expansion for a wide range of engineering applications. PMID:29735957
Steps to overcome the North-South divide in research relevant to climate change policy and practice
NASA Astrophysics Data System (ADS)
Blicharska, Malgorzata; Smithers, Richard J.; Kuchler, Magdalena; Agrawal, Ganesh K.; Gutiérrez, José M.; Hassanali, Ahmed; Huq, Saleemul; Koller, Silvia H.; Marjit, Sugata; Mshinda, Hassan M.; Masjuki, Hj Hassan; Solomons, Noel W.; Staden, Johannes Van; Mikusiński, Grzegorz
2017-01-01
A global North-South divide in research, and its negative consequences, has been highlighted in various scientific disciplines. Northern domination of science relevant to climate change policy and practice, and limited research led by Southern researchers in Southern countries, may hinder further development and implementation of global climate change agreements and nationally appropriate actions. Despite efforts to address the North-South divide, progress has been slow. In this Perspective, we illustrate the extent of the divide, review underlying issues and analyse their consequences for climate change policy development and implementation. We propose a set of practical steps in both Northern and Southern countries that a wide range of actors should take at global, regional and national scales to span the North-South divide, with examples of some actions already being implemented.
Lu, Cheng-Hsuan; da Silva, Arlindo; Wang, Jun; Moorthi, Shrinivas; Chin, Mian; Colarco, Peter; Tang, Youhua; Bhattacharjee, Partha S.; Chen, Shen-Po; Chuang, Hui-Ya; Juang, Hann-Ming Henry; McQueen, Jeffery; Iredell, Mark
2018-01-01
The NOAA National Centers for Environmental Prediction (NCEP) implemented NEMS GFS Aerosol Component (NGAC) for global dust forecasting in collaboration with NASA Goddard Space Flight Center (GSFC). NGAC Version 1.0 has been providing 5 day dust forecasts at 1°×1° resolution on a global scale, once per day at 00:00 Coordinated Universal Time (UTC), since September 2012. This is the first global system capable of interactive atmosphere aerosol forecasting at NCEP. The implementation of NGAC V1.0 reflects an effective and efficient transitioning of NASA research advances to NCEP operations, paving the way for NCEP to provide global aerosol products serving a wide range of stakeholders as well as to allow the effects of aerosols on weather forecasts and climate prediction to be considered. PMID:29652411
Lu, Cheng-Hsuan; da Silva, Arlindo; Wang, Jun; Moorthi, Shrinivas; Chin, Mian; Colarco, Peter; Tang, Youhua; Bhattacharjee, Partha S; Chen, Shen-Po; Chuang, Hui-Ya; Juang, Hann-Ming Henry; McQueen, Jeffery; Iredell, Mark
2016-01-01
The NOAA National Centers for Environmental Prediction (NCEP) implemented NEMS GFS Aerosol Component (NGAC) for global dust forecasting in collaboration with NASA Goddard Space Flight Center (GSFC). NGAC Version 1.0 has been providing 5 day dust forecasts at 1°×1° resolution on a global scale, once per day at 00:00 Coordinated Universal Time (UTC), since September 2012. This is the first global system capable of interactive atmosphere aerosol forecasting at NCEP. The implementation of NGAC V1.0 reflects an effective and efficient transitioning of NASA research advances to NCEP operations, paving the way for NCEP to provide global aerosol products serving a wide range of stakeholders as well as to allow the effects of aerosols on weather forecasts and climate prediction to be considered.
Powell, Byron J; Mandell, David S; Hadley, Trevor R; Rubin, Ronnie M; Evans, Arthur C; Hurford, Matthew O; Beidas, Rinad S
2017-05-12
Examining the role of modifiable barriers and facilitators is a necessary step toward developing effective implementation strategies. This study examines whether both general (organizational culture, organizational climate, and transformational leadership) and strategic (implementation climate and implementation leadership) organizational-level factors predict therapist-level determinants of implementation (knowledge of and attitudes toward evidence-based practices). Within the context of a system-wide effort to increase the use of evidence-based practices (EBPs) and recovery-oriented care, we conducted an observational, cross-sectional study of 19 child-serving agencies in the City of Philadelphia, including 23 sites, 130 therapists, 36 supervisors, and 22 executive administrators. Organizational variables included characteristics such as EBP initiative participation, program size, and proportion of independent contractor therapists; general factors such as organizational culture and climate (Organizational Social Context Measurement System) and transformational leadership (Multifactor Leadership Questionnaire); and strategic factors such as implementation climate (Implementation Climate Scale) and implementation leadership (Implementation Leadership Scale). Therapist-level variables included demographics, attitudes toward EBPs (Evidence-Based Practice Attitudes Scale), and knowledge of EBPs (Knowledge of Evidence-Based Services Questionnaire). We used linear mixed-effects regression models to estimate the associations between the predictor (organizational characteristics, general and strategic factors) and dependent (knowledge of and attitudes toward EBPs) variables. Several variables were associated with therapists' knowledge of EBPs. Clinicians in organizations with more proficient cultures or higher levels of transformational leadership (idealized influence) had greater knowledge of EBPs; conversely, clinicians in organizations with more resistant cultures, more functional organizational climates, and implementation climates characterized by higher levels of financial reward for EBPs had less knowledge of EBPs. A number of organizational factors were associated with the therapists' attitudes toward EBPs. For example, more engaged organizational cultures, implementation climates characterized by higher levels of educational support, and more proactive implementation leadership were all associated with more positive attitudes toward EBPs. This study provides evidence for the importance of both general and strategic organizational determinants as predictors of knowledge of and attitudes toward EBPs. The findings highlight the need for longitudinal and mixed-methods studies that examine the influence of organizational factors on implementation.
Huang, Yueng-Hsiang; Lee, Jin; Chen, Zhuo; Perry, MacKenna; Cheung, Janelle H; Wang, Mo
2017-06-01
Zohar and Luria's (2005) safety climate (SC) scale, measuring organization- and group- level SC each with 16 items, is widely used in research and practice. To improve the utility of the SC scale, we shortened the original full-length SC scales. Item response theory (IRT) analysis was conducted using a sample of 29,179 frontline workers from various industries. Based on graded response models, we shortened the original scales in two ways: (1) selecting items with above-average discriminating ability (i.e. offering more than 6.25% of the original total scale information), resulting in 8-item organization-level and 11-item group-level SC scales; and (2) selecting the most informative items that together retain at least 30% of original scale information, resulting in 4-item organization-level and 4-item group-level SC scales. All four shortened scales had acceptable reliability (≥0.89) and high correlations (≥0.95) with the original scale scores. The shortened scales will be valuable for academic research and practical survey implementation in improving occupational safety. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
A distributed parallel storage architecture and its potential application within EOSDIS
NASA Technical Reports Server (NTRS)
Johnston, William E.; Tierney, Brian; Feuquay, Jay; Butzer, Tony
1994-01-01
We describe the architecture, implementation, use of a scalable, high performance, distributed-parallel data storage system developed in the ARPA funded MAGIC gigabit testbed. A collection of wide area distributed disk servers operate in parallel to provide logical block level access to large data sets. Operated primarily as a network-based cache, the architecture supports cooperation among independently owned resources to provide fast, large-scale, on-demand storage to support data handling, simulation, and computation.
Technical Description of Urban Microscale Modeling System: Component 1 of CRTI Project 02-0093RD
2007-03-01
0093RD which involved (1) development and implementation of a com- putational fluid dynamics model for the simulation of urban flow in an arbitrary...resource will serve as a nation-wide general problem- solving tool for first-responders involved with CBR incidents in the urban environment and...predictions with experimental data obtained from a comprehensive full-scale urban field experiment conducted in Oklahoma City, Oklahoma in July 2003 (Joint
Molecular Robots Obeying Asimov's Three Laws of Robotics.
Kaminka, Gal A; Spokoini-Stern, Rachel; Amir, Yaniv; Agmon, Noa; Bachelet, Ido
2017-01-01
Asimov's three laws of robotics, which were shaped in the literary work of Isaac Asimov (1920-1992) and others, define a crucial code of behavior that fictional autonomous robots must obey as a condition for their integration into human society. While, general implementation of these laws in robots is widely considered impractical, limited-scope versions have been demonstrated and have proven useful in spurring scientific debate on aspects of safety and autonomy in robots and intelligent systems. In this work, we use Asimov's laws to examine these notions in molecular robots fabricated from DNA origami. We successfully programmed these robots to obey, by means of interactions between individual robots in a large population, an appropriately scoped variant of Asimov's laws, and even emulate the key scenario from Asimov's story "Runaround," in which a fictional robot gets into trouble despite adhering to the laws. Our findings show that abstract, complex notions can be encoded and implemented at the molecular scale, when we understand robots on this scale on the basis of their interactions.
A 14 × 14 μm2 footprint polarization-encoded quantum controlled-NOT gate based on hybrid waveguide
Wang, S. M.; Cheng, Q. Q.; Gong, Y. X.; Xu, P.; Sun, C.; Li, L.; Li, T.; Zhu, S. N.
2016-01-01
Photonic quantum information processing system has been widely used in communication, metrology and lithography. The recent emphasis on the miniaturized photonic platform is thus motivated by the urgent need for realizing large-scale information processing and computing. Although the integrated quantum logic gates and quantum algorithms based on path encoding have been successfully demonstrated, the technology for handling another commonly used polarization-encoded qubits has yet to be fully developed. Here, we show the implementation of a polarization-dependent beam-splitter in the hybrid waveguide system. With precisely design, the polarization-encoded controlled-NOT gate can be implemented using only single such polarization-dependent beam-splitter with the significant size reduction of the overall device footprint to 14 × 14 μm2. The experimental demonstration of the highly integrated controlled-NOT gate sets the stage to develop large-scale quantum information processing system. Our hybrid design also establishes the new capabilities in controlling the polarization modes in integrated photonic circuits. PMID:27142992
Scaling of Guide-Field Magnetic Reconnection using Anisotropic Fluid Closure
NASA Astrophysics Data System (ADS)
Ohia, O.; Egedal, J.; Lukin, V. S.; Daughton, W.; Le, A.
2012-10-01
Collisionless magnetic reconnection, a process linked to solar flares, coronal mass ejections, and magnetic substorms, has been widely studied through fluid models and fully kinetic simulations. While fluid models often reproduce the fast reconnection rate of fully kinetic simulations, significant differences are observed in the structure of the reconnection regions [1]. However, guide-field fluid simulations implementing new equations of state that accurately account for the anisotropic electron pressure [2] reproduce the detailed reconnection region observed in kinetic simulations [3]. Implementing this two-fluid simulation using the HiFi framework [4], we study the force balance of the electron layers in guide-field reconnection and derive scaling laws for their characteristics.[1ex] [1] Daughton W et al., Phys. Plasmas 13, 072101 (2006).[0ex] [2] Le A et al., Phys. Rev. Lett. 102, 085001 (2009). [0ex] [3] Ohia O, et al., Phys. Rev. Lett. In Press (2012).[0ex] [4] Lukin VS, Linton MG, Nonlinear Proc. Geoph. 18, 871 (2011)
Wang, S M; Cheng, Q Q; Gong, Y X; Xu, P; Sun, C; Li, L; Li, T; Zhu, S N
2016-05-04
Photonic quantum information processing system has been widely used in communication, metrology and lithography. The recent emphasis on the miniaturized photonic platform is thus motivated by the urgent need for realizing large-scale information processing and computing. Although the integrated quantum logic gates and quantum algorithms based on path encoding have been successfully demonstrated, the technology for handling another commonly used polarization-encoded qubits has yet to be fully developed. Here, we show the implementation of a polarization-dependent beam-splitter in the hybrid waveguide system. With precisely design, the polarization-encoded controlled-NOT gate can be implemented using only single such polarization-dependent beam-splitter with the significant size reduction of the overall device footprint to 14 × 14 μm(2). The experimental demonstration of the highly integrated controlled-NOT gate sets the stage to develop large-scale quantum information processing system. Our hybrid design also establishes the new capabilities in controlling the polarization modes in integrated photonic circuits.
Brown, C Hendricks; Mohr, David C; Gallo, Carlos G; Mader, Christopher; Palinkas, Lawrence; Wingood, Gina; Prado, Guillermo; Kellam, Sheppard G; Pantin, Hilda; Poduska, Jeanne; Gibbons, Robert; McManus, John; Ogihara, Mitsunori; Valente, Thomas; Wulczyn, Fred; Czaja, Sara; Sutcliffe, Geoff; Villamar, Juan; Jacobs, Christopher
2013-06-01
African Americans and Hispanics in the United States have much higher rates of HIV than non-minorities. There is now strong evidence that a range of behavioral interventions are efficacious in reducing sexual risk behavior in these populations. Although a handful of these programs are just beginning to be disseminated widely, we still have not implemented effective programs to a level that would reduce the population incidence of HIV for minorities. We proposed that innovative approaches involving computational technologies be explored for their use in both developing new interventions and in supporting wide-scale implementation of effective behavioral interventions. Mobile technologies have a place in both of these activities. First, mobile technologies can be used in sensing contexts and interacting to the unique preferences and needs of individuals at times where intervention to reduce risk would be most impactful. Second, mobile technologies can be used to improve the delivery of interventions by facilitators and their agencies. Systems science methods including social network analysis, agent-based models, computational linguistics, intelligent data analysis, and systems and software engineering all have strategic roles that can bring about advances in HIV prevention in minority communities. Using an existing mobile technology for depression and 3 effective HIV prevention programs, we illustrated how 8 areas in the intervention/implementation process can use innovative computational approaches to advance intervention adoption, fidelity, and sustainability.
Fast Laplace solver approach to pore-scale permeability
NASA Astrophysics Data System (ADS)
Arns, C. H.; Adler, P. M.
2018-02-01
We introduce a powerful and easily implemented method to calculate the permeability of porous media at the pore scale using an approximation based on the Poiseulle equation to calculate permeability to fluid flow with a Laplace solver. The method consists of calculating the Euclidean distance map of the fluid phase to assign local conductivities and lends itself naturally to the treatment of multiscale problems. We compare with analytical solutions as well as experimental measurements and lattice Boltzmann calculations of permeability for Fontainebleau sandstone. The solver is significantly more stable than the lattice Boltzmann approach, uses less memory, and is significantly faster. Permeabilities are in excellent agreement over a wide range of porosities.
Density Functional O(N) Calculations
NASA Astrophysics Data System (ADS)
Ordejón, Pablo
1998-03-01
We have developed a scheme for performing Density Functional Theory calculations with O(N) scaling.(P. Ordejón, E. Artacho and J. M. Soler, Phys. Rev. B, 53), 10441 (1996) The method uses arbitrarily flexible and complete Atomic Orbitals (AO) basis sets. This gives a wide range of choice, from extremely fast calculations with minimal basis sets, to greatly accurate calculations with complete sets. The size-efficiency of AO bases, together with the O(N) scaling of the algorithm, allow the application of the method to systems with many hundreds of atoms, in single processor workstations. I will present the SIESTA code,(D. Sanchez-Portal, P. Ordejón, E. Artacho and J. M. Soler, Int. J. Quantum Chem., 65), 453 (1997) in which the method is implemented, with several LDA, LSD and GGA functionals available, and using norm-conserving, non-local pseudopotentials (in the Kleinman-Bylander form) to eliminate the core electrons. The calculation of static properties such as energies, forces, pressure, stress and magnetic moments, as well as molecular dynamics (MD) simulations capabilities (including variable cell shape, constant temperature and constant pressure MD) are fully implemented. I will also show examples of the accuracy of the method, and applications to large-scale materials and biomolecular systems.
Building a Unit-Level Mentored Program to Sustain a Culture of Inquiry for Evidence-Based Practice.
Breckenridge-Sproat, Sara T; Throop, Meryia D; Raju, Dheeraj; Murphy, Deborah A; Loan, Lori A; Patrician, Patricia A
2015-01-01
This study tested the effectiveness of a dynamic educational and mentoring program, facilitated by unit-level mentors, to introduce, promote, and sustain an evidence-based practice (EBP) culture among nurses in a military healthcare setting. The need to identify gaps in practice, apply principles of EBP, and advance scientific applications in the pursuit of quality nursing care is as important to military healthcare as it is in the civilian sector. The Advancing Research through Close Collaboration Model guided the intervention and study. Three instruments were used: the Organizational Readiness for System-wide Integration of Evidence-Based Practice, EBP Beliefs, and EBP Implementation scales. The study took place in 3 military hospitals simultaneously undergoing facility and staff integration. Data were collected from staff nurses in the inpatient nursing units before and after a facilitated education and mentoring intervention. Three hundred sixty nurses (38%) completed baseline, and 325 (31%) completed follow-up surveys. Scores improved on all 3 measures following implementation of the program; however, the differences were statistically significant only for the Organizational Readiness for System-wide Integration of Evidence-Based Practice scale (70.96 vs 77.63, t = -3.95, P < .01). In the paired individual pretest/posttest subsample (n = 56), scores improved significantly on all 3 instruments. Despite typically high turnover rates of military personnel and restructuring of 3 facilities during the study period, the readiness for, beliefs about, and implementation of EBP improved. This study suggests that a commitment to an EBP culture may diffuse among individuals in an organization, even while experiencing significant change. It also demonstrates that a unit-level mentored EBP program is sustainable despite changes in organizational structure and workforce composition.
Hättenschwiler, Nicole; Sterchi, Yanik; Mendes, Marcia; Schwaninger, Adrian
2018-10-01
Bomb attacks on civil aviation make detecting improvised explosive devices and explosive material in passenger baggage a major concern. In the last few years, explosive detection systems for cabin baggage screening (EDSCB) have become available. Although used by a number of airports, most countries have not yet implemented these systems on a wide scale. We investigated the benefits of EDSCB with two different levels of automation currently being discussed by regulators and airport operators: automation as a diagnostic aid with an on-screen alarm resolution by the airport security officer (screener) or EDSCB with an automated decision by the machine. The two experiments reported here tested and compared both scenarios and a condition without automation as baseline. Participants were screeners at two international airports who differed in both years of work experience and familiarity with automation aids. Results showed that experienced screeners were good at detecting improvised explosive devices even without EDSCB. EDSCB increased only their detection of bare explosives. In contrast, screeners with less experience (tenure < 1 year) benefitted substantially from EDSCB in detecting both improvised explosive devices and bare explosives. A comparison of all three conditions showed that automated decision provided better human-machine detection performance than on-screen alarm resolution and no automation. This came at the cost of slightly higher false alarm rates on the human-machine system level, which would still be acceptable from an operational point of view. Results indicate that a wide-scale implementation of EDSCB would increase the detection of explosives in passenger bags and automated decision instead of automation as diagnostic aid with on screen alarm resolution should be considered. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
A fuzzy set preference model for market share analysis
NASA Technical Reports Server (NTRS)
Turksen, I. B.; Willson, Ian A.
1992-01-01
Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share prediction).
Scientific and Technological Foundations for Scaling Production of Nanostructured Metals
NASA Astrophysics Data System (ADS)
Lowe, Terry C.; Davis, Casey F.; Rovira, Peter M.; Hayne, Mathew L.; Campbell, Gordon S.; Grzenia, Joel E.; Stock, Paige J.; Meagher, Rilee C.; Rack, Henry J.
2017-05-01
Severe Plastic Deformation (SPD) has been explored in a wide range of metals and alloys. However, there are only a few industrial scale implementations of SPD for commercial alloys. To demonstrate and evolve technology for producing ultrafine grain metals by SPD, a Nanostructured Metals Manufacturing Testbed (NMMT) has been established in Golden, Colorado. Machines for research scale and pilot scale Equal Channel Angular Pressing-Conform (ECAP-C) technology have been configured in the NMMT to systematically evaluate and evolve SPD processing and advance the foundational science and technology for manufacturing. We highlight the scientific and technological areas that are critical for scale up of continuous SPD of aluminum, copper, magnesium, titanium, and iron-based alloys. Key areas that we will address in this presentation include the need for comprehensive analysis of starting microstructures, data on operating deformation mechanisms, high pressure thermodynamics and phase transformation kinetics, tribological behaviors, temperature dependence of lubricant properties, adaptation of tolerances and shear intensity to match viscoplastic behaviors, real-time process monitoring, and mechanics of billet/tooling interactions.
Leadbeater, Bonnie J; Dishion, Tom; Sandler, Irwin; Bradshaw, Catherine P; Dodge, Kenneth; Gottfredson, Denise; Graham, Phillip W; Lindstrom Johnson, Sarah; Maldonado-Molina, Mildred M; Mauricio, Anne M; Smith, Emilie Phillips
2018-06-23
Prevention science researchers and practitioners are increasingly engaged in a wide range of activities and roles to promote evidence-based prevention practices in the community. Ethical concerns invariably arise in these activities and roles that may not be explicitly addressed by university or professional guidelines for ethical conduct. In 2015, the Society for Prevention Research (SPR) Board of Directors commissioned Irwin Sandler and Tom Dishion to organize a series of roundtables and establish a task force to identify salient ethical issues encountered by prevention scientists and community-based practitioners as they collaborate to implement evidence-based prevention practices. This article documents the process and findings of the SPR Ethics Task Force and aims to inform continued efforts to articulate ethical practice. Specifically, the SPR membership and task force identified prevention activities that commonly stemmed from implementation and scale-up efforts. This article presents examples that illustrate typical ethical dilemmas. We present principles and concepts that can be used to frame the discussion of ethical concerns that may be encountered in implementation and scale-up efforts. We summarize value statements that stemmed from our discussion. We also conclude that the field of prevention science in general would benefit from standards and guidelines to promote ethical behavior and social justice in the process of implementing evidence-based prevention practices in community settings. It is our hope that this article serves as an educational resource for students, investigators, and Human Subjects Review Board members regarding some of the complexity of issues of fairness, equality, diversity, and personal rights for implementation of preventive interventions.
Ahmad, Asif; Teater, Phyllis; Bentley, Thomas D.; Kuehn, Lynn; Kumar, Rajee R.; Thomas, Andrew; Mekhjian, Hagop S.
2002-01-01
The benefits of computerized physician order entry have been widely recognized, although few institutions have successfully installed these systems. Obstacles to successful implementation are organizational as well as technical. In the spring of 2000, following a 4-year period of planning and customization, a 9-month pilot project, and a 14-month hiatus for year 2000, the Ohio State University Health System extensively implemented physician order entry across inpatient units. Implementation for specialty and community services is targeted for completion in 2002. On implemented units, all orders are processed through the system, with 80 percent being entered by physicians and the rest by nursing or other licensed care providers. The system is deployable across diverse clinical environments, focused on physicians as the primary users, and accepted by clinicians. These are the three criteria by which the authors measured the success of their implementation. They believe that the availability of specialty-specific order sets, the engagement of physician leadership, and a large-scale system implementation were key strategic factors that enabled physician-users to accept a physician order entry system despite significant changes in workflow. PMID:11751800
2012-01-01
Background Segmented service delivery with consequent inefficiencies in health systems was one of the main concerns raised during scaling up of disease-specific programs in the last two decades. The organized response to NCD is in infancy in most LMICs with little evidence on how the response is evolving in terms of institutional arrangements and policy development processes. Methods Drawing on qualitative review of policy and program documents from five LMICs and data from global key-informant surveys conducted in 2004 and 2010, we examine current status of governance of response to NCDs at national level along three dimensions— institutional arrangements for stewardship and program management and implementation; policies/plans; and multisectoral coordination and partnerships. Results Several positive trends were noted in the organization and governance of response to NCDs: shift from specific NCD-based programs to integrated NCD programs, increasing inclusion of NCDs in sector-wide health plans, and establishment of high-level multisectoral coordination mechanisms. Several areas of concern were identified. The evolving NCD-specific institutional structures are being treated as ‘program management and implementation’ entities rather than as lead ‘technical advisory’ bodies, with unclear division of roles and responsibilities between NCD-specific and sector-wide structures. NCD-specific and sector-wide plans are poorly aligned and lack prioritization, costing, and appropriate targets. Finally, the effectiveness of existing multisectoral coordination mechanisms remains questionable. Conclusions The ‘technical functions’ and ‘implementation and management functions’ should be clearly separated between NCD-specific units and sector-wide institutional structures to avoid duplicative segmented service delivery systems. Institutional capacity building efforts for NCDs should target both NCD-specific units (for building technical and analytical capacity) and sector-wide organizational units (for building program management and implementation capacity) in MOH. The sector-wide health plans should reflect NCDs in proportion to their public health importance. NCD specific plans should be developed in close consultation with sector-wide health- and non-health stakeholders. These plans should expand on the directions provided by sector-wide health plans specifying strategically prioritized, fully costed activities, and realistic quantifiable targets for NCD control linked with sector-wide expenditure framework. Multisectoral coordination mechanisms need to be strengthened with optimal decision-making powers and resource commitment and monitoring of their outputs. PMID:23067232
Experimental research control software system
NASA Astrophysics Data System (ADS)
Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.
2014-05-01
A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.
Mansoori, Bahar; Erhard, Karen K; Sunshine, Jeffrey L
2012-02-01
The availability of the Picture Archiving and Communication System (PACS) has revolutionized the practice of radiology in the past two decades and has shown to eventually increase productivity in radiology and medicine. PACS implementation and integration may bring along numerous unexpected issues, particularly in a large-scale enterprise. To achieve a successful PACS implementation, identifying the critical success and failure factors is essential. This article provides an overview of the process of implementing and integrating PACS in a comprehensive health system comprising an academic core hospital and numerous community hospitals. Important issues are addressed, touching all stages from planning to operation and training. The impact of an enterprise-wide radiology information system and PACS at the academic medical center (four specialty hospitals), in six additional community hospitals, and in all associated outpatient clinics as well as the implications on the productivity and efficiency of the entire enterprise are presented. Copyright © 2012 AUR. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, Lin; Zhang, Feng; Wang, Cai-Zhuang; Ho, Kai-Ming; Travesset, Alex
2018-04-01
We present an implementation of EAM and FS interatomic potentials, which are widely used in simulating metallic systems, in HOOMD-blue, a software designed to perform classical molecular dynamics simulations using GPU accelerations. We first discuss the details of our implementation and then report extensive benchmark tests. We demonstrate that single-precision floating point operations efficiently implemented on GPUs can produce sufficient accuracy when compared against double-precision codes, as demonstrated in test simulations of calculations of the glass-transition temperature of Cu64.5Zr35.5, and pair correlation function g (r) of liquid Ni3Al. Our code scales well with the size of the simulating system on NVIDIA Tesla M40 and P100 GPUs. Compared with another popular software LAMMPS running on 32 cores of AMD Opteron 6220 processors, the GPU/CPU performance ratio can reach as high as 4.6. The source code can be accessed through the HOOMD-blue web page for free by any interested user.
Hine, N D M; Haynes, P D; Mostofi, A A; Payne, M C
2010-09-21
We present calculations of formation energies of defects in an ionic solid (Al(2)O(3)) extrapolated to the dilute limit, corresponding to a simulation cell of infinite size. The large-scale calculations required for this extrapolation are enabled by developments in the approach to parallel sparse matrix algebra operations, which are central to linear-scaling density-functional theory calculations. The computational cost of manipulating sparse matrices, whose sizes are determined by the large number of basis functions present, is greatly improved with this new approach. We present details of the sparse algebra scheme implemented in the ONETEP code using hierarchical sparsity patterns, and demonstrate its use in calculations on a wide range of systems, involving thousands of atoms on hundreds to thousands of parallel processes.
Miranda, Leah S; Datta, Santanu; Melzer, Anne C; Wiener, Renda Soylemez; Davis, James M; Tong, Betty C; Golden, Sara E; Slatore, Christopher G
2017-10-01
Screening for lung cancer using low-dose computed tomography has been demonstrated to reduce lung cancer-related mortality and is being widely implemented. Further research in this area is needed to assess the impact of screening on patient-centered outcomes. Here, we describe the design and rationale for a new study entitled Lung Cancer Screening Implementation: Evaluation of Patient-Centered Care. The protocol is composed of an interconnected series of studies evaluating patients and clinicians who are engaged in lung cancer screening in real-world settings. The primary goal of this study is to evaluate communication processes that are being used in routine care and to identify best practices that can be readily scaled up for implementation in multiple settings. We hypothesize that higher overall quality of patient-clinician communication processes will be associated with lower levels of distress and decisional conflict as patients decide whether or not to participate in lung cancer screening. This work is a critical step toward identifying modifiable mechanisms that are associated with high quality of care for the millions of patients who will consider lung cancer screening. Given the enormous potential benefits and burdens of lung cancer screening on patients, clinicians, and the healthcare system, it is important to identify and then scale up quality communication practices that positively influence patient-centered care.
NASA Technical Reports Server (NTRS)
Lu, Cheng-Hsuan; Da Silva, Arlindo M.; Wang, Jun; Moorthi, Shrinivas; Chin, Mian; Colarco, Peter; Tang, Youhua; Bhattacharjee, Partha S.; Chen, Shen-Po; Chuang, Hui-Ya;
2016-01-01
The NOAA National Centers for Environmental Prediction (NCEP) implemented the NOAA Environmental Modeling System (NEMS) Global Forecast System (GFS) Aerosol Component (NGAC) for global dust forecasting in collaboration with NASA Goddard Space Flight Center (GSFC). NGAC Version 1.0 has been providing 5-day dust forecasts at 1deg x 1deg resolution on a global scale, once per day at 00:00 Coordinated Universal Time (UTC), since September 2012. This is the first global system capable of interactive atmosphere aerosol forecasting at NCEP. The implementation of NGAC V1.0 reflects an effective and efficient transitioning of NASA research advances to NCEP operations, paving the way for NCEP to provide global aerosol products serving a wide range of stakeholders, as well as to allow the effects of aerosols on weather forecasts and climate prediction to be considered.
NASA Technical Reports Server (NTRS)
Singh, M.
2011-01-01
During the last decades, a number of fiber reinforced ceramic composites have been developed and tested for various aerospace and ground based applications. However, a number of challenges still remain slowing the wide scale implementation of these materials. In addition to continuous fiber reinforced composites, other innovative materials have been developed including the fibrous monoliths and sintered fiber bonded ceramics. The sintered silicon carbide fiber bonded ceramics have been fabricated by the hot pressing and sintering of silicon carbide fibers. However, in this system reliable property database as well as various issues related to thermomechanical performance, integration, and fabrication of large and complex shape components has yet to be addressed. In this presentation, thermomechanical properties of sintered silicon carbide fiber bonded ceramics (as fabricated and joined) will be presented. In addition, critical need for manufacturing and integration technologies in successful implementation of these materials will be discussed.
Bardosh, Kevin Louis
2016-01-01
Efforts to control neglected tropical diseases have increasingly focused on questions of implementation. But how should we conceptualize the implementation process? Drawing on ethnographic fieldwork between 2010 and 2012, in this article I explore efforts by a small-scale public-private partnership to use private veterinarians to sustainably control zoonotic sleeping sickness in Uganda. With a fundamental tension between business incentives and vector control, I show how divergences in knowledge, power, values, and social norms shaped project implementation and community responses. Reflecting more widely on the relationships between project plans and local realities, I argue that these encounters reveal the heuristic value in approaching global health interventions as evolving 'social experiments.' This metaphor reveals the uncertainty inherent to dominant narratives and models, the role of available expertise in defining the limits of action, and the need for continuous adaption to synchronize with emergent social and institutional topographies.
Acceptance of lean redesigns in primary care: A contextual analysis.
Hung, Dorothy; Gray, Caroline; Martinez, Meghan; Schmittdiel, Julie; Harrison, Michael I
Lean is a leading change strategy used in health care to achieve short-term efficiency and quality improvement while promising longer-term system transformation. Most research examines Lean intervention to address isolated problems, rather than to achieve broader systemic changes to care delivery. Moreover, no studies examine contextual influences on system-wide Lean implementation efforts in primary care. The aim of this study was to identify contextual factors most critical to implementing and scaling Lean redesigns across all primary care clinics in a large, ambulatory care delivery system. Over 100 interviews and focus groups were conducted with frontline physicians, clinical staff, and operational leaders. Data analysis was guided by a modified Consolidated Framework for Implementation Research (CFIR), a popular implementation science framework. On the basis of expert recommendations, the modified framework targets factors influencing the implementation of process redesigns. This modified framework, the CFIR-PR, informed our identification of contextual factors that most impacted Lean acceptance among frontline physicians and staff. Several domains identified by the CFIR-PR were critical to acceptance of Lean redesigns. Regarding the implementation process acceptance was influenced by time and intensity of exposure to changes, "top-down" versus "bottom-up" implementation styles, and degrees of employee engagement in developing new workflows. Important factors in the inner setting were the clinic's culture and style of leadership, along with availability of information about Lean's effectiveness. Last, implementation efforts were impacted by individual and team characteristics regarding changed work roles and related issues of professional identity, authority, and autonomy. This study underscores the need for change leaders to consider the contextual factors that surround efforts to implement Lean in primary care. As Lean redesigns are scaled across a system, special attention is warranted with respect to the implementation approach, internal clinic setting, and implications for professional roles and identities of physicians and staff.
Ehrhart, Mark G; Aarons, Gregory A; Farahnak, Lauren R
2014-10-23
Although the importance of the organizational environment for implementing evidence-based practices (EBP) has been widely recognized, there are limited options for measuring implementation climate in public sector health settings. The goal of this research was to develop and test a measure of EBP implementation climate that would both capture a broad range of issues important for effective EBP implementation and be of practical use to researchers and managers seeking to understand and improve the implementation of EBPs. Participants were 630 clinicians working in 128 work groups in 32 US-based mental health agencies. Items to measure climate for EBP implementation were developed based on past literature on implementation climate and other strategic climates and in consultation with experts on the implementation of EBPs in mental health settings. The sample was randomly split at the work group level of analysis; half of the sample was used for exploratory factor analysis (EFA), and the other half was used for confirmatory factor analysis (CFA). The entire sample was utilized for additional analyses assessing the reliability, support for level of aggregation, and construct-based evidence of validity. The EFA resulted in a final factor structure of six dimensions for the Implementation Climate Scale (ICS): 1) focus on EBP, 2) educational support for EBP, 3) recognition for EBP, 4) rewards for EBP, 5) selection for EBP, and 6) selection for openness. This structure was supported in the other half of the sample using CFA. Additional analyses supported the reliability and construct-based evidence of validity for the ICS, as well as the aggregation of the measure to the work group level. The ICS is a very brief (18 item) and pragmatic measure of a strategic climate for EBP implementation. It captures six dimensions of the organizational context that indicate to employees the extent to which their organization prioritizes and values the successful implementation of EBPs. The ICS can be used by researchers to better understand the role of the organizational context on implementation outcomes and by organizations to evaluate their current climate as they consider how to improve the likelihood of implementation success.
A subthreshold aVLSI implementation of the Izhikevich simple neuron model.
Rangan, Venkat; Ghosh, Abhishek; Aparin, Vladimir; Cauwenberghs, Gert
2010-01-01
We present a circuit architecture for compact analog VLSI implementation of the Izhikevich neuron model, which efficiently describes a wide variety of neuron spiking and bursting dynamics using two state variables and four adjustable parameters. Log-domain circuit design utilizing MOS transistors in subthreshold results in high energy efficiency, with less than 1pJ of energy consumed per spike. We also discuss the effects of parameter variations on the dynamics of the equations, and present simulation results that replicate several types of neural dynamics. The low power operation and compact analog VLSI realization make the architecture suitable for human-machine interface applications in neural prostheses and implantable bioelectronics, as well as large-scale neural emulation tools for computational neuroscience.
NASA Technical Reports Server (NTRS)
Mannino, Antonio
2015-01-01
NASA's GEOstationary Coastal and Air Pollution Events (GEOCAPE) mission concept recommended by the U.S. National Research Council (2007) focuses on measurements of atmospheric trace gases and aerosols and aquatic coastal ecology and biogeochemistry from geostationary orbit (35,786 km altitude). GEO-CAPE is currently in pre-formulation (pre- Phase) A with no established launch date. NASA continues to support science and engineering studies to reduce mission risk. Instrument design lab (IDL) studies were commissioned in 2014 to design and cost two implementations for geostationary ocean color instruments (1) Wide-Angle Spectrometer (WAS) and (2) Filter Radiometer (FR) and (3) a cost scaling study to compare the costs for implementing different science performance requirements.
Drug checking: a potential solution to the opioid overdose epidemic?
Bardwell, Geoff; Kerr, Thomas
2018-05-25
North America is experiencing an overdose epidemic driven in part by the proliferation of illicitly-manufactured fentanyl and related analogues. In response, communities are scaling up novel overdose prevention interventions. Included are drug checking technologies. Drug checking technologies aim to identify the contents of illicit drugs. These technologies vary considerably in terms of cost, accuracy, and usability, and while efforts are now underway to implement drug checking programs for people who inject drugs, there remains a lack of rigorous evaluation of their impacts. Given the ongoing overdose crisis and the urgent need for effective responses, research on drug checking should be prioritized. However, while such research should be supported, it should be completed before these technologies are widely implemented.
Wiese, J; König, R
2009-01-01
Biogas plants gain worldwide increasing importance due to several advantages. However, concerning the equipment most of the existing biogas plants are low-tech plants. E.g., from the point of view of instrumentation, control and automation (ICA) most plants are black-box systems. Consequently, practice shows that many biogas plants are operated sub-optimally and/or in critical (load) ranges. To solve these problems, some new biogas plants have been equipped with modern machines and ICA equipment. In this paper, the authors will show details and discuss operational results of a modern agricultural biogas plant and the resultant opportunities for the implementation of a plant-wide automation.
Légaré, France; Moumjid-Ferdjaoui, Nora; Drolet, Renée; Stacey, Dawn; Härter, Martin; Bastian, Hilda; Beaulieu, Marie-Dominique; Borduas, Francine; Charles, Cathy; Coulter, Angela; Desroches, Sophie; Friedrich, Gwendolyn; Gafni, Amiram; Graham, Ian D.; Labrecque, Michel; LeBlanc, Annie; Légaré, Jean; Politi, Mary; Sargeant, Joan; Thomson, Richard
2014-01-01
Shared decision making is now making inroads in health care professionals’ continuing education curriculum, but there is no consensus on what core competencies are required by clinicians for effectively involving patients in health-related decisions. Ready-made programs for training clinicians in shared decision making are in high demand, but existing programs vary widely in their theoretical foundations, length, and content. An international, interdisciplinary group of 25 individuals met in 2012 to discuss theoretical approaches to making health-related decisions, compare notes on existing programs, take stock of stakeholders concerns, and deliberate on core competencies. This article summarizes the results of those discussions. Some participants believed that existing models already provide a sufficient conceptual basis for developing and implementing shared decision making competency-based training programs on a wide scale. Others argued that this would be premature as there is still no consensus on the definition of shared decision making or sufficient evidence to recommend specific competencies for implementing shared decision making. However, all participants agreed that there were 2 broad types of competencies that clinicians need for implementing shared decision making: relational competencies and risk communication competencies. Further multidisciplinary research could broaden and deepen our understanding of core competencies for shared decision making training. PMID:24347105
NASA Astrophysics Data System (ADS)
Noh, S.; Tachikawa, Y.; Shiiba, M.; Kim, S.
2011-12-01
Applications of the sequential data assimilation methods have been increasing in hydrology to reduce uncertainty in the model prediction. In a distributed hydrologic model, there are many types of state variables and each variable interacts with each other based on different time scales. However, the framework to deal with the delayed response, which originates from different time scale of hydrologic processes, has not been thoroughly addressed in the hydrologic data assimilation. In this study, we propose the lagged filtering scheme to consider the lagged response of internal states in a distributed hydrologic model using two filtering schemes; particle filtering (PF) and ensemble Kalman filtering (EnKF). The EnKF is one of the widely used sub-optimal filters implementing an efficient computation with limited number of ensemble members, however, still based on Gaussian approximation. PF can be an alternative in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions involved. In case of PF, advanced particle regularization scheme is implemented together to preserve the diversity of the particle system. In case of EnKF, the ensemble square root filter (EnSRF) are implemented. Each filtering method is parallelized and implemented in the high performance computing system. A distributed hydrologic model, the water and energy transfer processes (WEP) model, is applied for the Katsura River catchment, Japan to demonstrate the applicability of proposed approaches. Forecasted results via PF and EnKF are compared and analyzed in terms of the prediction accuracy and the probabilistic adequacy. Discussions are focused on the prospects and limitations of each data assimilation method.
Path integral Monte Carlo ground state approach: formalism, implementation, and applications
NASA Astrophysics Data System (ADS)
Yan, Yangqian; Blume, D.
2017-11-01
Monte Carlo techniques have played an important role in understanding strongly correlated systems across many areas of physics, covering a wide range of energy and length scales. Among the many Monte Carlo methods applicable to quantum mechanical systems, the path integral Monte Carlo approach with its variants has been employed widely. Since semi-classical or classical approaches will not be discussed in this review, path integral based approaches can for our purposes be divided into two categories: approaches applicable to quantum mechanical systems at zero temperature and approaches applicable to quantum mechanical systems at finite temperature. While these two approaches are related to each other, the underlying formulation and aspects of the algorithm differ. This paper reviews the path integral Monte Carlo ground state (PIGS) approach, which solves the time-independent Schrödinger equation. Specifically, the PIGS approach allows for the determination of expectation values with respect to eigen states of the few- or many-body Schrödinger equation provided the system Hamiltonian is known. The theoretical framework behind the PIGS algorithm, implementation details, and sample applications for fermionic systems are presented.
Examining Barriers to Sustained Implementation of School-Wide Prevention Practices
ERIC Educational Resources Information Center
Turri, Mary G.; Mercer, Sterett H.; McIntosh, Kent; Nese, Rhonda N. T.; Strickland-Cohen, M. Kathleen; Hoselton, Robert
2016-01-01
The purpose of this study was to determine if an experimental 5-item measure of barriers to implementing and sustaining school-wide prevention practices, the "Assessment of Barriers to Implementation and Sustainability in Schools" (ABISS), would relate to objective measures of school-wide positive behavioral interventions and supports…
Effects of School-Wide Positive Behavior Support on Teacher Self-Efficacy
ERIC Educational Resources Information Center
Kelm, Joanna L.; McIntosh, Kent
2012-01-01
This study examined the relationships between implementation of a school-wide approach to behavior, School-wide Positive Behavior Support (SWPBS), and teacher self-efficacy. Twenty-two teachers from schools implementing SWPBS and 40 teachers from schools not implementing SWPBS completed a questionnaire measuring aspects of self-efficacy.…
Evaluation of a hospital-wide PACS: costs and benefits of the Hammersmith PACS installation
NASA Astrophysics Data System (ADS)
Bryan, Stirling; Keen, Justin; Buxton, Martin J.; Weatherburn, Gwyneth C.
1992-07-01
The unusual nature of sites chosen for hospital-wide PACS implementations and the very small number of proposed implementations make evaluation a complex task. The UK Department of Health is funding both the evaluation and implementation of a hospital-wide PACS. The Brunel University evaluation of the Hammersmith Hospital PACS has two main components: an economic evaluation of the costs and benefits of hospital-wide PACS installations and an exercise in monitoring the implementation process. This paper concentrates on the economic component.
65nm RadSafe™ Technology for RC64 and Advanced SOCs
NASA Astrophysics Data System (ADS)
Liran, Tuvia; Ginosar, Ran; Lange, Fredy; Mandler, Alberto; Aviely, Peleg; Meirov, Henri; Goldberg, Michael; Meister, Zeev; Oliel, Mickey
2015-09-01
The trend of scaling of microelectronic provides certain advantages for space components, as well as some challenges. It enables implementing highly integrated and high performance ASICs, reducing power, area and weight. Scaling also improves the immunity to TID and SEL in most cases, but increases soft error rate significantly. Ramon Chips adopted the 65nm technology for implementing RC64 [1,2], a 64 core DSP for space applications, and for making other future products. The 65nm process node is widely used, very mature, and supported by wide range of IP providers. Thus the need for full custom design of cores and IPs is minimized, and radiation hardening is achievable by mitigating the radiation effects on the available IPs, and developing proprietary IPs only for complementing the available IPs. The RadSafe_65TM technology includes hardened standard cells and I/O libraries, methods for mitigation of radiation effects in COTS IP cores (SRAM, PLL, SERDES, DDR2/3 interface) and adding unique cores for monitoring radiation effects and junction temperature. We had developed RADIC6, a technology development vehicle, for verification of all hard cores and verification of the methodologies and design flow required for RC64. RADIC6 includes the test structures for characterizing the IP cores for immunity to all radiation effects. This paper describes the main elements and IP cores of RadSafe_65TM, as well as the contents of RADIC6 test chip.
NASA Astrophysics Data System (ADS)
Schertzer, D. J. M.; Versini, P. A.; Tchiguirinskaia, I.
2017-12-01
Urban areas are facing an expected increase in intensity and frequency of extreme weather events due to climate change. Combined with unsustainable urbanization, this should exacerbate the environmental consequences related to the water cycle as stormwater management issues, urban heat island increase and biodiversity degradation. Blue Green Solutions (BGS), such as green roofs, vegetated swales or urban ponds, appear to be particularly efficient to reduce the potential impact of new and existing urban developments with respect to these issues. Based on this statement, the French ANR EVNATURB project aims to develop a platform to assess the eco-systemic services provided by BGS and related with the previously mentioned issues. By proposing a multi-disciplinary consortium coupling monitoring, modelling and prospecting, it attempts to tackle several scientific issues currently limiting BGS wide implementation. Based on high resolution monitored sites and modelling tools, space-time variability of the related physical processes will be studied over a wide range of scales (from the material to the district scale), as well as local social-environmental stakes and constraints, to better consider the complexity of the urban environment. The EVNATURB platform developed during the project is intended for every stakeholder involved in urban development projects (planners, architects, engineering and environmental certification companies…) and will help them to implement BGS and evaluate which ones are the most appropriate for a particular project depending on its environmental objectives and constraints, and particularly for obtaining environmental certification.
NASA Astrophysics Data System (ADS)
Rainville, L.; Farrar, J. T.; Shcherbina, A.; Centurioni, L. R.
2017-12-01
The Salinity Processes in the Upper-ocean Regional Study (SPURS) is a program aimed at understanding the patterns and variability of sea surface salinity. Following the first SPURS program in an evaporation-dominated region (2012-2013), the SPURS-2 program targeted wide range of spatial and temporal scales associated with processes controlling salinity in the rain-dominated Eastern Pacific Fresh Pool. Autonomous instruments were delivered in August and September 2016 using research vessels conducted observations over one complete annual cycle. The SPURS-2 field program used coordinated observations from many different autonomous platforms, and a mix of Lagrangian and Eulerian approaches. Here we discuss the motivation, implementation, and the early of SPURS-2.
Coates, Peter S.; Prochazka, Brian G.; Ricca, Mark A.; Wann, Gregory T.; Aldridge, Cameron L.; Hanser, Steven E.; Doherty, Kevin E.; O'Donnell, Michael S.; Edmunds, David R.; Espinosa, Shawn P.
2017-08-10
Population ecologists have long recognized the importance of ecological scale in understanding processes that guide observed demographic patterns for wildlife species. However, directly incorporating spatial and temporal scale into monitoring strategies that detect whether trajectories are driven by local or regional factors is challenging and rarely implemented. Identifying the appropriate scale is critical to the development of management actions that can attenuate or reverse population declines. We describe a novel example of a monitoring framework for estimating annual rates of population change for greater sage-grouse (Centrocercus urophasianus) within a hierarchical and spatially nested structure. Specifically, we conducted Bayesian analyses on a 17-year dataset (2000–2016) of lek counts in Nevada and northeastern California to estimate annual rates of population change, and compared trends across nested spatial scales. We identified leks and larger scale populations in immediate need of management, based on the occurrence of two criteria: (1) crossing of a destabilizing threshold designed to identify significant rates of population decline at a particular nested scale; and (2) crossing of decoupling thresholds designed to identify rates of population decline at smaller scales that decouple from rates of population change at a larger spatial scale. This approach establishes how declines affected by local disturbances can be separated from those operating at larger scales (for example, broad-scale wildfire and region-wide drought). Given the threshold output from our analysis, this adaptive management framework can be implemented readily and annually to facilitate responsive and effective actions for sage-grouse populations in the Great Basin. The rules of the framework can also be modified to identify populations responding positively to management action or demonstrating strong resilience to disturbance. Similar hierarchical approaches might be beneficial for other species occupying landscapes with heterogeneous disturbance and climatic regimes.
Scaling the PuNDIT project for wide area deployments
NASA Astrophysics Data System (ADS)
McKee, Shawn; Batista, Jorge; Carcassi, Gabriele; Dovrolis, Constantine; Lee, Danny
2017-10-01
In today’s world of distributed scientific collaborations, there are many challenges to providing reliable inter-domain network infrastructure. Network operators use a combination of active monitoring and trouble tickets to detect problems, but these are often ineffective at identifying issues that impact wide-area network users. Additionally, these approaches do not scale to wide area inter-domain networks due to unavailability of data from all the domains along typical network paths. The Pythia Network Diagnostic InfrasTructure (PuNDIT) project aims to create a scalable infrastructure for automating the detection and localization of problems across these networks. The project goal is to gather and analyze metrics from existing perfSONAR monitoring infrastructures to identify the signatures of possible problems, locate affected network links, and report them to the user in an intuitive fashion. Simply put, PuNDIT seeks to convert complex network metrics into easily understood diagnoses in an automated manner. We present our progress in creating the PuNDIT system and our status in developing, testing and deploying PuNDIT. We report on the project progress to-date, describe the current implementation architecture and demonstrate some of the various user interfaces it will support. We close by discussing the remaining challenges and next steps and where we see the project going in the future.
DiClemente, Carlo C; Crouch, Taylor Berens; Norwood, Amber E Q; Delahanty, Janine; Welsh, Christopher
2015-03-01
Screening, brief intervention, and referral to treatment (SBIRT) has become an empirically supported and widely implemented approach in primary and specialty care for addressing substance misuse. Accordingly, training of providers in SBIRT has increased exponentially in recent years. However, the quality and fidelity of training programs and subsequent interventions are largely unknown because of the lack of SBIRT-specific evaluation tools. The purpose of this study was to create a coding scale to assess quality and fidelity of SBIRT interactions addressing alcohol, tobacco, illicit drugs, and prescription medication misuse. The scale was developed to evaluate performance in an SBIRT residency training program. Scale development was based on training protocol and competencies with consultation from Motivational Interviewing coding experts. Trained medical residents practiced SBIRT with standardized patients during 10- to 15-min videotaped interactions. This study included 25 tapes from the Family Medicine program coded by 3 unique coder pairs with varying levels of coding experience. Interrater reliability was assessed for overall scale components and individual items via intraclass correlation coefficients. Coder pair-specific reliability was also assessed. Interrater reliability was excellent overall for the scale components (>.85) and nearly all items. Reliability was higher for more experienced coders, though still adequate for the trained coder pair. Descriptive data demonstrated a broad range of adherence and skills. Subscale correlations supported concurrent and discriminant validity. Data provide evidence that the MD3 SBIRT Coding Scale is a psychometrically reliable coding system for evaluating SBIRT interactions and can be used to evaluate implementation skills for fidelity, training, assessment, and research. Recommendations for refinement and further testing of the measure are discussed. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
Tuncbag, Nurcan; Gursoy, Attila; Nussinov, Ruth; Keskin, Ozlem
2011-08-11
Prediction of protein-protein interactions at the structural level on the proteome scale is important because it allows prediction of protein function, helps drug discovery and takes steps toward genome-wide structural systems biology. We provide a protocol (termed PRISM, protein interactions by structural matching) for large-scale prediction of protein-protein interactions and assembly of protein complex structures. The method consists of two components: rigid-body structural comparisons of target proteins to known template protein-protein interfaces and flexible refinement using a docking energy function. The PRISM rationale follows our observation that globally different protein structures can interact via similar architectural motifs. PRISM predicts binding residues by using structural similarity and evolutionary conservation of putative binding residue 'hot spots'. Ultimately, PRISM could help to construct cellular pathways and functional, proteome-scale annotation. PRISM is implemented in Python and runs in a UNIX environment. The program accepts Protein Data Bank-formatted protein structures and is available at http://prism.ccbb.ku.edu.tr/prism_protocol/.
Autonomous smart sensor network for full-scale structural health monitoring
NASA Astrophysics Data System (ADS)
Rice, Jennifer A.; Mechitov, Kirill A.; Spencer, B. F., Jr.; Agha, Gul A.
2010-04-01
The demands of aging infrastructure require effective methods for structural monitoring and maintenance. Wireless smart sensor networks offer the ability to enhance structural health monitoring (SHM) practices through the utilization of onboard computation to achieve distributed data management. Such an approach is scalable to the large number of sensor nodes required for high-fidelity modal analysis and damage detection. While smart sensor technology is not new, the number of full-scale SHM applications has been limited. This slow progress is due, in part, to the complex network management issues that arise when moving from a laboratory setting to a full-scale monitoring implementation. This paper presents flexible network management software that enables continuous and autonomous operation of wireless smart sensor networks for full-scale SHM applications. The software components combine sleep/wake cycling for enhanced power management with threshold detection for triggering network wide tasks, such as synchronized sensing or decentralized modal analysis, during periods of critical structural response.
Hoagwood, Kimberly Eaton; Olin, S. Serene; Horwitz, Sarah; McKay, Mary; Cleek, Andrew; Gleacher, Alissa; Lewandowski, Eric; Nadeem, Erum; Acri, Mary; Chor, Ka Ho Brian; Kuppinger, Anne; Burton, Geraldine; Weiss, Dara; Frank, Samantha; Finnerty, Molly; Bradbury, Donna M.; Woodlock, Kristin M.; Hogan, Michael
2014-01-01
Dissemination of innovations is widely considered the sine qua non for system improvement. At least two dozen states are rolling-out evidence-based mental health practices targeted at children and families using trainings, consultations, webinars, and learning collaboratives to improve quality and outcomes. In New York State (NYS) a group of researchers, policy-makers, providers and family support specialists have worked in partnership since 2002 to redesign and evaluate the children’s mental health system. Five system strategies driven by empirically-based practices and organized within a state-supported infrastructure have been used in the child and family service system with over 2,000 providers: (a) business practices; (b) use of health information technologies in quality improvement; (c) specific clinical interventions targeted at common childhood disorders; (d) parent activation; and (e) quality indicator development. The NYS system has provided a laboratory for naturalistic experiments. We describe these initiatives, key findings and challenges, lessons learned for scaling, and implications for creating evidence-based implementation policies in state systems. PMID:24460518
Parallel group independent component analysis for massive fMRI data sets.
Chen, Shaojie; Huang, Lei; Qiu, Huitong; Nebel, Mary Beth; Mostofsky, Stewart H; Pekar, James J; Lindquist, Martin A; Eloyan, Ani; Caffo, Brian S
2017-01-01
Independent component analysis (ICA) is widely used in the field of functional neuroimaging to decompose data into spatio-temporal patterns of co-activation. In particular, ICA has found wide usage in the analysis of resting state fMRI (rs-fMRI) data. Recently, a number of large-scale data sets have become publicly available that consist of rs-fMRI scans from thousands of subjects. As a result, efficient ICA algorithms that scale well to the increased number of subjects are required. To address this problem, we propose a two-stage likelihood-based algorithm for performing group ICA, which we denote Parallel Group Independent Component Analysis (PGICA). By utilizing the sequential nature of the algorithm and parallel computing techniques, we are able to efficiently analyze data sets from large numbers of subjects. We illustrate the efficacy of PGICA, which has been implemented in R and is freely available through the Comprehensive R Archive Network, through simulation studies and application to rs-fMRI data from two large multi-subject data sets, consisting of 301 and 779 subjects respectively.
Implementation of a piezoelectric energy harvester in railway health monitoring
NASA Astrophysics Data System (ADS)
Li, Jingcheng; Jang, Shinae; Tang, Jiong
2014-03-01
With development of wireless sensor technology, wireless sensor network has shown a great potential for railway health monitoring. However, how to supply continuous power to the wireless sensor nodes is one of the critical issues in long-term full-scale deployment of the wireless smart sensors. Some energy harvesting methodologies have been available including solar, vibration, wind, etc; among them, vibration-based energy harvester using piezoelectric material showed the potential for converting ambient vibration energy to electric energy in railway health monitoring even for underground subway systems. However, the piezoelectric energy harvester has two major problems including that it could only generate small amount of energy, and that it should match the exact narrow band natural frequency with the excitation frequency. To overcome these problems, a wide band piezoelectric energy harvester, which could generate more power on various frequencies regions, has been designed and validated with experimental test. Then it was applied to a full-scale field test using actual railway train. The power generation of the wide band piezoelectric array has been compared to a narrow-band, resonant-based, piezoelectric energy harvester.
Implementation of highly parallel and large scale GW calculations within the OpenAtom software
NASA Astrophysics Data System (ADS)
Ismail-Beigi, Sohrab
The need to describe electronic excitations with better accuracy than provided by band structures produced by Density Functional Theory (DFT) has been a long-term enterprise for the computational condensed matter and materials theory communities. In some cases, appropriate theoretical frameworks have existed for some time but have been difficult to apply widely due to computational cost. For example, the GW approximation incorporates a great deal of important non-local and dynamical electronic interaction effects but has been too computationally expensive for routine use in large materials simulations. OpenAtom is an open source massively parallel ab initiodensity functional software package based on plane waves and pseudopotentials (http://charm.cs.uiuc.edu/OpenAtom/) that takes advantage of the Charm + + parallel framework. At present, it is developed via a three-way collaboration, funded by an NSF SI2-SSI grant (ACI-1339804), between Yale (Ismail-Beigi), IBM T. J. Watson (Glenn Martyna) and the University of Illinois at Urbana Champaign (Laxmikant Kale). We will describe the project and our current approach towards implementing large scale GW calculations with OpenAtom. Potential applications of large scale parallel GW software for problems involving electronic excitations in semiconductor and/or metal oxide systems will be also be pointed out.
Dispersion interactions in Density Functional Theory
NASA Astrophysics Data System (ADS)
Andrinopoulos, Lampros; Hine, Nicholas; Mostofi, Arash
2012-02-01
Semilocal functionals in Density Functional Theory (DFT) achieve high accuracy simulating a wide range of systems, but miss the effect of dispersion (vdW) interactions, important in weakly bound systems. We study two different methods to include vdW in DFT: First, we investigate a recent approach [1] to evaluate the vdW contribution to the total energy using maximally-localized Wannier functions. Using a set of simple dimers, we show that it has a number of shortcomings that hamper its predictive power; we then develop and implement a series of improvements [2] and obtain binding energies and equilibrium geometries in closer agreement to quantum-chemical coupled-cluster calculations. Second, we implement the vdW-DF functional [3], using Soler's method [4], within ONETEP [5], a linear-scaling DFT code, and apply it to a range of systems. This method within a linear-scaling DFT code allows the simulation of weakly bound systems of larger scale, such as organic/inorganic interfaces, biological systems and implicit solvation models. [1] P. Silvestrelli, JPC A 113, 5224 (2009). [2] L. Andrinopoulos et al, JCP 135, 154105 (2011). [3] M. Dion et al, PRL 92, 246401 (2004). [4] G. Rom'an-P'erez, J.M. Soler, PRL 103, 096102 (2009). [5] C. Skylaris et al, JCP 122, 084119 (2005).
System-wide lean implementation in health care: A multiple case study.
Centauri, Federica; Mazzocato, Pamela; Villa, Stefano; Marsilio, Marta
2018-05-01
Background Lean practices have been widely used by health care organizations to meet efficiency, performance and quality improvement needs. The lean health care literature shows that the effective implementation of lean requires a holistic system-wide approach. However, there is still limited evidence on what drives effective system-wide lean implementation in health care. The existing literature suggests that a deeper understanding of how lean interventions interact with the organizational context is necessary to identify the critical variables to successfully sustain system-wide lean strategies. Purpose and methodology: A multiple case study of three Italian hospitals is conducted with the aim to explore the organizational conditions that are relevant for an effective system-wide lean implementation. A conceptual framework, built on socio-technical system schemas, is used to guide data collection and analysis. The analysis points out the importance to support lean implementation with an integrated and coordinated strategy involving the social, technical, and external components of the overall hospital system.
Lundin, Johan; Dumont, Guy
2017-01-01
ABSTRACT Current advances within medical technology show great potential from a global health perspective. Inexpensive, effective solutions to common problems within diagnostics, medical procedures and access to medical information are emerging within almost all fields of medicine. The innovations can benefit health care both in resource-limited and in resource-rich settings. However, there is a big gap between the proof-of-concept stage and implementation. This article will give examples of promising solutions, with special focus on mobile image- and sensor-based diagnostics. We also discuss how technology and frugal innovations could be made sustainable and widely available. Finally, a list of critical factors for success is presented, based on both our own experiences and the literature. PMID:28838308
A universal preconditioner for simulating condensed phase materials.
Packwood, David; Kermode, James; Mones, Letif; Bernstein, Noam; Woolley, John; Gould, Nicholas; Ortner, Christoph; Csányi, Gábor
2016-04-28
We introduce a universal sparse preconditioner that accelerates geometry optimisation and saddle point search tasks that are common in the atomic scale simulation of materials. Our preconditioner is based on the neighbourhood structure and we demonstrate the gain in computational efficiency in a wide range of materials that include metals, insulators, and molecular solids. The simple structure of the preconditioner means that the gains can be realised in practice not only when using expensive electronic structure models but also for fast empirical potentials. Even for relatively small systems of a few hundred atoms, we observe speedups of a factor of two or more, and the gain grows with system size. An open source Python implementation within the Atomic Simulation Environment is available, offering interfaces to a wide range of atomistic codes.
Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A
2016-01-01
The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient’s/parent’s smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. PMID:27018263
A universal preconditioner for simulating condensed phase materials
NASA Astrophysics Data System (ADS)
Packwood, David; Kermode, James; Mones, Letif; Bernstein, Noam; Woolley, John; Gould, Nicholas; Ortner, Christoph; Csányi, Gábor
2016-04-01
We introduce a universal sparse preconditioner that accelerates geometry optimisation and saddle point search tasks that are common in the atomic scale simulation of materials. Our preconditioner is based on the neighbourhood structure and we demonstrate the gain in computational efficiency in a wide range of materials that include metals, insulators, and molecular solids. The simple structure of the preconditioner means that the gains can be realised in practice not only when using expensive electronic structure models but also for fast empirical potentials. Even for relatively small systems of a few hundred atoms, we observe speedups of a factor of two or more, and the gain grows with system size. An open source Python implementation within the Atomic Simulation Environment is available, offering interfaces to a wide range of atomistic codes.
Cáceres, Carlos F; Borquez, Annick; Klausner, Jeffrey D; Baggaley, Rachel; Beyrer, Chris
2016-01-01
In this article, we present recent evidence from studies focused on the implementation, effectiveness and cost-effectiveness of pre-exposure prophylaxis (PrEP) for HIV infection; discuss PrEP scale-up to date, including the observed levels of access and policy development; and elaborate on key emerging policy and research issues to consider for further scale-up, with a special focus on lower-middle income countries. The 2015 WHO Early Release Guidelines for HIV Treatment and Prevention reflect both scientific evidence and new policy perspectives. Those guidelines present a timely challenge to health systems for the scaling up of not only treatment for every person living with HIV infection but also the offer of PrEP to those at substantial risk. Delivery and uptake of both universal antiretroviral therapy (ART) and PrEP will require nation-wide commitment and could reinvigorate health systems to develop more comprehensive "combination prevention" programmes and support wider testing linked to both treatments and other prevention options for populations at highest risk who are currently not accessing services. Various gaps in current health systems will need to be addressed to achieve strategic scale-up of PrEP, including developing prioritization strategies, strengthening drug regulations, determining cost and funding sources, training health providers, supporting user adherence and creating demand. The initial steps in the scale-up of PrEP globally suggest feasibility, acceptability and likely impact. However, to prevent setbacks in less well-resourced settings, countries will need to anticipate and address challenges such as operational and health systems barriers, drug cost and regulatory policies, health providers' openness to prescribing PrEP to populations at substantial risk, demand and legal and human rights issues. Emerging problems will require creative solutions and will continue to illustrate the complexity of PrEP implementation.
Cáceres, Carlos F; Borquez, Annick; Klausner, Jeffrey D; Baggaley, Rachel; Beyrer, Chris
2016-01-01
Background In this article, we present recent evidence from studies focused on the implementation, effectiveness and cost-effectiveness of pre-exposure prophylaxis (PrEP) for HIV infection; discuss PrEP scale-up to date, including the observed levels of access and policy development; and elaborate on key emerging policy and research issues to consider for further scale-up, with a special focus on lower-middle income countries. Discussion The 2015 WHO Early Release Guidelines for HIV Treatment and Prevention reflect both scientific evidence and new policy perspectives. Those guidelines present a timely challenge to health systems for the scaling up of not only treatment for every person living with HIV infection but also the offer of PrEP to those at substantial risk. Delivery and uptake of both universal antiretroviral therapy (ART) and PrEP will require nation-wide commitment and could reinvigorate health systems to develop more comprehensive “combination prevention” programmes and support wider testing linked to both treatments and other prevention options for populations at highest risk who are currently not accessing services. Various gaps in current health systems will need to be addressed to achieve strategic scale-up of PrEP, including developing prioritization strategies, strengthening drug regulations, determining cost and funding sources, training health providers, supporting user adherence and creating demand. Conclusions The initial steps in the scale-up of PrEP globally suggest feasibility, acceptability and likely impact. However, to prevent setbacks in less well-resourced settings, countries will need to anticipate and address challenges such as operational and health systems barriers, drug cost and regulatory policies, health providers’ openness to prescribing PrEP to populations at substantial risk, demand and legal and human rights issues. Emerging problems will require creative solutions and will continue to illustrate the complexity of PrEP implementation. PMID:27760685
Cook, Diane J.; Crandall, Aaron S.; Thomas, Brian L.; Krishnan, Narayanan C.
2013-01-01
While the potential benefits of smart home technology are widely recognized, a lightweight design is needed for the benefits to be realized at a large scale. We introduce the CASAS “smart home in a box”, a lightweight smart home design that is easy to install and provides smart home capabilities out of the box with no customization or training. We discuss types of data analysis that have been performed by the CASAS group and can be pursued in the future by using this approach to designing and implementing smart home technologies. PMID:24415794
Cook, Diane J; Crandall, Aaron S; Thomas, Brian L; Krishnan, Narayanan C
2013-07-01
While the potential benefits of smart home technology are widely recognized, a lightweight design is needed for the benefits to be realized at a large scale. We introduce the CASAS "smart home in a box", a lightweight smart home design that is easy to install and provides smart home capabilities out of the box with no customization or training. We discuss types of data analysis that have been performed by the CASAS group and can be pursued in the future by using this approach to designing and implementing smart home technologies.
Note: Design of FPGA based system identification module with application to atomic force microscopy
NASA Astrophysics Data System (ADS)
Ghosal, Sayan; Pradhan, Sourav; Salapaka, Murti
2018-05-01
The science of system identification is widely utilized in modeling input-output relationships of diverse systems. In this article, we report field programmable gate array (FPGA) based implementation of a real-time system identification algorithm which employs forgetting factors and bias compensation techniques. The FPGA module is employed to estimate the mechanical properties of surfaces of materials at the nano-scale with an atomic force microscope (AFM). The FPGA module is user friendly which can be interfaced with commercially available AFMs. Extensive simulation and experimental results validate the design.
Multi-scale Modeling in Clinical Oncology: Opportunities and Barriers to Success.
Yankeelov, Thomas E; An, Gary; Saut, Oliver; Luebeck, E Georg; Popel, Aleksander S; Ribba, Benjamin; Vicini, Paolo; Zhou, Xiaobo; Weis, Jared A; Ye, Kaiming; Genin, Guy M
2016-09-01
Hierarchical processes spanning several orders of magnitude of both space and time underlie nearly all cancers. Multi-scale statistical, mathematical, and computational modeling methods are central to designing, implementing and assessing treatment strategies that account for these hierarchies. The basic science underlying these modeling efforts is maturing into a new discipline that is close to influencing and facilitating clinical successes. The purpose of this review is to capture the state-of-the-art as well as the key barriers to success for multi-scale modeling in clinical oncology. We begin with a summary of the long-envisioned promise of multi-scale modeling in clinical oncology, including the synthesis of disparate data types into models that reveal underlying mechanisms and allow for experimental testing of hypotheses. We then evaluate the mathematical techniques employed most widely and present several examples illustrating their application as well as the current gap between pre-clinical and clinical applications. We conclude with a discussion of what we view to be the key challenges and opportunities for multi-scale modeling in clinical oncology.
Multi-scale Modeling in Clinical Oncology: Opportunities and Barriers to Success
Yankeelov, Thomas E.; An, Gary; Saut, Oliver; Luebeck, E. Georg; Popel, Aleksander S.; Ribba, Benjamin; Vicini, Paolo; Zhou, Xiaobo; Weis, Jared A.; Ye, Kaiming; Genin, Guy M.
2016-01-01
Hierarchical processes spanning several orders of magnitude of both space and time underlie nearly all cancers. Multi-scale statistical, mathematical, and computational modeling methods are central to designing, implementing and assessing treatment strategies that account for these hierarchies. The basic science underlying these modeling efforts is maturing into a new discipline that is close to influencing and facilitating clinical successes. The purpose of this review is to capture the state-of-the-art as well as the key barriers to success for multi-scale modeling in clinical oncology. We begin with a summary of the long-envisioned promise of multi-scale modeling in clinical oncology, including the synthesis of disparate data types into models that reveal underlying mechanisms and allow for experimental testing of hypotheses. We then evaluate the mathematical techniques employed most widely and present several examples illustrating their application as well as the current gap between pre-clinical and clinical applications. We conclude with a discussion of what we view to be the key challenges and opportunities for multi-scale modeling in clinical oncology. PMID:27384942
Yang, Lin; Zhang, Feng; Wang, Cai-Zhuang; ...
2018-01-12
We present an implementation of EAM and FS interatomic potentials, which are widely used in simulating metallic systems, in HOOMD-blue, a software designed to perform classical molecular dynamics simulations using GPU accelerations. We first discuss the details of our implementation and then report extensive benchmark tests. We demonstrate that single-precision floating point operations efficiently implemented on GPUs can produce sufficient accuracy when compared against double-precision codes, as demonstrated in test simulations of calculations of the glass-transition temperature of Cu 64.5Zr 35.5, and pair correlation function of liquid Ni 3Al. Our code scales well with the size of the simulating systemmore » on NVIDIA Tesla M40 and P100 GPUs. Compared with another popular software LAMMPS running on 32 cores of AMD Opteron 6220 processors, the GPU/CPU performance ratio can reach as high as 4.6. In conclusion, the source code can be accessed through the HOOMD-blue web page for free by any interested user.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Lin; Zhang, Feng; Wang, Cai-Zhuang
We present an implementation of EAM and FS interatomic potentials, which are widely used in simulating metallic systems, in HOOMD-blue, a software designed to perform classical molecular dynamics simulations using GPU accelerations. We first discuss the details of our implementation and then report extensive benchmark tests. We demonstrate that single-precision floating point operations efficiently implemented on GPUs can produce sufficient accuracy when compared against double-precision codes, as demonstrated in test simulations of calculations of the glass-transition temperature of Cu 64.5Zr 35.5, and pair correlation function of liquid Ni 3Al. Our code scales well with the size of the simulating systemmore » on NVIDIA Tesla M40 and P100 GPUs. Compared with another popular software LAMMPS running on 32 cores of AMD Opteron 6220 processors, the GPU/CPU performance ratio can reach as high as 4.6. In conclusion, the source code can be accessed through the HOOMD-blue web page for free by any interested user.« less
Bin recycling strategy for improving the histogram precision on GPU
NASA Astrophysics Data System (ADS)
Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.
2016-07-01
Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.
GPU-accelerated Tersoff potentials for massively parallel Molecular Dynamics simulations
NASA Astrophysics Data System (ADS)
Nguyen, Trung Dac
2017-03-01
The Tersoff potential is one of the empirical many-body potentials that has been widely used in simulation studies at atomic scales. Unlike pair-wise potentials, the Tersoff potential involves three-body terms, which require much more arithmetic operations and data dependency. In this contribution, we have implemented the GPU-accelerated version of several variants of the Tersoff potential for LAMMPS, an open-source massively parallel Molecular Dynamics code. Compared to the existing MPI implementation in LAMMPS, the GPU implementation exhibits a better scalability and offers a speedup of 2.2X when run on 1000 compute nodes on the Titan supercomputer. On a single node, the speedup ranges from 2.0 to 8.0 times, depending on the number of atoms per GPU and hardware configurations. The most notable features of our GPU-accelerated version include its design for MPI/accelerator heterogeneous parallelism, its compatibility with other functionalities in LAMMPS, its ability to give deterministic results and to support both NVIDIA CUDA- and OpenCL-enabled accelerators. Our implementation is now part of the GPU package in LAMMPS and accessible for public use.
Improving the Efficiency of Free Energy Calculations in the Amber Molecular Dynamics Package.
Kaus, Joseph W; Pierce, Levi T; Walker, Ross C; McCammont, J Andrew
2013-09-10
Alchemical transformations are widely used methods to calculate free energies. Amber has traditionally included support for alchemical transformations as part of the sander molecular dynamics (MD) engine. Here we describe the implementation of a more efficient approach to alchemical transformations in the Amber MD package. Specifically we have implemented this new approach within the more computational efficient and scalable pmemd MD engine that is included with the Amber MD package. The majority of the gain in efficiency comes from the improved design of the calculation, which includes better parallel scaling and reduction in the calculation of redundant terms. This new implementation is able to reproduce results from equivalent simulations run with the existing functionality, but at 2.5 times greater computational efficiency. This new implementation is also able to run softcore simulations at the λ end states making direct calculation of free energies more accurate, compared to the extrapolation required in the existing implementation. The updated alchemical transformation functionality will be included in the next major release of Amber (scheduled for release in Q1 2014) and will be available at http://ambermd.org, under the Amber license.
Improving the Efficiency of Free Energy Calculations in the Amber Molecular Dynamics Package
Pierce, Levi T.; Walker, Ross C.; McCammont, J. Andrew
2013-01-01
Alchemical transformations are widely used methods to calculate free energies. Amber has traditionally included support for alchemical transformations as part of the sander molecular dynamics (MD) engine. Here we describe the implementation of a more efficient approach to alchemical transformations in the Amber MD package. Specifically we have implemented this new approach within the more computational efficient and scalable pmemd MD engine that is included with the Amber MD package. The majority of the gain in efficiency comes from the improved design of the calculation, which includes better parallel scaling and reduction in the calculation of redundant terms. This new implementation is able to reproduce results from equivalent simulations run with the existing functionality, but at 2.5 times greater computational efficiency. This new implementation is also able to run softcore simulations at the λ end states making direct calculation of free energies more accurate, compared to the extrapolation required in the existing implementation. The updated alchemical transformation functionality will be included in the next major release of Amber (scheduled for release in Q1 2014) and will be available at http://ambermd.org, under the Amber license. PMID:24185531
Monitoring and validation of decentralised water and wastewater systems for increased uptake.
Sharma, A K; Cook, S; Chong, M N
2013-01-01
Decentralised water and wastewater systems are being implemented to meet growing demand for municipal services either in combination with centralised systems or as standalone systems. In Australia, there has been increased investment in decentralised water and wastewater systems in response to the capacity constraints of existing centralised systems, an extended period of below average rainfall, uncertainly in traditional water sources due to potential climate change impacts, and the need to reduce the environmental impact of urban development. The implementation of decentralised water systems as a mainstream practice at different development scales is impeded by the knowledge gaps on their actual performance in a range of development types and settings. As the wide-spread uptake of these approaches in modern cities is relatively new compared to centralised approaches, there is limited information available on their planning, design, implementation, reliability and robustness. This paper presents a number of case studies where monitoring studies are under way to validate the performance of decentralised water and wastewater systems. The results from these case studies show the yield and reliability of these decentralised systems, as well as the associated energy demand and ecological footprint. The outputs from these case studies, and other monitoring studies, are important in improving decentralised system design guidelines and developing industry wide management norms for the operation and maintenance of decentralised systems.
Turning Access into a web-enabled secure information system for clinical trials.
Dongquan Chen; Chen, Wei-Bang; Soong, Mayhue; Soong, Seng-Jaw; Orthner, Helmuth F
2009-08-01
Organizations that have limited resources need to conduct clinical studies in a cost-effective, but secure way. Clinical data residing in various individual databases need to be easily accessed and secured. Although widely available, digital certification, encryption, and secure web server, have not been implemented as widely, partly due to a lack of understanding of needs and concerns over issues such as cost and difficulty in implementation. The objective of this study was to test the possibility of centralizing various databases and to demonstrate ways of offering an alternative to a large-scale comprehensive and costly commercial product, especially for simple phase I and II trials, with reasonable convenience and security. We report a working procedure to transform and develop a standalone Access database into a secure Web-based secure information system. For data collection and reporting purposes, we centralized several individual databases; developed, and tested a web-based secure server using self-issued digital certificates. The system lacks audit trails. The cost of development and maintenance may hinder its wide application. The clinical trial databases scattered in various departments of an institution could be centralized into a web-enabled secure information system. The limitations such as the lack of a calendar and audit trail can be partially addressed with additional programming. The centralized Web system may provide an alternative to a comprehensive clinical trial management system.
Fragment assignment in the cloud with eXpress-D
2013-01-01
Background Probabilistic assignment of ambiguously mapped fragments produced by high-throughput sequencing experiments has been demonstrated to greatly improve accuracy in the analysis of RNA-Seq and ChIP-Seq, and is an essential step in many other sequence census experiments. A maximum likelihood method using the expectation-maximization (EM) algorithm for optimization is commonly used to solve this problem. However, batch EM-based approaches do not scale well with the size of sequencing datasets, which have been increasing dramatically over the past few years. Thus, current approaches to fragment assignment rely on heuristics or approximations for tractability. Results We present an implementation of a distributed EM solution to the fragment assignment problem using Spark, a data analytics framework that can scale by leveraging compute clusters within datacenters–“the cloud”. We demonstrate that our implementation easily scales to billions of sequenced fragments, while providing the exact maximum likelihood assignment of ambiguous fragments. The accuracy of the method is shown to be an improvement over the most widely used tools available and can be run in a constant amount of time when cluster resources are scaled linearly with the amount of input data. Conclusions The cloud offers one solution for the difficulties faced in the analysis of massive high-thoughput sequencing data, which continue to grow rapidly. Researchers in bioinformatics must follow developments in distributed systems–such as new frameworks like Spark–for ways to port existing methods to the cloud and help them scale to the datasets of the future. Our software, eXpress-D, is freely available at: http://github.com/adarob/express-d. PMID:24314033
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiang, Nai-Yuan; Zavala, Victor M.
We present a filter line-search algorithm that does not require inertia information of the linear system. This feature enables the use of a wide range of linear algebra strategies and libraries, which is essential to tackle large-scale problems on modern computing architectures. The proposed approach performs curvature tests along the search step to detect negative curvature and to trigger convexification. We prove that the approach is globally convergent and we implement the approach within a parallel interior-point framework to solve large-scale and highly nonlinear problems. Our numerical tests demonstrate that the inertia-free approach is as efficient as inertia detection viamore » symmetric indefinite factorizations. We also demonstrate that the inertia-free approach can lead to reductions in solution time because it reduces the amount of convexification needed.« less
Spatial strategies for managing visitor impacts in National Parks
Leung, Y.-F.; Marion, J.L.
1999-01-01
Resource and social impacts caused by recreationists and tourists have become a management concern in national parks and equivalent protected areas. The need to contain visitor impacts within acceptable limits has prompted park and protected area managers to implement a wide variety of strategies and actions, many of which are spatial in nature. This paper classifies and illustrates the basic spatial strategies for managing visitor impacts in parks and protected areas. A typology of four spatial strategies was proposed based on the recreation and park management literature. Spatial segregation is a common strategy for shielding sensitive resources from visitor impacts or for separating potentially conflicting types of use. Two forms of spatial segregation are zoning and closure. A spatial containment strategy is intended to minimize the aggregate extent of visitor impacts by confining use to limited designated or established Iocations. In contrast, a spatial dispersal strategy seeks to spread visitor use, reducing the frequency of use to levels that avoid or minimize permanent resource impacts or visitor crowding and conflict. Finally, a spatial configuration strategy minimizes impacting visitor behavior though the judicious spatial arrangement of facilities. These four spatial strategics can be implemented separately or in combination at varying spatial scales within a single park. A survey of national park managers provides an empirical example of the diversity of implemented spatial strategies in managing visitor impacts. Spatial segregation is frequently applied in the form of camping restrictions or closures to protect sensitive natural or cultural resources and to separate incompatible visitor activities. Spatial containment is the most widely applied strategy for minimizing the areal extent of resource impacts. Spatial dispersal is commonly applied to reduce visitor crowding or conflicts in popular destination areas but is less frequently applied or effective in minimizing resource impacts. Spatial configuration was only minimally evaluated, as it was not included in the survey. The proposed typology of spatial strategies offers a useful means of organizing and understanding the wide variety of management strategies and actions applied in managing visitor impacts in parks and protected areas. Examples from U.S. national parks demonstrate the diversity of these basic strategies and their flexibility in implementation at various spatial scales. Documentation of these examples helps illustrate their application and inform managers of the multitude of options. Further analysis from the spatial perspective is needed Io extend the applicability of this typology to other recreational activities and management issues.
Parallel Computing Strategies for Irregular Algorithms
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Oliker, Leonid; Shan, Hongzhang; Biegel, Bryan (Technical Monitor)
2002-01-01
Parallel computing promises several orders of magnitude increase in our ability to solve realistic computationally-intensive problems, but relies on their efficient mapping and execution on large-scale multiprocessor architectures. Unfortunately, many important applications are irregular and dynamic in nature, making their effective parallel implementation a daunting task. Moreover, with the proliferation of parallel architectures and programming paradigms, the typical scientist is faced with a plethora of questions that must be answered in order to obtain an acceptable parallel implementation of the solution algorithm. In this paper, we consider three representative irregular applications: unstructured remeshing, sparse matrix computations, and N-body problems, and parallelize them using various popular programming paradigms on a wide spectrum of computer platforms ranging from state-of-the-art supercomputers to PC clusters. We present the underlying problems, the solution algorithms, and the parallel implementation strategies. Smart load-balancing, partitioning, and ordering techniques are used to enhance parallel performance. Overall results demonstrate the complexity of efficiently parallelizing irregular algorithms.
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
NASA Astrophysics Data System (ADS)
Simunovic, S.; Zacharia, T.; Baltas, N.; Spalding, D. B.
1995-03-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. The Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, S.; Zacharia, T.; Baltas, N.
1995-04-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. Themore » Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.« less
e-Justice Implementation at a National Scale: The Ugandan Case
NASA Astrophysics Data System (ADS)
Kitoogo, Fredrick Edward; Bitwayiki, Constantine
The use of information and communications technologies has been identified as one of the means suitable for supplementing the various reforms in convalescing the performance of the justice sector. The Government of Uganda has made strides in the implementation of e-Government to effectively utilize information and communications technologies in governance. The justice players are manifested in a justice, law and order sector which is based on the the Sector Wide Approach whose basic principle is that communication, cooperation and coordination between institutions can greatly add value to service delivery within a sector. Although a subset of e-Government, e-Justice aims at improving service delivery and collaboration between all justice players through the use of ICTs and needs to be spear-headed at a sector level. This work proposes ways of harnessing the existing opportunities and methods to implement e-Justice in Uganda that will culminate into a generic framework that can be applied in similar countries.
Optimizing the Performance of Reactive Molecular Dynamics Simulations for Multi-core Architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aktulga, Hasan Metin; Coffman, Paul; Shan, Tzu-Ray
2015-12-01
Hybrid parallelism allows high performance computing applications to better leverage the increasing on-node parallelism of modern supercomputers. In this paper, we present a hybrid parallel implementation of the widely used LAMMPS/ReaxC package, where the construction of bonded and nonbonded lists and evaluation of complex ReaxFF interactions are implemented efficiently using OpenMP parallelism. Additionally, the performance of the QEq charge equilibration scheme is examined and a dual-solver is implemented. We present the performance of the resulting ReaxC-OMP package on a state-of-the-art multi-core architecture Mira, an IBM BlueGene/Q supercomputer. For system sizes ranging from 32 thousand to 16.6 million particles, speedups inmore » the range of 1.5-4.5x are observed using the new ReaxC-OMP software. Sustained performance improvements have been observed for up to 262,144 cores (1,048,576 processes) of Mira with a weak scaling efficiency of 91.5% in larger simulations containing 16.6 million particles.« less
Teen Pregnancy Prevention: Implementation of a Multicomponent, Community-Wide Approach.
Mueller, Trisha; Tevendale, Heather D; Fuller, Taleria R; House, L Duane; Romero, Lisa M; Brittain, Anna; Varanasi, Bala
2017-03-01
This article provides an overview and description of implementation activities of the multicomponent, community-wide initiatives of the Teenage Pregnancy Prevention Program initiated in 2010 by the Office of Adolescent Health and the Centers for Disease Control and Prevention. The community-wide initiatives applied the Interactive Systems Framework for dissemination and implementation through training and technical assistance on the key elements of the initiative: implementation of evidence-based teen pregnancy prevention (TPP) interventions; enhancing quality of and access to youth-friendly reproductive health services; educating stakeholders about TPP; working with youth in communities most at risk of teen pregnancy; and mobilizing the community to garner support. Of nearly 12,000 hours of training and technical assistance provided, the majority was for selecting, implementing, and evaluating an evidence-based TPP program. Real-world implementation of a community-wide approach to TPP takes time and effort. This report describes implementation within each of the components and shares lessons learned during planning and implementation phases of the initiative. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Technology achievements and projections for communication satellites of the future
NASA Technical Reports Server (NTRS)
Bagwell, J. W.
1986-01-01
Multibeam systems of the future using monolithic microwave integrated circuits to provide phase control and power gain are contrasted with discrete microwave power amplifiers from 10 to 75 W and their associated waveguide feeds, phase shifters and power splitters. Challenging new enabling technology areas include advanced electrooptical control and signal feeds. Large scale MMIC's will be used incorporating on chip control interfaces, latching, and phase and amplitude control with power levels of a few watts each. Beam forming algorithms for 80 to 90 deg. wide angle scanning and precise beam forming under wide ranging environments will be required. Satelllite systems using these dynamically reconfigured multibeam antenna systems will demand greater degrees of beam interconnectivity. Multiband and multiservice users will be interconnected through the same space platform. Monolithic switching arrays operating over a wide range of RF and IF frequencies are contrasted with current IF switch technology implemented discretely. Size, weight, and performance improvements by an order of magnitude are projected.
NASA Astrophysics Data System (ADS)
Zhang, Yang; Liu, Wei; Li, Xiaodong; Yang, Fan; Gao, Peng; Jia, Zhenyuan
2015-10-01
Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.
Framework and implementation of a continuous network-wide health monitoring system for roadways
NASA Astrophysics Data System (ADS)
Wang, Ming; Birken, Ralf; Shahini Shamsabadi, Salar
2014-03-01
According to the 2013 ASCE report card America's infrastructure scores only a D+. There are more than four million miles of roads (grade D) in the U.S. requiring a broad range of maintenance activities. The nation faces a monumental problem of infrastructure management in the scheduling and implementation of maintenance and repair operations, and in the prioritization of expenditures within budgetary constraints. The efficient and effective performance of these operations however is crucial to ensuring roadway safety, preventing catastrophic failures, and promoting economic growth. There is a critical need for technology that can cost-effectively monitor the condition of a network-wide road system and provide accurate, up-to-date information for maintenance activity prioritization. The Versatile Onboard Traffic Embedded Roaming Sensors (VOTERS) project provides a framework and the sensing capability to complement periodical localized inspections to continuous network-wide health monitoring. Research focused on the development of a cost-effective, lightweight package of multi-modal sensor systems compatible with this framework. An innovative software infrastructure is created that collects, processes, and evaluates these large time-lapse multi-modal data streams. A GIS-based control center manages multiple inspection vehicles and the data for further analysis, visualization, and decision making. VOTERS' technology can monitor road conditions at both the surface and sub-surface levels while the vehicle is navigating through daily traffic going about its normal business, thereby allowing for network-wide frequent assessment of roadways. This deterioration process monitoring at unprecedented time and spatial scales provides unique experimental data that can be used to improve life-cycle cost analysis models.
NASA Astrophysics Data System (ADS)
Versini, Pierre-Antoine; Tchiguirinskaia, Ioulia; Schertzer, Daniel
2016-04-01
Concentrating buildings and socio-economic activities, urban areas are particularly vulnerable to hydrological risks. Modification in climate may intensify already existing issues concerning stormwater management (due to impervious area) and water supply (due to the increase of the population). In this context, water use efficiency and best water management practices are key-issues in the urban environment already stressed. Blue and green infrastructures are nature-based solutions that provide synergy of the blue and green systems to provide multifunctional solutions and multiple benefits: increased amenity, urban heat island improvement, biodiversity, reduced energy requirements... They are particularly efficient to reduce the potential impact of new and existing developments with respect to stormwater and/or water supply issues. The Multi-Hydro distributed rainfall-runoff model represents an adapted tool to manage the impacts of such infrastructures at the urban basin scale. It is a numerical platform that makes several models interact, each of them representing a specific portion of the water cycle in an urban environment: surface runoff and infiltration depending on a land use classification, sub-surface processes and sewer network drainage. Multi-Hydro is still being developed at the Ecole des Ponts (open access from https://hmco.enpc.fr/Tools-Training/Tools/Multi-Hydro.php) to take into account the wide complexity of urban environments. The latest advancements have made possible the representation of several blue and green infrastructures (green roof, basin, swale). Applied in a new urban development project located in the Paris region, Multi-Hydro has been used to simulate the impact of blue and green infrastructures implementation. It was particularly focused on their ability to fulfil regulation rules established by local stormwater managers in order to connect the parcel to the sewer network. The results show that a combination of several blue and green infrastructures, if they are widely implemented, could represent an efficient tool to ensure regulation rules at the parcel scale.
Advancing translational research with the Semantic Web.
Ruttenberg, Alan; Clark, Tim; Bug, William; Samwald, Matthias; Bodenreider, Olivier; Chen, Helen; Doherty, Donald; Forsberg, Kerstin; Gao, Yong; Kashyap, Vipul; Kinoshita, June; Luciano, Joanne; Marshall, M Scott; Ogbuji, Chimezie; Rees, Jonathan; Stephens, Susie; Wong, Gwendolyn T; Wu, Elizabeth; Zaccagnini, Davide; Hongsermeier, Tonya; Neumann, Eric; Herman, Ivan; Cheung, Kei-Hoi
2007-05-09
A fundamental goal of the U.S. National Institute of Health (NIH) "Roadmap" is to strengthen Translational Research, defined as the movement of discoveries in basic research to application at the clinical level. A significant barrier to translational research is the lack of uniformly structured data across related biomedical domains. The Semantic Web is an extension of the current Web that enables navigation and meaningful use of digital resources by automatic processes. It is based on common formats that support aggregation and integration of data drawn from diverse sources. A variety of technologies have been built on this foundation that, together, support identifying, representing, and reasoning across a wide range of biomedical data. The Semantic Web Health Care and Life Sciences Interest Group (HCLSIG), set up within the framework of the World Wide Web Consortium, was launched to explore the application of these technologies in a variety of areas. Subgroups focus on making biomedical data available in RDF, working with biomedical ontologies, prototyping clinical decision support systems, working on drug safety and efficacy communication, and supporting disease researchers navigating and annotating the large amount of potentially relevant literature. We present a scenario that shows the value of the information environment the Semantic Web can support for aiding neuroscience researchers. We then report on several projects by members of the HCLSIG, in the process illustrating the range of Semantic Web technologies that have applications in areas of biomedicine. Semantic Web technologies present both promise and challenges. Current tools and standards are already adequate to implement components of the bench-to-bedside vision. On the other hand, these technologies are young. Gaps in standards and implementations still exist and adoption is limited by typical problems with early technology, such as the need for a critical mass of practitioners and installed base, and growing pains as the technology is scaled up. Still, the potential of interoperable knowledge sources for biomedicine, at the scale of the World Wide Web, merits continued work.
Advancing translational research with the Semantic Web
Ruttenberg, Alan; Clark, Tim; Bug, William; Samwald, Matthias; Bodenreider, Olivier; Chen, Helen; Doherty, Donald; Forsberg, Kerstin; Gao, Yong; Kashyap, Vipul; Kinoshita, June; Luciano, Joanne; Marshall, M Scott; Ogbuji, Chimezie; Rees, Jonathan; Stephens, Susie; Wong, Gwendolyn T; Wu, Elizabeth; Zaccagnini, Davide; Hongsermeier, Tonya; Neumann, Eric; Herman, Ivan; Cheung, Kei-Hoi
2007-01-01
Background A fundamental goal of the U.S. National Institute of Health (NIH) "Roadmap" is to strengthen Translational Research, defined as the movement of discoveries in basic research to application at the clinical level. A significant barrier to translational research is the lack of uniformly structured data across related biomedical domains. The Semantic Web is an extension of the current Web that enables navigation and meaningful use of digital resources by automatic processes. It is based on common formats that support aggregation and integration of data drawn from diverse sources. A variety of technologies have been built on this foundation that, together, support identifying, representing, and reasoning across a wide range of biomedical data. The Semantic Web Health Care and Life Sciences Interest Group (HCLSIG), set up within the framework of the World Wide Web Consortium, was launched to explore the application of these technologies in a variety of areas. Subgroups focus on making biomedical data available in RDF, working with biomedical ontologies, prototyping clinical decision support systems, working on drug safety and efficacy communication, and supporting disease researchers navigating and annotating the large amount of potentially relevant literature. Results We present a scenario that shows the value of the information environment the Semantic Web can support for aiding neuroscience researchers. We then report on several projects by members of the HCLSIG, in the process illustrating the range of Semantic Web technologies that have applications in areas of biomedicine. Conclusion Semantic Web technologies present both promise and challenges. Current tools and standards are already adequate to implement components of the bench-to-bedside vision. On the other hand, these technologies are young. Gaps in standards and implementations still exist and adoption is limited by typical problems with early technology, such as the need for a critical mass of practitioners and installed base, and growing pains as the technology is scaled up. Still, the potential of interoperable knowledge sources for biomedicine, at the scale of the World Wide Web, merits continued work. PMID:17493285
Muilenburg, Jessica L; Laschober, Tanja C; Eby, Lillian T
2015-09-01
Adolescence is a prime developmental stage for early tobacco cessation (TC) intervention. This study examined substance use disorder counselors' reports of the availability and implementation of TC services (behavioral treatments and pharmacotherapies) in their treatment programs and the relationship between their tobacco-related knowledge and implementation of TC services. Survey data were collected in 2012 from 63 counselors working in 22 adolescent-only treatment programs. Measures included 15 TC behavioral treatments, nine TC pharmacotherapies, and three tobacco-related knowledge scales (morbidity/mortality, modalities and effectiveness, pharmacology). First, nine of the 15 behavioral treatments are reported as being available by more than half of counselors; four of the 15 behavioral treatments are used by counselors with more than half of adolescents. Of the nine pharmacotherapies, availability of the nicotine patch is reported by almost 40%, buproprion by nearly 30%, and clonidine by about 21% of counselors. Pharmacotherapies are used by counselors with very few adolescents. Second, counselors' tobacco-related knowledge varies based on the knowledge scale examined. Third, we only find a significant positive relationship between counselors' implementation of TC behavioral treatments and TC modalities and effectiveness knowledge. Findings suggest that more behavioral treatments should be made available in substance use disorder treatment programs considering that they are the main treatment recommendation for adolescents. Counselors should be encouraged to routinely use a wide range of available behavioral treatments. Finally, counselors should be encouraged to expand their knowledge of TC modalities and effectiveness because of the relationship with behavioral treatments implementation. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Satellite power system (SPS) financial/management scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-10-01
The problems of financing and managing a large-scale, lengthy SPS program reduce to the key questions of ownership and control. Ownership (that is, the sources of capital) may be governmental, corporate, or individual; control may be exercised by a government agency, a government-sanctioned monopoly, or a competitive corporation. Since the R and D phase and the commercial implementation phase of an SPS program are qualitatively very different with respect to length of time before return-on-investment, we have considered two general categories of SPS organizations: (1) organizations capable of carrying out a complete SPS program, from R and D through commercialization;more » (2) organizations capable of carrying out commercial implementation only. Six organizational models for carrying out the complete SPS program have been examined in some detail: 1) existing government agencies (DOE, NASA, etc.); 2) a new government agency, patterned after TVA; 3) a taxpayer stock corporation, a new concept; 4) a trust fund supported by energy taxes, patterned after the financing of the Interstate Highway System; 5) a federal agency financed by bonds, patterned after the Federal National Mortgage Association; and 6) the staging company, a new concept, already in the early stages of implementation as a private venture. Four additional organizational forms have been considered for commercial implementation of SPS: 7) a government-chartered monopoly, patterned after the Communications Satellite Corporation; 8) the consortium model, already widely used for large-scale projects; 9) the corporate socialism model, patterned after such developments as the transcontinental railroad; and 10) the universal capitalism model, a concept partially implemented in the 1976 legislation creating Employee Stock Ownership Plans. A number of qualitative criteria for comparative assessment of these alternatives have been developed.« less
Hochgesang, Mindy; Zamudio-Haas, Sophia; Moran, Lissa; Nhampossa, Leopoldo; Packel, Laura; Leslie, Hannah; Richards, Janise; Shade, Starley B
2017-01-01
The rapid scale-up of HIV care and treatment in resource-limited countries requires concurrent, rapid development of health information systems to support quality service delivery. Mozambique, a country with an 11.5% prevalence of HIV, has developed nation-wide patient monitoring systems (PMS) with standardized reporting tools, utilized by all HIV treatment providers in paper or electronic form. Evaluation of the initial implementation of PMS can inform and strengthen future development as the country moves towards a harmonized, sustainable health information system. This assessment was conducted in order to 1) characterize data collection and reporting processes and PMS resources available and 2) provide evidence-based recommendations for harmonization and sustainability of PMS. This baseline assessment of PMS was conducted with eight non-governmental organizations that supported the Ministry of Health to provide 90% of HIV care and treatment in Mozambique. The study team conducted structured and semi-structured surveys at 18 health facilities located in all 11 provinces. Seventy-nine staff were interviewed. Deductive a priori analytic categories guided analysis. Health facilities have implemented paper and electronic monitoring systems with varying success. Where in use, robust electronic PMS facilitate facility-level reporting of required indicators; improve ability to identify patients lost to follow-up; and support facility and patient management. Challenges to implementation of monitoring systems include a lack of national guidelines and norms for patient level HIS, variable system implementation and functionality, and limited human and infrastructure resources to maximize system functionality and information use. This initial assessment supports the need for national guidelines to harmonize, expand, and strengthen HIV-related health information systems. Recommendations may benefit other countries with similar epidemiologic and resource-constrained environments seeking to improve PMS implementation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Thermal nanostructure: An order parameter multiscale ensemble approach
NASA Astrophysics Data System (ADS)
Cheluvaraja, S.; Ortoleva, P.
2010-02-01
Deductive all-atom multiscale techniques imply that many nanosystems can be understood in terms of the slow dynamics of order parameters that coevolve with the quasiequilibrium probability density for rapidly fluctuating atomic configurations. The result of this multiscale analysis is a set of stochastic equations for the order parameters whose dynamics is driven by thermal-average forces. We present an efficient algorithm for sampling atomistic configurations in viruses and other supramillion atom nanosystems. This algorithm allows for sampling of a wide range of configurations without creating an excess of high-energy, improbable ones. It is implemented and used to calculate thermal-average forces. These forces are then used to search the free-energy landscape of a nanosystem for deep minima. The methodology is applied to thermal structures of Cowpea chlorotic mottle virus capsid. The method has wide applicability to other nanosystems whose properties are described by the CHARMM or other interatomic force field. Our implementation, denoted SIMNANOWORLD™, achieves calibration-free nanosystem modeling. Essential atomic-scale detail is preserved via a quasiequilibrium probability density while overall character is provided via predicted values of order parameters. Applications from virology to the computer-aided design of nanocapsules for delivery of therapeutic agents and of vaccines for nonenveloped viruses are envisioned.
Erskine, Jonathan; Hunter, David J; Small, Adrian; Hicks, Chris; McGovern, Tom; Lugsden, Ed; Whitty, Paula; Steen, Nick; Eccles, Martin Paul
2013-02-01
The research project 'An Evaluation of Transformational Change in NHS North East' examines the progress and success of National Health Service (NHS) organisations in north east England in implementing and embedding the North East Transformation System (NETS), a region-wide programme to improve healthcare quality and safety, and to reduce waste, using a combination of Vision, Compact, and Lean-based Method. This paper concentrates on findings concerning the role of leadership in enabling tranformational change, based on semi-structured interviews with a mix of senior NHS managers and quality improvement staff in 14 study sites. Most interviewees felt that implementing the NETS requires committed, stable leadership, attention to team-building across disciplines and leadership development at many levels. We conclude that without senior leader commitment to continuous improvement over a long time scale and serious efforts to distribute leadership tasks to all levels, healthcare organisations are less likely to achieve positive changes in managerial-clinical relations, sustainable improvements to organisational culture and, ultimately, the region-wide step change in quality, safety and efficiency that the NETS was designed to deliver. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Predictors of Sustained Implementation of School-Wide Positive Behavioral Interventions and Supports
ERIC Educational Resources Information Center
McIntosh, Kent; Mercer, Sterett H.; Nese, Rhonda N. T.; Strickland-Cohen, M. Kathleen; Hoselton, Robert
2016-01-01
In this analysis of extant data from 3,011 schools implementing school-wide positive behavioral interventions and supports (SWPBIS) across multiple years, we assessed the predictive power of various school characteristics and speed of initial implementation on sustained fidelity of implementation of SWPBIS at 1, 3, and 5 years. In addition, we…
Ocean Observatories Initiative (OOI): Status of Design, Capabilities, and Implementation
NASA Astrophysics Data System (ADS)
Brasseur, L. H.; Banahan, S.; Cowles, T.
2009-05-01
The National Science Foundation's (NSF) Ocean Observatories Initiative (OOI) will implement the construction and operation of an interactive, integrated ocean observing network. This research- driven, multi-scale network will provide the broad ocean science community with access to advanced technology to enable studies of fundamental ocean processes. The OOI will afford observations at coastal, regional, and global scales on timeframes of milliseconds to decades in support of investigations into climate variability, ocean ecosystems, biogeochemical processes, coastal ocean dynamics, circulation and mixing dynamics, fluid-rock interactions, and the sub-seafloor biosphere. The elements of the OOI include arrays of fixed and re-locatable moorings, autonomous underwater vehicles, and cabled seafloor nodes. All assets combined, the OOI network will provide data from over 45 distinct types of sensors, comprising over 800 total sensors distributed in the Pacific and Atlantic oceans. These core sensors for the OOI were determined through a formal process of science requirements development. This core sensor array will be integrated through a system-wide cyberinfrastructure allowing for remote control of instruments, adaptive sampling, and near-real time access to data. Implementation of the network will stimulate new avenues of research and the development of new infrastructure, instrumentation, and sensor technologies. The OOI is funded by the NSF and managed by the Consortium for Ocean Leadership which focuses on the science, technology, education, and outreach for an emerging network of ocean observing systems.
Cruvinel, Erica; Richter, Kimber P; Bastos, Ronaldo Rocha; Ronzani, Telmo Mota
2013-02-11
Numerous studies have demonstrated that positive organizational climates contribute to better work performance. Screening and brief intervention (SBI) for alcohol, tobacco, and other drug use has the potential to reach a broad population of hazardous drug users but has not yet been widely adopted in Brazil's health care system. We surveyed 149 primary health care professionals in 30 clinics in Brazil who were trained to conduct SBI among their patients. We prospectively measured how often they delivered SBI to evaluate the association between organizational climate and adoption/performance of SBI. Organizational climate was measured by the 2009 Organizational Climate Scale for Health Organizations, a scale validated in Brazil that assesses leadership, professional development, team spirit, relationship with the community, safety, strategy, and remuneration. Performance of SBI was measured prospectively by weekly assessments during the three months following training. We also assessed self-reported SBI and self-efficacy for performing SBI at three months post-training. We used inferential statistics to depict and test for the significance of associations. Teams with better organizational climates implemented SBI more frequently. Organizational climate factors most closely associated with SBI implementation included professional development and relationship with the community. The dimensions of leadership and remuneration were also significantly associated with SBI. Organizational climate may influence implementation of SBI and ultimately may affect the ability of organizations to identify and address drug use.
Schürer, S; Schellberg, D; Schmidt, J; Kallinowski, F; Mehrabi, A; Herfarth, Ch; Büchler, M W; Kadmon, M
2006-04-01
The medical faculty of Heidelberg University implemented a new problem-based clinical curriculum (Heidelberg Curriculum Medicinale, or Heicumed) in 2001. The present study analyses the evaluation data of two student cohorts prior to the introduction of Heicumed. Its aim was to specify problems of the traditional training and to draw conclusions for implementation of a new curriculum. The evaluation instrument was the Heidelberg Inventory for the Evaluation of Teaching (HILVE-I). The data were analysed calculating differences in the means between defined groups, with the 13 primary scales of the HILVE I-instrument as dependent variables. Teaching method and subject had no systematic influence on evaluation results. Thus, didactic lecture in orthopedic surgery achieved better results than small group tutorials, while the data on vascular and general surgery showed opposite results. Major factors for success were continuity and didactic training of lecturers and tutors. This is convincingly reflected by the results of the lecture course "Differential diagnosis in general surgery". The good evaluation data on small group tutorials resulted largely from the "participation" and "discussion" scales, which represent interactivity in learning. The results of the present study suggest the importance of two major pedagogic ideas: continuity and didactic training of lecturers and tutors. These principles were widely implemented in Heicumed and have contributed to the success of the new curriculum.
SKIRT: Hybrid parallelization of radiative transfer simulations
NASA Astrophysics Data System (ADS)
Verstocken, S.; Van De Putte, D.; Camps, P.; Baes, M.
2017-07-01
We describe the design, implementation and performance of the new hybrid parallelization scheme in our Monte Carlo radiative transfer code SKIRT, which has been used extensively for modelling the continuum radiation of dusty astrophysical systems including late-type galaxies and dusty tori. The hybrid scheme combines distributed memory parallelization, using the standard Message Passing Interface (MPI) to communicate between processes, and shared memory parallelization, providing multiple execution threads within each process to avoid duplication of data structures. The synchronization between multiple threads is accomplished through atomic operations without high-level locking (also called lock-free programming). This improves the scaling behaviour of the code and substantially simplifies the implementation of the hybrid scheme. The result is an extremely flexible solution that adjusts to the number of available nodes, processors and memory, and consequently performs well on a wide variety of computing architectures.
The NatCarb geoportal: Linking distributed data from the Carbon Sequestration Regional Partnerships
Carr, T.R.; Rich, P.M.; Bartley, J.D.
2007-01-01
The Department of Energy (DOE) Carbon Sequestration Regional Partnerships are generating the data for a "carbon atlas" of key geospatial data (carbon sources, potential sinks, etc.) required for rapid implementation of carbon sequestration on a broad scale. The NATional CARBon Sequestration Database and Geographic Information System (NatCarb) provides Web-based, nation-wide data access. Distributed computing solutions link partnerships and other publicly accessible repositories of geological, geophysical, natural resource, infrastructure, and environmental data. Data are maintained and enhanced locally, but assembled and accessed through a single geoportal. NatCarb, as a first attempt at a national carbon cyberinfrastructure (NCCI), assembles the data required to address technical and policy challenges of carbon capture and storage. We present a path forward to design and implement a comprehensive and successful NCCI. ?? 2007 The Haworth Press, Inc. All rights reserved.
Gaussian Accelerated Molecular Dynamics: Theory, Implementation, and Applications
Miao, Yinglong; McCammon, J. Andrew
2018-01-01
A novel Gaussian Accelerated Molecular Dynamics (GaMD) method has been developed for simultaneous unconstrained enhanced sampling and free energy calculation of biomolecules. Without the need to set predefined reaction coordinates, GaMD enables unconstrained enhanced sampling of the biomolecules. Furthermore, by constructing a boost potential that follows a Gaussian distribution, accurate reweighting of GaMD simulations is achieved via cumulant expansion to the second order. The free energy profiles obtained from GaMD simulations allow us to identify distinct low energy states of the biomolecules and characterize biomolecular structural dynamics quantitatively. In this chapter, we present the theory of GaMD, its implementation in the widely used molecular dynamics software packages (AMBER and NAMD), and applications to the alanine dipeptide biomolecular model system, protein folding, biomolecular large-scale conformational transitions and biomolecular recognition. PMID:29720925
A universal preconditioner for simulating condensed phase materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Packwood, David; Ortner, Christoph, E-mail: c.ortner@warwick.ac.uk; Kermode, James, E-mail: j.r.kermode@warwick.ac.uk
2016-04-28
We introduce a universal sparse preconditioner that accelerates geometry optimisation and saddle point search tasks that are common in the atomic scale simulation of materials. Our preconditioner is based on the neighbourhood structure and we demonstrate the gain in computational efficiency in a wide range of materials that include metals, insulators, and molecular solids. The simple structure of the preconditioner means that the gains can be realised in practice not only when using expensive electronic structure models but also for fast empirical potentials. Even for relatively small systems of a few hundred atoms, we observe speedups of a factor ofmore » two or more, and the gain grows with system size. An open source Python implementation within the Atomic Simulation Environment is available, offering interfaces to a wide range of atomistic codes.« less
Lee, Cameron M.; Engelbrecht, Christoph J.; Soper, Timothy D.; Helmchen, Fritjof; Seibel, Eric J.
2011-01-01
In modern endoscopy, wide field of view and full color are considered necessary for navigating inside the body, inspecting tissue for disease and guiding interventions such as biopsy or surgery. Current flexible endoscope technologies suffer from reduced resolution when device diameter shrinks. Endoscopic procedures today using coherent fiber bundle technology, on the scale of 1 mm, are performed with such poor image quality that the clinician’s vision meets the criteria for legal blindness. Here, we review a new and versatile scanning fiber imaging technology and describe its implementation for ultrathin and flexible endoscopy. This scanning fiber endoscope (SFE) or catheterscope enables high quality, laser-based, video imaging for ultrathin clinical applications while also providing new options for in vivo biological research of subsurface tissue and high resolution fluorescence imaging. PMID:20336702
NASA Astrophysics Data System (ADS)
Harvey, J. W.; Packman, A. I.
2010-12-01
Surface water and groundwater flow interact with the channel geomorphology and sediments in ways that determine how material is transported, stored, and transformed in stream corridors. Solute and sediment transport affect important ecological processes such as carbon and nutrient dynamics and stream metabolism, processes that are fundamental to stream health and function. Many individual mechanisms of transport and storage of solute and sediment have been studied, including surface water exchange between the main channel and side pools, hyporheic flow through shallow and deep subsurface flow paths, and sediment transport during both baseflow and floods. A significant challenge arises from non-linear and scale-dependent transport resulting from natural, fractal fluvial topography and associated broad, multi-scale hydrologic interactions. Connections between processes and linkages across scales are not well understood, imposing significant limitations on system predictability. The whole-stream tracer experimental approach is popular because of the spatial averaging of heterogeneous processes; however the tracer results, implemented alone and analyzed using typical models, cannot usually predict transport beyond the very specific conditions of the experiment. Furthermore, the results of whole stream tracer experiments tend to be biased due to unavoidable limitations associated with sampling frequency, measurement sensitivity, and experiment duration. We recommend that whole-stream tracer additions be augmented with hydraulic and topographic measurements and also with additional tracer measurements made directly in storage zones. We present examples of measurements that encompass interactions across spatial and temporal scales and models that are transferable to a wide range of flow and geomorphic conditions. These results show how the competitive effects between the different forces driving hyporheic flow, operating at different spatial scales, creates a situation where hyporheic fluxes cannot be accurately estimated without considering multi-scale effects. Our modeling captures the dominance of small-scale features such as bedforms that drive the majority of hyporheic flow, but it also captures how hyporheic flow is substantially modified by relatively small changes in streamflow or groundwater flow. The additional field measurements add sensitivity and power to whole stream tracer additions by improving resolution of the relative importance of storage at different scales (e.g. bar-scale versus bedform-scale). This information is critical in identifying hot spots where important biogeochemical reactions occur. In summary, interpreting multi-scale interactions in streams requires models that are physically based and that incorporate non-linear process dynamics. Such models can take advantage of increasingly comprehensive field data to integrate transport processes across spatially variable flow and geomorphic conditions. The most useful field and modeling approaches will be those that are simple enough to be easily implemented by users from various disciplines but comprehensive enough to produce meaningful predictions for a wide range of flow and geomorphic scenarios. This capability is needed to support improved strategies for protecting stream ecological health in the face of accelerating land use and climate change.
Dudley, Dawn M.; Chin, Emily N.; Bimber, Benjamin N.; Sanabani, Sabri S.; Tarosso, Leandro F.; Costa, Priscilla R.; Sauer, Mariana M.; Kallas, Esper G.; O.’Connor, David H.
2012-01-01
Background Great efforts have been made to increase accessibility of HIV antiretroviral therapy (ART) in low and middle-income countries. The threat of wide-scale emergence of drug resistance could severely hamper ART scale-up efforts. Population-based surveillance of transmitted HIV drug resistance ensures the use of appropriate first-line regimens to maximize efficacy of ART programs where drug options are limited. However, traditional HIV genotyping is extremely expensive, providing a cost barrier to wide-scale and frequent HIV drug resistance surveillance. Methods/Results We have developed a low-cost laboratory-scale next-generation sequencing-based genotyping method to monitor drug resistance. We designed primers specifically to amplify protease and reverse transcriptase from Brazilian HIV subtypes and developed a multiplexing scheme using multiplex identifier tags to minimize cost while providing more robust data than traditional genotyping techniques. Using this approach, we characterized drug resistance from plasma in 81 HIV infected individuals collected in São Paulo, Brazil. We describe the complexities of analyzing next-generation sequencing data and present a simplified open-source workflow to analyze drug resistance data. From this data, we identified drug resistance mutations in 20% of treatment naïve individuals in our cohort, which is similar to frequencies identified using traditional genotyping in Brazilian patient samples. Conclusion The developed ultra-wide sequencing approach described here allows multiplexing of at least 48 patient samples per sequencing run, 4 times more than the current genotyping method. This method is also 4-fold more sensitive (5% minimal detection frequency vs. 20%) at a cost 3–5× less than the traditional Sanger-based genotyping method. Lastly, by using a benchtop next-generation sequencer (Roche/454 GS Junior), this approach can be more easily implemented in low-resource settings. This data provides proof-of-concept that next-generation HIV drug resistance genotyping is a feasible and low-cost alternative to current genotyping methods and may be particularly beneficial for in-country surveillance of transmitted drug resistance. PMID:22574170
Energy-Efficient Wide Datapath Integer Arithmetic Logic Units Using Superconductor Logic
NASA Astrophysics Data System (ADS)
Ayala, Christopher Lawrence
Complementary Metal-Oxide-Semiconductor (CMOS) technology is currently the most widely used integrated circuit technology today. As CMOS approaches the physical limitations of scaling, it is unclear whether or not it can provide long-term support for niche areas such as high-performance computing and telecommunication infrastructure, particularly with the emergence of cloud computing. Alternatively, superconductor technologies based on Josephson junction (JJ) switching elements such as Rapid Single Flux Quantum (RSFQ) logic and especially its new variant, Energy-Efficient Rapid Single Flux Quantum (ERSFQ) logic have the capability to provide an ultra-high-speed, low power platform for digital systems. The objective of this research is to design and evaluate energy-efficient, high-speed 32-bit integer Arithmetic Logic Units (ALUs) implemented using RSFQ and ERSFQ logic as the first steps towards achieving practical Very-Large-Scale-Integration (VLSI) complexity in digital superconductor electronics. First, a tunable VHDL superconductor cell library is created to provide a mechanism to conduct design exploration and evaluation of superconductor digital circuits from the perspectives of functionality, complexity, performance, and energy-efficiency. Second, hybrid wave-pipelining techniques developed earlier for wide datapath RSFQ designs have been used for efficient arithmetic and logic circuit implementations. To develop the core foundation of the ALU, the ripple-carry adder and the Kogge-Stone parallel prefix carry look-ahead adder are studied as representative candidates on opposite ends of the design spectrum. By combining the high-performance features of the Kogge-Stone structure and the low complexity of the ripple-carry adder, a 32-bit asynchronous wave-pipelined hybrid sparse-tree ALU has been designed and evaluated using the VHDL cell library tuned to HYPRES' gate-level characteristics. The designs and techniques from this research have been implemented using RSFQ logic and prototype chips have been fabricated. As a joint work with HYPRES, a 20 GHz 8-bit Kogge-Stone ALU consisting of 7,950 JJs total has been fabricated using a 1.5 μm 4.5 kA/cm2 process and fully demonstrated. An 8-bit sparse-tree ALU (8,832 JJs total) and a 16-bit sparse-tree adder (12,785 JJs total) have also been fabricated using a 1.0 μm 10 kA/cm 2 process and demonstrated under collaboration with Yokohama National University and Nagoya University (Japan).
2013-01-01
Background In the United States, as in many other parts of the world, the prevalence of overweight/obesity is at epidemic proportions in the adult population and even higher among Veterans. To address the high prevalence of overweight/obesity among Veterans, the MOVE!® weight management program was disseminated nationally to Veteran Affairs (VA) medical centers. The objective of this paper is two-fold: to describe factors that explain the wide variation in implementation of MOVE!; and to illustrate, step-by-step, how to apply a theory-based framework using qualitative data. Methods Five VA facilities were selected to maximize variation in implementation effectiveness and geographic location. Twenty-four key stakeholders were interviewed about their experiences in implementing MOVE!. The Consolidated Framework for Implementation Research (CFIR) was used to guide collection and analysis of qualitative data. Constructs that most strongly influence implementation effectiveness were identified through a cross-case comparison of ratings. Results Of the 31 CFIR constructs assessed, ten constructs strongly distinguished between facilities with low versus high program implementation effectiveness. The majority (six) were related to the inner setting: networks and communications; tension for change; relative priority; goals and feedback; learning climate; and leadership engagement. One construct each, from intervention characteristics (relative advantage) and outer setting (patient needs and resources), plus two from process (executing and reflecting) also strongly distinguished between high and low implementation. Two additional constructs weakly distinguished, 16 were mixed, three constructs had insufficient data to assess, and one was not applicable. Detailed descriptions of how each distinguishing construct manifested in study facilities and a table of recommendations is provided. Conclusions This paper presents an approach for using the CFIR to code and rate qualitative data in a way that will facilitate comparisons across studies. An online Wiki resource (http://www.wiki.cfirwiki.net) is available, in addition to the information presented here, that contains much of the published information about the CFIR and its constructs and sub-constructs. We hope that the described approach and open access to the CFIR will generate wide use and encourage dialogue and continued refinement of both the framework and approaches for applying it. PMID:23663819
Damschroder, Laura J; Lowery, Julie C
2013-05-10
In the United States, as in many other parts of the world, the prevalence of overweight/obesity is at epidemic proportions in the adult population and even higher among Veterans. To address the high prevalence of overweight/obesity among Veterans, the MOVE!(®) weight management program was disseminated nationally to Veteran Affairs (VA) medical centers. The objective of this paper is two-fold: to describe factors that explain the wide variation in implementation of MOVE!; and to illustrate, step-by-step, how to apply a theory-based framework using qualitative data. Five VA facilities were selected to maximize variation in implementation effectiveness and geographic location. Twenty-four key stakeholders were interviewed about their experiences in implementing MOVE!. The Consolidated Framework for Implementation Research (CFIR) was used to guide collection and analysis of qualitative data. Constructs that most strongly influence implementation effectiveness were identified through a cross-case comparison of ratings. Of the 31 CFIR constructs assessed, ten constructs strongly distinguished between facilities with low versus high program implementation effectiveness. The majority (six) were related to the inner setting: networks and communications; tension for change; relative priority; goals and feedback; learning climate; and leadership engagement. One construct each, from intervention characteristics (relative advantage) and outer setting (patient needs and resources), plus two from process (executing and reflecting) also strongly distinguished between high and low implementation. Two additional constructs weakly distinguished, 16 were mixed, three constructs had insufficient data to assess, and one was not applicable. Detailed descriptions of how each distinguishing construct manifested in study facilities and a table of recommendations is provided. This paper presents an approach for using the CFIR to code and rate qualitative data in a way that will facilitate comparisons across studies. An online Wiki resource (http://www.wiki.cfirwiki.net) is available, in addition to the information presented here, that contains much of the published information about the CFIR and its constructs and sub-constructs. We hope that the described approach and open access to the CFIR will generate wide use and encourage dialogue and continued refinement of both the framework and approaches for applying it.
Robinson, Claire H; Annis, Ann M; Forman, Jane; Krein, Sarah L; Yankey, Nicholas; Duffy, Sonia A; Taylor, Beth; Sales, Anne E
2016-08-01
To assess implementation of the Veterans Health Administration staffing methodology directive. In 2010 the Veterans Health Administration promulgated a staffing methodology directive for inpatient nursing units to address staffing and budget forecasting. A qualitative multi-case evaluation approach assessed staffing methodology implementation. Semi-structured telephone interviews were conducted from March - June 2014 with Nurse Executives and their teams at 21 facilities. Interviews focused on the budgeting process, implementation experiences, use of data, leadership support, and training. An implementation score was created for each facility using a 4-point rating scale. The scores were used to select three facilities (low, medium and high implementation) for more detailed case studies. After analysing interview summaries, the evaluation team developed a four domain scoring structure: (1) integration of staffing methodology into budget development; (2) implementation of the Directive elements; (3) engagement of leadership and staff; and (4) use of data to support the staffing methodology process. The high implementation facility had leadership understanding and endorsement of staffing methodology, confidence in and ability to work with data, and integration of staffing methodology results into the budgeting process. The low implementation facility reported poor leadership engagement and little understanding of data sources and interpretation. Implementation varies widely across facilities. Implementing staffing methodology in facilities with complex and changing staffing needs requires substantial commitment at all organizational levels especially for facilities that have traditionally relied on historical levels to budget for staffing. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.
Waltz, Thomas J; Powell, Byron J; Matthieu, Monica M; Damschroder, Laura J; Chinman, Matthew J; Smith, Jeffrey L; Proctor, Enola K; Kirchner, JoAnn E
2015-08-07
Poor terminological consistency for core concepts in implementation science has been widely noted as an obstacle to effective meta-analyses. This inconsistency is also a barrier for those seeking guidance from the research literature when developing and planning implementation initiatives. The Expert Recommendations for Implementing Change (ERIC) study aims to address one area of terminological inconsistency: discrete implementation strategies involving one process or action used to support a practice change. The present report is on the second stage of the ERIC project that focuses on providing initial validation of the compilation of 73 implementation strategies that were identified in the first phase. Purposive sampling was used to recruit a panel of experts in implementation science and clinical practice (N = 35). These key stakeholders used concept mapping sorting and rating activities to place the 73 implementation strategies into similar groups and to rate each strategy's relative importance and feasibility. Multidimensional scaling analysis provided a quantitative representation of the relationships among the strategies, all but one of which were found to be conceptually distinct from the others. Hierarchical cluster analysis supported organizing the 73 strategies into 9 categories. The ratings data reflect those strategies identified as the most important and feasible. This study provides initial validation of the implementation strategies within the ERIC compilation as being conceptually distinct. The categorization and strategy ratings of importance and feasibility may facilitate the search for, and selection of, strategies that are best suited for implementation efforts in a particular setting.
Rainfall recharge estimation on a nation-wide scale using satellite information in New Zealand
NASA Astrophysics Data System (ADS)
Westerhoff, Rogier; White, Paul; Moore, Catherine
2015-04-01
Models of rainfall recharge to groundwater are challenged by the need to combine uncertain estimates of rainfall, evapotranspiration, terrain slope, and unsaturated zone parameters (e.g., soil drainage and hydraulic conductivity of the subsurface). Therefore, rainfall recharge is easiest to estimate on a local scale in well-drained plains, where it is known that rainfall directly recharges groundwater. In New Zealand, this simplified approach works in the policy framework of regional councils, who manage water allocation at the aquifer and sub-catchment scales. However, a consistent overview of rainfall recharge is difficult to obtain at catchment and national scale: in addition to data uncertainties, data formats are inconsistent between catchments; the density of ground observations, where these exist, differs across regions; each region typically uses different local models for estimating recharge components; and different methods and ground observations are used for calibration and validation of these models. The research described in this paper therefore presents a nation-wide approach to estimate rainfall recharge in New Zealand. The method used is a soil water balance approach, with input data from national rainfall and soil and geology databases. Satellite data (i.e., evapotranspiration, soil moisture, and terrain) aid in the improved calculation of rainfall recharge, especially in data-sparse areas. A first version of the model has been implemented on a 1 km x 1 km and monthly scale between 2000 and 2013. A further version will include a quantification of recharge estimate uncertainty: with both "top down" input error propagation methods and catchment-wide "bottom up" assessments of integrated uncertainty being adopted. Using one nation-wide methodology opens up new possibilities: it can, for example, help in more consistent estimation of water budgets, groundwater fluxes, or other hydrological parameters. Since recharge is estimated for the entire land surface, and not only the known aquifers, the model also identifies other zones that could potentially recharge aquifers, including large areas (e.g., mountains) that are currently regarded as impervious. The resulting rainfall recharge data have also been downscaled in a 200 m x 200 m calculation of a national monthly water table. This will lead to better estimation of hydraulic conductivity, which holds considerable potential for further research in unconfined aquifers in New Zealand.
Use of hydrologic and hydrodynamic modeling for ecosystem restoration
Obeysekera, J.; Kuebler, L.; Ahmed, S.; Chang, M.-L.; Engel, V.; Langevin, C.; Swain, E.; Wan, Y.
2011-01-01
Planning and implementation of unprecedented projects for restoring the greater Everglades ecosystem are underway and the hydrologic and hydrodynamic modeling of restoration alternatives has become essential for success of restoration efforts. In view of the complex nature of the South Florida water resources system, regional-scale (system-wide) hydrologic models have been developed and used extensively for the development of the Comprehensive Everglades Restoration Plan. In addition, numerous subregional-scale hydrologic and hydrodynamic models have been developed and are being used for evaluating project-scale water management plans associated with urban, agricultural, and inland costal ecosystems. The authors provide a comprehensive summary of models of all scales, as well as the next generation models under development to meet the future needs of ecosystem restoration efforts in South Florida. The multiagency efforts to develop and apply models have allowed the agencies to understand the complex hydrologic interactions, quantify appropriate performance measures, and use new technologies in simulation algorithms, software development, and GIS/database techniques to meet the future modeling needs of the ecosystem restoration programs. Copyright ?? 2011 Taylor & Francis Group, LLC.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
...; Comment Request; Evaluation of a District Wide Implementation of a Professional Learning Community... Professional Learning Community Initiative. OMB Control Number: 1850--NEW. Type of Review: A new information... need for systematic information about district-wide implementation of professional learning communities...
An improved architecture for video rate image transformations
NASA Technical Reports Server (NTRS)
Fisher, Timothy E.; Juday, Richard D.
1989-01-01
Geometric image transformations are of interest to pattern recognition algorithms for their use in simplifying some aspects of the pattern recognition process. Examples include reducing sensitivity to rotation, scale, and perspective of the object being recognized. The NASA Programmable Remapper can perform a wide variety of geometric transforms at full video rate. An architecture is proposed that extends its abilities and alleviates many of the first version's shortcomings. The need for the improvements are discussed in the context of the initial Programmable Remapper and the benefits and limitations it has delivered. The implementation and capabilities of the proposed architecture are discussed.
Skvortsov, Valeriy; Ivannikov, Alexander; Tikunov, Dimitri; Stepanenko, Valeriy; Borysheva, Natalie; Orlenko, Sergey; Nalapko, Mikhail; Hoshi, Masaharu
2006-02-01
General aspects of applying the method of retrospective dose estimation by electron paramagnetic resonance spectroscopy of human tooth enamel (EPR dosimetry) to the population residing in the vicinity of the Semipalatinsk nuclear test site are analyzed and summarized. The analysis is based on the results obtained during 20 years of investigations conducted in the Medical Radiological Research Center regarding the development and practical application of this method for wide-scale dosimetrical investigation of populations exposed to radiation after the Chernobyl accident and other radiation accidents.
NASA Technical Reports Server (NTRS)
Li, Yong; Moorthi, S.; Bates, J. Ray; Suarez, Max J.
1994-01-01
High order horizontal diffusion of the form K Delta(exp 2m) is widely used in spectral models as a means of preventing energy accumulation at the shortest resolved scales. In the spectral context, an implicit formation of such diffusion is trivial to implement. The present note describes an efficient method of implementing implicit high order diffusion in global finite difference models. The method expresses the high order diffusion equation as a sequence of equations involving Delta(exp 2). The solution is obtained by combining fast Fourier transforms in longitude with a finite difference solver for the second order ordinary differential equation in latitude. The implicit diffusion routine is suitable for use in any finite difference global model that uses a regular latitude/longitude grid. The absence of a restriction on the timestep makes it particularly suitable for use in semi-Lagrangian models. The scale selectivity of the high order diffusion gives it an advantage over the uncentering method that has been used to control computational noise in two-time-level semi-Lagrangian models.
Kinetic Approaches to Shear-Driven Magnetic Reconnection for Multi-Scale Modeling of CME Initiation
NASA Astrophysics Data System (ADS)
Black, C.; Antiochos, S. K.; DeVore, C.; Germaschewski, K.; Karpen, J. T.
2013-12-01
In the standard model for coronal mass ejections (CME) and/or solar flares, the free energy for the event resides in the strongly sheared magnetic field of a filament channel. The pre-eruption force balance, consisting of an upward force due to the magnetic pressure of the sheared field balanced by a downward tension due to overlying un-sheared field, is widely believed to be disrupted by magnetic reconnection. Therefore, understanding initiation of solar explosive phenomena requires a true multi-scale model of reconnection onset driven by the buildup of magnetic shear. While the application of magnetic-field shear is a trivial matter in MHD simulations, it is a significant challenge in a PIC code. The driver must be implemented in a self-consistent manner and with boundary conditions that avoid the generation of waves that destroy the applied shear. In this work, we describe drivers for 2.5D, aperiodic, PIC systems and discuss the implementation of driver-consistent boundary conditions that allow a net electric current to flow through the walls. Preliminary tests of these boundaries with a MHD equilibrium are shown. This work was supported, in part, by the NASA Living With a Star TR&T Program.
Discrete event performance prediction of speculatively parallel temperature-accelerated dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zamora, Richard James; Voter, Arthur F.; Perez, Danny
Due to its unrivaled ability to predict the dynamical evolution of interacting atoms, molecular dynamics (MD) is a widely used computational method in theoretical chemistry, physics, biology, and engineering. Despite its success, MD is only capable of modeling time scales within several orders of magnitude of thermal vibrations, leaving out many important phenomena that occur at slower rates. The Temperature Accelerated Dynamics (TAD) method overcomes this limitation by thermally accelerating the state-to-state evolution captured by MD. Due to the algorithmically complex nature of the serial TAD procedure, implementations have yet to improve performance by parallelizing the concurrent exploration of multiplemore » states. Here we utilize a discrete event-based application simulator to introduce and explore a new Speculatively Parallel TAD (SpecTAD) method. We investigate the SpecTAD algorithm, without a full-scale implementation, by constructing an application simulator proxy (SpecTADSim). Finally, following this method, we discover that a nontrivial relationship exists between the optimal SpecTAD parameter set and the number of CPU cores available at run-time. Furthermore, we find that a majority of the available SpecTAD boost can be achieved within an existing TAD application using relatively simple algorithm modifications.« less
Discrete event performance prediction of speculatively parallel temperature-accelerated dynamics
Zamora, Richard James; Voter, Arthur F.; Perez, Danny; ...
2016-12-01
Due to its unrivaled ability to predict the dynamical evolution of interacting atoms, molecular dynamics (MD) is a widely used computational method in theoretical chemistry, physics, biology, and engineering. Despite its success, MD is only capable of modeling time scales within several orders of magnitude of thermal vibrations, leaving out many important phenomena that occur at slower rates. The Temperature Accelerated Dynamics (TAD) method overcomes this limitation by thermally accelerating the state-to-state evolution captured by MD. Due to the algorithmically complex nature of the serial TAD procedure, implementations have yet to improve performance by parallelizing the concurrent exploration of multiplemore » states. Here we utilize a discrete event-based application simulator to introduce and explore a new Speculatively Parallel TAD (SpecTAD) method. We investigate the SpecTAD algorithm, without a full-scale implementation, by constructing an application simulator proxy (SpecTADSim). Finally, following this method, we discover that a nontrivial relationship exists between the optimal SpecTAD parameter set and the number of CPU cores available at run-time. Furthermore, we find that a majority of the available SpecTAD boost can be achieved within an existing TAD application using relatively simple algorithm modifications.« less
Gravity and Heater Size Effects on Pool Boiling Heat Transfer
NASA Technical Reports Server (NTRS)
Kim, Jungho; Raj, Rishi
2014-01-01
The current work is based on observations of boiling heat transfer over a continuous range of gravity levels between 0g to 1.8g and varying heater sizes with a fluorinert as the test liquid (FC-72/n-perfluorohexane). Variable gravity pool boiling heat transfer measurements over a wide range of gravity levels were made during parabolic flight campaigns as well as onboard the International Space Station. For large heaters and-or higher gravity conditions, buoyancy dominated boiling and heat transfer results were heater size independent. The power law coefficient for gravity in the heat transfer equation was found to be a function of wall temperature under these conditions. Under low gravity conditions and-or for smaller heaters, surface tension forces dominated and heat transfer results were heater size dependent. A pool boiling regime map differentiating buoyancy and surface tension dominated regimes was developed along with a unified framework that allowed for scaling of pool boiling over a wide range of gravity levels and heater sizes. The scaling laws developed in this study are expected to allow performance quantification of phase change based technologies under variable gravity environments eventually leading to their implementation in space based applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jager, Yetta; Forsythe, Patrick S.; McLaughlin, Robert L.
The majority of large North American rivers are fragmented by dams that interrupt migrations of wide-ranging fishes like sturgeons. Reconnecting habitat is viewed as an important means of protecting sturgeon species in U.S. rivers because these species have lost between 5% and 60% of their historical ranges. Unfortunately, facilities designed to pass other fishes have rarely worked well for sturgeons. The most successful passage facilities were sized appropriately for sturgeons and accommodated bottom-oriented species. For upstream passage, facilities with large entrances, full-depth guidance systems, large lifts, or wide fishways without obstructions or tight turns worked well. However, facilitating upstream migrationmore » is only half the battle. Broader recovery for linked sturgeon populations requires safe round-trip passage involving multiple dams. The most successful downstream passage facilities included nature-like fishways, large canal bypasses, and bottom-draw sluice gates. We outline an adaptive approach to implementing passage that begins with temporary programs and structures and monitors success both at the scale of individual fish at individual dams and the scale of metapopulations in a river basin. The challenge will be to learn from past efforts and reconnect North American sturgeon populations in a way that promotes range expansion and facilitates population recovery.« less
Reconnecting fragmented sturgeon populations in North American rivers
Jager, Henriette; Parsley, Michael J.; Cech, Joseph J. Jr.; McLaughlin, R.L.; Forsythe, Patrick S.; Elliott, Robert S.
2016-01-01
The majority of large North American rivers are fragmented by dams that interrupt migrations of wide-ranging fishes like sturgeons. Reconnecting habitat is viewed as an important means of protecting sturgeon species in U.S. rivers because these species have lost between 5% and 60% of their historical ranges. Unfortunately, facilities designed to pass other fishes have rarely worked well for sturgeons. The most successful passage facilities were sized appropriately for sturgeons and accommodated bottom-oriented species. For upstream passage, facilities with large entrances, full-depth guidance systems, large lifts, or wide fishways without obstructions or tight turns worked well. However, facilitating upstream migration is only half the battle. Broader recovery for linked sturgeon populations requires safe “round-trip” passage involving multiple dams. The most successful downstream passage facilities included nature-like fishways, large canal bypasses, and bottom-draw sluice gates. We outline an adaptive approach to implementing passage that begins with temporary programs and structures and monitors success both at the scale of individual fish at individual dams and the scale of metapopulations in a river basin. The challenge will be to learn from past efforts and reconnect North American sturgeon populations in a way that promotes range expansion and facilitates population recovery.
Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker
NASA Astrophysics Data System (ADS)
Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong
2017-10-01
Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.
NASA Astrophysics Data System (ADS)
Hartmann, Alfred; Redfield, Steve
1989-04-01
This paper discusses design of large-scale (1000x 1000) optical crossbar switching networks for use in parallel processing supercom-puters. Alternative design sketches for an optical crossbar switching network are presented using free-space optical transmission with either a beam spreading/masking model or a beam steering model for internodal communications. The performances of alternative multiple access channel communications protocol-unslotted and slotted ALOHA and carrier sense multiple access (CSMA)-are compared with the performance of the classic arbitrated bus crossbar of conventional electronic parallel computing. These comparisons indicate an almost inverse relationship between ease of implementation and speed of operation. Practical issues of optical system design are addressed, and an optically addressed, composite spatial light modulator design is presented for fabrication to arbitrarily large scale. The wide range of switch architecture, communications protocol, optical systems design, device fabrication, and system performance problems presented by these design sketches poses a serious challenge to practical exploitation of highly parallel optical interconnects in advanced computer designs.
NASA Astrophysics Data System (ADS)
Kempf, A.; Chatwin-Davies, A.; Martin, R. T. W.
2013-02-01
While a natural ultraviolet cutoff, presumably at the Planck length, is widely assumed to exist in nature, it is nontrivial to implement a minimum length scale covariantly. This is because the presence of a fixed minimum length needs to be reconciled with the ability of Lorentz transformations to contract lengths. In this paper, we implement a fully covariant Planck scale cutoff by cutting off the spectrum of the d'Alembertian. In this scenario, consistent with Lorentz contractions, wavelengths that are arbitrarily smaller than the Planck length continue to exist. However, the dynamics of modes of wavelengths that are significantly smaller than the Planck length possess a very small bandwidth. This has the effect of freezing the dynamics of such modes. While both wavelengths and bandwidths are frame dependent, Lorentz contraction and time dilation conspire to make the freezing of modes of trans-Planckian wavelengths covariant. In particular, we show that this ultraviolet cutoff can be implemented covariantly also in curved spacetimes. We focus on Friedmann Robertson Walker spacetimes and their much-discussed trans-Planckian question: The physical wavelength of each comoving mode was smaller than the Planck scale at sufficiently early times. What was the mode's dynamics then? Here, we show that in the presence of the covariant UV cutoff, the dynamical bandwidth of a comoving mode is essentially zero up until its physical wavelength starts exceeding the Planck length. In particular, we show that under general assumptions, the number of dynamical degrees of freedom of each comoving mode all the way up to some arbitrary finite time is actually finite. Our results also open the way to calculating the impact of this natural UV cutoff on inflationary predictions for the cosmic microwave background.
Soft Actuators for Small-Scale Robotics.
Hines, Lindsey; Petersen, Kirstin; Lum, Guo Zhan; Sitti, Metin
2017-04-01
This review comprises a detailed survey of ongoing methodologies for soft actuators, highlighting approaches suitable for nanometer- to centimeter-scale robotic applications. Soft robots present a special design challenge in that their actuation and sensing mechanisms are often highly integrated with the robot body and overall functionality. When less than a centimeter, they belong to an even more special subcategory of robots or devices, in that they often lack on-board power, sensing, computation, and control. Soft, active materials are particularly well suited for this task, with a wide range of stimulants and a number of impressive examples, demonstrating large deformations, high motion complexities, and varied multifunctionality. Recent research includes both the development of new materials and composites, as well as novel implementations leveraging the unique properties of soft materials. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2009-01-01
Background Insertional mutagenesis is an effective method for functional genomic studies in various organisms. It can rapidly generate easily tractable mutations. A large-scale insertional mutagenesis with the piggyBac (PB) transposon is currently performed in mice at the Institute of Developmental Biology and Molecular Medicine (IDM), Fudan University in Shanghai, China. This project is carried out via collaborations among multiple groups overseeing interconnected experimental steps and generates a large volume of experimental data continuously. Therefore, the project calls for an efficient database system for recording, management, statistical analysis, and information exchange. Results This paper presents a database application called MP-PBmice (insertional mutation mapping system of PB Mutagenesis Information Center), which is developed to serve the on-going large-scale PB insertional mutagenesis project. A lightweight enterprise-level development framework Struts-Spring-Hibernate is used here to ensure constructive and flexible support to the application. The MP-PBmice database system has three major features: strict access-control, efficient workflow control, and good expandability. It supports the collaboration among different groups that enter data and exchange information on daily basis, and is capable of providing real time progress reports for the whole project. MP-PBmice can be easily adapted for other large-scale insertional mutation mapping projects and the source code of this software is freely available at http://www.idmshanghai.cn/PBmice. Conclusion MP-PBmice is a web-based application for large-scale insertional mutation mapping onto the mouse genome, implemented with the widely used framework Struts-Spring-Hibernate. This system is already in use by the on-going genome-wide PB insertional mutation mapping project at IDM, Fudan University. PMID:19958505
Notenbaert, An; Pfeifer, Catherine; Silvestri, Silvia; Herrero, Mario
2017-02-01
As a result of population growth, urbanization and climate change, agricultural systems around the world face enormous pressure on the use of resources. There is a pressing need for wide-scale innovation leading to development that improves the livelihoods and food security of the world's population while at the same time addressing climate change adaptation and mitigation. A variety of promising climate-smart interventions have been identified. However, what remains is the prioritization of interventions for investment and broad dissemination. The suitability and adoption of interventions depends on a variety of bio-physical and socio-economic factors. Also their impacts, when adopted and out-scaled, are likely to be highly heterogeneous. This heterogeneity expresses itself not only spatially and temporally but also in terms of the stakeholders affected, some might win and some might lose. A mechanism that can facilitate a systematic, holistic assessment of the likely spread and consequential impact of potential interventions is one way of improving the selection and targeting of such options. In this paper we provide climate smart agriculture (CSA) planners and implementers at all levels with a generic framework for evaluating and prioritising potential interventions. This entails an iterative process of mapping out recommendation domains, assessing adoption potential and estimating impacts. Through examples, related to livestock production in sub-Saharan Africa, we demonstrate each of the steps and how they are interlinked. The framework is applicable in many different forms, scales and settings. It has a wide applicability beyond the examples presented and we hope to stimulate readers to integrate the concepts in the planning process for climate-smart agriculture, which invariably involves multi-stakeholder, multi-scale and multi-objective decision-making.
Implementing meta-analysis from genome-wide association studies for pork quality traits
USDA-ARS?s Scientific Manuscript database
Pork quality plays an important role in the meat processing industry, thus different methodologies have been implemented to elucidate the genetic architecture of traits affecting meat quality. One of the most common and widely used approaches is to perform genome-wide association (GWA) studies. Howe...
Légaré, France; Moumjid-Ferdjaoui, Nora; Drolet, Renée; Stacey, Dawn; Härter, Martin; Bastian, Hilda; Beaulieu, Marie-Dominique; Borduas, Francine; Charles, Cathy; Coulter, Angela; Desroches, Sophie; Friedrich, Gwendolyn; Gafni, Amiram; Graham, Ian D; Labrecque, Michel; LeBlanc, Annie; Légaré, Jean; Politi, Mary; Sargeant, Joan; Thomson, Richard
2013-01-01
Shared decision making is now making inroads in health care professionals' continuing education curriculum, but there is no consensus on what core competencies are required by clinicians for effectively involving patients in health-related decisions. Ready-made programs for training clinicians in shared decision making are in high demand, but existing programs vary widely in their theoretical foundations, length, and content. An international, interdisciplinary group of 25 individuals met in 2012 to discuss theoretical approaches to making health-related decisions, compare notes on existing programs, take stock of stakeholders concerns, and deliberate on core competencies. This article summarizes the results of those discussions. Some participants believed that existing models already provide a sufficient conceptual basis for developing and implementing shared decision making competency-based training programs on a wide scale. Others argued that this would be premature as there is still no consensus on the definition of shared decision making or sufficient evidence to recommend specific competencies for implementing shared decision making. However, all participants agreed that there were 2 broad types of competencies that clinicians need for implementing shared decision making: relational competencies and risk communication competencies. Further multidisciplinary research could broaden and deepen our understanding of core competencies for shared decision making training. Copyright © 2013 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.
ISO 50001 and SEP Faster and Cheaper - Exploring the Enterprise-Wide Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jingjing; Rao, Prakash; Therkelsen, Peter
ISO 50001 and other management systems (e.g., ISO 9001 and ISO 14001) allow for implementation and certification at the enterprise level. The "Central Office" concept, which allows a small group of employees to manage and facilitate the organization’s energy management system (EnMS) at the enterprise level, was introduced within the ISO 50003 standard to provide guidance to ISO 50001 certification bodies. Four industrial companies have partnered with the United States Department of Energy to pilot the enterprise-wide ISO 50001/SEP concept under the Better Buildings Superior Energy Performance (SEP) Enterprise-wide Accelerator. Each organization developed a Central Office to host their EnMSmore » while implementing ISO 50001/SEP at multiple physically separated sites. The four corporate partners tailored their Central Office implementation model to meet their own specific circumstances and needs. This paper reviews the commonalities, differences, and benefits of each of these enterprise-wide implementation models, including organizational structures, Central Office staff responsibilities, and key strategies. The cost savings and benefits of using the enterprise-wide approach were assessed, including the cost per site compared with that of a conventional, single-site ISO 50001/SEP implementation approach. This paper also discusses the drivers for the cost reductions realized through these enterprise-wide approaches. The four partner companies worked with 30 total sites. On average, these 30 sites improved energy performance by 5% annually over their SEP achievement periods, saved more than $600,000 annually in energy costs and reduced implementation cost for ISO 50001 and SEP by $19,000 and 0.8 Full Time Equivalent × years (FTE-yr) of staff time per site. The results can inform other organizations seeking to implement enterprise-wide ISO 50001/SEP, as well as energy efficiency organizations seeking to promote wider adoption of ISO 50001 implementation.« less
Distributed Storage Healthcare — The Basis of a Planet-Wide Public Health Care Network
Kakouros, Nikolaos
2013-01-01
Background: As health providers move towards higher levels of information technology (IT) integration, they become increasingly dependent on the availability of the electronic health record (EHR). Current solutions of individually managed storage by each healthcare provider focus on efforts to ensure data security, availability and redundancy. Such models, however, scale poorly to a future of a planet-wide public health-care network (PWPHN). Our aim was to review the research literature on distributed storage systems and propose methods that may aid the implementation of a PWPHN. Methods: A systematic review was carried out of the research dealing with distributed storage systems and EHR. A literature search was conducted on five electronic databases: Pubmed/Medline, Cinalh, EMBASE, Web of Science (ISI) and Google Scholar and then expanded to include non-authoritative sources. Results: The English National Health Service Spine represents the most established country-wide PHN but is limited in deployment and remains underused. Other, literature identified and established distributed EHR attempts are more limited in scope. We discuss the currently available distributed file storage solutions and propose a schema of how one of these technologies can be used to deploy a distributed storage of EHR with benefits in terms of enhanced fault tolerance and global availability within the PWPHN. We conclude that a PWPHN distributed health care record storage system is technically feasible over current Internet infrastructure. Nonetheless, the socioeconomic viability of PWPHN implementations remains to be determined. PMID:23459171
NASA Astrophysics Data System (ADS)
Yang, Shuangming; Wei, Xile; Deng, Bin; Liu, Chen; Li, Huiyan; Wang, Jiang
2018-03-01
Balance between biological plausibility of dynamical activities and computational efficiency is one of challenging problems in computational neuroscience and neural system engineering. This paper proposes a set of efficient methods for the hardware realization of the conductance-based neuron model with relevant dynamics, targeting reproducing the biological behaviors with low-cost implementation on digital programmable platform, which can be applied in wide range of conductance-based neuron models. Modified GP neuron models for efficient hardware implementation are presented to reproduce reliable pallidal dynamics, which decode the information of basal ganglia and regulate the movement disorder related voluntary activities. Implementation results on a field-programmable gate array (FPGA) demonstrate that the proposed techniques and models can reduce the resource cost significantly and reproduce the biological dynamics accurately. Besides, the biological behaviors with weak network coupling are explored on the proposed platform, and theoretical analysis is also made for the investigation of biological characteristics of the structured pallidal oscillator and network. The implementation techniques provide an essential step towards the large-scale neural network to explore the dynamical mechanisms in real time. Furthermore, the proposed methodology enables the FPGA-based system a powerful platform for the investigation on neurodegenerative diseases and real-time control of bio-inspired neuro-robotics.
Small-scale, self-propagating combustion realized with on-chip porous silicon.
Piekiel, Nicholas W; Morris, Christopher J
2015-05-13
For small-scale energy applications, energetic materials represent a high energy density source that, in certain cases, can be accessed with a very small amount of energy input. Recent advances in microprocessing techniques allow for the implementation of a porous silicon energetic material onto a crystalline silicon wafer at the microscale; however, combustion at a small length scale remains to be fully investigated, particularly with regards to the limitations of increased relative heat loss during combustion. The present study explores the critical dimensions of an on-chip porous silicon energetic material (porous silicon + sodium perchlorate (NaClO4)) required to propagate combustion. We etched ∼97 μm wide and ∼45 μm deep porous silicon channels that burned at a steady rate of 4.6 m/s, remaining steady across 90° changes in direction. In an effort to minimize the potential on-chip footprint for energetic porous silicon, we also explored the minimum spacing between porous silicon channels. We demonstrated independent burning of porous silicon channels at a spacing of <40 μm. Using this spacing, it was possible to have a flame path length of >0.5 m on a chip surface area of 1.65 cm(2). Smaller porous silicon channels of ∼28 μm wide and ∼14 μm deep were also utilized. These samples propagated combustion, but at times, did so unsteadily. This result may suggest that we are approaching a critical length scale for self-propagating combustion in a porous silicon energetic material.
Georgeu, Daniella; Colvin, Christopher J; Lewin, Simon; Fairall, Lara; Bachmann, Max O; Uebel, Kerry; Zwarenstein, Merrick; Draper, Beverly; Bateman, Eric D
2012-07-16
Task-shifting is promoted widely as a mechanism for expanding antiretroviral treatment (ART) access. However, the evidence for nurse-initiated and managed ART (NIMART) in Africa is limited, and little is known about the key barriers and enablers to implementing NIMART programmes on a large scale. The STRETCH (Streamlining Tasks and Roles to Expand Treatment and Care for HIV) programme was a complex educational and organisational intervention implemented in the Free State Province of South Africa to enable nurses providing primary HIV/AIDS care to expand their roles and include aspects of care and treatment usually provided by physicians. STRETCH used a phased implementation approach and ART treatment guidelines tailored specifically to nurses. The effects of STRETCH on pre-ART mortality, ART provision, and the quality of HIV/ART care were evaluated through a randomised controlled trial. This study was conducted alongside the trial to develop a contextualised understanding of factors affecting the implementation of the programme. This study was a qualitative process evaluation using in-depth interviews and focus group discussions with patients, health workers, health managers, and other key informants as well as observation in clinics. Research questions focused on perceptions of STRETCH, changes in health provider roles, attitudes and patient relationships, and impact of the implementation context on trial outcomes. Data were analysed collaboratively by the research team using thematic analysis. NIMART appears to be highly acceptable among nurses, patients, and physicians. Managers and nurses expressed confidence in their ability to deliver ART successfully. This confidence developed slowly and unevenly, through a phased and well-supported approach that guided nurses through training, re-prescription, and initiation. The research also shows that NIMART changes the working and referral relationships between health staff, demands significant training and support, and faces workload and capacity constraints, and logistical and infrastructural challenges. Large-scale NIMART appears to be feasible and acceptable in the primary level public sector health services in South Africa. Successful implementation requires a comprehensive approach with: an incremental and well supported approach to implementation; clinical guidelines tailored to nurses; and significant health services reorganisation to accommodate the knock-on effects of shifts in practice.
Mägi, Reedik; Suleimanov, Yury V; Clarke, Geraldine M; Kaakinen, Marika; Fischer, Krista; Prokopenko, Inga; Morris, Andrew P
2017-01-11
Genome-wide association studies (GWAS) of single nucleotide polymorphisms (SNPs) have been successful in identifying loci contributing genetic effects to a wide range of complex human diseases and quantitative traits. The traditional approach to GWAS analysis is to consider each phenotype separately, despite the fact that many diseases and quantitative traits are correlated with each other, and often measured in the same sample of individuals. Multivariate analyses of correlated phenotypes have been demonstrated, by simulation, to increase power to detect association with SNPs, and thus may enable improved detection of novel loci contributing to diseases and quantitative traits. We have developed the SCOPA software to enable GWAS analysis of multiple correlated phenotypes. The software implements "reverse regression" methodology, which treats the genotype of an individual at a SNP as the outcome and the phenotypes as predictors in a general linear model. SCOPA can be applied to quantitative traits and categorical phenotypes, and can accommodate imputed genotypes under a dosage model. The accompanying META-SCOPA software enables meta-analysis of association summary statistics from SCOPA across GWAS. Application of SCOPA to two GWAS of high-and low-density lipoprotein cholesterol, triglycerides and body mass index, and subsequent meta-analysis with META-SCOPA, highlighted stronger association signals than univariate phenotype analysis at established lipid and obesity loci. The META-SCOPA meta-analysis also revealed a novel signal of association at genome-wide significance for triglycerides mapping to GPC5 (lead SNP rs71427535, p = 1.1x10 -8 ), which has not been reported in previous large-scale GWAS of lipid traits. The SCOPA and META-SCOPA software enable discovery and dissection of multiple phenotype association signals through implementation of a powerful reverse regression approach.
Evaluating a Social and Emotional Learning Curriculum, "Strong Kids", Implemented School-Wide
ERIC Educational Resources Information Center
Kramer, Thomas J.
2013-01-01
The goal of this study was to explore whether "Strong Kids" could result in improved social and emotional competence when implemented as a school-wide universal intervention. No prior studies have examined this question. This study also evaluated whether teachers could implement "Strong Kids" as it was designed and whether they…
ERIC Educational Resources Information Center
Gay, Ronald Lynn
2016-01-01
This study examined factors related to the implementation of a School Wide Positive Behavioral Intervention and Support (SWPBIS) program at a large middle school in the United States. Parent Teacher Student Association volunteers at the school reported that teacher fidelity to implementation of SWPBIS activities was inconsistent, threatening the…
ERIC Educational Resources Information Center
Pinkelman, Sarah E.; McIntosh, Kent; Rasplica, Caitlin K.; Berg, Tricia; Strickland-Cohen, M. Kathleen
2015-01-01
The purpose of this study was to identify the most important perceived enablers and barriers regarding sustainability of school-wide positive behavioral interventions and supports. School personnel representing 860 schools implementing or preparing to implement school-wide positive behavioral interventions and supports completed an open-ended…
ERIC Educational Resources Information Center
Broskey, Matthew
2017-01-01
This study focused on understanding teachers' personal and professional experiences that influence the fidelity of implementation of a school-wide positive behavior support (SWPBS) program within their classrooms. Research has focused on the implementation fidelity of school-wide positive support programs, academic impact on students, teacher…
ERIC Educational Resources Information Center
McIntosh, Kent; Predy, Larissa K.; Upreti, Gita; Hume, Amanda E.; Turri, Mary G.; Mathews, Susanna
2014-01-01
The purpose of this study was to assess the perceived importance of specific contextual variables for initial implementation and sustainability of School-Wide Positive Behavior Support (SWPBS). A large, national sample of 257 school team members completed the "School-Wide Universal Behavior Sustainability Index: School Teams", a…
Mathews, Juanita; Levin, Michael
2018-04-20
Breakthroughs in biomedicine and synthetic bioengineering require predictive, rational control over anatomical structure and function. Recent successes in manipulating cellular and molecular hardware have not been matched by progress in understanding the patterning software implemented during embryogenesis and regeneration. A fundamental capability gap is driving desired changes in growth and form to address birth defects and traumatic injury. Here we review new tools, results, and conceptual advances in an exciting emerging field: endogenous non-neural bioelectric signaling, which enables cellular collectives to make global decisions and implement large-scale pattern homeostasis. Spatially distributed electric circuits regulate gene expression, organ morphogenesis, and body-wide axial patterning. Developmental bioelectricity facilitates the interface to organ-level modular control points that direct patterning in vivo. Cracking the bioelectric code will enable transformative progress in bioengineering and regenerative medicine. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sushko, Gennady B.; Solov'yov, Ilia A.; Verkhovtsev, Alexey V.; Volkov, Sergey N.; Solov'yov, Andrey V.
2016-01-01
The concept of molecular mechanics force field has been widely accepted nowadays for studying various processes in biomolecular systems. In this paper, we suggest a modification for the standard CHARMM force field that permits simulations of systems with dynamically changing molecular topologies. The implementation of the modified force field was carried out in the popular program MBN Explorer, and, to support the development, we provide several illustrative case studies where dynamical topology is necessary. In particular, it is shown that the modified molecular mechanics force field can be applied for studying processes where rupture of chemical bonds plays an essential role, e.g., in irradiation- or collision-induced damage, and also in transformation and fragmentation processes involving biomolecular systems. Contribution to the Topical Issue "COST Action Nano-IBCT: Nano-scale Processes Behind Ion-Beam Cancer Therapy", edited by Andrey V. Solov'yov, Nigel Mason, Gustavo Garcia and Eugene Surdutovich.
Preventive chemotherapy in human helminthiasis: theoretical and operational aspects
Chitsulo, L.; Engels, D.; Savioli, L.
2017-01-01
Preventive chemotherapy (PC), the large-scale distribution of anthelminthic drugs to population groups at risk, is the core intervention recommended by the WHO for reducing morbidity and transmission of the four main helminth infections, namely lymphatic filariasis, onchocerciasis, schistosomiasis and soil-transmitted helminthiasis. The strategy is widely implemented worldwide but its general theoretical foundations have not been described so far in a comprehensive and cohesive manner. Starting from the information available on the biological and epidemiological characteristics of helminth infections, as well as from the experience generated by disease control and elimination interventions across the world, we extrapolate the fundamentals and synthesise the principles that regulate PC and justify its implementation as a sound and essential public health intervention. The outline of the theoretical aspects of PC contributes to a thorough understanding of the different facets of this strategy and helps comprehend opportunities and limits of control and elimination interventions directed against helminth infections. PMID:22040463
Muñoz, C; Young, H; Antileo, C; Bornhardt, C
2009-01-01
This paper presents a sliding mode controller (SMC) for dissolved oxygen (DO) in an integrated nitrogen removal process carried out in a suspended biomass sequencing batch reactor (SBR). The SMC performance was compared against an auto-tuning PI controller with parameters adjusted at the beginning of the batch cycle. A method for cancelling the slow DO sensor dynamics was implemented by using a first order model of the sensor. Tests in a lab-scale reactor showed that the SMC offers a better disturbance rejection capability than the auto-tuning PI controller, furthermore providing reasonable performance in a wide range of operation. Thus, SMC becomes an effective robust nonlinear tool to the DO control in this process, being also simple from a computational point of view, allowing its implementation in devices such as industrial programmable logic controllers (PLCs).
Fales, B Scott; Levine, Benjamin G
2015-10-13
Methods based on a full configuration interaction (FCI) expansion in an active space of orbitals are widely used for modeling chemical phenomena such as bond breaking, multiply excited states, and conical intersections in small-to-medium-sized molecules, but these phenomena occur in systems of all sizes. To scale such calculations up to the nanoscale, we have developed an implementation of FCI in which electron repulsion integral transformation and several of the more expensive steps in σ vector formation are performed on graphical processing unit (GPU) hardware. When applied to a 1.7 × 1.4 × 1.4 nm silicon nanoparticle (Si72H64) described with the polarized, all-electron 6-31G** basis set, our implementation can solve for the ground state of the 16-active-electron/16-active-orbital CASCI Hamiltonian (more than 100,000,000 configurations) in 39 min on a single NVidia K40 GPU.
Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A
2016-05-01
The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient's/parent's smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.
A Wide-Band High-Gain Compact SIS Receiver Utilizing a 300-μW SiGe IF LNA
NASA Astrophysics Data System (ADS)
Montazeri, Shirin; Grimes, Paul K.; Tong, Cheuk-Yu Edward; Bardin, Joseph C.
2017-06-01
Low-power low-noise amplifiers integrated with superconductor-insulator-superconductor (SIS) mixers are required to enable implementation of large-scale focal plane arrays. In this work, a 220-GHz SIS mixer has been integrated with a high-gain broad-band low-power IF amplifier into a compact receiver module. The low noise amplifier (LNA) was specifically designed to match to the SIS output impedance and contributes less than 7 K to the system noise temperature over the 4-8 GHz IF frequency range. A receiver noise temperature of 30-45 K was measured for a local oscillator frequency of 220 GHz over an IF spanning 4-8 GHz. The LNA power dissipation was only 300-μW. To the best of the authors' knowledge, this is the lowest power consumption reported for a high-gain wide-band LNA directly integrated with an SIS mixer.
Pound, Joe Mathews; Miller, John Allen; George, John E; Fish, Durland
2009-08-01
The Northeast Area-wide Tick Control Project (NEATCP) was funded by the United States Department of Agriculture (USDA) as a large-scale cooperative demonstration project of the USDA-Agricultural Research Service (ARS)-patented 4-Poster tick control technology (Pound et al. 1994) involving the USDA-ARS and a consortium of universities, state agencies, and a consulting firm at research locations in the five states of Connecticut (CT), Maryland (MD), New Jersey (NJ), New York (NY), and Rhode Island (RI). The stated objective of the project was "A community-based field trial of ARS-patented tick control technology designed to reduce the risk of Lyme disease in northeastern states." Here we relate the rationale and history of the technology, a chronological listing of events leading to implementation of the project, the original protocol for selecting treatment, and control sites, and protocols for deployment of treatments, sampling, assays, data analyses, and estimates of efficacy.
Verloo, Henk; Desmedt, Mario; Morin, Diane
2017-09-01
To evaluate two psychometric properties of the French versions of the Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales, namely their internal consistency and construct validity. The Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales developed by Melnyk et al. are recognised as valid, reliable instruments in English. However, no psychometric validation for their French versions existed. Secondary analysis of a cross sectional survey. Source data came from a cross-sectional descriptive study sample of 382 nurses and other allied healthcare providers. Cronbach's alpha was used to evaluate internal consistency, and principal axis factor analysis and varimax rotation were computed to determine construct validity. The French Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales showed excellent reliability, with Cronbach's alphas close to the scores established by Melnyk et al.'s original versions. Principal axis factor analysis showed medium-to-high factor loading scores without obtaining collinearity. Principal axis factor analysis with varimax rotation of the 16-item Evidence-Based Practice Beliefs scale resulted in a four-factor loading structure. Principal axis factor analysis with varimax rotation of the 17-item Evidence-Based Practice Implementation scale revealed a two-factor loading structure. Further research should attempt to understand why the French Evidence-Based Practice Implementation scale showed a two-factor loading structure but Melnyk et al.'s original has only one. The French versions of the Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales can both be considered valid and reliable instruments for measuring Evidence-Based Practice beliefs and implementation. The results suggest that the French Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales are valid and reliable and can therefore be used to evaluate the effectiveness of organisational strategies aimed at increasing professionals' confidence in Evidence-Based Practice, supporting its use and implementation. © 2017 John Wiley & Sons Ltd.
Evaluation of low-cost commercial-off-the-shelf autopilot systems for SUAS operations
NASA Astrophysics Data System (ADS)
Brown, Calvin Thomas
With this increase in unmanned aircraft system (UAS) operations, there is a need for a structured process to evaluate different commercially available systems, particularly autopilots. The Remotely Operated Aircraft Management, Interpretation, and Navigation from Ground or ROAMING scale was developed to meet this need. This scale is a modification of the widely accepted Handling Qualities Rating scale developed by George Cooper and Robert Harper Jr. The Cooper-Harper scale allows pilots to rate a vehicle's performance in completing some task. Similarly, the ROAMING scale allows UAS operators to evaluate the management and observability of UAS in completing some task. The standardized evaluative process consists of cost, size, weight, and power (SWAP) analysis, ease of implementation through procedural description of setup, ROAMING scale rating, a slightly modified NASA TLX rating, and comparison of manual operation to autonomous operation of the task. This standard for evaluation of autopilots and their software will lead to better understanding of the workload placed on UAS operators and indicate where improvements to design and operational procedures can be made. An assortment of low-cost commercial-off-the-shelf (COTS) autopilots were selected for use in the development of the evaluation and results of these tests demonstrate the commonalities and differences in these systems.
Martini, Roberto; Barthelat, Francois
2016-10-13
Protective systems that are simultaneously hard to puncture and compliant in flexion are desirable, but difficult to achieve because hard materials are usually stiff. However, we can overcome this conflicting design requirement by combining plates of a hard material with a softer substrate, and a strategy which is widely found in natural armors such as fish scales or osteoderms. Man-made segmented armors have a long history, but their systematic implementation in a modern and a protective system is still hampered by a limited understanding of the mechanics and the design of optimization guidelines, and by challenges in cost-efficient manufacturing. This study addresses these limitations with a flexible bioinspired armor based on overlapping ceramic scales. The fabrication combines laser engraving and a stretch-and-release method which allows for fine tuning of the size and overlap of the scales, and which is suitable for large scale fabrication. Compared to a continuous layer of uniform ceramic, our fish-scale like armor is not only more flexible, but it is also more resistant to puncture and more damage tolerant. The proposed armor is also about ten times more puncture resistant than soft elastomers, making it a very attractive alternative to traditional protective equipment.
Understanding the importance of an energy crisis
NASA Astrophysics Data System (ADS)
Mechtenberg, Abigail Reid
Human development and energy, in general, and electrical energy, specifically, co-exist seamlessly in high HDI countries where reliability and availability is greater than 99%. In numerous low HDI countries, there is 2-50% electric grid availability with reliability at or below 50% due to load shedding and faults. In Africa, solar, wind, biomass and hydroelectric energy production are cited to meet growing demand and increase reliability and availability; however, the capital costs are greater than the ability-to-pay for wide scale implementation. Since the 1970s, the United States has continued to argue over the new sustainable energy infrastructure solution(s); thus resulting in no new infrastructure being built for wide scale implementation. Together the world is facing the daunting task of averting an energy crisis in developed countries and facing energy crises in developing countries. This thesis explores the importance of energy crises: from the past, current, and future. The first part entails arguing that the United States is not on a pathway to prevent an energy crisis based on an analysis of 1986 and 2004 niche and status-quo manufacturing of light-duty vehicles. The second part answers the question of what an energy crisis looks like by exploring and investigating current electrical energy crises in Fort Portal, Uganda. This part used both anthropological and physics education empowerment research to co-design and build for various energy crisis situations in hospitals, schools, and businesses all from locally available materials and expertise. Finally, looking into the US light-duty vehicle's future, I design a new hybrid vehicle powertrain (called transition mode hybrid). This third part describes my new patent as a way to avert an energy crisis in the light-duty transportation sector.
75 FR 42633 - Business Continuity and Disaster Recovery
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-22
... event of a wide-scale disruption affecting such entities' trading or clearing operations. These proposed... objective, in the event of a wide-scale disruption. The proposed amendments also revise application guidance... overall resilience of the U.S. financial system in the event of a wide-scale disruption, and is the...
Adapting Certified Safe Farm to North Carolina Agriculture: An Implementation Study.
Storm, Julia F; LePrevost, Catherine E; Tutor-Marcom, Robin; Cope, W Gregory
2016-01-01
Certified Safe Farm (CSF) is a multimodal safety and health program developed and assessed through multiple controlled intervention studies in Iowa. Although developed with the intent to be broadly applicable to agriculture, CSF has not been widely implemented outside the midwestern United States. This article describes the CSF implementation process in North Carolina (NC), as piloted on a large-scale in three agriculturally diverse and productive counties of NC, and reports its effectiveness using the Reach Effectiveness Adoption Implementation Maintenance (RE-AIM) framework. Implementation involved (1) capacity building through safety and health training, (2) adaptation of components of Iowa's CSF model to NC agriculture, (3) marketing and recruitment, and (4) formative evaluation, including an online survey and focus group discussion. From 2009 to 2012, 113 farms participated in at least one component of the CSF intervention, representing a NC farm participation rate of 3.1% in the study area. A major adaptation of NC implementation was the utilization of NC Cooperative Extension as the local driver of implementation in contrast to local AgriSafe clinics in Iowa. The most innovative adaptation to CSF components was the development of a defined economic incentive in the form of a cost-share program. The RE-AIM framework was found to be useful and relevant to the field of agricultural health and safety translational research. This study provides effectiveness measures and implementation alternatives useful for those considering implementing CSF. It informs current efforts to move CSF from research to practice through the National Sustainable Model CSF Program initiative.
Stucki, Gerold; Zampolini, Mauro; Juocevicius, Alvydas; Negrini, Stefano; Christodoulou, Nicolas
2017-04-01
Since its launch in 2001, relevant international, regional and national PRM bodies have aimed to implement the International Classification of Functioning, Disability and Health (ICF) in Physical and Rehabilitation Medicine (PRM), whereby contributing to the development of suitable practical tools. These tools are available for implementing the ICF in day-to-day clinical practice, standardized reporting of functioning outcomes in quality management and research, and guiding evidence-informed policy. Educational efforts have reinforced PRM physicians' and other rehabilitation professionals' ICF knowledge, and numerous implementation projects have explored how the ICF is applied in clinical practice, research and policy. Largely lacking though is the system-wide implementation of ICF in day-to-day practice across all rehabilitation services of national health systems. In Europe, system-wide implementation of ICF requires the interaction between practice, science and governance. Considering its mandate, the UEMS PRM Section and Board have decided to lead a European effort towards system-wide ICF implementation in PRM, rehabilitation and health care at large, in interaction with governments, non-governmental actors and the private sector, and aligned with ISPRM's collaboration plan with WHO. In this paper we present the current PRM internal and external policy agenda towards system-wide ICF implementation and the corresponding implementation action plan, while highlighting priority action steps - promotion of ICF-based standardized reporting in national quality management and assurance programs, development of unambiguous rehabilitation service descriptions using the International Classification System for Service Organization in Health-related Rehabilitation, development of Clinical Assessment Schedules, qualitative linkage and quantitative mapping of data to the ICF, and the cultural adaptation of the ICF Clinical Data Collection Tool in European languages.
Merks, Piotr; Swieczkowski, Damian; Byliniak, Michal; Drozd, Mariola; Krupa, Katarzyna; Jaguszewski, Milosz; Brindley, David A; Naughton, Bernard D
2018-01-01
By February 2019, the Polish pharmaceutical industry, community and hospital pharmacies, wholesalers and parallel traders must all comply with the EU-wide Falsified Medicines Directive (FMD) legislation (2011/62/EU), to ensure that no medicinal product is dispensed to a patient without proper tracking and authentication. Here we describe how Poland is complying with the new EU regulations, the actions that have been taken to incorporate the FMD into Polish Pharmaceutical Law and whether or not these actions are sufficient. We found that Poland is only partially compliant with the FMD and further actions need to be undertaken to fully meet the Delegated Act (DA) requirements. Moreover, there is lack of awareness in Poland about the prevalence of falsified medication and the time scale required for implementation of the DA. Based on our findings, we suggest that a public awareness campaign should be started to raise awareness of the increased number of falsified medicines in the legal supply chain and that drug authorisation systems are implemented by Polish pharmacies to support the FMD.
Merks, Piotr; Swieczkowski, Damian; Byliniak, Michal; Drozd, Mariola; Krupa, Katarzyna; Jaguszewski, Milosz; Brindley, David A; Naughton, Bernard D
2018-01-01
By February 2019, the Polish pharmaceutical industry, community and hospital pharmacies, wholesalers and parallel traders must all comply with the EU-wide Falsified Medicines Directive (FMD) legislation (2011/62/EU), to ensure that no medicinal product is dispensed to a patient without proper tracking and authentication. Here we describe how Poland is complying with the new EU regulations, the actions that have been taken to incorporate the FMD into Polish Pharmaceutical Law and whether or not these actions are sufficient. We found that Poland is only partially compliant with the FMD and further actions need to be undertaken to fully meet the Delegated Act (DA) requirements. Moreover, there is lack of awareness in Poland about the prevalence of falsified medication and the time scale required for implementation of the DA. Based on our findings, we suggest that a public awareness campaign should be started to raise awareness of the increased number of falsified medicines in the legal supply chain and that drug authorisation systems are implemented by Polish pharmacies to support the FMD. PMID:29445453
Resource Efficient Hardware Architecture for Fast Computation of Running Max/Min Filters
Torres-Huitzil, Cesar
2013-01-01
Running max/min filters on rectangular kernels are widely used in many digital signal and image processing applications. Filtering with a k × k kernel requires of k 2 − 1 comparisons per sample for a direct implementation; thus, performance scales expensively with the kernel size k. Faster computations can be achieved by kernel decomposition and using constant time one-dimensional algorithms on custom hardware. This paper presents a hardware architecture for real-time computation of running max/min filters based on the van Herk/Gil-Werman (HGW) algorithm. The proposed architecture design uses less computation and memory resources than previously reported architectures when targeted to Field Programmable Gate Array (FPGA) devices. Implementation results show that the architecture is able to compute max/min filters, on 1024 × 1024 images with up to 255 × 255 kernels, in around 8.4 milliseconds, 120 frames per second, at a clock frequency of 250 MHz. The implementation is highly scalable for the kernel size with good performance/area tradeoff suitable for embedded applications. The applicability of the architecture is shown for local adaptive image thresholding. PMID:24288456
Hi-Corrector: a fast, scalable and memory-efficient package for normalizing large-scale Hi-C data.
Li, Wenyuan; Gong, Ke; Li, Qingjiao; Alber, Frank; Zhou, Xianghong Jasmine
2015-03-15
Genome-wide proximity ligation assays, e.g. Hi-C and its variant TCC, have recently become important tools to study spatial genome organization. Removing biases from chromatin contact matrices generated by such techniques is a critical preprocessing step of subsequent analyses. The continuing decline of sequencing costs has led to an ever-improving resolution of the Hi-C data, resulting in very large matrices of chromatin contacts. Such large-size matrices, however, pose a great challenge on the memory usage and speed of its normalization. Therefore, there is an urgent need for fast and memory-efficient methods for normalization of Hi-C data. We developed Hi-Corrector, an easy-to-use, open source implementation of the Hi-C data normalization algorithm. Its salient features are (i) scalability-the software is capable of normalizing Hi-C data of any size in reasonable times; (ii) memory efficiency-the sequential version can run on any single computer with very limited memory, no matter how little; (iii) fast speed-the parallel version can run very fast on multiple computing nodes with limited local memory. The sequential version is implemented in ANSI C and can be easily compiled on any system; the parallel version is implemented in ANSI C with the MPI library (a standardized and portable parallel environment designed for solving large-scale scientific problems). The package is freely available at http://zhoulab.usc.edu/Hi-Corrector/. © The Author 2014. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Rerucha, Simon; Sarbort, Martin; Hola, Miroslava; Cizek, Martin; Hucl, Vaclav; Cip, Ondrej; Lazar, Josef
2016-12-01
The homodyne detection with only a single detector represents a promising approach in the interferometric application which enables a significant reduction of the optical system complexity while preserving the fundamental resolution and dynamic range of the single frequency laser interferometers. We present the design, implementation and analysis of algorithmic methods for computational processing of the single-detector interference signal based on parallel pipelined processing suitable for real time implementation on a programmable hardware platform (e.g. the FPGA - Field Programmable Gate Arrays or the SoC - System on Chip). The algorithmic methods incorporate (a) the single detector signal (sine) scaling, filtering, demodulations and mixing necessary for the second (cosine) quadrature signal reconstruction followed by a conic section projection in Cartesian plane as well as (a) the phase unwrapping together with the goniometric and linear transformations needed for the scale linearization and periodic error correction. The digital computing scheme was designed for bandwidths up to tens of megahertz which would allow to measure the displacements at the velocities around half metre per second. The algorithmic methods were tested in real-time operation with a PC-based reference implementation that employed the advantage pipelined processing by balancing the computational load among multiple processor cores. The results indicate that the algorithmic methods are suitable for a wide range of applications [3] and that they are bringing the fringe counting interferometry closer to the industrial applications due to their optical setup simplicity and robustness, computational stability, scalability and also a cost-effectiveness.
ERIC Educational Resources Information Center
Schelling, Amy L.; Harris, Monica L.
2016-01-01
Implementation of school-wide positive behavioral interventions and supports (SWPBIS) in K-12 schools is well documented in the literature. However, far less documentation can be found in the literature related to its implementation with students with significant intellectual and other developmental disabilities being served in either typical or…
ERIC Educational Resources Information Center
Fox, Lise; Veguilla, Myrna; Perez Binder, Denise
2014-01-01
The Technical Assistance Center on Social Emotional Intervention for Young Children (TACSEI) Roadmap on "Data Decision-Making and Program-Wide Implementation of the Pyramid Model" provides programs with guidance on how to collect and use data to ensure the implementation of the Pyramid Model with fidelity and decision-making that…
ERIC Educational Resources Information Center
Weinberger, Yehudith
2018-01-01
The study described here analyses a faculty-wide change designed to foster the communication proficiencies of students in a large teacher education college, gathering data from various sources over three years of the new agenda's implementation. Qualitative and quantitative data analysis revealed that implementation was progressing on two distinct…
ERIC Educational Resources Information Center
Bethune, Keri S.
2017-01-01
Fidelity of implementation of School-Wide Positive Behavioral Interventions and Supports (SWPBIS) procedures within schools is critical to the success of the program. Coaching has been suggested as one approach to helping ensure accuracy of implementation of SWPBIS plans. This study used a multiple baseline across participants design to examine…
ERIC Educational Resources Information Center
Rieffannacht, Kimberlie Beth
2016-01-01
The purpose of this transcendental phenomenological study was to describe lived experience during School Wide Positive Behavior Support (SWPBS) implementation for School Wide Positive Behavior coaches in Pennsylvania public schools. Participants, identified as co-researchers throughout this study, included 11 SWPBS coaches selected from seven…
A web access script language to support clinical application development.
O'Kane, K C; McColligan, E E
1998-02-01
This paper describes the development of a script language to support the implementation of decentralized, clinical information applications on the World Wide Web (Web). The goal of this work is to facilitate construction of low overhead, fully functional clinical information systems that can be accessed anywhere by low cost Web browsers to search, retrieve and analyze stored patient data. The Web provides a model of network access to data bases on a global scale. Although it was originally conceived as a means to exchange scientific documents, Web browsers and servers currently support access to a wide variety of audio, video, graphical and text based data to a rapidly growing community. Access to these services is via inexpensive client software browsers that connect to servers by means of the open architecture of the Internet. In this paper, the design and implementation of a script language that supports the development of low cost, Web-based, distributed clinical information systems for both Inter- and Intra-Net use is presented. The language is based on the Mumps language and, consequently, supports many legacy applications with few modifications. Several enhancements, however, have been made to support modern programming practices and the Web interface. The interpreter for the language also supports standalone program execution on Unix, MS-Windows, OS/2 and other operating systems.
Samwald, Matthias; Xu, Hong; Blagec, Kathrin; Empey, Philip E; Malone, Daniel C; Ahmed, Seid Mussa; Ryan, Patrick; Hofer, Sebastian; Boyce, Richard D
2016-01-01
Pre-emptive pharmacogenomic (PGx) testing of a panel of genes may be easier to implement and more cost-effective than reactive pharmacogenomic testing if a sufficient number of medications are covered by a single test and future medication exposure can be anticipated. We analysed the incidence of exposure of individual patients in the United States to multiple drugs for which pharmacogenomic guidelines are available (PGx drugs) within a selected four-year period (2009-2012) in order to identify and quantify the incidence of pharmacotherapy in a nation-wide patient population that could be impacted by pre-emptive PGx testing based on currently available clinical guidelines. In total, 73 024 095 patient records from private insurance, Medicare Supplemental and Medicaid were included. Patients enrolled in Medicare Supplemental age > = 65 or Medicaid age 40-64 had the highest incidence of PGx drug use, with approximately half of the patients receiving at least one PGx drug during the 4 year period and one fourth to one third of patients receiving two or more PGx drugs. These data suggest that exposure to multiple PGx drugs is common and that it may be beneficial to implement wide-scale pre-emptive genomic testing. Future work should therefore concentrate on investigating the cost-effectiveness of multiplexed pre-emptive testing strategies.
Using web technology and Java mobile software agents to manage outside referrals.
Murphy, S. N.; Ng, T.; Sittig, D. F.; Barnett, G. O.
1998-01-01
A prototype, web-based referral application was created with the objective of providing outside primary care providers (PCP's) the means to refer patients to the Massachusetts General Hospital and the Brigham and Women's Hospital. The application was designed to achieve the two primary objectives of providing the consultant with enough data to make decisions even at the initial visit, and providing the PCP with a prompt response from the consultant. The system uses a web browser/server to initiate the referral and Java mobile software agents to support the workflow of the referral. This combination provides a light client implementation that can run on a wide variety of hardware and software platforms found in the office of the PCP. The implementation can guarantee a high degree of security for the computer of the PCP. Agents can be adapted to support the wide variety of data types that may be used in referral transactions, including reports with complex presentation needs and scanned (faxed) images Agents can be delivered to the PCP as running applications that can perform ongoing queries and alerts at the office of the PCP. Finally, the agent architecture is designed to scale in a natural and seamless manner for unforeseen future needs. PMID:9929190
The Jade File System. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Rao, Herman Chung-Hwa
1991-01-01
File systems have long been the most important and most widely used form of shared permanent storage. File systems in traditional time-sharing systems, such as Unix, support a coherent sharing model for multiple users. Distributed file systems implement this sharing model in local area networks. However, most distributed file systems fail to scale from local area networks to an internet. Four characteristics of scalability were recognized: size, wide area, autonomy, and heterogeneity. Owing to size and wide area, techniques such as broadcasting, central control, and central resources, which are widely adopted by local area network file systems, are not adequate for an internet file system. An internet file system must also support the notion of autonomy because an internet is made up by a collection of independent organizations. Finally, heterogeneity is the nature of an internet file system, not only because of its size, but also because of the autonomy of the organizations in an internet. The Jade File System, which provides a uniform way to name and access files in the internet environment, is presented. Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Because of autonomy, Jade is designed under the restriction that the underlying file systems may not be modified. In order to avoid the complexity of maintaining an internet-wide, global name space, Jade permits each user to define a private name space. In Jade's design, we pay careful attention to avoiding unnecessary network messages between clients and file servers in order to achieve acceptable performance. Jade's name space supports two novel features: (1) it allows multiple file systems to be mounted under one direction; and (2) it permits one logical name space to mount other logical name spaces. A prototype of Jade was implemented to examine and validate its design. The prototype consists of interfaces to the Unix File System, the Sun Network File System, and the File Transfer Protocol.
2016-08-10
AFRL-AFOSR-JP-TR-2016-0073 Large-scale Linear Optimization through Machine Learning: From Theory to Practical System Design and Implementation ...2016 4. TITLE AND SUBTITLE Large-scale Linear Optimization through Machine Learning: From Theory to Practical System Design and Implementation 5a...performances on various machine learning tasks and it naturally lends itself to fast parallel implementations . Despite this, very little work has been
Implementation of the Agitated Behavior Scale in the Electronic Health Record.
Wilson, Helen John; Dasgupta, Kritis; Michael, Kathleen
The purpose of the study was to implement an Agitated Behavior Scale through an electronic health record and to evaluate the usability of the scale in a brain injury unit at a rehabilitation hospital. A quality improvement project was conducted in the brain injury unit at a large rehabilitation hospital with registered nurses as participants using convenience sampling. The project consisted of three phases and included education, implementation of the scale in the electronic health record, and administration of the survey questionnaire, which utilized the system usability scale. The Agitated Behavior Scale was found to be usable, and there was 92.2% compliance with the use of the electronic Electronic Agitated Behavior Scale. The Agitated Behavior Scale was effectively implemented in the electronic health record and was found to be usable in the assessment of agitation. Utilization of the scale through the electronic health record on a daily basis will allow for an early identification of agitation in patients with traumatic brain injury and enable prompt interventions to manage agitation.
NASA Astrophysics Data System (ADS)
Voldoire, Aurore; Decharme, Bertrand; Pianezze, Joris; Lebeaupin Brossier, Cindy; Sevault, Florence; Seyfried, Léo; Garnier, Valérie; Bielli, Soline; Valcke, Sophie; Alias, Antoinette; Accensi, Mickael; Ardhuin, Fabrice; Bouin, Marie-Noëlle; Ducrocq, Véronique; Faroux, Stéphanie; Giordani, Hervé; Léger, Fabien; Marsaleix, Patrick; Rainaud, Romain; Redelsperger, Jean-Luc; Richard, Evelyne; Riette, Sébastien
2017-11-01
This study presents the principles of the new coupling interface based on the SURFEX multi-surface model and the OASIS3-MCT coupler. As SURFEX can be plugged into several atmospheric models, it can be used in a wide range of applications, from global and regional coupled climate systems to high-resolution numerical weather prediction systems or very fine-scale models dedicated to process studies. The objective of this development is to build and share a common structure for the atmosphere-surface coupling of all these applications, involving on the one hand atmospheric models and on the other hand ocean, ice, hydrology, and wave models. The numerical and physical principles of SURFEX interface between the different component models are described, and the different coupled systems in which the SURFEX OASIS3-MCT-based coupling interface is already implemented are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Aiman; Laguna, Ignacio; Sato, Kento
Future high-performance computing systems may face frequent failures with their rapid increase in scale and complexity. Resilience to faults has become a major challenge for large-scale applications running on supercomputers, which demands fault tolerance support for prevalent MPI applications. Among failure scenarios, process failures are one of the most severe issues as they usually lead to termination of applications. However, the widely used MPI implementations do not provide mechanisms for fault tolerance. We propose FTA-MPI (Fault Tolerance Assistant MPI), a programming model that provides support for failure detection, failure notification and recovery. Specifically, FTA-MPI exploits a try/catch model that enablesmore » failure localization and transparent recovery of process failures in MPI applications. We demonstrate FTA-MPI with synthetic applications and a molecular dynamics code CoMD, and show that FTA-MPI provides high programmability for users and enables convenient and flexible recovery of process failures.« less
Understanding the cell-to-module efficiency gap in Cu(In,Ga)(S,Se)2 photovoltaics scale-up
NASA Astrophysics Data System (ADS)
Bermudez, Veronica; Perez-Rodriguez, Alejandro
2018-06-01
Cu(In,Ga)(S,Se)2 (CIGS) solar cells show record efficiencies comparable to those of crystalline Si-based technologies. Their industrial module production costs are also comparable to those of Si photovoltaics in spite of their much lower production volume. However, the competitiveness of CIGS is compromised by the difference in performance between cell and module scales, known as the cell-to-module efficiency gap, which is significantly higher than in competing industrial photovoltaic technologies. In this Review, we quantify the main cell-to-module efficiency loss mechanisms and discuss the various strategies explored in academia and industry to reduce the efficiency gap: new transparent conductive oxides, hybrid modularization approaches and the use of wide-bandgap solar absorbers in the 1.4-1.5 eV range. To implement these strategies, research gaps relating to various device layers need to be filled.
MOBE-ChIP: Probing Cell Type-Specific Binding Through Large-Scale Chromatin Immunoprecipitation.
Wang, Shenqi; Lau, On Sun
2018-01-01
In multicellular organisms, the initiation and maintenance of specific cell types often require the activity of cell type-specific transcriptional regulators. Understanding their roles in gene regulation is crucial but probing their DNA targets in vivo, especially in a genome-wide manner, remains a technical challenge with their limited expression. To improve the sensitivity of chromatin immunoprecipitation (ChIP) for detecting the cell type-specific signals, we have developed the Maximized Objects for Better Enrichment (MOBE)-ChIP, where ChIP is performed at a substantially larger experimental scale and under low background conditions. Here, we describe the procedure in the study of transcription factors in the model plant Arabidopsis. However, with some modifications, the technique should also be implemented in other systems. Besides cell type-specific studies, MOBE-ChIP can also be used as a general strategy to improve ChIP signals.
Machine Learning Toolkit for Extreme Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-03-31
Support Vector Machines (SVM) is a popular machine learning technique, which has been applied to a wide range of domains such as science, finance, and social networks for supervised learning. MaTEx undertakes the challenge of designing a scalable parallel SVM training algorithm for large scale systems, which includes commodity multi-core machines, tightly connected supercomputers and cloud computing systems. Several techniques are proposed for improved speed and memory space usage including adaptive and aggressive elimination of samples for faster convergence , and sparse format representation of data samples. Several heuristics for earliest possible to lazy elimination of non-contributing samples are consideredmore » in MaTEx. In many cases, where an early sample elimination might result in a false positive, low overhead mechanisms for reconstruction of key data structures are proposed. The proposed algorithm and heuristics are implemented and evaluated on various publicly available datasets« less
Mabry, C D
2001-03-01
Vascular surgeons have had to contend with rising costs while their reimbursements have undergone steady reductions. The use of newer accounting techniques can help vascular surgeons better manage their practices, plan for future expansion, and control costs. This article reviews traditional accounting methods, together with activity-based costing (ABC) principles that have been used in the past for practice expense analysis. The main focus is on a new technique-resource-based costing (RBC)-which uses the widely available Resource-Based Relative Value Scale (RBRVS) as its basis. The RBC technique promises easier implementation as well as more flexibility in determining true costs of performing various procedures, as opposed to more traditional accounting methods. It is hoped that RBC will assist vascular surgeons in coping with decreasing reimbursement. Copyright 2001 by W.B. Saunders Company
Multi-scale Multi-mechanism Design of Tough Hydrogels: Building Dissipation into Stretchy Networks
Zhao, Xuanhe
2014-01-01
As swollen polymer networks in water, hydrogels are usually brittle. However, hydrogels with high toughness play critical roles in many plant and animal tissues as well as in diverse engineering applications. Here we review the intrinsic mechanisms of a wide variety of tough hydrogels developed over past few decades. We show that tough hydrogels generally possess mechanisms to dissipate substantial mechanical energy but still maintain high elasticity under deformation. The integrations and interactions of different mechanisms for dissipating energy and maintaining elasticity are essential to the design of tough hydrogels. A matrix that combines various mechanisms is constructed for the first time to guide the design of next-generation tough hydrogels. We further highlight that a particularly promising strategy for the design is to implement multiple mechanisms across multiple length scales into nano-, micro-, meso-, and macro-structures of hydrogels. PMID:24834901
Anderson, Cynthia M; Borgmeier, Chris
2010-01-01
To meet the complex social behavioral and academic needs of all students, schools benefit from having available multiple evidence-based interventions of varying intensity. School-wide positive behavior support provides a framework within which a continuum of evidence-based interventions can be implemented in a school. This framework includes three levels or tiers of intervention; Tier I (primary or universal), Tier II (secondary or targeted), and Tier III (tertiary or individualized) supports. In this paper we review the logic behind school-wide positive behavior support and then focus on Tier II interventions, as this level of support has received the least attention in the literature. We delineate the key features of Tier II interventions as implemented within school-wide positive behavior support, provide guidelines for matching Tier II interventions to school and student needs, and describe how schools plan for implementation and maintenance of selected interventions.
On the functional design of the DTU10 MW wind turbine scale model of LIFES50+ project
NASA Astrophysics Data System (ADS)
Bayati, I.; Belloli, M.; Bernini, L.; Fiore, E.; Giberti, H.; Zasso, A.
2016-09-01
This paper illustrates the mechatronic design of the wind tunnel scale model of the DTU 10MW reference wind turbine, for the LIFES50+ H2020 European project. This model was designed with the final goal of controlling the angle of attack of each blade by means of miniaturized servomotors, for implementing advanced individual pitch control (IPC) laws on a Floating Offshore Wind Turbine (FOWT) 1/75 scale model. Many design constraints were to be respected: among others, the rotor-nacelle overall mass due to aero-elastic scaling, the limited space of the nacelle, where to put three miniaturized servomotors and the main shaft one, with their own inverters/controllers, the slip rings for electrical rotary contacts, the highest stiffness as possible for the nacelle support and the blade-rotor connections, for ensuring the proper kinematic constraint, considering the first flapwise blade natural frequency, the performance of the servomotors to guarantee the wide frequency band due to frequency scale factors, etc. The design and technical solutions are herein presented and discussed, along with an overview of the building and verification process. Also a discussion about the goals achieved and constraints respected for the rigid wind turbine scale model (LIFES50+ deliverable D.3.1) and the further possible improvements for the IPC-aero-elastic scale model, which is being finalized at the time of this paper.
Soneral, Paula A. G.; Wyse, Sara A.
2017-01-01
Student-centered learning environments with upside-down pedagogies (SCALE-UP) are widely implemented at institutions across the country, and learning gains from these classrooms have been well documented. This study investigates the specific design feature(s) of the SCALE-UP classroom most conducive to teaching and learning. Using pilot survey data from instructors and students to prioritize the most salient SCALE-UP classroom features, we created a low-tech “Mock-up” version of this classroom and tested the impact of these features on student learning, attitudes, and satisfaction using a quasi-experimental setup. The same instructor taught two sections of an introductory biology course in the SCALE-UP and Mock-up rooms. Although students in both sections were equivalent in terms of gender, grade point average, incoming ACT, and drop/fail/withdraw rate, the Mock-up classroom enrolled significantly more freshmen. Controlling for class standing, multiple regression modeling revealed no significant differences in exam, in-class, preclass, and Introduction to Molecular and Cellular Biology Concept Inventory scores between the SCALE-UP and Mock-up classrooms. Thematic analysis of student comments highlighted that collaboration and whiteboards enhanced the learning experience, but technology was not important. Student satisfaction and attitudes were comparable. These results suggest that the benefits of a SCALE-UP experience can be achieved at lower cost without technology features. PMID:28213582
Subgrid Scale Modeling in Solar Convection Simulations using the ASH Code
NASA Technical Reports Server (NTRS)
Young, Y.-N.; Miesch, M.; Mansour, N. N.
2003-01-01
The turbulent solar convection zone has remained one of the most challenging and important subjects in physics. Understanding the complex dynamics in the solar con- vection zone is crucial for gaining insight into the solar dynamo problem. Many solar observatories have generated revealing data with great details of large scale motions in the solar convection zone. For example, a strong di erential rotation is observed: the angular rotation is observed to be faster at the equator than near the poles not only near the solar surface, but also deep in the convection zone. On the other hand, due to the wide range of dynamical scales of turbulence in the solar convection zone, both theory and simulation have limited success. Thus, cutting edge solar models and numerical simulations of the solar convection zone have focused more narrowly on a few key features of the solar convection zone, such as the time-averaged di erential rotation. For example, Brun & Toomre (2002) report computational finding of differential rotation in an anelastic model for solar convection. A critical shortcoming in this model is that the viscous dissipation is based on application of mixing length theory to stellar dynamics with some ad hoc parameter tuning. The goal of our work is to implement the subgrid scale model developed at CTR into the solar simulation code and examine how the differential rotation will be a affected as a result. Specifically, we implement a Smagorinsky-Lilly subgrid scale model into the ASH (anelastic spherical harmonic) code developed over the years by various authors. This paper is organized as follows. In x2 we briefly formulate the anelastic system that describes the solar convection. In x3 we formulate the Smagorinsky-Lilly subgrid scale model for unstably stratifed convection. We then present some preliminary results in x4, where we also provide some conclusions and future directions.
A Very High Order, Adaptable MESA Implementation for Aeroacoustic Computations
NASA Technical Reports Server (NTRS)
Dydson, Roger W.; Goodrich, John W.
2000-01-01
Since computational efficiency and wave resolution scale with accuracy, the ideal would be infinitely high accuracy for problems with widely varying wavelength scales. Currently, many of the computational aeroacoustics methods are limited to 4th order accurate Runge-Kutta methods in time which limits their resolution and efficiency. However, a new procedure for implementing the Modified Expansion Solution Approximation (MESA) schemes, based upon Hermitian divided differences, is presented which extends the effective accuracy of the MESA schemes to 57th order in space and time when using 128 bit floating point precision. This new approach has the advantages of reducing round-off error, being easy to program. and is more computationally efficient when compared to previous approaches. Its accuracy is limited only by the floating point hardware. The advantages of this new approach are demonstrated by solving the linearized Euler equations in an open bi-periodic domain. A 500th order MESA scheme can now be created in seconds, making these schemes ideally suited for the next generation of high performance 256-bit (double quadruple) or higher precision computers. This ease of creation makes it possible to adapt the algorithm to the mesh in time instead of its converse: this is ideal for resolving varying wavelength scales which occur in noise generation simulations. And finally, the sources of round-off error which effect the very high order methods are examined and remedies provided that effectively increase the accuracy of the MESA schemes while using current computer technology.
Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang
2016-12-23
A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .
Financial Incentives to Enable Clean Energy Deployment: Policy Overview and Good Practices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Sadie
Financial incentives have been widely implemented by governments around the world to support scaled up deployment of renewable energy and energy efficiency technologies and practices. As of 2015, at least 48 countries have adopted financial incentives to support renewable energy and energy efficiency deployment. Broader clean energy strategies and plans provide a crucial foundation for financial incentives that often complement regulatory policies such as renewable energy targets, standards, and other mandates. This policy brief provides a primer on key financial incentive design elements, lessons from different country experiences, and curated support resources for more detailed and country-specific financial incentive designmore » information.« less
A real-time KLT implementation for radio-SETI applications
NASA Astrophysics Data System (ADS)
Melis, Andrea; Concu, Raimondo; Pari, Pierpaolo; Maccone, Claudio; Montebugnoli, Stelio; Possenti, Andrea; Valente, Giuseppe; Antonietti, Nicoló; Perrodin, Delphine; Migoni, Carlo; Murgia, Matteo; Trois, Alessio; Barbaro, Massimo; Bocchinu, Alessandro; Casu, Silvia; Lunesu, Maria Ilaria; Monari, Jader; Navarrini, Alessandro; Pisanu, Tonino; Schilliró, Francesco; Vacca, Valentina
2016-07-01
SETI, the Search for ExtraTerrestrial Intelligence, is the search for radio signals emitted by alien civilizations living in the Galaxy. Narrow-band FFT-based approaches have been preferred in SETI, since their computation time only grows like N*lnN, where N is the number of time samples. On the contrary, a wide-band approach based on the Kahrunen-Lo`eve Transform (KLT) algorithm would be preferable, but it would scale like N*N. In this paper, we describe a hardware-software infrastructure based on FPGA boards and GPU-based PCs that circumvents this computation-time problem allowing for a real-time KLT.
Active mixing of complex fluids at the microscale
Ober, Thomas J.; Foresti, Daniele; Lewis, Jennifer A.
2015-09-22
Mixing of complex fluids at low Reynolds number is fundamental for a broad range of applications, including materials assembly, microfluidics, and biomedical devices. Of these materials, yield stress fluids (and gels) pose the most significant challenges, especially when they must be mixed in low volumes over short timescales. New scaling relationships between mixer dimensions and operating conditions are derived and experimentally verified to create a framework for designing active microfluidic mixers that can efficiently homogenize a wide range of complex fluids. As a result, active mixing printheads are then designed and implemented for multimaterial 3D printing of viscoelastic inks withmore » programmable control of local composition.« less
Single-Molecule Bioelectronics
Rosenstein, Jacob K.; Lemay, Serge G.; Shepard, Kenneth L.
2014-01-01
Experimental techniques which interface single biomolecules directly with microelectronic systems are increasingly being used in a wide range of powerful applications, from fundamental studies of biomolecules to ultra-sensitive assays. Here we review several technologies which can perform electronic measurements of single molecules in solution: ion channels, nanopore sensors, carbon nanotube field-effect transistors, electron tunneling gaps, and redox cycling. We discuss the shared features among these techniques that enable them to resolve individual molecules, and discuss their limitations. Recordings from each of these methods all rely on similar electronic instrumentation, and we discuss the relevant circuit implementations and potential for scaling these single-molecule bioelectronic interfaces to high-throughput arrayed sensing platforms. PMID:25529538
Feng, Dai; Baumgartner, Richard; Svetnik, Vladimir
2018-04-05
The concordance correlation coefficient (CCC) is a widely used scaled index in the study of agreement. In this article, we propose estimating the CCC by a unified Bayesian framework that can (1) accommodate symmetric or asymmetric and light- or heavy-tailed data; (2) select model from several candidates; and (3) address other issues frequently encountered in practice such as confounding covariates and missing data. The performance of the proposal was studied and demonstrated using simulated as well as real-life biomarker data from a clinical study of an insomnia drug. The implementation of the proposal is accessible through a package in the Comprehensive R Archive Network.
Active mixing of complex fluids at the microscale
Ober, Thomas J.; Foresti, Daniele; Lewis, Jennifer A.
2015-01-01
Mixing of complex fluids at low Reynolds number is fundamental for a broad range of applications, including materials assembly, microfluidics, and biomedical devices. Of these materials, yield stress fluids (and gels) pose the most significant challenges, especially when they must be mixed in low volumes over short timescales. New scaling relationships between mixer dimensions and operating conditions are derived and experimentally verified to create a framework for designing active microfluidic mixers that can efficiently homogenize a wide range of complex fluids. Active mixing printheads are then designed and implemented for multimaterial 3D printing of viscoelastic inks with programmable control of local composition. PMID:26396254
García-Cortés, M; Lucena, M I; Pachkoria, K; Borraz, Y; Hidalgo, R; Andrade, R J
2008-05-01
Causality assessment in hepatotoxicity is challenging. The current standard liver-specific Council for International Organizations of Medical Sciences/Roussel Uclaf Causality Assessment Method scale is complex and difficult to implement in daily practice. The Naranjo Adverse Drug Reactions Probability Scale is a simple and widely used nonspecific scale, which has not been specifically evaluated in drug-induced liver injury. To compare the Naranjo method with the standard liver-specific Council for International Organizations of Medical Sciences/Roussel Uclaf Causality Assessment Method scale in evaluating the accuracy and reproducibility of Naranjo Adverse Drug Reactions Probability Scale in the diagnosis of hepatotoxicity. Two hundred and twenty-five cases of suspected hepatotoxicity submitted to a national registry were evaluated by two independent observers and assessed for between-observer and between-scale differences using percentages of agreement and the weighted kappa (kappa(w)) test. A total of 249 ratings were generated. Between-observer agreement was 45% with a kappa(w) value of 0.17 for the Naranjo Adverse Drug Reactions Probability Scale, while there was a higher agreement when using the Council for International Organizations of Medical Sciences/Roussel Uclaf Causality Assessment Method scale (72%, kappa(w): 0.71). Concordance between the two scales was 24% (kappa(w): 0.15). The Naranjo Adverse Drug Reactions Probability Scale had low sensitivity (54%) and poor negative predictive value (29%) and showed a limited capability to distinguish between adjacent categories of probability. The Naranjo scale lacks validity and reproducibility in the attribution of causality in hepatotoxicity.
ERIC Educational Resources Information Center
Goodman-Scott, Emily
2014-01-01
School-Wide Positive Behavioral Interventions and Supports (PBIS) are school-wide, data-driven frameworks for promoting safe schools and student learning. This article explains PBIS and provides practical examples of PBIS implementation by describing a school counselor-run PBIS framework in one elementary school, as part of a larger, district-wide…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.
CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less
A Bayesian method for assessing multiscalespecies-habitat relationships
Stuber, Erica F.; Gruber, Lutz F.; Fontaine, Joseph J.
2017-01-01
ContextScientists face several theoretical and methodological challenges in appropriately describing fundamental wildlife-habitat relationships in models. The spatial scales of habitat relationships are often unknown, and are expected to follow a multi-scale hierarchy. Typical frequentist or information theoretic approaches often suffer under collinearity in multi-scale studies, fail to converge when models are complex or represent an intractable computational burden when candidate model sets are large.ObjectivesOur objective was to implement an automated, Bayesian method for inference on the spatial scales of habitat variables that best predict animal abundance.MethodsWe introduce Bayesian latent indicator scale selection (BLISS), a Bayesian method to select spatial scales of predictors using latent scale indicator variables that are estimated with reversible-jump Markov chain Monte Carlo sampling. BLISS does not suffer from collinearity, and substantially reduces computation time of studies. We present a simulation study to validate our method and apply our method to a case-study of land cover predictors for ring-necked pheasant (Phasianus colchicus) abundance in Nebraska, USA.ResultsOur method returns accurate descriptions of the explanatory power of multiple spatial scales, and unbiased and precise parameter estimates under commonly encountered data limitations including spatial scale autocorrelation, effect size, and sample size. BLISS outperforms commonly used model selection methods including stepwise and AIC, and reduces runtime by 90%.ConclusionsGiven the pervasiveness of scale-dependency in ecology, and the implications of mismatches between the scales of analyses and ecological processes, identifying the spatial scales over which species are integrating habitat information is an important step in understanding species-habitat relationships. BLISS is a widely applicable method for identifying important spatial scales, propagating scale uncertainty, and testing hypotheses of scaling relationships.
A comparison of different category scales for estimating disease severity
USDA-ARS?s Scientific Manuscript database
Plant pathologists most often obtain quantitative information on disease severity using visual assessments. Category scales are widely used for assessing disease severity, including for screening germplasm. The most widely used category scale is the Horsfall-Barratt (H-B) scale, but reports show tha...
Haberer, Jessica E; Sabin, Lora; Amico, K Rivet; Orrell, Catherine; Galárraga, Omar; Tsai, Alexander C; Vreeman, Rachel C; Wilson, Ira; Sam-Agudu, Nadia A; Blaschke, Terrence F; Vrijens, Bernard; Mellins, Claude A; Remien, Robert H; Weiser, Sheri D; Lowenthal, Elizabeth; Stirratt, Michael J; Sow, Papa Salif; Thomas, Bruce; Ford, Nathan; Mills, Edward; Lester, Richard; Nachega, Jean B; Bwana, Bosco Mwebesa; Ssewamala, Fred; Mbuagbaw, Lawrence; Munderi, Paula; Geng, Elvin; Bangsberg, David R
2017-01-01
Introduction : Successful population-level antiretroviral therapy (ART) adherence will be necessary to realize both the clinical and prevention benefits of antiretroviral scale-up and, ultimately, the end of AIDS. Although many people living with HIV are adhering well, others struggle and most are likely to experience challenges in adherence that may threaten virologic suppression at some point during lifelong therapy. Despite the importance of ART adherence, supportive interventions have generally not been implemented at scale. The objective of this review is to summarize the recommendations of clinical, research, and public health experts for scalable ART adherence interventions in resource-limited settings. Methods : In July 2015, the Bill and Melinda Gates Foundation convened a meeting to discuss the most promising ART adherence interventions for use at scale in resource-limited settings. This article summarizes that discussion with recent updates. It is not a systematic review, but rather provides practical considerations for programme implementation based on evidence from individual studies, systematic reviews, meta-analyses, and the World Health Organization Consolidated Guidelines for HIV, which include evidence from randomized controlled trials in low- and middle-income countries. Interventions are categorized broadly as education and counselling; information and communication technology-enhanced solutions; healthcare delivery restructuring; and economic incentives and social protection interventions. Each category is discussed, including descriptions of interventions, current evidence for effectiveness, and what appears promising for the near future. Approaches to intervention implementation and impact assessment are then described. Results and discussion : The evidence base is promising for currently available, effective, and scalable ART adherence interventions for resource-limited settings. Numerous interventions build on existing health care infrastructure and leverage available resources. Those most widely studied and implemented to date involve peer counselling, adherence clubs, and short message service (SMS). Many additional interventions could have an important impact on ART adherence with further development, including standardized counselling through multi-media technology, electronic dose monitoring, decentralized and differentiated models of care, and livelihood interventions. Optimal targeting and tailoring of interventions will require improved adherence measurement. Conclusions : The opportunity exists today to address and resolve many of the challenges to effective ART adherence, so that they do not limit the potential of ART to help bring about the end of AIDS.
Yoon, Dhongik S; Jo, HangJin; Corradini, Michael L
2017-04-01
Condensation of steam vapor is an important mode of energy removal from the reactor containment. The presence of noncondensable gas complicates the process and makes it difficult to model. MELCOR, one of the more widely used system codes for containment analyses, uses the heat and mass transfer analogy to model condensation heat transfer. To investigate previously reported nodalization-dependence in natural convection flow regime, MELCOR condensation model as well as other models are studied. The nodalization-dependence issue is resolved by using physical length from the actual geometry rather than node size of each control volume as the characteristic length scale formore » MELCOR containment analyses. At the transition to turbulent natural convection regime, the McAdams correlation for convective heat transfer produces a better prediction compared to the original MELCOR model. The McAdams correlation is implemented in MELCOR and the prediction is validated against a set of experiments on a scaled AP600 containment. The MELCOR with our implemented model produces improved predictions. For steam molar fractions in the gas mixture greater than about 0.58, the predictions are within the uncertainty margin of the measurements. The simulation results still underestimate the heat transfer from the gas-steam mixture, implying that conservative predictions are provided.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Dhongik S; Jo, HangJin; Corradini, Michael L
Condensation of steam vapor is an important mode of energy removal from the reactor containment. The presence of noncondensable gas complicates the process and makes it difficult to model. MELCOR, one of the more widely used system codes for containment analyses, uses the heat and mass transfer analogy to model condensation heat transfer. To investigate previously reported nodalization-dependence in natural convection flow regime, MELCOR condensation model as well as other models are studied. The nodalization-dependence issue is resolved by using physical length from the actual geometry rather than node size of each control volume as the characteristic length scale formore » MELCOR containment analyses. At the transition to turbulent natural convection regime, the McAdams correlation for convective heat transfer produces a better prediction compared to the original MELCOR model. The McAdams correlation is implemented in MELCOR and the prediction is validated against a set of experiments on a scaled AP600 containment. The MELCOR with our implemented model produces improved predictions. For steam molar fractions in the gas mixture greater than about 0.58, the predictions are within the uncertainty margin of the measurements. The simulation results still underestimate the heat transfer from the gas-steam mixture, implying that conservative predictions are provided.« less
Multi-scale Multi-mechanism Toughening of Hydrogels
NASA Astrophysics Data System (ADS)
Zhao, Xuanhe
Hydrogels are widely used as scaffolds for tissue engineering, vehicles for drug delivery, actuators for optics and fluidics, and model extracellular matrices for biological studies. The scope of hydrogel applications, however, is often severely limited by their mechanical properties. Inspired by the mechanics and hierarchical structures of tough biological tissues, we propose that a general principle for the design of tough hydrogels is to implement two mechanisms for dissipating mechanical energy and maintaining high elasticity in hydrogels. A particularly promising strategy for the design is to integrate multiple pairs of mechanisms across multiple length scales into a hydrogel. We develop a multiscale theoretical framework to quantitatively guide the design of tough hydrogels. On the network level, we have developed micro-physical models to characterize the evolution of polymer networks under deformation. On the continuum level, we have implemented constitutive laws formulated from the network-level models into a coupled cohesive-zone and Mullins-effect model to quantitatively predict crack propagation and fracture toughness of hydrogels. Guided by the design principle and quantitative model, we will demonstrate a set of new hydrogels, based on diverse types of polymers, yet can achieve extremely high toughness superior to their natural counterparts such as cartilages. The work was supported by NSF(No. CMMI- 1253495) and ONR (No. N00014-14-1-0528).
NASA Astrophysics Data System (ADS)
Kim, Jonghoon; Cho, B. H.
2014-08-01
This paper introduces an innovative approach to analyze electrochemical characteristics and state-of-health (SOH) diagnosis of a Li-ion cell based on the discrete wavelet transform (DWT). In this approach, the DWT has been applied as a powerful tool in the analysis of the discharging/charging voltage signal (DCVS) with non-stationary and transient phenomena for a Li-ion cell. Specifically, DWT-based multi-resolution analysis (MRA) is used for extracting information on the electrochemical characteristics in both time and frequency domain simultaneously. Through using the MRA with implementation of the wavelet decomposition, the information on the electrochemical characteristics of a Li-ion cell can be extracted from the DCVS over a wide frequency range. Wavelet decomposition based on the selection of the order 3 Daubechies wavelet (dB3) and scale 5 as the best wavelet function and the optimal decomposition scale is implemented. In particular, this present approach develops these investigations one step further by showing low and high frequency components (approximation component An and detail component Dn, respectively) extracted from variable Li-ion cells with different electrochemical characteristics caused by aging effect. Experimental results show the clearness of the DWT-based approach for the reliable diagnosis of the SOH for a Li-ion cell.
Hanselman, Paul; Rozek, Christopher S.; Grigg, Jeffrey; Borman, Geoffrey D.
2016-01-01
Brief, targeted self-affirmation writing exercises have recently been offered as a way to reduce racial achievement gaps, but evidence about their effects in educational settings is mixed, leaving ambiguity about the likely benefits of these strategies if implemented broadly. A key limitation in interpreting these mixed results is that they come from studies conducted by different research teams with different procedures in different settings; it is therefore impossible to isolate whether different effects are the result of theorized heterogeneity, unidentified moderators, or idiosyncratic features of the different studies. We addressed this limitation by conducting a well-powered replication of self-affirmation in a setting where a previous large-scale field experiment demonstrated significant positive impacts, using the same procedures. We found no evidence of effects in this replication study and estimates were precise enough to reject benefits larger than an effect size of 0.10. These null effects were significantly different from persistent benefits in the prior study in the same setting, and extensive testing revealed that currently theorized moderators of self-affirmation effects could not explain the difference. These results highlight the potential fragility of self-affirmation in educational settings when implemented widely and the need for new theory, measures, and evidence about the necessary conditions for self-affirmation success. PMID:28450753
Emmert, Martin; Meszmer, Nina; Sander, Uwe
2016-09-19
Physician-rating websites have become a popular tool to create more transparency about the quality of health care providers. So far, it remains unknown whether online-based rating websites have the potential to contribute to a better standard of care. Our goal was to examine which health care providers use online rating websites and for what purposes, and whether health care providers use online patient ratings to improve patient care. We conducted an online-based cross-sectional study by surveying 2360 physicians and other health care providers (September 2015). In addition to descriptive statistics, we performed multilevel logistic regression models to ascertain the effects of providers' demographics as well as report card-related variables on the likelihood that providers implement measures to improve patient care. Overall, more than half of the responding providers surveyed (54.66%, 1290/2360) used online ratings to derive measures to improve patient care (implemented measures: mean 3.06, SD 2.29). Ophthalmologists (68%, 40/59) and gynecologists (65.4%, 123/188) were most likely to implement any measures. The most widely implemented quality measures were related to communication with patients (28.77%, 679/2360), the appointment scheduling process (23.60%, 557/2360), and office workflow (21.23%, 501/2360). Scaled-survey results had a greater impact on deriving measures than narrative comments. Multilevel logistic regression models revealed medical specialty, the frequency of report card use, and the appraisal of the trustworthiness of scaled-survey ratings to be significantly associated predictors for implementing measures to improve patient care because of online ratings. Our results suggest that online ratings displayed on physician-rating websites have an impact on patient care. Despite the limitations of our study and unintended consequences of physician-rating websites, they still may have the potential to improve patient care.
Student experiences across multiple flipped courses in a single curriculum.
Khanova, Julia; Roth, Mary T; Rodgers, Jo Ellen; McLaughlin, Jacqueline E
2015-10-01
The flipped classroom approach has garnered significant attention in health professions education, which has resulted in calls for curriculum-wide implementations of the model. However, research to support the development of evidence-based guidelines for large-scale flipped classroom implementations is lacking. This study was designed to examine how students experience the flipped classroom model of learning in multiple courses within a single curriculum, as well as to identify specific elements of flipped learning that students perceive as beneficial or challenging. A qualitative analysis of students' comments (n = 6010) from mid-course and end-of-course evaluations of 10 flipped courses (in 2012-2014) was conducted. Common and recurring themes were identified through systematic iterative coding and sorting using the constant comparison method. Multiple coders, agreement through consensus and member checking were utilised to ensure the trustworthiness of findings. Several themes emerged from the analysis: (i) the perceived advantages of flipped learning coupled with concerns about implementation; (ii) the benefits of pre-class learning and factors that negatively affect these benefits, such as quality and quantity of learning materials, as well as overall increase in workload, especially in the context of multiple concurrent flipped courses; (iii) the role of the instructor in the flipped learning environment, particularly in engaging students in active learning and ensuring instructional alignment, and (iv) the need for assessments that emphasise the application of knowledge and critical thinking skills. Analysis of data from 10 flipped courses provided insight into common patterns of student learning experiences specific to the flipped learning model within a single curriculum. The study points to the challenges associated with scaling the implementation of the flipped classroom across multiple courses. Several core elements critical to the effective design and implementation of the flipped classroom model are identified. © 2015 John Wiley & Sons Ltd.
Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.; ...
2015-04-27
CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less
Land Recycling: from Science to Practice - A Sustainable Development of Urban Areas
NASA Astrophysics Data System (ADS)
Romanowicz, A.
2015-12-01
Member States (MS) of the European Union have experience significant urban sprawl in the last 3 decades. The urban sprawl was driven mainly by internal (MS or EU) or external migration but also by EU policies (including funds and projects) and by changes in life style (e.g. moving away from cities; second homes). This presentation will aim at showing a number of EU wide analysis on: aging population, depopulation of some of the EU regions; agricultural production and scenarios projections of thereof. Various EU funded projects and programs have analyzed ways how future cities and how EU future land use could developed. Number of those solutions where further investigated with case studies/small scale implementations. However, in recent years the 2012 EU road map to resource efficiency and UN Sustainable Development Goals have called respectively for 'no net land take by 2050' and land neutrality. Thus, the process of implementing innovative solutions for land use has started and some of the cities and regions are well ahead in moving towards XXI century society. In order to streamline/share knowledge and steer EU wide discussion on this the European Commission in its road map to resource efficiency announced a Communication on land as a resource. This presentation will attempt to synthesize current discussion on the topic of 'land as a resource' and include examples of implemented innovative solutions for aging population, land recycling for urban developments and green spaces within the current EU policy context. Finally, some appreciation of the adopted UN Sustainable Development Goals regarding land and soil from the EU perspective will be given.
NASA Astrophysics Data System (ADS)
Bahtiar; Rahayu, Y. S.; Wasis
2018-01-01
This research aims to produce P3E learning model to improve students’ critical thinking skills. The developed model is named P3E, consisting of 4 (four) stages namely; organization, inquiry, presentation, and evaluation. This development research refers to the development stage by Kemp. The design of the wide scale try-out used pretest-posttest group design. The wide scale try-out was conducted in grade X of 2016/2017 academic year. The analysis of the results of this development research inludes three aspects, namely: validity, practicality, and effectiveness of the model developed. The research results showed; (1) the P3E learning model was valid, according to experts with an average value of 3.7; (2) The completion of the syntax of the learning model developed obtained 98.09% and 94.39% for two schools based on the assessment of the observers. This shows that the developed model is practical to be implemented; (3) the developed model is effective for improving students’ critical thinking skills, although the n-gain of the students’ critical thinking skills was 0.54 with moderate category. Based on the results of the research above, it can be concluded that the developed P3E learning model is suitable to be used to improve students’ critical thinking skills.
Strong influence of regional species pools on continent-wide structuring of local communities.
Lessard, Jean-Philippe; Borregaard, Michael K; Fordyce, James A; Rahbek, Carsten; Weiser, Michael D; Dunn, Robert R; Sanders, Nathan J
2012-01-22
There is a long tradition in ecology of evaluating the relative contribution of the regional species pool and local interactions on the structure of local communities. Similarly, a growing number of studies assess the phylogenetic structure of communities, relative to that in the regional species pool, to examine the interplay between broad-scale evolutionary and fine-scale ecological processes. Finally, a renewed interest in the influence of species source pools on communities has shown that the definition of the source pool influences interpretations of patterns of community structure. We use a continent-wide dataset of local ant communities and implement ecologically explicit source pool definitions to examine the relative importance of regional species pools and local interactions for shaping community structure. Then we assess which factors underlie systematic variation in the structure of communities along climatic gradients. We find that the average phylogenetic relatedness of species in ant communities decreases from tropical to temperate regions, but the strength of this relationship depends on the level of ecological realism in the definition of source pools. We conclude that the evolution of climatic niches influences the phylogenetic structure of regional source pools and that the influence of regional source pools on local community structure is strong.
Reconnecting fragmented sturgeon populations in North American rivers
Jager, Yetta; Forsythe, Patrick S.; McLaughlin, Robert L.; ...
2016-02-24
The majority of large North American rivers are fragmented by dams that interrupt migrations of wide-ranging fishes like sturgeons. Reconnecting habitat is viewed as an important means of protecting sturgeon species in U.S. rivers because these species have lost between 5% and 60% of their historical ranges. Unfortunately, facilities designed to pass other fishes have rarely worked well for sturgeons. The most successful passage facilities were sized appropriately for sturgeons and accommodated bottom-oriented species. For upstream passage, facilities with large entrances, full-depth guidance systems, large lifts, or wide fishways without obstructions or tight turns worked well. However, facilitating upstream migrationmore » is only half the battle. Broader recovery for linked sturgeon populations requires safe round-trip passage involving multiple dams. The most successful downstream passage facilities included nature-like fishways, large canal bypasses, and bottom-draw sluice gates. We outline an adaptive approach to implementing passage that begins with temporary programs and structures and monitors success both at the scale of individual fish at individual dams and the scale of metapopulations in a river basin. The challenge will be to learn from past efforts and reconnect North American sturgeon populations in a way that promotes range expansion and facilitates population recovery.« less
Prescribed fire as a means of reducing forest carbon emissions in the western United States.
Wiedinmyer, Christine; Hurteau, Matthew D
2010-03-15
Carbon sequestration by forested ecosystems offers a potential climate change mitigation benefit. However, wildfire has the potential to reverse this benefit In the western United States, climate change and land management practices have led to increases in wildfire intensity and size. One potential means of reducing carbon emissions from wildfire is the use of prescribed burning,which consumes less biomass and therefore releases less carbon to the atmosphere. This study uses a regional fire emissions model to estimate the potential reduction in fire emissions when prescribed burning is applied in dry, temperate forested systems of the western U.S. Daily carbon dioxide (CO(2)) fire emissions for 2001-2008 were calculated for the western U.S. for two cases: a default wildfire case and one in which prescribed burning was applied. Wide-scale prescribed fire application can reduce CO(2) fire emissions for the western U.S. by 18-25%1 in the western U.S., and by as much as 60% in specific forest systems. Although this work does not address important considerations such as the feasibility of implementing wide-scale prescribed fire management or the cumulative emissions from repeated prescribed burning, it does provide constraints on potential carbon emission reductions when prescribed burning is used.
NASA Astrophysics Data System (ADS)
Prodanovic, M.; Esteva, M.; Hanlon, M.; Nanda, G.; Agarwal, P.
2015-12-01
Recent advances in imaging have provided a wealth of 3D datasets that reveal pore space microstructure (nm to cm length scale) and allow investigation of nonlinear flow and mechanical phenomena from first principles using numerical approaches. This framework has popularly been called "digital rock physics". Researchers, however, have trouble storing and sharing the datasets both due to their size and the lack of standardized image types and associated metadata for volumetric datasets. This impedes scientific cross-validation of the numerical approaches that characterize large scale porous media properties, as well as development of multiscale approaches required for correct upscaling. A single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal, that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Once widely accepter, the repository will jumpstart productivity and enable scientific inquiry and engineering decisions founded on a data-driven basis. This is the first repository of its kind. We show initial results on incorporating essential software tools and pipelines that make it easier for researchers to store and reuse data, and for educators to quickly visualize and illustrate concepts to a wide audience. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.
NASA Astrophysics Data System (ADS)
Malone, A.
2017-12-01
Quantifying mass balance sensitivity to climate change is essential for forecasting glacier evolution and deciphering climate signals embedded in archives of past glacier changes. Ideally, these quantifications result from decades of field measurement, remote sensing, and a hierarchy modeling approach, but in data-sparse regions, such as the Himalayas and tropical Andes, regional-scale modeling rooted in first principles provides a first-order picture. Previous regional-scaling modeling studies have applied a surface energy and mass balance approach in order to quantify equilibrium line altitude sensitivity to climate change. In this study, an expanded regional-scale surface energy and mass balance model is implemented to quantify glacier-wide mass balance sensitivity to climate change for tropical Andean glaciers. Data from the Randolph Glacier Inventory are incorporated, and additional physical processes are included, such as a dynamic albedo and cloud-dependent atmospheric emissivity. The model output agrees well with the limited mass balance records for tropical Andean glaciers. The dominant climate variables driving interannual mass balance variability differ depending on the climate setting. For wet tropical glaciers (annual precipitation >0.75 m y-1), temperature is the dominant climate variable. Different hypotheses for the processes linking wet tropical glacier mass balance variability to temperature are evaluated. The results support the hypothesis that glacier-wide mass balance on wet tropical glaciers is largely dominated by processes at the lowest elevation where temperature plays a leading role in energy exchanges. This research also highlights the transient nature of wet tropical glaciers - the vast majority of tropical glaciers and a vital regional water resource - in an anthropogenic warming world.
ERIC Educational Resources Information Center
Christofferson, Remi Dabney; Callahan, Kathe
2015-01-01
This research explores the implementation of a school-wide intervention program that was designed to foster and instill intrinsic values based on an external reward system. The Positive Behavior Support in Schools (PBSIS) is an intervention intended to improve the climate of schools using system-wide positive behavioral interventions to discourage…
A decentralized software bus based on IP multicas ting
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd
1995-01-01
We describe decentralized reconfigurable implementation of a conference management system based on the low-level Internet Protocol (IP) multicasting protocol. IP multicasting allows low-cost, world-wide, two-way transmission of data between large numbers of conferencing participants through the Multicasting Backbone (MBone). Each conference is structured as a software bus -- a messaging system that provides a run-time interconnection model that acts as a separate agent (i.e., the bus) for routing, queuing, and delivering messages between distributed programs. Unlike the client-server interconnection model, the software bus model provides a level of indirection that enhances the flexibility and reconfigurability of a distributed system. Current software bus implementations like POLYLITH, however, rely on a centralized bus process and point-to-point protocols (i.e., TCP/IP) to route, queue, and deliver messages. We implement a software bus called the MULTIBUS that relies on a separate process only for routing and uses a reliable IP multicasting protocol for delivery of messages. The use of multicasting means that interconnections are independent of IP machine addresses. This approach allows reconfiguration of bus participants during system execution without notifying other participants of new IP addresses. The use of IP multicasting also permits an economy of scale in the number of participants. We describe the MULITIBUS protocol elements and show how our implementation performs better than centralized bus implementations.
Guillory, Charleta; Gong, Alice; Livingston, Judith; Creel, Liza; Ocampo, Elena; McKee-Garrett, Tiffany
2017-07-01
Objective Critical congenital heart disease (CCHD) is a leading cause of death in infants. Newborn screening (NBS) by pulse oximetry allows early identification of CCHD in asymptomatic newborns. To improve readiness of hospital neonatal birthing facilities for mandatory screening in Texas, an educational and quality improvement (QI) project was piloted to identify an implementation strategy for CCHD NBS in a range of birthing hospitals. Study Design Thirteen Texas hospitals implemented standardized CCHD screening by pulse oximetry. An educational program was devised and a tool kit was created to facilitate education and implementation. Newborn nursery nurses' knowledge was assessed using a pre- and posttest instrument. Results The nurses' knowledge assessment improved from 71 to 92.5% ( p < 0.0001). Of 11,322 asymptomatic newborns screened after 24 hours of age, 11 had a positive screen, with 1 confirmed case of CCHD. Pulse oximetry CCHD NBS had sensitivity of 100%, specificity of 99.91%, false-positive rate of 0.088%, positive predictive value of 9.09%, and negative predictive value of 100%. Conclusion Our educational program, including a tool kit, QI processes, and standardized pulse oximetry CCHD NBS, is applicable for a range of hospital birthing facilities and may facilitate wide-scale implementation, thereby improving newborn health. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Nanoantenna couplers for metal-insulator-metal waveguide interconnects
NASA Astrophysics Data System (ADS)
Onbasli, M. Cengiz; Okyay, Ali K.
2010-08-01
State-of-the-art copper interconnects suffer from increasing spatial power dissipation due to chip downscaling and RC delays reducing operation bandwidth. Wide bandwidth, minimized Ohmic loss, deep sub-wavelength confinement and high integration density are key features that make metal-insulator-metal waveguides (MIM) utilizing plasmonic modes attractive for applications in on-chip optical signal processing. Size-mismatch between two fundamental components (micron-size fibers and a few hundred nanometers wide waveguides) demands compact coupling methods for implementation of large scale on-chip optoelectronic device integration. Existing solutions use waveguide tapering, which requires more than 4λ-long taper distances. We demonstrate that nanoantennas can be integrated with MIM for enhancing coupling into MIM plasmonic modes. Two-dimensional finite-difference time domain simulations of antennawaveguide structures for TE and TM incident plane waves ranging from λ = 1300 to 1600 nm were done. The same MIM (100-nm-wide Ag/100-nm-wide SiO2/100-nm-wide Ag) was used for each case, while antenna dimensions were systematically varied. For nanoantennas disconnected from the MIM; field is strongly confined inside MIM-antenna gap region due to Fabry-Perot resonances. Major fraction of incident energy was not transferred into plasmonic modes. When the nanoantennas are connected to the MIM, stronger coupling is observed and E-field intensity at outer end of core is enhanced more than 70 times.
Subgrid spatial variability of soil hydraulic functions for hydrological modelling
NASA Astrophysics Data System (ADS)
Kreye, Phillip; Meon, Günter
2016-07-01
State-of-the-art hydrological applications require a process-based, spatially distributed hydrological model. Runoff characteristics are demanded to be well reproduced by the model. Despite that, the model should be able to describe the processes at a subcatchment scale in a physically credible way. The objective of this study is to present a robust procedure to generate various sets of parameterisations of soil hydraulic functions for the description of soil heterogeneity on a subgrid scale. Relations between Rosetta-generated values of saturated hydraulic conductivity (Ks) and van Genuchten's parameters of soil hydraulic functions were statistically analysed. An universal function that is valid for the complete bandwidth of Ks values could not be found. After concentrating on natural texture classes, strong correlations were identified for all parameters. The obtained regression results were used to parameterise sets of hydraulic functions for each soil class. The methodology presented in this study is applicable on a wide range of spatial scales and does not need input data from field studies. The developments were implemented into a hydrological modelling system.
Secure transport and adaptation of MC-EZBC video utilizing H.264-based transport protocols☆
Hellwagner, Hermann; Hofbauer, Heinz; Kuschnig, Robert; Stütz, Thomas; Uhl, Andreas
2012-01-01
Universal Multimedia Access (UMA) calls for solutions where content is created once and subsequently adapted to given requirements. With regard to UMA and scalability, which is required often due to a wide variety of end clients, the best suited codecs are wavelet based (like the MC-EZBC) due to their inherent high number of scaling options. However, most transport technologies for delivering videos to end clients are targeted toward the H.264/AVC standard or, if scalability is required, the H.264/SVC. In this paper we will introduce a mapping of the MC-EZBC bitstream to existing H.264/SVC based streaming and scaling protocols. This enables the use of highly scalable wavelet based codecs on the one hand and the utilization of already existing network technologies without accruing high implementation costs on the other hand. Furthermore, we will evaluate different scaling options in order to choose the best option for given requirements. Additionally, we will evaluate different encryption options based on transport and bitstream encryption for use cases where digital rights management is required. PMID:26869746
Maternal and neonatal implementation for equitable systems. A study design paper.
Ekirapa-Kiracho, Elizabeth; Tetui, Moses; Bua, John; Muhumuza Kananura, Rornald; Waiswa, Peter; Makumbi, Fred; Atuyambe, Lynn; Ajeani, Judith; George, Asha; Mutebi, Aloysuis; Kakaire, Ayub; Namazzi, Gertrude; Paina, Ligia; Namusoke Kiwanuka, Suzanne
2017-08-01
Evidence on effective ways of improving maternal and neonatal health outcomes is widely available. The challenge that most low-income countries grapple with is implementation at scale and sustainability. The study aimed at improving access to quality maternal and neonatal health services in a sustainable manner by using a participatory action research approach. The study consisted of a quasi-experimental design, with a participatory action research approach to implementation in three rural districts (Pallisa, Kibuku and Kamuli) in Eastern Uganda. The intervention had two main components; namely, community empowerment for comprehensive birth preparedness, and health provider and management capacity-building. We collected data using both quantitative and qualitative methods using household and facility-level structured surveys, record reviews, key informant interviews and focus group discussions. We purposively selected the participants for the qualitative data collection, while for the surveys we interviewed all eligible participants in the sampled households and health facilities. Descriptive statistics were used to describe the data, while the difference in difference analysis was used to measure the effect of the intervention. Qualitative data were analysed using thematic analysis. This study was implemented to generate evidence on how to increase access to quality maternal and newborn health services in a sustainable manner using a multisectoral participatory approach.
Digimarc Discover on Google Glass
NASA Astrophysics Data System (ADS)
Rogers, Eliot; Rodriguez, Tony; Lord, John; Alattar, Adnan
2015-03-01
This paper reports on the implementation of the Digimarc® Discover platform on Google Glass, enabling the reading of a watermark embedded in a printed material or audio. The embedded watermark typically contains a unique code that identifies the containing media or object and a synchronization signal that allows the watermark to be read robustly. The Digimarc Discover smartphone application can read the watermark from a small portion of printed image presented at any orientation or reasonable distance. Likewise, Discover can read the recently introduced Digimarc Barcode to identify and manage consumer packaged goods in the retail channel. The Digimarc Barcode has several advantages over the traditional barcode and is expected to save the retail industry millions of dollars when deployed at scale. Discover can also read an audio watermark from ambient audio captured using a microphone. The Digimarc Discover platform has been widely deployed on the iPad, iPhone and many Android-based devices, but it has not yet been implemented on a head-worn wearable device, such as Google Glass. Implementing Discover on Google Glass is a challenging task due to the current hardware and software limitations of the device. This paper identifies the challenges encountered in porting Discover to the Google Glass and reports on the solutions created to deliver a prototype implementation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, Russell
2009-09-10
This report presents results for year seventeen in the basin-wide Experimental Northern Pikeminnow Management Program to harvest northern pikeminnow1 (Ptychocheilus oregonensis) in the Columbia and Snake Rivers. This program was started in an effort to reduce predation by northern pikeminnow on juvenile salmonids during their emigration from natal streams to the ocean. Earlier work in the Columbia River Basin suggested predation by northern pikeminnow on juvenile salmonids might account for most of the 10-20% mortality juvenile salmonids experience in each of eight Columbia River and Snake River reservoirs. Modeling simulations based on work in John Day Reservoir from 1982 throughmore » 1988 indicated that, if predator-size northern pikeminnow were exploited at a 10-20% rate, the resulting restructuring of their population could reduce their predation on juvenile salmonids by 50%. To test this hypothesis, we implemented a sport-reward angling fishery and a commercial longline fishery in the John Day Pool in 1990. We also conducted an angling fishery in areas inaccessible to the public at four dams on the mainstem Columbia River and at Ice Harbor Dam on the Snake River. Based on the success of these limited efforts, we implemented three test fisheries on a system-wide scale in 1991 - a tribal longline fishery above Bonneville Dam, a sport-reward fishery, and a dam-angling fishery. Low catch of target fish and high cost of implementation resulted in discontinuation of the tribal longline fishery. However, the sport-reward and dam-angling fisheries were continued in 1992 and 1993. In 1992, we investigated the feasibility of implementing a commercial longline fishery in the Columbia River below Bonneville Dam and found that implementation of this fishery was also infeasible. Estimates of combined annual exploitation rates resulting from the sport-reward and dam-angling fisheries remained at the low end of our target range of 10-20%. This suggested the need for additional effective harvest techniques. During 1991 and 1992, we developed and tested a modified (small-sized) Merwin trapnet. We found this floating trapnet to be very effective in catching northern pikeminnow at specific sites. Consequently, in 1993 we examined a system-wide fishery using floating trapnets, but found this fishery to be ineffective at harvesting large numbers of northern pikeminnow on a system-wide scale. In 1994, we investigated the use of trap nets and gillnets at specific locations where concentrations of northern pikeminnow were known or suspected to occur during the spring season (i.e., March through early June). In addition, we initiated a concerted effort to increase public participation in the sport-reward fishery through a series of promotional and incentive activities. In 1995, 1996, and 1997, promotional activities and incentives were further improved based on the favorable response in 1994. Results of these efforts are subjects of this annual report. Evaluation of the success of test fisheries in achieving our target goal of a 10-20% annual exploitation rate on northern pikeminnow is presented in Report C of this report. Overall program success in terms of altering the size and age composition of the northern pikeminnow population and in terms of potential reductions in loss of juvenile salmonids to northern pikeminnow predation is also discussed in Report C. Program cooperators include the Pacific States Marine Fisheries Commission (PSMFC), Oregon Department of Fish and Wildlife (ODFW), and Washington Department of Fish and Wildlife (WDFW), and the U. S. Department of Agriculture (USDA), Animal Damage Unit as a contractor to test Dam Angling. The PSMFC was responsible for coordination and administration of the program; PSMFC subcontracted various tasks and activities to ODFW and WDFW based on the expertise each brought to the tasks involved in implementing the program and dam angling to the USDA.« less
NASA Astrophysics Data System (ADS)
Tai, Y.; Watanabe, T.; Nagata, K.
2018-03-01
A mixing volume model (MVM) originally proposed for molecular diffusion in incompressible flows is extended as a model for molecular diffusion and thermal conduction in compressible turbulence. The model, established for implementation in Lagrangian simulations, is based on the interactions among spatially distributed notional particles within a finite volume. The MVM is tested with the direct numerical simulation of compressible planar jets with the jet Mach number ranging from 0.6 to 2.6. The MVM well predicts molecular diffusion and thermal conduction for a wide range of the size of mixing volume and the number of mixing particles. In the transitional region of the jet, where the scalar field exhibits a sharp jump at the edge of the shear layer, a smaller mixing volume is required for an accurate prediction of mean effects of molecular diffusion. The mixing time scale in the model is defined as the time scale of diffusive effects at a length scale of the mixing volume. The mixing time scale is well correlated for passive scalar and temperature. Probability density functions of the mixing time scale are similar for molecular diffusion and thermal conduction when the mixing volume is larger than a dissipative scale because the mixing time scale at small scales is easily affected by different distributions of intermittent small-scale structures between passive scalar and temperature. The MVM with an assumption of equal mixing time scales for molecular diffusion and thermal conduction is useful in the modeling of the thermal conduction when the modeling of the dissipation rate of temperature fluctuations is difficult.
Rašić, Gordana; Filipović, Igor; Weeks, Andrew R; Hoffmann, Ary A
2014-04-11
Genetic markers are widely used to understand the biology and population dynamics of disease vectors, but often markers are limited in the resolution they provide. In particular, the delineation of population structure, fine scale movement and patterns of relatedness are often obscured unless numerous markers are available. To address this issue in the major arbovirus vector, the yellow fever mosquito (Aedes aegypti), we used double digest Restriction-site Associated DNA (ddRAD) sequencing for the discovery of genome-wide single nucleotide polymorphisms (SNPs). We aimed to characterize the new SNP set and to test the resolution against previously described microsatellite markers in detecting broad and fine-scale genetic patterns in Ae. aegypti. We developed bioinformatics tools that support the customization of restriction enzyme-based protocols for SNP discovery. We showed that our approach for RAD library construction achieves unbiased genome representation that reflects true evolutionary processes. In Ae. aegypti samples from three continents we identified more than 18,000 putative SNPs. They were widely distributed across the three Ae. aegypti chromosomes, with 47.9% found in intergenic regions and 17.8% in exons of over 2,300 genes. Pattern of their imputed effects in ORFs and UTRs were consistent with those found in a recent transcriptome study. We demonstrated that individual mosquitoes from Indonesia, Australia, Vietnam and Brazil can be assigned with a very high degree of confidence to their region of origin using a large SNP panel. We also showed that familial relatedness of samples from a 0.4 km2 area could be confidently established with a subset of SNPs. Using a cost-effective customized RAD sequencing approach supported by our bioinformatics tools, we characterized over 18,000 SNPs in field samples of the dengue fever mosquito Ae. aegypti. The variants were annotated and positioned onto the three Ae. aegypti chromosomes. The new SNP set provided much greater resolution in detecting population structure and estimating fine-scale relatedness than a set of polymorphic microsatellites. RAD-based markers demonstrate great potential to advance our understanding of mosquito population processes, critical for implementing new control measures against this major disease vector.
Predicting Abandonment of School-Wide Behavior Support Interventions
ERIC Educational Resources Information Center
Nese, Rhonda N. T.; McIntosh, Kent; Nese, Joseph F. T.; Ghemraoui, Adam; Bloom, Jerry; Johnson, Nanci W.; Phillips, Danielle; Richter, Mary F.; Hoselton, Robert
2016-01-01
This study examines predictors of abandonment of evidence-based practices through descriptive analyses of extant state-level training data, fidelity of implementation data, and nationally reported school demographic data across 915 schools in 3 states implementing school-wide positive behavioral interventions and supports (SWPBIS). Schools…
Global and local waveform simulations using the VERCE platform
NASA Astrophysics Data System (ADS)
Garth, Thomas; Saleh, Rafiq; Spinuso, Alessandro; Gemund, Andre; Casarotti, Emanuele; Magnoni, Federica; Krischner, Lion; Igel, Heiner; Schlichtweg, Horst; Frank, Anton; Michelini, Alberto; Vilotte, Jean-Pierre; Rietbrock, Andreas
2017-04-01
In recent years the potential to increase resolution of seismic imaging by full waveform inversion has been demonstrated on a range of scales from basin to continental scales. These techniques rely on harnessing the computational power of large supercomputers, and running large parallel codes to simulate the seismic wave field in a three-dimensional geological setting. The VERCE platform is designed to make these full waveform techniques accessible to a far wider spectrum of the seismological community. The platform supports the two widely used spectral element simulation programs SPECFEM3D Cartesian, and SPECFEM3D globe, allowing users to run a wide range of simulations. In the SPECFEM3D Cartesian implementation the user can run waveform simulations on a range of pre-loaded meshes and velocity models for specific areas, or upload their own velocity model and mesh. In the new SPECFEM3D globe implementation, the user will be able to select from a number of continent scale model regions, or perform waveform simulations for the whole earth. Earthquake focal mechanisms can be downloaded within the platform, for example from the GCMT catalogue, or users can upload their own focal mechanism catalogue through the platform. The simulations can be run on a range of European supercomputers in the PRACE network. Once a job has been submitted and run through the platform, the simulated waveforms can be manipulated or downloaded for further analysis. The misfit between the simulated and recorded waveforms can then be calculated through the platform through three interoperable workflows, for raw-data access (FDSN) and caching, pre-processing and finally misfit. The last workflow makes use of the Pyflex analysis software. In addition, the VERCE platform can be used to produce animations of waveform propagation through the velocity model, and synthetic shakemaps. All these data-products are made discoverable and re-usable thanks to the VERCE data and metadata management layer. We demonstrate the functionality of the VERCE platform with two use cases, one using the pre-loaded velocity model and mesh for the Maule area of Chile using the SPECFEM3D Cartesian workflow, and one showing the output of a global simulation using the SPECFEM3D globe workflow. It is envisioned that this tool will allow a much greater range of seismologists to access these full waveform inversion tools, and aid full waveform tomographic and source inversion, synthetic shakemap production and other full waveform applications, in a wide range of tectonic settings.
2012-01-01
Background Task-shifting is promoted widely as a mechanism for expanding antiretroviral treatment (ART) access. However, the evidence for nurse-initiated and managed ART (NIMART) in Africa is limited, and little is known about the key barriers and enablers to implementing NIMART programmes on a large scale. The STRETCH (Streamlining Tasks and Roles to Expand Treatment and Care for HIV) programme was a complex educational and organisational intervention implemented in the Free State Province of South Africa to enable nurses providing primary HIV/AIDS care to expand their roles and include aspects of care and treatment usually provided by physicians. STRETCH used a phased implementation approach and ART treatment guidelines tailored specifically to nurses. The effects of STRETCH on pre-ART mortality, ART provision, and the quality of HIV/ART care were evaluated through a randomised controlled trial. This study was conducted alongside the trial to develop a contextualised understanding of factors affecting the implementation of the programme. Methods This study was a qualitative process evaluation using in-depth interviews and focus group discussions with patients, health workers, health managers, and other key informants as well as observation in clinics. Research questions focused on perceptions of STRETCH, changes in health provider roles, attitudes and patient relationships, and impact of the implementation context on trial outcomes. Data were analysed collaboratively by the research team using thematic analysis. Results NIMART appears to be highly acceptable among nurses, patients, and physicians. Managers and nurses expressed confidence in their ability to deliver ART successfully. This confidence developed slowly and unevenly, through a phased and well-supported approach that guided nurses through training, re-prescription, and initiation. The research also shows that NIMART changes the working and referral relationships between health staff, demands significant training and support, and faces workload and capacity constraints, and logistical and infrastructural challenges. Conclusions Large-scale NIMART appears to be feasible and acceptable in the primary level public sector health services in South Africa. Successful implementation requires a comprehensive approach with: an incremental and well supported approach to implementation; clinical guidelines tailored to nurses; and significant health services reorganisation to accommodate the knock-on effects of shifts in practice. Trial registration ISRCTN46836853 PMID:22800379
PyHLA: tests for the association between HLA alleles and diseases.
Fan, Yanhui; Song, You-Qiang
2017-02-06
Recently, several tools have been designed for human leukocyte antigen (HLA) typing using single nucleotide polymorphism (SNP) array and next-generation sequencing (NGS) data. These tools provide high-throughput and cost-effective approaches for identifying HLA types. Therefore, tools for downstream association analysis are highly desirable. Although several tools have been designed for multi-allelic marker association analysis, they were designed only for microsatellite markers and do not scale well with increasing data volumes, or they were designed for large-scale data but provided a limited number of tests. We have developed a Python package called PyHLA, which implements several methods for HLA association analysis, to fill the gap. PyHLA is a tailor-made, easy to use, and flexible tool designed specifically for the association analysis of the HLA types imputed from genome-wide genotyping and NGS data. PyHLA provides functions for association analysis, zygosity tests, and interaction tests between HLA alleles and diseases. Monte Carlo permutation and several methods for multiple testing corrections have also been implemented. PyHLA provides a convenient and powerful tool for HLA analysis. Existing methods have been integrated and desired methods have been added in PyHLA. Furthermore, PyHLA is applicable to small and large sample sizes and can finish the analysis in a timely manner on a personal computer with different platforms. PyHLA is implemented in Python. PyHLA is a free, open source software distributed under the GPLv2 license. The source code, tutorial, and examples are available at https://github.com/felixfan/PyHLA.
PARALLEL HOP: A SCALABLE HALO FINDER FOR MASSIVE COSMOLOGICAL DATA SETS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skory, Stephen; Turk, Matthew J.; Norman, Michael L.
2010-11-15
Modern N-body cosmological simulations contain billions (10{sup 9}) of dark matter particles. These simulations require hundreds to thousands of gigabytes of memory and employ hundreds to tens of thousands of processing cores on many compute nodes. In order to study the distribution of dark matter in a cosmological simulation, the dark matter halos must be identified using a halo finder, which establishes the halo membership of every particle in the simulation. The resources required for halo finding are similar to the requirements for the simulation itself. In particular, simulations have become too extensive to use commonly employed halo finders, suchmore » that the computational requirements to identify halos must now be spread across multiple nodes and cores. Here, we present a scalable-parallel halo finding method called Parallel HOP for large-scale cosmological simulation data. Based on the halo finder HOP, it utilizes message passing interface and domain decomposition to distribute the halo finding workload across multiple compute nodes, enabling analysis of much larger data sets than is possible with the strictly serial or previous parallel implementations of HOP. We provide a reference implementation of this method as a part of the toolkit {sup yt}, an analysis toolkit for adaptive mesh refinement data that include complementary analysis modules. Additionally, we discuss a suite of benchmarks that demonstrate that this method scales well up to several hundred tasks and data sets in excess of 2000{sup 3} particles. The Parallel HOP method and our implementation can be readily applied to any kind of N-body simulation data and is therefore widely applicable.« less
Lake Nutrient Responses to Integrated Conservation Practices in an Agricultural Watershed.
Lizotte, Richard E; Yasarer, Lindsey M W; Locke, Martin A; Bingner, Ronald L; Knight, Scott S
2017-03-01
Watershed-scale management efforts to reduce nutrient loads and improve the conservation of lakes in agricultural watersheds require effective integration of a variety of agricultural conservation best management practices (BMPs). This paper documents watershed-scale assessments of the influence of multiple integrated BMPs on oxbow lake nutrient concentrations in a 625-ha watershed of intensive row-crop agricultural activity during a 14-yr monitoring period (1996-2009). A suite of BMPs within fields and at field edges throughout the watershed and enrollment of 87 ha into the Conservation Reserve Program (CRP) were implemented from 1995 to 2006. Total phosphorus (TP), soluble reactive phosphorus (SRP), ammonium, and nitrate were measured approximately biweekly from 1996 to 2009, and total nitrogen (TN) was measured from 2001 to 2009. Decreases in several lake nutrient concentrations occurred after BMP implementation. Reductions in TP lake concentrations were associated with vegetative buffers and rainfall. No consistent patterns of changes in TN or SRP lake concentrations were observed. Reductions in ammonium lake concentrations were associated with conservation tillage and CRP. Reductions in nitrate lake concentrations were associated with vegetative buffers. Watershed simulations conducted with the AnnAGNPS (Annualized Agricultural Non-Point Source) model with and without BMPs also show a clear reduction in TN and TP loads to the lake after the implementation of BMPs. These results provide direct evidence of how watershed-wide BMPs assist in reducing nutrient loading in aquatic ecosystems and promote a more viable and sustainable lake ecosystem. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Wangsness, David J.
1997-01-01
In the 1980s it was determined that existing ambient and compliance-monitoring data could not satisfactorily evaluate the results of hundreds of billions of dollars spent for water-pollution abatement in the United States. At the request of the US Congress, a new programme, the National Water-Quality Assessment, was designed and implemented by government agency, the US Geological Survey (USGS). The Assessment has reported status and trends in surface- and ground-water quality at national, regional, and local scales since 1991. The legislative basis for US monitoring and data-sharing policies are identified as well as the successive phases of the design and implementation of the USGS Assessment. Application to the Danube Basin is suggested. Much of the water-quality monitoring conducted in the United States is designed to comply with Federal and State laws mandated primarily by the Clean Water Act of 1987 and the Safe Drinking Water Act of 1986. Monitoring programs generally focus on rivers upstream and downstream of point-source discharges and at water-supply intakes. Few data are available for aquifer systems, and chemical analyses are often limited to those constituents required by law. In most cases, the majority of the available chemical and streamflow data have provided the information necessary to meet the objectives of the compliance-monitoring programs, but do not necessarily provide the information requires for basin-wide assessments of the water quality at the local, regional, or national scale.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, J. R.; Peng, E.; Ahmad, Z.
2015-05-15
We present a comprehensive methodology for the simulation of astronomical images from optical survey telescopes. We use a photon Monte Carlo approach to construct images by sampling photons from models of astronomical source populations, and then simulating those photons through the system as they interact with the atmosphere, telescope, and camera. We demonstrate that all physical effects for optical light that determine the shapes, locations, and brightnesses of individual stars and galaxies can be accurately represented in this formalism. By using large scale grid computing, modern processors, and an efficient implementation that can produce 400,000 photons s{sup −1}, we demonstratemore » that even very large optical surveys can be now be simulated. We demonstrate that we are able to (1) construct kilometer scale phase screens necessary for wide-field telescopes, (2) reproduce atmospheric point-spread function moments using a fast novel hybrid geometric/Fourier technique for non-diffraction limited telescopes, (3) accurately reproduce the expected spot diagrams for complex aspheric optical designs, and (4) recover system effective area predicted from analytic photometry integrals. This new code, the Photon Simulator (PhoSim), is publicly available. We have implemented the Large Synoptic Survey Telescope design, and it can be extended to other telescopes. We expect that because of the comprehensive physics implemented in PhoSim, it will be used by the community to plan future observations, interpret detailed existing observations, and quantify systematics related to various astronomical measurements. Future development and validation by comparisons with real data will continue to improve the fidelity and usability of the code.« less
Swan, D; Hannigan, A; Higgins, S; McDonnell, R; Meagher, D; Cullen, W
2017-02-01
In Ireland, as in many other healthcare systems, mental health service provision is being reconfigured with a move toward more care in the community, and particularly primary care. Recording and surveillance systems for mental health information and activities in primary care are needed for service planning and quality improvement. We describe the development and initial implementation of a software tool ('mental health finder') within a widely used primary care electronic medical record system (EMR) in Ireland to enable large-scale data collection on the epidemiology and management of mental health and substance use problems among patients attending general practice. In collaboration with the Irish Primary Care Research Network (IPCRN), we developed the 'Mental Health Finder' as a software plug-in to a commonly used primary care EMR system to facilitate data collection on mental health diagnoses and pharmacological treatments among patients. The finder searches for and identifies patients based on diagnostic coding and/or prescribed medicines. It was initially implemented among a convenience sample of six GP practices. Prevalence of mental health and substance use problems across the six practices, as identified by the finder, was 9.4% (range 6.9-12.7%). 61.9% of identified patients were female; 25.8% were private patients. One-third (33.4%) of identified patients were prescribed more than one class of psychotropic medication. Of the patients identified by the finder, 89.9% were identifiable via prescribing data, 23.7% via diagnostic coding. The finder is a feasible and promising methodology for large-scale data collection on mental health problems in primary care.
LASSIE: simulating large-scale models of biochemical systems on GPUs.
Tangherloni, Andrea; Nobile, Marco S; Besozzi, Daniela; Mauri, Giancarlo; Cazzaniga, Paolo
2017-05-10
Mathematical modeling and in silico analysis are widely acknowledged as complementary tools to biological laboratory methods, to achieve a thorough understanding of emergent behaviors of cellular processes in both physiological and perturbed conditions. Though, the simulation of large-scale models-consisting in hundreds or thousands of reactions and molecular species-can rapidly overtake the capabilities of Central Processing Units (CPUs). The purpose of this work is to exploit alternative high-performance computing solutions, such as Graphics Processing Units (GPUs), to allow the investigation of these models at reduced computational costs. LASSIE is a "black-box" GPU-accelerated deterministic simulator, specifically designed for large-scale models and not requiring any expertise in mathematical modeling, simulation algorithms or GPU programming. Given a reaction-based model of a cellular process, LASSIE automatically generates the corresponding system of Ordinary Differential Equations (ODEs), assuming mass-action kinetics. The numerical solution of the ODEs is obtained by automatically switching between the Runge-Kutta-Fehlberg method in the absence of stiffness, and the Backward Differentiation Formulae of first order in presence of stiffness. The computational performance of LASSIE are assessed using a set of randomly generated synthetic reaction-based models of increasing size, ranging from 64 to 8192 reactions and species, and compared to a CPU-implementation of the LSODA numerical integration algorithm. LASSIE adopts a novel fine-grained parallelization strategy to distribute on the GPU cores all the calculations required to solve the system of ODEs. By virtue of this implementation, LASSIE achieves up to 92× speed-up with respect to LSODA, therefore reducing the running time from approximately 1 month down to 8 h to simulate models consisting in, for instance, four thousands of reactions and species. Notably, thanks to its smaller memory footprint, LASSIE is able to perform fast simulations of even larger models, whereby the tested CPU-implementation of LSODA failed to reach termination. LASSIE is therefore expected to make an important breakthrough in Systems Biology applications, for the execution of faster and in-depth computational analyses of large-scale models of complex biological systems.
Data processing, multi-omic pathway mapping, and metabolite activity analysis using XCMS Online
Forsberg, Erica M; Huan, Tao; Rinehart, Duane; Benton, H Paul; Warth, Benedikt; Hilmers, Brian; Siuzdak, Gary
2018-01-01
Systems biology is the study of complex living organisms, and as such, analysis on a systems-wide scale involves the collection of information-dense data sets that are representative of an entire phenotype. To uncover dynamic biological mechanisms, bioinformatics tools have become essential to facilitating data interpretation in large-scale analyses. Global metabolomics is one such method for performing systems biology, as metabolites represent the downstream functional products of ongoing biological processes. We have developed XCMS Online, a platform that enables online metabolomics data processing and interpretation. A systems biology workflow recently implemented within XCMS Online enables rapid metabolic pathway mapping using raw metabolomics data for investigating dysregulated metabolic processes. In addition, this platform supports integration of multi-omic (such as genomic and proteomic) data to garner further systems-wide mechanistic insight. Here, we provide an in-depth procedure showing how to effectively navigate and use the systems biology workflow within XCMS Online without a priori knowledge of the platform, including uploading liquid chromatography (LCLC)–mass spectrometry (MS) data from metabolite-extracted biological samples, defining the job parameters to identify features, correcting for retention time deviations, conducting statistical analysis of features between sample classes and performing predictive metabolic pathway analysis. Additional multi-omics data can be uploaded and overlaid with previously identified pathways to enhance systems-wide analysis of the observed dysregulations. We also describe unique visualization tools to assist in elucidation of statistically significant dysregulated metabolic pathways. Parameter input takes 5–10 min, depending on user experience; data processing typically takes 1–3 h, and data analysis takes ~30 min. PMID:29494574
fastBMA: scalable network inference and transitive reduction.
Hung, Ling-Hong; Shi, Kaiyuan; Wu, Migao; Young, William Chad; Raftery, Adrian E; Yeung, Ka Yee
2017-10-01
Inferring genetic networks from genome-wide expression data is extremely demanding computationally. We have developed fastBMA, a distributed, parallel, and scalable implementation of Bayesian model averaging (BMA) for this purpose. fastBMA also includes a computationally efficient module for eliminating redundant indirect edges in the network by mapping the transitive reduction to an easily solved shortest-path problem. We evaluated the performance of fastBMA on synthetic data and experimental genome-wide time series yeast and human datasets. When using a single CPU core, fastBMA is up to 100 times faster than the next fastest method, LASSO, with increased accuracy. It is a memory-efficient, parallel, and distributed application that scales to human genome-wide expression data. A 10 000-gene regulation network can be obtained in a matter of hours using a 32-core cloud cluster (2 nodes of 16 cores). fastBMA is a significant improvement over its predecessor ScanBMA. It is more accurate and orders of magnitude faster than other fast network inference methods such as the 1 based on LASSO. The improved scalability allows it to calculate networks from genome scale data in a reasonable time frame. The transitive reduction method can improve accuracy in denser networks. fastBMA is available as code (M.I.T. license) from GitHub (https://github.com/lhhunghimself/fastBMA), as part of the updated networkBMA Bioconductor package (https://www.bioconductor.org/packages/release/bioc/html/networkBMA.html) and as ready-to-deploy Docker images (https://hub.docker.com/r/biodepot/fastbma/). © The Authors 2017. Published by Oxford University Press.
Monitoring and evaluating the quality of cancer care in Japan using administrative claims data.
Iwamoto, Momoko; Nakamura, Fumiaki; Higashi, Takahiro
2016-01-01
The importance of measuring the quality of cancer care has been well recognized in many developed countries, but has never been successfully implemented on a national level in Japan. We sought to establish a wide-scale quality monitoring and evaluation program for cancer by measuring 13 process-of-care quality indicators (QI) using a registry-linked claims database. We measured two QI on pre-treatment testing, nine on adherence to clinical guidelines on therapeutic treatments, and two on supportive care, for breast, prostate, colorectal, stomach, lung, liver and cervical cancer patients who were diagnosed in 2011 from 178 hospitals. We further assessed the reasons for non-adherence for patients who did not receive the indicated care in 26 hospitals. QI for pretreatment testing were high in most hospitals (above 80%), but scores on adjuvant radiation and chemoradiation therapies were low (20-37%), except for breast cancer (74%). QI for adjuvant chemotherapy and supportive care were more widely distributed across hospitals (45-68%). Further analysis in 26 hospitals showed that most of the patients who did not receive adjuvant chemotherapy had clinically valid reasons for not receiving the specified care (above 70%), but the majority of the patients did not have sufficient reasons for not receiving adjuvant radiotherapy (52-69%) and supportive care (above 80%). We present here the first wide-scale quality measurement initiative of cancer patients in Japan. Patients without clinically valid reasons for non-adherence should be examined further in future to improve care. © 2015 The Authors. Cancer Science published by Wiley Publishing Asia Pty Ltd on behalf of Japanese Cancer Association.
Supporting Parent Engagement in Programme-Wide Behavioural Intervention Implementation
ERIC Educational Resources Information Center
Cummings, Katrina P.
2017-01-01
Positive behaviour intervention and support (PBIS) models are evolving as an effective means to promote social and emotional competence among young children and address challenging behaviours. This study was designed to gain insights into parental involvement in programme-wide implementation of the "Pyramid" model. Interviews were…
Predicting Abandonment of School-Wide Positive Behavioral Interventions and Supports
ERIC Educational Resources Information Center
Nese, Rhonda; McIntosh, Kent; Nese, Joseph; Hoselton, Robert; Bloom, Jerry; Johnson, Nanci; Richter, Mary; Phillips, Danielle; Ghemraoui, Adam
2016-01-01
This study examines predictors of abandonment of evidence-based practices through descriptive analyses of extant state-level training data, fidelity of implementation data, and nationally reported school demographic data across 915 schools in three states implementing school-wide positive behavioral interventions and supports (SWPBIS). Schools…
Scaling of economic benefits from Green Roof implementation in Washington, DC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niu, H.; Clark, C. E.; Zhou, J.
2010-06-01
Green roof technology is recognized for mitigating stormwater runoff and energy consumption. Methods to overcome the cost gap between green roofs and conventional roofs were recently quantified by incorporating air quality benefits. This study investigates the impact of scaling on these benefits at the city-wide scale using Washington, DC as a test bed because of the proposed targets in the 20-20-20 vision (20 million ft{sup 2} by 2020) articulated by Casey Trees, a nonprofit organization. Building-specific stormwater benefits were analyzed assuming two proposed policy scenarios for stormwater fees ranging from 35 to 50% reduction for green roof implementation. Heat fluxmore » calculations were used to estimate building-specific energy savings for commercial buildings. To assess benefits at the city scale, stormwater infrastructure savings were based on operational savings and size reduction due to reduced stormwater volume generation. Scaled energy infrastructure benefits were calculated using two size reductions methods for air conditioners. Avoided carbon dioxide, nitrogen oxide (NOx), and sulfur dioxide emissions were based on reductions in electricity and natural gas consumption. Lastly, experimental and fugacity-based estimates were used to quantify the NOx uptake by green roofs, which was translated to health benefits using U.S. Environmental Protection Agency models. The results of the net present value (NPV) analysis showed that stormwater infrastructure benefits totaled $1.04 million (M), while fee-based stormwater benefits were $0.22-0.32 M/y. Energy savings were $0.87 M/y, while air conditioner resizing benefits were estimated at $0.02 to $0.04 M/y and avoided emissions benefits (based on current emission trading values) were $0.09 M-0.41 M/y. Over the lifetime of the green roof (40 years), the NPV is about 30-40% less than that of conventional roofs (not including green roof maintenance costs). These considerable benefits, in concert with current and emerging policy frameworks, may facilitate future adoption of this technology.« less
Scaling of economic benefits from green roof implementation in Washington, DC.
Niu, Hao; Clark, Corrie; Zhou, Jiti; Adriaens, Peter
2010-06-01
Green roof technology is recognized for mitigating stormwater runoff and energy consumption. Methods to overcome the cost gap between green roofs and conventional roofs were recently quantified by incorporating air quality benefits. This study investigates the impact of scaling on these benefits at the city-wide scale using Washington, DC as a test bed because of the proposed targets in the 20-20-20 vision (20 million ft(2) by 2020) articulated by Casey Trees, a nonprofit organization. Building-specific stormwater benefits were analyzed assuming two proposed policy scenarios for stormwater fees ranging from 35 to 50% reduction for green roof implementation. Heat flux calculations were used to estimate building-specific energy savings for commercial buildings. To assess benefits at the city scale, stormwater infrastructure savings were based on operational savings and size reduction due to reduced stormwater volume generation. Scaled energy infrastructure benefits were calculated using two size reductions methods for air conditioners. Avoided carbon dioxide, nitrogen oxide (NO(x)), and sulfur dioxide emissions were based on reductions in electricity and natural gas consumption. Lastly, experimental and fugacity-based estimates were used to quantify the NO(x) uptake by green roofs, which was translated to health benefits using U.S. Environmental Protection Agency models. The results of the net present value (NPV) analysis showed that stormwater infrastructure benefits totaled $1.04 million (M), while fee-based stormwater benefits were $0.22-0.32 M/y. Energy savings were $0.87 M/y, while air conditioner resizing benefits were estimated at $0.02 to $0.04 M/y and avoided emissions benefits (based on current emission trading values) were $0.09 M-0.41 M/y. Over the lifetime of the green roof (40 years), the NPV is about 30-40% less than that of conventional roofs (not including green roof maintenance costs). These considerable benefits, in concert with current and emerging policy frameworks, may facilitate future adoption of this technology.
Multi-scale Visualization of Molecular Architecture Using Real-Time Ambient Occlusion in Sculptor.
Wahle, Manuel; Wriggers, Willy
2015-10-01
The modeling of large biomolecular assemblies relies on an efficient rendering of their hierarchical architecture across a wide range of spatial level of detail. We describe a paradigm shift currently under way in computer graphics towards the use of more realistic global illumination models, and we apply the so-called ambient occlusion approach to our open-source multi-scale modeling program, Sculptor. While there are many other higher quality global illumination approaches going all the way up to full GPU-accelerated ray tracing, they do not provide size-specificity of the features they shade. Ambient occlusion is an aspect of global lighting that offers great visual benefits and powerful user customization. By estimating how other molecular shape features affect the reception of light at some surface point, it effectively simulates indirect shadowing. This effect occurs between molecular surfaces that are close to each other, or in pockets such as protein or ligand binding sites. By adding ambient occlusion, large macromolecular systems look much more natural, and the perception of characteristic surface features is strongly enhanced. In this work, we present a real-time implementation of screen space ambient occlusion that delivers realistic cues about tunable spatial scale characteristics of macromolecular architecture. Heretofore, the visualization of large biomolecular systems, comprising e.g. hundreds of thousands of atoms or Mega-Dalton size electron microscopy maps, did not take into account the length scales of interest or the spatial resolution of the data. Our approach has been uniquely customized with shading that is tuned for pockets and cavities of a user-defined size, making it useful for visualizing molecular features at multiple scales of interest. This is a feature that none of the conventional ambient occlusion approaches provide. Actual Sculptor screen shots illustrate how our implementation supports the size-dependent rendering of molecular surface features.
Cornejo, Pablo K; Zhang, Qiong; Mihelcic, James R
2016-07-05
Energy and resource consumptions required to treat and transport wastewater have led to efforts to improve the environmental sustainability of wastewater treatment plants (WWTPs). Resource recovery can reduce the environmental impact of these systems; however, limited research has considered how the scale of implementation impacts the sustainability of WWTPs integrated with resource recovery. Accordingly, this research uses life cycle assessment (LCA) to evaluate how the scale of implementation impacts the environmental sustainability of wastewater treatment integrated with water reuse, energy recovery, and nutrient recycling. Three systems were selected: a septic tank with aerobic treatment at the household scale, an advanced water reclamation facility at the community scale, and an advanced water reclamation facility at the city scale. Three sustainability indicators were considered: embodied energy, carbon footprint, and eutrophication potential. This study determined that as with economies of scale, there are benefits to centralization of WWTPs with resource recovery in terms of embodied energy and carbon footprint; however, the community scale was shown to have the lowest eutrophication potential. Additionally, technology selection, nutrient control practices, system layout, and topographical conditions may have a larger impact on environmental sustainability than the implementation scale in some cases.
Uneke, Chigozie Jesse; Sombie, Issiaka; Keita, Namoudou; Lokossou, Virgil; Johnson, Ermel; Ongolo-Zogo, Pierre
2016-01-01
The introduction of implementation science into maternal, newborn and child health (MNCH) research has facilitated better methods to improve uptake of research findings into practices. With increase in implementation research related to MNCH world-wide, stronger scientific evidence are now available and have improved MNCH policies in many countries including Nigeria. The purpose of this study was to review MNCH implementation studies undertaken in Nigeria in order to understand the extent the evidence generated informed better policy. This study was a systematic review. A MEDLINE Entrez PubMed search was performed in August 2015 and implementation studies that investigated MNCH in Nigeria from 1966 to 2015 in relation to health policy were sought. Search key words included Nigeria, health policy, maternal, newborn, and child health. Only policy relevant studies that were implementation or intervention research which generated evidence to improve MNCH in Nigeria were eligible and were selected. A total of 18 relevant studies that fulfilled the study inclusion criteria were identified out of 471 studies found. These studies generated high quality policy relevance evidence relating to task shifting, breastfeeding practices, maternal nutrition, childhood immunization, kangaroo mother care (KMC), prevention of maternal to child transmission of HIV, etc. These indicated significant improvements in maternal health outcomes in localities and health facilities where the studies were undertaken. There is a dire need for more implementation research related to MNCH in low income settings because the priority for improved MNCH outcome is not so much the development of new technologies but solving implementation issues, such as how to scale up and evaluate interventions within complex health systems.
Threats to sandy beach ecosystems: A review
NASA Astrophysics Data System (ADS)
Defeo, Omar; McLachlan, Anton; Schoeman, David S.; Schlacher, Thomas A.; Dugan, Jenifer; Jones, Alan; Lastra, Mariano; Scapini, Felicita
2009-01-01
We provide a brief synopsis of the unique physical and ecological attributes of sandy beach ecosystems and review the main anthropogenic pressures acting on the world's single largest type of open shoreline. Threats to beaches arise from a range of stressors which span a spectrum of impact scales from localised effects (e.g. trampling) to a truly global reach (e.g. sea-level rise). These pressures act at multiple temporal and spatial scales, translating into ecological impacts that are manifested across several dimensions in time and space so that today almost every beach on every coastline is threatened by human activities. Press disturbances (whatever the impact source involved) are becoming increasingly common, operating on time scales of years to decades. However, long-term data sets that describe either the natural dynamics of beach systems or the human impacts on beaches are scarce and fragmentary. A top priority is to implement long-term field experiments and monitoring programmes that quantify the dynamics of key ecological attributes on sandy beaches. Because of the inertia associated with global climate change and human population growth, no realistic management scenario will alleviate these threats in the short term. The immediate priority is to avoid further development of coastal areas likely to be directly impacted by retreating shorelines. There is also scope for improvement in experimental design to better distinguish natural variability from anthropogenic impacts. Sea-level rise and other effects of global warming are expected to intensify other anthropogenic pressures, and could cause unprecedented ecological impacts. The definition of the relevant scales of analysis, which will vary according to the magnitude of the impact and the organisational level under analysis, and the recognition of a physical-biological coupling at different scales, should be included in approaches to quantify impacts. Zoning strategies and marine reserves, which have not been widely implemented in sandy beaches, could be a key tool for biodiversity conservation and should also facilitate spillover effects into adjacent beach habitats. Setback and zoning strategies need to be enforced through legislation, and all relevant stakeholders should be included in the design, implementation and institutionalisation of these initiatives. New perspectives for rational management of sandy beaches require paradigm shifts, by including not only basic ecosystem principles, but also incentives for effective governance and sharing of management roles between government and local stakeholders.
Turbulence closure for mixing length theories
NASA Astrophysics Data System (ADS)
Jermyn, Adam S.; Lesaffre, Pierre; Tout, Christopher A.; Chitre, Shashikumar M.
2018-05-01
We present an approach to turbulence closure based on mixing length theory with three-dimensional fluctuations against a two-dimensional background. This model is intended to be rapidly computable for implementation in stellar evolution software and to capture a wide range of relevant phenomena with just a single free parameter, namely the mixing length. We incorporate magnetic, rotational, baroclinic, and buoyancy effects exactly within the formalism of linear growth theories with non-linear decay. We treat differential rotation effects perturbatively in the corotating frame using a novel controlled approximation, which matches the time evolution of the reference frame to arbitrary order. We then implement this model in an efficient open source code and discuss the resulting turbulent stresses and transport coefficients. We demonstrate that this model exhibits convective, baroclinic, and shear instabilities as well as the magnetorotational instability. It also exhibits non-linear saturation behaviour, and we use this to extract the asymptotic scaling of various transport coefficients in physically interesting limits.
A System-Level Approach to Overweight and Obesity in the Veterans Health Administration.
Raffa, Susan D; Maciejewski, Matthew L; Zimmerman, Lindsey E; Damschroder, Laura J; Estabrooks, Paul A; Ackermann, Ronald T; Tsai, Adam G; Histon, Trina; Goldstein, Michael G
2017-04-01
Healthcare systems are challenged by steady increases in the number of patients who are overweight and obese. Large-scale, evidence-based behavioral approaches for addressing overweight and obesity have been successfully implemented in systems such as the Veterans Health Administration (VHA). These population-based interventions target reduction in risk for obesity-associated conditions through lifestyle change and weight loss, and are associated with modest weight loss. Despite the fact that VHA has increased the overall reach of these behavioral interventions, the number of high-risk overweight and obese patients continues to rise. Recommendations for weight loss medications and bariatric surgery are included in clinical practice guidelines for the management of overweight and obesity, but these interventions are underutilized. During a recent state of the art conference on weight management held by VHA, subject matter experts identified challenges and gaps, as well as potential solutions and overarching policy recommendations, for implementing an integrated system-wide approach for improving population-based weight management.
Preventive chemotherapy in human helminthiasis: theoretical and operational aspects.
Gabrielli, A-F; Montresor, A; Chitsulo, L; Engels, D; Savioli, L
2011-12-01
Preventive chemotherapy (PC), the large-scale distribution of anthelminthic drugs to population groups at risk, is the core intervention recommended by the WHO for reducing morbidity and transmission of the four main helminth infections, namely lymphatic filariasis, onchocerciasis, schistosomiasis and soil-transmitted helminthiasis. The strategy is widely implemented worldwide but its general theoretical foundations have not been described so far in a comprehensive and cohesive manner. Starting from the information available on the biological and epidemiological characteristics of helminth infections, as well as from the experience generated by disease control and elimination interventions across the world, we extrapolate the fundamentals and synthesise the principles that regulate PC and justify its implementation as a sound and essential public health intervention. The outline of the theoretical aspects of PC contributes to a thorough understanding of the different facets of this strategy and helps comprehend opportunities and limits of control and elimination interventions directed against helminth infections. Copyright © 2011 Royal Society of Tropical Medicine and Hygiene. Published by Elsevier Ltd. All rights reserved.
Mental health interventions in schools in low-income and middle-income countries.
Fazel, Mina; Patel, Vikram; Thomas, Saji; Tol, Wietse
2014-10-01
Increasing enrolment rates could place schools in a crucial position to support mental health in low-income and middle-income countries. In this Review, we provide evidence for mental health interventions in schools in accordance with a public mental health approach spanning promotion, prevention, and treatment. We identified a systematic review for mental health promotion, and identified further prevention and treatment studies. Present evidence supports schools as places for promotion of positive aspects of mental health using a whole-school approach. Knowledge of effectiveness of prevention and treatment interventions is more widely available for conflict-affected children and adolescents. More evidence is needed to identify the many elements likely to be associated with effective prevention and treatment for children exposed to a range of adversity and types of mental disorders. Dissemination and implementation science is crucial to establish how proven effective interventions could be scaled up and implemented in schools. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Xie, Yiwei; Geng, Zihan; Zhuang, Leimeng; Burla, Maurizio; Taddei, Caterina; Hoekman, Marcel; Leinse, Arne; Roeloffzen, Chris G. H.; Boller, Klaus-J.; Lowery, Arthur J.
2017-12-01
Integrated optical signal processors have been identified as a powerful engine for optical processing of microwave signals. They enable wideband and stable signal processing operations on miniaturized chips with ultimate control precision. As a promising application, such processors enables photonic implementations of reconfigurable radio frequency (RF) filters with wide design flexibility, large bandwidth, and high-frequency selectivity. This is a key technology for photonic-assisted RF front ends that opens a path to overcoming the bandwidth limitation of current digital electronics. Here, the recent progress of integrated optical signal processors for implementing such RF filters is reviewed. We highlight the use of a low-loss, high-index-contrast stoichiometric silicon nitride waveguide which promises to serve as a practical material platform for realizing high-performance optical signal processors and points toward photonic RF filters with digital signal processing (DSP)-level flexibility, hundreds-GHz bandwidth, MHz-band frequency selectivity, and full system integration on a chip scale.
Stress Corrosion Cracking of Ferritic Materials for Fossil Power Generation Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pawel, Steven J; Siefert, John A.
2014-01-01
Creep strength enhanced ferritic (CSEF) steels Grades 23, 24, 91, and 92 have been widely implemented in the fossil fired industry for over two decades. The stress corrosion cracking (SCC) behavior of these materials with respect to mainstay Cr-Mo steels (such as Grades 11, 12 and 22) has not been properly assessed, particularly in consideration of recent reported issues of SCC in CSEF steels. This report details the results of Jones test exposures of a wide range of materials (Grades 11, 22, 23, 24, and 92), material conditions (as-received, improper heat treatments, normalized, weldments) and environments (salt fog; tube cleaningmore » environments including decreasing, scale removal, and passivation; and high temperature water) to compare the susceptibility to cracking of these steels. In the as-received (normalized and tempered) condition, none of these materials are susceptible to SCC in the environments examined. However, in the hardened condition, certain combinations of environment and alloy reveal substantial SCC susceptibility.« less
Li, Haitao; Boling, C Sam; Mason, Andrew J
2016-08-01
Airborne pollutants are a leading cause of illness and mortality globally. Electrochemical gas sensors show great promise for personal air quality monitoring to address this worldwide health crisis. However, implementing miniaturized arrays of such sensors demands high performance instrumentation circuits that simultaneously meet challenging power, area, sensitivity, noise and dynamic range goals. This paper presents a new multi-channel CMOS amperometric ADC featuring pixel-level architecture for gas sensor arrays. The circuit combines digital modulation of input currents and an incremental Σ∆ ADC to achieve wide dynamic range and high sensitivity with very high power efficiency and compact size. Fabricated in 0.5 [Formula: see text] CMOS, the circuit was measured to have 164 dB cross-scale dynamic range, 100 fA sensitivity while consuming only 241 [Formula: see text] and 0.157 [Formula: see text] active area per channel. Electrochemical experiments with liquid and gas targets demonstrate the circuit's real-time response to a wide range of analyte concentrations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, M.; Grimshaw, A.
1996-12-31
The Legion project at the University of Virginia is an architecture for designing and building system services that provide the illusion of a single virtual machine to users, a virtual machine that provides secure shared object and shared name spaces, application adjustable fault-tolerance, improved response time, and greater throughput. Legion targets wide area assemblies of workstations, supercomputers, and parallel supercomputers, Legion tackles problems not solved by existing workstation based parallel processing tools; the system will enable fault-tolerance, wide area parallel processing, inter-operability, heterogeneity, a single global name space, protection, security, efficient scheduling, and comprehensive resource management. This paper describes themore » core Legion object model, which specifies the composition and functionality of Legion`s core objects-those objects that cooperate to create, locate, manage, and remove objects in the Legion system. The object model facilitates a flexible extensible implementation, provides a single global name space, grants site autonomy to participating organizations, and scales to millions of sites and trillions of objects.« less
Format and basic geometry of a perspective display of air traffic for the cockpit
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael Wallace; Ellis, Stephen R.
1991-01-01
The design and implementation of a perspective display of air traffic for the cockpit is discussed. Parameters of the perspective are variable and interactive so that the appearance of the projected image can be widely varied. This approach makes allowances for exploration of perspective parameters and their interactions. The display was initially used to study the cases of horizontal maneuver biases found in experiments involving a plan view air traffic display format. Experiments to determine the effect of perspective geometry on spatial judgements have evolved from the display program. Several scaling techniques and other adjustments to the perspective are used to tailor the geometry for effective presentation of 3-D traffic situations.
NASA Astrophysics Data System (ADS)
Neradilová, Hana; Fedorko, Gabriel
2016-12-01
Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.
Dynamics of the cosmological relaxation after reheating
NASA Astrophysics Data System (ADS)
Choi, Kiwoon; Kim, Hyungjin; Sekiguchi, Toyokazu
2017-04-01
We examine if the cosmological relaxation mechanism, which was proposed recently as a new solution to the hierarchy problem, can be compatible with high reheating temperature well above the weak scale. As the barrier potential disappears at high temperature, the relaxion rolls down further after the reheating, which may ruin the successful implementation of the relaxation mechanism. It is noted that if the relaxion is coupled to a dark gauge boson, the new frictional force arising from dark gauge boson production can efficiently slow down the relaxion motion, which allows the relaxion to be stabilized after the electroweak phase transition for a wide range of model parameters, while satisfying the known observational constraints.
Experimental study of a quantum random-number generator based on two independent lasers
NASA Astrophysics Data System (ADS)
Sun, Shi-Hai; Xu, Feihu
2017-12-01
A quantum random-number generator (QRNG) can produce true randomness by utilizing the inherent probabilistic nature of quantum mechanics. Recently, the spontaneous-emission quantum phase noise of the laser has been widely deployed for quantum random-number generation, due to its high rate, its low cost, and the feasibility of chip-scale integration. Here, we perform a comprehensive experimental study of a phase-noise-based QRNG with two independent lasers, each of which operates in either continuous-wave (CW) or pulsed mode. We implement the QRNG by operating the two lasers in three configurations, namely, CW + CW, CW + pulsed, and pulsed + pulsed, and demonstrate their trade-offs, strengths, and weaknesses.
GeoMod 2014 - Modelling in geoscience
NASA Astrophysics Data System (ADS)
Leever, Karen; Oncken, Onno
2016-08-01
GeoMod is a biennial conference to review and discuss latest developments in analogue and numerical modelling of lithospheric and mantle deformation. GeoMod2014 took place at the GFZ German Research Centre for Geosciences in Potsdam, Germany. Its focus was on rheology and deformation at a wide range of temporal and spatial scales: from earthquakes to long-term deformation, from micro-structures to orogens and subduction systems. It also addressed volcanotectonics and the interaction between tectonics and surface processes (Elger et al., 2014). The conference was followed by a 2-day short course on "Constitutive Laws: from Observation to Implementation in Models" and a 1-day hands-on tutorial on the ASPECT numerical modelling software.
Acceleration of short and long DNA read mapping without loss of accuracy using suffix array.
Tárraga, Joaquín; Arnau, Vicente; Martínez, Héctor; Moreno, Raul; Cazorla, Diego; Salavert-Torres, José; Blanquer-Espert, Ignacio; Dopazo, Joaquín; Medina, Ignacio
2014-12-01
HPG Aligner applies suffix arrays for DNA read mapping. This implementation produces a highly sensitive and extremely fast mapping of DNA reads that scales up almost linearly with read length. The approach presented here is faster (over 20× for long reads) and more sensitive (over 98% in a wide range of read lengths) than the current state-of-the-art mappers. HPG Aligner is not only an optimal alternative for current sequencers but also the only solution available to cope with longer reads and growing throughputs produced by forthcoming sequencing technologies. https://github.com/opencb/hpg-aligner. © The Author 2014. Published by Oxford University Press.
Big data processing in the cloud - Challenges and platforms
NASA Astrophysics Data System (ADS)
Zhelev, Svetoslav; Rozeva, Anna
2017-12-01
Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.
An Analysis of Implementation Strategies in a School-Wide Vocabulary Intervention
ERIC Educational Resources Information Center
Roskos, Katheen A.; Moe, Jennifer Randazzo; Rosemary, Catherine
2017-01-01
From an improvement research perspective, this study explores strategies used to implement a school-wide vocabulary intervention into language arts instruction at an urban elementary school. Academic language time, an innovative change in the instructional delivery system, allots time and structure for deliberate teaching of cross-disciplinary…
PinAPL-Py: A comprehensive web-application for the analysis of CRISPR/Cas9 screens.
Spahn, Philipp N; Bath, Tyler; Weiss, Ryan J; Kim, Jihoon; Esko, Jeffrey D; Lewis, Nathan E; Harismendy, Olivier
2017-11-20
Large-scale genetic screens using CRISPR/Cas9 technology have emerged as a major tool for functional genomics. With its increased popularity, experimental biologists frequently acquire large sequencing datasets for which they often do not have an easy analysis option. While a few bioinformatic tools have been developed for this purpose, their utility is still hindered either due to limited functionality or the requirement of bioinformatic expertise. To make sequencing data analysis of CRISPR/Cas9 screens more accessible to a wide range of scientists, we developed a Platform-independent Analysis of Pooled Screens using Python (PinAPL-Py), which is operated as an intuitive web-service. PinAPL-Py implements state-of-the-art tools and statistical models, assembled in a comprehensive workflow covering sequence quality control, automated sgRNA sequence extraction, alignment, sgRNA enrichment/depletion analysis and gene ranking. The workflow is set up to use a variety of popular sgRNA libraries as well as custom libraries that can be easily uploaded. Various analysis options are offered, suitable to analyze a large variety of CRISPR/Cas9 screening experiments. Analysis output includes ranked lists of sgRNAs and genes, and publication-ready plots. PinAPL-Py helps to advance genome-wide screening efforts by combining comprehensive functionality with user-friendly implementation. PinAPL-Py is freely accessible at http://pinapl-py.ucsd.edu with instructions and test datasets.
Real-time bacterial microcolony counting using on-chip microscopy
NASA Astrophysics Data System (ADS)
Jung, Jae Hee; Lee, Jung Eun
2016-02-01
Observing microbial colonies is the standard method for determining the microbe titer and investigating the behaviors of microbes. Here, we report an automated, real-time bacterial microcolony-counting system implemented on a wide field-of-view (FOV), on-chip microscopy platform, termed ePetri. Using sub-pixel sweeping microscopy (SPSM) with a super-resolution algorithm, this system offers the ability to dynamically track individual bacterial microcolonies over a wide FOV of 5.7 mm × 4.3 mm without requiring a moving stage or lens. As a demonstration, we obtained high-resolution time-series images of S. epidermidis at 20-min intervals. We implemented an image-processing algorithm to analyze the spatiotemporal distribution of microcolonies, the development of which could be observed from a single bacterial cell. Test bacterial colonies with a minimum diameter of 20 μm could be enumerated within 6 h. We showed that our approach not only provides results that are comparable to conventional colony-counting assays but also can be used to monitor the dynamics of colony formation and growth. This microcolony-counting system using on-chip microscopy represents a new platform that substantially reduces the detection time for bacterial colony counting. It uses chip-scale image acquisition and is a simple and compact solution for the automation of colony-counting assays and microbe behavior analysis with applications in antibacterial drug discovery.
Real-time bacterial microcolony counting using on-chip microscopy
Jung, Jae Hee; Lee, Jung Eun
2016-01-01
Observing microbial colonies is the standard method for determining the microbe titer and investigating the behaviors of microbes. Here, we report an automated, real-time bacterial microcolony-counting system implemented on a wide field-of-view (FOV), on-chip microscopy platform, termed ePetri. Using sub-pixel sweeping microscopy (SPSM) with a super-resolution algorithm, this system offers the ability to dynamically track individual bacterial microcolonies over a wide FOV of 5.7 mm × 4.3 mm without requiring a moving stage or lens. As a demonstration, we obtained high-resolution time-series images of S. epidermidis at 20-min intervals. We implemented an image-processing algorithm to analyze the spatiotemporal distribution of microcolonies, the development of which could be observed from a single bacterial cell. Test bacterial colonies with a minimum diameter of 20 μm could be enumerated within 6 h. We showed that our approach not only provides results that are comparable to conventional colony-counting assays but also can be used to monitor the dynamics of colony formation and growth. This microcolony-counting system using on-chip microscopy represents a new platform that substantially reduces the detection time for bacterial colony counting. It uses chip-scale image acquisition and is a simple and compact solution for the automation of colony-counting assays and microbe behavior analysis with applications in antibacterial drug discovery. PMID:26902822
Automated Deployment of Advanced Controls and Analytics in Buildings
NASA Astrophysics Data System (ADS)
Pritoni, Marco
Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.
Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan
2016-10-28
Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems' architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation.
Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan
2016-01-01
Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems’ architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation. PMID:27801829
Implementation of an Evidence-Based Exercise Program for Older Adults in South Florida
Page, Timothy; Vieira, Edgar; Seff, Laura
2016-01-01
Introduction. This study aimed to examine how well an evidence-based physical activity program could be translated for wide scale dissemination and adoption to increase physical activity among community-dwelling older adults. Methods. Between October 2009 and December 2012, reach, fidelity, dosage, ease of implementation, and barriers to translation of EnhanceFitness (EF) were assessed. To assess effectiveness, a pretest-posttest design was used to measure increases in functional fitness (chair stands, arm curls, and the up-and-go test). Results. Fourteen community-based agencies offered 126 EF classes in 83 different locations and reached 4,490 older adults. Most participants were female (72%). Thirty-eight percent of participants did not complete the initial 16-week EF program. The 25% who received the recommended dose experienced an increase in upper and lower body strength and mobility. Further, participants reported high satisfaction with the program. Conclusion. EF was successfully implemented in a variety of settings throughout South Florida and reached a large number of older adults. However, challenges were encountered in ensuring that those who participated received a program dose that would lead to beneficial gains in functional fitness. PMID:27800182
A distributed-memory approximation algorithm for maximum weight perfect bipartite matching
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azad, Ariful; Buluc, Aydin; Li, Xiaoye S.
We design and implement an efficient parallel approximation algorithm for the problem of maximum weight perfect matching in bipartite graphs, i.e. the problem of finding a set of non-adjacent edges that covers all vertices and has maximum weight. This problem differs from the maximum weight matching problem, for which scalable approximation algorithms are known. It is primarily motivated by finding good pivots in scalable sparse direct solvers before factorization where sequential implementations of maximum weight perfect matching algorithms, such as those available in MC64, are widely used due to the lack of scalable alternatives. To overcome this limitation, we proposemore » a fully parallel distributed memory algorithm that first generates a perfect matching and then searches for weightaugmenting cycles of length four in parallel and iteratively augments the matching with a vertex disjoint set of such cycles. For most practical problems the weights of the perfect matchings generated by our algorithm are very close to the optimum. An efficient implementation of the algorithm scales up to 256 nodes (17,408 cores) on a Cray XC40 supercomputer and can solve instances that are too large to be handled by a single node using the sequential algorithm.« less
NeXOS, developing and evaluating a new generation of insitu ocean observation systems.
NASA Astrophysics Data System (ADS)
Delory, Eric; del Rio, Joaquin; Golmen, Lars; Roar Hareide, Nils; Pearlman, Jay; Rolin, Jean-Francois; Waldmann, Christoph; Zielinski, Oliver
2017-04-01
Ocean biological, chemical or physical processes occur over widely varying scales in space and time: from micro- to kilometer scales, from less than seconds to centuries. While space systems supply important data and information, insitu data is necessary for comprehensive modeling and forecasting of ocean dynamics. Yet, collection of in-situ observation on these scales is inherently challenging and remains generally difficult and costly in time and resources. This paper address the innovations and significant developments for a new generation of insitu sensors in FP7 European Union project "Next generation, Cost- effective, Compact, Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management" or "NeXOS" for short. Optical and acoustics sensors are the focus of NeXOS but NeXOS moves beyond just sensors as systems that simultaneously address multiple objectives and applications are becoming increasingly important. Thus NeXOS takes a perspective of both sensors and sensor systems with significant advantages over existing observing capabilities via the implementation of innovations such as multiplatform integration, greater reliability through better antifouling management and greater sensor and data interoperability through use of OGC standards. This presentation will address the sensor system development and field-testing of the new NeXOS sensor systems. This is being done on multiple platforms including profiling floats, gliders, ships, buoys and subsea stations. The implementation of a data system based on SWE and PUCK furthers interoperability across measurements and platforms. This presentation will review the sensor system capabilities, the status of field tests and recommendations for long-term ocean monitoring.
An interdisciplinary swat ecohydrological model to define catchment-scale hydrologic partitioning
NASA Astrophysics Data System (ADS)
Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.
2013-06-01
Land use and climate change have long been implicated in modifying ecosystem services, such as water quality and water yield, biodiversity, and agricultural production. To account for future effects on ecosystem services, the integration of physical, biological, economic, and social data over several scales must be implemented to assess the effects on natural resource availability and use. Our objective is to assess the capability of the SWAT model to capture short-duration monsoonal rainfall-runoff processes in complex mountainous terrain under rapid, event-driven processes in a monsoonal environment. To accomplish this, we developed a unique quality-control gap-filling algorithm for interpolation of high frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. We calibrated the interdisciplinary model to a combination of statistical, hydrologic, and plant growth metrics. In addition, we used multiple locations of different drainage area, aspect, elevation, and geologic substrata distributed throughout the catchment. Results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. While our model accurately reproduced observed discharge variability, the addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. The results of this study provide a valuable resource to describe landscape controls and their implication on discharge, sediment transport, and nutrient loading. This study also shows the challenges of applying the SWAT model to complex terrain and extreme environments. By incorporating anthropogenic features into modeling scenarios, we can greatly enhance our understanding of the hydroecological impacts on ecosystem services.
Rodríguez-Zárate, Clara J; Sandoval-Castillo, Jonathan; van Sebille, Erik; Keane, Robert G; Rocha-Olivares, Axayácatl; Urteaga, Jose; Beheregaray, Luciano B
2018-05-16
Spatial and temporal scales at which processes modulate genetic diversity over the landscape are usually overlooked, impacting the design of conservation management practices for widely distributed species. We examine processes shaping population divergence in highly mobile species by re-assessing the case of panmixia in the iconic olive ridley turtle from the eastern Pacific. We implemented a biophysical model of connectivity and a seascape genetic analysis based on nuclear DNA variation of 634 samples collected from 27 nesting areas. Two genetically distinct populations largely isolated during reproductive migrations and mating were detected, each composed of multiple nesting sites linked by high connectivity. This pattern was strongly associated with a steep environmental gradient and also influenced by ocean currents. These findings relate to meso-scale features of a dynamic oceanographic interface in the eastern tropical Pacific (ETP) region, a scenario that possibly provides different cost-benefit solutions and selective pressures for sea turtles during both the mating and migration periods. We reject panmixia and propose a new paradigm for olive ridley turtles where reproductive isolation due to assortative mating is linked to its environment. Our study demonstrates the relevance of integrative approaches for assessing the role of environmental gradients and oceanographic currents as drivers of genetic differentiation in widely distributed marine species. This is relevant for the conservation management of species of highly mobile behaviour, and assists the planning and development of large-scale conservation strategies for the threatened olive ridley turtles in the ETP. © 2018 The Author(s).
Predicting suicide with the SAD PERSONS scale.
Katz, Cara; Randall, Jason R; Sareen, Jitender; Chateau, Dan; Walld, Randy; Leslie, William D; Wang, JianLi; Bolton, James M
2017-09-01
Suicide is a major public health issue, and a priority requirement is accurately identifying high-risk individuals. The SAD PERSONS suicide risk assessment scale is widely implemented in clinical settings despite limited supporting evidence. This article aims to determine the ability of the SAD PERSONS scale (SPS) to predict future suicide in the emergency department. Five thousand four hundred sixty-two consecutive adults were seen by psychiatry consultation teams in two tertiary emergency departments with linkage to population-based administrative data to determine suicide deaths within 6 months, 1, and 5 years. Seventy-seven (1.4%) individuals died by suicide during the study period. When predicting suicide at 12 months, medium- and high-risk scores on SPS had a sensitivity of 49% and a specificity of 60%; the positive and negative predictive values were 0.9 and 99%, respectively. Half of the suicides at both 6- and 12-month intervals were classified as low risk by SPS at index visit. The area under the curve at 12 months for the Modified SPS was 0.59 (95% confidence interval [CI] range 0.51-0.67). High-risk scores (compared to low risk) were significantly associated with death by suicide over the 5-year study period using the SPS (hazard ratio 2.49; 95% CI 1.34-4.61) and modified version (hazard ratio 2.29; 95% CI 1.24-2.29). Although widely used in educational and clinical settings, these findings do not support the use of the SPS and Modified SPS to predict suicide in adults seen by psychiatric services in the emergency department. © 2017 Wiley Periodicals, Inc.
Fast Particle Methods for Multiscale Phenomena Simulations
NASA Technical Reports Server (NTRS)
Koumoutsakos, P.; Wray, A.; Shariff, K.; Pohorille, Andrew
2000-01-01
We are developing particle methods oriented at improving computational modeling capabilities of multiscale physical phenomena in : (i) high Reynolds number unsteady vortical flows, (ii) particle laden and interfacial flows, (iii)molecular dynamics studies of nanoscale droplets and studies of the structure, functions, and evolution of the earliest living cell. The unifying computational approach involves particle methods implemented in parallel computer architectures. The inherent adaptivity, robustness and efficiency of particle methods makes them a multidisciplinary computational tool capable of bridging the gap of micro-scale and continuum flow simulations. Using efficient tree data structures, multipole expansion algorithms, and improved particle-grid interpolation, particle methods allow for simulations using millions of computational elements, making possible the resolution of a wide range of length and time scales of these important physical phenomena.The current challenges in these simulations are in : [i] the proper formulation of particle methods in the molecular and continuous level for the discretization of the governing equations [ii] the resolution of the wide range of time and length scales governing the phenomena under investigation. [iii] the minimization of numerical artifacts that may interfere with the physics of the systems under consideration. [iv] the parallelization of processes such as tree traversal and grid-particle interpolations We are conducting simulations using vortex methods, molecular dynamics and smooth particle hydrodynamics, exploiting their unifying concepts such as : the solution of the N-body problem in parallel computers, highly accurate particle-particle and grid-particle interpolations, parallel FFT's and the formulation of processes such as diffusion in the context of particle methods. This approach enables us to transcend among seemingly unrelated areas of research.
Putilov, Arcady A
2017-01-01
Differences between the so-called larks and owls representing the opposite poles of morningness-eveningness dimension are widely known. However, scientific consensus has not yet been reached on the methodology for ranking and typing people along other dimensions of individual variation in their sleep-wake pattern. This review focused on the history and state-of-the-art of the methodology for self-assessment of individual differences in more than one trait or adaptability of the human sleep-wake cycle. The differences between this and other methodologies for the self-assessment of trait- and state-like variation in the perceived characteristics of daily rhythms were discussed and the critical issues that remained to be addressed in future studies were highlighted. These issues include a) a failure to develop a unidimensional scale for scoring chronotypological differences, b) the inconclusive results of the long-lasting search for objective markers of chronotype, c) a disagreement on both number and content of scales required for multidimensional self-assessment of chronobiological differences, d) a lack of evidence for the reliability and/or external validity of most of the proposed scales and e) an insufficient development of conceptualizations, models and model-based quantitative simulations linking the differences between people in their sleep-wake pattern with the differences in the basic parameters of underlying chronoregulatory processes. It seems that, in the nearest future, the wide implementation of portable actigraphic and somnographic devices might lead to the development of objective methodologies for multidimensional assessment and classification of sleep-wake traits and adaptabilities.
2003-12-01
operations run the full gamut from large-scale, theater-wide combat, as witnessed in Operation Iraqi Freedom, to small-scale operations against terrorists, to... gamut from large-scale, theater-wide combat, as witnessed in Operation Iraqi Freedom, to small-scale operations against terror- ists, to operations
Scale-free characteristics of random networks: the topology of the world-wide web
NASA Astrophysics Data System (ADS)
Barabási, Albert-László; Albert, Réka; Jeong, Hawoong
2000-06-01
The world-wide web forms a large directed graph, whose vertices are documents and edges are links pointing from one document to another. Here we demonstrate that despite its apparent random character, the topology of this graph has a number of universal scale-free characteristics. We introduce a model that leads to a scale-free network, capturing in a minimal fashion the self-organization processes governing the world-wide web.
Going to Scale: Experiences Implementing a School-Based Trauma Intervention
ERIC Educational Resources Information Center
Nadeem, Erum; Jaycox, Lisa H.; Kataoka, Sheryl H.; Langley, Audra K.; Stein, Bradley D.
2011-01-01
This article describes implementation experiences "scaling up" the Cognitive Behavioral Intervention for Trauma in Schools (CBITS)--an intervention developed using a community partnered research framework. Case studies from two sites that have successfully implemented CBITS are used to examine macro- and school-level implementation…
Implementation of Goal Attainment Scaling in Community Intellectual Disability Services
ERIC Educational Resources Information Center
Chapman, Melanie; Burton, Mark; Hunt, Victoria; Reeves, David
2006-01-01
The authors describe the evaluation of the implementation of an outcome measurement system (Goal Attainment Scaling-GAS) within the context of an interdisciplinary and interagency intellectual disability services setting. The GAS database allowed analysis of follow-up goals and indicated the extent of implementation, while a rater study evaluated…
Architectural Optimization of Digital Libraries
NASA Technical Reports Server (NTRS)
Biser, Aileen O.
1998-01-01
This work investigates performance and scaling issues relevant to large scale distributed digital libraries. Presently, performance and scaling studies focus on specific implementations of production or prototype digital libraries. Although useful information is gained to aid these designers and other researchers with insights to performance and scaling issues, the broader issues relevant to very large scale distributed libraries are not addressed. Specifically, no current studies look at the extreme or worst case possibilities in digital library implementations. A survey of digital library research issues is presented. Scaling and performance issues are mentioned frequently in the digital library literature but are generally not the focus of much of the current research. In this thesis a model for a Generic Distributed Digital Library (GDDL) and nine cases of typical user activities are defined. This model is used to facilitate some basic analysis of scaling issues. Specifically, the calculation of Internet traffic generated for different configurations of the study parameters and an estimate of the future bandwidth needed for a large scale distributed digital library implementation. This analysis demonstrates the potential impact a future distributed digital library implementation would have on the Internet traffic load and raises questions concerning the architecture decisions being made for future distributed digital library designs.
NASA Astrophysics Data System (ADS)
Jorba, O.; Pérez, C.; Baldasano, J. M.
2009-04-01
Chemical processes in air quality modelling systems are usually treated independently from the meteorological models. This approach is computationally attractive since off-line chemical transport simulations only require a single meteorological dataset to produce many chemical simulations. However, this separation of chemistry and meteorology produces a loss of important information about atmospheric processes and does not allow for feedbacks between chemistry and meteorology. To take into account such processes current models are evolving to an online coupling of chemistry and meteorology to produce consistent chemical weather predictions. The Earth Sciences Department of the Barcelona Supercomputing Center (BSC) develops the NMMB/BSC-DUST (Pérez et al., 2008), an online dust model within the global-regional NCEP/NMMB numerical weather prediction model (Janjic and Black, 2007) under development at National Centers for Environmental Prediction (NCEP). Current implementation is based on the well established regional dust model and forecast system DREAM (Nickovic et al., 2001). The most relevant characteristics of NMMB/BSC-DUST are its on-line coupling of the dust scheme with the meteorological driver, the wide range of applications from meso to global scales, and the inclusion of dust radiative effects allowing feedbacks between aerosols and meteorology. In order to complement such development, BSC works also in the implementation of a fully coupled online chemical mechanism within NMMB/BSC-DUST. The final objective is to develop a fully chemical weather prediction system able to resolve gas-aerosol-meteorology interactions from global to local scales. In this contribution we will present the design of the chemistry coupling and the current progress of its implementation. Following the NCEP/NMMB approach, the chemistry part will be coupled through the Earth System Modeling Framework (ESMF) as a pluggable component. The chemical mechanism and chemistry solver is based on the Kinetic PreProcessor KPP (Sandu and Sander, 2006) package with the main purpose to maintain a wide flexibility when configuring the model. Such approach will allow using a simple general chemical mechanism for global applications or a more complex mechanism for regional to local applications at higher resolution. REFERENCES Janjic, Z.I., and Black, T.L., 2007. An ESMF unified model for a broad range of spatial and temporal scales, Geophysical Research Abstracts, 9, 05025. Nickovic, S., Papadopoulos, A., Kakaliagou, O., and Kallos, G., 2001. Model for prediciton of desert dust cycle in the atmosphere. J. Geophys. Res., 106, 18113-18129. Pérez, C., Haustein, K., Janjic, Z.I., Jorba, O., Baldasano, J.M., Black, T.L., and Nickovic, S., 2008. An online dust model within the meso to global NMMB: current progress and plans. AGU Fall Meeting, San Francisco, A41K-03, 2008. Sandu, A., and Sander, R., 2006. Technical note:Simulating chemical systems in Fortran90 and Matlab with the Kinetic PreProcessor KPP-2.1. Atmos. Chem. and Phys., 6, 187-195.
Implementing the undergraduate mini-CEX: a tailored approach at Southampton University.
Hill, Faith; Kendall, Kathleen; Galbraith, Kevin; Crossley, Jim
2009-04-01
The mini-clinical evaluation exercise (mini-CEX) is widely used in the UK to assess clinical competence, but there is little evidence regarding its implementation in the undergraduate setting. This study aimed to estimate the validity and reliability of the undergraduate mini-CEX and discuss the challenges involved in its implementation. A total of 3499 mini-CEX forms were completed. Validity was assessed by estimating associations between mini-CEX score and a number of external variables, examining the internal structure of the instrument, checking competency domain response rates and profiles against expectations, and by qualitative evaluation of stakeholder interviews. Reliability was evaluated by overall reliability coefficient (R), estimation of the standard error of measurement (SEM), and from stakeholders' perceptions. Variance component analysis examined the contribution of relevant factors to students' scores. Validity was threatened by various confounding variables, including: examiner status; case complexity; attachment specialty; patient gender, and case focus. Factor analysis suggested that competency domains reflect a single latent variable. Maximum reliability can be achieved by aggregating scores over 15 encounters (R = 0.73; 95% confidence interval [CI] +/- 0.28 based on a 6-point assessment scale). Examiner stringency contributed 29% of score variation and student attachment aptitude 13%. Stakeholder interviews revealed staff development needs but the majority perceived the mini-CEX as more reliable and valid than the previous long case. The mini-CEX has good overall utility for assessing aspects of the clinical encounter in an undergraduate setting. Strengths include fidelity, wide sampling, perceived validity, and formative observation and feedback. Reliability is limited by variable examiner stringency, and validity by confounding variables, but these should be viewed within the context of overall assessment strategies.
Aldosari, Bakheet
2014-05-01
Outside a small number of OECD countries, little information exists regarding the rates, levels, and determinants of hospital electronic health record (EHR) system adoption. This study examines EHR system adoption in Riyadh, Saudi Arabia. Respondents from 22 hospitals were surveyed regarding the implementation, maintenance, and improvement phases of EHR system adoption. Thirty-seven items were graded on a three-point scale of preparedness/completion. Measured determinants included hospital size, level of care, ownership, and EHR system development team composition. Eleven of the hospitals had implemented fully functioning EHR systems, eight had systems in progress, and three had not adopted a system. Sixteen different systems were being used across the 19 adopting hospitals. Differential adoption levels were positively related to hospital size and negatively to the level of care (secondary versus tertiary). Hospital ownership (nonprofit versus private) and development team composition showed mixed effects depending on the particular adoption phase being considered. Adoption rates compare favourably with those reported from other countries and other districts in Saudi Arabia, but wide variations exist among hospitals in the levels of adoption of individual items. General weaknesses in the implementation phase concern the legacy of paper data systems, including document scanning and data conversion; in the maintenance phase concern updating/maintaining software; and in the improvement phase concern the communication and exchange of health information. This study is the first to investigate the level and determinants of EHR system adoption for public, other nonprofit, and private hospitals in Saudi Arabia. Wide interhospital variations in adoption bear implications for policy-making and funding intervention. Identified areas of weakness require action to increase the degree of adoption and usefulness of EHR systems. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Moran, J. E.
2011-12-01
The wide range of abilities in the student population at California State University East Bay, with a significant fraction of students under-prepared and requiring mathematics remediation, is a challenge to including mathematical concepts and exercises in our introductory geoscience courses. Student expectations that a geoscience course will not include quantitative work may result in math-phobics choosing the course and resisting quantitative work when presented with it. Introductory courses that are required for Geology and Environmental Science majors are also designated as General Education, which gives rise to a student group with a wide range of abilities and expectations. This presentation will focus on implementation of a series of online math tutorials for students in introductory geoscience courses called 'The Math You Need' (TMYN; http://serc.carleton.edu/mathyouneed/index.html). The program is implemented in a Physical Geology course, in which 2/3 of the students are typically non-majors. The Physical Geology course has a three hour lab each week and the lab exercises and lab manual offer several opportunities for application of TMYN. Many of the lab exercises include graphing, profiling, working with map scales, converting units, or using equations to calculate some parameter or solve for an unknown. Six TMYN modules covering topics using density calculations as applied to mineral properties and isostasy, graphing as applied to rock properties, earthquake location, and radiometric dating, and calculation of rates as applied to plate movement, stream discharge, and groundwater flow, are assigned as pre-labs to be completed before lab classes. TMYN skills are reinforced during lectures and lab exercises, as close in time as possible to students' exposure via TMYN. Pre- and post-tests give a measure of the effectiveness of TMYN in improving students' quantitative literacy.
Mortality Trends After a Voluntary Checklist-based Surgical Safety Collaborative.
Haynes, Alex B; Edmondson, Lizabeth; Lipsitz, Stuart R; Molina, George; Neville, Bridget A; Singer, Sara J; Moonan, Aunyika T; Childers, Ashley Kay; Foster, Richard; Gibbons, Lorri R; Gawande, Atul A; Berry, William R
2017-12-01
To determine whether completion of a voluntary, checklist-based surgical quality improvement program is associated with reduced 30-day postoperative mortality. Despite evidence of efficacy of team-based surgical safety checklists in improving perioperative outcomes in research trials, effective methods of population-based implementation have been lacking. The Safe Surgery 2015 South Carolina program was designed to foster state-wide engagement of hospitals in a voluntary, collaborative implementation of a checklist program. We compared postoperative mortality rates after inpatient surgery in South Carolina utilizing state-wide all-payer discharge claims from 2008 to 2013, linked with state vital statistics, stratifying hospitals on the basis of completion of the checklist program. Changes in risk-adjusted 30-day mortality were compared between hospitals, using propensity score-adjusted difference-in-differences analysis. Fourteen hospitals completed the program by December 2013. Before program launch, there was no difference in mortality trends between the completion cohort and all others (P = 0.33), but postoperative mortality diverged thereafter (P = 0.021). Risk-adjusted 30-day mortality among completers was 3.38% in 2010 and 2.84% in 2013 (P < 0.00001), whereas mortality among other hospitals (n = 44) was 3.50% in 2010 and 3.71% in 2013 (P = 0.3281), reflecting a 22% difference between the groups on difference-in-differences analysis (P = 0.0021). Despite similar pre-existing rates and trends of postoperative mortality, hospitals in South Carolina completing a voluntary checklist-based surgical quality improvement program had a reduction in deaths after inpatient surgery over the first 3 years of the collaborative compared with other hospitals in the state. This may indicate that effective large-scale implementation of a team-based surgical safety checklist is feasible.
Barriers Associated with Implementing a Campus-Wide Smoke-Free Policy
ERIC Educational Resources Information Center
Harbison, Philip Adam; Whitman, Marilyn V.
2008-01-01
Purpose: The purpose of this study is to review the barriers associated with implementing a campus-wide smoke-free policy as perceived by the American Cancer Society's Colleges against Cancer (CAC) Program chapter representatives. Design/methodology/approach: Four focus group sessions were conducted at the annual CAC National Leadership Summit in…
Code of Federal Regulations, 2011 CFR
2011-01-01
.../construction/operation of energy system prototypes C13. Import/export natural gas, minor new construction... Marketing Administration system-wide vegetation management program. C6Implementation of a Power Marketing Administration system-wide erosion control program. C7Establishment and implementation of contracts, policies...
ERIC Educational Resources Information Center
Tyre, Ashli D.; Feuerborn, Laura L.; Woods, Leslie
2018-01-01
Understanding staff concerns about a systemic change effort allows leadership teams to better anticipate and address staff needs for professional development and support. In this study, staff concerns in nine schools planning for or implementing School-Wide Positive Behavior Interventions and Supports (SWPBIS) were explored using the…
ERIC Educational Resources Information Center
Mercer, Sterett H.; McIntosh, Kent; Hoselton, Robert
2017-01-01
Several reliable and valid fidelity surveys are commonly used to assess Tier 1 implementation in School-Wide Positive Behavioral Interventions and Supports (SWPBIS); however, differences across surveys complicate consequential decisions regarding school implementation status when multiple measures are compared. To address this concern, the current…
ERIC Educational Resources Information Center
Pinkelman, Sarah; McIntosh, Kent; Raspica, Caitlin; Berg, Tricia; Strickland-Cohen, M. Kathleen
2015-01-01
The purpose of this study was to identify the most important perceived enablers and barriers regarding sustainability of School-wide Positive Behavioral Interventions and Supports (SWPBIS). School personnel representing 860 schools implementing or preparing to implement SWPBIS completed an open-ended survey of factors regarding its sustainability.…
Krintz, Chandra
2013-01-01
AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721
Scharf, Deborah M; Eberhart, Nicole K; Schmidt, Nicole; Vaughan, Christine A; Dutta, Trina; Pincus, Harold Alan; Burnam, M Audrey
2013-07-01
This article describes the characteristics and early implementation experiences of community behavioral health agencies that received Primary and Behavioral Health Care Integration (PBHCI) grants from the Substance Abuse and Mental Health Services Administration to integrate primary care into programs for adults with serious mental illness. Data were collected from 56 programs, across 26 states, that received PBHCI grants in 2009 (N=13) or 2010 (N=43). The authors systematically extracted quantitative and qualitative information about program characteristics from grantee proposals and semistructured telephone interviews with core program staff. Quarterly reports submitted by grantees were coded to identify barriers to implementing integrated care. Grantees shared core features required by the grant but varied widely in terms of characteristics of the organization, such as size and location, and in the way services were integrated, such as through partnerships with a primary care agency. Barriers to program implementation at start-up included difficulty recruiting and retaining qualified staff and issues related to data collection and use of electronic health records, licensing and approvals, and physical space. By the end of the first year, some problems, such as space issues, were largely resolved, but other issues, including problems with staffing and data collection, remained. New challenges, such as patient recruitment, had emerged. Early implementation experiences of PBHCI grantees may inform other programs that seek to integrate primary care into behavioral health settings as part of new, large-scale government initiatives, such as specialty mental health homes.
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets
Bicer, Tekin; Gursoy, Doga; Andrade, Vincent De; ...
2017-01-28
Here, synchrotron light source and detector technologies enable scientists to perform advanced experiments. These scientific instruments and experiments produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used data acquisition technique at light sources is Computed Tomography, which can generate tens of GB/s depending on x-ray range. A large-scale tomographic dataset, such as mouse brain, may require hours of computation time with a medium size workstation. In this paper, we present Trace, a data-intensive computing middleware we developed for implementation and parallelization of iterative tomographic reconstruction algorithms. Tracemore » provides fine-grained reconstruction of tomography datasets using both (thread level) shared memory and (process level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations we have done on the replicated reconstruction objects and evaluate them using a shale and a mouse brain sinogram. Our experimental evaluations show that the applied optimizations and parallelization techniques can provide 158x speedup (using 32 compute nodes) over single core configuration, which decreases the reconstruction time of a sinogram (with 4501 projections and 22400 detector resolution) from 12.5 hours to less than 5 minutes per iteration.« less
The 2-MEV model: Constancy of adolescent environmental values within an 8-year time frame
NASA Astrophysics Data System (ADS)
Bogner, F. X.; Johnson, B.; Buxner, S.; Felix, L.
2015-08-01
The 2-MEV model is a widely used tool to monitor children's environmental perception by scoring individual values. Although the scale's validity has been confirmed repeatedly and independently as well as the scale is in usage within more than two dozen language units all over the world, longitudinal properties still need clarification. The purpose of the present study therefore was to validate the 2-MEV scale based on a large data basis of 10,676 children collected over an eight-year period. Cohorts of three different US states contributed to the sample by responding to a paper-and-pencil questionnaire within their pre-test initiatives in the context of field center programs. Since we used only the pre-program 2-MEV scale results (which is before participation in education programs), the data were clearly unspoiled by any follow-up interventions. The purpose of analysis was fourfold: First, to test and confirm the hypothesized factorized structure for the large data set and for the subsample of each of the three states. Second, to analyze the scoring pattern across the eight years' time range for both preservation and utilitarian preferences. Third, to investigate any age effects in the extracted factors. Finally, to extract suitable recommendations for educational implementation efforts.
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bicer, Tekin; Gursoy, Doga; Andrade, Vincent De
Here, synchrotron light source and detector technologies enable scientists to perform advanced experiments. These scientific instruments and experiments produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used data acquisition technique at light sources is Computed Tomography, which can generate tens of GB/s depending on x-ray range. A large-scale tomographic dataset, such as mouse brain, may require hours of computation time with a medium size workstation. In this paper, we present Trace, a data-intensive computing middleware we developed for implementation and parallelization of iterative tomographic reconstruction algorithms. Tracemore » provides fine-grained reconstruction of tomography datasets using both (thread level) shared memory and (process level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations we have done on the replicated reconstruction objects and evaluate them using a shale and a mouse brain sinogram. Our experimental evaluations show that the applied optimizations and parallelization techniques can provide 158x speedup (using 32 compute nodes) over single core configuration, which decreases the reconstruction time of a sinogram (with 4501 projections and 22400 detector resolution) from 12.5 hours to less than 5 minutes per iteration.« less
Liu, Xiaonan; Ding, Wentao; Jiang, Huifeng
2017-07-19
Plant natural products (PNPs) are widely used as pharmaceuticals, nutraceuticals, seasonings, pigments, etc., with a huge commercial value on the global market. However, most of these PNPs are still being extracted from plants. A resource-conserving and environment-friendly synthesis route for PNPs that utilizes microbial cell factories has attracted increasing attention since the 1940s. However, at the present only a handful of PNPs are being produced by microbial cell factories at an industrial scale, and there are still many challenges in their large-scale application. One of the challenges is that most biosynthetic pathways of PNPs are still unknown, which largely limits the number of candidate PNPs for heterologous microbial production. Another challenge is that the metabolic fluxes toward the target products in microbial hosts are often hindered by poor precursor supply, low catalytic activity of enzymes and obstructed product transport. Consequently, despite intensive studies on the metabolic engineering of microbial hosts, the fermentation costs of most heterologously produced PNPs are still too high for industrial-scale production. In this paper, we review several aspects of PNP production in microbial cell factories, including important design principles and recent progress in pathway mining and metabolic engineering. In addition, implemented cases of industrial-scale production of PNPs in microbial cell factories are also highlighted.
Converting positive and negative symptom scores between PANSS and SAPS/SANS.
van Erp, Theo G M; Preda, Adrian; Nguyen, Dana; Faziola, Lawrence; Turner, Jessica; Bustillo, Juan; Belger, Aysenil; Lim, Kelvin O; McEwen, Sarah; Voyvodic, James; Mathalon, Daniel H; Ford, Judith; Potkin, Steven G; Fbirn
2014-01-01
The Scale for the Assessment of Positive Symptoms (SAPS), the Scale for the Assessment of Negative Symptoms (SANS), and the Positive and Negative Syndrome Scale for Schizophrenia (PANSS) are the most widely used schizophrenia symptom rating scales, but despite their co-existence for 25 years no easily usable between-scale conversion mechanism exists. The aim of this study was to provide equations for between-scale symptom rating conversions. Two-hundred-and-five schizophrenia patients [mean age±SD=39.5±11.6, 156 males] were assessed with the SANS, SAPS, and PANSS. Pearson's correlations between symptom scores from each of the scales were computed. Linear regression analyses, on data from 176 randomly selected patients, were performed to derive equations for converting ratings between the scales. Intraclass correlations, on data from the remaining 29 patients, not part of the regression analyses, were performed to determine rating conversion accuracy. Between-scale positive and negative symptom ratings were highly correlated. Intraclass correlations between the original positive and negative symptom ratings and those obtained via conversion of alternative ratings using the conversion equations were moderate to high (ICCs=0.65 to 0.91). Regression-based equations may be useful for conversion between schizophrenia symptom severity as measured by the SANS/SAPS and PANSS, though additional validation is warranted. This study's conversion equations, implemented at http:/converteasy.org, may aid in the comparison of medication efficacy studies, in meta- and mega-analyses examining symptoms as moderator variables, and in retrospective combination of symptom data in multi-center data sharing projects that need to pool symptom rating data when such data are obtained using different scales. Copyright © 2013 Elsevier B.V. All rights reserved.
Using a Virtual Experiment to Analyze Infiltration Process from Point to Grid-cell Size Scale
NASA Astrophysics Data System (ADS)
Barrios, M. I.
2013-12-01
The hydrological science requires the emergence of a consistent theoretical corpus driving the relationships between dominant physical processes at different spatial and temporal scales. However, the strong spatial heterogeneities and non-linearities of these processes make difficult the development of multiscale conceptualizations. Therefore, scaling understanding is a key issue to advance this science. This work is focused on the use of virtual experiments to address the scaling of vertical infiltration from a physically based model at point scale to a simplified physically meaningful modeling approach at grid-cell scale. Numerical simulations have the advantage of deal with a wide range of boundary and initial conditions against field experimentation. The aim of the work was to show the utility of numerical simulations to discover relationships between the hydrological parameters at both scales, and to use this synthetic experience as a media to teach the complex nature of this hydrological process. The Green-Ampt model was used to represent vertical infiltration at point scale; and a conceptual storage model was employed to simulate the infiltration process at the grid-cell scale. Lognormal and beta probability distribution functions were assumed to represent the heterogeneity of soil hydraulic parameters at point scale. The linkages between point scale parameters and the grid-cell scale parameters were established by inverse simulations based on the mass balance equation and the averaging of the flow at the point scale. Results have shown numerical stability issues for particular conditions and have revealed the complex nature of the non-linear relationships between models' parameters at both scales and indicate that the parameterization of point scale processes at the coarser scale is governed by the amplification of non-linear effects. The findings of these simulations have been used by the students to identify potential research questions on scale issues. Moreover, the implementation of this virtual lab improved the ability to understand the rationale of these process and how to transfer the mathematical models to computational representations.
InterProScan 5: genome-scale protein function classification
Jones, Philip; Binns, David; Chang, Hsin-Yu; Fraser, Matthew; Li, Weizhong; McAnulla, Craig; McWilliam, Hamish; Maslen, John; Mitchell, Alex; Nuka, Gift; Pesseat, Sebastien; Quinn, Antony F.; Sangrador-Vegas, Amaia; Scheremetjew, Maxim; Yong, Siew-Yit; Lopez, Rodrigo; Hunter, Sarah
2014-01-01
Motivation: Robust large-scale sequence analysis is a major challenge in modern genomic science, where biologists are frequently trying to characterize many millions of sequences. Here, we describe a new Java-based architecture for the widely used protein function prediction software package InterProScan. Developments include improvements and additions to the outputs of the software and the complete reimplementation of the software framework, resulting in a flexible and stable system that is able to use both multiprocessor machines and/or conventional clusters to achieve scalable distributed data analysis. InterProScan is freely available for download from the EMBl-EBI FTP site and the open source code is hosted at Google Code. Availability and implementation: InterProScan is distributed via FTP at ftp://ftp.ebi.ac.uk/pub/software/unix/iprscan/5/ and the source code is available from http://code.google.com/p/interproscan/. Contact: http://www.ebi.ac.uk/support or interhelp@ebi.ac.uk or mitchell@ebi.ac.uk PMID:24451626
A unified spectral parameterization for wave breaking: From the deep ocean to the surf zone
NASA Astrophysics Data System (ADS)
Filipot, J.-F.; Ardhuin, F.
2012-11-01
A new wave-breaking dissipation parameterization designed for phase-averaged spectral wave models is presented. It combines wave breaking basic physical quantities, namely, the breaking probability and the dissipation rate per unit area. The energy lost by waves is first explicitly calculated in physical space before being distributed over the relevant spectral components. The transition from deep to shallow water is made possible by using a dissipation rate per unit area of breaking waves that varies with the wave height, wavelength and water depth. This parameterization is implemented in the WAVEWATCH III modeling framework, which is applied to a wide range of conditions and scales, from the global ocean to the beach scale. Wave height, peak and mean periods, and spectral data are validated using in situ and remote sensing data. Model errors are comparable to those of other specialized deep or shallow water parameterizations. This work shows that it is possible to have a seamless parameterization from the deep ocean to the surf zone.
Deshommes, Elise; Laroche, Laurent; Deveau, Dominique; Nour, Shokoufeh; Prévost, Michèle
2017-09-05
Thirty-three households were monitored in a full-scale water distribution system, to investigate the impact of recent (<2 yr) or old partial lead service line replacements (PLSLRs). Total and particulate lead concentrations were measured using repeat sampling over a period of 1-20 months. Point-of-entry filters were installed to capture sporadic release of particulate lead from the lead service lines (LSLs). Mean concentrations increased immediately after PLSLRs and erratic particulate lead spikes were observed over the 18 month post-PLSLR monitoring period. The mass of lead released during this time frame indicates the occurrence of galvanic corrosion and scale destabilization. System-wide, lead concentrations were however lower in households with PLSLRs as compared to those with no replacement, especially for old PLSLRs. Nonetheless, 61% of PLSLR samples still exceeded 10 μg/L, reflecting the importance of implementing full LSL replacement and efficient risk communication. Acute concentrations measured immediately after PLSLRs demonstrate the need for appropriate flushing procedures to prevent lead poisoning.
Implementation of a School-wide Clinical Intervention Documentation System
Stevenson, T. Lynn; Fox, Brent I.; Andrus, Miranda; Carroll, Dana
2011-01-01
Objective. To evaluate the effectiveness and impact of a customized Web-based software program implemented in 2006 for school-wide documentation of clinical interventions by pharmacy practice faculty members, pharmacy residents, and student pharmacists. Methods. The implementation process, directed by a committee of faculty members and school administrators, included preparation and refinement of the software, user training, development of forms and reports, and integration of the documentation process within the curriculum. Results. Use of the documentation tool consistently increased from May 2007 to December 2010. Over 187,000 interventions were documented with over $6.2 million in associated cost avoidance. Conclusions. Successful implementation of a school-wide documentation tool required considerable time from the oversight committee and a comprehensive training program for all users, with ongoing monitoring of data collection practices. Data collected proved to be useful to show the impact of faculty members, residents, and student pharmacists at affiliated training sites. PMID:21829264
Aaron, Grant J; Friesen, Valerie M; Jungjohann, Svenja; Garrett, Greg S; Myatt, Mark
2017-01-01
Background: Large-scale food fortification (LSFF) of commonly consumed food vehicles is widely implemented in low- and middle-income countries. Many programs have monitoring information gaps and most countries fail to assess program coverage. Objective: The aim of this work was to present LSFF coverage survey findings (overall and in vulnerable populations) from 18 programs (7 wheat flour, 4 maize flour, and 7 edible oil programs) conducted in 8 countries between 2013 and 2015. Methods: A Fortification Assessment Coverage Toolkit (FACT) was developed to standardize the assessments. Three indicators were used to assess the relations between coverage and vulnerability: 1) poverty, 2) poor dietary diversity, and 3) rural residence. Three measures of coverage were assessed: 1) consumption of the vehicle, 2) consumption of a fortifiable vehicle, and 3) consumption of a fortified vehicle. Individual program performance was assessed based on the following: 1) achieving overall coverage ≥50%, 2) achieving coverage of ≥75% in ≥1 vulnerable group, and 3) achieving equity in coverage for ≥1 vulnerable group. Results: Coverage varied widely by food vehicle and country. Only 2 of the 18 LSFF programs assessed met all 3 program performance criteria. The 2 main program bottlenecks were a poor choice of vehicle and failure to fortify a fortifiable vehicle (i.e., absence of fortification). Conclusions: The results highlight the importance of sound program design and routine monitoring and evaluation. There is strong evidence of the impact and cost-effectiveness of LSFF; however, impact can only be achieved when the necessary activities and processes during program design and implementation are followed. The FACT approach fills an important gap in the availability of standardized tools. The LSFF programs assessed here need to be re-evaluated to determine whether to further invest in the programs, whether other vehicles are appropriate, and whether other approaches are needed. PMID:28404836
Aaron, Grant J; Friesen, Valerie M; Jungjohann, Svenja; Garrett, Greg S; Neufeld, Lynnette M; Myatt, Mark
2017-05-01
Background: Large-scale food fortification (LSFF) of commonly consumed food vehicles is widely implemented in low- and middle-income countries. Many programs have monitoring information gaps and most countries fail to assess program coverage. Objective: The aim of this work was to present LSFF coverage survey findings (overall and in vulnerable populations) from 18 programs (7 wheat flour, 4 maize flour, and 7 edible oil programs) conducted in 8 countries between 2013 and 2015. Methods: A Fortification Assessment Coverage Toolkit (FACT) was developed to standardize the assessments. Three indicators were used to assess the relations between coverage and vulnerability: 1 ) poverty, 2 ) poor dietary diversity, and 3 ) rural residence. Three measures of coverage were assessed: 1 ) consumption of the vehicle, 2 ) consumption of a fortifiable vehicle, and 3 ) consumption of a fortified vehicle. Individual program performance was assessed based on the following: 1 ) achieving overall coverage ≥50%, 2) achieving coverage of ≥75% in ≥1 vulnerable group, and 3 ) achieving equity in coverage for ≥1 vulnerable group. Results: Coverage varied widely by food vehicle and country. Only 2 of the 18 LSFF programs assessed met all 3 program performance criteria. The 2 main program bottlenecks were a poor choice of vehicle and failure to fortify a fortifiable vehicle (i.e., absence of fortification). Conclusions: The results highlight the importance of sound program design and routine monitoring and evaluation. There is strong evidence of the impact and cost-effectiveness of LSFF; however, impact can only be achieved when the necessary activities and processes during program design and implementation are followed. The FACT approach fills an important gap in the availability of standardized tools. The LSFF programs assessed here need to be re-evaluated to determine whether to further invest in the programs, whether other vehicles are appropriate, and whether other approaches are needed.
NASA Astrophysics Data System (ADS)
Gómez-García, Pablo Aurelio; Arranz, Alicia; Fresno, Manuel; Desco, Manuel; Mahmood, Umar; Vaquero, Juan José; Ripoll, Jorge
2015-06-01
Endoscopy is frequently used in the diagnosis of several gastro-intestinal pathologies as Crohn disease, ulcerative colitis or colorectal cancer. It has great potential as a non-invasive screening technique capable of detecting suspicious alterations in the intestinal mucosa, such as inflammatory processes. However, these early lesions usually cannot be detected with conventional endoscopes, due to lack of cellular detail and the absence of specific markers. Due to this lack of specificity, the development of new endoscopy technologies, which are able to show microscopic changes in the mucosa structure, are necessary. We here present a confocal endomicroscope, which in combination with a wide field fluorescence endoscope offers fast and specific macroscopic information through the use of activatable probes and a detailed analysis at cellular level of the possible altered tissue areas. This multi-modal and multi-scale imaging module, compatible with commercial endoscopes, combines near-infrared fluorescence (NIRF) measurements (enabling specific imaging of markers of disease and prognosis) and confocal endomicroscopy making use of a fiber bundle, providing a cellular level resolution. The system will be used in animal models exhibiting gastro-intestinal diseases in order to analyze the use of potential diagnostic markers in colorectal cancer. In this work, we present in detail the set-up design and the software implementation in order to obtain simultaneous RGB/NIRF measurements and short confocal scanning times.
Yoshioka, Kota; Nakamura, Jiro; Pérez, Byron; Tercero, Doribel; Pérez, Lenin; Tabaru, Yuichiro
2015-12-01
Chagas disease is one of the most serious health problems in Latin America. Because the disease is transmitted mainly by triatomine vectors, a three-phase vector control strategy was used to reduce its vector-borne transmission. In Nicaragua, we implemented an indoor insecticide spraying program in five northern departments to reduce house infestation by Triatoma dimidiata. The spraying program was performed in two rounds. After each round, we conducted entomological evaluation to compare the vector infestation level before and after spraying. A total of 66,200 and 44,683 houses were sprayed in the first and second spraying rounds, respectively. The entomological evaluation showed that the proportion of houses infested by T. dimidiata was reduced from 17.0% to 3.0% after the first spraying, which was statistically significant (P < 0.0001). However, the second spraying round did not demonstrate clear effectiveness. Space-time analysis revealed that reinfestation of T. dimidiata is more likely to occur in clusters where the pre-spray infestation level is high. Here we discuss how large-scale insecticide spraying is neither effective nor affordable when T. dimidiata is widely distributed at low infestation levels. Further challenges involve research on T. dimidiata reinfestation, diversification of vector control strategies, and implementation of sustainable vector surveillance. © The American Society of Tropical Medicine and Hygiene.
Carbon Dioxide Collection and Purification System for Mars
NASA Technical Reports Server (NTRS)
Clark, D. Larry; Trevathan, Joseph R.
2001-01-01
One of the most abundant resources available on Mars is the atmosphere. The primary constituent, carbon dioxide, can be used to produce a wide variety of consumables including propellants and breathing air. The residual gases can be used for additional pressurization tasks including supplementing the oxygen partial pressure in human habitats. A system is presented that supplies pure, high-pressure carbon dioxide and a separate stream of residual gases ready for further processing. This power-efficient method freezes the carbon dioxide directly from the atmosphere using a pulse-tube cryocooler. The resulting CO2 mass is later thawed in a closed pressure vessel, resulting in a compact source of liquefied gas at the vapor pressure of the bulk fluid. Results from a demonstration system are presented along with analysis and system scaling factors for implementation at larger scales. Trace gases in the Martian atmosphere challenge the system designer for all carbon dioxide acquisitions concepts. The approximately five percent of other gases build up as local concentrations of CO2 are removed, resulting in diminished performance of the collection process. The presented system takes advantage of this fact and draws the concentrated residual gases away as a useful byproduct. The presented system represents an excelient volume and mass solution for collecting and compressing this valuable Martian resource. Recent advances in pulse-tube cryocooler technology have enabled this concept to be realized in a reliable, low power implementation.
Cao, Jianfang; Chen, Lichao; Wang, Min; Tian, Yun
2018-01-01
The Canny operator is widely used to detect edges in images. However, as the size of the image dataset increases, the edge detection performance of the Canny operator decreases and its runtime becomes excessive. To improve the runtime and edge detection performance of the Canny operator, in this paper, we propose a parallel design and implementation for an Otsu-optimized Canny operator using a MapReduce parallel programming model that runs on the Hadoop platform. The Otsu algorithm is used to optimize the Canny operator's dual threshold and improve the edge detection performance, while the MapReduce parallel programming model facilitates parallel processing for the Canny operator to solve the processing speed and communication cost problems that occur when the Canny edge detection algorithm is applied to big data. For the experiments, we constructed datasets of different scales from the Pascal VOC2012 image database. The proposed parallel Otsu-Canny edge detection algorithm performs better than other traditional edge detection algorithms. The parallel approach reduced the running time by approximately 67.2% on a Hadoop cluster architecture consisting of 5 nodes with a dataset of 60,000 images. Overall, our approach system speeds up the system by approximately 3.4 times when processing large-scale datasets, which demonstrates the obvious superiority of our method. The proposed algorithm in this study demonstrates both better edge detection performance and improved time performance.
How To Identify Plasmons from the Optical Response of Nanostructures
2017-01-01
A promising trend in plasmonics involves shrinking the size of plasmon-supporting structures down to a few nanometers, thus enabling control over light–matter interaction at extreme-subwavelength scales. In this limit, quantum mechanical effects, such as nonlocal screening and size quantization, strongly affect the plasmonic response, rendering it substantially different from classical predictions. For very small clusters and molecules, collective plasmonic modes are hard to distinguish from other excitations such as single-electron transitions. Using rigorous quantum mechanical computational techniques for a wide variety of physical systems, we describe how an optical resonance of a nanostructure can be classified as either plasmonic or nonplasmonic. More precisely, we define a universal metric for such classification, the generalized plasmonicity index (GPI), which can be straightforwardly implemented in any computational electronic-structure method or classical electromagnetic approach to discriminate plasmons from single-particle excitations and photonic modes. Using the GPI, we investigate the plasmonicity of optical resonances in a wide range of systems including: the emergence of plasmonic behavior in small jellium spheres as the size and the number of electrons increase; atomic-scale metallic clusters as a function of the number of atoms; and nanostructured graphene as a function of size and doping down to the molecular plasmons in polycyclic aromatic hydrocarbons. Our study provides a rigorous foundation for the further development of ultrasmall nanostructures based on molecular plasmonics. PMID:28651057
Towards the Next Generation Air Quality Modeling System ...
The community multiscale air quality (CMAQ) model of the U.S. Environmental Protection Agency is one of the most widely used air quality model worldwide; it is employed for both research and regulatory applications at major universities and government agencies for improving understanding of the formation and transport of air pollutants. It is noted, however, that air quality issues and climate change assessments need to be addressed globally recognizing the linkages and interactions between meteorology and atmospheric chemistry across a wide range of scales. Therefore, an effort is currently underway to develop the next generation air quality modeling system (NGAQM) that will be based on a global integrated meteorology and chemistry system. The model for prediction across scales-atmosphere (MPAS-A), a global fully compressible non-hydrostatic model with seamlessly refined centroidal Voronoi grids, has been chosen as the meteorological driver of this modeling system. The initial step of adapting MPAS-A for the NGAQM was to implement and test the physics parameterizations and options that are preferred for retrospective air quality simulations (see the work presented by R. Gilliam, R. Bullock, and J. Herwehe at this workshop). The next step, presented herein, would be to link the chemistry from CMAQ to MPAS-A to build a prototype for the NGAQM. Furthermore, the techniques to harmonize transport processes between CMAQ and MPAS-A, methodologies to connect the chemis
Del Fiol, Guilherme; Huser, Vojtech; Strasberg, Howard R; Maviglia, Saverio M; Curtis, Clayton; Cimino, James J
2012-01-01
To support clinical decision-making,computerized information retrieval tools known as “infobuttons” deliver contextually-relevant knowledge resources intoclinical information systems.The Health Level Seven International(HL7)Context-Aware Knowledge Retrieval (Infobutton) Standard specifies a standard mechanism to enable infobuttons on a large scale. Objective To examine the experience of organizations in the course of implementing the HL7 Infobutton Standard. Method Cross-sectionalonline survey and in-depth phone interviews. Results A total of 17 organizations participated in the study.Analysis of the in-depth interviews revealed 20 recurrent themes.Implementers underscored the benefits, simplicity, and flexibility of the HL7 Infobutton Standard. Yet, participants voiced the need for easier access to standard specifications and improved guidance to beginners. Implementers predicted that the Infobutton Standard will be widely or at least fairly well adopted in the next five years, but uptake will dependlargely on adoption among electronic health record (EHR) vendors. To accelerate EHR adoption of the Infobutton Standard,implementers recommended HL7-compliant infobutton capabilities to be included in the United States Meaningful Use Certification Criteria EHR systems. Limitations Opinions and predictions should be interpreted with caution, since all the participant organizations have successfully implemented the Standard and overhalf of the organizations were actively engaged in the development of the Standard. Conclusion Overall, implementers reported a very positive experience with the HL7 Infobutton Standard.Despite indications of increasing uptake, measures should be taken to stimulate adoption of the Infobutton Standard among EHR vendors. Widespread adoption of the Infobutton standard has the potential to bring contextually relevant clinical decision support content into the healthcare provider workflow. PMID:22226933
McIntosh, Jennifer; Alonso, Albert; MacLure, Katie; Stewart, Derek; Kempen, Thomas; Mair, Alpana; Castel-Branco, Margarida; Codina, Carles; Fernandez-Llimos, Fernando; Fleming, Glenda; Gennimata, Dimitra; Gillespie, Ulrika; Harrison, Cathy; Illario, Maddalena; Junius-Walker, Ulrike; Kampolis, Christos F; Kardas, Przemyslaw; Lewek, Pawel; Malva, João; Menditto, Enrica; Scullin, Claire; Wiese, Birgitt
2018-01-01
Multimorbidity and its associated polypharmacy contribute to an increase in adverse drug events, hospitalizations, and healthcare spending. This study aimed to address: what exists regarding polypharmacy management in the European Union (EU); why programs were, or were not, developed; and, how identified initiatives were developed, implemented, and sustained. Change management principles (Kotter) and normalization process theory (NPT) informed data collection and analysis. Nine case studies were conducted in eight EU countries: Germany (Lower Saxony), Greece, Italy (Campania), Poland, Portugal, Spain (Catalonia), Sweden (Uppsala), and the United Kingdom (Northern Ireland and Scotland). The workflow included a review of country/region specific polypharmacy policies, key informant interviews with stakeholders involved in policy development and implementation and, focus groups of clinicians and managers. Data were analyzed using thematic analysis of individual cases and framework analysis across cases. Polypharmacy initiatives were identified in five regions (Catalonia, Lower Saxony, Northern Ireland, Scotland, and Uppsala) and included all care settings. There was agreement, even in cases without initiatives, that polypharmacy is a significant issue to address. Common themes regarding the development and implementation of polypharmacy management initiatives were: locally adapted solutions, organizational culture supporting innovation and teamwork, adequate workforce training, multidisciplinary teams, changes in workflow, redefinition of roles and responsibilities of professionals, policies and legislation supporting the initiative, and data management and information and communication systems to assist development and implementation. Depending on the setting, these were considered either facilitators or barriers to implementation. Within the studied EU countries, polypharmacy management was not widely addressed. These results highlight the importance of change management and theory-based implementation strategies, and provide examples of polypharmacy management initiatives that can assist managers and policymakers in developing new programs or scaling up existing ones, particularly in places currently lacking such initiatives.
ERIC Educational Resources Information Center
Bakah, Marie Afua Baah; Voogt, Joke M.; Pieters, Jules M.
2012-01-01
Polytechnic staff perspectives are sought on the sustainability and large-scale implementation of design teams (DT), as a means for collaborative curriculum design and teacher professional development in Ghana's polytechnics, months after implementation. Data indicates that teachers still collaborate in DTs for curriculum design and professional…
Implementing Small Scale ICT Projects in Developing Countries--How Challenging Is It?
ERIC Educational Resources Information Center
Karunaratne, Thashmee; Peiris, Colombage; Hansson, Henrik
2018-01-01
This paper summarises experiences of efforts made by twenty individuals when implementing small-scale ICT development projects in their organizations located in seven developing countries. The main focus of these projects was the use of ICT in educational settings. Challenges encountered and the contributing factors for implementation success of…
Scales of Natural Flood Management
NASA Astrophysics Data System (ADS)
Nicholson, Alex; Quinn, Paul; Owen, Gareth; Hetherington, David; Piedra Lara, Miguel; O'Donnell, Greg
2016-04-01
The scientific field of Natural flood Management (NFM) is receiving much attention and is now widely seen as a valid solution to sustainably manage flood risk whilst offering significant multiple benefits. However, few examples exist looking at NFM on a large scale (>10km2). Well-implemented NFM has the effect of restoring more natural catchment hydrological and sedimentological processes, which in turn can have significant flood risk and WFD benefits for catchment waterbodies. These catchment scale improvements in-turn allow more 'natural' processes to be returned to rivers and streams, creating a more resilient system. Although certain NFM interventions may appear distant and disconnected from main stem waterbodies, they will undoubtedly be contributing to WFD at the catchment waterbody scale. This paper offers examples of NFM, and explains how they can be maximised through practical design across many scales (from feature up to the whole catchment). New tools to assist in the selection of measures and their location, and to appreciate firstly, the flooding benefit at the local catchment scale and then show a Flood Impact Model that can best reflect the impacts of local changes further downstream. The tools will be discussed in the context of our most recent experiences on NFM projects including river catchments in the north east of England and in Scotland. This work has encouraged a more integrated approach to flood management planning that can use both traditional and novel NFM strategies in an effective and convincing way.
Enabling and challenging factors in institutional reform: The case of SCALE-UP
NASA Astrophysics Data System (ADS)
Foote, Kathleen; Knaub, Alexis; Henderson, Charles; Dancy, Melissa; Beichner, Robert J.
2016-06-01
While many innovative teaching strategies exist, integration into undergraduate science teaching has been frustratingly slow. This study aims to understand the low uptake of research-based instructional innovations by studying 21 successful implementations of the Student Centered Active Learning with Upside-down Pedagogies (SCALE-UP) instructional reform. SCALE-UP significantly restructures the classroom environment and pedagogy to promote highly active and interactive instruction. Although originally designed for university introductory physics courses, SCALE-UP has spread to many other disciplines at hundreds of departments around the world. This study reports findings from in-depth, open-ended interviews with 21 key contact people involved with successful secondary implementations of SCALE-UP throughout the United States. We defined successful implementations as those who restructured their pedagogy and classroom and sustained and/or spread the change. Interviews were coded to identify the most common enabling and challenging factors during reform implementation and compared to the theoretical framework of Kotter's 8-step Change Model. The most common enabling influences that emerged are documenting and leveraging evidence of local success, administrative support, interaction with outside SCALE-UP user(s), and funding. Many challenges are linked to the lack of these enabling factors including difficulty finding funding, space, and administrative and/or faculty support for reform. Our focus on successful secondary implementations meant that most interviewees were able to overcome challenges. Presentation of results is illuminated with case studies, quotes, and examples that can help secondary implementers with SCALE-UP reform efforts specifically. We also discuss the implications for policy makers, researchers, and the higher education community concerned with initiating structural change.
NURD: an implementation of a new method to estimate isoform expression from non-uniform RNA-seq data
2013-01-01
Background RNA-Seq technology has been used widely in transcriptome study, and one of the most important applications is to estimate the expression level of genes and their alternative splicing isoforms. There have been several algorithms published to estimate the expression based on different models. Recently Wu et al. published a method that can accurately estimate isoform level expression by considering position-related sequencing biases using nonparametric models. The method has advantages in handling different read distributions, but there hasn’t been an efficient program to implement this algorithm. Results We developed an efficient implementation of the algorithm in the program NURD. It uses a binary interval search algorithm. The program can correct both the global tendency of sequencing bias in the data and local sequencing bias specific to each gene. The correction makes the isoform expression estimation more reliable under various read distributions. And the implementation is computationally efficient in both the memory cost and running time and can be readily scaled up for huge datasets. Conclusion NURD is an efficient and reliable tool for estimating the isoform expression level. Given the reads mapping result and gene annotation file, NURD will output the expression estimation result. The package is freely available for academic use at http://bioinfo.au.tsinghua.edu.cn/software/NURD/. PMID:23837734
ERIC Educational Resources Information Center
Chapman, William E., Jr.
2009-01-01
The reputed benefits of using technology in schools have been the topic of many research studies. When the World Wide Workshop Foundation implemented their Globaloria program The reputed benefits of using technology in schools have been the topic of many research studies. When the World Wide Workshop Foundation implemented their Globaloria program…
ERIC Educational Resources Information Center
O'Shaughnessy, Tam E.; Lane, Kathleen L.; Gresham, Frank M.; Beebe-Frankenberger, Margaret E.
2003-01-01
This article describes a school-wide system of early identification and intervention for children recognized as being at risk for learning and behavior difficulties. Suggested guidelines for implementing such a program include: evaluating existing theory, knowledge, and practice; providing ongoing professional development; creating a school-wide…
ERIC Educational Resources Information Center
Betters-Bubon, Jennifer; Donohue, Peg
2016-01-01
The implementation of school-wide positive behavioral interventions and supports (SWPBIS) has been shown to reduce behavioral incidents and lead to more positive school climates. Despite the growing popularity in schools, there lacks clear understanding of the school counselor role in this approach. We present the perspectives of an elementary…
ERIC Educational Resources Information Center
Lange, Aurelie M. C.; van der Rijken, Rachel E. A.; Busschbach, Jan J. V.; Delsing, Marc J. M. H.; Scholte, Ron H. J.
2017-01-01
Objective: Therapist adherence is a quality indicator in routine clinical care when evaluating the success of the implementation of an intervention. The current study investigated whether therapist adherence mediates the association between therapist, team, and country-wide experience (i.e. number of years since implementation in the country) on…
ERIC Educational Resources Information Center
Adams, Mark Thomas
2013-01-01
This qualitative study investigated the nature of the relationship between principal leadership and school culture within a school-wide implementation of Professional Crisis Management (PCM). PCM is a comprehensive and fully integrated system designed to manage crisis situations effectively, safely, and with dignity. While designed primarily to…
Lyon, Aaron R; Cook, Clayton R; Brown, Eric C; Locke, Jill; Davis, Chayna; Ehrhart, Mark; Aarons, Gregory A
2018-01-08
A substantial literature has established the role of the inner organizational setting on the implementation of evidence-based practices in community contexts, but very little of this research has been extended to the education sector, one of the most common settings for the delivery of mental and behavioral health services to children and adolescents. The current study examined the factor structure, psychometric properties, and interrelations of an adapted set of pragmatic organizational instruments measuring key aspects of the organizational implementation context in schools: (1) strategic implementation leadership, (2) strategic implementation climate, and (3) implementation citizenship behavior. The Implementation Leadership Scale (ILS), Implementation Climate Scale (ICS), and Implementation Citizenship Behavior Scale (ICBS) were adapted by a research team that included the original scale authors and experts in the implementation of evidence-based practices in schools. These instruments were then administered to a geographically representative sample (n = 196) of school-based mental/behavioral health consultants to assess the reliability and structural validity via a series of confirmatory factor analyses. Overall, the original factor structures for the ILS, ICS, and ICBS were confirmed in the current sample. The one exception was poor functioning of the Rewards subscale of the ICS, which was removed in the final ICS model. Correlations among the revised measures, evaluated as part of an overarching model of the organizational implementation context, indicated both unique and shared variance. The current analyses suggest strong applicability of the revised instruments to implementation of evidence-based mental and behavioral practices in the education sector. The one poorly functioning subscale (Rewards on the ICS) was attributed to typical educational policies that do not allow for individual financial incentives to personnel. Potential directions for future expansion, revision, and application of the instruments in schools are discussed.
Aarons, Gregory A; Ehrhart, Mark G; Farahnak, Lauren R
2014-04-14
In healthcare and allied healthcare settings, leadership that supports effective implementation of evidenced-based practices (EBPs) is a critical concern. However, there are no empirically validated measures to assess implementation leadership. This paper describes the development, factor structure, and initial reliability and convergent and discriminant validity of a very brief measure of implementation leadership: the Implementation Leadership Scale (ILS). Participants were 459 mental health clinicians working in 93 different outpatient mental health programs in Southern California, USA. Initial item development was supported as part of a two United States National Institutes of Health (NIH) studies focused on developing implementation leadership training and implementation measure development. Clinician work group/team-level data were randomly assigned to be utilized for an exploratory factor analysis (n = 229; k = 46 teams) or for a confirmatory factor analysis (n = 230; k = 47 teams). The confirmatory factor analysis controlled for the multilevel, nested data structure. Reliability and validity analyses were then conducted with the full sample. The exploratory factor analysis resulted in a 12-item scale with four subscales representing proactive leadership, knowledgeable leadership, supportive leadership, and perseverant leadership. Confirmatory factor analysis supported an a priori higher order factor structure with subscales contributing to a single higher order implementation leadership factor. The scale demonstrated excellent internal consistency reliability as well as convergent and discriminant validity. The ILS is a brief and efficient measure of unit level leadership for EBP implementation. The availability of the ILS will allow researchers to assess strategic leadership for implementation in order to advance understanding of leadership as a predictor of organizational context for implementation. The ILS also holds promise as a tool for leader and organizational development to improve EBP implementation.
2014-01-01
Background In healthcare and allied healthcare settings, leadership that supports effective implementation of evidenced-based practices (EBPs) is a critical concern. However, there are no empirically validated measures to assess implementation leadership. This paper describes the development, factor structure, and initial reliability and convergent and discriminant validity of a very brief measure of implementation leadership: the Implementation Leadership Scale (ILS). Methods Participants were 459 mental health clinicians working in 93 different outpatient mental health programs in Southern California, USA. Initial item development was supported as part of a two United States National Institutes of Health (NIH) studies focused on developing implementation leadership training and implementation measure development. Clinician work group/team-level data were randomly assigned to be utilized for an exploratory factor analysis (n = 229; k = 46 teams) or for a confirmatory factor analysis (n = 230; k = 47 teams). The confirmatory factor analysis controlled for the multilevel, nested data structure. Reliability and validity analyses were then conducted with the full sample. Results The exploratory factor analysis resulted in a 12-item scale with four subscales representing proactive leadership, knowledgeable leadership, supportive leadership, and perseverant leadership. Confirmatory factor analysis supported an a priori higher order factor structure with subscales contributing to a single higher order implementation leadership factor. The scale demonstrated excellent internal consistency reliability as well as convergent and discriminant validity. Conclusions The ILS is a brief and efficient measure of unit level leadership for EBP implementation. The availability of the ILS will allow researchers to assess strategic leadership for implementation in order to advance understanding of leadership as a predictor of organizational context for implementation. The ILS also holds promise as a tool for leader and organizational development to improve EBP implementation. PMID:24731295
Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption
ERIC Educational Resources Information Center
Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane
2014-01-01
A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…
Spin-lattice relaxation of individual solid-state spins
NASA Astrophysics Data System (ADS)
Norambuena, A.; Muñoz, E.; Dinani, H. T.; Jarmola, A.; Maletinsky, P.; Budker, D.; Maze, J. R.
2018-03-01
Understanding the effect of vibrations on the relaxation process of individual spins is crucial for implementing nanosystems for quantum information and quantum metrology applications. In this work, we present a theoretical microscopic model to describe the spin-lattice relaxation of individual electronic spins associated to negatively charged nitrogen-vacancy centers in diamond, although our results can be extended to other spin-boson systems. Starting from a general spin-lattice interaction Hamiltonian, we provide a detailed description and solution of the quantum master equation of an electronic spin-one system coupled to a phononic bath in thermal equilibrium. Special attention is given to the dynamics of one-phonon processes below 1 K where our results agree with recent experimental findings and analytically describe the temperature and magnetic-field scaling. At higher temperatures, linear and second-order terms in the interaction Hamiltonian are considered and the temperature scaling is discussed for acoustic and quasilocalized phonons when appropriate. Our results, in addition to confirming a T5 temperature dependence of the longitudinal relaxation rate at higher temperatures, in agreement with experimental observations, provide a theoretical background for modeling the spin-lattice relaxation at a wide range of temperatures where different temperature scalings might be expected.
Meckel, T. A.; Trevisan, L.; Krishnamurthy, P. G.
2017-08-23
Small-scale (mm to m) sedimentary structures (e.g. ripple lamination, cross-bedding) have received a great deal of attention in sedimentary geology. The influence of depositional heterogeneity on subsurface fluid flow is now widely recognized, but incorporating these features in physically-rational bedform models at various scales remains problematic. The current investigation expands the capability of an existing set of open-source codes, allowing generation of high-resolution 3D bedform architecture models. The implemented modifications enable the generation of 3D digital models consisting of laminae and matrix (binary field) with characteristic depositional architecture. The binary model is then populated with petrophysical properties using a texturalmore » approach for additional analysis such as statistical characterization, property upscaling, and single and multiphase fluid flow simulation. One example binary model with corresponding threshold capillary pressure field and the scripts used to generate them are provided, but the approach can be used to generate dozens of previously documented common facies models and a variety of property assignments. An application using the example model is presented simulating buoyant fluid (CO 2) migration and resulting saturation distribution.« less
Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee
2016-07-01
In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research.
Wind Turbines as Landscape Impediments to the Migratory Connectivity of Bats
Cryan, Paul M.
2011-01-01
Unprecedented numbers of migratory bats are found dead beneath industrial-scale wind turbines during late summer and autumn in both North America and Europe. Prior to the wide-scale deployment of wind turbines, fatal collisions of migratory bats with anthropogenic structures were rarely reported and likely occurred very infrequently. There are no other well-documented threats to populations of migratory tree bats that cause mortality of similar magnitude to that observed at wind turbines. Just three migratory species comprise the vast majority of bat kills at turbines in North America and there are indications that turbines may actually attract migrating individuals toward their blades. Although fatality of certain migratory species is consistent in occurrence across large geographic regions, fatality rates differ across sites for reasons mostly unknown. Cumulative fatality for turbines in North America might already range into the hundreds of thousands of bats per year. Research into the causes of bat fatalities at wind turbines can ascertain the scale of the problem and help identify solutions. None of the migratory bats known to be most affected by wind turbines are protected by conservation laws, nor is there a legal mandate driving research into the problem or implementation of potential solutions.
Liu, Siyang; Huang, Shujia; Rao, Junhua; Ye, Weijian; Krogh, Anders; Wang, Jun
2015-01-01
Comprehensive recognition of genomic variation in one individual is important for understanding disease and developing personalized medication and treatment. Many tools based on DNA re-sequencing exist for identification of single nucleotide polymorphisms, small insertions and deletions (indels) as well as large deletions. However, these approaches consistently display a substantial bias against the recovery of complex structural variants and novel sequence in individual genomes and do not provide interpretation information such as the annotation of ancestral state and formation mechanism. We present a novel approach implemented in a single software package, AsmVar, to discover, genotype and characterize different forms of structural variation and novel sequence from population-scale de novo genome assemblies up to nucleotide resolution. Application of AsmVar to several human de novo genome assemblies captures a wide spectrum of structural variants and novel sequences present in the human population in high sensitivity and specificity. Our method provides a direct solution for investigating structural variants and novel sequences from de novo genome assemblies, facilitating the construction of population-scale pan-genomes. Our study also highlights the usefulness of the de novo assembly strategy for definition of genome structure.
Gradient descent for robust kernel-based regression
NASA Astrophysics Data System (ADS)
Guo, Zheng-Chu; Hu, Ting; Shi, Lei
2018-06-01
In this paper, we study the gradient descent algorithm generated by a robust loss function over a reproducing kernel Hilbert space (RKHS). The loss function is defined by a windowing function G and a scale parameter σ, which can include a wide range of commonly used robust losses for regression. There is still a gap between theoretical analysis and optimization process of empirical risk minimization based on loss: the estimator needs to be global optimal in the theoretical analysis while the optimization method can not ensure the global optimality of its solutions. In this paper, we aim to fill this gap by developing a novel theoretical analysis on the performance of estimators generated by the gradient descent algorithm. We demonstrate that with an appropriately chosen scale parameter σ, the gradient update with early stopping rules can approximate the regression function. Our elegant error analysis can lead to convergence in the standard L 2 norm and the strong RKHS norm, both of which are optimal in the mini-max sense. We show that the scale parameter σ plays an important role in providing robustness as well as fast convergence. The numerical experiments implemented on synthetic examples and real data set also support our theoretical results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meckel, T. A.; Trevisan, L.; Krishnamurthy, P. G.
Small-scale (mm to m) sedimentary structures (e.g. ripple lamination, cross-bedding) have received a great deal of attention in sedimentary geology. The influence of depositional heterogeneity on subsurface fluid flow is now widely recognized, but incorporating these features in physically-rational bedform models at various scales remains problematic. The current investigation expands the capability of an existing set of open-source codes, allowing generation of high-resolution 3D bedform architecture models. The implemented modifications enable the generation of 3D digital models consisting of laminae and matrix (binary field) with characteristic depositional architecture. The binary model is then populated with petrophysical properties using a texturalmore » approach for additional analysis such as statistical characterization, property upscaling, and single and multiphase fluid flow simulation. One example binary model with corresponding threshold capillary pressure field and the scripts used to generate them are provided, but the approach can be used to generate dozens of previously documented common facies models and a variety of property assignments. An application using the example model is presented simulating buoyant fluid (CO 2) migration and resulting saturation distribution.« less
Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee
2015-01-01
In this paper, we present strategies for collecting and coding a large longitudinal communication dataset collected across multiple sites, consisting of over 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication datasets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multi-site secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges have the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a “how-to” example for managing large, digitally-recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414
2014-01-01
Background Decisions to scale up population health interventions from small projects to wider state or national implementation is fundamental to maximising population-wide health improvements. The objectives of this study were to examine: i) how decisions to scale up interventions are currently made in practice; ii) the role that evidence plays in informing decisions to scale up interventions; and iii) the role policy makers, practitioners, and researchers play in this process. Methods Interviews with an expert panel of senior Australian and international public health policy-makers (n = 7), practitioners (n = 7), and researchers (n = 7) were conducted in May 2013 with a participation rate of 84%. Results Scaling up decisions were generally made through iterative processes and led by policy makers and/or practitioners, but ultimately approved by political leaders and/or senior executives of funding agencies. Research evidence formed a component of the overall set of information used in decision-making, but its contribution was limited by the paucity of relevant intervention effectiveness research, and data on costs and cost effectiveness. Policy makers, practitioners/service managers, and researchers had different, but complementary roles to play in the process of scaling up interventions. Conclusions This analysis articulates the processes of how decisions to scale up interventions are made, the roles of evidence, and contribution of different professional groups. More intervention research that includes data on the effectiveness, reach, and costs of operating at scale and key service delivery issues (including acceptability and fit of interventions and delivery models) should be sought as this has the potential to substantially advance the relevance and ultimately usability of research evidence for scaling up population health action. PMID:24735455
An Integrated Ransac and Graph Based Mismatch Elimination Approach for Wide-Baseline Image Matching
NASA Astrophysics Data System (ADS)
Hasheminasab, M.; Ebadi, H.; Sedaghat, A.
2015-12-01
In this paper we propose an integrated approach in order to increase the precision of feature point matching. Many different algorithms have been developed as to optimizing the short-baseline image matching while because of illumination differences and viewpoints changes, wide-baseline image matching is so difficult to handle. Fortunately, the recent developments in the automatic extraction of local invariant features make wide-baseline image matching possible. The matching algorithms which are based on local feature similarity principle, using feature descriptor as to establish correspondence between feature point sets. To date, the most remarkable descriptor is the scale-invariant feature transform (SIFT) descriptor , which is invariant to image rotation and scale, and it remains robust across a substantial range of affine distortion, presence of noise, and changes in illumination. The epipolar constraint based on RANSAC (random sample consensus) method is a conventional model for mismatch elimination, particularly in computer vision. Because only the distance from the epipolar line is considered, there are a few false matches in the selected matching results based on epipolar geometry and RANSAC. Aguilariu et al. proposed Graph Transformation Matching (GTM) algorithm to remove outliers which has some difficulties when the mismatched points surrounded by the same local neighbor structure. In this study to overcome these limitations, which mentioned above, a new three step matching scheme is presented where the SIFT algorithm is used to obtain initial corresponding point sets. In the second step, in order to reduce the outliers, RANSAC algorithm is applied. Finally, to remove the remained mismatches, based on the adjacent K-NN graph, the GTM is implemented. Four different close range image datasets with changes in viewpoint are utilized to evaluate the performance of the proposed method and the experimental results indicate its robustness and capability.
RGAugury: a pipeline for genome-wide prediction of resistance gene analogs (RGAs) in plants.
Li, Pingchuan; Quan, Xiande; Jia, Gaofeng; Xiao, Jin; Cloutier, Sylvie; You, Frank M
2016-11-02
Resistance gene analogs (RGAs), such as NBS-encoding proteins, receptor-like protein kinases (RLKs) and receptor-like proteins (RLPs), are potential R-genes that contain specific conserved domains and motifs. Thus, RGAs can be predicted based on their conserved structural features using bioinformatics tools. Computer programs have been developed for the identification of individual domains and motifs from the protein sequences of RGAs but none offer a systematic assessment of the different types of RGAs. A user-friendly and efficient pipeline is needed for large-scale genome-wide RGA predictions of the growing number of sequenced plant genomes. An integrative pipeline, named RGAugury, was developed to automate RGA prediction. The pipeline first identifies RGA-related protein domains and motifs, namely nucleotide binding site (NB-ARC), leucine rich repeat (LRR), transmembrane (TM), serine/threonine and tyrosine kinase (STTK), lysin motif (LysM), coiled-coil (CC) and Toll/Interleukin-1 receptor (TIR). RGA candidates are identified and classified into four major families based on the presence of combinations of these RGA domains and motifs: NBS-encoding, TM-CC, and membrane associated RLP and RLK. All time-consuming analyses of the pipeline are paralleled to improve performance. The pipeline was evaluated using the well-annotated Arabidopsis genome. A total of 98.5, 85.2, and 100 % of the reported NBS-encoding genes, membrane associated RLPs and RLKs were validated, respectively. The pipeline was also successfully applied to predict RGAs for 50 sequenced plant genomes. A user-friendly web interface was implemented to ease command line operations, facilitate visualization and simplify result management for multiple datasets. RGAugury is an efficiently integrative bioinformatics tool for large scale genome-wide identification of RGAs. It is freely available at Bitbucket: https://bitbucket.org/yaanlpc/rgaugury .
2014-01-01
Background Mindfulness-based cognitive therapy (MBCT) is a cost-effective psychosocial prevention programme that helps people with recurrent depression stay well in the long term. It was singled out in the 2009 National Institute for Health and Clinical Excellence (NICE) Depression Guideline as a key priority for implementation. Despite good evidence and guideline recommendations, its roll-out and accessibility across the UK appears to be limited and inequitably distributed. The study aims to describe the current state of MBCT accessibility and implementation across the UK, develop an explanatory framework of what is hindering and facilitating its progress in different areas, and develop an Implementation Plan and related resources to promote better and more equitable availability and use of MBCT within the UK National Health Service. Methods/Design This project is a two-phase qualitative, exploratory and explanatory research study, using an interview survey and in-depth case studies theoretically underpinned by the Promoting Action on Implementation in Health Services (PARIHS) framework. Interviews will be conducted with stakeholders involved in commissioning, managing and implementing MBCT services in each of the four UK countries, and will include areas where MBCT services are being implemented successfully and where implementation is not working well. In-depth case studies will be undertaken on a range of MBCT services to develop a detailed understanding of the barriers and facilitators to implementation. Guided by the study’s conceptual framework, data will be synthesized across Phase 1 and Phase 2 to develop a fit for purpose implementation plan. Discussion Promoting the uptake of evidence-based treatments into routine practice and understanding what influences these processes has the potential to support the adoption and spread of nationally recommended interventions like MBCT. This study could inform a larger scale implementation trial and feed into future implementation of MBCT with other long-term conditions and associated co-morbidities. It could also inform the implementation of interventions that are acceptable and effective, but are not widely accessible or implemented. PMID:24884603
LES Modeling of Lateral Dispersion in the Ocean on Scales of 10 m to 10 km
2015-10-20
ocean on scales of 0.1-10 km that can be implemented in larger-scale ocean models. These parameterizations will incorporate the effects of local...ocean on scales of 0.1-10 km that can be implemented in larger-scale ocean models. These parameterizations will incorporate the effects of local...www.fields.utoronto.ca/video-archive/static/2013/06/166-1766/mergedvideo.ogv) and at the Nonlinear Effects in Internal Waves Conference held at Cornell University
Rosas, Scott R; Behar, Lenore B; Hydaker, William M
2016-01-01
Establishing a system of care requires communities to identify ways to successfully implement strategies and support positive outcomes for children and their families. Such community transformation is complex and communities vary in terms of their readiness for implementing sustainable community interventions. Assessing community readiness and guiding implementation, specifically for the funded communities implementing a system of care, requires a well-designed tool with sound psychometric properties. This scale development study used the results of a previously published concept mapping study to create, administer, and assess the psychometric characteristics of the System of Care Readiness and Implementation Measurement Scale (SOC-RIMS). The results indicate the SOC-RIMS possesses excellent internal consistency characteristics, measures clearly discernible dimensions of community readiness, and demonstrates the target constructs exist within a broad network of content. The SOC-RIMS can be a useful part of a comprehensive assessment in communities where system of care practices, principles, and philosophies are implemented and evaluated.
Status of States' Progress in Implementing Part H of IDEA: Report #3.
ERIC Educational Resources Information Center
Harbin, Gloria L.; And Others
This report focuses on progress in the implementation of Part H of the Individuals with Disabilities Education Act (IDEA) through a comparison of states' status on three yearly administrations of the State Progress Scale. The scale was designed to monitor implementation of the required 14 components in the stages of policy development, policy…
Wide band design on the scaled absorbing material filled with flaky CIPs
NASA Astrophysics Data System (ADS)
Xu, Yonggang; Yuan, Liming; Gao, Wei; Wang, Xiaobing; Liang, Zichang; Liao, Yi
2018-02-01
The scaled target measurement is an important method to get the target characteristic. Radar absorbing materials are widely used in the low detectable target, considering the absorbing material frequency dispersion characteristics, it makes designing and manufacturing scaled radar absorbing materials on the scaled target very difficult. This paper proposed a wide band design method on the scaled absorbing material of the thin absorption coating with added carbonyl iron particles. According to the theoretical radar cross section (RCS) of the plate, the reflection loss determined by the permittivity and permeability was chosen as the main design factor. Then, the parameters of the scaled absorbing materials were designed using the effective medium theory, and the scaled absorbing material was constructed. Finally, the full-size coating plate and scaled coating plates (under three different scale factors) were simulated; the RCSs of the coating plates were numerically calculated and measured at 4 GHz and a scale factor of 2. The results showed that the compensated RCS of the scaled coating plate was close to that of the full-size coating plate, that is, the mean deviation was less than 0.5 dB, and the design method for the scaled material was very effective.
ASIC implementation of recursive scaled discrete cosine transform algorithm
NASA Astrophysics Data System (ADS)
On, Bill N.; Narasimhan, Sam; Huang, Victor K.
1994-05-01
A program to implement the Recursive Scaled Discrete Cosine Transform (DCT) algorithm as proposed by H. S. Hou has been undertaken at the Institute of Microelectronics. Implementation of the design was done using top-down design methodology with VHDL (VHSIC Hardware Description Language) for chip modeling. When the VHDL simulation has been satisfactorily completed, the design is synthesized into gates using a synthesis tool. The architecture of the design consists of two processing units together with a memory module for data storage and transpose. Each processing unit is composed of four pipelined stages which allow the internal clock to run at one-eighth (1/8) the speed of the pixel clock. Each stage operates on eight pixels in parallel. As the data flows through each stage, there are various adders and multipliers to transform them into the desired coefficients. The Scaled IDCT was implemented in a similar fashion with the adders and multipliers rearranged to perform the inverse DCT algorithm. The chip has been verified using Field Programmable Gate Array devices. The design is operational. The combination of fewer multiplications required and pipelined architecture give Hou's Recursive Scaled DCT good potential of achieving high performance at a low cost in using Very Large Scale Integration implementation.
Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom
2010-06-01
The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of themore » implementation and testing of SUSA at the INL VHTR Project Office.« less
QUEST+: A general multidimensional Bayesian adaptive psychometric method.
Watson, Andrew B
2017-03-01
QUEST+ is a Bayesian adaptive psychometric testing method that allows an arbitrary number of stimulus dimensions, psychometric function parameters, and trial outcomes. It is a generalization and extension of the original QUEST procedure and incorporates many subsequent developments in the area of parametric adaptive testing. With a single procedure, it is possible to implement a wide variety of experimental designs, including conventional threshold measurement; measurement of psychometric function parameters, such as slope and lapse; estimation of the contrast sensitivity function; measurement of increment threshold functions; measurement of noise-masking functions; Thurstone scale estimation using pair comparisons; and categorical ratings on linear and circular stimulus dimensions. QUEST+ provides a general method to accelerate data collection in many areas of cognitive and perceptual science.
Ultra-wideband three-dimensional optoacoustic tomography.
Gateau, Jérôme; Chekkoury, Andrei; Ntziachristos, Vasilis
2013-11-15
Broadband optoacoustic waves generated by biological tissues excited with nanosecond laser pulses carry information corresponding to a wide range of geometrical scales. Typically, the frequency content present in the signals generated during optoacoustic imaging is much larger compared to the frequency band captured by common ultrasonic detectors, the latter typically acting as bandpass filters. To image optical absorption within structures ranging from entire organs to microvasculature in three dimensions, we implemented optoacoustic tomography with two ultrasound linear arrays featuring a center frequency of 6 and 24 MHz, respectively. In the present work, we show that complementary information on anatomical features could be retrieved and provide a better understanding on the localization of structures in the general anatomy by analyzing multi-bandwidth datasets acquired on a freshly excised kidney.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazen, Terry C.; Tabak, Henry H.
2007-03-15
Bioremediation of metals and radionuclides has had manyfield tests, demonstrations, and full-scale implementations in recentyears. Field research in this area has occurred for many different metalsand radionuclides using a wide array of strategies. These strategies canbe generally characterized in six major categories: biotransformation,bioaccumulation/bisorption, biodegradation of chelators, volatilization,treatment trains, and natural attenuation. For all field applicationsthere are a number of critical biogeochemical issues that most beaddressed for the successful field application. Monitoring andcharacterization parameters that are enabling to bioremediation of metalsand radionuclides are presented here. For each of the strategies a casestudy is presented to demonstrate a field application that usesmore » thisstrategy.« less
The Ensembl REST API: Ensembl Data for Any Language
Yates, Andrew; Beal, Kathryn; Keenan, Stephen; McLaren, William; Pignatelli, Miguel; Ritchie, Graham R. S.; Ruffier, Magali; Taylor, Kieron; Vullo, Alessandro; Flicek, Paul
2015-01-01
Motivation: We present a Web service to access Ensembl data using Representational State Transfer (REST). The Ensembl REST server enables the easy retrieval of a wide range of Ensembl data by most programming languages, using standard formats such as JSON and FASTA while minimizing client work. We also introduce bindings to the popular Ensembl Variant Effect Predictor tool permitting large-scale programmatic variant analysis independent of any specific programming language. Availability and implementation: The Ensembl REST API can be accessed at http://rest.ensembl.org and source code is freely available under an Apache 2.0 license from http://github.com/Ensembl/ensembl-rest. Contact: ayates@ebi.ac.uk or flicek@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25236461
Stochastic Analysis of Reaction–Diffusion Processes
Hu, Jifeng; Kang, Hye-Won
2013-01-01
Reaction and diffusion processes are used to model chemical and biological processes over a wide range of spatial and temporal scales. Several routes to the diffusion process at various levels of description in time and space are discussed and the master equation for spatially discretized systems involving reaction and diffusion is developed. We discuss an estimator for the appropriate compartment size for simulating reaction–diffusion systems and introduce a measure of fluctuations in a discretized system. We then describe a new computational algorithm for implementing a modified Gillespie method for compartmental systems in which reactions are aggregated into equivalence classes and computational cells are searched via an optimized tree structure. Finally, we discuss several examples that illustrate the issues that have to be addressed in general systems. PMID:23719732
Woillard, Jean-Baptiste; Chouchana, Laurent; Picard, Nicolas; Loriot, Marie-Anne
2017-04-01
Therapeutic drug monitoring is already widely used for immunosuppressive drugs due to their narrow therapeutic index. This article summarizes evidence reported in the literature regarding the pharmacogenetics of (i) immunosuppressive drugs used in transplantation and (ii) azathioprine used in chronic inflammatory bowel disease. The conditions of use of currently available major pharmacogenetic tests are detailed and recommendations are provided based on a scale established by the RNPGx scoring tests as "essential", "advisable" and "potentially useful". Other applications for which the level of evidence is still debated are also discussed. Copyright © 2017 Société française de pharmacologie et de thérapeutique. Published by Elsevier Masson SAS. All rights reserved.
Lee, Ching-Pei; Lin, Chih-Jen
2014-04-01
Linear rankSVM is one of the widely used methods for learning to rank. Although its performance may be inferior to nonlinear methods such as kernel rankSVM and gradient boosting decision trees, linear rankSVM is useful to quickly produce a baseline model. Furthermore, following its recent development for classification, linear rankSVM may give competitive performance for large and sparse data. A great deal of works have studied linear rankSVM. The focus is on the computational efficiency when the number of preference pairs is large. In this letter, we systematically study existing works, discuss their advantages and disadvantages, and propose an efficient algorithm. We discuss different implementation issues and extensions with detailed experiments. Finally, we develop a robust linear rankSVM tool for public use.
Swimming like algae: biomimetic soft artificial cilia
Sareh, Sina; Rossiter, Jonathan; Conn, Andrew; Drescher, Knut; Goldstein, Raymond E.
2013-01-01
Cilia are used effectively in a wide variety of biological systems from fluid transport to thrust generation. Here, we present the design and implementation of artificial cilia, based on a biomimetic planar actuator using soft-smart materials. This actuator is modelled on the cilia movement of the alga Volvox, and represents the cilium as a piecewise constant-curvature robotic actuator that enables the subsequent direct translation of natural articulation into a multi-segment ionic polymer metal composite actuator. It is demonstrated how the combination of optimal segmentation pattern and biologically derived per-segment driving signals reproduce natural ciliary motion. The amenability of the artificial cilia to scaling is also demonstrated through the comparison of the Reynolds number achieved with that of natural cilia. PMID:23097503
Regulatory RNA-assisted genome engineering in microorganisms.
Si, Tong; HamediRad, Mohammad; Zhao, Huimin
2015-12-01
Regulatory RNAs are increasingly recognized and utilized as key modulators of gene expression in diverse organisms. Thanks to their modular and programmable nature, trans-acting regulatory RNAs are especially attractive in genome-scale applications. Here we discuss the recent examples in microbial genome engineering implementing various trans-acting RNA platforms, including sRNA, RNAi, asRNA and CRISRP-Cas. In particular, we focus on how the scalable and multiplex nature of trans-acting RNAs has been used to tackle the challenges in creating genome-wide and combinatorial diversity for functional genomics and metabolic engineering applications. Advances in computational design and context-dependent regulation are also discussed for their contribution in improving fine-tuning capabilities of trans-acting RNAs. Copyright © 2015 Elsevier Ltd. All rights reserved.
2015-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...Utilizing Sparse Array Data to Develop and Implement a New Method for Estimating Blue and Fin Whale Density Len Thomas & Danielle Harris Centre...to develop and implement a new method for estimating blue and fin whale density that is effective over large spatial scales and is designed to cope
Spicer, Neil; Bhattacharya, Dipankar; Dimka, Ritgak; Fanta, Feleke; Mangham-Jefferies, Lindsay; Schellenberg, Joanna; Tamire-Woldemariam, Addis; Walt, Gill; Wickremasinghe, Deepthi
2014-11-01
Donors and other development partners commonly introduce innovative practices and technologies to improve health in low and middle income countries. Yet many innovations that are effective in improving health and survival are slow to be translated into policy and implemented at scale. Understanding the factors influencing scale-up is important. We conducted a qualitative study involving 150 semi-structured interviews with government, development partners, civil society organisations and externally funded implementers, professional associations and academic institutions in 2012/13 to explore scale-up of innovative interventions targeting mothers and newborns in Ethiopia, the Indian state of Uttar Pradesh and the six states of northeast Nigeria, which are settings with high burdens of maternal and neonatal mortality. Interviews were analysed using a common analytic framework developed for cross-country comparison and themes were coded using Nvivo. We found that programme implementers across the three settings require multiple steps to catalyse scale-up. Advocating for government to adopt and finance health innovations requires: designing scalable innovations; embedding scale-up in programme design and allocating time and resources; building implementer capacity to catalyse scale-up; adopting effective approaches to advocacy; presenting strong evidence to support government decision making; involving government in programme design; invoking policy champions and networks; strengthening harmonisation among external programmes; aligning innovations with health systems and priorities. Other steps include: supporting government to develop policies and programmes and strengthening health systems and staff; promoting community uptake by involving media, community leaders, mobilisation teams and role models. We conclude that scale-up has no magic bullet solution - implementers must embrace multiple activities, and require substantial support from donors and governments in doing so. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Design, Implementation and Validation of a Europe-Wide Pedagogical Framework for E-Learning
ERIC Educational Resources Information Center
Granic, Andrina; Mifsud, Charles; Cukusic, Maja
2009-01-01
Within the context of a Europe-wide project UNITE, a number of European partners set out to design, implement and validate a pedagogical framework (PF) for e- and m-Learning in secondary schools. The process of formulating and testing the PF was an evolutionary one that reflected the experiences and skills of the various European partners and…
ERIC Educational Resources Information Center
Ramin, John E.
2011-01-01
The purpose of this study was to explore the effectiveness of implementing Life Space Crisis Intervention as a school-wide strategy for reducing school violence. Life Space Crisis Intervention (LSCI) is a strength-based verbal interaction strategy (Long, Fecser, Wood, 2001). LSCI utilizes naturally occurring crisis situations as teachable…
ERIC Educational Resources Information Center
Rutledge, Stacey A.; Brown, Stephanie; Petrova, Kitchka
2017-01-01
Scaling in educational settings has tended to focus on replication of external programs with less focus on the nature of adaptation. In this article, we explore the scaling of Personalization for Academic and Social-emotional Learning (PASL), a systemic high school reform effort that was intentionally identified, developed, and implemented with…
ERIC Educational Resources Information Center
Rutledge, Stacey; Brown, Stephanie; Petrova, Kitchka
2017-01-01
Scaling in educational settings has tended to focus on replication of external programs with less focus on the nature of adaptation. In this article, we explore the scaling of Personalization for Academic and Social-emotional Learning (PASL), a systemic high school reform effort that was intentionally identified, developed, and implemented with…
GPU Acceleration of DSP for Communication Receivers.
Gunther, Jake; Gunther, Hyrum; Moon, Todd
2017-09-01
Graphics processing unit (GPU) implementations of signal processing algorithms can outperform CPU-based implementations. This paper describes the GPU implementation of several algorithms encountered in a wide range of high-data rate communication receivers including filters, multirate filters, numerically controlled oscillators, and multi-stage digital down converters. These structures are tested by processing the 20 MHz wide FM radio band (88-108 MHz). Two receiver structures are explored: a single channel receiver and a filter bank channelizer. Both run in real time on NVIDIA GeForce GTX 1080 graphics card.
The Nature Index: a general framework for synthesizing knowledge on the state of biodiversity.
Certain, Grégoire; Skarpaas, Olav; Bjerke, Jarle-Werner; Framstad, Erik; Lindholm, Markus; Nilsen, Jan-Erik; Norderhaug, Ann; Oug, Eivind; Pedersen, Hans-Christian; Schartau, Ann-Kristin; van der Meeren, Gro I; Aslaksen, Iulie; Engen, Steinar; Garnåsjordet, Per-Arild; Kvaløy, Pål; Lillegård, Magnar; Yoccoz, Nigel G; Nybø, Signe
2011-04-22
The magnitude and urgency of the biodiversity crisis is widely recognized within scientific and political organizations. However, a lack of integrated measures for biodiversity has greatly constrained the national and international response to the biodiversity crisis. Thus, integrated biodiversity indexes will greatly facilitate information transfer from science toward other areas of human society. The Nature Index framework samples scientific information on biodiversity from a variety of sources, synthesizes this information, and then transmits it in a simplified form to environmental managers, policymakers, and the public. The Nature Index optimizes information use by incorporating expert judgment, monitoring-based estimates, and model-based estimates. The index relies on a network of scientific experts, each of whom is responsible for one or more biodiversity indicators. The resulting set of indicators is supposed to represent the best available knowledge on the state of biodiversity and ecosystems in any given area. The value of each indicator is scaled relative to a reference state, i.e., a predicted value assessed by each expert for a hypothetical undisturbed or sustainably managed ecosystem. Scaled indicator values can be aggregated or disaggregated over different axes representing spatiotemporal dimensions or thematic groups. A range of scaling models can be applied to allow for different ways of interpreting the reference states, e.g., optimal situations or minimum sustainable levels. Statistical testing for differences in space or time can be implemented using Monte-Carlo simulations. This study presents the Nature Index framework and details its implementation in Norway. The results suggest that the framework is a functional, efficient, and pragmatic approach for gathering and synthesizing scientific knowledge on the state of biodiversity in any marine or terrestrial ecosystem and has general applicability worldwide.
Efficient spiking neural network model of pattern motion selectivity in visual cortex.
Beyeler, Michael; Richert, Micah; Dutt, Nikil D; Krichmar, Jeffrey L
2014-07-01
Simulating large-scale models of biological motion perception is challenging, due to the required memory to store the network structure and the computational power needed to quickly solve the neuronal dynamics. A low-cost yet high-performance approach to simulating large-scale neural network models in real-time is to leverage the parallel processing capability of graphics processing units (GPUs). Based on this approach, we present a two-stage model of visual area MT that we believe to be the first large-scale spiking network to demonstrate pattern direction selectivity. In this model, component-direction-selective (CDS) cells in MT linearly combine inputs from V1 cells that have spatiotemporal receptive fields according to the motion energy model of Simoncelli and Heeger. Pattern-direction-selective (PDS) cells in MT are constructed by pooling over MT CDS cells with a wide range of preferred directions. Responses of our model neurons are comparable to electrophysiological results for grating and plaid stimuli as well as speed tuning. The behavioral response of the network in a motion discrimination task is in agreement with psychophysical data. Moreover, our implementation outperforms a previous implementation of the motion energy model by orders of magnitude in terms of computational speed and memory usage. The full network, which comprises 153,216 neurons and approximately 40 million synapses, processes 20 frames per second of a 40 × 40 input video in real-time using a single off-the-shelf GPU. To promote the use of this algorithm among neuroscientists and computer vision researchers, the source code for the simulator, the network, and analysis scripts are publicly available.
The Nature Index: A General Framework for Synthesizing Knowledge on the State of Biodiversity
Certain, Grégoire; Skarpaas, Olav; Bjerke, Jarle-Werner; Framstad, Erik; Lindholm, Markus; Nilsen, Jan-Erik; Norderhaug, Ann; Oug, Eivind; Pedersen, Hans-Christian; Schartau, Ann-Kristin; van der Meeren, Gro I.; Aslaksen, Iulie; Engen, Steinar; Garnåsjordet, Per-Arild; Kvaløy, Pål; Lillegård, Magnar; Yoccoz, Nigel G.; Nybø, Signe
2011-01-01
The magnitude and urgency of the biodiversity crisis is widely recognized within scientific and political organizations. However, a lack of integrated measures for biodiversity has greatly constrained the national and international response to the biodiversity crisis. Thus, integrated biodiversity indexes will greatly facilitate information transfer from science toward other areas of human society. The Nature Index framework samples scientific information on biodiversity from a variety of sources, synthesizes this information, and then transmits it in a simplified form to environmental managers, policymakers, and the public. The Nature Index optimizes information use by incorporating expert judgment, monitoring-based estimates, and model-based estimates. The index relies on a network of scientific experts, each of whom is responsible for one or more biodiversity indicators. The resulting set of indicators is supposed to represent the best available knowledge on the state of biodiversity and ecosystems in any given area. The value of each indicator is scaled relative to a reference state, i.e., a predicted value assessed by each expert for a hypothetical undisturbed or sustainably managed ecosystem. Scaled indicator values can be aggregated or disaggregated over different axes representing spatiotemporal dimensions or thematic groups. A range of scaling models can be applied to allow for different ways of interpreting the reference states, e.g., optimal situations or minimum sustainable levels. Statistical testing for differences in space or time can be implemented using Monte-Carlo simulations. This study presents the Nature Index framework and details its implementation in Norway. The results suggest that the framework is a functional, efficient, and pragmatic approach for gathering and synthesizing scientific knowledge on the state of biodiversity in any marine or terrestrial ecosystem and has general applicability worldwide. PMID:21526118
An i2b2-based, generalizable, open source, self-scaling chronic disease registry
Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D
2013-01-01
Objective Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Materials and methods Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. Results The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. Discussion We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. Conclusions The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases. PMID:22733975
NASA Astrophysics Data System (ADS)
Peltola, Olli; Raivonen, Maarit; Li, Xuefei; Vesala, Timo
2018-02-01
Emission via bubbling, i.e. ebullition, is one of the main methane (CH4) emission pathways from wetlands to the atmosphere. Direct measurement of gas bubble formation, growth and release in the peat-water matrix is challenging and in consequence these processes are relatively unknown and are coarsely represented in current wetland CH4 emission models. In this study we aimed to evaluate three ebullition modelling approaches and their effect on model performance. This was achieved by implementing the three approaches in one process-based CH4 emission model. All the approaches were based on some kind of threshold: either on CH4 pore water concentration (ECT), pressure (EPT) or free-phase gas volume (EBG) threshold. The model was run using 4 years of data from a boreal sedge fen and the results were compared with eddy covariance measurements of CH4 fluxes.
Modelled annual CH4 emissions were largely unaffected by the different ebullition modelling approaches; however, temporal variability in CH4 emissions varied an order of magnitude between the approaches. Hence the ebullition modelling approach drives the temporal variability in modelled CH4 emissions and therefore significantly impacts, for instance, high-frequency (daily scale) model comparison and calibration against measurements. The modelling approach based on the most recent knowledge of the ebullition process (volume threshold, EBG) agreed the best with the measured fluxes (R2 = 0.63) and hence produced the most reasonable results, although there was a scale mismatch between the measurements (ecosystem scale with heterogeneous ebullition locations) and model results (single horizontally homogeneous peat column). The approach should be favoured over the two other more widely used ebullition modelling approaches and researchers are encouraged to implement it into their CH4 emission models.
Can AIDS prevention move to sufficient scale?
Slutkin, G
1993-05-01
Much has been learned about which AIDS prevention interventions are effective and what an AIDS prevention program should look like. It is also clear that important program issues must be worked out at the country level if effective interventions are to be had. Programs with successful interventions and approaches in most countries, however, have yet to be implemented on a sufficiently large scale. While some national programs are beginning to use proven interventions and are moving toward implementing full-scale national AIDS programs, most AIDS prevention programs do not incorporate condom marketing, are not using mass media and advertising in a well-programmed way, do not have peer projects to reach most at-risk populations, and do not have systems in place to diagnose and treat persons with sexually transmitted diseases (STD). Far more planning and resources for AIDS prevention are needed from national and international public and private sectors. International efforts by the World Health Organization (WHO), UNICEF, UNDP, UNESCO, UNFPA, and the World Bank have increased markedly over the past few years. Bilaterally, the US, Sweden, United Kingdom, Canada, Netherlands, Norway, Denmark, Japan, Germany, France, and other countries are contributing to WHO/GPA and to direct bilateral AIDS prevention activities. USAID happens to be the largest single contributor to WHO/GPA and is also the largest bilateral program with its $168 millions AIDSCAP funded over 5 years. AIDSCAP integrates condom distribution and marketing, STD prevention and control, behavioral change and communication strategies through person-to-person and mass media approaches, and strong evaluation components. AIDSCAP can help fulfill the need to demonstrate that programs can be developed on a country-wide level by showing how behavior can be changed in a broad geographical area.
An i2b2-based, generalizable, open source, self-scaling chronic disease registry.
Natter, Marc D; Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D
2013-01-01
Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases.