CytoSPADE: high-performance analysis and visualization of high-dimensional cytometry data
Linderman, Michael D.; Simonds, Erin F.; Qiu, Peng; Bruggner, Robert V.; Sheode, Ketaki; Meng, Teresa H.; Plevritis, Sylvia K.; Nolan, Garry P.
2012-01-01
Motivation: Recent advances in flow cytometry enable simultaneous single-cell measurement of 30+ surface and intracellular proteins. CytoSPADE is a high-performance implementation of an interface for the Spanning-tree Progression Analysis of Density-normalized Events algorithm for tree-based analysis and visualization of this high-dimensional cytometry data. Availability: Source code and binaries are freely available at http://cytospade.org and via Bioconductor version 2.10 onwards for Linux, OSX and Windows. CytoSPADE is implemented in R, C++ and Java. Contact: michael.linderman@mssm.edu Supplementary Information: Additional documentation available at http://cytospade.org. PMID:22782546
High School CPR/AED Training in Washington State.
Salvatierra, Gail G; Palazzo, Steven J; Emery, Allison
2017-05-01
Describe the rates of CPR/AED training in high schools in the state of Washington after passage of legislation mandating CPR/AED training. A web-based survey was sent to administrators at 660 public and private high schools in the state of Washington. The survey was completed by 148 schools (22%); 64% reported providing CPR training and 54% provided AED training. Reported barriers to implementation included instructor availability, cost, and a lack of equipment. Descriptive statistics were used to describe the sample characteristics and implementation rates. Mandates without resources and support do not ensure implementation of CPR/AED training in high schools. Full public health benefits of a CPR mandate will not be realized until barriers to implementation are identified and eliminated through use of available, accessible public health resources. © 2016 Wiley Periodicals, Inc.
Sports-Related Emergency Preparedness in Oregon High Schools
Johnson, Samuel T.; Norcross, Marc F.; Bovbjerg, Viktor E.; Hoffman, Mark A.; Chang, Eunwook; Koester, Michael C.
2017-01-01
Background: Best practice recommendations for sports-related emergency preparation include implementation of venue-specific emergency action plans (EAPs), access to early defibrillation, and first responders—specifically coaches—trained in cardiopulmonary resuscitation and automated external defibrillator (AED) use. The objective was to determine whether high schools had implemented these 3 recommendations and whether schools with a certified athletic trainer (AT) were more likely to have done so. Hypothesis: Schools with an AT were more likely to have implemented the recommendations. Study Design: Cross-sectional study. Level of Evidence: Level 4. Methods: All Oregon School Activities Association member school athletic directors were invited to complete a survey on sports-related emergency preparedness and AT availability at their school. Chi-square and Fisher exact tests were used to analyze the associations between emergency preparedness and AT availability. Results: In total, 108 respondents (37% response rate) completed the survey. Exactly half reported having an AT available. Only 11% (95% CI, 6%-19%) of the schools had implemented all 3 recommendations, 29% (95% CI, 21%-39%) had implemented 2, 32% (95% CI, 24%-42%) had implemented 1, and 27% (95% CI, 19%-36%) had not implemented any of the recommendations. AT availability was associated with implementation of the recommendations (χ2 = 10.3, P = 0.02), and the proportion of schools with ATs increased with the number of recommendations implemented (χ2 = 9.3, P < 0.01). Schools with an AT were more likely to implement venue-specific EAPs (52% vs 24%, P < 0.01) and have an AED available for early defibrillation (69% vs 44%, P = 0.02) but not more likely to require coach training (33% vs 28%, P = 0.68). Conclusions: Despite best practice recommendations, most schools were inadequately prepared for sports-related emergencies. Schools with an AT were more likely to implement some, but not all, of the recommendations. Policy changes may be needed to improve implementation. Clinical Relevance: Most Oregon high schools need to do more to prepare for sports-related emergencies. The results provide evidence for sports medicine professionals and administrators to inform policy changes that ensure the safety of athletes. PMID:28129072
Sports-Related Emergency Preparedness in Oregon High Schools.
Johnson, Samuel T; Norcross, Marc F; Bovbjerg, Viktor E; Hoffman, Mark A; Chang, Eunwook; Koester, Michael C
Best practice recommendations for sports-related emergency preparation include implementation of venue-specific emergency action plans (EAPs), access to early defibrillation, and first responders-specifically coaches-trained in cardiopulmonary resuscitation and automated external defibrillator (AED) use. The objective was to determine whether high schools had implemented these 3 recommendations and whether schools with a certified athletic trainer (AT) were more likely to have done so. Schools with an AT were more likely to have implemented the recommendations. Cross-sectional study. Level 4. All Oregon School Activities Association member school athletic directors were invited to complete a survey on sports-related emergency preparedness and AT availability at their school. Chi-square and Fisher exact tests were used to analyze the associations between emergency preparedness and AT availability. In total, 108 respondents (37% response rate) completed the survey. Exactly half reported having an AT available. Only 11% (95% CI, 6%-19%) of the schools had implemented all 3 recommendations, 29% (95% CI, 21%-39%) had implemented 2, 32% (95% CI, 24%-42%) had implemented 1, and 27% (95% CI, 19%-36%) had not implemented any of the recommendations. AT availability was associated with implementation of the recommendations (χ 2 = 10.3, P = 0.02), and the proportion of schools with ATs increased with the number of recommendations implemented (χ 2 = 9.3, P < 0.01). Schools with an AT were more likely to implement venue-specific EAPs (52% vs 24%, P < 0.01) and have an AED available for early defibrillation (69% vs 44%, P = 0.02) but not more likely to require coach training (33% vs 28%, P = 0.68). Despite best practice recommendations, most schools were inadequately prepared for sports-related emergencies. Schools with an AT were more likely to implement some, but not all, of the recommendations. Policy changes may be needed to improve implementation. Most Oregon high schools need to do more to prepare for sports-related emergencies. The results provide evidence for sports medicine professionals and administrators to inform policy changes that ensure the safety of athletes.
Implementing Cardiopulmonary Resuscitation Training Programs in High Schools: Iowa's Experience.
Hoyme, Derek B; Atkins, Dianne L
2017-02-01
To understand perceived barriers to providing cardiopulmonary resuscitation (CPR) education, implementation processes, and practices in high schools. Iowa has required CPR as a graduation requirement since 2011 as an unfunded mandate. A cross-sectional study was performed through multiple choice surveys sent to Iowa high schools to collect data about school demographics, details of CPR programs, cost, logistics, and barriers to implementation, as well as automated external defibrillator training and availability. Eighty-four schools responded (26%), with the most frequently reported school size of 100-500 students and faculty size of 25-50. When the law took effect, 51% of schools had training programs already in place; at the time of the study, 96% had successfully implemented CPR training. Perceived barriers to implementation were staffing, time commitment, equipment availability, and cost. The average estimated startup cost was <$1000 US, and the yearly maintenance cost was <$500 with funds typically allocated from existing school resources. The facilitator was a school official or volunteer for 81% of schools. Average estimated training time commitment per student was <2 hours. Automated external defibrillators are available in 98% of schools, and 61% include automated external defibrillator training in their curriculum. Despite perceived barriers, school CPR training programs can be implemented with reasonable resource and time allocations. Copyright © 2016 Elsevier Inc. All rights reserved.
Implementing Cardiopulmonary Resuscitation Training Programs in High Schools: Iowa's Experience
Hoyme, Derek B.; Atkins, Dianne L.
2017-01-01
Objective To understand perceived barriers to providing cardiopulmonary resuscitation (CPR) education, implementation processes, and practices in high schools. Study design Iowa has required CPR as a graduation requirement since 2011 as an unfunded mandate. A cross-sectional study was performed through multiple choice surveys sent to Iowa high schools to collect data about school demographics, details of CPR programs, cost, logistics, and barriers to implementation, as well as automated external defibrillator training and availability. Results Eighty-four schools responded (26%), with the most frequently reported school size of 100-500 students and faculty size of 25-50. When the law took effect, 51% of schools had training programs already in place; at the time of the study, 96% had successfully implemented CPR training. Perceived barriers to implementation were staffing, time commitment, equipment availability, and cost. The average estimated startup cost was <$1000 US, and the yearly maintenance cost was <$500 with funds typically allocated from existing school resources. The facilitator was a school official or volunteer for 81% of schools. Average estimated training time commitment per student was <2 hours. Automated external defibrillators are available in 98% of schools, and 61% include automated external defibrillator training in their curriculum. Conclusions Despite perceived barriers, school CPR training programs can be implemented with reasonable resource and time allocations. PMID:27852456
Nielsen, Katie R; Becerra, Rosario; Mallma, Gabriela; Tantaleán da Fieno, José
2018-01-01
Acute lower respiratory infections are the leading cause of death outside the neonatal period for children less than 5 years of age. Widespread availability of invasive and non-invasive mechanical ventilation in resource-rich settings has reduced mortality rates; however, these technologies are not always available in many low- and middle-income countries due to the high cost and trained personnel required to implement and sustain their use. High flow nasal cannula (HFNC) is a form of non-invasive respiratory support with growing evidence for use in pediatric respiratory failure. Its simple interface makes utilization in resource-limited settings appealing, although widespread implementation in these settings lags behind resource-rich settings. Implementation science is an emerging field dedicated to closing the know-do gap by incorporating evidence-based interventions into routine care, and its principles have guided the scaling up of many global health interventions. In 2016, we introduced HFNC use for respiratory failure in a pediatric intensive care unit in Lima, Peru using implementation science methodology. Here, we review our experience in the context of the principles of implementation science to serve as a guide for others considering HFNC implementation in resource-limited settings.
Morgan, Martin; Anders, Simon; Lawrence, Michael; Aboyoun, Patrick; Pagès, Hervé; Gentleman, Robert
2009-01-01
Summary: ShortRead is a package for input, quality assessment, manipulation and output of high-throughput sequencing data. ShortRead is provided in the R and Bioconductor environments, allowing ready access to additional facilities for advanced statistical analysis, data transformation, visualization and integration with diverse genomic resources. Availability and Implementation: This package is implemented in R and available at the Bioconductor web site; the package contains a ‘vignette’ outlining typical work flows. Contact: mtmorgan@fhcrc.org PMID:19654119
[Family Health Program implementation in municipalities in Mato Grosso State, Brazil].
Canesqui, Ana Maria; Spinelli, Maria Angélica do Santos
2008-04-01
This article analysis some key aspects in the implementation of the Family Health Program (FHP): results; conditions; and institutional mechanisms; flow and regularity of funding; organizational structures; and human resources availability and training. The study was conducted in seven municipalities (counties) in the State of Mato Grosso, Brazil, and used secondary data as well as primary data from interviews with different stakeholders. The research design was evaluative, using a quantitative/qualitative analysis. The results showed: varying stages in the implementation process, different FHP models, and adaptation of organizational structures; high level of human resources availability, except for nurse assistants; availability of financial resources, with some difficulties in their flow; and other institutional factors that hinder or facilitate the micro-implementation process in the municipalities.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-16
...' health, safety, employment, mobility, and education; and 3. Neighborhood: Transform distressed, high..., high quality public schools and education programs, high quality early learning programs and services..., communities must develop and implement a comprehensive neighborhood revitalization strategy, or Transformation...
Code of Federal Regulations, 2012 CFR
2012-10-01
... Guiding Principles for Federal Leadership in High-Performance and Sustainable Buildings (available at http... shall implement high-performance sustainable building design, construction, renovation, repair...
Code of Federal Regulations, 2013 CFR
2013-10-01
... Guiding Principles for Federal Leadership in High-Performance and Sustainable Buildings (available at http... shall implement high-performance sustainable building design, construction, renovation, repair...
Diroma, Maria Angela; Santorsola, Mariangela; Guttà, Cristiano; Gasparre, Giuseppe; Picardi, Ernesto; Pesole, Graziano; Attimonelli, Marcella
2014-01-01
Motivation: The increasing availability of mitochondria-targeted and off-target sequencing data in whole-exome and whole-genome sequencing studies (WXS and WGS) has risen the demand of effective pipelines to accurately measure heteroplasmy and to easily recognize the most functionally important mitochondrial variants among a huge number of candidates. To this purpose, we developed MToolBox, a highly automated pipeline to reconstruct and analyze human mitochondrial DNA from high-throughput sequencing data. Results: MToolBox implements an effective computational strategy for mitochondrial genomes assembling and haplogroup assignment also including a prioritization analysis of detected variants. MToolBox provides a Variant Call Format file featuring, for the first time, allele-specific heteroplasmy and annotation files with prioritized variants. MToolBox was tested on simulated samples and applied on 1000 Genomes WXS datasets. Availability and implementation: MToolBox package is available at https://sourceforge.net/projects/mtoolbox/. Contact: marcella.attimonelli@uniba.it Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25028726
Code of Federal Regulations, 2014 CFR
2014-10-01
... Principles for Federal Leadership in High-Performance and Sustainable Buildings (available at http://www.wbdg... implement high-performance sustainable building design, construction, renovation, repair, commissioning...
Selection and Implementation of a Simulated Electronic Medical Record (EMR) in a Nursing Skills Lab
ERIC Educational Resources Information Center
Curry, David G.
2011-01-01
SUNY Plattsburgh has a baccalaureate nursing program that has been active in integrating technology in nursing education for many years. Recently, the faculty implemented human simulation (Laerdal's SimMan) in the Nursing Skills Lab (NSL) to provide some uniform clinical experiences (high frequency or high risk scenarios) not always available in…
A distributed infrastructure for publishing VO services: an implementation
NASA Astrophysics Data System (ADS)
Cepparo, Francesco; Scagnetto, Ivan; Molinaro, Marco; Smareglia, Riccardo
2016-07-01
This contribution describes both the design and the implementation details of a new solution for publishing VO services, enlightening its maintainable, distributed, modular and scalable architecture. Indeed, the new publisher is multithreaded and multiprocess. Multiple instances of the modules can run on different machines to ensure high performance and high availability, and this will be true both for the interface modules of the services and the back end data access ones. The system uses message passing to let its components communicate through an AMQP message broker that can itself be distributed to provide better scalability and availability.
Preemptive clinical pharmacogenetics implementation: current programs in five US medical centers.
Dunnenberger, Henry M; Crews, Kristine R; Hoffman, James M; Caudle, Kelly E; Broeckel, Ulrich; Howard, Scott C; Hunkler, Robert J; Klein, Teri E; Evans, William E; Relling, Mary V
2015-01-01
Although the field of pharmacogenetics has existed for decades, practioners have been slow to implement pharmacogenetic testing in clinical care. Numerous publications describe the barriers to clinical implementation of pharmacogenetics. Recently, several freely available resources have been developed to help address these barriers. In this review, we discuss current programs that use preemptive genotyping to optimize the pharmacotherapy of patients. Array-based preemptive testing includes a large number of relevant pharmacogenes that impact multiple high-risk drugs. Using a preemptive approach allows genotyping results to be available prior to any prescribing decision so that genomic variation may be considered as an inherent patient characteristic in the planning of therapy. This review describes the common elements among programs that have implemented preemptive genotyping and highlights key processes for implementation, including clinical decision support.
Sybil--efficient constraint-based modelling in R.
Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J
2013-11-13
Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).
Web-based network analysis and visualization using CellMaps
Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín
2016-01-01
Summary: CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. Availability and Implementation: The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps. The client is implemented in JavaScript and the server in C and Java. Contact: jdopazo@cipf.es Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27296979
Häberle, Johannes; Huemer, Martina
2015-01-01
Implementation of guidelines and assessment of their adaptation is not an extensively investigated process in the field of rare diseases. However, whether targeted recipients are reached and willing and able to follow the recommendations has significant impact on the efficacy of guidelines. In 2012, a guideline for the management of urea cycle disorders (UCDs) has been published. We evaluate the efficacy of implementation, adaptation, and use of the UCD guidelines by applying different strategies. (i) Download statistics from online sources were recorded. (ii) Facilities relevant for the implementation of the guidelines were assessed in pediatric units in Germany and Austria. (iii) The guidelines were evaluated by targeted recipients using the AGREE instrument. (iv) A regional networking-based implementation process was evaluated. (i) Download statistics revealed high access with an increase in downloads over time. (ii) In 18% of hospitals ammonia testing was not available 24/7, and emergency drugs were often not available. (iii) Recipient criticism expressed in the AGREE instrument focused on incomplete inclusion of patients' perspectives. (iv) The implementation process improved the availability of ammonia measurements and access to emergency medication, patient care processes, and cooperation between nonspecialists and specialists. Interest in the UCD guidelines is high and sustained, but more precise targeting of the guidelines is advisable. Surprisingly, many hospitals do not possess all facilities necessary to apply the guidelines. Regional network and awareness campaigns result in the improvement of both facilities and knowledge.
A comparison of high-speed links, their commercial support and ongoing R&D activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez, H.L.; Barsotti, E.; Zimmermann, S.
Technological advances and a demanding market have forced the development of higher bandwidth communication standards for networks, data links and busses. Most of these emerging standards are gathering enough momentum that their widespread availability and lower prices are anticipated. The hardware and software that support the physical media for most of these links is currently available, allowing the user community to implement fairly high-bandwidth data links and networks with commercial components. Also, switches needed to support these networks are available or being developed. The commercial suppose of high-bandwidth data links, networks and switching fabrics provides a powerful base for themore » implementation of high-bandwidth data acquisition systems. A large data acquisition system like the one for the Solenoidal Detector Collaboration (SDC) at the SSC can benefit from links and networks that support an integrated systems engineering approach, for initialization, downloading, diagnostics, monitoring, hardware integration and event data readout. The issue that our current work addresses is the possibility of having a channel/network that satisfies the requirements of an integrated data acquisition system. In this paper we present a brief description of high-speed communication links and protocols that we consider of interest for high energy physic High Performance Parallel Interface (HIPPI). Serial HIPPI, Fibre Channel (FC) and Scalable Coherent Interface (SCI). In addition, the initial work required to implement an SDC-like data acquisition system is described.« less
A comparison of high-speed links, their commercial support and ongoing R D activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez, H.L.; Barsotti, E.; Zimmermann, S.
Technological advances and a demanding market have forced the development of higher bandwidth communication standards for networks, data links and busses. Most of these emerging standards are gathering enough momentum that their widespread availability and lower prices are anticipated. The hardware and software that support the physical media for most of these links is currently available, allowing the user community to implement fairly high-bandwidth data links and networks with commercial components. Also, switches needed to support these networks are available or being developed. The commercial suppose of high-bandwidth data links, networks and switching fabrics provides a powerful base for themore » implementation of high-bandwidth data acquisition systems. A large data acquisition system like the one for the Solenoidal Detector Collaboration (SDC) at the SSC can benefit from links and networks that support an integrated systems engineering approach, for initialization, downloading, diagnostics, monitoring, hardware integration and event data readout. The issue that our current work addresses is the possibility of having a channel/network that satisfies the requirements of an integrated data acquisition system. In this paper we present a brief description of high-speed communication links and protocols that we consider of interest for high energy physic High Performance Parallel Interface (HIPPI). Serial HIPPI, Fibre Channel (FC) and Scalable Coherent Interface (SCI). In addition, the initial work required to implement an SDC-like data acquisition system is described.« less
Dunnenberger, Henry M.; Crews, Kristine R.; Hoffman, James M.; Caudle, Kelly E.; Broeckel, Ulrich; Howard, Scott C.; Hunkler, Robert J.; Klein, Teri E.; Evans, William E.; Relling, Mary V.
2015-01-01
Although the field of pharmacogenetics has existed for decades, the implementation of, pharmacogenetic testing in clinical care has been slow. There are numerous publications, describing the barriers to clinical implementation of pharmacogenetics. Recently, several freely, available resources have been developed to help address these barriers. In this review we, discuss current programs that use preemptive genotyping to optimize the pharmacotherapy of, patients. Array-based preemptive testing includes a large number of relevant pharmacogenes, that impact multiple high-risk drugs. Using a preemptive approach allows genotyping results to, be available prior to any prescribing decision so that genomic variation may be considered as, an inherent patient characteristic in the planning of therapy. This review describes the common, elements among programs that have implemented preemptive genotyping and highlights key, processes for implementation, including clinical decision support. PMID:25292429
Gittelsohn, Joel; Suratkar, Sonali; Song, Hee-Jung; Sacher, Suzanne; Rajan, Radha; Rasooly, Irit R.; Bednarek, Erin; Sharma, Sangita; Anliker, Jean A.
2011-01-01
Reduced access to affordable healthy foods is linked to higher rates of chronic diseases in low-income urban settings. The authors conduct a feasibility study of an environmental intervention (Baltimore Healthy Stores) in seven corner stores owned by Korean Americans and two supermarkets in low-income East Baltimore. The goal is to increase the availability of healthy food options and to promote them at the point of purchase. The process evaluation is conducted largely by external evaluators. Participating stores stock promoted foods, and print materials are displayed with moderate to high fidelity. Interactive consumer taste tests are implemented with high reach and dose. Materials developed specifically for Korean American corner store owners are implemented with moderate to high fidelity and dose. Results indicate that small food store–based intervention programs are feasible to implement and are a viable means of increasing healthy food availability and a good location for point-of-purchase promotions in low-income urban settings. PMID:19144859
Domitrovich, Celene E.; Bradshaw, Catherine P.; Poduska, Jeanne M.; Hoagwood, Kimberly; Buckley, Jacquelyn A.; Olin, Serene; Romanelli, Lisa Hunter; Leaf, Philip J.; Greenberg, Mark T.; Ialongo, Nicholas S.
2011-01-01
Increased availability of research-supported, school-based prevention programs, coupled with the growing national policy emphasis on use of evidence-based practices, has contributed to a shift in research priorities from efficacy to implementation and dissemination. A critical issue in moving research to practice is ensuring high-quality implementation of both the intervention model and the support system for sustaining it. The paper describes a three-level framework for considering the implementation quality of school-based interventions. Future directions for research on implementation are discussed. PMID:27182282
The Alcohol Environment Protocol: A new tool for alcohol policy.
Casswell, Sally; Morojele, Neo; Williams, Petal Petersen; Chaiyasong, Surasak; Gordon, Ross; Gray-Philip, Gaile; Viet Cuong, Pham; MacKintosh, Anne-Marie; Halliday, Sharon; Railton, Renee; Randerson, Steve; Parry, Charles D H
2018-01-04
To report data on the implementation of alcohol policies regarding availability and marketing, and drink driving, along with ratings of enforcement from two small high-income to three high-middle income countries, and one low-middle income country. This study uses the Alcohol Environment Protocol, an International Alcohol Control study research tool, which documents the alcohol policy environment by standardised collection of data from administrative sources, observational studies and interviews with key informants to allow for cross-country comparison and change over time. All countries showed adoption to varying extents of key effective policy approaches outlined in the World Health Organization Global Strategy to Reduce the Harmful Use of Alcohol (2010). High-income countries were more likely to allocate resources to enforcement. However, where enforcement and implementation were high, policy on availability was fairly liberal. Key Informants judged alcohol to be very available in both high- and middle-income countries, reflecting liberal policy in the former and less implementation and enforcement and informal (unlicensed) sale of alcohol in the latter. Marketing was largely unrestricted in all countries and while drink-driving legislation was in place, it was less well enforced in middle-income countries. In countries with fewer resources, alcohol policies are less effective because of lack of implementation and enforcement and, in the case of marketing, lack of regulation. This has implications for the increase in consumption taking place as a result of the expanding distribution and marketing of commercial alcohol and consequent increases in alcohol-related harm. © 2018 The Authors Drug and Alcohol Review published by John Wiley & Sons Australia, Ltd on behalf of Australasian Professional Society on Alcohol and other Drugs.
A Practical, Hardware Friendly MMSE Detector for MIMO-OFDM-Based Systems
NASA Astrophysics Data System (ADS)
Kim, Hun Seok; Zhu, Weijun; Bhatia, Jatin; Mohammed, Karim; Shah, Anish; Daneshrad, Babak
2008-12-01
Design and implementation of a highly optimized MIMO (multiple-input multiple-output) detector requires cooptimization of the algorithm with the underlying hardware architecture. Special attention must be paid to application requirements such as throughput, latency, and resource constraints. In this work, we focus on a highly optimized matrix inversion free [InlineEquation not available: see fulltext.] MMSE (minimum mean square error) MIMO detector implementation. The work has resulted in a real-time field-programmable gate array-based implementation (FPGA-) on a Xilinx Virtex-2 6000 using only 9003 logic slices, 66 multipliers, and 24 Block RAMs (less than 33% of the overall resources of this part). The design delivers over 420 Mbps sustained throughput with a small 2.77-microsecond latency. The designed [InlineEquation not available: see fulltext.] linear MMSE MIMO detector is capable of complying with the proposed IEEE 802.11n standard.
nuMap: A Web Platform for Accurate Prediction of Nucleosome Positioning
Alharbi, Bader A.; Alshammari, Thamir H.; Felton, Nathan L.; Zhurkin, Victor B.; Cui, Feng
2014-01-01
Nucleosome positioning is critical for gene expression and of major biological interest. The high cost of experimentally mapping nucleosomal arrangement signifies the need for computational approaches to predict nucleosome positions at high resolution. Here, we present a web-based application to fulfill this need by implementing two models, YR and W/S schemes, for the translational and rotational positioning of nucleosomes, respectively. Our methods are based on sequence-dependent anisotropic bending that dictates how DNA is wrapped around a histone octamer. This application allows users to specify a number of options such as schemes and parameters for threading calculation and provides multiple layout formats. The nuMap is implemented in Java/Perl/MySQL and is freely available for public use at http://numap.rit.edu. The user manual, implementation notes, description of the methodology and examples are available at the site. PMID:25220945
nuMap: a web platform for accurate prediction of nucleosome positioning.
Alharbi, Bader A; Alshammari, Thamir H; Felton, Nathan L; Zhurkin, Victor B; Cui, Feng
2014-10-01
Nucleosome positioning is critical for gene expression and of major biological interest. The high cost of experimentally mapping nucleosomal arrangement signifies the need for computational approaches to predict nucleosome positions at high resolution. Here, we present a web-based application to fulfill this need by implementing two models, YR and W/S schemes, for the translational and rotational positioning of nucleosomes, respectively. Our methods are based on sequence-dependent anisotropic bending that dictates how DNA is wrapped around a histone octamer. This application allows users to specify a number of options such as schemes and parameters for threading calculation and provides multiple layout formats. The nuMap is implemented in Java/Perl/MySQL and is freely available for public use at http://numap.rit.edu. The user manual, implementation notes, description of the methodology and examples are available at the site. Copyright © 2014 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.
NetProt: Complex-based Feature Selection.
Goh, Wilson Wen Bin; Wong, Limsoon
2017-08-04
Protein complex-based feature selection (PCBFS) provides unparalleled reproducibility with high phenotypic relevance on proteomics data. Currently, there are five PCBFS paradigms, but not all representative methods have been implemented or made readily available. To allow general users to take advantage of these methods, we developed the R-package NetProt, which provides implementations of representative feature-selection methods. NetProt also provides methods for generating simulated differential data and generating pseudocomplexes for complex-based performance benchmarking. The NetProt open source R package is available for download from https://github.com/gohwils/NetProt/releases/ , and online documentation is available at http://rpubs.com/gohwils/204259 .
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-19
... combination of combustion and post-combustion controls. EPA approached the five factor analysis using a top... from fuel-bound nitrogen and high temperature combustion; (2) post- combustion add-on control to reduce... is a combination of a post- combustion add-on control, i.e., selective catalytic reduction (SCR), and...
NASA Astrophysics Data System (ADS)
Franchetti, Franz; Sandryhaila, Aliaksei; Johnson, Jeremy R.
2014-06-01
In this paper we introduce High Assurance SPIRAL to solve the last mile problem for the synthesis of high assurance implementations of controllers for vehicular systems that are executed in today's and future embedded and high performance embedded system processors. High Assurance SPIRAL is a scalable methodology to translate a high level specification of a high assurance controller into a highly resource-efficient, platform-adapted, verified control software implementation for a given platform in a language like C or C++. High Assurance SPIRAL proves that the implementation is equivalent to the specification written in the control engineer's domain language. Our approach scales to problems involving floating-point calculations and provides highly optimized synthesized code. It is possible to estimate the available headroom to enable assurance/performance trade-offs under real-time constraints, and enables the synthesis of multiple implementation variants to make attacks harder. At the core of High Assurance SPIRAL is the Hybrid Control Operator Language (HCOL) that leverages advanced mathematical constructs expressing the controller specification to provide high quality translation capabilities. Combined with a verified/certified compiler, High Assurance SPIRAL provides a comprehensive complete solution to the efficient synthesis of verifiable high assurance controllers. We demonstrate High Assurance SPIRALs capability by co-synthesizing proofs and implementations for attack detection and sensor spoofing algorithms and deploy the code as ROS nodes on the Landshark unmanned ground vehicle and on a Synthetic Car in a real-time simulator.
High-Performance I/O: HDF5 for Lattice QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurth, Thorsten; Pochinsky, Andrew; Sarje, Abhinav
2015-01-01
Practitioners of lattice QCD/QFT have been some of the primary pioneer users of the state-of-the-art high-performance-computing systems, and contribute towards the stress tests of such new machines as soon as they become available. As with all aspects of high-performance-computing, I/O is becoming an increasingly specialized component of these systems. In order to take advantage of the latest available high-performance I/O infrastructure, to ensure reliability and backwards compatibility of data files, and to help unify the data structures used in lattice codes, we have incorporated parallel HDF5 I/O into the SciDAC supported USQCD software stack. Here we present the design andmore » implementation of this I/O framework. Our HDF5 implementation outperforms optimized QIO at the 10-20% level and leaves room for further improvement by utilizing appropriate dataset chunking.« less
High-Performance I/O: HDF5 for Lattice QCD
Kurth, Thorsten; Pochinsky, Andrew; Sarje, Abhinav; ...
2017-05-09
Practitioners of lattice QCD/QFT have been some of the primary pioneer users of the state-of-the-art high-performance-computing systems, and contribute towards the stress tests of such new machines as soon as they become available. As with all aspects of high-performance-computing, I/O is becoming an increasingly specialized component of these systems. In order to take advantage of the latest available high-performance I/O infrastructure, to ensure reliability and backwards compatibility of data files, and to help unify the data structures used in lattice codes, we have incorporated parallel HDF5 I/O into the SciDAC supported USQCD software stack. Here we present the design andmore » implementation of this I/O framework. Our HDF5 implementation outperforms optimized QIO at the 10-20% level and leaves room for further improvement by utilizing appropriate dataset chunking.« less
On safari to Random Jungle: a fast implementation of Random Forests for high-dimensional data
Schwarz, Daniel F.; König, Inke R.; Ziegler, Andreas
2010-01-01
Motivation: Genome-wide association (GWA) studies have proven to be a successful approach for helping unravel the genetic basis of complex genetic diseases. However, the identified associations are not well suited for disease prediction, and only a modest portion of the heritability can be explained for most diseases, such as Type 2 diabetes or Crohn's disease. This may partly be due to the low power of standard statistical approaches to detect gene–gene and gene–environment interactions when small marginal effects are present. A promising alternative is Random Forests, which have already been successfully applied in candidate gene analyses. Important single nucleotide polymorphisms are detected by permutation importance measures. To this day, the application to GWA data was highly cumbersome with existing implementations because of the high computational burden. Results: Here, we present the new freely available software package Random Jungle (RJ), which facilitates the rapid analysis of GWA data. The program yields valid results and computes up to 159 times faster than the fastest alternative implementation, while still maintaining all options of other programs. Specifically, it offers the different permutation importance measures available. It includes new options such as the backward elimination method. We illustrate the application of RJ to a GWA of Crohn's disease. The most important single nucleotide polymorphisms (SNPs) validate recent findings in the literature and reveal potential interactions. Availability: The RJ software package is freely available at http://www.randomjungle.org Contact: inke.koenig@imbs.uni-luebeck.de; ziegler@imbs.uni-luebeck.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20505004
Formal semantic specifications as implementation blueprints for real-time programming languages
NASA Technical Reports Server (NTRS)
Feyock, S.
1981-01-01
Formal definitions of language and system semantics provide highly desirable checks on the correctness of implementations of programming languages and their runtime support systems. If these definitions can give concrete guidance to the implementor, major increases in implementation accuracy and decreases in implementation effort can be achieved. It is shown that of the wide variety of available methods the Hgraph (hypergraph) definitional technique (Pratt, 1975), is best suited to serve as such an implementation blueprint. A discussion and example of the Hgraph technique is presented, as well as an overview of the growing body of implementation experience of real-time languages based on Hgraph semantic definitions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurth, Thorsten; Pochinsky, Andrew; Sarje, Abhinav
Practitioners of lattice QCD/QFT have been some of the primary pioneer users of the state-of-the-art high-performance-computing systems, and contribute towards the stress tests of such new machines as soon as they become available. As with all aspects of high-performance-computing, I/O is becoming an increasingly specialized component of these systems. In order to take advantage of the latest available high-performance I/O infrastructure, to ensure reliability and backwards compatibility of data files, and to help unify the data structures used in lattice codes, we have incorporated parallel HDF5 I/O into the SciDAC supported USQCD software stack. Here we present the design andmore » implementation of this I/O framework. Our HDF5 implementation outperforms optimized QIO at the 10-20% level and leaves room for further improvement by utilizing appropriate dataset chunking.« less
ERIC Educational Resources Information Center
Bandy, Tawana; Burkhauser, Mary; Metz, Allison J. R.
2009-01-01
Although many program managers look to data to inform decision-making and manage their programs, high-quality program data may not always be available. Yet such data are necessary for effective program implementation. The use of high-quality data facilitates program management, reduces reliance on anecdotal information, and ensures that data are…
Regular Soda Policies, School Availability, and High School Student Consumption
Terry-McElrath, Yvonne M.; Chriqui, Jamie F.; O’Malley, Patrick M.; Chaloupka, Frank J.; Johnston, Lloyd D.
2014-01-01
Background Beginning in the 2014–2015 school year, all U.S. schools participating in federally reimbursable meal programs are required to implement new nutrition standards for items sold in competitive venues. Multilevel mediation modeling examining direct, mediated, and indirect pathways between policy, availability, and student consumption might provide insight into possible outcomes of implementing aspects of the new standards. Purpose To employ multilevel mediation modeling using state- and school district–level policies mandating school soda bans, school soda availability, and student soda consumption. Methods The 2010–2012 Monitoring the Future surveys obtained nationally representative data on high school student soda consumption; school administrators provided school soda availability data. State laws and district policies were compiled and coded. Analyses conducted in 2014 controlled for state-, school-, and student-level characteristics. Results State–district–school models found that state bans were associated with significantly lower school soda availability (c, p<0.05) but district bans showed no significant associations. No significant direct, mediated, or indirect associations between state policy and student consumption were observed for the overall sample. Among African American high school students, state policy was associated directly with significantly lower school soda availability (a, p<0.01), and—indirectly through lower school availability—with significantly lower soda consumption (a*b, p<0.05). Conclusions These analyses indicate state policy focused on regular soda strongly affected school soda availability, and worked through changes in school availability to decrease soda consumption among African American students, but not the overall population. PMID:25576493
Terry-McElrath, Yvonne M; Hood, Nancy E; Colabianchi, Natalie; O'Malley, Patrick M; Johnston, Lloyd D
2014-07-01
The 2013-2014 school year involved preparation for implementing the new US Department of Agriculture (USDA) competitive foods nutrition standards. An awareness of associations between commercial supplier involvement, food vending practices, and food vending item availability may assist schools in preparing for the new standards. Analyses used 2007-2012 questionnaire data from administrators of 814 middle and 801 high schools in the nationally representative Youth, Education, and Society study to examine prevalence of profit from and commercial involvement with vending machine food sales, and associations between such measures and food availability. Profits for the school district were associated with decreased low-nutrient, energy-dense (LNED) food availability and increased fruit/vegetable availability. Profits for the school and use of company suppliers were associated with increased LNED availability; company suppliers also were associated with decreased fruit/vegetable availability. Supplier "say" in vending food selection was associated with increased LNED availability and decreased fruit/vegetable availability. Results support (1) increased district involvement with school vending policies and practices, and (2) limited supplier "say" as to what items are made available in student-accessed vending machines. Schools and districts should pay close attention to which food items replace vending machine LNED foods following implementation of the new nutrition standards. © 2014, American School Health Association.
van Kampen, Sanne C.; Oskam, Linda; Tuijn, Coosje J.; Klatser, Paul R.
2012-01-01
Background Successful integration of new diagnostics in national tuberculosis (TB) control programs, also called ‘retooling’, is highly dependent on operational aspects related to test availability, accessibility and affordability. This survey aimed to find out whether recommendations to use new diagnostics lead to successful retooling in high TB endemic countries, using immunochromatographic tests (ICTs) for TB culture speciation as a case study. ICTs are recommended to accurately confirm the presence of bacteria of the Mycobacterium tuberculosis complex in liquid culture isolates. Methods and Findings Questionnaires were sent to national TB reference laboratories (NRLs) in 42 high TB endemic countries to address their access to information on ICT implementation, logistics related to availability, accessibility and affordability of ICTs, and testing algorithms. Results from 16 responding countries indicated that half of the NRLs were aware of the contents of WHO guidance documents on liquid culture and ICT implementation, as well as their eligibility for a negotiated pricing agreement for ICT procurement. No major issues with availability and accessibility of ICTs were raised. When asked about testing algorithms, ICTs were not used as stand-alone or first test for TB culture identification as recommended by WHO. Conclusions The low response rate was a limitation of this survey and together with NRLs managers' unawareness of global guidance, suggests a lack of effective communication between partners of the global laboratory network and NRLs. TB tests could become more affordable to high TB endemic countries, if the possibility to negotiate lower prices for commercial products is communicated to them more successfully. NRLs need additional guidance to identify where available technologies can be most usefully implemented and in what order, taking into account long-term laboratory strategies. PMID:22937050
The International Space Station human life sciences experiment implementation process
NASA Technical Reports Server (NTRS)
Miller, L. J.; Haven, C. P.; McCollum, S. G.; Lee, A. M.; Kamman, M. R.; Baumann, D. K.; Anderson, M. E.; Buderer, M. C.
2001-01-01
The selection, definition, and development phases of a Life Sciences flight research experiment has been consistent throughout the past decade. The implementation process, however, has changed significantly within the past two years. This change is driven primarily by the shift from highly integrated, dedicated research missions on platforms with well defined processes to self contained experiments with stand alone operations on platforms which are being concurrently designed. For experiments manifested on the International Space Station (ISS) and/or on short duration missions, the more modular, streamlined, and independent the individual experiment is, the more likely it is to be successfully implemented before the ISS assembly is completed. During the assembly phase of the ISS, science operations are lower in priority than the construction of the station. After the station has been completed, it is expected that more resources will be available to perform research. The complexity of implementing investigations increases with the logistics needed to perform the experiment. Examples of logistics issues include- hardware unique to the experiment; large up and down mass and volume needs; access to crew and hardware during the ascent or descent phases; maintenance of hardware and supplies with a limited shelf life,- baseline data collection schedules with lengthy sessions or sessions close to the launch or landing; onboard stowage availability, particularly cold stowage; and extensive training where highly proficient skills must be maintained. As the ISS processes become better defined, experiment implementation will meet new challenges due to distributed management, on-orbit resource sharing, and adjustments to crew availability pre- and post-increment. c 2001. Elsevier Science Ltd. All rights reserved.
Framework for Flexible Security in Group Communications
NASA Technical Reports Server (NTRS)
McDaniel, Patrick; Prakash, Atul
2006-01-01
The Antigone software system defines a framework for the flexible definition and implementation of security policies in group communication systems. Antigone does not dictate the available security policies, but provides high-level mechanisms for implementing them. A central element of the Antigone architecture is a suite of such mechanisms comprising micro-protocols that provide the basic services needed by secure groups.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobsen, M. K., E-mail: mjacobsen@lanl.gov; Velisavljevic, N.
2015-11-15
Recent technical developments using the large volume Paris-Edinburgh press platform have enabled x-ray synchrotron studies at high pressure and temperature conditions. However, its application to some materials of interest, such as high hazard materials that require special handling due to safety issues, reactivity, or other challenges, has not been feasible without the introduction of special containment systems to eliminate the hazards. However, introduction of a containment system is challenging due to the requirement to provide full safety containment for operation in the variety of environments available, while not hindering any of the experimental probes that are available for inert samplemore » measurement. In this work, we report on the development and implementation of a full safety enclosure for a Paris-Edinburgh type press. During the initial development and subsequent application stage of work, experiments were performed on both cerium dioxide (CeO{sub 2}) and uranium (U). This device allows for full implementation of all currently available experimental probes involving the Paris-Edinburgh press at the High Pressure Collaborative Access Team sector of the Advanced Photon Source.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-11
... Federal Implementation Plan for Implementing Best Available Retrofit Technology for Four Corners Power... Implementation Plan (FIP) to implement the Best Available Retrofit Technology (BART) requirement of the Regional... given the uncertainties in the electrical market in Arizona, EPA is proposing to extend the date by...
Implementation of a Smeared Crack Band Model in a Micromechanics Framework
NASA Technical Reports Server (NTRS)
Pineda, Evan J.; Bednarcyk, Brett A.; Waas, Anthony M.; Arnold, Steven M.
2012-01-01
The smeared crack band theory is implemented within the generalized method of cells and high-fidelity generalized method of cells micromechanics models to capture progressive failure within the constituents of a composite material while retaining objectivity with respect to the size of the discretization elements used in the model. An repeating unit cell containing 13 randomly arranged fibers is modeled and subjected to a combination of transverse tension/compression and transverse shear loading. The implementation is verified against experimental data (where available), and an equivalent finite element model utilizing the same implementation of the crack band theory. To evaluate the performance of the crack band theory within a repeating unit cell that is more amenable to a multiscale implementation, a single fiber is modeled with generalized method of cells and high-fidelity generalized method of cells using a relatively coarse subcell mesh which is subjected to the same loading scenarios as the multiple fiber repeating unit cell. The generalized method of cells and high-fidelity generalized method of cells models are validated against a very refined finite element model.
Low-cost and high-speed optical mark reader based on an intelligent line camera
NASA Astrophysics Data System (ADS)
Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin
2003-08-01
Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.
imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel
Grapov, Dmitry; Newman, John W.
2012-01-01
Summary: Interactive modules for Data Exploration and Visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data through a user-friendly interface. Individual modules enables interactive and dynamic analyses of large data by interfacing R's multivariate statistics and highly customizable visualizations with the spreadsheet environment, aiding robust inferences and generating information-rich data visualizations. This tool provides access to multiple comparisons with false discovery correction, hierarchical clustering, principal and independent component analyses, partial least squares regression and discriminant analysis, through an intuitive interface for creating high-quality two- and a three-dimensional visualizations including scatter plot matrices, distribution plots, dendrograms, heat maps, biplots, trellis biplots and correlation networks. Availability and implementation: Freely available for download at http://sourceforge.net/projects/imdev/. Implemented in R and VBA and supported by Microsoft Excel (2003, 2007 and 2010). Contact: John.Newman@ars.usda.gov Supplementary Information: Installation instructions, tutorials and users manual are available at http://sourceforge.net/projects/imdev/. PMID:22815358
Hart, Reece K.; Rico, Rudolph; Hare, Emily; Garcia, John; Westbrook, Jody; Fusaro, Vincent A.
2015-01-01
Summary: Biological sequence variants are commonly represented in scientific literature, clinical reports and databases of variation using the mutation nomenclature guidelines endorsed by the Human Genome Variation Society (HGVS). Despite the widespread use of the standard, no freely available and comprehensive programming libraries are available. Here we report an open-source and easy-to-use Python library that facilitates the parsing, manipulation, formatting and validation of variants according to the HGVS specification. The current implementation focuses on the subset of the HGVS recommendations that precisely describe sequence-level variation relevant to the application of high-throughput sequencing to clinical diagnostics. Availability and implementation: The package is released under the Apache 2.0 open-source license. Source code, documentation and issue tracking are available at http://bitbucket.org/hgvs/hgvs/. Python packages are available at PyPI (https://pypi.python.org/pypi/hgvs). Contact: reecehart@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25273102
An Analysis of School-to-Work Implementation in Selected Charter Schools. Research Report.
ERIC Educational Resources Information Center
Goodman, Gregory
Three charter schools in southern Arizona--Pimeria Alta High School, Vail Charter High School, and VISION High School--were profiled to ascertain the role of school-to-work (STW) in charter schools. The profiles focused on the following: students' and parents' characteristics and reasons for selecting a charter school; and available facilities,…
High-throughput genotyping of hop (Humulus lupulus L.) utilising diversity arrays technology (DArT)
USDA-ARS?s Scientific Manuscript database
Implementation of molecular methods in hop breeding is dependent on the availability of sizeable numbers of polymorphic markers and a comprehensive understanding of genetic variation. Diversity Arrays Technology (DArT) is a high-throughput cost-effective method for the discovery of large numbers of...
Real-Time Reed-Solomon Decoder
NASA Technical Reports Server (NTRS)
Maki, Gary K.; Cameron, Kelly B.; Owsley, Patrick A.
1994-01-01
Generic Reed-Solomon decoder fast enough to correct errors in real time in practical applications designed to be implemented in fewer and smaller very-large-scale integrated, VLSI, circuit chips. Configured to operate in pipelined manner. One outstanding aspect of decoder design is that Euclid multiplier and divider modules contain Galoisfield multipliers configured as combinational-logic cells. Operates at speeds greater than older multipliers. Cellular configuration highly regular and requires little interconnection area, making it ideal for implementation in extraordinarily dense VLSI circuitry. Flight electronics single chip version of this technology implemented and available.
HiTC: exploration of high-throughput ‘C’ experiments
Servant, Nicolas; Lajoie, Bryan R.; Nora, Elphège P.; Giorgetti, Luca; Chen, Chong-Jian; Heard, Edith; Dekker, Job; Barillot, Emmanuel
2012-01-01
Summary: The R/Bioconductor package HiTC facilitates the exploration of high-throughput 3C-based data. It allows users to import and export ‘C’ data, to transform, normalize, annotate and visualize interaction maps. The package operates within the Bioconductor framework and thus offers new opportunities for future development in this field. Availability and implementation: The R package HiTC is available from the Bioconductor website. A detailed vignette provides additional documentation and help for using the package. Contact: nicolas.servant@curie.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22923296
Varsi, Cecilie; Ekstedt, Mirjam; Gammon, Deede
2015-01-01
Background Although there is growing evidence of the positive effects of Internet-based patient-provider communication (IPPC) services for both patients and health care providers, their implementation into clinical practice continues to be a challenge. Objective The 3 aims of this study were to (1) identify and compare barriers and facilitators influencing the implementation of an IPPC service in 5 hospital units using the Consolidated Framework for Implementation Research (CFIR), (2) assess the ability of the different constructs of CFIR to distinguish between high and low implementation success, and (3) compare our findings with those from other studies that used the CFIR to discriminate between high and low implementation success. Methods This study was based on individual interviews with 10 nurses, 6 physicians, and 1 nutritionist who had used the IPPC to answer messages from patients. Results Of the 36 CFIR constructs, 28 were addressed in the interviews, of which 12 distinguished between high and low implementation units. Most of the distinguishing constructs were related to the inner setting domain of CFIR, indicating that institutional factors were particularly important for successful implementation. Health care providers’ beliefs in the intervention as useful for themselves and their patients as well as the implementation process itself were also important. A comparison of constructs across ours and 2 other studies that also used the CFIR to discriminate between high and low implementation success showed that 24 CFIR constructs distinguished between high and low implementation units in at least 1 study; 11 constructs distinguished in 2 studies. However, only 2 constructs (patient need and resources and available resources) distinguished consistently between high and low implementation units in all 3 studies. Conclusions The CFIR is a helpful framework for illuminating barriers and facilitators influencing IPPC implementation. However, CFIR’s strength of being broad and comprehensive also limits its usefulness as an implementation framework because it does not discriminate between the relative importance of its many constructs for implementation success. This is the first study to identify which CFIR constructs are the most promising to distinguish between high and low implementation success across settings and interventions. Findings from this study can contribute to the refinement of CFIR toward a more succinct and parsimonious framework for planning and evaluation of the implementation of clinical interventions. ClinicalTrial Clinicaltrials.gov NCT00971139; http://clinicaltrial.gov/ct2/show/NCT00971139 (Archived by WebCite at http://www.webcitation.org/6cWeqN1uY) PMID:26582138
The integration of technology into the middle and high school science curriculum
NASA Astrophysics Data System (ADS)
Corbin, Jan Frederic
This study was to determine the level of technology implementation into the middle and high school science curriculum by beginning teachers. Research was conducted in two phases. The first phase was a survey that provided demographic data and determined the Level of Technology Implementation, Personal Computer Use, and Current Instructional Practice. Dr. Christopher Moersch developed the survey, Level of Technology Implementation (LoTi(c) ). The data provided insight into what technology teachers use, barriers associated with technology integration, teacher training and development, and technical support. Follow-up interviews were conducted to gather additional qualitative data and information. Analysis of the data found beginning teachers have not received enough technology training to integrate technology seamlessly into the science curriculum. Conclusions cite the need for more technology courses during preservice education, more time during the day for beginning teachers to learn to use the technology available at their schools, consolidation of inservice staff development offerings, and more technical support staff readily available. Recommendations were made to expand the study group to all science teachers, assess the technology capacity of all schools, and conduct needs assessment of inservice staff development.
Jiang, Minghuan; Zhou, Zhongliang; Wu, Lina; Shen, Qian; Lv, Bing; Wang, Xiao; Yang, Shimin; Fang, Yu
2015-02-01
In 2009, China implemented the National Essential Medicines System (NEMS) to improve access to high-quality low-cost essential medicines. To measure the prices, availability and affordability of medicines in China following the implementation of the NEMS. 120 public hospitals and 120 private pharmacies in ten cities in Shaanxi Province, Western China. The standardized methodology developed by the World Health Organization and Health Action International was used to collect data on prices and availability of 49 medicines. Median price ratio; availability as a percentage; cost of course of treatment in days' wages of the lowest-paid government workers. In the public hospitals, originator brands (OBs) were procured at 8.89 times the international reference price, more than seven times higher than the lowest-priced generics (LPGs). Patients paid 11.83 and 1.69 times the international reference prices for OBs and generics respectively. A similar result was observed in the private pharmacies. The mean availabilities of OBs and LPGs were 7.1 and 20.0 % in the public hospitals, and 12.6 and 29.2 % in the private pharmacies. Treatment with OBs is therefore largely unaffordable, but the affordability of the LPGs is generally good. High prices and low availability of survey medicines were observed. The affordability of generics, but not OBs, is reasonable. Effective measures should be taken to reduce medicine prices and improve availability and affordability in Shaanxi Province.
2008-02-01
combined thermal g effect and initial current field. The model is implemented using Abaqus user element subroutine and verified against the experimental...Finite Element Formulation The proposed model is implemented with ABAQUS general purpose finite element program using thermal -displacement analysis...option. ABAQUS and other commercially available finite element codes do not have the capability to solve general electromigration problem directly. Thermal
Implementation of Virtualization Oriented Architecture: A Healthcare Industry Case Study
NASA Astrophysics Data System (ADS)
Rao, G. Subrahmanya Vrk; Parthasarathi, Jinka; Karthik, Sundararaman; Rao, Gvn Appa; Ganesan, Suresh
This paper presents a Virtualization Oriented Architecture (VOA) and an implementation of VOA for Hridaya - a Telemedicine initiative. Hadoop Compute cloud was established at our labs and jobs which require a massive computing capability such as ECG signal analysis were submitted and the study is presented in this current paper. VOA takes advantage of inexpensive community PCs and provides added advantages such as Fault Tolerance, Scalability, Performance, High Availability.
Task Shifting in Dermatology: A Call to Action.
Brown, Danielle N; Langan, Sinéad M; Freeman, Esther E
2017-11-01
Can task shifting be used to improve the delivery of dermatologic care in resource-poor settings worldwide? Task shifting is a means of redistributing available resources, whereby highly trained individuals train an available workforce to provide necessary care in low-resource settings. Limited evidence exists for task shifting in dermatology; however, studies from psychiatry demonstrate its efficacy. In the field of dermatology there is a need for high-quality evidence including randomized clinical trials to validate the implementation of task shifting in low-resource settings globally.
Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D
2018-01-01
Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757
A Multi-Level Parallelization Concept for High-Fidelity Multi-Block Solvers
NASA Technical Reports Server (NTRS)
Hatay, Ferhat F.; Jespersen, Dennis C.; Guruswamy, Guru P.; Rizk, Yehia M.; Byun, Chansup; Gee, Ken; VanDalsem, William R. (Technical Monitor)
1997-01-01
The integration of high-fidelity Computational Fluid Dynamics (CFD) analysis tools with the industrial design process benefits greatly from the robust implementations that are transportable across a wide range of computer architectures. In the present work, a hybrid domain-decomposition and parallelization concept was developed and implemented into the widely-used NASA multi-block Computational Fluid Dynamics (CFD) packages implemented in ENSAERO and OVERFLOW. The new parallel solver concept, PENS (Parallel Euler Navier-Stokes Solver), employs both fine and coarse granularity in data partitioning as well as data coalescing to obtain the desired load-balance characteristics on the available computer platforms. This multi-level parallelism implementation itself introduces no changes to the numerical results, hence the original fidelity of the packages are identically preserved. The present implementation uses the Message Passing Interface (MPI) library for interprocessor message passing and memory accessing. By choosing an appropriate combination of the available partitioning and coalescing capabilities only during the execution stage, the PENS solver becomes adaptable to different computer architectures from shared-memory to distributed-memory platforms with varying degrees of parallelism. The PENS implementation on the IBM SP2 distributed memory environment at the NASA Ames Research Center obtains 85 percent scalable parallel performance using fine-grain partitioning of single-block CFD domains using up to 128 wide computational nodes. Multi-block CFD simulations of complete aircraft simulations achieve 75 percent perfect load-balanced executions using data coalescing and the two levels of parallelism. SGI PowerChallenge, SGI Origin 2000, and a cluster of workstations are the other platforms where the robustness of the implementation is tested. The performance behavior on the other computer platforms with a variety of realistic problems will be included as this on-going study progresses.
Implementation and use of a highly available and innovative IaaS solution: the Cloud Area Padovana
NASA Astrophysics Data System (ADS)
Aiftimiei, C.; Andreetto, P.; Bertocco, S.; Biasotto, M.; Dal Pra, S.; Costa, F.; Crescente, A.; Dorigo, A.; Fantinel, S.; Fanzago, F.; Frizziero, E.; Gulmini, M.; Michelotto, M.; Sgaravatto, M.; Traldi, S.; Venaruzzo, M.; Verlato, M.; Zangrando, L.
2015-12-01
While in the business world the cloud paradigm is typically implemented purchasing resources and services from third party providers (e.g. Amazon), in the scientific environment there's usually the need of on-premises IaaS infrastructures which allow efficient usage of the hardware distributed among (and owned by) different scientific administrative domains. In addition, the requirement of open source adoption has led to the choice of products like OpenStack by many organizations. We describe a use case of the Italian National Institute for Nuclear Physics (INFN) which resulted in the implementation of a unique cloud service, called ’Cloud Area Padovana’, which encompasses resources spread over two different sites: the INFN Legnaro National Laboratories and the INFN Padova division. We describe how this IaaS has been implemented, which technologies have been adopted and how services have been configured in high-availability (HA) mode. We also discuss how identity and authorization management were implemented, adopting a widely accepted standard architecture based on SAML2 and OpenID: by leveraging the versatility of those standards the integration with authentication federations like IDEM was implemented. We also discuss some other innovative developments, such as a pluggable scheduler, implemented as an extension of the native OpenStack scheduler, which allows the allocation of resources according to a fair-share based model and which provides a persistent queuing mechanism for handling user requests that can not be immediately served. Tools, technologies, procedures used to install, configure, monitor, operate this cloud service are also discussed. Finally we present some examples that show how this IaaS infrastructure is being used.
Jacobsen, M. K.; Velisavljevic, N.
2015-11-20
Recent technical developments using the large volume Paris-Edinburgh press platform have enabled x-ray synchrotron studies at high pressure and temperature conditions. However, its application to some materials of interest, such as high hazard materials that require special handling due to safety issues, reactivity, or other challenges, has not been feasible without the introduction of special containment systems to eliminate the hazards. However, introduction of a containment system is challenging due to the requirement to provide full safety containment for operation in the variety of environments available, while not hindering any of the experimental probes that are available for inert samplemore » measurement. In this work, we report on the development and implementation of a full safety enclosure for a Paris-Edinburgh type press. During the initial development and subsequent application stage of work, experiments were performed on both cerium dioxide (CeO2) and uranium (U). As a result, this device allows for full implementation of all currently available experimental probes involving the Paris-Edinburgh press at the High Pressure Collaborative Access Team sector of the Advanced Photon Source.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ning, S. A.; Hayman, G.; Damiani, R.
Blade element momentum methods, though conceptually simple, are highly useful for analyzing wind turbines aerodynamics and are widely used in many design and analysis applications. A new version of AeroDyn is being developed to take advantage of new robust solution methodologies, conform to a new modularization framework for National Renewable Energy Laboratory's FAST, utilize advanced skewed-wake analysis methods, fix limitations with previous implementations, and to enable modeling of highly flexible and nonstraight blades. This paper reviews blade element momentum theory and several of the options available for analyzing skewed inflow. AeroDyn implementation details are described for the benefit of usersmore » and developers. These new options are compared to solutions from the previous version of AeroDyn and to experimental data. Finally, recommendations are given on how one might select from the various available solution approaches.« less
NGSANE: a lightweight production informatics framework for high-throughput data analysis.
Buske, Fabian A; French, Hugh J; Smith, Martin A; Clark, Susan J; Bauer, Denis C
2014-05-15
The initial steps in the analysis of next-generation sequencing data can be automated by way of software 'pipelines'. However, individual components depreciate rapidly because of the evolving technology and analysis methods, often rendering entire versions of production informatics pipelines obsolete. Constructing pipelines from Linux bash commands enables the use of hot swappable modular components as opposed to the more rigid program call wrapping by higher level languages, as implemented in comparable published pipelining systems. Here we present Next Generation Sequencing ANalysis for Enterprises (NGSANE), a Linux-based, high-performance-computing-enabled framework that minimizes overhead for set up and processing of new projects, yet maintains full flexibility of custom scripting when processing raw sequence data. Ngsane is implemented in bash and publicly available under BSD (3-Clause) licence via GitHub at https://github.com/BauerLab/ngsane. Denis.Bauer@csiro.au Supplementary data are available at Bioinformatics online.
Scalable Unix commands for parallel processors : a high-performance implementation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ong, E.; Lusk, E.; Gropp, W.
2001-06-22
We describe a family of MPI applications we call the Parallel Unix Commands. These commands are natural parallel versions of common Unix user commands such as ls, ps, and find, together with a few similar commands particular to the parallel environment. We describe the design and implementation of these programs and present some performance results on a 256-node Linux cluster. The Parallel Unix Commands are open source and freely available.
Experimental magic state distillation for fault-tolerant quantum computing.
Souza, Alexandre M; Zhang, Jingfu; Ryan, Colm A; Laflamme, Raymond
2011-01-25
Any physical quantum device for quantum information processing (QIP) is subject to errors in implementation. In order to be reliable and efficient, quantum computers will need error-correcting or error-avoiding methods. Fault-tolerance achieved through quantum error correction will be an integral part of quantum computers. Of the many methods that have been discovered to implement it, a highly successful approach has been to use transversal gates and specific initial states. A critical element for its implementation is the availability of high-fidelity initial states, such as |0〉 and the 'magic state'. Here, we report an experiment, performed in a nuclear magnetic resonance (NMR) quantum processor, showing sufficient quantum control to improve the fidelity of imperfect initial magic states by distilling five of them into one with higher fidelity.
Exploring information technology adoption by family physicians: survey instrument valuation.
Dixon, D. R.; Stewart, M.
2000-01-01
As the information needs of family physicians become more complex, there will be a greater need to successfully implement the technologies needed to manage that information. The ability to stratify primary care physicians can enable the implementation process to be more efficient. This research tested a new instrument on 101 family physicians, and was able to stratify physicians into high, intermediate, and low information technology (IT) usage groups. It is expected that this stratification would allow managers of IT implementation to target specific adoption strategies for each group. The instrument is available from ddixon@julian.uwo.ca. PMID:11079870
Mao, Hongliang
2017-01-01
Abstract Motivation: Short Interspersed Nuclear Elements (SINEs) are transposable elements (TEs) that amplify through a copy-and-paste mode via RNA intermediates. The computational identification of new SINEs are challenging because of their weak structural signals and rapid diversification in sequences. Results: Here we report SINE_Scan, a highly efficient program to predict SINE elements in genomic DNA sequences. SINE_Scan integrates hallmark of SINE transposition, copy number and structural signals to identify a SINE element. SINE_Scan outperforms the previously published de novo SINE discovery program. It shows high sensitivity and specificity in 19 plant and animal genome assemblies, of which sizes vary from 120 Mb to 3.5 Gb. It identifies numerous new families and substantially increases the estimation of the abundance of SINEs in these genomes. Availability and Implementation: The code of SINE_Scan is freely available at http://github.com/maohlzj/SINE_Scan, implemented in PERL and supported on Linux. Contact: wangh8@fudan.edu.cn Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28062442
NASA Technical Reports Server (NTRS)
Pineda, Evan J.; Bednarcyk, Brett A.; Waas, Anthony M.; Arnold, Steven M.
2012-01-01
The smeared crack band theory is implemented within the generalized method of cells and high-fidelity generalized method of cells micromechanics models to capture progressive failure within the constituents of a composite material while retaining objectivity with respect to the size of the discretization elements used in the model. An repeating unit cell containing 13 randomly arranged fibers is modeled and subjected to a combination of transverse tension/compression and transverse shear loading. The implementation is verified against experimental data (where available), and an equivalent finite element model utilizing the same implementation of the crack band theory. To evaluate the performance of the crack band theory within a repeating unit cell that is more amenable to a multiscale implementation, a single fiber is modeled with generalized method of cells and high-fidelity generalized method of cells using a relatively coarse subcell mesh which is subjected to the same loading scenarios as the multiple fiber repeating unit cell. The generalized method of cells and high-fidelity generalized method of cells models are validated against a very refined finite element model.
Johnston, Kylie N; Young, Mary; Grimmer-Somers, Karen A; Antic, Ral; Frith, Peter A
2011-01-01
Background Clinical guidelines for management of patients with chronic obstructive pulmonary disease (COPD) include recommendations based on high levels of evidence, but gaps exist in their implementation. The aim of this study was to examine the perspectives of medical practitioners regarding implementation of six high-evidence recommendations for the management of people with COPD. Methods Semi-structured interviews were conducted with medical practitioners involved with care of COPD patients in hospital and general practice. Interviews sought medical practitioners’ experience regarding implementation of smoking cessation, influenza vaccination, pulmonary rehabilitation, guideline-based medications, long-term oxygen therapy for hypoxemia and plan and advice for future exacerbations. Interviews were audiotaped, transcribed verbatim and analyzed using content analysis. Results Nine hospital-based medical practitioners and seven general practitioners participated. Four major categories were identified which impacted on implementation of the target recommendations in the care of patients with COPD: (1) role clarity of the medical practitioner; (2) persuasive communication with the patient; (3) complexity of behavioral change required; (4) awareness and support available at multiple levels. For some recommendations, strength in all four categories provided significant enablers supporting implementation. However, with regard to pulmonary rehabilitation and plans and advice for future exacerbations, all identified categories that presented barriers to implementation. Conclusion This study of medical practitioner perspectives has indicated areas where significant barriers to the implementation of key evidence-based recommendations in COPD management persist. Developing strategies to target the identified categories provides an opportunity to achieve greater implementation of those high-evidence recommendations in the care of people with COPD. PMID:22259242
High dimensional biological data retrieval optimization with NoSQL technology.
Wang, Shicai; Pandis, Ioannis; Wu, Chao; He, Sijin; Johnson, David; Emam, Ibrahim; Guitton, Florian; Guo, Yike
2014-01-01
High-throughput transcriptomic data generated by microarray experiments is the most abundant and frequently stored kind of data currently used in translational medicine studies. Although microarray data is supported in data warehouses such as tranSMART, when querying relational databases for hundreds of different patient gene expression records queries are slow due to poor performance. Non-relational data models, such as the key-value model implemented in NoSQL databases, hold promise to be more performant solutions. Our motivation is to improve the performance of the tranSMART data warehouse with a view to supporting Next Generation Sequencing data. In this paper we introduce a new data model better suited for high-dimensional data storage and querying, optimized for database scalability and performance. We have designed a key-value pair data model to support faster queries over large-scale microarray data and implemented the model using HBase, an implementation of Google's BigTable storage system. An experimental performance comparison was carried out against the traditional relational data model implemented in both MySQL Cluster and MongoDB, using a large publicly available transcriptomic data set taken from NCBI GEO concerning Multiple Myeloma. Our new key-value data model implemented on HBase exhibits an average 5.24-fold increase in high-dimensional biological data query performance compared to the relational model implemented on MySQL Cluster, and an average 6.47-fold increase on query performance on MongoDB. The performance evaluation found that the new key-value data model, in particular its implementation in HBase, outperforms the relational model currently implemented in tranSMART. We propose that NoSQL technology holds great promise for large-scale data management, in particular for high-dimensional biological data such as that demonstrated in the performance evaluation described in this paper. We aim to use this new data model as a basis for migrating tranSMART's implementation to a more scalable solution for Big Data.
High dimensional biological data retrieval optimization with NoSQL technology
2014-01-01
Background High-throughput transcriptomic data generated by microarray experiments is the most abundant and frequently stored kind of data currently used in translational medicine studies. Although microarray data is supported in data warehouses such as tranSMART, when querying relational databases for hundreds of different patient gene expression records queries are slow due to poor performance. Non-relational data models, such as the key-value model implemented in NoSQL databases, hold promise to be more performant solutions. Our motivation is to improve the performance of the tranSMART data warehouse with a view to supporting Next Generation Sequencing data. Results In this paper we introduce a new data model better suited for high-dimensional data storage and querying, optimized for database scalability and performance. We have designed a key-value pair data model to support faster queries over large-scale microarray data and implemented the model using HBase, an implementation of Google's BigTable storage system. An experimental performance comparison was carried out against the traditional relational data model implemented in both MySQL Cluster and MongoDB, using a large publicly available transcriptomic data set taken from NCBI GEO concerning Multiple Myeloma. Our new key-value data model implemented on HBase exhibits an average 5.24-fold increase in high-dimensional biological data query performance compared to the relational model implemented on MySQL Cluster, and an average 6.47-fold increase on query performance on MongoDB. Conclusions The performance evaluation found that the new key-value data model, in particular its implementation in HBase, outperforms the relational model currently implemented in tranSMART. We propose that NoSQL technology holds great promise for large-scale data management, in particular for high-dimensional biological data such as that demonstrated in the performance evaluation described in this paper. We aim to use this new data model as a basis for migrating tranSMART's implementation to a more scalable solution for Big Data. PMID:25435347
CUDA compatible GPU cards as efficient hardware accelerators for Smith-Waterman sequence alignment
Manavski, Svetlin A; Valle, Giorgio
2008-01-01
Background Searching for similarities in protein and DNA databases has become a routine procedure in Molecular Biology. The Smith-Waterman algorithm has been available for more than 25 years. It is based on a dynamic programming approach that explores all the possible alignments between two sequences; as a result it returns the optimal local alignment. Unfortunately, the computational cost is very high, requiring a number of operations proportional to the product of the length of two sequences. Furthermore, the exponential growth of protein and DNA databases makes the Smith-Waterman algorithm unrealistic for searching similarities in large sets of sequences. For these reasons heuristic approaches such as those implemented in FASTA and BLAST tend to be preferred, allowing faster execution times at the cost of reduced sensitivity. The main motivation of our work is to exploit the huge computational power of commonly available graphic cards, to develop high performance solutions for sequence alignment. Results In this paper we present what we believe is the fastest solution of the exact Smith-Waterman algorithm running on commodity hardware. It is implemented in the recently released CUDA programming environment by NVidia. CUDA allows direct access to the hardware primitives of the last-generation Graphics Processing Units (GPU) G80. Speeds of more than 3.5 GCUPS (Giga Cell Updates Per Second) are achieved on a workstation running two GeForce 8800 GTX. Exhaustive tests have been done to compare our implementation to SSEARCH and BLAST, running on a 3 GHz Intel Pentium IV processor. Our solution was also compared to a recently published GPU implementation and to a Single Instruction Multiple Data (SIMD) solution. These tests show that our implementation performs from 2 to 30 times faster than any other previous attempt available on commodity hardware. Conclusions The results show that graphic cards are now sufficiently advanced to be used as efficient hardware accelerators for sequence alignment. Their performance is better than any alternative available on commodity hardware platforms. The solution presented in this paper allows large scale alignments to be performed at low cost, using the exact Smith-Waterman algorithm instead of the largely adopted heuristic approaches. PMID:18387198
NASA Astrophysics Data System (ADS)
Rauch, T.; Deetjen, J. L.
2003-01-01
State-of-the-art NLTE model atmosphere codes have arrived at a high level of ``numerical'' sophistication and are an adequate tool to analyze the available high-quality spectra from the infrared to the X-ray wavelength range. The computational capacities allow the calculation which include all elements from hydrogen up to the iron group and the lack of reliable atomic data has become a crucial problem for further progress. We summarize briefly the available sources of atomic data and how these are implemented in the Tübingen Model Atmosphere Package (TMAP).
Petersen, Katelin E; Prows, Cynthia A; Martin, Lisa J; Maglo, Koffi N
2014-01-01
The success of personalized medicine depends on factors influencing the availability and implementation of its new tools to individualize clinical care. However, little is known about physicians' views of the availability of personalized medicine across racial/ethnic groups and the relationship between perceived availability and clinical implementation. This study examines physicians' perceptions of key elements/tools and potential barriers to personalized medicine in connection with their perceptions of the availability of the latter across subpopulations. Study subjects consisted of physicians recruited from Cincinnati Children's Hospital Medical Center and UC Health. An electronic survey conducted from September 2012 to November 2012 recruited 104 physicians. Wilcoxon rank sum analysis compared groups. Physicians were divided about whether personalized medicine contributes to health equality, as 37.4% of them believe that personalized medicine is currently available only for some subpopulations. They also rated the importance of racial/ethnic background almost as high as the importance of genetic information in the delivery of personalized medicine. Actual elements of personalized medicine rated highest include family history, drug-drug interaction alerts in medical records, and biomarker measurements to guide therapy. Costs of gene-based therapies and genetic testing were rated the most significant barriers. The ratings of several elements and barriers were associated with perceived availability of personalized medicine across subpopulations. While physicians hold differing views about the availability and implementation of personalized medicine, they likewise establish complex relationships between race/ethnicity and personalized medicine that may carry serious implications for its clinical success. © 2014 S. Karger AG, Basel.
Begnaud, Abbie; Hall, Thomas; Allen, Tadashi
2016-01-01
Screening for lung cancer with low-dose CT has evolved rapidly in recent years since the National Lung Screening Trial (NLST) results. Subsequent professional and governmental organization guidelines have shaped policy and reimbursement for the service. Increasingly available guidance describes eligible patients and components necessary for a high-quality lung cancer screening program; however, practical instruction and implementation experience is not widely reported. We launched a lung cancer screening program in the face of reimbursement and guideline uncertainties at a large academic health center. We report our experience with implementation, including challenges and proposed solutions. Initially, we saw less referrals than expected for screening, and many patients referred for screening did not clearly meet eligibility guidelines. We educated primary care providers and implemented system tools to encourage referral of eligible patients. Moreover, in response to the Centers for Medicare & Medicaid Services (CMS) final coverage determination, we report our programmatic adaptation to meet these requirements. In addition to the components common to all quality programs, individual health delivery systems will face unique barriers related to patient population, available resources, and referral patterns.
System for selecting relevant information for decision support.
Kalina, Jan; Seidl, Libor; Zvára, Karel; Grünfeldová, Hana; Slovák, Dalibor; Zvárová, Jana
2013-01-01
We implemented a prototype of a decision support system called SIR which has a form of a web-based classification service for diagnostic decision support. The system has the ability to select the most relevant variables and to learn a classification rule, which is guaranteed to be suitable also for high-dimensional measurements. The classification system can be useful for clinicians in primary care to support their decision-making tasks with relevant information extracted from any available clinical study. The implemented prototype was tested on a sample of patients in a cardiological study and performs an information extraction from a high-dimensional set containing both clinical and gene expression data.
High responsivity CMOS imager pixel implemented in SOI technology
NASA Technical Reports Server (NTRS)
Zheng, X.; Wrigley, C.; Yang, G.; Pain, B.
2000-01-01
Availability of mature sub-micron CMOS technology and the advent of the new low noise active pixel sensor (APS) concept have enabled the development of low power, miniature, single-chip, CMOS digital imagers in the decade of the 1990's.
NASA Technical Reports Server (NTRS)
Saleeb, A. F.; Chang, T. Y. P.; Wilt, T.; Iskovitz, I.
1989-01-01
The research work performed during the past year on finite element implementation and computational techniques pertaining to high temperature composites is outlined. In the present research, two main issues are addressed: efficient geometric modeling of composite structures and expedient numerical integration techniques dealing with constitutive rate equations. In the first issue, mixed finite elements for modeling laminated plates and shells were examined in terms of numerical accuracy, locking property and computational efficiency. Element applications include (currently available) linearly elastic analysis and future extension to material nonlinearity for damage predictions and large deformations. On the material level, various integration methods to integrate nonlinear constitutive rate equations for finite element implementation were studied. These include explicit, implicit and automatic subincrementing schemes. In all cases, examples are included to illustrate the numerical characteristics of various methods that were considered.
Availability of drinking water in US public school cafeterias.
Hood, Nancy E; Turner, Lindsey; Colabianchi, Natalie; Chaloupka, Frank J; Johnston, Lloyd D
2014-09-01
This study examined the availability of free drinking water during lunchtime in US public schools, as required by federal legislation beginning in the 2011-2012 school year. Data were collected by mail-back surveys in nationally representative samples of US public elementary, middle, and high schools from 2009-2010 to 2011-2012. Overall, 86.4%, 87.4%, and 89.4% of students attended elementary, middle, and high schools, respectively, that met the drinking water requirement. Most students attended schools with existing cafeteria drinking fountains and about one fourth attended schools with water dispensers. In middle and high schools, respondents were asked to indicate whether drinking fountains were clean, and whether they were aware of any water-quality problems at the school. The vast majority of middle and high school students (92.6% and 90.4%, respectively) attended schools where the respondent perceived drinking fountains to be clean or very clean. Approximately one in four middle and high school students attended a school where the survey respondent indicated that there were water-quality issues affecting drinking fountains. Although most schools have implemented the requirement to provide free drinking water at lunchtime, additional work is needed to promote implementation at all schools. School nutrition staff at the district and school levels can play an important role in ensuring that schools implement the drinking water requirement, as well as promote education and behavior-change strategies to increase student consumption of water at school. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Orlando, Lori A.; Sperber, Nina R.; Voils, Corrine; Nichols, Marshall; Myers, Rachel A.; Wu, R. Ryanne; Rakhra-Burris, Tejinder; Levy, Kenneth D.; Levy, Mia; Pollin, Toni I.; Guan, Yue; Horowitz, Carol R.; Ramos, Michelle; Kimmel, Stephen E.; McDonough, Caitrin W.; Madden, Ebony B.; Damschroder, Laura J.
2017-01-01
Purpose Implementation research provides a structure for evaluating the clinical integration of genomic medicine interventions. This paper describes the Implementing GeNomics In PracTicE (IGNITE) Network’s efforts to promote: 1) a broader understanding of genomic medicine implementation research; and 2) the sharing of knowledge generated in the network. Methods To facilitate this goal the IGNITE Network Common Measures Working Group (CMG) members adopted the Consolidated Framework for Implementation Research (CFIR) to guide their approach to: identifying constructs and measures relevant to evaluating genomic medicine as a whole, standardizing data collection across projects, and combining data in a centralized resource for cross network analyses. Results CMG identified ten high-priority CFIR constructs as important for genomic medicine. Of those, eight didn’t have standardized measurement instruments. Therefore, we developed four survey tools to address this gap. In addition, we identified seven high-priority constructs related to patients, families, and communities that did not map to CFIR constructs. Both sets of constructs were combined to create a draft genomic medicine implementation model. Conclusion We developed processes to identify constructs deemed valuable for genomic medicine implementation and codified them in a model. These resources are freely available to facilitate knowledge generation and sharing across the field. PMID:28914267
Orlando, Lori A; Sperber, Nina R; Voils, Corrine; Nichols, Marshall; Myers, Rachel A; Wu, R Ryanne; Rakhra-Burris, Tejinder; Levy, Kenneth D; Levy, Mia; Pollin, Toni I; Guan, Yue; Horowitz, Carol R; Ramos, Michelle; Kimmel, Stephen E; McDonough, Caitrin W; Madden, Ebony B; Damschroder, Laura J
2018-06-01
PurposeImplementation research provides a structure for evaluating the clinical integration of genomic medicine interventions. This paper describes the Implementing Genomics in Practice (IGNITE) Network's efforts to promote (i) a broader understanding of genomic medicine implementation research and (ii) the sharing of knowledge generated in the network.MethodsTo facilitate this goal, the IGNITE Network Common Measures Working Group (CMG) members adopted the Consolidated Framework for Implementation Research (CFIR) to guide its approach to identifying constructs and measures relevant to evaluating genomic medicine as a whole, standardizing data collection across projects, and combining data in a centralized resource for cross-network analyses.ResultsCMG identified 10 high-priority CFIR constructs as important for genomic medicine. Of those, eight did not have standardized measurement instruments. Therefore, we developed four survey tools to address this gap. In addition, we identified seven high-priority constructs related to patients, families, and communities that did not map to CFIR constructs. Both sets of constructs were combined to create a draft genomic medicine implementation model.ConclusionWe developed processes to identify constructs deemed valuable for genomic medicine implementation and codified them in a model. These resources are freely available to facilitate knowledge generation and sharing across the field.
Xi, Xiaoyu; Li, Weixia; Li, Jun; Zhu, Xuan; Fu, Cong; Wei, Xu; Chu, Shuzhen
2015-08-27
Field surveys conducted in China before the implementation of the essential medicine policy showed that Chinese individuals faced less access to essential medicines. This paper aims to evaluate the availability, prices and affordability of essential medicines in Jiangsu Province, China after the implementation of the policy in 2009. A cross-sectional survey was conducted in Jiangsu in 2013 using the World Health Organization/Health Action International (WHO/HAI) methodology. Data on the availability and prices of 50 essential medicines were collected from the public and private healthcare sectors. The mean availabilities of innovator brands and lowest priced generics (LPGs) were 11.5% and 100% in primary healthcare facilities, 36.8% and 32.6% in the secondary and tertiary sectors, and 18.7% and 42.9% in the private sector, respectively. The median price ratios (MPRs) were 1.26 to 2.05 for generics and 3.76 to 27.22 for innovator brands. Treating ten common diseases with LPGs was generally affordable, whereas treatment with IBs was less affordable. The high availability of LPGs at primary healthcare facilities reflects the success of the essential medicine policy, while the low availability in secondary and tertiary levels and in private pharmacies reflects a failure to implement the policy in these levels. The health policy should be fully developed and enforced at the secondary and tertiary levels and in the private sector to ensure equitable access to health services.
3.2 million stillbirths: epidemiology and overview of the evidence review
Lawn, Joy E; Yakoob, Mohammad Yawar; Haws, Rachel A; Soomro, Tanya; Darmstadt, Gary L; Bhutta, Zulfiqar A
2009-01-01
More than 3.2 million stillbirths occur globally each year, yet stillbirths are largely invisible in global data tracking, policy dialogue and programme implementation. This mismatch of burden to action is due to a number of factors that keep stillbirths hidden, notably a lack of data and a lack of consensus on priority interventions, but also to social taboos that reduce the visibility of stillbirths and the associated family mourning. Whilst there are estimates of the numbers of stillbirths, to date there has been no systematic global analysis of the causes of stillbirths. The multiple classifications systems in use are often complex and are primarily focused on high-income countries. We review available data and propose a programmatic classification that is feasible and comparable across settings. We undertook a comprehensive global review of available information on stillbirths in order to 1) identify studies that evaluated risk factors and interventions to reduce stillbirths, 2) evaluate the level of evidence for interventions, 3) place the available evidence for interventions in a health systems context to guide programme implementation, and 4) elucidate key implementation, monitoring, and research gaps. This first paper in the series outlines issues in stillbirth data availability and quality, the global epidemiology of stillbirths, and describes the methodology and framework used for the review of interventions and strategies. PMID:19426465
Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S.
2016-01-01
Motivation: Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical ‘large p, small n’ problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Results: Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the ‘large p, small n’ problem. Availability and implementation: Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT. Contact: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26519502
NASA Technical Reports Server (NTRS)
Roth, Don J.; Hendricks, J. Lynne; Whalen, Mike F.; Bodis, James R.; Martin, Katherine
1996-01-01
This article describes the commercial implementation of ultrasonic velocity imaging methods developed and refined at NASA Lewis Research Center on the Sonix c-scan inspection system. Two velocity imaging methods were implemented: thickness-based and non-thickness-based reflector plate methods. The article demonstrates capabilities of the commercial implementation and gives the detailed operating procedures required for Sonix customers to achieve optimum velocity imaging results. This commercial implementation of velocity imaging provides a 100x speed increase in scanning and processing over the lab-based methods developed at LeRC. The significance of this cooperative effort is that the aerospace and other materials development-intensive industries which use extensive ultrasonic inspection for process control and failure analysis will now have an alternative, highly accurate imaging method commercially available.
FPGA implementation of digital down converter using CORDIC algorithm
NASA Astrophysics Data System (ADS)
Agarwal, Ashok; Lakshmi, Boppana
2013-01-01
In radio receivers, Digital Down Converters (DDC) are used to translate the signal from Intermediate Frequency level to baseband. It also decimates the oversampled signal to a lower sample rate, eliminating the need of a high end digital signal processors. In this paper we have implemented architecture for DDC employing CORDIC algorithm, which down converts an IF signal of 70MHz (3G) to 200 KHz baseband GSM signal, with an SFDR greater than 100dB. The implemented architecture reduces the hardware resource requirements by 15 percent when compared with other architecture available in the literature due to elimination of explicit multipliers and a quadrature phase shifter for mixing.
Dual initiation strip charge apparatus and methods for making and implementing the same
Jakaboski, Juan-Carlos [Albuquerque, NM; Todd,; Steven, N [Rio Rancho, NM; Polisar, Stephen [Albuquerque, NM; Hughs, Chance [Tijeras, NM
2011-03-22
A Dual Initiation Strip Charge (DISC) apparatus is initiated by a single initiation source and detonates a strip of explosive charge at two separate contacts. The reflection of explosively induced stresses meet and create a fracture and breach a target along a generally single fracture contour and produce generally fragment-free scattering and no spallation. Methods for making and implementing a DISC apparatus provide numerous advantages over previous methods of creating explosive charges by utilizing steps for rapid prototyping; by implementing efficient steps and designs for metering consistent, repeatable, and controlled amount of high explosive; and by utilizing readily available materials.
ABACAS: algorithm-based automatic contiguation of assembled sequences
Assefa, Samuel; Keane, Thomas M.; Otto, Thomas D.; Newbold, Chris; Berriman, Matthew
2009-01-01
Summary: Due to the availability of new sequencing technologies, we are now increasingly interested in sequencing closely related strains of existing finished genomes. Recently a number of de novo and mapping-based assemblers have been developed to produce high quality draft genomes from new sequencing technology reads. New tools are necessary to take contigs from a draft assembly through to a fully contiguated genome sequence. ABACAS is intended as a tool to rapidly contiguate (align, order, orientate), visualize and design primers to close gaps on shotgun assembled contigs based on a reference sequence. The input to ABACAS is a set of contigs which will be aligned to the reference genome, ordered and orientated, visualized in the ACT comparative browser, and optimal primer sequences are automatically generated. Availability and Implementation: ABACAS is implemented in Perl and is freely available for download from http://abacas.sourceforge.net Contact: sa4@sanger.ac.uk PMID:19497936
Rocca-Serra, Philippe; Brandizi, Marco; Maguire, Eamonn; Sklyar, Nataliya; Taylor, Chris; Begley, Kimberly; Field, Dawn; Harris, Stephen; Hide, Winston; Hofmann, Oliver; Neumann, Steffen; Sterk, Peter; Tong, Weida; Sansone, Susanna-Assunta
2010-01-01
Summary: The first open source software suite for experimentalists and curators that (i) assists in the annotation and local management of experimental metadata from high-throughput studies employing one or a combination of omics and other technologies; (ii) empowers users to uptake community-defined checklists and ontologies; and (iii) facilitates submission to international public repositories. Availability and Implementation: Software, documentation, case studies and implementations at http://www.isa-tools.org Contact: isatools@googlegroups.com PMID:20679334
Brunette, Mary F; Asher, Dianne; Whitley, Rob; Lutz, Wilma J; Wieder, Barbara L; Jones, Amanda M; McHugo, Gregory J
2008-09-01
Approximately half of the people who have serious mental illnesses experience a co-occurring substance use disorder at some point in their lifetime. Integrated dual disorders treatment, a program to treat persons with co-occurring disorders, improves outcomes but is not widely available in public mental health settings. This report describes the extent to which this intervention was implemented by 11 community mental health centers participating in a large study of practice implementation. Facilitators and barriers to implementation are described. Trained implementation monitors conducted regular site visits over two years. During visits, monitors interviewed key informants, conducted ethnographic observations of implementation efforts, and assessed fidelity to the practice model. These data were coded and used as a basis for detailed site reports summarizing implementation processes. The authors reviewed the reports and distilled the three top facilitators and barriers for each site. The most prominent cross-site facilitators and barriers were identified. Two sites reached high fidelity, six sites reached moderate fidelity, and three sites remained at low fidelity over the two years. Prominent facilitators and barriers to implementation with moderate to high fidelity were administrative leadership, consultation and training, supervisor mastery and supervision, chronic staff turnover, and finances. Common facilitators and barriers to implementation of integrated dual disorders treatment emerged across sites. The results confirmed the importance of the use of the consultant-trainer in the model of implementation, as well as the need for intensive activities at multiple levels to facilitate implementation. Further research on service implementation is needed, including but not limited to clarifying strategies to overcome barriers.
High-Frequency Wireless Communications System: 2.45-GHz Front-End Circuit and System Integration
ERIC Educational Resources Information Center
Chen, M.-H.; Huang, M.-C.; Ting, Y.-C.; Chen, H.-H.; Li, T.-L.
2010-01-01
In this article, a course on high-frequency wireless communications systems is presented. With the 145-MHz baseband subsystem available from a prerequisite course, the present course emphasizes the design and implementation of the 2.45-GHz front-end subsystem as well as system integration issues. In this curriculum, the 2.45-GHz front-end…
Providing Internet Access to High-Resolution Lunar Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
Providing Internet Access to High-Resolution Mars Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... consideration and use of the best available science to inform project decisionmaking that implements a land... developed considering the best available science in accordance with § 219.35(a). Projects implementing land... official must consider the best available science in implementing and, if appropriate, amending the plan...
Manzanera, R; Plana, M; Moya, D; Ortner, J; Mira, J J
2016-01-01
To describe the level of implementation of quality and safety good practice elements in a Mutual Society health centre. A Cross-sectional study was conducted to assess the level of implementation of good practices using a questionnaire. Some quality dimensions were also assessed (scale 0 to 10) by a set of 87 quality coordinators of health centres and a random sample of 54 healthcare professionals working in small centres. Seventy quality coordinators and 27 professionals replied (response rates 80% and 50%, respectively. There were no differences in the assessment of quality attributes between both groups. They identified as areas for improvement: use of practice guidelines (7.6/10), scientific and technical skills (7.5/10), and patient satisfaction (7.7/10). Availability and accessibility to clinical reports, informed consent, availability of hydro-alcoholic solution, and to record allergies, were considered of high importance to be implemented, with training and research, improvements in equipment and technology plans, adherence to clinical practice guidelines and the preparation of risk maps, being of less importance. The good practices related to equipment and resources have a higher likelihood to be implemented, meanwhile those related to quality and safety attitudes have more barriers before being implemented. The mutual has a similar behaviour than other healthcare institutions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Spatiotemporal matrix image formation for programmable ultrasound scanners
NASA Astrophysics Data System (ADS)
Berthon, Beatrice; Morichau-Beauchant, Pierre; Porée, Jonathan; Garofalakis, Anikitos; Tavitian, Bertrand; Tanter, Mickael; Provost, Jean
2018-02-01
As programmable ultrasound scanners become more common in research laboratories, it is increasingly important to develop robust software-based image formation algorithms that can be obtained in a straightforward fashion for different types of probes and sequences with a small risk of error during implementation. In this work, we argue that as the computational power keeps increasing, it is becoming practical to directly implement an approximation to the matrix operator linking reflector point targets to the corresponding radiofrequency signals via thoroughly validated and widely available simulations software. Once such a spatiotemporal forward-problem matrix is constructed, standard and thus highly optimized inversion procedures can be leveraged to achieve very high quality images in real time. Specifically, we show that spatiotemporal matrix image formation produces images of similar or enhanced quality when compared against standard delay-and-sum approaches in phantoms and in vivo, and show that this approach can be used to form images even when using non-conventional probe designs for which adapted image formation algorithms are not readily available.
Efficient implementation of the many-body Reactive Bond Order (REBO) potential on GPU
NASA Astrophysics Data System (ADS)
Trędak, Przemysław; Rudnicki, Witold R.; Majewski, Jacek A.
2016-09-01
The second generation Reactive Bond Order (REBO) empirical potential is commonly used to accurately model a wide range hydrocarbon materials. It is also extensible to other atom types and interactions. REBO potential assumes complex multi-body interaction model, that is difficult to represent efficiently in the SIMD or SIMT programming model. Hence, despite its importance, no efficient GPGPU implementation has been developed for this potential. Here we present a detailed description of a highly efficient GPGPU implementation of molecular dynamics algorithm using REBO potential. The presented algorithm takes advantage of rarely used properties of the SIMT architecture of a modern GPU to solve difficult synchronizations issues that arise in computations of multi-body potential. Techniques developed for this problem may be also used to achieve efficient solutions of different problems. The performance of proposed algorithm is assessed using a range of model systems. It is compared to highly optimized CPU implementation (both single core and OpenMP) available in LAMMPS package. These experiments show up to 6x improvement in forces computation time using single processor of the NVIDIA Tesla K80 compared to high end 16-core Intel Xeon processor.
Annotare—a tool for annotating high-throughput biomedical investigations and resulting data
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J.; Ball, Catherine A.
2010-01-01
Summary: Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Availability and Implementation: Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows. Contact: rshankar@stanford.edu PMID:20733062
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galitsky, Christina; Martin, Nathan; Worrell, Ernst
2003-09-01
Annually, breweries in the United States spend over $200 million on energy. Energy consumption is equal to 38 percent of the production costs of beer, making energy efficiency improvement an important way to reduce costs, especially in times of high energy price volatility. After a summary of the beer making process and energy use, we examine energy efficiency opportunities available for breweries. We provide specific primary energy savings for each energy efficiency measure based on case studies that have implemented the measures, as well as references to technical literature. If available, we have also listed typical payback periods. Our findingsmore » suggest that given available technology, there are still opportunities to reduce energy consumption cost-effectively in the brewing industry. Brewers value highly the quality, taste and drinkability of their beer. Brewing companies have and are expected to continue to spend capital on cost-effective energy conservation measures that meet these quality, taste and drinkability requirements. For individual plants, further research on the economics of the measures, as well as their applicability to different brewing practices, is needed to assess implementation of selected technologies.« less
A network-based training environment: a medical image processing paradigm.
Costaridou, L; Panayiotakis, G; Sakellaropoulos, P; Cavouras, D; Dimopoulos, J
1998-01-01
The capability of interactive multimedia and Internet technologies is investigated with respect to the implementation of a distance learning environment. The system is built according to a client-server architecture, based on the Internet infrastructure, composed of server nodes conceptually modelled as WWW sites. Sites are implemented by customization of available components. The environment integrates network-delivered interactive multimedia courses, network-based tutoring, SIG support, information databases of professional interest, as well as course and tutoring management. This capability has been demonstrated by means of an implemented system, validated with digital image processing content, specifically image enhancement. Image enhancement methods are theoretically described and applied to mammograms. Emphasis is given to the interactive presentation of the effects of algorithm parameters on images. The system end-user access depends on available bandwidth, so high-speed access can be achieved via LAN or local ISDN connections. Network based training offers new means of improved access and sharing of learning resources and expertise, as promising supplements in training.
Barth, R P; Ash, J R; Hacking, S
1986-01-01
Child abuse prevention programs rely on varied strategies to identify, screen, obtain referrals of, and engage high risk parents. Available literature on community-based child abuse prevention projects is not conclusive about project outcomes nor sufficiently descriptive about implementation. From the literature, experience and interviews with staff from more than 20 programs, barriers to implementation are identifiable. Barriers arise during identifying and screening at-risk families, referral, continued collaboration with referrers, and engaging clients in services. The paper describes a diverse set of strategies for surmounting these barriers. Staff characteristics and concrete services partially predict the success of program implementation. So does the program's relationship to other agencies. Child abuse prevention programs assume independent, interdependent, and dependent relationships with other agencies and referrers. Interdependent programs appear to have the best chance of obtaining referrals and maintaining clients who match their program's intent.
Accurate and Robust Unitary Transformations of a High-Dimensional Quantum System
NASA Astrophysics Data System (ADS)
Anderson, B. E.; Sosa-Martinez, H.; Riofrío, C. A.; Deutsch, Ivan H.; Jessen, Poul S.
2015-06-01
Unitary transformations are the most general input-output maps available in closed quantum systems. Good control protocols have been developed for qubits, but questions remain about the use of optimal control theory to design unitary maps in high-dimensional Hilbert spaces, and about the feasibility of their robust implementation in the laboratory. Here we design and implement unitary maps in a 16-dimensional Hilbert space associated with the 6 S1 /2 ground state of 133Cs, achieving fidelities >0.98 with built-in robustness to static and dynamic perturbations. Our work has relevance for quantum information processing and provides a template for similar advances on other physical platforms.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... COUNCIL ON ENVIRONMENTAL QUALITY Instructions for Implementing Climate Change Adaptation Planning... Availability of Climate Change Adaptation Planning Implementing Instructions. SUMMARY: The Chair of the Council... for Implementing Climate Change Adaptation Planning are now available at: http://www.whitehouse.gov...
Bohus, M; Schmahl, C; Herpertz, S C; Lieb, K; Berger, M; Roepke, S; Heinz, A; Gallinat, J; Lyssenko, L
2016-07-01
Borderline personality disorders (BPD) are severe mental diseases which place high pressure on the psychiatric healthcare system. Nowadays, well-tested, disorder-specific treatment concepts are available also for inpatient treatment in Germany. These show very good and long-term improvements in the psychopathology as well as posttreatment social participation; however, prerequisites for the implementation of these evidence-based inpatient psychotherapy programs are well-trained treatment teams and appropriate financing of resource expenditure. The aim was to formulate a definition of normative needs for treatment duration and intensity for a guideline-conform, empirically proven and effective inpatient treatment of borderline personality disorder as well as the derived personnel requirements in comparison to the currently available resources within the framework of the Psychiatry Personnel Act (Psych-PV). The resource requirements were established based on evaluated hospital ward models, the recommendations of the S2 guidelines and the criteria of specialist societies and compared with the personnel stipulations according to the Psych-PV. The results for a normatively established treatment program showed a pronounced deficit in the financing of the evaluated resource requirements, even when the stipulations laid down in the Psych-PV were implemented to 100 %. Disorder-specific inpatient treatment programs for borderline personality disorder have been scientifically proven to be highly effective; however, resource analyses show that the personnel requirements necessary for effective implementation of these programs are much higher than those allocated by the funding according to the Pysch-PV. The current underfunding leads to inadequate treatment outcomes with high readmission rates and as a result high direct and indirect costs of illness.
45 CFR 162.920 - Availability of implementation specifications and operating rules.
Code of Federal Regulations, 2011 CFR
2011-10-01
... publications is consistent with the policies of other publishers of standards. The transaction implementation... publications is consistent with the policies of other publishers of standards. The transaction implementation... 45 Public Welfare 1 2011-10-01 2011-10-01 false Availability of implementation specifications and...
Implementation of Integrated System Fault Management Capability
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Schmalzel, John; Morris, Jon; Smith, Harvey; Turowski, Mark
2008-01-01
Fault Management to support rocket engine test mission with highly reliable and accurate measurements; while improving availability and lifecycle costs. CORE ELEMENTS: Architecture, taxonomy, and ontology (ATO) for DIaK management. Intelligent Sensor Processes; Intelligent Element Processes; Intelligent Controllers; Intelligent Subsystem Processes; Intelligent System Processes; Intelligent Component Processes.
An Air Force Guide for Effective Meeting Management
2011-05-01
The inspection program to ensure its sustainment has faced increasing workload requirements due to structural issues related to heavy use and aging...26 Data Sources /Format...aircraft availability, the High Velocity Maintenance (HVM) concept is being implemented to replace the current PDM process for heavy maintenance
Marušić, Srećko; Knežević, Aleksandar; Bačić Vrca, Vesna; Marinović, Ivana; Bačić, Julija; Obreli Neto, Paulo Roque; Amidžić Klarić, Daniela; Diklić, Dijaneta
2017-12-01
The aim of this study was to evaluate the implementation of the 9th edition of the American College of Chest Physicians (ACCP9) guidelines for prevention of venous thromboembolism in nonsurgical patients in clinical practice in one university and one general Croatian hospital. A retrospective study was conducted at Zadar General Hospital from Zadar and Dubrava University Hospital from Zagreb. Medical charts of all patients admitted to Medical Departments in two periods, before and after implementation of the ACCP9 guidelines, were analyzed. The ACCP9 guidelines were made available to all physicians through the hospital electronic information system immediately after the publication. The Hospital Drug Committees promoted implementation of the guidelines during their periodical clinical visits. Overall, 850 patients were included in the study in two periods. There was no statistically significant difference in the number of high-risk patients receiving thromboprophylaxis after the guidelines implementation in either hospital. In both periods, a signifi-cantly higher number of high-risk patients received thromboprophylaxis in Dubrava University Hos-pital in comparison with Zadar General Hospital (31.7% vs. 3.8% and 40.3% vs. 7.3%, respectively; p<0.001). This study revealed insufficient implementation of evidence-based thromboprophylaxis guidelines in clinical practice in two Croatian hospitals.
PRACTICAL BARRIERS TO IMPLEMENTATION OF THYROID CANCER GUIDELINES IN THE ASIA-PACIFIC REGION.
Yang, Samantha Peiling; Ying, Lee Suat; Saw, Stephanie; Tuttle, R Michael; Venkataraman, Kavita; Su-Ynn, Chia
2015-11-01
Numerous published guidelines have described the optimal management of thyroid cancer. However, these rely on the clinical availability of diagnostic and therapeutic modalities. We hypothesized that the availability of medical resources and economic circumstances vary in Asia-Pacific countries, making it difficult to implement guideline recommendations into clinical practice. We surveyed participants at the 2009 and 2013 Congresses of the Association of Southeast Asian Nations Federation of Endocrine Societies by distributing questionnaires to attendees at registration. Responses were obtained from 268 respondents in 2009 and 163 respondents in 2013. Similar to the high prevalence of low-risk thyroid cancer observed in the Surveillance, Epidemiology, and End Results database, across the Asia-Pacific countries surveyed in 2009 and 2013, 50 to 100% of the respondents from the Philippines, Malaysia, Singapore, China, Taiwan, Thailand, Hong Kong, Korea, and Sri Lanka reported that more than 50% of the patients had low-risk thyroid cancer on follow-up. Importantly, there was much variation with regards to the perceived availability of investigation and treatment modalities. We found a wide variation in clinicians' perception of availability of diagnostic and therapeutic modalities in the face of a rise in thyroid cancer incidence and thyroid cancer management guidelines that emphasized their importance. The lack of availability of management tools and treatments will prove to be a major barrier to the implementation of thyroid cancer management guidelines in Southeast Asia, and likely in other parts of the world as well.
Subnanosecond time-to-digital converter implemented in a Kintex-7 FPGA
NASA Astrophysics Data System (ADS)
Sano, Y.; Horii, Y.; Ikeno, M.; Sasaki, O.; Tomoto, M.; Uchida, T.
2017-12-01
Time-to-digital converters (TDCs) are used in various fields, including high-energy physics. One advantage of implementing TDCs in field-programmable gate arrays (FPGAs) is the flexibility on the modification of the logics, which is useful to cope with the changes in the experimental conditions. Recent FPGAs make it possible to implement TDCs with a time resolution less than 10 ps. On the other hand, various drift chambers require a time resolution of O(0.1) ns, and a simple and easy-to-implement TDC is useful for a robust operation. Herein an eight-channel TDC with a variable bin size down to 0.28 ns is implemented in a Xilinx Kintex-7 FPGA and tested. The TDC is based on a multisampling scheme with quad phase clocks synchronised with an external reference clock. Calibration of the bin size is unnecessary if a stable reference clock is available, which is common in high-energy physics experiments. Depending on the channel, the standard deviation of the differential nonlinearity for a 0.28 ns bin size is 0.13-0.31. The performance has a negligible dependence on the temperature. The power consumption and the potential to extend the number of channels are also discussed.
A High-Availability, Distributed Hardware Control System Using Java
NASA Technical Reports Server (NTRS)
Niessner, Albert F.
2011-01-01
Two independent coronagraph experiments that require 24/7 availability with different optical layouts and different motion control requirements are commanded and controlled with the same Java software system executing on many geographically scattered computer systems interconnected via TCP/IP. High availability of a distributed system requires that the computers have a robust communication messaging system making the mix of TCP/IP (a robust transport), and XML (a robust message) a natural choice. XML also adds the configuration flexibility. Java then adds object-oriented paradigms, exception handling, heavily tested libraries, and many third party tools for implementation robustness. The result is a software system that provides users 24/7 access to two diverse experiments with XML files defining the differences
Validation environment for AIPS/ALS: Implementation and results
NASA Technical Reports Server (NTRS)
Segall, Zary; Siewiorek, Daniel; Caplan, Eddie; Chung, Alan; Czeck, Edward; Vrsalovic, Dalibor
1990-01-01
The work is presented which was performed in porting the Fault Injection-based Automated Testing (FIAT) and Programming and Instrumentation Environments (PIE) validation tools, to the Advanced Information Processing System (AIPS) in the context of the Ada Language System (ALS) application, as well as an initial fault free validation of the available AIPS system. The PIE components implemented on AIPS provide the monitoring mechanisms required for validation. These mechanisms represent a substantial portion of the FIAT system. Moreover, these are required for the implementation of the FIAT environment on AIPS. Using these components, an initial fault free validation of the AIPS system was performed. The implementation is described of the FIAT/PIE system, configured for fault free validation of the AIPS fault tolerant computer system. The PIE components were modified to support the Ada language. A special purpose AIPS/Ada runtime monitoring and data collection was implemented. A number of initial Ada programs running on the PIE/AIPS system were implemented. The instrumentation of the Ada programs was accomplished automatically inside the PIE programming environment. PIE's on-line graphical views show vividly and accurately the performance characteristics of Ada programs, AIPS kernel and the application's interaction with the AIPS kernel. The data collection mechanisms were written in a high level language, Ada, and provide a high degree of flexibility for implementation under various system conditions.
Jourdin, Ludovic; Freguia, Stefano; Flexer, Victoria; Keller, Jurg
2016-02-16
The enhancement of microbial electrosynthesis (MES) of acetate from CO2 to performance levels that could potentially support practical implementations of the technology must go through the optimization of key design and operating conditions. We report that higher proton availability drastically increases the acetate production rate, with pH 5.2 found to be optimal, which will likely suppress methanogenic activity without inhibitor addition. Applied cathode potential as low as -1.1 V versus SHE still achieved 99% of electron recovery in the form of acetate at a current density of around -200 A m(-2). These current densities are leading to an exceptional acetate production rate of up to 1330 g m(-2) day(-1) at pH 6.7. Using highly open macroporous reticulated vitreous carbon electrodes with macropore sizes of about 0.6 mm in diameter was found to be optimal for achieving a good balance between total surface area available for biofilm formation and effective mass transfer between the bulk liquid and the electrode and biofilm surface. Furthermore, we also successfully demonstrated the use of a synthetic biogas mixture as carbon dioxide source, yielding similarly high MES performance as pure CO2. This would allow this process to be used effectively for both biogas quality improvement and conversion of the available CO2 to acetate.
NASA Astrophysics Data System (ADS)
Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.
2016-04-01
The paper deals with the issues in program management used for engineering innovative products. The existing project management tools were analyzed. The aim is to develop a decision support system that takes into account the features of program management used for high-tech products: research intensity, a high level of technical risks, unpredictable results due to the impact of various external factors, availability of several implementing agencies. The need for involving experts and using intelligent techniques for information processing is demonstrated. A conceptual model of common information space to support communication between members of the collaboration on high-tech programs has been developed. The structure and objectives of the information analysis system “Geokhod” were formulated with the purpose to implement the conceptual model of common information space in the program “Development and production of new class mining equipment - “Geokhod”.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... Proposed United States Regional Climate Reference Network (USRCRN) AGENCY: National Weather Service (NWS..., is proposing to implement, operate, and manage a USRCRN. With other climate monitoring efforts..., high-quality climate data for use in climate-monitoring activities and for placing current climate...
MPRAnator: a web-based tool for the design of massively parallel reporter assay experiments
Georgakopoulos-Soares, Ilias; Jain, Naman; Gray, Jesse M; Hemberg, Martin
2017-01-01
Motivation: With the rapid advances in DNA synthesis and sequencing technologies and the continuing decline in the associated costs, high-throughput experiments can be performed to investigate the regulatory role of thousands of oligonucleotide sequences simultaneously. Nevertheless, designing high-throughput reporter assay experiments such as massively parallel reporter assays (MPRAs) and similar methods remains challenging. Results: We introduce MPRAnator, a set of tools that facilitate rapid design of MPRA experiments. With MPRA Motif design, a set of variables provides fine control of how motifs are placed into sequences, thereby allowing the investigation of the rules that govern transcription factor (TF) occupancy. MPRA single-nucleotide polymorphism design can be used to systematically examine the functional effects of single or combinations of single-nucleotide polymorphisms at regulatory sequences. Finally, the Transmutation tool allows for the design of negative controls by permitting scrambling, reversing, complementing or introducing multiple random mutations in the input sequences or motifs. Availability and implementation: MPRAnator tool set is implemented in Python, Perl and Javascript and is freely available at www.genomegeek.com and www.sanger.ac.uk/science/tools/mpranator. The source code is available on www.github.com/hemberg-lab/MPRAnator/ under the MIT license. The REST API allows programmatic access to MPRAnator using simple URLs. Contact: igs@sanger.ac.uk or mh26@sanger.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27605100
Ma, Jian; Casey, Cameron P.; Zheng, Xueyun; Ibrahim, Yehia M.; Wilkins, Christopher S.; Renslow, Ryan S.; Thomas, Dennis G.; Payne, Samuel H.; Monroe, Matthew E.; Smith, Richard D.; Teeguarden, Justin G.; Baker, Erin S.; Metz, Thomas O.
2017-01-01
Abstract Motivation: Drift tube ion mobility spectrometry coupled with mass spectrometry (DTIMS-MS) is increasingly implemented in high throughput omics workflows, and new informatics approaches are necessary for processing the associated data. To automatically extract arrival times for molecules measured by DTIMS at multiple electric fields and compute their associated collisional cross sections (CCS), we created the PNNL Ion Mobility Cross Section Extractor (PIXiE). The primary application presented for this algorithm is the extraction of data that can then be used to create a reference library of experimental CCS values for use in high throughput omics analyses. Results: We demonstrate the utility of this approach by automatically extracting arrival times and calculating the associated CCSs for a set of endogenous metabolites and xenobiotics. The PIXiE-generated CCS values were within error of those calculated using commercially available instrument vendor software. Availability and implementation: PIXiE is an open-source tool, freely available on Github. The documentation, source code of the software, and a GUI can be found at https://github.com/PNNL-Comp-Mass-Spec/PIXiE and the source code of the backend workflow library used by PIXiE can be found at https://github.com/PNNL-Comp-Mass-Spec/IMS-Informed-Library. Contact: erin.baker@pnnl.gov or thomas.metz@pnnl.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28505286
Yu, Yi-Kuo; Capra, John A.; Stojmirović, Aleksandar; Landsman, David; Altschul, Stephen F.
2015-01-01
Motivation: DNA and protein patterns are usefully represented by sequence logos. However, the methods for logo generation in common use lack a proper statistical basis, and are non-optimal for recognizing functionally relevant alignment columns. Results: We redefine the information at a logo position as a per-observation multiple alignment log-odds score. Such scores are positive or negative, depending on whether a column’s observations are better explained as arising from relatedness or chance. Within this framework, we propose distinct normalized maximum likelihood and Bayesian measures of column information. We illustrate these measures on High Mobility Group B (HMGB) box proteins and a dataset of enzyme alignments. Particularly in the context of protein alignments, our measures improve the discrimination of biologically relevant positions. Availability and implementation: Our new measures are implemented in an open-source Web-based logo generation program, which is available at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/logoddslogo/index.html. A stand-alone version of the program is also available from this site. Contact: altschul@ncbi.nlm.nih.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25294922
High-Speed, Low-Cost Workstation for Computation-Intensive Statistics. Phase 1
1990-06-20
routine implementation and performance. 5 The two compiled versions given in the table were coded in an attempt to obtain an optimized compiled version...level statistics and linear algebra routines (BSAS and BLAS) that have been prototyped in this study. For each routine, both the C code ( Turbo C...OISTRIBUTION /AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Unlimited distribution 13. ABSTRACT (Maximum 200 words) High-performance and low-cost
ERIC Educational Resources Information Center
Chait, Robin; Hardcastle, Daphne; Kotzin, Stacy; LaPointe, Michelle; Miller, Meredith; Rimdzius, Tracy; Sanchez, Susan; Scott, Elois; Stullich, Stephanie; Thompson-Hoffman, Susan
This report provides a comprehensive summary of the most recent data available from the National Assessment of Title I on the implementation of the Title I program and the academic performance of children in high poverty schools. Seven sections focus on: (1) "Policy Context for Title I" (provisions of the current Title I law and new…
Pradhan, Nousheen Akber; Rizvi, Narjis; Sami, Neelofar; Gul, Xaher
2013-07-05
Integrated management of childhood illnesses (IMCI) strategy has been proven to improve health outcomes in children under 5 years of age. Pakistan, despite being in the late implementation phase of the strategy, continues to report high under-five mortality due to pneumonia, diarrhea, measles, and malnutrition - the main targets of the strategy. The study determines the factors influencing IMCI implementation at public-sector primary health care (PHC) facilities in Matiari district, Sindh, Pakistan. An exploratory qualitative study with an embedded quantitative strand was conducted. The qualitative part included 16 in-depth interviews (IDIs) with stakeholders which included planners and policy makers at a provincial level (n=5), implementers and managers at a district level (n=3), and IMCI-trained physicians posted at PHC facilities (n=8). Quantitative part included PHC facility survey (n=16) utilizing WHO health facility assessment tool to assess availability of IMCI essential drugs, supplies, and equipments. Qualitative content analysis was used to interpret the textual information, whereas descriptive frequencies were calculated for health facility survey data. The major factors reported to enhance IMCI implementation were knowledge and perception about the strategy and need for separate clinic for children aged under 5 years as potential support factors. The latter can facilitate in strategy implementation through allocated workforce and required equipments and supplies. Constraint factors mainly included lack of clear understanding of the strategy, poor planning for IMCI implementation, ambiguity in defined roles and responsibilities among stakeholders, and insufficient essential supplies and drugs at PHC centers. The latter was further substantiated through health facilities' survey findings, which indicated that none of the facilities had 100% stock of essential supplies and drugs. Only one out of all 16 surveyed facilities had 75% of the total supplies, while 4 out of 16 facilities had 56% of the required IMCI drug stock. The mean availability of supplies ranged from 36.6 to 66%, while the mean availability of drugs ranged from 45.8 to 56.7%. Our findings indicate that the Matiari district has sound implementation potential; however, bottlenecks at health care facility and at health care management level have badly constrained the implementation process. An interdependency exists among the constraining factors, such as lack of sound planning resulting in unclear understanding of the strategy; leading to ambiguous roles and responsibilities among stakeholders which manifest as inadequate availability of supplies and drugs at PHC facilities. Addressing these barriers is likely to have a cumulative effect on facilitating IMCI implementation. On the basis of these findings, we recommend that the provincial Ministry of Health (MoH) and provincial Maternal Neonatal and Child Health (MNCH) program jointly assess the situation and streamline IMCI implementation in the district through sound planning, training, supervision, and logistic support.
Looking Forward: The Promise of Widespread Implementation of Parent Training Programs
Forgatch, Marion S.; Patterson, Gerald R.; Gewirtz, Abigail H.
2013-01-01
Over the past quarter century a body of parent training programs has been developed and validated as effective in reducing child behavior problems, but few of these have made their way into routine practice. This article describes the long and winding road of implementation as applied to children's mental health. Adopting Rogers' (1995) diffusion framework and Fixsen and colleagues' implementation framework (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005), we review more than a decade of research on the implementation of Parent Management Training – Oregon Model (PMTO®). Data from US and international PMTO implementations are used to illustrate the payoffs and the challenges of making empirically supported interventions routine practice in the community. Technological advances that break down barriers to communication across distances, the availability of efficacious programs suitable for implementation, and the urgent need for high quality mental health care provide strong rationales for prioritizing attention to implementation. Over the next quarter of a century, the challenge is to reduce the prevalence of children's psychopathology by creating science-based delivery systems to reach families in need, everywhere. PMID:24443650
High speed micromachining with high power UV laser
NASA Astrophysics Data System (ADS)
Patel, Rajesh S.; Bovatsek, James M.
2013-03-01
Increasing demand for creating fine features with high accuracy in manufacturing of electronic mobile devices has fueled growth for lasers in manufacturing. High power, high repetition rate ultraviolet (UV) lasers provide an opportunity to implement a cost effective high quality, high throughput micromachining process in a 24/7 manufacturing environment. The energy available per pulse and the pulse repetition frequency (PRF) of diode pumped solid state (DPSS) nanosecond UV lasers have increased steadily over the years. Efficient use of the available energy from a laser is important to generate accurate fine features at a high speed with high quality. To achieve maximum material removal and minimal thermal damage for any laser micromachining application, use of the optimal process parameters including energy density or fluence (J/cm2), pulse width, and repetition rate is important. In this study we present a new high power, high PRF QuasarR 355-40 laser from Spectra-Physics with TimeShiftTM technology for unique software adjustable pulse width, pulse splitting, and pulse shaping capabilities. The benefits of these features for micromachining include improved throughput and quality. Specific example and results of silicon scribing are described to demonstrate the processing benefits of the Quasar's available power, PRF, and TimeShift technology.
Localizer: fast, accurate, open-source, and modular software package for superresolution microscopy
Duwé, Sam; Neely, Robert K.; Zhang, Jin
2012-01-01
Abstract. We present Localizer, a freely available and open source software package that implements the computational data processing inherent to several types of superresolution fluorescence imaging, such as localization (PALM/STORM/GSDIM) and fluctuation imaging (SOFI/pcSOFI). Localizer delivers high accuracy and performance and comes with a fully featured and easy-to-use graphical user interface but is also designed to be integrated in higher-level analysis environments. Due to its modular design, Localizer can be readily extended with new algorithms as they become available, while maintaining the same interface and performance. We provide front-ends for running Localizer from Igor Pro, Matlab, or as a stand-alone program. We show that Localizer performs favorably when compared with two existing superresolution packages, and to our knowledge is the only freely available implementation of SOFI/pcSOFI microscopy. By dramatically improving the analysis performance and ensuring the easy addition of current and future enhancements, Localizer strongly improves the usability of superresolution imaging in a variety of biomedical studies. PMID:23208219
Families of FPGA-Based Accelerators for Approximate String Matching1
Van Court, Tom; Herbordt, Martin C.
2011-01-01
Dynamic programming for approximate string matching is a large family of different algorithms, which vary significantly in purpose, complexity, and hardware utilization. Many implementations have reported impressive speed-ups, but have typically been point solutions – highly specialized and addressing only one or a few of the many possible options. The problem to be solved is creating a hardware description that implements a broad range of behavioral options without losing efficiency due to feature bloat. We report a set of three component types that address different parts of the approximate string matching problem. This allows each application to choose the feature set required, then make maximum use of the FPGA fabric according to that application’s specific resource requirements. Multiple, interchangeable implementations are available for each component type. We show that these methods allow the efficient generation of a large, if not complete, family of accelerators for this application. This flexibility was obtained while retaining high performance: We have evaluated a sample against serial reference codes and found speed-ups of from 150× to 400× over a high-end PC. PMID:21603598
Pas, Elise T.; Loh, Deanna; Debnam, Katrina J.; Bradshaw, Catherine P.
2016-01-01
Although evidence-based practices for students’ social, emotional, and behavioral health are readily available, their adoption and quality implementation in schools are of increasing concern. Teachers are vital to implementation; yet, there is limited research on teachers’ openness to adopting new practices, which may be essential to successful program adoption and implementation. The current study explored how perceptions of principal support, teacher affiliation, teacher efficacy, and burnout relate to teachers’ openness to new practices. Data came from 2,133 teachers across 51 high schools. Structural equation modeling assessed how organizational climate (i.e., principal support and teacher affiliation) related to teachers’ openness directly and indirectly via teacher resources (i.e., efficacy and burnout). Teachers with more favorable perceptions of both principal support and teacher affiliation reported greater efficacy, and, in turn, more openness; however, burnout was not significantly associated with openness. Post hoc analyses indicated that among teachers with high levels of burnout, only principal support related to greater efficacy, and in turn, higher openness. Implications for promoting teachers’ openness to new program adoption are discussed. PMID:28533823
Johnson, Stacy R; Pas, Elise T; Loh, Deanna; Debnam, Katrina J; Bradshaw, Catherine P
2017-03-01
Although evidence-based practices for students' social, emotional, and behavioral health are readily available, their adoption and quality implementation in schools are of increasing concern. Teachers are vital to implementation; yet, there is limited research on teachers' openness to adopting new practices, which may be essential to successful program adoption and implementation. The current study explored how perceptions of principal support, teacher affiliation, teacher efficacy, and burnout relate to teachers' openness to new practices. Data came from 2,133 teachers across 51 high schools. Structural equation modeling assessed how organizational climate (i.e., principal support and teacher affiliation) related to teachers' openness directly and indirectly via teacher resources (i.e., efficacy and burnout). Teachers with more favorable perceptions of both principal support and teacher affiliation reported greater efficacy, and, in turn, more openness; however, burnout was not significantly associated with openness. Post hoc analyses indicated that among teachers with high levels of burnout, only principal support related to greater efficacy, and in turn, higher openness. Implications for promoting teachers' openness to new program adoption are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trędak, Przemysław, E-mail: przemyslaw.tredak@fuw.edu.pl; Rudnicki, Witold R.; Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw, ul. Pawińskiego 5a, 02-106 Warsaw
The second generation Reactive Bond Order (REBO) empirical potential is commonly used to accurately model a wide range hydrocarbon materials. It is also extensible to other atom types and interactions. REBO potential assumes complex multi-body interaction model, that is difficult to represent efficiently in the SIMD or SIMT programming model. Hence, despite its importance, no efficient GPGPU implementation has been developed for this potential. Here we present a detailed description of a highly efficient GPGPU implementation of molecular dynamics algorithm using REBO potential. The presented algorithm takes advantage of rarely used properties of the SIMT architecture of a modern GPUmore » to solve difficult synchronizations issues that arise in computations of multi-body potential. Techniques developed for this problem may be also used to achieve efficient solutions of different problems. The performance of proposed algorithm is assessed using a range of model systems. It is compared to highly optimized CPU implementation (both single core and OpenMP) available in LAMMPS package. These experiments show up to 6x improvement in forces computation time using single processor of the NVIDIA Tesla K80 compared to high end 16-core Intel Xeon processor.« less
Schröder, Jan; Hsu, Arthur; Boyle, Samantha E.; Macintyre, Geoff; Cmero, Marek; Tothill, Richard W.; Johnstone, Ricky W.; Shackleton, Mark; Papenfuss, Anthony T.
2014-01-01
Motivation: Methods for detecting somatic genome rearrangements in tumours using next-generation sequencing are vital in cancer genomics. Available algorithms use one or more sources of evidence, such as read depth, paired-end reads or split reads to predict structural variants. However, the problem remains challenging due to the significant computational burden and high false-positive or false-negative rates. Results: In this article, we present Socrates (SOft Clip re-alignment To idEntify Structural variants), a highly efficient and effective method for detecting genomic rearrangements in tumours that uses only split-read data. Socrates has single-nucleotide resolution, identifies micro-homologies and untemplated sequence at break points, has high sensitivity and high specificity and takes advantage of parallelism for efficient use of resources. We demonstrate using simulated and real data that Socrates performs well compared with a number of existing structural variant detection tools. Availability and implementation: Socrates is released as open source and available from http://bioinf.wehi.edu.au/socrates. Contact: papenfuss@wehi.edu.au Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24389656
Pre-implementation guidelines for infectious disease point-of-care testing in medical institutions.
van der Eijk, Annemiek A; Tintu, Andrei N; Hays, John P
2017-01-01
Infectious disease point-of-care test (ID-POCT) devices are becoming widely available, and in this respect, international quality standards and guidelines are available for consultation once ID-POCT has been implemented into medical institutions. However, specific guidelines for consultation during the initial pre-implementation decision-making process are currently lacking. Further, there exist pre-implementation issues specific to ID-POCT. Here we present pre-implementation guidelines for consultation when considering the implementation of ID-POCT in medical institutions.
Reilly, Kathryn L; Reeves, Penny; Deeming, Simon; Yoong, Sze Lin; Wolfenden, Luke; Nathan, Nicole; Wiggers, John
2018-03-20
No evaluations of the cost or cost effectiveness of interventions to increase school implementation of food availability policies have been reported. Government and non-government agency decisions regarding the extent of investment required to enhance school implementation of such policies are unsupported by such evidence. This study sought to i) Determine cost and cost-effectiveness of three interventions in improving school implementation of an Australian government healthy canteen policy and; ii) Determine the relative cost-effectiveness of the interventions in improving school implementation of such a policy. An analysis of the cost and cost-effectiveness of three implementation interventions of varying support intensity, relative to usual implementation support conducted during 2013-2015 was undertaken. Secondly, an indirect comparison of the trials was undertaken to determine the most cost-effective of the three strategies. The economic analysis was based on the cost of delivering the interventions by health service delivery staff to increase the proportion of schools 'adherent' with the policy. The total costs per school were $166,971, $70,926 and $75,682 for the high, medium and low intensity interventions respectively. Compared to usual support, the cost effectiveness ratios for each of the three interventions were: A$2982 (high intensity), A$2627 (medium intensity) and A$4730 (low intensity) per percent increase in proportion of schools reporting 'adherence'). Indirect comparison between the 'high' and 'medium intensity' interventions showed no statistically significant difference in cost-effectiveness. The results indicate that while the cost profiles of the interventions varied substantially, the cost-effectiveness did not. This result is valuable to policy makers seeking cost-effective solutions that can be delivered within budget.
Early experiences in developing and managing the neuroscience gateway.
Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas T
2015-02-01
The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway.
Early experiences in developing and managing the neuroscience gateway
Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas. T.
2015-01-01
SUMMARY The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway. PMID:26523124
One GigaSample Per Second Data Acquisition using Available Gate Array Technology
NASA Technical Reports Server (NTRS)
Wagner, K.W.
1999-01-01
A new National Aeronautics and Space Administration instrument forced demanding requirements upon its altimeter digitizer system. Eight-bit data would be generated at a rate of one billion samples per second. NASA had never before attempted to capture such high-speed data in the radiation, low-power, no-convective-cooling, limited-board-area environment of space. This presentation describes how the gate array technology available at the time of the design was used to implement this one gigasample per second data acquisition system
Benefits and Threats to Using Social Media for Presenting and Implementing Evidence.
Cook, Chad E; O'Connell, Neil E; Hall, Toby; George, Steven Z; Jull, Gwendolen; Wright, Alexis A; Girbés, Enrique Lluch; Lewis, Jeremy; Hancock, Mark
2018-01-01
As a potential high-yield tool for disseminating information that can reach many people, social media is transforming how clinicians, the public, and policy makers are educated and find new knowledge associated with research-related information. Social media is available to all who access the internet, reducing selected barriers to acquiring original source documents such as journal articles or books and potentially improving implementation-the process of formulating a conclusion and moving on that decision. The use of social media for evidence dissemination/implementation of research has both benefits and threats. It is the aim of this Viewpoint to provide a balanced view of each. J Orthop Sports Phys Ther 2018;48(1):3-7. doi:10.2519/jospt.2018.0601.
Local Alignment Tool Based on Hadoop Framework and GPU Architecture
Hung, Che-Lun; Hua, Guan-Jie
2014-01-01
With the rapid growth of next generation sequencing technologies, such as Slex, more and more data have been discovered and published. To analyze such huge data the computational performance is an important issue. Recently, many tools, such as SOAP, have been implemented on Hadoop and GPU parallel computing architectures. BLASTP is an important tool, implemented on GPU architectures, for biologists to compare protein sequences. To deal with the big biology data, it is hard to rely on single GPU. Therefore, we implement a distributed BLASTP by combining Hadoop and multi-GPUs. The experimental results present that the proposed method can improve the performance of BLASTP on single GPU, and also it can achieve high availability and fault tolerance. PMID:24955362
Local alignment tool based on Hadoop framework and GPU architecture.
Hung, Che-Lun; Hua, Guan-Jie
2014-01-01
With the rapid growth of next generation sequencing technologies, such as Slex, more and more data have been discovered and published. To analyze such huge data the computational performance is an important issue. Recently, many tools, such as SOAP, have been implemented on Hadoop and GPU parallel computing architectures. BLASTP is an important tool, implemented on GPU architectures, for biologists to compare protein sequences. To deal with the big biology data, it is hard to rely on single GPU. Therefore, we implement a distributed BLASTP by combining Hadoop and multi-GPUs. The experimental results present that the proposed method can improve the performance of BLASTP on single GPU, and also it can achieve high availability and fault tolerance.
Globus Nexus: A Platform-as-a-Service Provider of Research Identity, Profile, and Group Management.
Chard, Kyle; Lidman, Mattias; McCollam, Brendan; Bryan, Josh; Ananthakrishnan, Rachana; Tuecke, Steven; Foster, Ian
2016-03-01
Globus Nexus is a professionally hosted Platform-as-a-Service that provides identity, profile and group management functionality for the research community. Many collaborative e-Science applications need to manage large numbers of user identities, profiles, and groups. However, developing and maintaining such capabilities is often challenging given the complexity of modern security protocols and requirements for scalable, robust, and highly available implementations. By outsourcing this functionality to Globus Nexus, developers can leverage best-practice implementations without incurring development and operations overhead. Users benefit from enhanced capabilities such as identity federation, flexible profile management, and user-oriented group management. In this paper we present Globus Nexus, describe its capabilities and architecture, summarize how several e-Science applications leverage these capabilities, and present results that characterize its scalability, reliability, and availability.
Globus Nexus: A Platform-as-a-Service provider of research identity, profile, and group management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chard, Kyle; Lidman, Mattias; McCollam, Brendan
Globus Nexus is a professionally hosted Platform-as-a-Service that provides identity, profile and group management functionality for the research community. Many collaborative e-Science applications need to manage large numbers of user identities, profiles, and groups. However, developing and maintaining such capabilities is often challenging given the complexity of modern security protocols and requirements for scalable, robust, and highly available implementations. By outsourcing this functionality to Globus Nexus, developers can leverage best-practice implementations without incurring development and operations overhead. Users benefit from enhanced capabilities such as identity federation, flexible profile management, and user-oriented group management. In this paper we present Globus Nexus,more » describe its capabilities and architecture, summarize how several e-Science applications leverage these capabilities, and present results that characterize its scalability, reliability, and availability.« less
Globus Nexus: A Platform-as-a-Service Provider of Research Identity, Profile, and Group Management
Lidman, Mattias; McCollam, Brendan; Bryan, Josh; Ananthakrishnan, Rachana; Tuecke, Steven; Foster, Ian
2015-01-01
Globus Nexus is a professionally hosted Platform-as-a-Service that provides identity, profile and group management functionality for the research community. Many collaborative e-Science applications need to manage large numbers of user identities, profiles, and groups. However, developing and maintaining such capabilities is often challenging given the complexity of modern security protocols and requirements for scalable, robust, and highly available implementations. By outsourcing this functionality to Globus Nexus, developers can leverage best-practice implementations without incurring development and operations overhead. Users benefit from enhanced capabilities such as identity federation, flexible profile management, and user-oriented group management. In this paper we present Globus Nexus, describe its capabilities and architecture, summarize how several e-Science applications leverage these capabilities, and present results that characterize its scalability, reliability, and availability. PMID:26688598
Impact of Nutrition Standards on Competitive Food Quality in Massachusetts Middle and High Schools
Cohen, Juliana F. W.; Hoffman, Jessica A.; Rosenfeld, Lindsay; Chaffee, Ruth; Smith, Lauren; Rimm, Eric B.
2016-01-01
Objectives. To examine changes in competitive foods (items sold in à la carte lines, vending machines, and school stores that “compete” with school meals) in Massachusetts middle and high schools before and after implementation of a statewide nutrition law in 2012. Methods. We photographed n = 10 782 competitive foods and beverages in 36 Massachusetts school districts and 7 control state districts to determine availability and compliance with the law at baseline (2012), 1 year (2013), and 2 years (2014) after the policy (overall enrollment: 71 202 students). We examined availability and compliance trends over time. Results. By 2014, 60% of competitive foods and 79% of competitive beverages were compliant. Multilevel models showed an absolute 46.2% increase for foods (95% confidence interval = 36.2, 56.3) and 46.8% increase for beverages (95% confidence interval = 39.2, 54.4) in schools’ alignment with updated standards from 2012 to 2014. Conclusions. The law’s implementation resulted in major improvements in the availability and nutritional quality of competitive foods and beverages, but schools did not reach 100% compliance. This law closely mirrors US Department of Agriculture Smart Snacks in School standards, suggesting that complying with strict nutrition standards is feasible, and schools may experience challenges and improvements over time. PMID:27077344
Improving finite element results in modeling heart valve mechanics.
Earl, Emily; Mohammadi, Hadi
2018-06-01
Finite element analysis is a well-established computational tool which can be used for the analysis of soft tissue mechanics. Due to the structural complexity of the leaflet tissue of the heart valve, the currently available finite element models do not adequately represent the leaflet tissue. A method of addressing this issue is to implement computationally expensive finite element models, characterized by precise constitutive models including high-order and high-density mesh techniques. In this study, we introduce a novel numerical technique that enhances the results obtained from coarse mesh finite element models to provide accuracy comparable to that of fine mesh finite element models while maintaining a relatively low computational cost. Introduced in this study is a method by which the computational expense required to solve linear and nonlinear constitutive models, commonly used in heart valve mechanics simulations, is reduced while continuing to account for large and infinitesimal deformations. This continuum model is developed based on the least square algorithm procedure coupled with the finite difference method adhering to the assumption that the components of the strain tensor are available at all nodes of the finite element mesh model. The suggested numerical technique is easy to implement, practically efficient, and requires less computational time compared to currently available commercial finite element packages such as ANSYS and/or ABAQUS.
Impact of Nutrition Standards on Competitive Food Quality in Massachusetts Middle and High Schools.
Gorski, Mary T; Cohen, Juliana F W; Hoffman, Jessica A; Rosenfeld, Lindsay; Chaffee, Ruth; Smith, Lauren; Rimm, Eric B
2016-06-01
To examine changes in competitive foods (items sold in à la carte lines, vending machines, and school stores that "compete" with school meals) in Massachusetts middle and high schools before and after implementation of a statewide nutrition law in 2012. We photographed n = 10 782 competitive foods and beverages in 36 Massachusetts school districts and 7 control state districts to determine availability and compliance with the law at baseline (2012), 1 year (2013), and 2 years (2014) after the policy (overall enrollment: 71 202 students). We examined availability and compliance trends over time. By 2014, 60% of competitive foods and 79% of competitive beverages were compliant. Multilevel models showed an absolute 46.2% increase for foods (95% confidence interval = 36.2, 56.3) and 46.8% increase for beverages (95% confidence interval = 39.2, 54.4) in schools' alignment with updated standards from 2012 to 2014. The law's implementation resulted in major improvements in the availability and nutritional quality of competitive foods and beverages, but schools did not reach 100% compliance. This law closely mirrors US Department of Agriculture Smart Snacks in School standards, suggesting that complying with strict nutrition standards is feasible, and schools may experience challenges and improvements over time.
RAID Unbound: Storage Fault Tolerance in a Distributed Environment
NASA Technical Reports Server (NTRS)
Ritchie, Brian
1996-01-01
Mirroring, data replication, backup, and more recently, redundant arrays of independent disks (RAID) are all technologies used to protect and ensure access to critical company data. A new set of problems has arisen as data becomes more and more geographically distributed. Each of the technologies listed above provides important benefits; but each has failed to adapt fully to the realities of distributed computing. The key to data high availability and protection is to take the technologies' strengths and 'virtualize' them across a distributed network. RAID and mirroring offer high data availability, which data replication and backup provide strong data protection. If we take these concepts at a very granular level (defining user, record, block, file, or directory types) and them liberate them from the physical subsystems with which they have traditionally been associated, we have the opportunity to create a highly scalable network wide storage fault tolerance. The network becomes the virtual storage space in which the traditional concepts of data high availability and protection are implemented without their corresponding physical constraints.
Trinczek, B.; Köpcke, F.; Leusch, T.; Majeed, R.W.; Schreiweis, B.; Wenk, J.; Bergh, B.; Ohmann, C.; Röhrig, R.; Prokosch, H.U.; Dugas, M.
2014-01-01
Summary Objective (1) To define features and data items of a Patient Recruitment System (PRS); (2) to design a generic software architecture of such a system covering the requirements; (3) to identify implementation options available within different Hospital Information System (HIS) environments; (4) to implement five PRS following the architecture and utilizing the implementation options as proof of concept. Methods Existing PRS were reviewed and interviews with users and developers conducted. All reported PRS features were collected and prioritized according to their published success and user’s request. Common feature sets were combined into software modules of a generic software architecture. Data items to process and transfer were identified for each of the modules. Each site collected implementation options available within their respective HIS environment for each module, provided a prototypical implementation based on available implementation possibilities and supported the patient recruitment of a clinical trial as a proof of concept. Results 24 commonly reported and requested features of a PRS were identified, 13 of them prioritized as being mandatory. A UML version 2 based software architecture containing 5 software modules covering these features was developed. 13 data item groups processed by the modules, thus required to be available electronically, have been identified. Several implementation options could be identified for each module, most of them being available at multiple sites. Utilizing available tools, a PRS could be implemented in each of the five participating German university hospitals. Conclusion A set of required features and data items of a PRS has been described for the first time. The software architecture covers all features in a clear, well-defined way. The variety of implementation options and the prototypes show that it is possible to implement the given architecture in different HIS environments, thus enabling more sites to successfully support patient recruitment in clinical trials. PMID:24734138
Trinczek, B; Köpcke, F; Leusch, T; Majeed, R W; Schreiweis, B; Wenk, J; Bergh, B; Ohmann, C; Röhrig, R; Prokosch, H U; Dugas, M
2014-01-01
(1) To define features and data items of a Patient Recruitment System (PRS); (2) to design a generic software architecture of such a system covering the requirements; (3) to identify implementation options available within different Hospital Information System (HIS) environments; (4) to implement five PRS following the architecture and utilizing the implementation options as proof of concept. Existing PRS were reviewed and interviews with users and developers conducted. All reported PRS features were collected and prioritized according to their published success and user's request. Common feature sets were combined into software modules of a generic software architecture. Data items to process and transfer were identified for each of the modules. Each site collected implementation options available within their respective HIS environment for each module, provided a prototypical implementation based on available implementation possibilities and supported the patient recruitment of a clinical trial as a proof of concept. 24 commonly reported and requested features of a PRS were identified, 13 of them prioritized as being mandatory. A UML version 2 based software architecture containing 5 software modules covering these features was developed. 13 data item groups processed by the modules, thus required to be available electronically, have been identified. Several implementation options could be identified for each module, most of them being available at multiple sites. Utilizing available tools, a PRS could be implemented in each of the five participating German university hospitals. A set of required features and data items of a PRS has been described for the first time. The software architecture covers all features in a clear, well-defined way. The variety of implementation options and the prototypes show that it is possible to implement the given architecture in different HIS environments, thus enabling more sites to successfully support patient recruitment in clinical trials.
Long, Michael W; Henderson, Kathryn E; Schwartz, Marlene B
2010-10-01
This article seeks to inform state and local school food policies by evaluating the impact of Connecticut's Healthy Food Certification (HFC), a program which provides monetary incentives to school districts that choose to implement state nutrition standards for all foods sold to students outside reimbursable school meals. Food service directors from all school districts participating in the National School Lunch Program (NSLP) (N = 151) in Connecticut were surveyed about the availability of competitive foods before and after the 2006-2007 implementation of HFC. Food categories were coded as healthy or unhealthy based on whether they met the Connecticut Nutrition Standards. Data on NSLP participation were provided by the State Department of Education. Changes in NSLP participation and availability of unhealthy competitive foods in elementary, middle, and high schools were compared pre- and post-HFC across districts participating (n = 74) versus not participating (n = 77) in HFC. On average, all districts in Connecticut reduced the availability of unhealthy competitive foods, with a significantly greater reduction among HFC districts. Average NSLP participation also increased across the state. Participating in HFC was associated with significantly greater NSLP participation for paid meals in middle school; however, implementing HFC did not increase overall NSLP participation beyond the statewide upward trend. The 2006-2007 school year was marked by a significant decrease in unhealthy competitive foods and an increase in NSLP participation across the state. Participation in Connecticut's voluntary HFC further reduced the availability of unhealthy competitive foods in local school districts, and had either a positive or neutral effect on NSLP participation. © 2010, American School Health Association.
Identification of Social Anxiety in Schools: The Utility of a Two-Step Screening Process
ERIC Educational Resources Information Center
Sweeney, Corinne; Masia Warner, Carrie; Brice, Chad; Stewart, Catherine; Ryan, Julie; Loeb, Katharine L.; McGrath, Robert E.
2015-01-01
Social anxiety disorder (SAD) is highly prevalent yet largely undetected and untreated in adolescents despite the availability of effective treatments. Implementing interventions in schools enhances recognition and access to treatment for SAD. However, without reliable means to accurately identify youth in need of services, school-based…
ERIC Educational Resources Information Center
Novak, Gordon S., Jr.
GLISP is a LISP-based language which provides high-level language features not found in ordinary LISP. The GLISP language is implemented by means of a compiler which accepts GLISP as input and produces ordinary LISP as output. This output can be further compiled to machine code by the LISP compiler. GLISP is available for several LISP dialects,…
Code of Federal Regulations, 2013 CFR
2013-10-01
... Agreement Continuously improve safety and seek high levels of safety, particularly by developing and... and regions, thereby providing greater safety protection with available government resources. II... a working party of experts recommends a harmonized or new global technical regulation, it sends a...
South Carolina Case Study: Building a Student-Level Longitudinal Data System
ERIC Educational Resources Information Center
Kugle, Cherry; Smith, Nancy
2007-01-01
The Data Quality Campaign is a national, collaborative effort to encourage and support state policymakers to improve the collection, availability and use of high-quality education data and to implement state longitudinal data systems to improve student achievement. The campaign aims to provide tools and resources that will assist state development…
ERIC Educational Resources Information Center
Qian, Manman; Chukharev-Hudilainen, Evgeny; Levis, John
2018-01-01
Many types of L2 phonological perception are often difficult to acquire without instruction. These difficulties with perception may also be related to intelligibility in production. Instruction on perception contrasts is more likely to be successful with the use of phonetically variable input made available through computer-assisted pronunciation…
Evidence-Based Special Education in the Context of Scarce Evidence-Based Practices
ERIC Educational Resources Information Center
TEACHING Exceptional Children, 2014
2014-01-01
Evidence-based practices (EBPs) are supported as generally effective for populations of learners by bodies of high-quality and experimental research and, when aligned with stakeholder values and practical needs, should be prioritized for implementation. However, evidence-based practices are not currently available for all learner types in all…
DOT National Transportation Integrated Search
2014-03-01
Similar to an ill patient, road safety issues can : also be diagnosed, if the right tools are available. : Statistics on roadway incidents can locate areas : that have a high rate of incidents and require : a solution, such as better signage, lightin...
Louisiana Case Study: Building a Student-Level Longitudinal Data System
ERIC Educational Resources Information Center
Kugle, Cherry; Smith, Nancy
2008-01-01
The Data Quality Campaign is a national, collaborative effort to encourage and support state policymakers to improve the collection, availability and use of high-quality education data and to implement state longitudinal data systems to improve student achievement. The campaign aims to provide tools and resources that will assist state development…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-22
... Office of Citizen Exchanges, Cultural Programs Division, in the implementation of short-term, high... influencers, artists, arts managers, and foreign youth with special interest in the arts. The Cultural.... citizens to conferences or conference-type seminars overseas; nor is funding available for bringing foreign...
[New paradigms and challenges in cervical cancer prevention and control in Latin America].
Almonte, Maribel; Murillo, Raúl; Sánchez, Gloria Inés; Jerónimo, José; Salmerón, Jorge; Ferreccio, Catterina; Lazcano-Ponce, Eduardo; Herrero, Rolando
2010-01-01
Cervical cancer continues to be a significant health problem in Latin America. The use of conventional cytology to detect precancerous cervical lesions has had almost no major impact on reducing cervical cancer incidence and mortality rates, which are still high in the region. The availability of new screening tools to detect precancerous lesions provide great opportunities for cervical cancer prevention in the region, as do highly efficacious HPV vaccines able to prevent nearly all lesions associated with HPV-16 and -18 when applied before viral exposure. This paper summarizes the scientific evidence and regional experiences related to: i) the use of HPV testing and visual inspection after the application of acetic acid (VIA) in primary screening and ii) the implementation of adolescent HPV vaccination programs. Finally, we outline a number of recommendations for different resource settings. The feasibility of implementing successful and sustainable national cervical cancer prevention programs in Latin American countries in the region will depend on health priorities and the availability of infrastructure and health personnel--as determined by rigorous local situational analysis.
Gálvez, Sergio; Ferusic, Adis; Esteban, Francisco J; Hernández, Pilar; Caballero, Juan A; Dorado, Gabriel
2016-10-01
The Smith-Waterman algorithm has a great sensitivity when used for biological sequence-database searches, but at the expense of high computing-power requirements. To overcome this problem, there are implementations in literature that exploit the different hardware-architectures available in a standard PC, such as GPU, CPU, and coprocessors. We introduce an application that splits the original database-search problem into smaller parts, resolves each of them by executing the most efficient implementations of the Smith-Waterman algorithms in different hardware architectures, and finally unifies the generated results. Using non-overlapping hardware allows simultaneous execution, and up to 2.58-fold performance gain, when compared with any other algorithm to search sequence databases. Even the performance of the popular BLAST heuristic is exceeded in 78% of the tests. The application has been tested with standard hardware: Intel i7-4820K CPU, Intel Xeon Phi 31S1P coprocessors, and nVidia GeForce GTX 960 graphics cards. An important increase in performance has been obtained in a wide range of situations, effectively exploiting the available hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ditzler, Gregory; Morrison, J. Calvin; Lan, Yemin
Background: Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α– & β–diversity. Feature subset selection – a sub-field of machine learning – can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate betweenmore » age groups in the human gut microbiome. Results: We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. Conclusions: We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.« less
Fizzy: feature subset selection for metagenomics.
Ditzler, Gregory; Morrison, J Calvin; Lan, Yemin; Rosen, Gail L
2015-11-04
Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α- & β-diversity. Feature subset selection--a sub-field of machine learning--can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate between age groups in the human gut microbiome. We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.
Fizzy. Feature subset selection for metagenomics
Ditzler, Gregory; Morrison, J. Calvin; Lan, Yemin; ...
2015-11-04
Background: Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α– & β–diversity. Feature subset selection – a sub-field of machine learning – can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate betweenmore » age groups in the human gut microbiome. Results: We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. Conclusions: We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.« less
Olmos, Jorge A; Piskorz, María Marta; Vela, Marcelo F
2016-06-01
GERD is a highly prevalent disease in our country. It has a deep impact in patient´s quality of life, representing extremely high costs regarding health. The correct understanding of its pathophysiology is crucial for the rational use of diagnoses methods and the implementation of appropriate treatment adjusted to each individual case. In this review we evaluate this disorder based on the best available evidence, focusing in pathophysiological mechanisms, its epidemiology, modern diagnosis methods and current management standards.
Parallel algorithms for large-scale biological sequence alignment on Xeon-Phi based clusters.
Lan, Haidong; Chan, Yuandong; Xu, Kai; Schmidt, Bertil; Peng, Shaoliang; Liu, Weiguo
2016-07-19
Computing alignments between two or more sequences are common operations frequently performed in computational molecular biology. The continuing growth of biological sequence databases establishes the need for their efficient parallel implementation on modern accelerators. This paper presents new approaches to high performance biological sequence database scanning with the Smith-Waterman algorithm and the first stage of progressive multiple sequence alignment based on the ClustalW heuristic on a Xeon Phi-based compute cluster. Our approach uses a three-level parallelization scheme to take full advantage of the compute power available on this type of architecture; i.e. cluster-level data parallelism, thread-level coarse-grained parallelism, and vector-level fine-grained parallelism. Furthermore, we re-organize the sequence datasets and use Xeon Phi shuffle operations to improve I/O efficiency. Evaluations show that our method achieves a peak overall performance up to 220 GCUPS for scanning real protein sequence databanks on a single node consisting of two Intel E5-2620 CPUs and two Intel Xeon Phi 7110P cards. It also exhibits good scalability in terms of sequence length and size, and number of compute nodes for both database scanning and multiple sequence alignment. Furthermore, the achieved performance is highly competitive in comparison to optimized Xeon Phi and GPU implementations. Our implementation is available at https://github.com/turbo0628/LSDBS-mpi .
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-04
... Promulgation of Implementation Plans; New York Reasonably Available Control Technology and Reasonably Available... 25, 2009, the EPA proposed to disapprove portions of a proposed revision to the New York State.... Specifically, EPA proposed to disapprove New York's reasonably available control measure analysis and New York...
Space Debris Detection on the HPDP, a Coarse-Grained Reconfigurable Array Architecture for Space
NASA Astrophysics Data System (ADS)
Suarez, Diego Andres; Bretz, Daniel; Helfers, Tim; Weidendorfer, Josef; Utzmann, Jens
2016-08-01
Stream processing, widely used in communications and digital signal processing applications, requires high- throughput data processing that is achieved in most cases using Application-Specific Integrated Circuit (ASIC) designs. Lack of programmability is an issue especially in space applications, which use on-board components with long life-cycles requiring applications updates. To this end, the High Performance Data Processor (HPDP) architecture integrates an array of coarse-grained reconfigurable elements to provide both flexible and efficient computational power suitable for stream-based data processing applications in space. In this work the capabilities of the HPDP architecture are demonstrated with the implementation of a real-time image processing algorithm for space debris detection in a space-based space surveillance system. The implementation challenges and alternatives are described making trade-offs to improve performance at the expense of negligible degradation of detection accuracy. The proposed implementation uses over 99% of the available computational resources. Performance estimations based on simulations show that the HPDP can amply match the application requirements.
NASA Astrophysics Data System (ADS)
Bruschetta, M.; Maran, F.; Beghi, A.
2017-06-01
The use of dynamic driving simulators is constantly increasing in the automotive community, with applications ranging from vehicle development to rehab and driver training. The effectiveness of such devices is related to their capabilities of well reproducing the driving sensations, hence it is crucial that the motion control strategies generate both realistic and feasible inputs to the platform. Such strategies are called motion cueing algorithms (MCAs). In recent years several MCAs based on model predictive control (MPC) techniques have been proposed. The main drawback associated with the use of MPC is its computational burden, that may limit their application to high performance dynamic simulators. In the paper, a fast, real-time implementation of an MPC-based MCA for 9 DOF, high performance platform is proposed. Effectiveness of the approach in managing the available working area is illustrated by presenting experimental results from an implementation on a real device with a 200 Hz control frequency.
Leavy, Breiffni; Kwak, Lydia; Hagströmer, Maria; Franzén, Erika
2017-02-07
If people with progressive neurological diseases are to avail of evidence-based rehabilitation, programs found effective in randomized controlled trials (RCT's) must firstly be adapted and tested in clinical effectiveness studies as a means of strengthening their evidence base. This paper describes the protocol for an effectiveness-implementation trial that will assess the clinical effectiveness of a highly challenging balance training program (the HiBalance program) for people with mild-moderate Parkinson's disease (PD) while simultaneously collecting data concerning the way in which the program is implemented. The HiBalance program is systemically designed to target balance impairments in PD and has been shown effective at improving balance control and gait in a previous RCT. Study aims are to i) determine the effectiveness of the adapted HiBalance program on performance and self-rated outcomes such as balance control, gait and physical activity level ii) conduct a process evaluation of program implementation at the various clinics iii) determine barriers and facilitators to program implementation in these settings. This effectiveness-implementation type 1 hybrid study will use a non-randomized controlled design with consecutive inclusion of people with PD at multiple clinical sites. A mixed method approach will be used to collect clinical effectiveness data and process evaluation data which is both quantitative and qualitative in nature. The consolidated framework for implementation research (CFIR) will be used to guide the planning and collection of data concerning implementation barriers and facilitators. The HiBalance program will be provided by physical therapists as a part of standard rehabilitation care at the clinical sites, while the evaluation of the implementation process will be performed by the research group and funded by research grants. An effectiveness-implementation study design benefits patients by speeding up the process of translating findings from research settings to routine health care. Findings from this study will also be highly relevant for those working with neurological rehabilitation when faced with decisions concerning the translation of training programs from efficacy studies to everyday clinical practice. ClinicalTrials.gov march 2016, NCT02727478 .
Exploiting Multiple Levels of Parallelism in Sparse Matrix-Matrix Multiplication
Azad, Ariful; Ballard, Grey; Buluc, Aydin; ...
2016-11-08
Sparse matrix-matrix multiplication (or SpGEMM) is a key primitive for many high-performance graph algorithms as well as for some linear solvers, such as algebraic multigrid. The scaling of existing parallel implementations of SpGEMM is heavily bound by communication. Even though 3D (or 2.5D) algorithms have been proposed and theoretically analyzed in the flat MPI model on Erdös-Rényi matrices, those algorithms had not been implemented in practice and their complexities had not been analyzed for the general case. In this work, we present the first implementation of the 3D SpGEMM formulation that exploits multiple (intranode and internode) levels of parallelism, achievingmore » significant speedups over the state-of-the-art publicly available codes at all levels of concurrencies. We extensively evaluate our implementation and identify bottlenecks that should be subject to further research.« less
Design and implementation of a non-linear symphonic soundtrack of a video game
NASA Astrophysics Data System (ADS)
Sporka, Adam J.; Valta, Jan
2017-10-01
The music in the contemporary video games is often interactive. The music playback is based on transitions between pieces of available music material. These transitions happen in response to evolving gameplay. This paradigm is referred to as the adaptive music. Our challenge was to design, create, and implement the soundtrack of the upcoming video game Kingdom Come: Deliverance. Our soundtrack is a collection of compositions with symphonic orchestration. Per our design decision, our intention was to implement the adaptive music in a way which respected the nature of the orchestral film score. We created our own adaptive music middleware, called Sequence Music Engine, implementing a high-level music logic as well as the low-level playback infrastructure. Our system can handle hours of video game music, helps maintain the relevance of the music throughout the video game, and minimises the repetitiveness of the individual pieces.
Angelow, Aniela; Schmidt, Matthias; Weitmann, Kerstin; Schwedler, Susanne; Vogt, Hannes; Havemann, Christoph; Hoffmann, Wolfgang
2008-07-01
In our report we describe concept, strategies and implementation of a central biosample and data management (CSDM) system in the three-centre clinical study of the Transregional Collaborative Research Centre "Inflammatory Cardiomyopathy - Molecular Pathogenesis and Therapy" SFB/TR 19, Germany. Following the requirements of high system resource availability, data security, privacy protection and quality assurance, a web-based CSDM was developed based on Java 2 Enterprise Edition using an Oracle database. An efficient and reliable sample documentation system using bar code labelling, a partitioning storage algorithm and an online documentation software was implemented. An online electronic case report form is used to acquire patient-related data. Strict rules for access to the online applications and secure connections are used to account for privacy protection and data security. Challenges for the implementation of the CSDM resided at project, technical and organisational level as well as at staff level.
Osteoarthritis guidelines: Barriers to implementation and solutions.
Ferreira de Meneses, Sarah; Rannou, Francois; Hunter, David J
2016-06-01
Osteoarthritis (OA) is a leading cause of disability worldwide. Clinical practice guidelines (CPGs) have been developed to facilitate improved OA management. Scientific communities worldwide have proposed CPGs for OA treatment. Despite the number of highly prominent guidelines available and their remarkable consistency, their uptake has been suboptimal. Possibly because of the multitude of barriers related to the implementation of CPGs. For example, different guidelines show contradictions, some lack evidence, and they lack a hierarchy or tools to facilitate their translation and application. Also, the guidelines do not acknowledge the effect of comorbidities on choosing the treatments. Finally, poor integration of multidisciplinary services within and across healthcare settings is a major barrier to the effective implementation of management guidelines. Here we describe the main problems related to the OA guidelines and some solutions so as to offer some guidance on the elaboration of future CPGs and their implementation in primary care. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Web-based network analysis and visualization using CellMaps.
Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín
2016-10-01
: CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps The client is implemented in JavaScript and the server in C and Java. jdopazo@cipf.es Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Pela, F.; Tsugawa, R. K.; Andreoli, L. J.
2004-12-01
The National Polar-Orbiting NPOESS, a tri-agency program, supports missions of the Department of Commerce (DOC)/National Oceanic and Atmospheric Administration (NOAA), the Department of Defense (DoD), and the National Aeronautics and Space Administration (NASA). NPOESS provides a critical, timely, reliable, and high quality space-based sensing capability to acquire and process global and regional environmental imagery and specialized meteorological, climatic, terrestrial, oceanographic, solar-geophysical, and other data products. These products are delivered to national weather and environmental facilities operated by NOAA and DoD, to NASA, and to environmental remote sensing science community users to support civil and military functions. These data are also provided in real time to field terminals deployed worldwide. The NPOESS architecture is built on a foundation of affordability, and the three pillars of data quality, latency, availability. Affordability refers to an over-arching awareness of cost to provide the best value to the government for implementing a converged system; some dimensions of cost include the cost for system development and implementation, the balance between development costs and operation and maintenance costs, and the fiscal year expenditure plans that meet schedule commitments. Data quality is characterized in terms of the attributes associated with Environmental Data Records (EDRs), and the products that are delivered to the four US Operational Centrals and field users. These EDRs are generated by the system using raw data from the space-borne sensors and spacecraft, in conjunction with science algorithms and calibration factors. Data latency refers to the time period between the detection of energy by a space-borne sensor to the delivery of a corresponding EDR. The system was designed to minimize data latency, and hence provide users with timely data. Availability refers to both data availability and system operational availability. Data availability is ensured by the way data is stored and routed throughout the system, on the spacecraft and on the ground, so that it can be retrieved and resent if the first transmittal is not successful. Operational availability is a measure of how well around-the-clock operations are supported, through the careful deployment of hot spares and fault tolerance of the system. Both types of availability are very high for the NPOESS architecture. Overall, the NPOESS architecture successfully delivers to the government a best-value solution featuring high data quality, low data latency, and high data/system availability.
[GNU Pattern: open source pattern hunter for biological sequences based on SPLASH algorithm].
Xu, Ying; Li, Yi-xue; Kong, Xiang-yin
2005-06-01
To construct a high performance open source software engine based on IBM SPLASH algorithm for later research on pattern discovery. Gpat, which is based on SPLASH algorithm, was developed by using open source software. GNU Pattern (Gpat) software was developped, which efficiently implemented the core part of SPLASH algorithm. Full source code of Gpat was also available for other researchers to modify the program under the GNU license. Gpat is a successful implementation of SPLASH algorithm and can be used as a basic framework for later research on pattern recognition in biological sequences.
Steady-state capabilities for hydroturbines with OpenFOAM
NASA Astrophysics Data System (ADS)
Page, M.; Beaudoin, M.; Giroux, A. M.
2010-08-01
The availability of a high quality Open Source CFD simulation platform like OpenFOAM offers new R&D opportunities by providing direct access to models and solver implementation details. Efforts have been made by Hydro-Québec to adapt OpenFOAM to hydroturbines for the development of steady-state capabilities. The paper describes the developments that have been made to implement new turbomachinery related capabilities: Multiple Frame of Reference solver, domain coupling interfaces (GGI, cyclicGGI and mixing plane) and specialized boundary conditions. Practical use of the new turbomachinery capabilities are demonstrated for the analysis of a 195-MW Francis hydroturbine.
Helaers, Raphaël; Milinkovitch, Michel C
2010-07-15
The development, in the last decade, of stochastic heuristics implemented in robust application softwares has made large phylogeny inference a key step in most comparative studies involving molecular sequences. Still, the choice of a phylogeny inference software is often dictated by a combination of parameters not related to the raw performance of the implemented algorithm(s) but rather by practical issues such as ergonomics and/or the availability of specific functionalities. Here, we present MetaPIGA v2.0, a robust implementation of several stochastic heuristics for large phylogeny inference (under maximum likelihood), including a Simulated Annealing algorithm, a classical Genetic Algorithm, and the Metapopulation Genetic Algorithm (metaGA) together with complex substitution models, discrete Gamma rate heterogeneity, and the possibility to partition data. MetaPIGA v2.0 also implements the Likelihood Ratio Test, the Akaike Information Criterion, and the Bayesian Information Criterion for automated selection of substitution models that best fit the data. Heuristics and substitution models are highly customizable through manual batch files and command line processing. However, MetaPIGA v2.0 also offers an extensive graphical user interface for parameters setting, generating and running batch files, following run progress, and manipulating result trees. MetaPIGA v2.0 uses standard formats for data sets and trees, is platform independent, runs in 32 and 64-bits systems, and takes advantage of multiprocessor and multicore computers. The metaGA resolves the major problem inherent to classical Genetic Algorithms by maintaining high inter-population variation even under strong intra-population selection. Implementation of the metaGA together with additional stochastic heuristics into a single software will allow rigorous optimization of each heuristic as well as a meaningful comparison of performances among these algorithms. MetaPIGA v2.0 gives access both to high customization for the phylogeneticist, as well as to an ergonomic interface and functionalities assisting the non-specialist for sound inference of large phylogenetic trees using nucleotide sequences. MetaPIGA v2.0 and its extensive user-manual are freely available to academics at http://www.metapiga.org.
2010-01-01
Background The development, in the last decade, of stochastic heuristics implemented in robust application softwares has made large phylogeny inference a key step in most comparative studies involving molecular sequences. Still, the choice of a phylogeny inference software is often dictated by a combination of parameters not related to the raw performance of the implemented algorithm(s) but rather by practical issues such as ergonomics and/or the availability of specific functionalities. Results Here, we present MetaPIGA v2.0, a robust implementation of several stochastic heuristics for large phylogeny inference (under maximum likelihood), including a Simulated Annealing algorithm, a classical Genetic Algorithm, and the Metapopulation Genetic Algorithm (metaGA) together with complex substitution models, discrete Gamma rate heterogeneity, and the possibility to partition data. MetaPIGA v2.0 also implements the Likelihood Ratio Test, the Akaike Information Criterion, and the Bayesian Information Criterion for automated selection of substitution models that best fit the data. Heuristics and substitution models are highly customizable through manual batch files and command line processing. However, MetaPIGA v2.0 also offers an extensive graphical user interface for parameters setting, generating and running batch files, following run progress, and manipulating result trees. MetaPIGA v2.0 uses standard formats for data sets and trees, is platform independent, runs in 32 and 64-bits systems, and takes advantage of multiprocessor and multicore computers. Conclusions The metaGA resolves the major problem inherent to classical Genetic Algorithms by maintaining high inter-population variation even under strong intra-population selection. Implementation of the metaGA together with additional stochastic heuristics into a single software will allow rigorous optimization of each heuristic as well as a meaningful comparison of performances among these algorithms. MetaPIGA v2.0 gives access both to high customization for the phylogeneticist, as well as to an ergonomic interface and functionalities assisting the non-specialist for sound inference of large phylogenetic trees using nucleotide sequences. MetaPIGA v2.0 and its extensive user-manual are freely available to academics at http://www.metapiga.org. PMID:20633263
NASA Astrophysics Data System (ADS)
Simonis, I.; Alameh, N.; Percivall, G.
2012-04-01
The GEOSS Architecture Implementation Pilots (AIP) develop and pilot new process and infrastructure components for the GEOSS Common Infrastructure (GCI) and the broader GEOSS architecture through an evolutionary development process consisting of a set of phases. Each phase addresses a set of Societal Benefit Areas (SBA) and geoinformatic topics. The first three phases consisted of architecture refinements based on interactions with users; component interoperability testing; and SBA-driven demonstrations. The fourth phase (AIP-4) documented here focused on fostering interoperability arrangements and common practices for GEOSS by facilitating access to priority earth observation data sources and by developing and testing specific clients and mediation components to enable such access. Additionally, AIP-4 supported the development of a thesaurus for earth observation parameters and tutorials to guide data providers to make their data available through GEOSS. The results of AIP-4 are documented in two engineering reports and captured in a series of videos posted online. Led by the Open Geospatial Consortium (OGC), AIP-4 built on contributions from over 60 organizations. This wide portfolio helped testing interoperability arrangements in a highly heterogeneous environment. AIP-4 participants cooperated closely to test available data sets, access services, and client applications in multiple workflows and set ups. Eventually, AIP-4 improved the accessibility of GEOSS datasets identified as supporting Critical Earth Observation Priorities by the GEO User Interface Committee (UIC), and increased the use of the data through promoting availability of new data services, clients, and applications. During AIP-4, A number of key earth observation data sources have been made available online at standard service interfaces, discovered using brokered search approaches, and processed and visualized in generalized client applications. AIP-4 demonstrated the level of interoperability that can be achieved using currently available standards and corresponding products and implementations. The AIP-4 integration testing process proved that the integration of heterogeneous data resources available via interoperability arrangements such as WMS, WFS, WCS and WPS indeed works. However, the integration often required various levels of customizations on the client side to accommodate for variations in the service implementations. Those variations seem to be based on both malfunctioning service implementations as well as varying interpretations of or inconsistencies in existing standards. Other interoperability issues identified revolve around missing metadata or using unrecognized identifiers in the description of GEOSS resources. Once such issues are resolved, continuous compliance testing is necessary to ensure minimizing variability of implementations. Once data providers can choose from a set of enhanced implementations for offering their data using consistent interoperability arrangements, the barrier to client and decision support implementation developers will be lowered, leading to true leveraging of earth observation data through GEOSS. AIP-4 results, lessons learnt from previous AIPs 1-3 and close coordination with the Infrastructure Implementation Board (IIB), the successor of the Architecture and Data Committee (ADC), form the basis in the current preparation phase for the next Architecture Implementation Pilot, AIP-5. The Call For Participation will be launched in February and the pilot will be conducted from May to November 2012. The current planning foresees a scenario- oriented approach, with possible scenarios coming from the domains of disaster management, health (including air quality and waterborne diseases), water resource observations, energy, biodiversity and climate change, and agriculture.
Lewis, Cara C; Fischer, Sarah; Weiner, Bryan J; Stanick, Cameo; Kim, Mimi; Martinez, Ruben G
2015-11-04
High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of quantitative instruments used in implementation science have been published, no studies have focused on instruments that measure implementation outcomes. Proctor and colleagues established a core set of implementation outcomes including: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, sustainability (Adm Policy Ment Health Ment Health Serv Res 36:24-34, 2009). The Society for Implementation Research Collaboration (SIRC) Instrument Review Project employed an enhanced systematic review methodology (Implement Sci 2: 2015) to identify quantitative instruments of implementation outcomes relevant to mental or behavioral health settings. Full details of the enhanced systematic review methodology are available (Implement Sci 2: 2015). To increase the feasibility of the review, and consistent with the scope of SIRC, only instruments that were applicable to mental or behavioral health were included. The review, synthesis, and evaluation included the following: (1) a search protocol for the literature review of constructs; (2) the literature review of instruments using Web of Science and PsycINFO; and (3) data extraction and instrument quality ratings to inform knowledge synthesis. Our evidence-based assessment rating criteria quantified fundamental psychometric properties as well as a crude measure of usability. Two independent raters applied the evidence-based assessment rating criteria to each instrument to generate a quality profile. We identified 104 instruments across eight constructs, with nearly half (n = 50) assessing acceptability and 19 identified for adoption, with all other implementation outcomes revealing fewer than 10 instruments. Only one instrument demonstrated at least minimal evidence for psychometric strength on all six of the evidence-based assessment criteria. The majority of instruments had no information regarding responsiveness or predictive validity. Implementation outcomes instrumentation is underdeveloped with respect to both the sheer number of available instruments and the psychometric quality of existing instruments. Until psychometric strength is established, the field will struggle to identify which implementation strategies work best, for which organizations, and under what conditions.
Palinkas, Lawrence A; Campbell, Mark; Saldana, Lisa
2018-01-01
Background: This study examined influences on the decisions of administrators of youth-serving organizations to initiate and proceed with implementation of an evidence-based practice (EBP). Methods: Semi-structured interviews, developed using the Stages of Implementation Completion (SIC) as a framework, were conducted with 19 agency chief executive officers and program directors of 15 organizations serving children and adolescents. Results: Agency leaders' self-assessments of implementation feasibility and desirability prior to implementation (Pre-implementation) were influenced by intervention affordability, feasibility, requirements, validity, reliability, relevance, cost savings, positive outcomes, and adequacy of information; availability of funding, support from sources external to the agency, and adequacy of technical assistance; and staff availability and attitudes toward innovation in general and EBPs in particular, organizational capacity, fit between the EBP and agency mission and capacity, prior experience with implementation, experience with seeking evidence, and developing consensus. Assessments during the Implementation phase included intervention flexibility and requirements; availability of funding, adequacy of training and technical assistance, and getting sufficient and appropriate referrals; and staffing and implementing with fidelity. Assessments during the Sustainment phase included intervention costs and benefits; availability of funding, support from sources outside of the agency, and need for the EBP; and the fit between the EBP and the agency mission. Discussion: The results point to opportunities for using agency leader models to develop strategies to facilitate implementation of evidence-based and innovative practices for children and adolescents. The SIC provides a standardized framework for guiding agency leader self-assessments of implementation.
Palinkas, Lawrence A.; Campbell, Mark; Saldana, Lisa
2018-01-01
Background: This study examined influences on the decisions of administrators of youth-serving organizations to initiate and proceed with implementation of an evidence-based practice (EBP). Methods: Semi-structured interviews, developed using the Stages of Implementation Completion (SIC) as a framework, were conducted with 19 agency chief executive officers and program directors of 15 organizations serving children and adolescents. Results: Agency leaders' self-assessments of implementation feasibility and desirability prior to implementation (Pre-implementation) were influenced by intervention affordability, feasibility, requirements, validity, reliability, relevance, cost savings, positive outcomes, and adequacy of information; availability of funding, support from sources external to the agency, and adequacy of technical assistance; and staff availability and attitudes toward innovation in general and EBPs in particular, organizational capacity, fit between the EBP and agency mission and capacity, prior experience with implementation, experience with seeking evidence, and developing consensus. Assessments during the Implementation phase included intervention flexibility and requirements; availability of funding, adequacy of training and technical assistance, and getting sufficient and appropriate referrals; and staffing and implementing with fidelity. Assessments during the Sustainment phase included intervention costs and benefits; availability of funding, support from sources outside of the agency, and need for the EBP; and the fit between the EBP and the agency mission. Discussion: The results point to opportunities for using agency leader models to develop strategies to facilitate implementation of evidence-based and innovative practices for children and adolescents. The SIC provides a standardized framework for guiding agency leader self-assessments of implementation. PMID:29896471
Making extreme computations possible with virtual machines
NASA Astrophysics Data System (ADS)
Reuter, J.; Chokoufe Nejad, B.; Ohl, T.
2016-10-01
State-of-the-art algorithms generate scattering amplitudes for high-energy physics at leading order for high-multiplicity processes as compiled code (in Fortran, C or C++). For complicated processes the size of these libraries can become tremendous (many GiB). We show that amplitudes can be translated to byte-code instructions, which even reduce the size by one order of magnitude. The byte-code is interpreted by a Virtual Machine with runtimes comparable to compiled code and a better scaling with additional legs. We study the properties of this algorithm, as an extension of the Optimizing Matrix Element Generator (O'Mega). The bytecode matrix elements are available as alternative input for the event generator WHIZARD. The bytecode interpreter can be implemented very compactly, which will help with a future implementation on massively parallel GPUs.
Verloo, Henk; Desmedt, Mario; Morin, Diane
2017-02-01
Evidence-based practice (EBP) is upheld as a means for patients to receive the most efficient care in a given context. Despite the available evidence and positive beliefs about it, implementing EBP as standard daily practice still faces many obstacles. This study investigated the beliefs about and implementation of EBP among nurses and allied healthcare providers (AHP) in 9 acute care hospitals in the canton of Valais, Switzerland. A cross-sectional descriptive survey was conducted. The target population was composed of 1899 nurses and 126 AHPs. Beliefs about and implementation of EBP were measured using EBP-Beliefs and EBP-Implementation scales of Melnyk et al. The initial sample consisted in 491 participants (overall response rate 24.2%): 421 nurses (22.4% response rate) and 78 AHPs (61.9% response rate). The final sample, composed only of those who declared previous exposure to EBP, included 391 participants (329 nurses and 62 AHPs). Overall, participants had positive attitudes towards EBP and were willing to increase their knowledge to guide practice. However, they acknowledged poor implementation of EBP in daily practice. A significantly higher level of EBP implementation was declared by those formally trained in it (P = 0.006) and by those occupying more senior professional functions (P = 0.004). EBP-Belief scores predicted 13% of the variance in the EBP-Implementation scores (R 2 = 0.13). EBP is poorly implemented despite positive beliefs about it. Continuing education and support on EBP would help to ensure that patients receive the best available care based on high-quality evidence, patient needs, clinical expertise, and a fair distribution of healthcare resources. This study's results will be used to guide institutional strategy to increase the use of EBP in daily practice. © 2016 John Wiley & Sons, Ltd.
CellAnimation: an open source MATLAB framework for microscopy assays.
Georgescu, Walter; Wikswo, John P; Quaranta, Vito
2012-01-01
Advances in microscopy technology have led to the creation of high-throughput microscopes that are capable of generating several hundred gigabytes of images in a few days. Analyzing such wealth of data manually is nearly impossible and requires an automated approach. There are at present a number of open-source and commercial software packages that allow the user to apply algorithms of different degrees of sophistication to the images and extract desired metrics. However, the types of metrics that can be extracted are severely limited by the specific image processing algorithms that the application implements, and by the expertise of the user. In most commercial software, code unavailability prevents implementation by the end user of newly developed algorithms better suited for a particular type of imaging assay. While it is possible to implement new algorithms in open-source software, rewiring an image processing application requires a high degree of expertise. To obviate these limitations, we have developed an open-source high-throughput application that allows implementation of different biological assays such as cell tracking or ancestry recording, through the use of small, relatively simple image processing modules connected into sophisticated imaging pipelines. By connecting modules, non-expert users can apply the particular combination of well-established and novel algorithms developed by us and others that are best suited for each individual assay type. In addition, our data exploration and visualization modules make it easy to discover or select specific cell phenotypes from a heterogeneous population. CellAnimation is distributed under the Creative Commons Attribution-NonCommercial 3.0 Unported license (http://creativecommons.org/licenses/by-nc/3.0/). CellAnimationsource code and documentation may be downloaded from www.vanderbilt.edu/viibre/software/documents/CellAnimation.zip. Sample data are available at www.vanderbilt.edu/viibre/software/documents/movies.zip. walter.georgescu@vanderbilt.edu Supplementary data available at Bioinformatics online.
Akber Pradhan, Nousheen; Rizvi, Narjis; Sami, Neelofar; Gul, Xaher
2013-01-01
Background Integrated management of childhood illnesses (IMCI) strategy has been proven to improve health outcomes in children under 5 years of age. Pakistan, despite being in the late implementation phase of the strategy, continues to report high under-five mortality due to pneumonia, diarrhea, measles, and malnutrition – the main targets of the strategy. Objective The study determines the factors influencing IMCI implementation at public-sector primary health care (PHC) facilities in Matiari district, Sindh, Pakistan. Design An exploratory qualitative study with an embedded quantitative strand was conducted. The qualitative part included 16 in-depth interviews (IDIs) with stakeholders which included planners and policy makers at a provincial level (n=5), implementers and managers at a district level (n=3), and IMCI-trained physicians posted at PHC facilities (n=8). Quantitative part included PHC facility survey (n=16) utilizing WHO health facility assessment tool to assess availability of IMCI essential drugs, supplies, and equipments. Qualitative content analysis was used to interpret the textual information, whereas descriptive frequencies were calculated for health facility survey data. Results The major factors reported to enhance IMCI implementation were knowledge and perception about the strategy and need for separate clinic for children aged under 5 years as potential support factors. The latter can facilitate in strategy implementation through allocated workforce and required equipments and supplies. Constraint factors mainly included lack of clear understanding of the strategy, poor planning for IMCI implementation, ambiguity in defined roles and responsibilities among stakeholders, and insufficient essential supplies and drugs at PHC centers. The latter was further substantiated through health facilities’ survey findings, which indicated that none of the facilities had 100% stock of essential supplies and drugs. Only one out of all 16 surveyed facilities had 75% of the total supplies, while 4 out of 16 facilities had 56% of the required IMCI drug stock. The mean availability of supplies ranged from 36.6 to 66%, while the mean availability of drugs ranged from 45.8 to 56.7%. Conclusion Our findings indicate that the Matiari district has sound implementation potential; however, bottlenecks at health care facility and at health care management level have badly constrained the implementation process. An interdependency exists among the constraining factors, such as lack of sound planning resulting in unclear understanding of the strategy; leading to ambiguous roles and responsibilities among stakeholders which manifest as inadequate availability of supplies and drugs at PHC facilities. Addressing these barriers is likely to have a cumulative effect on facilitating IMCI implementation. On the basis of these findings, we recommend that the provincial Ministry of Health (MoH) and provincial Maternal Neonatal and Child Health (MNCH) program jointly assess the situation and streamline IMCI implementation in the district through sound planning, training, supervision, and logistic support. PMID:23830574
An end-to-end workflow for engineering of biological networks from high-level specifications.
Beal, Jacob; Weiss, Ron; Densmore, Douglas; Adler, Aaron; Appleton, Evan; Babb, Jonathan; Bhatia, Swapnil; Davidsohn, Noah; Haddock, Traci; Loyall, Joseph; Schantz, Richard; Vasilev, Viktor; Yaman, Fusun
2012-08-17
We present a workflow for the design and production of biological networks from high-level program specifications. The workflow is based on a sequence of intermediate models that incrementally translate high-level specifications into DNA samples that implement them. We identify algorithms for translating between adjacent models and implement them as a set of software tools, organized into a four-stage toolchain: Specification, Compilation, Part Assignment, and Assembly. The specification stage begins with a Boolean logic computation specified in the Proto programming language. The compilation stage uses a library of network motifs and cellular platforms, also specified in Proto, to transform the program into an optimized Abstract Genetic Regulatory Network (AGRN) that implements the programmed behavior. The part assignment stage assigns DNA parts to the AGRN, drawing the parts from a database for the target cellular platform, to create a DNA sequence implementing the AGRN. Finally, the assembly stage computes an optimized assembly plan to create the DNA sequence from available part samples, yielding a protocol for producing a sample of engineered plasmids with robotics assistance. Our workflow is the first to automate the production of biological networks from a high-level program specification. Furthermore, the workflow's modular design allows the same program to be realized on different cellular platforms simply by swapping workflow configurations. We validated our workflow by specifying a small-molecule sensor-reporter program and verifying the resulting plasmids in both HEK 293 mammalian cells and in E. coli bacterial cells.
Highly selective rhodium catalyzed domino C-H activation/cyclizations.
Trans, Duc N; Cramer, Nicolai
2011-01-01
The direct functionalization of carbon-hydrogen bonds is an emerging tool to establish more sustainable and efficient synthetic methods. We present its implementation in a cascade reaction that provides a rapid assembly of functionalized indanylamines from simple and readily available starting materials. Careful choice of the ancillary ligand---an electron-rich bidentate phosphine ligand--enables highly diastereoselective rhodium(i)-catalyzed intramolecular allylations of unsubstituted ketimines induced by a directed C-H bond activation and allene carbo-metalation sequence.
INTERIM ANALYSIS OF THE CONTRIBUTION OF HIGH-LEVEL EVIDENCE FOR DENGUE VECTOR CONTROL.
Horstick, Olaf; Ranzinger, Silvia Runge
2015-01-01
This interim analysis reviews the available systematic literature for dengue vector control on three levels: 1) single and combined vector control methods, with existing work on peridomestic space spraying and on Bacillus thuringiensis israelensis; further work is available soon on the use of Temephos, Copepods and larvivorous fish; 2) or for a specific purpose, like outbreak control, and 3) on a strategic level, as for example decentralization vs centralization, with a systematic review on vector control organization. Clear best practice guidelines for methodology of entomological studies are needed. There is a need to include measuring dengue transmission data. The following recommendations emerge: Although vector control can be effective, implementation remains an issue; Single interventions are probably not useful; Combinations of interventions have mixed results; Careful implementation of vector control measures may be most important; Outbreak interventions are often applied with questionable effectiveness.
ImgLib2--generic image processing in Java.
Pietzsch, Tobias; Preibisch, Stephan; Tomancák, Pavel; Saalfeld, Stephan
2012-11-15
ImgLib2 is an open-source Java library for n-dimensional data representation and manipulation with focus on image processing. It aims at minimizing code duplication by cleanly separating pixel-algebra, data access and data representation in memory. Algorithms can be implemented for classes of pixel types and generic access patterns by which they become independent of the specific dimensionality, pixel type and data representation. ImgLib2 illustrates that an elegant high-level programming interface can be achieved without sacrificing performance. It provides efficient implementations of common data types, storage layouts and algorithms. It is the data model underlying ImageJ2, the KNIME Image Processing toolbox and an increasing number of Fiji-Plugins. ImgLib2 is licensed under BSD. Documentation and source code are available at http://imglib2.net and in a public repository at https://github.com/imagej/imglib. Supplementary data are available at Bioinformatics Online. saalfeld@mpi-cbg.de
The Impact of Induction/Mentoring on Job Satisfaction and Retention of Novice Teachers
ERIC Educational Resources Information Center
Williams, Ingrid Rene'e
2012-01-01
Teachers are essential to the success of all students. In an effort to reduce high teacher turnover, states and/or school systems are implementing induction/mentoring programs as a mechanism for supporting teachers in their early years. The issue is not the availability of teachers; higher education is producing more than enough qualified…
Little People, Big Helpers: Implementing Elementary Peer Programs Is Possible and Powerful
ERIC Educational Resources Information Center
Townsend, Ashley C.
2013-01-01
In the last few years, peer programs have grown in popularity around the country, supported by a growing body of research and the rewards of seeing teenagers' lives changed. Elementary peer programs, with their different set of typical issues, schedule demands, and personnel availability, are certainly different from their middle- and high-school…
Low-Cost High-Speed Techniques for Real-Time Simulation of Power Electronic Systems
2007-06-01
first implemented on the RT-Lab using Simulink S- fuctions . An effort was then initiated to code at least part of the simulation on the available FPGA. It...time simulation, and the use of simulation packages such as Matlab and Spice. The primary purpose of these calculations was to confirm that the
ERIC Educational Resources Information Center
Woo, Stephanie M.; Hepner, Kimberly A.; Gilbert, Elizabeth A.; Osilla, Karen Chan; Hunter, Sarah B.; Munoz, Ricardo F.; Watkins, Katherine E.
2013-01-01
One barrier to widespread public access to empirically supported treatments (ESTs) is the limited availability and high cost of professionals trained to deliver them. Our earlier work from 2 clinical trials demonstrated that front-line addiction counselors could be trained to deliver a manualized, group-based cognitive behavioral therapy (GCBT)…
NASA Technical Reports Server (NTRS)
Dutra, Jayne E.; Smith, Lisa
2006-01-01
The goal of this plan is to briefly describe new technologies available to us in the arenas of information discovery and discuss the strategic value they have for the NASA enterprise with some considerations and suggestions for near term implementations using the NASA Engineering Network (NEN) as a delivery venue.
A Hot-Wire Method Based Thermal Conductivity Measurement Apparatus for Teaching Purposes
ERIC Educational Resources Information Center
Alvarado, S.; Marin, E.; Juarez, A. G.; Calderon, A.; Ivanov, R.
2012-01-01
The implementation of an automated system based on the hot-wire technique is described for the measurement of the thermal conductivity of liquids using equipment easily available in modern physics laboratories at high schools and universities (basically a precision current source and a voltage meter, a data acquisition card, a personal computer…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-30
... the extension dependent on weather conditions and availability of large medium and giant BFT to the... vessel. NMFS may increase or decrease the actual allowed daily retention limit of large medium and giant... NMFS implements an increase to the Harpoon category daily incidental retention limit of large medium...
ERIC Educational Resources Information Center
Wheatley, Rikki K.; West, Richard P.; Charlton, Cade T.; Sanders, Richard B.; Smith, Tim G.; Taylor, Matthew J.
2009-01-01
Schools are often in need of low-cost, high-impact strategies to improve student behavior in school common areas. While many behavior management programs exist, there are few resources available to guide the implementation of these programs and ensure they are grounded in evidence-based strategies. Therefore, the current study had two primary…
ERIC Educational Resources Information Center
Karahalis, John
2011-01-01
This researcher addresses whether transition planning with classified special education students is being implemented in accordance with the Kohler model for successful transition planning. Using archival data available from two high schools, one with a specialized on campus program and the latter as an excluded site specializing in classified…
Effectiveness of three post-fire rehabilitation treatments in the Colorado Front Range
J. W. Wagenbrenner; L. H. MacDonald; D. Rough
2006-01-01
Post-fire rehabilitation treatments are commonly implemented after high-severity wildfires, but few data are available about the efficacy of these treatments. This study assessed post-fire erosion rates and the effectiveness of seeding, straw mulching, and contour felling in reducing erosion after a June 2000 wildfire northwest of Loveland, Colorado. Site...
NASA Astrophysics Data System (ADS)
Yan, Hui; Wang, K. G.; Jones, Jim E.
2016-06-01
A parallel algorithm for large-scale three-dimensional phase-field simulations of phase coarsening is developed and implemented on high-performance architectures. From the large-scale simulations, a new kinetics in phase coarsening in the region of ultrahigh volume fraction is found. The parallel implementation is capable of harnessing the greater computer power available from high-performance architectures. The parallelized code enables increase in three-dimensional simulation system size up to a 5123 grid cube. Through the parallelized code, practical runtime can be achieved for three-dimensional large-scale simulations, and the statistical significance of the results from these high resolution parallel simulations are greatly improved over those obtainable from serial simulations. A detailed performance analysis on speed-up and scalability is presented, showing good scalability which improves with increasing problem size. In addition, a model for prediction of runtime is developed, which shows a good agreement with actual run time from numerical tests.
Fidelity of Implementation of Research Experience for Teachers in the Classroom
NASA Astrophysics Data System (ADS)
Sen, Tapati
In this study, the Arizona State University Mathematics and Science Teaching Fellows 2010 program was analyzed qualitatively from start to finish to determine the impact of the research experience on teachers in the classroom. The sample for the study was the 2010 cohort of eight high school science teachers. Erickson’s (1986) interpretive, participant observational fieldwork method was used to report data by means of detailed descriptions of the research experience and classroom implementation. Data was collected from teacher documents, interviews, and observations. The findings revealed various factors that were responsible for an ineffective implementation of the research experience in the classroom such as research experience, curriculum support, availability of resources, and school curriculum. Implications and recommendations for future programs are discussed in the study.
Supporting decision-making processes for evidence-based mental health promotion.
Jané-Llopis, Eva; Katschnig, Heinz; McDaid, David; Wahlbeck, Kristian
2011-12-01
The use of evidence is critical in guiding decision-making, but evidence from effect studies will be only one of a number of factors that will need to be taken into account in the decision-making processes. Equally important for policymakers will be the use of different types of evidence including implementation essentials and other decision-making principles such as social justice, political, ethical, equity issues, reflecting public attitudes and the level of resources available, rather than be based on health outcomes alone. This paper, aimed to support decision-makers, highlights the importance of commissioning high-quality evaluations, the key aspects to assess levels of evidence, the importance of supporting evidence-based implementation and what to look out for before, during and after implementation of mental health promotion and mental disorder prevention programmes.
Technical considerations for implementation of x-ray CT polymer gel dosimetry.
Hilts, M; Jirasek, A; Duzenli, C
2005-04-21
Gel dosimetry is the most promising 3D dosimetry technique in current radiation therapy practice. X-ray CT has been shown to be a feasible method of reading out polymer gel dosimeters and, with the high accessibility of CT scanners to cancer hospitals, presents an exciting possibility for clinical implementation of gel dosimetry. In this study we report on technical considerations for implementation of x-ray CT polymer gel dosimetry. Specifically phantom design, CT imaging methods, imaging time requirements and gel dose response are investigated. Where possible, recommendations are made for optimizing parameters to enhance system performance. The dose resolution achievable with an optimized system is calculated given voxel size and imaging time constraints. Results are compared with MRI and optical CT polymer gel dosimetry results available in the literature.
Embedded Implementation of VHR Satellite Image Segmentation
Li, Chao; Balla-Arabé, Souleymane; Ginhac, Dominique; Yang, Fan
2016-01-01
Processing and analysis of Very High Resolution (VHR) satellite images provide a mass of crucial information, which can be used for urban planning, security issues or environmental monitoring. However, they are computationally expensive and, thus, time consuming, while some of the applications, such as natural disaster monitoring and prevention, require high efficiency performance. Fortunately, parallel computing techniques and embedded systems have made great progress in recent years, and a series of massively parallel image processing devices, such as digital signal processors or Field Programmable Gate Arrays (FPGAs), have been made available to engineers at a very convenient price and demonstrate significant advantages in terms of running-cost, embeddability, power consumption flexibility, etc. In this work, we designed a texture region segmentation method for very high resolution satellite images by using the level set algorithm and the multi-kernel theory in a high-abstraction C environment and realize its register-transfer level implementation with the help of a new proposed high-level synthesis-based design flow. The evaluation experiments demonstrate that the proposed design can produce high quality image segmentation with a significant running-cost advantage. PMID:27240370
Hoffman, Jessica A.; Rosenfeld, Lindsay; Schmidt, Nicole; Cohen, Juliana F. W.; Gorski, Mary; Chaffee, Ruth; Smith, Lauren; Rimm, Eric B.
2015-01-01
Background During 2012, Massachusetts adopted comprehensive school competitive food and beverage standards that closely align with Institute of Medicine recommendations and Smart Snacks in School national standards. Objective We examined the extent to which a sample of Massachusetts middle schools and high schools sold foods and beverages that were compliant with the state competitive food and beverage standards after the first year of implementation, and complied with four additional aspects of the regulations. Design Observational cohort study with data collected before implementation (Spring 2012) and 1 year after implementation (Spring 2013). Participants/setting School districts (N=37) with at least one middle school and one high school participated. Main outcome measures Percent of competitive foods and beverages that were compliant with Massachusetts standards and compliance with four additional aspects of the regulations. Data were collected via school site visits and a foodservice director questionnaire. Statistical analyses performed Multilevel models were used to examine change in food and beverage compliance over time. Results More products were available in high schools than middle schools at both time points. The number of competitive beverages and several categories of competitive food products sold in the sample of Massachusetts schools decreased following the implementation of the standards. Multilevel models demonstrated a 47-percentage-point increase in food and 46-percentage-point increase in beverage compliance in Massachusetts schools from 2012 to 2013. Overall, total compliance was higher for beverages than foods. Conclusions This study of a group of Massachusetts schools demonstrated the feasibility of schools making substantial changes in response to requirements for healthier competitive foods, even in the first year of implementation. PMID:26210085
Hoffman, Jessica A; Rosenfeld, Lindsay; Schmidt, Nicole; Cohen, Juliana F W; Gorski, Mary; Chaffee, Ruth; Smith, Lauren; Rimm, Eric B
2015-08-01
During 2012, Massachusetts adopted comprehensive school competitive food and beverage standards that closely align with Institute of Medicine recommendations and Smart Snacks in School national standards. We examined the extent to which a sample of Massachusetts middle schools and high schools sold foods and beverages that were compliant with the state competitive food and beverage standards after the first year of implementation, and complied with four additional aspects of the regulations. Observational cohort study with data collected before implementation (Spring 2012) and 1 year after implementation (Spring 2013). School districts (N=37) with at least one middle school and one high school participated. Percent of competitive foods and beverages that were compliant with Massachusetts standards and compliance with four additional aspects of the regulations. Data were collected via school site visits and a foodservice director questionnaire. Multilevel models were used to examine change in food and beverage compliance over time. More products were available in high schools than middle schools at both time points. The number of competitive beverages and several categories of competitive food products sold in the sample of Massachusetts schools decreased following the implementation of the standards. Multilevel models demonstrated a 47-percentage-point increase in food and 46-percentage-point increase in beverage compliance in Massachusetts schools from 2012 to 2013. Overall, total compliance was higher for beverages than foods. This study of a group of Massachusetts schools demonstrated the feasibility of schools making substantial changes in response to requirements for healthier competitive foods, even in the first year of implementation. Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
High-Performance Java Codes for Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)
2001-01-01
The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.
Update on bacterial meningitis: epidemiology, trials and genetic association studies.
Kasanmoentalib, E Soemirien; Brouwer, Matthijs C; van de Beek, Diederik
2013-06-01
Bacterial meningitis is a life-threatening disease that continues to inflict a heavy toll. We reviewed recent advances in vaccination, randomized studies on treatment, and genetic association studies in bacterial meningitis. The incidence of bacterial meningitis has decreased after implementation of vaccines, and further implementation of existing conjugate vaccines particularly in low-income countries is expected to reduce the global disease burden. Several randomized studies have been performed recently in this field. Clinical studies showed that short duration (5 days) of antibiotic treatment is as effective as longer duration treatment in low-income countries, and that dexamethasone decreases death and neurological sequelae in high-income countries. Ongoing trials will further define the role of paracetamol, glycerol and hypothermia in bacterial meningitis. Genetic association studies identified pathophysiological mechanisms that could be counteracted in experimental meningitis, providing promising leads for future treatments. Conjugate vaccines have reduced the burden of bacterial meningitis in high-income countries, but implementation of available vaccines in low-income countries is necessary to reduce disease burden worldwide. Adjunctive dexamethasone therapy has beneficial effects in patients with bacterial meningitis but only in high-income countries. Genetic association studies may reveal targets for new treatment strategies.
PuReMD-GPU: A reactive molecular dynamics simulation package for GPUs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kylasa, S.B., E-mail: skylasa@purdue.edu; Aktulga, H.M., E-mail: hmaktulga@lbl.gov; Grama, A.Y., E-mail: ayg@cs.purdue.edu
2014-09-01
We present an efficient and highly accurate GP-GPU implementation of our community code, PuReMD, for reactive molecular dynamics simulations using the ReaxFF force field. PuReMD and its incorporation into LAMMPS (Reax/C) is used by a large number of research groups worldwide for simulating diverse systems ranging from biomembranes to explosives (RDX) at atomistic level of detail. The sub-femtosecond time-steps associated with ReaxFF strongly motivate significant improvements to per-timestep simulation time through effective use of GPUs. This paper presents, in detail, the design and implementation of PuReMD-GPU, which enables ReaxFF simulations on GPUs, as well as various performance optimization techniques wemore » developed to obtain high performance on state-of-the-art hardware. Comprehensive experiments on model systems (bulk water and amorphous silica) are presented to quantify the performance improvements achieved by PuReMD-GPU and to verify its accuracy. In particular, our experiments show up to 16× improvement in runtime compared to our highly optimized CPU-only single-core ReaxFF implementation. PuReMD-GPU is a unique production code, and is currently available on request from the authors.« less
Implementation of the high-order schemes QUICK and LECUSSO in the COMMIX-1C Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sakai, K.; Sun, J.G.; Sha, W.T.
Multidimensional analysis computer programs based on the finite volume method, such as COMMIX-1C, have been commonly used to simulate thermal-hydraulic phenomena in engineering systems such as nuclear reactors. In COMMIX-1C, the first-order schemes with respect to both space and time are used. In many situations such as flow recirculations and stratifications with steep gradient of velocity and temperature fields, however, high-order difference schemes are necessary for an accurate prediction of the fields. For these reasons, two second-order finite difference numerical schemes, QUICK (Quadratic Upstream Interpolation for Convective Kinematics) and LECUSSO (Local Exact Consistent Upwind Scheme of Second Order), have beenmore » implemented in the COMMIX-1C computer code. The formulations were derived for general three-dimensional flows with nonuniform grid sizes. Numerical oscillation analyses for QUICK and LECUSSO were performed. To damp the unphysical oscillations which occur in calculations with high-order schemes at high mesh Reynolds numbers, a new FRAM (Filtering Remedy and Methodology) scheme was developed and implemented. To be consistent with the high-order schemes, the pressure equation and the boundary conditions for all the conservation equations were also modified to be of second order. The new capabilities in the code are listed. Test calculations were performed to validate the implementation of the high-order schemes. They include the test of the one-dimensional nonlinear Burgers equation, two-dimensional scalar transport in two impinging streams, von Karmann vortex shedding, shear driven cavity flow, Couette flow, and circular pipe flow. The calculated results were compared with available data; the agreement is good.« less
Product reformulation and nutritional improvements after new competitive food standards in schools.
Jahn, Jaquelyn L; Cohen, Juliana Fw; Gorski-Findling, Mary T; Hoffman, Jessica A; Rosenfeld, Lindsay; Chaffee, Ruth; Smith, Lauren; Rimm, Eric B
2018-04-01
In 2012, Massachusetts enacted school competitive food and beverage standards similar to national Smart Snacks. These standards aim to improve the nutritional quality of competitive snacks. It was previously demonstrated that a majority of foods and beverages were compliant with the standards, but it was unknown whether food manufacturers reformulated products in response to the standards. The present study assessed whether products were reformulated after standards were implemented; the availability of reformulated products outside schools; and whether compliance with the standards improved the nutrient composition of competitive snacks. An observational cohort study documenting all competitive snacks sold before (2012) and after (2013 and 2014) the standards were implemented. The sample included thirty-six school districts with both a middle and high school. After 2012, energy, saturated fat, Na and sugar decreased and fibre increased among all competitive foods. By 2013, 8 % of foods were reformulated, as were an additional 9 % by 2014. Nearly 15 % of reformulated foods were look-alike products that could not be purchased at supermarkets. Energy and Na in beverages decreased after 2012, in part facilitated by smaller package sizes. Massachusetts' law was effective in improving the nutritional content of snacks and product reformulation helped schools adhere to the law. This suggests fully implementing Smart Snacks standards may similarly improve the foods available in schools nationally. However, only some healthier reformulated foods were available outside schools.
Mirtschin, Joanne G; Forbes, Sara F; Cato, Louise E; Heikura, Ida A; Strobel, Nicki; Hall, Rebecca; Burke, Louise M
2018-02-12
We describe the implementation of a 3-week dietary intervention in elite race walkers at the Australian Institute of Sport, with a focus on the resources and strategies needed to accomplish a complex study of this scale. Interventions involved: traditional guidelines of high carbohydrate (CHO) availability for all training sessions (HCHO); a periodized CHO diet which integrated sessions with low CHO and high CHO availability within the same total CHO intake, and a ketogenic low-CHO high-fat diet (LCHF). 7-day menus and recipes were constructed for a communal eating setting to meet nutritional goals as well as individualized food preferences and special needs. Menus also included nutrition support pre, during and post-exercise. Daily monitoring, via observation and food checklists, showed that energy and macronutrient targets were achieved: diets were matched for energy (~14.8 MJ/d) and protein (~2.1 g.kg/d), and achieved desired differences for fat and CHO: HCHO and PCHO: CHO = 8.5 g/kg/d, 60% energy; fat = 20% of energy; LCHF: 0.5 g/kg/d CHO, fat = 78% energy. There were no differences in micronutrient intakes or density between HCHO and PCHO diets; however, the micronutrient density of LCHF was significantly lower. Daily food costs per athlete were similar for each diet (~AUDS$27 ± 10). Successful implementation and monitoring of dietary interventions in sports nutrition research of the scale of the present study require meticulous planning and the expertise of chefs and sports dietitians. Different approaches to sports nutrition support raise practical challenges around cost, micronutrient density, accommodation of special needs and sustainability.
Tobacco control laws in Pakistan and their implementation: A pilot study in Karachi.
Khan, Javaid Ahmed; Amir Humza Sohail, Abdul Malik; Arif Maan, Muhammad Arslan
2016-07-01
In order to limit the high prevalence of tobacco use in Pakistan various tobacco control laws have been implemented. The objective of this study is to serve as a pilot study to assess the implementation of these laws in the largest city of Pakistan, Karachi. A cross-sectional study was conducted in Karachi. The implementation of tobacco control laws in 'smoke-free' places, the adherence of tobacco companies to these laws, the regulation of cigarette sale, and the awareness and views of the general public regarding tobacco control laws were assessed via direct observation by visits and through self-administered questionnaires. The implementation of tobacco control laws in 'smoke-free' public places was found to be poor. Out of 37, only 23(62%) brands displayed pictorial warnings on their packs. 3(8%) of the brands were available in two different kinds of packs, both with and without pictorial warnings. Cigarette sale to minors was taking place at 80(85%) of the visited cigarette outlets. 50(53%) of the outlets displayed cigarette advertisements in the form of posters. 46(40%) of the persons questioned had awareness regarding the existence of ban on smoking in public places and 126(90%) of these were in favour of it. The implementation of tobacco control law in Pakistan is poor. Non adherence to the law in public places was alarmingly high. Also, the study demonstrates the poor compliance to the tobacco control laws by tobacco companies. The sale of cigarettes is almost unregulated.
Fast and Adaptive Lossless Onboard Hyperspectral Data Compression System
NASA Technical Reports Server (NTRS)
Aranki, Nazeeh I.; Keymeulen, Didier; Kimesh, Matthew A.
2012-01-01
Modern hyperspectral imaging systems are able to acquire far more data than can be downlinked from a spacecraft. Onboard data compression helps to alleviate this problem, but requires a system capable of power efficiency and high throughput. Software solutions have limited throughput performance and are power-hungry. Dedicated hardware solutions can provide both high throughput and power efficiency, while taking the load off of the main processor. Thus a hardware compression system was developed. The implementation uses a field-programmable gate array (FPGA). The implementation is based on the fast lossless (FL) compression algorithm reported in Fast Lossless Compression of Multispectral-Image Data (NPO-42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which achieves excellent compression performance and has low complexity. This algorithm performs predictive compression using an adaptive filtering method, and uses adaptive Golomb coding. The implementation also packetizes the coded data. The FL algorithm is well suited for implementation in hardware. In the FPGA implementation, one sample is compressed every clock cycle, which makes for a fast and practical realtime solution for space applications. Benefits of this implementation are: 1) The underlying algorithm achieves a combination of low complexity and compression effectiveness that exceeds that of techniques currently in use. 2) The algorithm requires no training data or other specific information about the nature of the spectral bands for a fixed instrument dynamic range. 3) Hardware acceleration provides a throughput improvement of 10 to 100 times vs. the software implementation. A prototype of the compressor is available in software, but it runs at a speed that does not meet spacecraft requirements. The hardware implementation targets the Xilinx Virtex IV FPGAs, and makes the use of this compressor practical for Earth satellites as well as beyond-Earth missions with hyperspectral instruments.
NASA Astrophysics Data System (ADS)
Kalumba, Mulenga; Nyirenda, Edwin
2017-12-01
The Government of the Republic Zambia (GRZ) will install a new hydropower station Kafue Gorge Lower downstream of the existing Kafue Gorge Station (KGS) and plans to start operating the Itezhi-Tezhi (ITT) hydropower facility in the Kafue Basin. The Basin has significant biodiversity hot spots such as the Luangwa National park and Kafue Flats. It is described as a Man-Biosphere reserve and the National Park is a designated World Heritage Site hosting a variety of wildlife species. All these natural reserves demand special protection, and environmental flow requirements (e-flows) have been identified as a necessary need to preserve these ecosystems. Implementation of e-flows is therefore a priority as Zambia considers to install more hydropower facilities. However before allocation of e-flows, it is necessary to first assess the river flow available for allocation at existing hydropower stations in the Kafue Basin. The river flow availability in the basin was checked by assessing the variability in low and high flows since the timing, frequency and duration of extreme droughts and floods (caused by low and high flows) are all important hydrological characteristics of a flow regime that affects e-flows. The river flows for a 41 year monthly time series data (1973-2014) were used to extract independent low and high flows using the Water Engineering Time Series Processing Tool (WETSPRO). The low and high flows were used to construct cumulative frequency distribution curves that were compared and analysed to show their variation over a long period. A water balance of each hydropower station was used to check the river flow allocation aspect by comparing the calculated water balance outflow (river flow) with the observed river flow, the hydropower and consumptive water rights downstream of each hydropower station. In drought periods about 50-100 m3/s of riverflow is available or discharged at both ITT and KGS stations while as in extreme flood events about 1300-1500 m3/s of riverflow is available. There is river flow available in the wet and dry seasons for e-flow allocation at ITT. On average per month 25 m3/s is allocated for e-flows at ITT for downstream purposes. On the other hand, it may be impossible to implement e-flows at KGS with the limited available outflow (river flow). The available river flow from ITT plays a very vital role in satisfying the current hydropower generating capacity at KGS. Therefore, the operations of KGS heavily depends on the available outflow (river flow) from ITT.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-14
... Promulgation of Air Quality Implementation Plans; Maryland; Reasonably Available Control Technology for the 1997 8-Hour Ozone National Ambient Air Quality Standard AGENCY: Environmental Protection Agency (EPA... available control technology (RACT) for the 1997 8-hour ozone national ambient air quality standard (NAAQS...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-23
... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 52 [EPA-R02-OAR-2009-0462, FRL-9178-5] Approval and Promulgation of Implementation Plans; New York Reasonably Available Control Technology and Reasonably Available Control Measures AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: EPA is...
Cyber-T web server: differential analysis of high-throughput data.
Kayala, Matthew A; Baldi, Pierre
2012-07-01
The Bayesian regularization method for high-throughput differential analysis, described in Baldi and Long (A Bayesian framework for the analysis of microarray expression data: regularized t-test and statistical inferences of gene changes. Bioinformatics 2001: 17: 509-519) and implemented in the Cyber-T web server, is one of the most widely validated. Cyber-T implements a t-test using a Bayesian framework to compute a regularized variance of the measurements associated with each probe under each condition. This regularized estimate is derived by flexibly combining the empirical measurements with a prior, or background, derived from pooling measurements associated with probes in the same neighborhood. This approach flexibly addresses problems associated with low replication levels and technology biases, not only for DNA microarrays, but also for other technologies, such as protein arrays, quantitative mass spectrometry and next-generation sequencing (RNA-seq). Here we present an update to the Cyber-T web server, incorporating several useful new additions and improvements. Several preprocessing data normalization options including logarithmic and (Variance Stabilizing Normalization) VSN transforms are included. To augment two-sample t-tests, a one-way analysis of variance is implemented. Several methods for multiple tests correction, including standard frequentist methods and a probabilistic mixture model treatment, are available. Diagnostic plots allow visual assessment of the results. The web server provides comprehensive documentation and example data sets. The Cyber-T web server, with R source code and data sets, is publicly available at http://cybert.ics.uci.edu/.
Multidimensional optimal droop control for wind resources in DC microgrids
NASA Astrophysics Data System (ADS)
Bunker, Kaitlyn J.
Two important and upcoming technologies, microgrids and electricity generation from wind resources, are increasingly being combined. Various control strategies can be implemented, and droop control provides a simple option without requiring communication between microgrid components. Eliminating the single source of potential failure around the communication system is especially important in remote, islanded microgrids, which are considered in this work. However, traditional droop control does not allow the microgrid to utilize much of the power available from the wind. This dissertation presents a novel droop control strategy, which implements a droop surface in higher dimension than the traditional strategy. The droop control relationship then depends on two variables: the dc microgrid bus voltage, and the wind speed at the current time. An approach for optimizing this droop control surface in order to meet a given objective, for example utilizing all of the power available from a wind resource, is proposed and demonstrated. Various cases are used to test the proposed optimal high dimension droop control method, and demonstrate its function. First, the use of linear multidimensional droop control without optimization is demonstrated through simulation. Next, an optimal high dimension droop control surface is implemented with a simple dc microgrid containing two sources and one load. Various cases for changing load and wind speed are investigated using simulation and hardware-in-the-loop techniques. Optimal multidimensional droop control is demonstrated with a wind resource in a full dc microgrid example, containing an energy storage device as well as multiple sources and loads. Finally, the optimal high dimension droop control method is applied with a solar resource, and using a load model developed for a military patrol base application. The operation of the proposed control is again investigated using simulation and hardware-in-the-loop techniques.
HDL Based FPGA Interface Library for Data Acquisition and Multipurpose Real Time Algorithms
NASA Astrophysics Data System (ADS)
Fernandes, Ana M.; Pereira, R. C.; Sousa, J.; Batista, A. J. N.; Combo, A.; Carvalho, B. B.; Correia, C. M. B. A.; Varandas, C. A. F.
2011-08-01
The inherent parallelism of the logic resources, the flexibility in its configuration and the performance at high processing frequencies makes the field programmable gate array (FPGA) the most suitable device to be used both for real time algorithm processing and data transfer in instrumentation modules. Moreover, the reconfigurability of these FPGA based modules enables exploiting different applications on the same module. When using a reconfigurable module for various applications, the availability of a common interface library for easier implementation of the algorithms on the FPGA leads to more efficient development. The FPGA configuration is usually specified in a hardware description language (HDL) or other higher level descriptive language. The critical paths, such as the management of internal hardware clocks that require deep knowledge of the module behavior shall be implemented in HDL to optimize the timing constraints. The common interface library should include these critical paths, freeing the application designer from hardware complexity and able to choose any of the available high-level abstraction languages for the algorithm implementation. With this purpose a modular Verilog code was developed for the Virtex 4 FPGA of the in-house Transient Recorder and Processor (TRP) hardware module, based on the Advanced Telecommunications Computing Architecture (ATCA), with eight channels sampling at up to 400 MSamples/s (MSPS). The TRP was designed to perform real time Pulse Height Analysis (PHA), Pulse Shape Discrimination (PSD) and Pile-Up Rejection (PUR) algorithms at a high count rate (few Mevent/s). A brief description of this modular code is presented and examples of its use as an interface with end user algorithms, including a PHA with PUR, are described.
A 7.4 ps FPGA-Based TDC with a 1024-Unit Measurement Matrix
Zhang, Min; Wang, Hai; Liu, Yan
2017-01-01
In this paper, a high-resolution time-to-digital converter (TDC) based on a field programmable gate array (FPGA) device is proposed and tested. During the implementation, a new architecture of TDC is proposed which consists of a measurement matrix with 1024 units. The utilization of routing resources as the delay elements distinguishes the proposed design from other existing designs, which contributes most to the device insensitivity to variations of temperature and voltage. Experimental results suggest that the measurement resolution is 7.4 ps, and the INL (integral nonlinearity) and DNL (differential nonlinearity) are 11.6 ps and 5.5 ps, which indicates that the proposed TDC offers high performance among the available TDCs. Benefitting from the FPGA platform, the proposed TDC has superiorities in easy implementation, low cost, and short development time. PMID:28420121
A 7.4 ps FPGA-Based TDC with a 1024-Unit Measurement Matrix.
Zhang, Min; Wang, Hai; Liu, Yan
2017-04-14
In this paper, a high-resolution time-to-digital converter (TDC) based on a field programmable gate array (FPGA) device is proposed and tested. During the implementation, a new architecture of TDC is proposed which consists of a measurement matrix with 1024 units. The utilization of routing resources as the delay elements distinguishes the proposed design from other existing designs, which contributes most to the device insensitivity to variations of temperature and voltage. Experimental results suggest that the measurement resolution is 7.4 ps, and the INL (integral nonlinearity) and DNL (differential nonlinearity) are 11.6 ps and 5.5 ps, which indicates that the proposed TDC offers high performance among the available TDCs. Benefitting from the FPGA platform, the proposed TDC has superiorities in easy implementation, low cost, and short development time.
PAB3D: Its History in the Use of Turbulence Models in the Simulation of Jet and Nozzle Flows
NASA Technical Reports Server (NTRS)
Abdol-Hamid, Khaled S.; Pao, S. Paul; Hunter, Craig A.; Deere, Karen A.; Massey, Steven J.; Elmiligui, Alaa
2006-01-01
This is a review paper for PAB3D s history in the implementation of turbulence models for simulating jet and nozzle flows. We describe different turbulence models used in the simulation of subsonic and supersonic jet and nozzle flows. The time-averaged simulations use modified linear or nonlinear two-equation models to account for supersonic flow as well as high temperature mixing. Two multiscale-type turbulence models are used for unsteady flow simulations. These models require modifications to the Reynolds Averaged Navier-Stokes (RANS) equations. The first scheme is a hybrid RANS/LES model utilizing the two-equation (k-epsilon) model with a RANS/LES transition function, dependent on grid spacing and the computed turbulence length scale. The second scheme is a modified version of the partially averaged Navier-Stokes (PANS) formulation. All of these models are implemented in the three-dimensional Navier-Stokes code PAB3D. This paper discusses computational methods, code implementation, computed results for a wide range of nozzle configurations at various operating conditions, and comparisons with available experimental data. Very good agreement is shown between the numerical solutions and available experimental data over a wide range of operating conditions.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-04
...; Federal Implementation Plan for Best Available Retrofit Technology Determination; Extension of Public... Best Available Retrofit Technology (BART) requirements for sulfur dioxide (SO 2 ) at one source in...
ERIC Educational Resources Information Center
Silverman, Michael J.
2009-01-01
Because of the relatively poor treatment available, the high financial costs of hospitalization, multiple and complex issues of persons with severe mental illnesses, and advancements in pharmacotherapy, psychiatric patients are often only hospitalized for a few days before they are discharged. Thus, brief psychosocial interventions for persons who…
Deakyne, S J; Bajaj, L; Hoffman, J; Alessandrini, E; Ballard, D W; Norris, R; Tzimenatos, L; Swietlik, M; Tham, E; Grundmeier, R W; Kuppermann, N; Dayan, P S
2015-01-01
Overuse of cranial computed tomography scans in children with blunt head trauma unnecessarily exposes them to radiation. The Pediatric Emergency Care Applied Research Network (PECARN) blunt head trauma prediction rules identify children who do not require a computed tomography scan. Electronic health record (EHR) based clinical decision support (CDS) may effectively implement these rules but must only be provided for appropriate patients in order to minimize excessive alerts. To develop, implement and evaluate site-specific groupings of chief complaints (CC) that accurately identify children with head trauma, in order to activate data collection in an EHR. As part of a 13 site clinical trial comparing cranial computed tomography use before and after implementation of CDS, four PECARN sites centrally developed and locally implemented CC groupings to trigger a clinical trial alert (CTA) to facilitate the completion of an emergency department head trauma data collection template. We tested and chose CC groupings to attain high sensitivity while maintaining at least moderate specificity. Due to variability in CCs available, identical groupings across sites were not possible. We noted substantial variability in the sensitivity and specificity of seemingly similar CC groupings between sites. The implemented CC groupings had sensitivities greater than 90% with specificities between 75-89%. During the trial, formal testing and provider feedback led to tailoring of the CC groupings at some sites. CC groupings can be successfully developed and implemented across multiple sites to accurately identify patients who should have a CTA triggered to facilitate EHR data collection. However, CC groupings will necessarily vary in order to attain high sensitivity and moderate-to-high specificity. In future trials, the balance between sensitivity and specificity should be considered based on the nature of the clinical condition, including prevalence and morbidity, in addition to the goals of the intervention being considered.
The Newick utilities: high-throughput phylogenetic tree processing in the Unix shell
Junier, Thomas; Zdobnov, Evgeny M.
2010-01-01
Summary: We present a suite of Unix shell programs for processing any number of phylogenetic trees of any size. They perform frequently-used tree operations without requiring user interaction. They also allow tree drawing as scalable vector graphics (SVG), suitable for high-quality presentations and further editing, and as ASCII graphics for command-line inspection. As an example we include an implementation of bootscanning, a procedure for finding recombination breakpoints in viral genomes. Availability: C source code, Python bindings and executables for various platforms are available from http://cegg.unige.ch/newick_utils. The distribution includes a manual and example data. The package is distributed under the BSD License. Contact: thomas.junier@unige.ch PMID:20472542
fluff: exploratory analysis and visualization of high-throughput sequencing data
Georgiou, Georgios
2016-01-01
Summary. In this article we describe fluff, a software package that allows for simple exploration, clustering and visualization of high-throughput sequencing data mapped to a reference genome. The package contains three command-line tools to generate publication-quality figures in an uncomplicated manner using sensible defaults. Genome-wide data can be aggregated, clustered and visualized in a heatmap, according to different clustering methods. This includes a predefined setting to identify dynamic clusters between different conditions or developmental stages. Alternatively, clustered data can be visualized in a bandplot. Finally, fluff includes a tool to generate genomic profiles. As command-line tools, the fluff programs can easily be integrated into standard analysis pipelines. The installation is straightforward and documentation is available at http://fluff.readthedocs.org. Availability. fluff is implemented in Python and runs on Linux. The source code is freely available for download at https://github.com/simonvh/fluff. PMID:27547532
Volpe, S. L.; Hall, W. J.; Steckler, A.; Schneider, M.; Thompson, D.; Mobley, C.; Pham, T.; El ghormli, L.
2013-01-01
The process evaluation of HEALTHY, a large multi-center trial to decrease type 2 diabetes mellitus in middle school children, monitored the implementation of the intervention to ascertain the extent that components were delivered and received as intended. The purpose of this article is to report the process evaluation findings concerning the extent to which the HEALTHY nutrition intervention was implemented during the HEALTHY trial. Overall, the observed fidelity of implementing nutrition strategies improved from baseline to the end of the study. By the last semester, all but two nutrition process evaluation goals were met. The most challenging goal to implement was serving high fiber foods, including grain-based foods and legumes. The easiest goals to implement were lowering the fat content of foods offered and offering healthier beverages. The most challenging barriers experienced by research dietitians and food service staff were costs, availability of foods and student acceptance. Forming strong relationships between the research dietitians and food service staff was identified as a key strategy to meet HEALTHY nutrition goals. PMID:24107856
Implementation of real-time digital signal processing systems
NASA Technical Reports Server (NTRS)
Narasimha, M.; Peterson, A.; Narayan, S.
1978-01-01
Special purpose hardware implementation of DFT Computers and digital filters is considered in the light of newly introduced algorithms and IC devices. Recent work by Winograd on high-speed convolution techniques for computing short length DFT's, has motivated the development of more efficient algorithms, compared to the FFT, for evaluating the transform of longer sequences. Among these, prime factor algorithms appear suitable for special purpose hardware implementations. Architectural considerations in designing DFT computers based on these algorithms are discussed. With the availability of monolithic multiplier-accumulators, a direct implementation of IIR and FIR filters, using random access memories in place of shift registers, appears attractive. The memory addressing scheme involved in such implementations is discussed. A simple counter set-up to address the data memory in the realization of FIR filters is also described. The combination of a set of simple filters (weighting network) and a DFT computer is shown to realize a bank of uniform bandpass filters. The usefulness of this concept in arriving at a modular design for a million channel spectrum analyzer, based on microprocessors, is discussed.
Multi-Threaded Algorithms for GPGPU in the ATLAS High Level Trigger
NASA Astrophysics Data System (ADS)
Conde Muíño, P.; ATLAS Collaboration
2017-10-01
General purpose Graphics Processor Units (GPGPU) are being evaluated for possible future inclusion in an upgraded ATLAS High Level Trigger farm. We have developed a demonstrator including GPGPU implementations of Inner Detector and Muon tracking and Calorimeter clustering within the ATLAS software framework. ATLAS is a general purpose particle physics experiment located on the LHC collider at CERN. The ATLAS Trigger system consists of two levels, with Level-1 implemented in hardware and the High Level Trigger implemented in software running on a farm of commodity CPU. The High Level Trigger reduces the trigger rate from the 100 kHz Level-1 acceptance rate to 1.5 kHz for recording, requiring an average per-event processing time of ∼ 250 ms for this task. The selection in the high level trigger is based on reconstructing tracks in the Inner Detector and Muon Spectrometer and clusters of energy deposited in the Calorimeter. Performing this reconstruction within the available farm resources presents a significant challenge that will increase significantly with future LHC upgrades. During the LHC data taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further to 7.5 times the design value in 2026 following LHC and ATLAS upgrades. Corresponding improvements in the speed of the reconstruction code will be needed to provide the required trigger selection power within affordable computing resources. Key factors determining the potential benefit of including GPGPU as part of the HLT processor farm are: the relative speed of the CPU and GPGPU algorithm implementations; the relative execution times of the GPGPU algorithms and serial code remaining on the CPU; the number of GPGPU required, and the relative financial cost of the selected GPGPU. We give a brief overview of the algorithms implemented and present new measurements that compare the performance of various configurations exploiting GPGPU cards.
Ducharme, Lori J.; Chandler, Redonna K.; Harris, Alex H. S.
2015-01-01
The National Institute on Alcohol Abuse and Alcoholism (NIAAA), National Institute on Drug Abuse (NIDA), and Veterans Health Administration (VHA) share an interest in promoting high quality, rigorous health services research to improve the availability and utilization of evidence-based treatment for substance use disorders (SUD). Recent and continuing changes in the healthcare policy and funding environments prioritize the integration of evidence-based substance abuse treatments into primary care and general medical settings. This area is a prime candidate for implementation research. Recent and ongoing implementation projects funded by these agencies are reviewed. Research in five areas is highlighted: screening and brief intervention for risky drinking; screening and brief intervention for tobacco use; uptake of FDA-approved addiction pharmacotherapies; safe opioid prescribing; and disease management. Gaps in the portfolios, and priorities for future research, are described. PMID:26233697
FOAM: the modular adaptive optics framework
NASA Astrophysics Data System (ADS)
van Werkhoven, T. I. M.; Homs, L.; Sliepen, G.; Rodenhuis, M.; Keller, C. U.
2012-07-01
Control software for adaptive optics systems is mostly custom built and very specific in nature. We have developed FOAM, a modular adaptive optics framework for controlling and simulating adaptive optics systems in various environments. Portability is provided both for different control hardware and adaptive optics setups. To achieve this, FOAM is written in C++ and runs on standard CPUs. Furthermore we use standard Unix libraries and compilation procedures and implemented a hardware abstraction layer in FOAM. We have successfully implemented FOAM on the adaptive optics system of ExPo - a high-contrast imaging polarimeter developed at our institute - in the lab and will test it on-sky late June 2012. We also plan to implement FOAM on adaptive optics systems for microscopy and solar adaptive optics. FOAM is available* under the GNU GPL license and is free to be used by anyone.
NASA Technical Reports Server (NTRS)
Korzennik, Sylvain
1997-01-01
Under the direction of Dr. Rhodes, and the technical supervision of Dr. Korzennik, the data assimilation of high spatial resolution solar dopplergrams has been carried out throughout the program on the Intel Delta Touchstone supercomputer. With the help of a research assistant, partially supported by this grant, and under the supervision of Dr. Korzennik, code development was carried out at SAO, using various available resources. To ensure cross-platform portability, PVM was selected as the message passing library. A parallel implementation of power spectra computation for helioseismology data reduction, using PVM was successfully completed. It was successfully ported to SMP architectures (i.e. SUN), and to some MPP architectures (i.e. the CM5). Due to limitation of the implementation of PVM on the Cray T3D, the port to that architecture was not completed at the time.
A versatile embedded boundary adaptive mesh method for compressible flow in complex geometry
NASA Astrophysics Data System (ADS)
Al-Marouf, M.; Samtaney, R.
2017-05-01
We present an embedded ghost fluid method for numerical solutions of the compressible Navier Stokes (CNS) equations in arbitrary complex domains. A PDE multidimensional extrapolation approach is used to reconstruct the solution in the ghost fluid regions and imposing boundary conditions on the fluid-solid interface, coupled with a multi-dimensional algebraic interpolation for freshly cleared cells. The CNS equations are numerically solved by the second order multidimensional upwind method. Block-structured adaptive mesh refinement, implemented with the Chombo framework, is utilized to reduce the computational cost while keeping high resolution mesh around the embedded boundary and regions of high gradient solutions. The versatility of the method is demonstrated via several numerical examples, in both static and moving geometry, ranging from low Mach number nearly incompressible flows to supersonic flows. Our simulation results are extensively verified against other numerical results and validated against available experimental results where applicable. The significance and advantages of our implementation, which revolve around balancing between the solution accuracy and implementation difficulties, are briefly discussed as well.
ASPeak: an abundance sensitive peak detection algorithm for RIP-Seq.
Kucukural, Alper; Özadam, Hakan; Singh, Guramrit; Moore, Melissa J; Cenik, Can
2013-10-01
Unlike DNA, RNA abundances can vary over several orders of magnitude. Thus, identification of RNA-protein binding sites from high-throughput sequencing data presents unique challenges. Although peak identification in ChIP-Seq data has been extensively explored, there are few bioinformatics tools tailored for peak calling on analogous datasets for RNA-binding proteins. Here we describe ASPeak (abundance sensitive peak detection algorithm), an implementation of an algorithm that we previously applied to detect peaks in exon junction complex RNA immunoprecipitation in tandem experiments. Our peak detection algorithm yields stringent and robust target sets enabling sensitive motif finding and downstream functional analyses. ASPeak is implemented in Perl as a complete pipeline that takes bedGraph files as input. ASPeak implementation is freely available at https://sourceforge.net/projects/as-peak under the GNU General Public License. ASPeak can be run on a personal computer, yet is designed to be easily parallelizable. ASPeak can also run on high performance computing clusters providing efficient speedup. The documentation and user manual can be obtained from http://master.dl.sourceforge.net/project/as-peak/manual.pdf.
Small Microprocessor for ASIC or FPGA Implementation
NASA Technical Reports Server (NTRS)
Kleyner, Igor; Katz, Richard; Blair-Smith, Hugh
2011-01-01
A small microprocessor, suitable for use in applications in which high reliability is required, was designed to be implemented in either an application-specific integrated circuit (ASIC) or a field-programmable gate array (FPGA). The design is based on commercial microprocessor architecture, making it possible to use available software development tools and thereby to implement the microprocessor at relatively low cost. The design features enhancements, including trapping during execution of illegal instructions. The internal structure of the design yields relatively high performance, with a significant decrease, relative to other microprocessors that perform the same functions, in the number of microcycles needed to execute macroinstructions. The problem meant to be solved in designing this microprocessor was to provide a modest level of computational capability in a general-purpose processor while adding as little as possible to the power demand, size, and weight of a system into which the microprocessor would be incorporated. As designed, this microprocessor consumes very little power and occupies only a small portion of a typical modern ASIC or FPGA. The microprocessor operates at a rate of about 4 million instructions per second with clock frequency of 20 MHz.
Implementing Montessori Methods for Dementia: A Scoping Review.
Hitzig, Sander L; Sheppard, Christine L
2017-10-01
A scoping review was conducted to develop an understanding of Montessori-based programing (MBP) approaches used in dementia care and to identify optimal ways to implement these programs across various settings. Six peer-reviewed databases were searched for relevant abstracts by 2 independent reviewers. Included articles and book chapters were those available in English and published by the end of January 2016. Twenty-three articles and 2 book chapters met the inclusion criteria. Four approaches to implementing MBP were identified: (a) staff assisted (n = 14); (b) intergenerational (n = 5); (c) resident assisted (n = 4); and (d) volunteer or family assisted (n = 2). There is a high degree of variability with how MBP was delivered and no clearly established "best practices" or standardized protocol emerged across approaches except for resident-assisted MBP. The findings from this scoping review provide an initial road map on suggestions for implementing MBP across dementia care settings. Irrespective of implementation approach, there are several pragmatic and logistical issues that need to be taken into account for optimal implementation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Salloum, Alison; Crawford, Erika A; Lewin, Adam B; Storch, Eric A
2015-01-01
Computer-assisted cognitive behavioral therapy (CCBT) programs for childhood anxiety are being developed, although research about factors that contribute to implementation of CCBT in community mental health centers (CMHC) is limited. The purpose of this mixed-methods study was to explore consumers' and providers' perceptions of utilizing a CCBT for childhood anxiety in CMHC in an effort to identify factors that may impact implementation of CCBT in CMHC. Focus groups and interviews occurred with 7 parents, 6 children, 3 therapists, 3 project coordinators and 3 administrators who had participated in CCBT for childhood anxiety. Surveys of treatment satisfaction and treatment barriers were administered to consumers. RESULTS suggest that both consumers and providers were highly receptive to participation in and implementation of CCBT in CMHC. Implementation themes included positive receptiveness, factors related to therapists, treatment components, applicability of treatment, treatment content, initial implementation challenges, resources, dedicated staff, support, outreach, opportunities with the CMHC, payment, and treatment availability. As studies continue to demonstrate the effectiveness of CCBT for childhood anxiety, research needs to continue to examine factors that contribute to the successful implementation of such treatments in CMHC.
ERIC Educational Resources Information Center
Sheppard, Julie Trammell
2013-01-01
The purpose of this qualitative case study is to examine the perceptions of teachers and curriculum specialists over the effectiveness of professional development and available resources of the Common Core State Standards (CCSS) implementation process in Arkansas. Arkansas divided the implementation process into three stages: Phase I implemented…
Stevens, Gregory J; Warfel, Joel W; Aden, James K; Blackwell, Scott D
2018-02-13
Endotracheal intubation is a medical procedure that is often indicated in both the perioperative and critical care environments. Cuffed endotracheal tubes (ETT) allow for safer and more efficient delivery of positive pressure ventilation, as well as create a barrier to reduce the risk of micro-aspiration and anesthetic pollution in the operating room environment. Over-inflation of the endotracheal cuff can lead to serious and harmful sequelae. This study aimed to assess if departmental education paired with ready access to a manometer to assess cuff pressure would result in an improvement in the proportion of ETT cuff pressures in the goal range. A quality improvement study was conducted at the San Antonio Military Medical Center (SAMMC; Department of Defense hospital in San Antonio, TX). The initiative was divided into three key periods: pre-implementation, implementation, and post-implementation. During the pre-implementation period, ETT cuff pressures were obtained to assess the baseline state of ETT cuff pressures for patients in the operating room; the proportion of in-range (20-30 cmH2O) pressures was calculated. During the implementation phase, operating rooms were equipped with manometers and anesthesia departmental education was performed regarding the use of the manometers. Three months later, post-implementation cuff pressures were measured in the OR, and the proportion of in-range pressures was again calculated. The pre-implementation data showed an average cuff pressure of 48.92 cmH2O and a median of 38.5 cmH2O. Of the 100 pre-implementation pressures, 20 were in the goal range. Post-implementation data had an average cuff pressure of 41.96 cmH2O and a median of 30 cmH2O. A chi-squared test of pressures in the safe range from the pre-implementation versus post-implementation values yielded a highly significant p-value of 0.0003. The data from this study clearly demonstrated a statistically significant improvement in the proportion of in-range cuff pressures following the quality improvement initiative. This study supports the use of department-wide education and the availability of manometers in each OR to yield safer cuff pressures for intubated patients. This study did not aim to determine an optimal ETT cuff pressure, but utilized data already available to determine a safe cuff pressure. Further research needs to be performed to assess whether routine monitoring of cuff pressure results in improved patient outcomes. Published by Oxford University Press on behalf of the Association of Military Surgeons of the United States 2018. This work is written by (a) US Government employee(s) and is in the public domain in the US.
RHydro - Hydrological models and tools to represent and analyze hydrological data in R
NASA Astrophysics Data System (ADS)
Reusser, Dominik; Buytaert, Wouter
2010-05-01
In hydrology, basic equations and procedures keep being implemented from scratch by scientist, with the potential for errors and inefficiency. The use of libraries can overcome these problems. Other scientific disciplines such as mathematics and physics have benefited significantly from such an approach with freely available implementations for many routines. As an example, hydrological libraries could contain: Major representations of hydrological processes such as infiltration, sub-surface runoff and routing algorithms. Scaling functions, for instance to combine remote sensing precipitation fields with rain gauge data Data consistency checks Performance measures. Here we present a beginning for such a library implemented in the high level data programming language R. Currently, Top-model, data import routines for WaSiM-ETH as well basic visualization and evaluation tools are implemented. The design is such, that a definition of import scripts for additional models is sufficient to have access to the full set of evaluation and visualization tools.
Configurable e-commerce-oriented distributed seckill system with high availability
NASA Astrophysics Data System (ADS)
Zhu, Liye
2018-04-01
The rapid development of e-commerce prompted the birth of seckill activity. Seckill activity greatly stimulated public shopping desire because of its significant attraction to customers. In a seckill activity, a limited number of products will be sold at varying degrees of discount, which brings a huge temptation for customers. The discounted products are usually sold out in seconds, which can be a huge challenge for e-commerce systems. In this case, a seckill system with high concurrency and high availability has very practical significance. This research cooperates with Huijin Department Store to design and implement a seckill system of e-commerce platform. The seckill system supports high concurrency network conditions and is highly available in unexpected situation. In addition, due to the short life cycle of seckill activity, the system has the flexibility to be configured and scalable, which means that it is able to add or re-move system resources on demand. Finally, this paper carried out the function test and the performance test of the whole system. The test results show that the system meets the functional requirements and performance requirements of suppliers, administrators as well as users.
Moore, Justin B.; Carson, Russell L.; Webster, Collin A.; Singletary, Camelia R.; Castelli, Darla M.; Pate, Russell R.; Beets, Michael W.; Beighle, Aaron
2018-01-01
Comprehensive school physical activity programs (CSPAPs) have been endorsed as a promising strategy to increase youth physical activity (PA) in school settings. A CSPAP is a five-component approach, which includes opportunities before, during, and after school for PA. Extensive resources are available to public health practitioners and school officials regarding what should be implemented, but little guidance and few resources are available regarding how to effectively implement a CSPAP. Implementation science provides a number of conceptual frameworks that can guide implementation of a CSPAP, but few published studies have employed an implementation science framework to a CSPAP. Therefore, we developed Be a Champion! (BAC), which represents a synthesis of implementation science strategies, modified for application to CSPAPs implementation in schools while allowing for local tailoring of the approach. This article describes BAC while providing examples from the implementation of a CSPAP in three rural elementary schools. PMID:29354631
cisTEM, user-friendly software for single-particle image processing.
Grant, Timothy; Rohou, Alexis; Grigorieff, Nikolaus
2018-03-07
We have developed new open-source software called cis TEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cis TEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k - 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cis TEM is available for download from cistem.org. © 2018, Grant et al.
NASA and Industry Benefits of ACTS High Speed Network Interoperability Experiments
NASA Technical Reports Server (NTRS)
Zernic, M. J.; Beering, D. R.; Brooks, D. E.
2000-01-01
This paper provides synopses of the design. implementation, and results of key high data rate communications experiments utilizing the technologies of NASA's Advanced Communications Technology Satellite (ACTS). Specifically, the network protocol and interoperability performance aspects will be highlighted. The objectives of these key experiments will be discussed in their relevant context to NASA missions, as well as, to the comprehensive communications industry. Discussion of the experiment implementation will highlight the technical aspects of hybrid network connectivity, a variety of high-speed interoperability architectures, a variety of network node platforms, protocol layers, internet-based applications, and new work focused on distinguishing between link errors and congestion. In addition, this paper describes the impact of leveraging government-industry partnerships to achieve technical progress and forge synergistic relationships. These relationships will be the key to success as NASA seeks to combine commercially available technology with its own internal technology developments to realize more robust and cost effective communications for space operations.
cisTEM, user-friendly software for single-particle image processing
2018-01-01
We have developed new open-source software called cisTEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cisTEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k – 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cisTEM is available for download from cistem.org. PMID:29513216
Paranal maintenance and CMMS experience
NASA Astrophysics Data System (ADS)
Montano, Nelson
2004-10-01
During the last four years of operations, low technical downtime has been one of the relevant records of the Paranal Observatory. From the beginning of the Very Large Telescope (VLT) project, European Southern Observatory (ESO) has considered the implementation of a proper maintenance strategy a fundamental point in order to ensure low technical down time and preserve the Observatory's assets. The implementation of the maintenance strategy was based on the following aspects: - Strong maintenance sense during the design stage. Line Replacement Unit (LRU) concept, standardization and modularity of the Observatory equipment - Creation of a dedicated team for Maintenance - The implementation of a Computerized Maintenance Management System After four operational years, the result of these aspects has exceeded the expectations; the Observatory has been operating with high availability under a sustainable strategy. The strengths of the maintenance strategy have been based on modern maintenance concepts applied by regular production companies, where any minute of down time involves high cost. The operation of the actual Paranal Maintenance System is based mainly on proactive activities, such as regular inspections, preventive maintenance (PM) and predictive maintenance (PdM) plans. Nevertheless, it has been necessary to implement a strong plan for corrective maintenance (CM). The Spare Parts Strategy has also been an important point linked to the Maintenance Strategy and CMMS implementation. At present, almost 4,000 items related to the Observatory spare parts are loaded into the CMMS database. Currently, we are studying the implementation of a Reliability Centered Maintenance (RCM) project in one of our critical systems The following document presents the actual status of the Paranal Maintenance Strategy and which have been the motivations to implement the established strategy.
Googling DNA sequences on the World Wide Web.
Hajibabaei, Mehrdad; Singer, Gregory A C
2009-11-10
New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.
Pérez-Lago, Laura; Martínez-Lirola, Miguel; García, Sergio; Herranz, Marta; Mokrousov, Igor; Comas, Iñaki; Martínez-Priego, Llúcia; Bouza, Emilio
2016-01-01
Current migratory movements require new strategies for rapidly tracking the transmission of high-risk imported Mycobacterium tuberculosis strains. Whole-genome sequencing (WGS) enables us to identify single-nucleotide polymorphisms (SNPs) and therefore design PCRs to track specific relevant strains. However, fast implementation of these strategies in the hospital setting is difficult because professionals working in diagnostics, molecular epidemiology, and genomics are generally at separate institutions. In this study, we describe the urgent implementation of a system that integrates genomics and molecular tools in a genuine high-risk epidemiological alert involving 2 independent importations of extensively drug resistant (XDR) and pre-XDR Beijing M. tuberculosis strains from Russia into Spain. Both cases involved commercial sex workers with long-standing tuberculosis (TB). The system was based on strain-specific PCRs tailored from WGS data that were transferred to the local node that was managing the epidemiological alert. The optimized tests were available for prospective implementation in the local node 33 working days after receiving the primary cultures of the XDR strains and were applied to all 42 new incident cases. An interpretable result was obtained in each case (directly from sputum for 27 stain-positive cases) and corresponded to the amplification profiles for strains other than the targeted pre-XDR and XDR strains, which made it possible to prospectively rule out transmission of these high-risk strains at diagnosis. PMID:27682128
Pérez-Lago, Laura; Martínez-Lirola, Miguel; García, Sergio; Herranz, Marta; Mokrousov, Igor; Comas, Iñaki; Martínez-Priego, Llúcia; Bouza, Emilio; García-de-Viedma, Darío
2016-12-01
Current migratory movements require new strategies for rapidly tracking the transmission of high-risk imported Mycobacterium tuberculosis strains. Whole-genome sequencing (WGS) enables us to identify single-nucleotide polymorphisms (SNPs) and therefore design PCRs to track specific relevant strains. However, fast implementation of these strategies in the hospital setting is difficult because professionals working in diagnostics, molecular epidemiology, and genomics are generally at separate institutions. In this study, we describe the urgent implementation of a system that integrates genomics and molecular tools in a genuine high-risk epidemiological alert involving 2 independent importations of extensively drug resistant (XDR) and pre-XDR Beijing M. tuberculosis strains from Russia into Spain. Both cases involved commercial sex workers with long-standing tuberculosis (TB). The system was based on strain-specific PCRs tailored from WGS data that were transferred to the local node that was managing the epidemiological alert. The optimized tests were available for prospective implementation in the local node 33 working days after receiving the primary cultures of the XDR strains and were applied to all 42 new incident cases. An interpretable result was obtained in each case (directly from sputum for 27 stain-positive cases) and corresponded to the amplification profiles for strains other than the targeted pre-XDR and XDR strains, which made it possible to prospectively rule out transmission of these high-risk strains at diagnosis. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
The implementation of sea ice model on a regional high-resolution scale
NASA Astrophysics Data System (ADS)
Prasad, Siva; Zakharov, Igor; Bobby, Pradeep; McGuire, Peter
2015-09-01
The availability of high-resolution atmospheric/ocean forecast models, satellite data and access to high-performance computing clusters have provided capability to build high-resolution models for regional ice condition simulation. The paper describes the implementation of the Los Alamos sea ice model (CICE) on a regional scale at high resolution. The advantage of the model is its ability to include oceanographic parameters (e.g., currents) to provide accurate results. The sea ice simulation was performed over Baffin Bay and the Labrador Sea to retrieve important parameters such as ice concentration, thickness, ridging, and drift. Two different forcing models, one with low resolution and another with a high resolution, were used for the estimation of sensitivity of model results. Sea ice behavior over 7 years was simulated to analyze ice formation, melting, and conditions in the region. Validation was based on comparing model results with remote sensing data. The simulated ice concentration correlated well with Advanced Microwave Scanning Radiometer for EOS (AMSR-E) and Ocean and Sea Ice Satellite Application Facility (OSI-SAF) data. Visual comparison of ice thickness trends estimated from the Soil Moisture and Ocean Salinity satellite (SMOS) agreed with the simulation for year 2010-2011.
Building a highly available and intrusion tolerant Database Security and Protection System (DSPS).
Cai, Liang; Yang, Xiao-Hu; Dong, Jin-Xiang
2003-01-01
Database Security and Protection System (DSPS) is a security platform for fighting malicious DBMS. The security and performance are critical to DSPS. The authors suggested a key management scheme by combining the server group structure to improve availability and the key distribution structure needed by proactive security. This paper detailed the implementation of proactive security in DSPS. After thorough performance analysis, the authors concluded that the performance difference between the replicated mechanism and proactive mechanism becomes smaller and smaller with increasing number of concurrent connections; and that proactive security is very useful and practical for large, critical applications.
Code of Federal Regulations, 2010 CFR
2010-01-01
... demonstrate that an alternative source(s) of repayment will be available in order for further processing of... Note,” has been revised so that the language will no longer be inserted as an addendum, but the... natural condition, such as drought, and (b) without action by the producer that destroys a natural wetland...
Code of Federal Regulations, 2014 CFR
2014-01-01
... demonstrate that an alternative source(s) of repayment will be available in order for further processing of... Note,” has been revised so that the language will no longer be inserted as an addendum, but the... natural condition, such as drought, and (b) without action by the producer that destroys a natural wetland...
Code of Federal Regulations, 2012 CFR
2012-01-01
... demonstrate that an alternative source(s) of repayment will be available in order for further processing of... Note,” has been revised so that the language will no longer be inserted as an addendum, but the... natural condition, such as drought, and (b) without action by the producer that destroys a natural wetland...
Code of Federal Regulations, 2011 CFR
2011-01-01
... demonstrate that an alternative source(s) of repayment will be available in order for further processing of... Note,” has been revised so that the language will no longer be inserted as an addendum, but the... natural condition, such as drought, and (b) without action by the producer that destroys a natural wetland...
Code of Federal Regulations, 2013 CFR
2013-01-01
... demonstrate that an alternative source(s) of repayment will be available in order for further processing of... Note,” has been revised so that the language will no longer be inserted as an addendum, but the... natural condition, such as drought, and (b) without action by the producer that destroys a natural wetland...
High-Speed Optical Wide-Area Data-Communication Network
NASA Technical Reports Server (NTRS)
Monacos, Steve P.
1994-01-01
Proposed fiber-optic wide-area network (WAN) for digital communication balances input and output flows of data with its internal capacity by routing traffic via dynamically interconnected routing planes. Data transmitted optically through network by wavelength-division multiplexing in synchronous or asynchronous packets. WAN implemented with currently available technology. Network is multiple-ring cyclic shuffle exchange network ensuring traffic reaches its destination with minimum number of hops.
ISA implementation and uncertainty: a literature review and expert elicitation study.
van der Pas, J W G M; Marchau, V A W J; Walker, W E; van Wee, G P; Vlassenroot, S H
2012-09-01
Each day, an average of over 116 people die from traffic accidents in the European Union. One out of three fatalities is estimated to be the result of speeding. The current state of technology makes it possible to make speeding more difficult, or even impossible, by placing intelligent speed limiters (so called ISA devices) in vehicles. Although the ISA technology has been available for some years now, and reducing the number of road traffic fatalities and injuries has been high on the European political agenda, implementation still seems to be far away. Experts indicate that there are still too many uncertainties surrounding ISA implementation, and dealing with these uncertainties is essential for implementing ISA. In this paper, a systematic and representative inventory of the uncertainties is made based upon the literature. Furthermore, experts in the field of ISA were surveyed and asked which uncertainties are barriers for ISA implementation, and how uncertain these uncertainties are. We found that the long-term effects and the effects of large-scale implementation of ISA are still uncertain and are the most important barriers for the implementation of the most effective types of ISA. One way to deal with these uncertainties would be to start implementation on a small scale and gradually expand the penetration, in order to learn how ISA influences the transport system over time. Copyright © 2010 Elsevier Ltd. All rights reserved.
2013-01-01
Background In the United States, as in many other parts of the world, the prevalence of overweight/obesity is at epidemic proportions in the adult population and even higher among Veterans. To address the high prevalence of overweight/obesity among Veterans, the MOVE!® weight management program was disseminated nationally to Veteran Affairs (VA) medical centers. The objective of this paper is two-fold: to describe factors that explain the wide variation in implementation of MOVE!; and to illustrate, step-by-step, how to apply a theory-based framework using qualitative data. Methods Five VA facilities were selected to maximize variation in implementation effectiveness and geographic location. Twenty-four key stakeholders were interviewed about their experiences in implementing MOVE!. The Consolidated Framework for Implementation Research (CFIR) was used to guide collection and analysis of qualitative data. Constructs that most strongly influence implementation effectiveness were identified through a cross-case comparison of ratings. Results Of the 31 CFIR constructs assessed, ten constructs strongly distinguished between facilities with low versus high program implementation effectiveness. The majority (six) were related to the inner setting: networks and communications; tension for change; relative priority; goals and feedback; learning climate; and leadership engagement. One construct each, from intervention characteristics (relative advantage) and outer setting (patient needs and resources), plus two from process (executing and reflecting) also strongly distinguished between high and low implementation. Two additional constructs weakly distinguished, 16 were mixed, three constructs had insufficient data to assess, and one was not applicable. Detailed descriptions of how each distinguishing construct manifested in study facilities and a table of recommendations is provided. Conclusions This paper presents an approach for using the CFIR to code and rate qualitative data in a way that will facilitate comparisons across studies. An online Wiki resource (http://www.wiki.cfirwiki.net) is available, in addition to the information presented here, that contains much of the published information about the CFIR and its constructs and sub-constructs. We hope that the described approach and open access to the CFIR will generate wide use and encourage dialogue and continued refinement of both the framework and approaches for applying it. PMID:23663819
Damschroder, Laura J; Lowery, Julie C
2013-05-10
In the United States, as in many other parts of the world, the prevalence of overweight/obesity is at epidemic proportions in the adult population and even higher among Veterans. To address the high prevalence of overweight/obesity among Veterans, the MOVE!(®) weight management program was disseminated nationally to Veteran Affairs (VA) medical centers. The objective of this paper is two-fold: to describe factors that explain the wide variation in implementation of MOVE!; and to illustrate, step-by-step, how to apply a theory-based framework using qualitative data. Five VA facilities were selected to maximize variation in implementation effectiveness and geographic location. Twenty-four key stakeholders were interviewed about their experiences in implementing MOVE!. The Consolidated Framework for Implementation Research (CFIR) was used to guide collection and analysis of qualitative data. Constructs that most strongly influence implementation effectiveness were identified through a cross-case comparison of ratings. Of the 31 CFIR constructs assessed, ten constructs strongly distinguished between facilities with low versus high program implementation effectiveness. The majority (six) were related to the inner setting: networks and communications; tension for change; relative priority; goals and feedback; learning climate; and leadership engagement. One construct each, from intervention characteristics (relative advantage) and outer setting (patient needs and resources), plus two from process (executing and reflecting) also strongly distinguished between high and low implementation. Two additional constructs weakly distinguished, 16 were mixed, three constructs had insufficient data to assess, and one was not applicable. Detailed descriptions of how each distinguishing construct manifested in study facilities and a table of recommendations is provided. This paper presents an approach for using the CFIR to code and rate qualitative data in a way that will facilitate comparisons across studies. An online Wiki resource (http://www.wiki.cfirwiki.net) is available, in addition to the information presented here, that contains much of the published information about the CFIR and its constructs and sub-constructs. We hope that the described approach and open access to the CFIR will generate wide use and encourage dialogue and continued refinement of both the framework and approaches for applying it.
Efficient hash tables for network applications.
Zink, Thomas; Waldvogel, Marcel
2015-01-01
Hashing has yet to be widely accepted as a component of hard real-time systems and hardware implementations, due to still existing prejudices concerning the unpredictability of space and time requirements resulting from collisions. While in theory perfect hashing can provide optimal mapping, in practice, finding a perfect hash function is too expensive, especially in the context of high-speed applications. The introduction of hashing with multiple choices, d-left hashing and probabilistic table summaries, has caused a shift towards deterministic DRAM access. However, high amounts of rare and expensive high-speed SRAM need to be traded off for predictability, which is infeasible for many applications. In this paper we show that previous suggestions suffer from the false precondition of full generality. Our approach exploits four individual degrees of freedom available in many practical applications, especially hardware and high-speed lookups. This reduces the requirement of on-chip memory up to an order of magnitude and guarantees constant lookup and update time at the cost of only minute amounts of additional hardware. Our design makes efficient hash table implementations cheaper, more predictable, and more practical.
Gimbel, Sarah; Rustagi, Alison S; Robinson, Julia; Kouyate, Seydou; Coutinho, Joana; Nduati, Ruth; Pfeiffer, James; Gloyd, Stephen; Sherr, Kenneth; Granato, S Adam; Kone, Ahoua; Cruz, Emilia; Manuel, Joao Luis; Zucule, Justina; Napua, Manuel; Mbatia, Grace; Wariua, Grace; Maina, Martin
2016-08-01
Despite large investments to prevent mother-to-child-transmission (PMTCT), pediatric HIV elimination goals are not on track in many countries. The Systems Analysis and Improvement Approach (SAIA) study was a cluster randomized trial to test whether a package of systems engineering tools could strengthen PMTCT programs. We sought to (1) define core and adaptable components of the SAIA intervention, and (2) explain the heterogeneity in SAIA's success between facilities. The Consolidated Framework for Implementation Research (CFIR) guided all data collection efforts. CFIR constructs were assessed in focus group discussions and interviews with study and facility staff in 6 health facilities (1 high-performing and 1 low-performing site per country, identified by study staff) in December 2014 at the end of the intervention period. SAIA staff identified the intervention's core and adaptable components at an end-of-study meeting in August 2015. Two independent analysts used CFIR constructs to code transcripts before reaching consensus. Flow mapping and continuous quality improvement were the core to the SAIA in all settings, whereas the PMTCT cascade analysis tool was the core in high HIV prevalence settings. Five CFIR constructs distinguished strongly between high and low performers: 2 in inner setting (networks and communication, available resources) and 3 in process (external change agents, executing, reflecting and evaluating). The CFIR is a valuable tool to categorize elements of an intervention as core versus adaptable, and to understand heterogeneity in study implementation. Future intervention studies should apply evidence-based implementation science frameworks, like the CFIR, to provide salient data to expand implementation to other settings.
Rustagi, Alison S.; Robinson, Julia; Kouyate, Seydou; Coutinho, Joana; Nduati, Ruth; Pfeiffer, James; Gloyd, Stephen; Sherr, Kenneth; Granato, S. Adam; Kone, Ahoua; Cruz, Emilia; Manuel, Joao Luis; Zucule, Justina; Napua, Manuel; Mbatia, Grace; Wariua, Grace; Maina, Martin
2016-01-01
Background: Despite large investments to prevent mother-to-child-transmission (PMTCT), pediatric HIV elimination goals are not on track in many countries. The Systems Analysis and Improvement Approach (SAIA) study was a cluster randomized trial to test whether a package of systems engineering tools could strengthen PMTCT programs. We sought to (1) define core and adaptable components of the SAIA intervention, and (2) explain the heterogeneity in SAIA's success between facilities. Methods: The Consolidated Framework for Implementation Research (CFIR) guided all data collection efforts. CFIR constructs were assessed in focus group discussions and interviews with study and facility staff in 6 health facilities (1 high-performing and 1 low-performing site per country, identified by study staff) in December 2014 at the end of the intervention period. SAIA staff identified the intervention's core and adaptable components at an end-of-study meeting in August 2015. Two independent analysts used CFIR constructs to code transcripts before reaching consensus. Results: Flow mapping and continuous quality improvement were the core to the SAIA in all settings, whereas the PMTCT cascade analysis tool was the core in high HIV prevalence settings. Five CFIR constructs distinguished strongly between high and low performers: 2 in inner setting (networks and communication, available resources) and 3 in process (external change agents, executing, reflecting and evaluating). Discussion: The CFIR is a valuable tool to categorize elements of an intervention as core versus adaptable, and to understand heterogeneity in study implementation. Future intervention studies should apply evidence-based implementation science frameworks, like the CFIR, to provide salient data to expand implementation to other settings. PMID:27355497
NASA Astrophysics Data System (ADS)
Kim, Jeongnim; Baczewski, Andrew D.; Beaudet, Todd D.; Benali, Anouar; Chandler Bennett, M.; Berrill, Mark A.; Blunt, Nick S.; Josué Landinez Borda, Edgar; Casula, Michele; Ceperley, David M.; Chiesa, Simone; Clark, Bryan K.; Clay, Raymond C., III; Delaney, Kris T.; Dewing, Mark; Esler, Kenneth P.; Hao, Hongxia; Heinonen, Olle; Kent, Paul R. C.; Krogel, Jaron T.; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M. Graham; Luo, Ye; Malone, Fionn D.; Martin, Richard M.; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A.; Mitas, Lubos; Morales, Miguel A.; Neuscamman, Eric; Parker, William D.; Pineda Flores, Sergio D.; Romero, Nichols A.; Rubenstein, Brenda M.; Shea, Jacqueline A. R.; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F.; Townsend, Joshua P.; Tubman, Norm M.; Van Der Goetz, Brett; Vincent, Jordan E.; ChangMo Yang, D.; Yang, Yubo; Zhang, Shuai; Zhao, Luning
2018-05-01
QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater–Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.
Kim, Jeongnim; Baczewski, Andrew T; Beaudet, Todd D; Benali, Anouar; Bennett, M Chandler; Berrill, Mark A; Blunt, Nick S; Borda, Edgar Josué Landinez; Casula, Michele; Ceperley, David M; Chiesa, Simone; Clark, Bryan K; Clay, Raymond C; Delaney, Kris T; Dewing, Mark; Esler, Kenneth P; Hao, Hongxia; Heinonen, Olle; Kent, Paul R C; Krogel, Jaron T; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M Graham; Luo, Ye; Malone, Fionn D; Martin, Richard M; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A; Mitas, Lubos; Morales, Miguel A; Neuscamman, Eric; Parker, William D; Pineda Flores, Sergio D; Romero, Nichols A; Rubenstein, Brenda M; Shea, Jacqueline A R; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F; Townsend, Joshua P; Tubman, Norm M; Van Der Goetz, Brett; Vincent, Jordan E; Yang, D ChangMo; Yang, Yubo; Zhang, Shuai; Zhao, Luning
2018-05-16
QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program's capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-22
... advance and available for prompt implementation once triggered. Section 110(k)(5) of the CAA provides that... Environmental protection, Air pollution control, Iowa, Particulate matter, State Implementation Plan. Dated...
ChronQC: a quality control monitoring system for clinical next generation sequencing.
Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C
2018-05-15
ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.
BigWig and BigBed: enabling browsing of large distributed datasets.
Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D
2010-09-01
BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.
Energy efficiency opportunities in the brewery industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worrell, Ernst; Galitsky, Christina; Martin, Nathan
2002-06-28
Breweries in the United States spend annually over $200 Million on energy. Energy consumption is equal to 3-8% of the production costs of beer, making energy efficiency improvement an important way to reduce costs, especially in times of high energy price volatility. After a summary of the beer making process and energy use, we examine energy efficiency opportunities available for breweries. We provide specific primary energy savings for each energy efficiency measure based on case studies that have implemented the measures, as well as references to technical literature. If available, we have also listed typical payback periods. Our findings suggestmore » that there may still be opportunities to reduce energy consumption cost-effectively for breweries. Major brewing companies have and will continue to spend capital on cost effective measures that do not impact the quality of the beer. Further research on the economics of the measures, as well as their applicability to different brewing practices, is needed to assess implementation of selected technologies at individual breweries.« less
LFQC: a lossless compression algorithm for FASTQ files
Nicolae, Marius; Pathak, Sudipta; Rajasekaran, Sanguthevar
2015-01-01
Motivation: Next Generation Sequencing (NGS) technologies have revolutionized genomic research by reducing the cost of whole genome sequencing. One of the biggest challenges posed by modern sequencing technology is economic storage of NGS data. Storing raw data is infeasible because of its enormous size and high redundancy. In this article, we address the problem of storage and transmission of large FASTQ files using innovative compression techniques. Results: We introduce a new lossless non-reference based FASTQ compression algorithm named Lossless FASTQ Compressor. We have compared our algorithm with other state of the art big data compression algorithms namely gzip, bzip2, fastqz (Bonfield and Mahoney, 2013), fqzcomp (Bonfield and Mahoney, 2013), Quip (Jones et al., 2012), DSRC2 (Roguski and Deorowicz, 2014). This comparison reveals that our algorithm achieves better compression ratios on LS454 and SOLiD datasets. Availability and implementation: The implementations are freely available for non-commercial purposes. They can be downloaded from http://engr.uconn.edu/rajasek/lfqc-v1.1.zip. Contact: rajasek@engr.uconn.edu PMID:26093148
Scribl: an HTML5 Canvas-based graphics library for visualizing genomic data over the web
Miller, Chase A.; Anthony, Jon; Meyer, Michelle M.; Marth, Gabor
2013-01-01
Motivation: High-throughput biological research requires simultaneous visualization as well as analysis of genomic data, e.g. read alignments, variant calls and genomic annotations. Traditionally, such integrative analysis required desktop applications operating on locally stored data. Many current terabyte-size datasets generated by large public consortia projects, however, are already only feasibly stored at specialist genome analysis centers. As even small laboratories can afford very large datasets, local storage and analysis are becoming increasingly limiting, and it is likely that most such datasets will soon be stored remotely, e.g. in the cloud. These developments will require web-based tools that enable users to access, analyze and view vast remotely stored data with a level of sophistication and interactivity that approximates desktop applications. As rapidly dropping cost enables researchers to collect data intended to answer questions in very specialized contexts, developers must also provide software libraries that empower users to implement customized data analyses and data views for their particular application. Such specialized, yet lightweight, applications would empower scientists to better answer specific biological questions than possible with general-purpose genome browsers currently available. Results: Using recent advances in core web technologies (HTML5), we developed Scribl, a flexible genomic visualization library specifically targeting coordinate-based data such as genomic features, DNA sequence and genetic variants. Scribl simplifies the development of sophisticated web-based graphical tools that approach the dynamism and interactivity of desktop applications. Availability and implementation: Software is freely available online at http://chmille4.github.com/Scribl/ and is implemented in JavaScript with all modern browsers supported. Contact: gabor.marth@bc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23172864
Implementing Pay-for-Performance in the Neonatal Intensive Care Unit
Profit, Jochen; Zupancic, John A. F.; Gould, Jeffrey B.; Petersen, Laura A.
2011-01-01
Pay-for-performance initiatives in medicine are proliferating rapidly. Neonatal intensive care is a likely target for these efforts because of the high cost, available databases, and relative strength of evidence for at least some measures of quality. Pay-for-performance may improve patient care but requires valid measurements of quality to ensure that financial incentives truly support superior performance. Given the existing uncertainty with respect to both the effectiveness of pay-for-performance and the state of quality measurement science, experimentation with pay-for-performance initiatives should proceed with caution and in controlled settings. In this article, we describe approaches to measuring quality and implementing pay-for-performance in the NICU setting. PMID:17473099
NASA Astrophysics Data System (ADS)
Maćkowiak-Pawłowska, Maja; Przybyła, Piotr
2018-05-01
The incomplete particle identification limits the experimentally-available phase space region for identified particle analysis. This problem affects ongoing fluctuation and correlation studies including the search for the critical point of strongly interacting matter performed on SPS and RHIC accelerators. In this paper we provide a procedure to obtain nth order moments of the multiplicity distribution using the identity method, generalising previously published solutions for n=2 and n=3. Moreover, we present an open source software implementation of this computation, called Idhim, that allows one to obtain the true moments of identified particle multiplicity distributions from the measured ones provided the response function of the detector is known.
Using Compilers to Enhance Cryptographic Product Development
NASA Astrophysics Data System (ADS)
Bangerter, E.; Barbosa, M.; Bernstein, D.; Damgård, I.; Page, D.; Pagter, J. I.; Sadeghi, A.-R.; Sovio, S.
Developing high-quality software is hard in the general case, and it is significantly more challenging in the case of cryptographic software. A high degree of new skill and understanding must be learnt and applied without error to avoid vulnerability and inefficiency. This is often beyond the financial, manpower or intellectual resources avail-able. In this paper we present the motivation for the European funded CACE (Computer Aided Cryptography Engineering) project The main objective of CACE is to provide engineers (with limited or no expertise in cryptography) with a toolbox that allows them to generate robust and efficient implementations of cryptographic primitives. We also present some preliminary results already obtained in the early stages of this project, and discuss the relevance of the project as perceived by stakeholders in the mobile device arena.
Pteros 2.0: Evolution of the fast parallel molecular analysis library for C++ and python.
Yesylevskyy, Semen O
2015-07-15
Pteros is the high-performance open-source library for molecular modeling and analysis of molecular dynamics trajectories. Starting from version 2.0 Pteros is available for C++ and Python programming languages with very similar interfaces. This makes it suitable for writing complex reusable programs in C++ and simple interactive scripts in Python alike. New version improves the facilities for asynchronous trajectory reading and parallel execution of analysis tasks by introducing analysis plugins which could be written in either C++ or Python in completely uniform way. The high level of abstraction provided by analysis plugins greatly simplifies prototyping and implementation of complex analysis algorithms. Pteros is available for free under Artistic License from http://sourceforge.net/projects/pteros/. © 2015 Wiley Periodicals, Inc.
Promotion of mental health and prevention of mental disorders: priorities for implementation.
Barry, M M; Clarke, A M; Petersen, I
2015-09-28
There is compelling evidence from high-quality studies that mental health promotion and primary prevention interventions can reduce the risk of mental disorders, enhance protective factors for good mental and physical health, and lead to lasting positive effects on a range of social and economic outcomes. This paper reviews the available evidence in order to guide the implementation of mental health promotion and prevention interventions in the Eastern Mediterranean Region. The paper identifies a number of priority areas that can generate clear health and social gains in the population and be implemented and sustained at a reasonable cost. The interventions cover population groups across the lifespan from infancy to adulthood and include actions delivered across different settings and delivery platforms. "Best practices" were identified as interventions for which there is evidence not only of their effectiveness but also of their feasibility within resource constraints. The implications of the findings for capacity development are considered.
An Exploratory Study of OEE Implementation in Indian Manufacturing Companies
NASA Astrophysics Data System (ADS)
Kumar, J.; Soni, V. K.
2015-04-01
Globally, the implementation of Overall equipment effectiveness (OEE) has proven to be highly effective in improving availability, performance rate and quality rate while reducing unscheduled breakdown and wastage that stems from the equipment. This paper investigates the present status and future scope of OEE metrics in Indian manufacturing companies through an extensive survey. In this survey, opinions of Production and Maintenance Managers have been analyzed statistically to explore the relationship between factors, perspective of OEE and potential use of OEE metrics. Although the sample has been divers in terms of product, process type, size, and geographic location of the companies, they are enforced to implement improvement techniques such as OEE metrics to improve performance. The findings reveal that OEE metrics has huge potential and scope to improve performance. Responses indicate that Indian companies are aware of OEE but they are not utilizing full potential of OEE metrics.
PR-PR: Cross-Platform Laboratory Automation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linshiz, G; Stawski, N; Goyal, G
To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Goldenmore » Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.« less
Study of V/STOL aircraft implementation. Volume 1: Summary
NASA Technical Reports Server (NTRS)
Portenier, W. J.; Webb, H. M.
1973-01-01
A high density short haul air market which by 1980 is large enough to support the introduction of an independent short haul air transportation system is discussed. This system will complement the existing air transportation system and will provide relief of noise and congestion problems at conventional airports. The study has found that new aircraft, exploiting V/STOL and quiet engine technology, can be available for implementing these new services, and they can operate from existing reliever and general aviation airports. The study has also found that the major funding requirements for implementing new short haul services could be borne by private capital, and that the government funding requirement would be minimal and/or recovered through the airline ticket tax. In addition, a suitable new short haul aircraft would have a market potential for $3.5 billion in foreign sales. The long lead times needed for aircraft and engine technology development will require timely actions by federal agencies.
Ducharme, Lori J; Chandler, Redonna K; Harris, Alex H S
2016-01-01
The National Institute on Alcohol Abuse and Alcoholism (NIAAA), National Institute on Drug Abuse (NIDA), and Veterans Health Administration (VHA) share an interest in promoting high quality, rigorous health services research to improve the availability and utilization of evidence-based treatment for substance use disorders (SUD). Recent and continuing changes in the healthcare policy and funding environments prioritize the integration of evidence-based substance abuse treatments into primary care and general medical settings. This area is a prime candidate for implementation research. Recent and ongoing implementation projects funded by these agencies are reviewed. Research in five areas is highlighted: screening and brief intervention for risky drinking; screening and brief intervention for tobacco use; uptake of FDA-approved addiction pharmacotherapies; safe opioid prescribing; and disease management. Gaps in the portfolios, and priorities for future research, are described. Published by Elsevier Inc.
Nonas, Cathy; Silver, Lynn D; Kettel Khan, Laura; Leviton, Laura
2014-10-16
Childhood obesity is associated with health risks in childhood, and it increases the risk of adult obesity, which is associated with many chronic diseases. Therefore, implementing policies that may prevent obesity at young ages is important. In 2007, the New York City Department of Health and Mental Hygiene implemented new regulations for early childhood centers to increase physical activity, limit screen time, and provide healthful beverage offerings (ie, restrict sugar-sweetened beverages for all children, restrict whole milk for those older than 2 years, restrict juice to beverages that are 100% juice and limit serving of juice to only 6 ounces per day, and make water available and accessible at all times). This article explains why these amendments to the Health Code were created, how information about these changes was disseminated, and what training programs were used to help ensure implementation, particularly in high-need neighborhoods.
Goodarzi, Zahra; Hanson, Heather M; Jette, Nathalie; Patten, Scott; Pringsheim, Tamara; Holroyd-Leduc, Jayna
2018-06-01
ABSTRACTOur primary objective was to understand the barriers and facilitators associated with the implementation of high-quality clinical practice guidelines (CPGs) for depression and anxiety in patients with dementia or Parkinson's disease (PD). We conducted focus groups or interviews with participants experiencing dementia or PD, their caregivers, and physicians in Calgary, Alberta, and applied the theoretical domains framework and behaviour change wheel to guide data collection and perform a framework analysis. Thirty-three physicians and seven PD patients/caregivers participated. We report barriers and facilitators to the implementation of guideline recommendations for diagnosis, management, and the use of the guidelines. An overarching theme was the lack of evidence for depression or anxiety disorders in dementia or PD, which was prominent for anxiety versus depression. Patients noted difficulties with communicating symptoms and accessing services. Although guidelines are available, physicians have difficulty implementing certain recommendations due primarily to a lack of evidence regarding efficacy.
PR-PR: cross-platform laboratory automation system.
Linshiz, Gregory; Stawski, Nina; Goyal, Garima; Bi, Changhao; Poust, Sean; Sharma, Monica; Mutalik, Vivek; Keasling, Jay D; Hillson, Nathan J
2014-08-15
To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.
Challenges and opportunities for meningococcal vaccination in the developing world.
Shaker, Rouba; Fayad, Danielle; Dbaibo, Ghassan
2018-05-04
Meningococcal disease continues to be a life threatening infection with high morbidity and mortality even in appropriately treated patients. Meningococcal vaccination plays a major role in the control of the disease; however, implementing vaccination remains problematic in the developing world. The objective of this review is to identify the challenges facing the use of meningococcal vaccines in the developing world in order to discuss the opportunities and available solutions to improve immunization in these countries. Inadequate epidemiologic information necessary to implement vaccination and financial challenges predominate. Multiple measures are needed to achieve the successful implementation of meningococcal conjugate vaccination programs that protect against circulating serogroups in developing countries including enhanced surveillance systems, financial support and aid through grants, product development partnerships that are the end result of effective collaboration and communication between different interdependent stakeholders to develop affordable vaccines, and demonstration of the cost-effectiveness of new meningococcal vaccines.
Frequency addressable beams for land mobile communications
NASA Technical Reports Server (NTRS)
Thompson, J. D.; Dubellay, G. G.
1988-01-01
Satellites used for mobile communications need to serve large numbers of small, low cost terminals. The most important parameters affecting the capacity of such systems are the satellite equivalent isotropically radiated power (EIRP) and gain to noise temperature ratio (G/T) and available bandwidth. Satellites using frequency addressed beams provide high EIRP and G/T with high-gain antenna beams that also permit frequency reuse over the composite coverage area. Frequency addressing is easy to implement and compatible with low-cost terminals and offers higher capacity than alternative approaches.
NASA Astrophysics Data System (ADS)
Ferraris, M.; Risso, P.; Squarcia, S.
We present the realization done for the organization, selection, transmission of Radiotherapy's data and images. The choice of a standard healthcare records, based on the stereotactic and/or conformational radiotherapy, the implementation of the healthcare file into a distributed data-base using the World Wide Web platform for data presentation and transmission and the availability in the network is presented. The solution chosen is a good example of technology transfert from High Energy physics and Medicine and opens new interesting ways in this field.
Dunham, Jason B.; Gallo, Kirsten; Shively, Dan; Allen, Chris; Goehring, Brad
2011-01-01
Translocations to recover native fishes have resulted in mixed success. One reason for the failure of these actions is inadequate assessments of their feasibility prior to implementation. Here, we provide a framework developed to assess the feasibility of one type of translocation-reintroduction. The framework was founded on two simple components of feasibility: the potential for recipient habitats to support a reintroduction and the potential of available donor populations to support a reintroduction. Within each component, we developed a series of key questions. The final assessment was based on a scoring system that incorporated consideration of uncertainty in available information. The result was a simple yet transparent system for assessing reintroduction feasibility that can be rapidly applied in practice. We applied this assessment framework to the potential reintroduction of threatened bull trout Salvelinus confluentus into the Clackamas River, Oregon. In this case, the assessment suggested that the degree of feasibility for reintroduction was high based on the potential of recipient habitats and available donor populations. The assessment did not provide a comprehensive treatment of all possible factors that would drive an actual decision to implement a reintroduction,
Taking on breast cancer in East Africa: global challenges in breast cancer.
Kantelhardt, Eva Johanna; Cubasch, Herbert; Hanson, Claudia
2015-02-01
To provide an update on breast cancer epidemiology, early detection, and therapy in Africa. Breast cancer has been a long neglected topic in Africa. Due to increased activities of population-based cancer registries, cancer incidence rates become available. Data from 26 African countries for 2012 suggest that in the majority of countries, breast cancer has turned into the leading cancer among the female population. Yet data from hospital-based registries show that patients often present late. Efforts are being made to implement early detection programs; however, there are open questions how best to organize screening activities, referral and how to assure pathology service. Adjuvant treatment is still limited to a small number of centers; neoadjuvant treatment is underutilized. New data have become available from different countries reporting high proportions of estrogen receptor-positive tumors, which would possibly justify the administration of tamoxifen in unknown receptor status. Breast cancer is an increasing health problem in low-resource countries. More information on incidence, clinical presentation, outcome, and tumor biology in Africa has become available. Further evidence is needed on strategies to improve awareness, appropriate treatment options, and implementation of palliative care. http://links.lww.com/COOG/A17
Real-time demonstration hardware for enhanced DPCM video compression algorithm
NASA Technical Reports Server (NTRS)
Bizon, Thomas P.; Whyte, Wayne A., Jr.; Marcopoli, Vincent R.
1992-01-01
The lack of available wideband digital links as well as the complexity of implementation of bandwidth efficient digital video CODECs (encoder/decoder) has worked to keep the cost of digital television transmission too high to compete with analog methods. Terrestrial and satellite video service providers, however, are now recognizing the potential gains that digital video compression offers and are proposing to incorporate compression systems to increase the number of available program channels. NASA is similarly recognizing the benefits of and trend toward digital video compression techniques for transmission of high quality video from space and therefore, has developed a digital television bandwidth compression algorithm to process standard National Television Systems Committee (NTSC) composite color television signals. The algorithm is based on differential pulse code modulation (DPCM), but additionally utilizes a non-adaptive predictor, non-uniform quantizer and multilevel Huffman coder to reduce the data rate substantially below that achievable with straight DPCM. The non-adaptive predictor and multilevel Huffman coder combine to set this technique apart from other DPCM encoding algorithms. All processing is done on a intra-field basis to prevent motion degradation and minimize hardware complexity. Computer simulations have shown the algorithm will produce broadcast quality reconstructed video at an average transmission rate of 1.8 bits/pixel. Hardware implementation of the DPCM circuit, non-adaptive predictor and non-uniform quantizer has been completed, providing realtime demonstration of the image quality at full video rates. Video sampling/reconstruction circuits have also been constructed to accomplish the analog video processing necessary for the real-time demonstration. Performance results for the completed hardware compare favorably with simulation results. Hardware implementation of the multilevel Huffman encoder/decoder is currently under development along with implementation of a buffer control algorithm to accommodate the variable data rate output of the multilevel Huffman encoder. A video CODEC of this type could be used to compress NTSC color television signals where high quality reconstruction is desirable (e.g., Space Station video transmission, transmission direct-to-the-home via direct broadcast satellite systems or cable television distribution to system headends and direct-to-the-home).
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-24
... Haze State Implementation Plan; Federal Implementation Plan for Regional Haze AGENCY: Environmental... (SIP) revision submitted by the State of Wyoming on January 12, 2011, that addresses regional haze...; Regional Haze State Implementation Plan; Federal Implementation Plan for Regional Haze; Proposed Rule (77...
Web-based interactive 2D/3D medical image processing and visualization software.
Mahmoudi, Seyyed Ehsan; Akhondi-Asl, Alireza; Rahmani, Roohollah; Faghih-Roohi, Shahrooz; Taimouri, Vahid; Sabouri, Ahmad; Soltanian-Zadeh, Hamid
2010-05-01
There are many medical image processing software tools available for research and diagnosis purposes. However, most of these tools are available only as local applications. This limits the accessibility of the software to a specific machine, and thus the data and processing power of that application are not available to other workstations. Further, there are operating system and processing power limitations which prevent such applications from running on every type of workstation. By developing web-based tools, it is possible for users to access the medical image processing functionalities wherever the internet is available. In this paper, we introduce a pure web-based, interactive, extendable, 2D and 3D medical image processing and visualization application that requires no client installation. Our software uses a four-layered design consisting of an algorithm layer, web-user-interface layer, server communication layer, and wrapper layer. To compete with extendibility of the current local medical image processing software, each layer is highly independent of other layers. A wide range of medical image preprocessing, registration, and segmentation methods are implemented using open source libraries. Desktop-like user interaction is provided by using AJAX technology in the web-user-interface. For the visualization functionality of the software, the VRML standard is used to provide 3D features over the web. Integration of these technologies has allowed implementation of our purely web-based software with high functionality without requiring powerful computational resources in the client side. The user-interface is designed such that the users can select appropriate parameters for practical research and clinical studies. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.
Panagiotoulias, I; Botsou, F; Kaberi, H; Karageorgis, A P; Scoullos, M
2017-10-31
In order to document the impact of Best Available Techniques (BAT) and implementation of regulation on the improvement of the coastal marine environment state, we examined the case of a representative steel mill located at the Gulf of Elefsis (Greece). The evaluation of metal pollution was based on the analysis of major and trace elements, organic carbon, magnetic properties, and sediment accumulation rates, in sediment cores obtained from the vicinity of the plant. The analytical data are discussed in relation to steel production, changes of production routes, and adoption of BAT introduced in order to fulfill EU and national legislation. The results show that the input of pollutants to sediments and the degree of contamination were reduced by approximately 40-70% in the decade 2003-2015 in comparison to the periods of high discharges (1963-2002), whereas the toxicity risks from "high-to-extremely high" were reduced to "medium-to-high."
Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.
2010-01-01
Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475
Implementing an electronic hand hygiene monitoring system: Lessons learned from community hospitals.
Edmisten, Catherine; Hall, Charles; Kernizan, Lorna; Korwek, Kimberly; Preston, Aaron; Rhoades, Evan; Shah, Shalin; Spight, Lori; Stradi, Silvia; Wellman, Sonia; Zygadlo, Scott
2017-08-01
Measuring and providing feedback about hand hygiene (HH) compliance is a complicated process. Electronic HH monitoring systems have been proposed as a possible solution; however, there is little information available about how to successfully implement and maintain these systems for maximum benefit in community hospitals. An electronic HH monitoring system was implemented in 3 community hospitals by teams at each facility with support from the system vendor. Compliance rates were measured by the electronic monitoring system. The implementation challenges, solutions, and drivers of success were monitored within each facility. The electronic HH monitoring systems tracked on average more than 220,000 compliant HH events per facility per month, with an average monthly compliance rate >85%. The sharing of best practices between facilities was valuable in addressing challenges encountered during implementation and maintaining a high rate of use. Drivers of success included a collaborative environment, leadership commitment, using data to drive improvement, consistent and constant messaging, staff empowerment, and patient involvement. Realizing the full benefit of investments in electronic HH monitoring systems requires careful consideration of implementation strategies, planning for ongoing support and maintenance, and presenting data in a meaningful way to empower and inspire staff. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-07
... Promulgation of Air Quality Implementation Plans: North Carolina; Control Techniques Guidelines and Reasonably Available Control Technology AGENCY: Environmental Protection Agency (EPA). ACTION: Proposed rule. SUMMARY... Carolina's commitment associated with the conditional approval of its reasonably available control...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-04
... Emission Inventory, Contingency Measures, Reasonably Available Control Measures, and Transportation... Implementation Plan (SIP) to meet the 2002 base year emissions inventory, the reasonable further progress (RFP) plan, RFP contingency measure, and reasonably available control measure (RACM) requirements of the...
Implementing clinical guidelines for chronic obstructive pulmonary disease: barriers and solutions
Overington, Jeff D.; Huang, Yao C.; Abramson, Michael J.; Brown, Juliet L.; Goddard, John R.; Bowman, Rayleen V.; Fong, Kwun M.
2014-01-01
Chronic obstructive pulmonary disease (COPD) is a complex chronic lung disease characterised by progressive fixed airflow limitation and acute exacerbations that frequently require hospitalisation. Evidence-based clinical guidelines for the diagnosis and management of COPD are now widely available. However, the uptake of these COPD guidelines in clinical practice is highly variable, as is the case for many other chronic disease guidelines. Studies have identified many barriers to implementation of COPD and other guidelines, including factors such as lack of familiarity with guidelines amongst clinicians and inadequate implementation programs. Several methods for enhancing adherence to clinical practice guidelines have been evaluated, including distribution methods, professional education sessions, electronic health records (EHR), point of care reminders and computer decision support systems (CDSS). Results of these studies are mixed to date, and the most effective ways to implement clinical practice guidelines remain unclear. Given the significant resources dedicated to evidence-based medicine, effective dissemination and implementation of best practice at the patient level is an important final step in the process of guideline development. Future efforts should focus on identifying optimal methods for translating the evidence into everyday clinical practice to ensure that patients receive the best care. PMID:25478199
Arora, Prerna G; Connors, Elizabeth H; George, Melissa W; Lyon, Aaron R; Wolk, Courtney B; Weist, Mark D
2016-12-01
Evidence-based assessment (EBA) is a critically important aspect of delivering high-quality, school-based mental health care for youth. However, research in this area is limited and additional applied research on how best to support the implementation of EBA in school mental health (SMH) is needed. Accordingly, this manuscript seeks to facilitate the advancement of research on EBA in SMH by reviewing relevant literature on EBA implementation in schools and providing recommendations for key research priorities. Given the limited number of published studies available, findings from child and adolescent mental health and implementation science research are also included to inform a robust and comprehensive research agenda on this topic. Based on this literature review, five priorities for research on EBA in SMH are outlined: (1) effective identification of assessment targets, (2) appropriate selection of assessment measures, (3) investigation of organizational readiness for EBA, (4) study of implementation support for EBA, and (5) promotion of EBA data integration and use. Each priority area includes recommended directions for future research. A comprehensive and robust research agenda is warranted to build the science and practice of implementing EBA in SMH. Specific directions for this agenda are offered.
PhenStat | Informatics Technology for Cancer Research (ITCR)
PhenStat is a freely available R package that provides a variety of statistical methods for the identification of phenotypic associations from model organisms developed for the International Mouse Phenotyping Consortium (IMPC at www.mousephenotype.org ). The methods have been developed for high throughput phenotyping pipelines implemented across various experimental designs with an emphasis on managing temporal variation and is being adapted for analysis with PDX mouse strains.
NASA Technical Reports Server (NTRS)
Follett, William W.; Rajagopal, Raj
2001-01-01
The focus of the AA MDO team is to reduce product development cost through the capture and automation of best design and analysis practices and through increasing the availability of low-cost, high-fidelity analysis. Implementation of robust designs reduces costs associated with the Test-Fall-Fix cycle. RD is currently focusing on several technologies to improve the design process, including optimization and robust design, expert and rule-based systems, and collaborative technologies.
Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences
NASA Technical Reports Server (NTRS)
1979-01-01
An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.
CAGE IIIA Distributed Simulation Design Methodology
2014-05-01
2 VHF Very High Frequency VLC Video LAN Codec – an Open-source cross-platform multimedia player and framework VM Virtual Machine VOIP Voice Over...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the...operation and to be available in the other nation’s simulations. The challenge for the CAGE campaign of experiments is to continue to build upon this
Telecommunications Services Required by Distributed and Interconnected Office Centers.
1980-07-20
systems and communications management systems which are on the market . It is expected that these systems and the capabilities they offer will be available...saw the possibilities of marketing the service, but was delayed in its implementation because the high capacity communication network to support the...Jersey 07666. [181 Washburn, C, Unfolding Electronic Mail Market Leads to Integrated Info Systems, Communications News, November 1979, page 56. /191
NASA Astrophysics Data System (ADS)
Suyanto, Slamet
2017-08-01
Indonesian government assigned a new curriculum in 2013, namely Curriculum of 2013 (C13). Recently, the implementation of the C13 has come up with a big controversy because it was setting back to the previous curriculum of KTSP (Scholl-based Curriculum) for majority of schools. Were the schools not ready to implement the curriculum of 2013? This research was a survey research to give evidence on the school readiness in implementing the new curriculum and to find the problems of the curriculum implementation. The samples of the research were 33 junior high schools from seven regencies in Indonesia. The respondents were 33 school principals and vice principals for curriculum affair, 200 teachers, and 200 students. The data were collected by using questionnaires, interview, and obsevation checklists. The data were taken during monitoring and evaluation programs facilitated by the Indonesian Directorate of Junior High School Development Management. The results indicates that (1) the readiness of the schools was 9 schools (27.27%) were ready, 17 schools (51.52%) were less ready, and 7 schools (21.21%) were not ready to implement the new curriculum; (2) the readiness of the schools was affected by the poor of the books' availability, only 23% of schools had complete student books, the number trained teachers, only 33% of teacher got training, the ICT access, only 17% of school have a good ICT access for all students, and teachers' understanding on the learning and assessment process, only 37% of teacher had good understanding on the new curriculum. The teacher had difficulties on (1) developing a lesson plan (16%), (2) using scientific approach (31,5%), (3) implementing authentic assessment (43,5%). Students mostly (78,5%) said that learning with the new curriculum is more difficult than it was before. Therefore, specific training on the new curriculum implementation is still needed.
Kalman-variant estimators for state of charge in lithium-sulfur batteries
NASA Astrophysics Data System (ADS)
Propp, Karsten; Auger, Daniel J.; Fotouhi, Abbas; Longo, Stefano; Knap, Vaclav
2017-03-01
Lithium-sulfur batteries are now commercially available, offering high specific energy density, low production costs and high safety. However, there is no commercially-available battery management system for them, and there are no published methods for determining state of charge in situ. This paper describes a study to address this gap. The properties and behaviours of lithium-sulfur are briefly introduced, and the applicability of 'standard' lithium-ion state-of-charge estimation methods is explored. Open-circuit voltage methods and 'Coulomb counting' are found to have a poor fit for lithium-sulfur, and model-based methods, particularly recursive Bayesian filters, are identified as showing strong promise. Three recursive Bayesian filters are implemented: an extended Kalman filter (EKF), an unscented Kalman filter (UKF) and a particle filter (PF). These estimators are tested through practical experimentation, considering both a pulse-discharge test and a test based on the New European Driving Cycle (NEDC). Experimentation is carried out at a constant temperature, mirroring the environment expected in the authors' target automotive application. It is shown that the estimators, which are based on a relatively simple equivalent-circuit-network model, can deliver useful results. If the three estimators implemented, the unscented Kalman filter gives the most robust and accurate performance, with an acceptable computational effort.
Benedikt, Clemens; Kelly, Sherrie L; Wilson, David; Wilson, David P
2016-12-01
Estimated global new HIV infections among people who inject drugs (PWID) remained stable over the 2010-2015 period and the target of a 50% reduction over this period was missed. To achieve the 2020 UNAIDS target of reducing adult HIV infections by 75% compared to 2010, accelerated action in scaling up HIV programs for PWID is required. In a context of diminishing external support to HIV programs in countries where most HIV-affected PWID live, it is essential that available resources are allocated and used as efficiently as possible. Allocative and implementation efficiency analysis methods were applied. Optima, a dynamic, population-based HIV model with an integrated program and economic analysis framework was applied in eight countries in Eastern Europe and Central Asia (EECA). Mathematical analyses established optimized allocations of resources. An implementation efficiency analysis focused on examining technical efficiency, unit costs, and heterogeneity of service delivery models and practices. Findings from the latest reported data revealed that countries allocated between 4% (Bulgaria) and 40% (Georgia) of total HIV resources to programs targeting PWID - with a median of 13% for the eight countries. When distributing the same amount of HIV funding optimally, between 9% and 25% of available HIV resources would be allocated to PWID programs with a median allocation of 16% and, in addition, antiretroviral therapy would be scaled up including for PWID. As a result of optimized allocations, new HIV infections are projected to decline by 3-28% and AIDS-related deaths by 7-53% in the eight countries. Implementation efficiencies identified involve potential reductions in drug procurement costs, service delivery models, and practices and scale of service delivery influencing cost and outcome. A high level of implementation efficiency was associated with high volumes of PWID clients accessing a drug harm reduction facility. A combination of optimized allocation of resources, improved implementation efficiency and increased investment of non-HIV resources is required to enhance coverage and improve outcomes of programs for PWID. Increasing efficiency of HIV programs for PWID is a key step towards avoiding implicit rationing and ensuring transparent allocation of resources where and how they would have the largest impact on the health of PWID, and thereby ensuring that funding spent on PWID becomes a global best buy in public health. Copyright © 2016. Published by Elsevier B.V.
Process Evaluation of Making HEPA Policy Practice: A Group Randomized Trial.
Weaver, Robert G; Moore, Justin B; Huberty, Jennifer; Freedman, Darcy; Turner-McGrievy, Brie; Beighle, Aaron; Ward, Diane; Pate, Russell; Saunders, Ruth; Brazendale, Keith; Chandler, Jessica; Ajja, Rahma; Kyryliuk, Becky; Beets, Michael W
2016-09-01
This study examines the link between implementation of Strategies to Enhance Practice (STEPs) and outcomes. Twenty after-school programs (ASPs) participated in an intervention to increase children's accumulation of 30 minutes/day of moderate to vigorous physical activity (MVPA) and quality of snacks served during program time. Outcomes were measured via accelerometer (MVPA) and direct observation (snacks). STEPs implementation data were collected via document review and direct observation. Based on implementation data, ASPs were divided into high/low implementers. Differences between high/low implementers' change in percentage of boys accumulating 30 minutes/day of MVPA were observed. There was no difference between high/low implementers for girls. Days fruits and/or vegetables and water were served increased in the high/low implementation groups, while desserts and sugar-sweetened beverages decreased. Effect sizes (ES) for the difference in changes between the high and low group ranged from low (ES = 0.16) to high (ES = 0.97). Higher levels of implementation led to increased MVPA for boys, whereas girls MVPA benefited from the intervention regardless of high/low implementation. ESs of the difference between high/low implementers indicate that increased implementation of STEPs increases days healthier snacks are served. Programs in the high-implementation group implemented a variety of STEPs strategies, suggesting local adoption/adaptation is key to implementation. © 2016 Society for Public Health Education.
Cradock, Angie L; Kenney, Erica L; McHugh, Anne; Conley, Lisa; Mozaffarian, Rebecca S; Reiner, Jennifer F; Gortmaker, Steven L
2015-09-10
Intake of sugar-sweetened beverages (SSBs) is associated with negative health effects. Access to healthy beverages may be promoted by policies such as the Healthy Beverage Executive Order (HBEO) established by former Boston mayor Thomas M. Menino, which directed city departments to eliminate the sale of SSBs on city property. Implementation consisted of "traffic-light signage" and educational materials at point of purchase. This study evaluates the impact of the HBEO on changes in beverage availability. Researchers collected data on price, brand, and size of beverages for sale in spring 2011 (899 beverage slots) and for sale in spring 2013, two years after HBEO implementation (836 beverage slots) at access points (n = 31) at city agency locations in Boston. Nutrient data, including calories and sugar content, from manufacturer websites were used to determine HBEO beverage traffic-light classification category. We used paired t tests to examine change in average calories and sugar content of beverages and the proportion of beverages by traffic-light classification at access points before and after HBEO implementation. Average beverage sugar grams and calories at access points decreased (sugar, -13.1 g; calories, -48.6 kcal; p<.001) following the implementation of the HBEO. The average proportion of high-sugar ("red") beverages available per access point declined (-27.8%, p<.001). Beverage prices did not change over time. City agencies were significantly more likely to sell only low-sugar beverages after the HBEO was implemented (OR = 4.88; 95% CI, 1.49-16.0). Policies such as the HBEO can promote community-wide changes that make healthier beverage options more accessible on city-owned properties.
Kenney, Erica L.; McHugh, Anne; Conley, Lisa; Mozaffarian, Rebecca S.; Reiner, Jennifer F.; Gortmaker, Steven L.
2015-01-01
Introduction Intake of sugar-sweetened beverages (SSBs) is associated with negative health effects. Access to healthy beverages may be promoted by policies such as the Healthy Beverage Executive Order (HBEO) established by former Boston mayor Thomas M. Menino, which directed city departments to eliminate the sale of SSBs on city property. Implementation consisted of “traffic-light signage” and educational materials at point of purchase. This study evaluates the impact of the HBEO on changes in beverage availability. Methods Researchers collected data on price, brand, and size of beverages for sale in spring 2011 (899 beverage slots) and for sale in spring 2013, two years after HBEO implementation (836 beverage slots) at access points (n = 31) at city agency locations in Boston. Nutrient data, including calories and sugar content, from manufacturer websites were used to determine HBEO beverage traffic-light classification category. We used paired t tests to examine change in average calories and sugar content of beverages and the proportion of beverages by traffic-light classification at access points before and after HBEO implementation. Results Average beverage sugar grams and calories at access points decreased (sugar, −13.1 g; calories, −48.6 kcal; p<.001) following the implementation of the HBEO. The average proportion of high-sugar (“red”) beverages available per access point declined (−27.8%, p<.001). Beverage prices did not change over time. City agencies were significantly more likely to sell only low-sugar beverages after the HBEO was implemented (OR = 4.88; 95% CI, 1.49–16.0). Discussion Policies such as the HBEO can promote community-wide changes that make healthier beverage options more accessible on city-owned properties. PMID:26355828
2016-06-23
The Food and Drug Administration (FDA) is announcing the availability of its FDA Adverse Event Reporting System (FAERS) Regional Implementation Specifications for the International Conference on Harmonisation (ICH) E2B(R3) Specification. FDA is making this technical specifications document available to assist interested parties in electronically submitting individual case safety reports (ICSRs) (and ICSR attachments) to the Center for Drug Evaluation and Research (CDER) and the Center for Biologics Evaluation and Research (CBER). This document, entitled "FDA Regional Implementation Specifications for ICH E2B(R3) Implementation: Postmarket Submission of Individual Case Safety Reports (ICSRs) for Drugs and Biologics, Excluding Vaccines" supplements the "E2B(R3) Electronic Transmission of Individual Case Safety Reports (ICSRs) Implementation Guide--Data Elements and Message Specification" final guidance for industry and describes FDA's technical approach for receiving ICSRs, for incorporating regionally controlled terminology, and for adding region-specific data elements when reporting to FAERS.
Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud
Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew
2015-01-01
Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation. PMID:26501966
Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.
Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew
2015-01-01
Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation.
Implementation and Testing of Low Cost Uav Platform for Orthophoto Imaging
NASA Astrophysics Data System (ADS)
Brucas, D.; Suziedelyte-Visockiene, J.; Ragauskas, U.; Berteska, E.; Rudinskas, D.
2013-08-01
Implementation of Unmanned Aerial Vehicles for civilian applications is rapidly increasing. Technologies which were expensive and available only for military use have recently spread on civilian market. There is a vast number of low cost open source components and systems for implementation on UAVs available. Using of low cost hobby and open source components ensures considerable decrease of UAV price, though in some cases compromising its reliability. In Space Science and Technology Institute (SSTI) in collaboration with Vilnius Gediminas Technical University (VGTU) researches have been performed in field of constructing and implementation of small UAVs composed of low cost open source components (and own developments). Most obvious and simple implementation of such UAVs - orthophoto imaging with data download and processing after the flight. The construction, implementation of UAVs, flight experience, data processing and data implementation will be further covered in the paper and presentation.
Helfrich, Christian D; Sylling, Philip W; Gale, Randall C; Mohr, David C; Stockdale, Susan E; Joos, Sandra; Brown, Elizabeth J; Grembowski, David; Asch, Steven M; Fihn, Stephan D; Nelson, Karin M; Meredith, Lisa S
2016-02-24
The patient-centered medical home (PCMH) is a team-based, comprehensive model of primary care. When effectively implemented, PCMH is associated with higher patient satisfaction, lower staff burnout, and lower hospitalization for ambulatory care-sensitive conditions. However, less is known about what factors contribute to (or hinder) PCMH implementation. We explored the associations of specific facilitators and barriers reported by primary care employees with a previously validated, clinic-level measure of PCMH implementation, the Patient Aligned Care Team Implementation Progress Index (Pi(2)). We used a 2012 survey of primary care employees in the Veterans Health Administration to perform cross-sectional, respondent-level multinomial regressions. The dependent variable was the Pi(2) categorized as high implementation (top decile, 54 clinics, 235 respondents), medium implementation (middle eight deciles, 547 clinics, 4537 respondents), and low implementation (lowest decile, 42 clinics, 297 respondents) among primary care clinics. The independent variables were ordinal survey items rating 19 barriers to patient-centered care and 10 facilitators of PCMH implementation. For facilitators, we explored clinic Pi(2) score decile both as a function of respondent-reported availability of facilitators and of rating of facilitator helpfulness. The availability of five facilitators was associated with higher odds of a respondent's clinic's Pi(2) scores being in the highest versus lowest decile: teamlet huddles (OR = 3.91), measurement tools (OR = 3.47), regular team meetings (OR = 2.88), information systems (OR = 2.42), and disease registries (OR = 2.01). The helpfulness of four facilitators was associated with higher odds of a respondent's clinic's Pi(2) scores being in the highest versus lowest decile. Six barriers were associated with significantly higher odds of a respondent's clinic's Pi(2) scores being in the lowest versus highest decile, with the strongest associations for the difficulty recruiting and retaining providers (OR = 2.37) and non-provider clinicians (OR = 2.17). Results for medium versus low Pi(2) score clinics were similar, with fewer, smaller significant associations, all in the expected direction. A number of specific barriers and facilitators were associated with PCMH implementation, notably recruitment and retention of clinicians, team huddles, and local education. These findings can guide future research, and may help healthcare policy makers and leaders decide where to focus attention and limited resources.
Gehring, Nicole D; McGrath, Patrick; Wozney, Lori; Soleimani, Amir; Bennett, Kathryn; Hartling, Lisa; Huguet, Anna; Dyson, Michele P; Newton, Amanda S
2017-06-21
Researchers, healthcare planners, and policymakers convey a sense of urgency in using eMental healthcare technologies to improve pediatric mental healthcare availability and access. Yet, different stakeholders may focus on different aspects of implementation. We conducted a systematic review to identify implementation foci in research studies and government/organizational documents for eMental healthcare technologies for pediatric mental healthcare. A search of eleven electronic databases and grey literature was conducted. We included research studies and documents from organization and government websites if the focus included eMental healthcare technology for children/adolescents (0-18 years), and implementation was studied and reported (research studies) or goals/recommendations regarding implementation were made (documents). We assessed study quality using the Mixed Methods Appraisal Tool and document quality using the Appraisal of Guidelines for Research & Evaluation II. Implementation information was grouped according to Proctor and colleagues' implementation outcomes-acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, and sustainability-and grouped separately for studies and documents. Twenty research studies and nine government/organizational documents met eligibility criteria. These articles represented implementation of eMental healthcare technologies in the USA (14 studies), United Kingdom (2 documents, 3 studies), Canada (2 documents, 1 study), Australia (4 documents, 1 study), New Zealand (1 study), and the Netherlands (1 document). The quality of research studies was excellent (n = 11), good (n = 6), and poor (n = 1). These eMental health studies focused on the acceptability (70%, n = 14) and appropriateness (50%, n = 10) of eMental healthcare technologies to users and mental healthcare professionals. The quality of government and organizational documents was high (n = 2), medium (n = 6), and low (n = 1). These documents focused on cost (100%, n = 9), penetration (89%, n = 8), feasibility (78%, n = 7), and sustainability (67%, n = 6) of implementing eMental healthcare technology. To date, research studies have largely focused on acceptability and appropriateness, while government/organizational documents state goals and recommendations regarding costs, feasibility, and sustainability of eMental healthcare technologies. These differences suggest that the research evidence available for pediatric eMental healthcare technologies does not reflect the focus of governments and organizations. Partnerships between researchers, healthcare planners, and policymakers may help to align implementation research with policy development, decision-making, and funding foci.
NASA Astrophysics Data System (ADS)
Steiger, Damian S.; Haener, Thomas; Troyer, Matthias
Quantum computers promise to transform our notions of computation by offering a completely new paradigm. A high level quantum programming language and optimizing compilers are essential components to achieve scalable quantum computation. In order to address this, we introduce the ProjectQ software framework - an open source effort to support both theorists and experimentalists by providing intuitive tools to implement and run quantum algorithms. Here, we present our ProjectQ quantum compiler, which compiles a quantum algorithm from our high-level Python-embedded language down to low-level quantum gates available on the target system. We demonstrate how this compiler can be used to control actual hardware and to run high-performance simulations.
NASA Astrophysics Data System (ADS)
Farid, M.; Iswoyo, H.; Ridwan, I.; Nasaruddin; Dermawan, R.
2018-05-01
Use of certified seeds is necessary in rice farming system to ensure high production and productivity. In South Sulawesi, as one of the main center of Rice in Indonesia, increase in the regional harvest areas and land productivity was not followed by significant increase in rice productivity. Seeds policy implemented by the government not only covers the technology applied in the seeds production but also quality insurance, availability and its distribution system. Despite these efforts, use of certified seeds in the field by farmers are still limited, therefore a survey was conducted to study the use and availability of rice certified seeds in Bone regency. Farmer groups in 9 districts were sampled to obtained data concerned to the use and availability of rice seeds including type, planting season, cropping pattern and availability at planting. Results show that the use of certified seeds in Bone regency were relatively high, however level of seeds supply was low or delayed at time of planting. To overcome the problem of seed supply, most farmers used its own seeds from previous harvest. Some suggestions to resolve this condition are discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-19
... Promulgation of Implementation Plans; Georgia; Control Techniques Guidelines and Reasonably Available Control...), related to reasonably available control technology (RACT) requirements. This correcting amendment corrects... October 21, 2009, SIP submittal for certain source categories for which EPA has issued control technique...
Boni, Enrico; Bassi, Luca; Dallai, Alessandro; Guidi, Francesco; Meacci, Valentino; Ramalli, Alessandro; Ricci, Stefano; Tortoli, Piero
2016-10-01
Open scanners offer an increasing support to the ultrasound researchers who are involved in the experimental test of novel methods. Each system presents specific performance in terms of number of channels, flexibility, processing power, data storage capability, and overall dimensions. This paper reports the design criteria and hardware/software implementation details of a new 256-channel ultrasound advanced open platform. This system is organized in a modular architecture, including multiple front-end boards, interconnected by a high-speed (80 Gb/s) ring, capable of finely controlling all transmit (TX) and receive (RX) signals. High flexibility and processing power (equivalent to 2500 GFLOP) are guaranteed by the possibility of individually programming multiple digital signal processors and field programmable gate arrays. Eighty GB of on-board memory are available for the storage of prebeamforming, postbeamforming, and baseband data. The use of latest generation devices allowed to integrate all needed electronics in a small size ( 34 cm ×30 cm ×26 cm). The system implements a multiline beamformer that allows obtaining images of 96 lines by 2048 depths at a frame rate of 720 Hz (expandable to 3000 Hz). The multiline beamforming capability is also exploited to implement a real-time vector Doppler scheme in which a single TX and two independent RX apertures are simultaneously used to maintain the analysis over a full pulse repetition frequency range.
FUX-Sim: Implementation of a fast universal simulation/reconstruction framework for X-ray systems.
Abella, Monica; Serrano, Estefania; Garcia-Blas, Javier; García, Ines; de Molina, Claudia; Carretero, Jesus; Desco, Manuel
2017-01-01
The availability of digital X-ray detectors, together with advances in reconstruction algorithms, creates an opportunity for bringing 3D capabilities to conventional radiology systems. The downside is that reconstruction algorithms for non-standard acquisition protocols are generally based on iterative approaches that involve a high computational burden. The development of new flexible X-ray systems could benefit from computer simulations, which may enable performance to be checked before expensive real systems are implemented. The development of simulation/reconstruction algorithms in this context poses three main difficulties. First, the algorithms deal with large data volumes and are computationally expensive, thus leading to the need for hardware and software optimizations. Second, these optimizations are limited by the high flexibility required to explore new scanning geometries, including fully configurable positioning of source and detector elements. And third, the evolution of the various hardware setups increases the effort required for maintaining and adapting the implementations to current and future programming models. Previous works lack support for completely flexible geometries and/or compatibility with multiple programming models and platforms. In this paper, we present FUX-Sim, a novel X-ray simulation/reconstruction framework that was designed to be flexible and fast. Optimized implementation for different families of GPUs (CUDA and OpenCL) and multi-core CPUs was achieved thanks to a modularized approach based on a layered architecture and parallel implementation of the algorithms for both architectures. A detailed performance evaluation demonstrates that for different system configurations and hardware platforms, FUX-Sim maximizes performance with the CUDA programming model (5 times faster than other state-of-the-art implementations). Furthermore, the CPU and OpenCL programming models allow FUX-Sim to be executed over a wide range of hardware platforms.
Genecentric: a package to uncover graph-theoretic structure in high-throughput epistasis data.
Gallant, Andrew; Leiserson, Mark D M; Kachalov, Maxim; Cowen, Lenore J; Hescott, Benjamin J
2013-01-18
New technology has resulted in high-throughput screens for pairwise genetic interactions in yeast and other model organisms. For each pair in a collection of non-essential genes, an epistasis score is obtained, representing how much sicker (or healthier) the double-knockout organism will be compared to what would be expected from the sickness of the component single knockouts. Recent algorithmic work has identified graph-theoretic patterns in this data that can indicate functional modules, and even sets of genes that may occur in compensatory pathways, such as a BPM-type schema first introduced by Kelley and Ideker. However, to date, any algorithms for finding such patterns in the data were implemented internally, with no software being made publically available. Genecentric is a new package that implements a parallelized version of the Leiserson et al. algorithm (J Comput Biol 18:1399-1409, 2011) for generating generalized BPMs from high-throughput genetic interaction data. Given a matrix of weighted epistasis values for a set of double knock-outs, Genecentric returns a list of generalized BPMs that may represent compensatory pathways. Genecentric also has an extension, GenecentricGO, to query FuncAssociate (Bioinformatics 25:3043-3044, 2009) to retrieve GO enrichment statistics on generated BPMs. Python is the only dependency, and our web site provides working examples and documentation. We find that Genecentric can be used to find coherent functional and perhaps compensatory gene sets from high throughput genetic interaction data. Genecentric is made freely available for download under the GPLv2 from http://bcb.cs.tufts.edu/genecentric.
Genecentric: a package to uncover graph-theoretic structure in high-throughput epistasis data
2013-01-01
Background New technology has resulted in high-throughput screens for pairwise genetic interactions in yeast and other model organisms. For each pair in a collection of non-essential genes, an epistasis score is obtained, representing how much sicker (or healthier) the double-knockout organism will be compared to what would be expected from the sickness of the component single knockouts. Recent algorithmic work has identified graph-theoretic patterns in this data that can indicate functional modules, and even sets of genes that may occur in compensatory pathways, such as a BPM-type schema first introduced by Kelley and Ideker. However, to date, any algorithms for finding such patterns in the data were implemented internally, with no software being made publically available. Results Genecentric is a new package that implements a parallelized version of the Leiserson et al. algorithm (J Comput Biol 18:1399-1409, 2011) for generating generalized BPMs from high-throughput genetic interaction data. Given a matrix of weighted epistasis values for a set of double knock-outs, Genecentric returns a list of generalized BPMs that may represent compensatory pathways. Genecentric also has an extension, GenecentricGO, to query FuncAssociate (Bioinformatics 25:3043-3044, 2009) to retrieve GO enrichment statistics on generated BPMs. Python is the only dependency, and our web site provides working examples and documentation. Conclusion We find that Genecentric can be used to find coherent functional and perhaps compensatory gene sets from high throughput genetic interaction data. Genecentric is made freely available for download under the GPLv2 from http://bcb.cs.tufts.edu/genecentric. PMID:23331614
Watts, Allison W; Mâsse, Louise C; Naylor, Patti-Jean
2014-04-14
High rates of childhood obesity have generated interest among policy makers to improve the school food environment and increase students' levels of physical activity. The purpose of this study was to examine school-level changes associated with implementation of the Food and Beverage Sales in Schools (FBSS) and Daily Physical Activity (DPA) guidelines in British Columbia, Canada. Elementary and middle/high school principals completed a survey on the school food and physical activity environment in 2007-08 (N=513) and 2011-12 (N=490). Hierarchical mixed effects regression was used to examine changes in: 1) availability of food and beverages; 2) minutes per day of Physical Education (PE); 3) delivery method of PE; and 4) school community support. Models controlled for school enrollment and community type, education and income. After policy implementation was expected, more elementary schools provided access to fruits and vegetables and less to 100% fruit juice. Fewer middle/high schools provided access to sugar-sweetened beverages, French fries, baked goods, salty snacks and chocolate/candy. Schools were more likely to meet 150 min/week of PE for grade 6 students, and offer more minutes of PE per week for grade 8 and 10 students including changes to PE delivery method. School community support for nutrition and physical activity policies increased over time. Positive changes to the school food environment occurred after schools were expected to implement the FBSS and DPA guidelines. Reported changes to the school environment are encouraging and provide support for guidelines and policies that focus on increasing healthy eating and physical activity in schools.
Buettner, Florian; Moignard, Victoria; Göttgens, Berthold; Theis, Fabian J.
2014-01-01
Motivation: High-throughput single-cell quantitative real-time polymerase chain reaction (qPCR) is a promising technique allowing for new insights in complex cellular processes. However, the PCR reaction can be detected only up to a certain detection limit, whereas failed reactions could be due to low or absent expression, and the true expression level is unknown. Because this censoring can occur for high proportions of the data, it is one of the main challenges when dealing with single-cell qPCR data. Principal component analysis (PCA) is an important tool for visualizing the structure of high-dimensional data as well as for identifying subpopulations of cells. However, to date it is not clear how to perform a PCA of censored data. We present a probabilistic approach that accounts for the censoring and evaluate it for two typical datasets containing single-cell qPCR data. Results: We use the Gaussian process latent variable model framework to account for censoring by introducing an appropriate noise model and allowing a different kernel for each dimension. We evaluate this new approach for two typical qPCR datasets (of mouse embryonic stem cells and blood stem/progenitor cells, respectively) by performing linear and non-linear probabilistic PCA. Taking the censoring into account results in a 2D representation of the data, which better reflects its known structure: in both datasets, our new approach results in a better separation of known cell types and is able to reveal subpopulations in one dataset that could not be resolved using standard PCA. Availability and implementation: The implementation was based on the existing Gaussian process latent variable model toolbox (https://github.com/SheffieldML/GPmat); extensions for noise models and kernels accounting for censoring are available at http://icb.helmholtz-muenchen.de/censgplvm. Contact: fbuettner.phys@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24618470
MaxBin 2.0: an automated binning algorithm to recover genomes from multiple metagenomic datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Yu-Wei; Simmons, Blake A.; Singer, Steven W.
The recovery of genomes from metagenomic datasets is a critical step to defining the functional roles of the underlying uncultivated populations. We previously developed MaxBin, an automated binning approach for high-throughput recovery of microbial genomes from metagenomes. Here, we present an expanded binning algorithm, MaxBin 2.0, which recovers genomes from co-assembly of a collection of metagenomic datasets. Tests on simulated datasets revealed that MaxBin 2.0 is highly accurate in recovering individual genomes, and the application of MaxBin 2.0 to several metagenomes from environmental samples demonstrated that it could achieve two complementary goals: recovering more bacterial genomes compared to binning amore » single sample as well as comparing the microbial community composition between different sampling environments. Availability and implementation: MaxBin 2.0 is freely available at http://sourceforge.net/projects/maxbin/ under BSD license. Supplementary information: Supplementary data are available at Bioinformatics online.« less
Memory-efficient RNA energy landscape exploration
Mann, Martin; Kucharík, Marcel; Flamm, Christoph; Wolfinger, Michael T.
2014-01-01
Motivation: Energy landscapes provide a valuable means for studying the folding dynamics of short RNA molecules in detail by modeling all possible structures and their transitions. Higher abstraction levels based on a macro-state decomposition of the landscape enable the study of larger systems; however, they are still restricted by huge memory requirements of exact approaches. Results: We present a highly parallelizable local enumeration scheme that enables the computation of exact macro-state transition models with highly reduced memory requirements. The approach is evaluated on RNA secondary structure landscapes using a gradient basin definition for macro-states. Furthermore, we demonstrate the need for exact transition models by comparing two barrier-based approaches, and perform a detailed investigation of gradient basins in RNA energy landscapes. Availability and implementation: Source code is part of the C++ Energy Landscape Library available at http://www.bioinf.uni-freiburg.de/Software/. Contact: mmann@informatik.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24833804
Twenty-five-year atraumatic restorative treatment (ART) approach: a comprehensive overview.
Frencken, Jo E; Leal, Soraya Coelho; Navarro, Maria Fidela
2012-10-01
The atraumatic restorative treatment (ART) approach was born 25 years ago in Tanzania. It has evolved into an essential caries management concept for improving quality and access to oral care globally. Meta-analyses and systematic reviews have indicated that the high effectiveness of ART sealants using high-viscosity glass ionomers in carious lesion development prevention is not different from that of resin fissure sealants. ART using high-viscosity glass ionomer can safely be used to restore single-surface cavities both in primary and in permanent posterior teeth, but its quality in restoring multiple surfaces in primary posterior teeth cavities needs to be improved. Insufficient information is available regarding the quality of ART restorations in multiple surfaces in permanent anterior and posterior teeth. There appears to be no difference in the survival of single-surface high-viscosity glass-ionomer ART restorations and amalgam restorations. The use of ART results in smaller cavities and in high acceptance of preventive and restorative care by children. Because local anaesthesia is seldom needed and only hand instruments are used, ART is considered to be a promising approach for treating children suffering from early childhood caries. ART has been implemented in the public oral health services of a number of countries, and clearly, proper implementation requires the availability of sufficient stocks of good high-viscosity glass ionomers and sets of ART instruments right from the start. Textbooks including chapters on ART are available, and the concept is being included in graduate courses at dental schools in a number of countries. Recent development and testing of e-learning modules for distance learning has increasingly facilitated the distribution of ART information amongst professionals, thus enabling more people to benefit from ART. However, this development and further research require adequate funding, which is not always easily obtainable. The next major challenge is the continuation of care to the frail elderly, in which ART may play a part. ART, as part of the Basic Package of Oral Care, is an important cornerstone for the development of global oral health and alleviating inequality in oral care.
Implementation of a high precision multi-measurement time-to-digital convertor on a Kintex-7 FPGA
NASA Astrophysics Data System (ADS)
Kuang, Jie; Wang, Yonggang; Cao, Qiang; Liu, Chong
2018-05-01
Time-to-digital convertors (TDCs) based on field programmable gate array (FPGA) are becoming more and more popular. Multi-measurement is an effective method to improve TDC precision beyond the cell delay limitation. However, the implementation of TDC with multi-measurement on FPGAs manufactured with 28 nm and more advanced process is facing new challenges. Benefiting from the ones-counter encoding scheme, which was developed in our previous work, we implement a ring oscillator multi-measurement TDC on a Xilinx Kintex-7 FPGA. Using the two TDC channels to measure time-intervals in the range (0 ns-30 ns), the average RMS precision can be improved to 5.76 ps, meanwhile the logic resource usage remains the same with the one-measurement TDC, and the TDC dead time is only 22 ns. The investigation demonstrates that the multi-measurement methods are still available for current main-stream FPGAs. Furthermore, the new implementation in this paper could make the trade-off among the time precision, resource usage and TDC dead time better than ever before.
School wellness policies and foods and beverages available in schools.
Hood, Nancy E; Colabianchi, Natalie; Terry-McElrath, Yvonne M; O'Malley, Patrick M; Johnston, Lloyd D
2013-08-01
Since 2006-2007, education agencies (e.g., school districts) participating in U.S. federal meal programs are required to have wellness policies. To date, this is the only federal policy that addresses foods and beverages sold outside of school meals (in competitive venues). To examine the extent to which federally required components of school wellness policies are associated with availability of foods and beverages in competitive venues. Questionnaire data were collected in 2007-2008 through 2010-2011 school years from 892 middle and 1019 high schools in nationally representative samples. School administrators reported the extent to which schools had required wellness policy components (goals, nutrition guidelines, implementation plan/person responsible, stakeholder involvement) and healthier and less-healthy foods and beverages available in competitive venues. Analyses were conducted in 2012. About one third of students (31.8%) were in schools with all four wellness policy components. Predominantly white schools had higher wellness policy scores than other schools. After controlling for school characteristics, higher wellness policy scores were associated with higher availability of low-fat and whole-grain foods and lower availability of regular-fat/sugared foods in middle and high schools. In middle schools, higher scores also were associated with lower availability of 2%/whole milk. High schools with higher scores also had lower sugar-sweetened beverage availability and higher availability of 1%/nonfat milk, fruits/vegetables, and salad bars. Because they are associated with lower availability of less-healthy and higher availability of healthier foods and beverages in competitive venues, federally required components of school wellness policies should be encouraged in all schools. Copyright © 2013 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.
Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary
2017-12-01
Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.
Vending and School Store Snack and Beverage Trends: Minnesota Secondary Schools, 2002–2010
Kubik, Martha Y.; Davey, Cynthia; Nanney, Marilyn S.; MacLehose, Richard F.; Nelson, Toben F.; Coombes, Brandon
2013-01-01
Background The Child Nutrition and WIC Reauthorization Act of 2004 (hereafter called the 2004 Reauthorization Act) was federal legislation that required school districts participating in the federally funded school meal program to develop and implement policies addressing nutrition guidelines for all foods and beverages available on school campuses by the onset of the 2006/2007 school year. Purpose Vending machine and school store (VMSS) availability and low-nutrient, energy-dense snacks and beverages in VMSS were assessed in a statewide sample of Minnesota secondary schools before and after the 2004 Reauthorization Act was implemented in 2006/2007. Methods The CDC School Health Profiles principal survey was collected from a representative sample of middle (n=170) and high (n=392) schools biennially from 2002 to 2010. Trends were estimated using general linear models with a logit link and linear spline modeling. Analyses were conducted in 2012. Results Among high schools, VMSS (p=0.001) and sugar-sweetened beverages (p=0.004), high-fat salty snacks (p=0.001), and candy (p=0.001) in VMSS decreased from 2002 to 2008. In 2008, a change in slope direction from negative to positive occurred for all food practices and an increase in VMSS (p=0.014) and sugar-sweetened beverages (p=0.033) was seen. Among middle schools, VMSS (p=0.027), sugar-sweetened beverages (p=0.001), high-fat salty snacks (p=0.001), and candy (p=0.029) decreased from 2002 to 2010. Conclusions This study supports a link between policy and sustainable decreases in some food practices but not others and a differential effect that favors middle schools over high schools. Policy-setting is a dynamic process requiring ongoing surveillance to identify shifting trends. PMID:23683975
Bain, Luchuo Engelbert; Kongnyuy, Eugene Justine
2018-05-24
The abortion law in Cameroon is highly restrictive. The law permits induced abortions only when the woman's life is at risk, to preserve her physical and mental health, and on grounds of rape or incest. Unsafe abortions remain rampant with however rare reported cases of persecution, even when these abortions are proven to have been carried out illegally. Available public health interventions are cheap and feasible (Misoprostol and Manual Vacuum Aspiration in post abortion care, modern contraception, post-abortion counseling), and must be implemented to reduce unacceptably high maternal mortality rates in the country which still stand at as high as 596/100.000. Changes in the legal status of abortions might take a long time to come by. Albeit, advocacy efforts must be reinforced to render the law more liberal to permit women to seek safe abortion services. The frequency of abortions, generally clandestine, in this restrictive legal atmosphere has adverse economic, health and social justice implications. We argue that a non-optimal or restrictive legal atmosphere is not an acceptable excuse to justify these high maternal deaths resulting from unsafe abortions, especially in Cameroon where unsafe abortions remain rampant. Implementing currently available, cheap and effective evidence based practice guidelines are possible in the country. Expansion and use of Manual Vacuum Aspiration kits in health care facilities, post-abortion misoprostol and carefully considering the content of post abortion counseling packages deserve keen attention. More large scale qualitative and quantitative studies nationwide to identify and act on context specific barriers to contraception use and abortion related stigma are urgently needed.
Creating a High-Frequency Electronic Database in the PICU: The Perpetual Patient.
Brossier, David; El Taani, Redha; Sauthier, Michael; Roumeliotis, Nadia; Emeriaud, Guillaume; Jouvet, Philippe
2018-04-01
Our objective was to construct a prospective high-quality and high-frequency database combining patient therapeutics and clinical variables in real time, automatically fed by the information system and network architecture available through fully electronic charting in our PICU. The purpose of this article is to describe the data acquisition process from bedside to the research electronic database. Descriptive report and analysis of a prospective database. A 24-bed PICU, medical ICU, surgical ICU, and cardiac ICU in a tertiary care free-standing maternal child health center in Canada. All patients less than 18 years old were included at admission to the PICU. None. Between May 21, 2015, and December 31, 2016, 1,386 consecutive PICU stays from 1,194 patients were recorded in the database. Data were prospectively collected from admission to discharge, every 5 seconds from monitors and every 30 seconds from mechanical ventilators and infusion pumps. These data were linked to the patient's electronic medical record. The database total volume was 241 GB. The patients' median age was 2.0 years (interquartile range, 0.0-9.0). Data were available for all mechanically ventilated patients (n = 511; recorded duration, 77,678 hr), and respiratory failure was the most frequent reason for admission (n = 360). The complete pharmacologic profile was synched to database for all PICU stays. Following this implementation, a validation phase is in process and several research projects are ongoing using this high-fidelity database. Using the existing bedside information system and network architecture of our PICU, we implemented an ongoing high-fidelity prospectively collected electronic database, preventing the continuous loss of scientific information. This offers the opportunity to develop research on clinical decision support systems and computational models of cardiorespiratory physiology for example.
Vending and school store snack and beverage trends: Minnesota secondary schools, 2002-2010.
Kubik, Martha Y; Davey, Cynthia; Nanney, Marilyn S; MacLehose, Richard F; Nelson, Toben F; Coombes, Brandon
2013-06-01
The Child Nutrition and WIC Reauthorization Act of 2004 (hereafter called the 2004 Reauthorization Act) was federal legislation that required school districts participating in the federally funded school meal program to develop and implement policies addressing nutrition guidelines for all foods and beverages available on school campuses by the onset of the 2006/2007 school year. Vending machine and school store (VMSS) availability and low-nutrient, energy-dense snacks and beverages in VMSS were assessed in a statewide sample of Minnesota secondary schools before and after the 2004 Reauthorization Act was implemented in 2006/2007. The CDC School Health Profiles principal survey was collected from a representative sample of middle (n=170) and high (n=392) schools biennially from 2002 to 2010. Trends were estimated using general linear models with a logit link and linear spline modeling. Analyses were conducted in 2012. Among high schools, VMSS (p=0.001) and sugar-sweetened beverages (p=0.004), high-fat salty snacks (p=0.001), and candy (p=0.001) in VMSS decreased from 2002 to 2008. In 2008, a change in slope direction from negative to positive occurred for all food practices and an increase in VMSS (p=0.014) and sugar-sweetened beverages (p=0.033) was seen. Among middle schools, VMSS (p=0.027), sugar-sweetened beverages (p=0.001), high-fat salty snacks (p=0.001), and candy (p=0.029) decreased from 2002 to 2010. This study supports a link between policy and sustainable decreases in some food practices but not others and a differential effect that favors middle schools over high schools. Policy-setting is a dynamic process requiring ongoing surveillance to identify shifting trends. Copyright © 2013 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-13
... Promulgation of Air Quality Implementation Plans; Maryland; Reasonably Available Control Technology for the... control technology (RACT) for oxides of nitrogen (NO X ) and volatile organic compounds (VOCs) for the... business information (CBI) or other information whose disclosure is restricted by statute. Certain other...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-26
... Promulgation of Air Quality Implementation Plans; Missouri; Reasonably Available Control Technology (RACT) for the 8-Hour Ozone National Ambient Air Quality Standard (NAAQS) AGENCY: Environmental Protection Agency... revision is to amend Missouri's regulation for the Control of Volatile Organic Compounds (VOC) and meet the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-23
... Promulgation of Air Quality Implementation Plans; Missouri; Reasonably Available Control Technology (RACT) for the 8-Hour Ozone National Ambient Air Quality Standard (NAAQS) AGENCY: Environmental Protection Agency... these revisions because they enhance the Missouri SIP by improving VOC emission controls in Missouri...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... evaluating control strategies. This better understanding allows for more strategic approaches in which public..., we lay out our proposed approaches, but here are a few examples: 1. Federal control measures: States...), reasonably available control technology (RACT), reasonably available control measures (RACM), new source...
Project #OA&E-FY18-0190, May 2, 2018. The OIG plans to begin preliminary research to evaluate the impact of EPA’s lack of notice of availability of required training materials on Agricultural Worker Protection Standard implementation.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... Commercial Availability Provision of the Dominican Republic-Central America-United States Free Trade...-Central America-United States Free Trade Agreement Implementation Act (``CAFTA-DR Implementation Act... FURTHER INFORMATION CONTACT: Maria Dybczak, Office of Textiles and Apparel, U.S. Department of Commerce...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... Commercial Availability Provision of the Dominican Republic-Central America-United States Free Trade... FURTHER INFORMATION CONTACT: Maria Dybczak, Office of Textiles and Apparel, U.S. Department of Commerce...-United States Free Trade Agreement Implementation Act (``CAFTA-DR Implementation Act''), Public Law 109...
Efficient data management in a large-scale epidemiology research project.
Meyer, Jens; Ostrzinski, Stefan; Fredrich, Daniel; Havemann, Christoph; Krafczyk, Janina; Hoffmann, Wolfgang
2012-09-01
This article describes the concept of a "Central Data Management" (CDM) and its implementation within the large-scale population-based medical research project "Personalized Medicine". The CDM can be summarized as a conjunction of data capturing, data integration, data storage, data refinement, and data transfer. A wide spectrum of reliable "Extract Transform Load" (ETL) software for automatic integration of data as well as "electronic Case Report Forms" (eCRFs) was developed, in order to integrate decentralized and heterogeneously captured data. Due to the high sensitivity of the captured data, high system resource availability, data privacy, data security and quality assurance are of utmost importance. A complex data model was developed and implemented using an Oracle database in high availability cluster mode in order to integrate different types of participant-related data. Intelligent data capturing and storage mechanisms are improving the quality of data. Data privacy is ensured by a multi-layered role/right system for access control and de-identification of identifying data. A well defined backup process prevents data loss. Over the period of one and a half year, the CDM has captured a wide variety of data in the magnitude of approximately 5terabytes without experiencing any critical incidents of system breakdown or loss of data. The aim of this article is to demonstrate one possible way of establishing a Central Data Management in large-scale medical and epidemiological studies. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Yamaki, Kiyoshi; Lowry, Brienne Davis; Buscaj, Emilie; Zisko, Leigh; Rimmer, James H
2015-05-01
The aim of this study was to assess the availability of public health surveillance data on obesity among American children with disabilities in state-based surveillance programs. We reviewed annual cross-sectional datasets in state-level surveillance programs for high school students, implemented 2001-2011, for the inclusion of weight and height and disability screening questions. When datasets included a disability screen, its content and consistency of use across years were examined. We identified 54 surveillance programs with 261 annual datasets containing obesity data. Twelve surveillance programs in 11 states included a disability screening question that could be used to extract obesity data for high school students with disabilities, leaving the other 39 states with no state-level obesity data for students with disabilities. A total of 43 annual datasets, 16.5 % of the available datasets, could be used to estimate the obesity status of students with disabilities. The frequency of use of disability questions varied across states, and the content of the questions often changed across years and within a state. We concluded that state surveillance programs rarely contained questions that could be used to identify high school students with disabilities. This limits the availability of data that can be used to monitor obesity and related health statuses among this population in the majority of states.
The Revolving Fund Pharmacy Model: backing up the Ministry of Health supply chain in western Kenya.
Manji, Imran; Manyara, Simon M; Jakait, Beatrice; Ogallo, William; Hagedorn, Isabel C; Lukas, Stephanie; Kosgei, Eunice J; Pastakia, Sonak D
2016-10-01
A pressing challenge in low and middle-income countries (LMIC) is inadequate access to essential medicines, especially for chronic diseases. The Revolving Fund Pharmacy (RFP) model is an initiative to provide high-quality medications consistently to patients, using revenues generated from the sale of medications to sustainably resupply medications. This article describes the utilization of RFPs developed by the Academic Model Providing Access to Healthcare (AMPATH) with the aim of stimulating the implementation of similar models elsewhere to ensure sustainable access to quality and affordable medications in similar LMIC settings. The service evaluation of three pilot RFPs started between April 2011 and January 2012 in select government facilities is described. The evaluation assessed cross-sectional availability of essential medicines before and after implementation of the RFPs, number of patient encounters and the impact of community awareness activities. Availability of essential medicines in the three pilot RFPs increased from 40%, 36% and <10% to 90%, 94% and 91% respectively. After the first year of operation, the pilot RFPs had a total of 33 714 patient encounters. As of February 2014, almost 3 years after starting up the first RFP, the RFPs had a total of 115 991 patient encounters. In the Eldoret RFP, community awareness activities led to a 51% increase in sales. With proper oversight and stakeholder involvement, this model is a potential solution to improve availability of essential medicines in LMICs. These pilots exemplify the feasibility of implementing and scaling up this model in other locations. © 2016 Royal Pharmaceutical Society.
Adaptive Suppression of Noise in Voice Communications
NASA Technical Reports Server (NTRS)
Kozel, David; DeVault, James A.; Birr, Richard B.
2003-01-01
A subsystem for the adaptive suppression of noise in a voice communication system effects a high level of reduction of noise that enters the system through microphones. The subsystem includes a digital signal processor (DSP) plus circuitry that implements voice-recognition and spectral- manipulation techniques. The development of the adaptive noise-suppression subsystem was prompted by the following considerations: During processing of the space shuttle at Kennedy Space Center, voice communications among test team members have been significantly impaired in several instances because some test participants have had to communicate from locations with high ambient noise levels. Ear protection for the personnel involved is commercially available and is used in such situations. However, commercially available noise-canceling microphones do not provide sufficient reduction of noise that enters through microphones and thus becomes transmitted on outbound communication links.
NASA Astrophysics Data System (ADS)
Saerbeck, T.; Klose, F.; Le Brun, A. P.; Füzi, J.; Brule, A.; Nelson, A.; Holt, S. A.; James, M.
2012-08-01
This review presents the implementation and full characterization of the polarization equipment of the time-of-flight neutron reflectometer PLATYPUS at the Australian Nuclear Science and Technology Organisation (ANSTO). The functionality and efficiency of individual components are evaluated and found to maintain a high neutron beam polarization with a maximum of 99.3% through polarizing Fe/Si supermirrors. Neutron spin-flippers with efficiencies of 99.7% give full control over the incident and scattered neutron spin direction over the whole wavelength spectrum available in the instrument. The first scientific experiments illustrate data correction mechanisms for finite polarizations and reveal an extraordinarily high reproducibility for measuring magnetic thin film samples. The setup is now fully commissioned and available for users through the neutron beam proposal system of the Bragg Institute at ANSTO.
Saerbeck, T; Klose, F; Le Brun, A P; Füzi, J; Brule, A; Nelson, A; Holt, S A; James, M
2012-08-01
This review presents the implementation and full characterization of the polarization equipment of the time-of-flight neutron reflectometer PLATYPUS at the Australian Nuclear Science and Technology Organisation (ANSTO). The functionality and efficiency of individual components are evaluated and found to maintain a high neutron beam polarization with a maximum of 99.3% through polarizing Fe/Si supermirrors. Neutron spin-flippers with efficiencies of 99.7% give full control over the incident and scattered neutron spin direction over the whole wavelength spectrum available in the instrument. The first scientific experiments illustrate data correction mechanisms for finite polarizations and reveal an extraordinarily high reproducibility for measuring magnetic thin film samples. The setup is now fully commissioned and available for users through the neutron beam proposal system of the Bragg Institute at ANSTO.
45 CFR 162.920 - Availability of implementation specifications.
Code of Federal Regulations, 2010 CFR
2010-10-01
... implementation specifications and the Technical Reports Type 3 described in subparts I through S of this part... implementation specifications, which include the Technical Reports Type 3 described in this section, for... part 51. The implementation specifications and Technical Reports Type 3 described in this section are...
Van Hoecke, Sofie; Steurbaut, Kristof; Taveirne, Kristof; De Turck, Filip; Dhoedt, Bart
2010-01-01
We designed a broker platform for e-homecare services using web service technology. The broker allows efficient data communication and guarantees quality requirements such as security, availability and cost-efficiency by dynamic selection of services, minimizing user interactions and simplifying authentication through a single user sign-on. A prototype was implemented, with several e-homecare services (alarm, telemonitoring, audio diary and video-chat). It was evaluated by patients with diabetes and multiple sclerosis. The patients found that the start-up time and overhead imposed by the platform was satisfactory. Having all e-homecare services integrated into a single application, which required only one login, resulted in a high quality of experience for the patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Woohyun; Katipamula, Srinivas; Lutes, Robert G.
This report describes how the intelligent load control (ILC) algorithm can be implemented to achieve peak demand reduction while minimizing impacts on occupant comfort. The algorithm was designed to minimize the additional sensors and minimum configuration requirements to enable a scalable and cost-effective implementation for both large and small-/medium-sized commercial buildings. The ILC algorithm uses an analytic hierarchy process (AHP) to dynamically prioritize the available curtailable loads based on both quantitative (deviation of zone conditions from set point) and qualitative rules (types of zone). Although the ILC algorithm described in this report was highly tailored to work with rooftop units,more » it can be generalized for application to other building loads such as variable-air-volume (VAV) boxes and lighting systems.« less
Stephens, Susie M; Chen, Jake Y; Davidson, Marcel G; Thomas, Shiby; Trute, Barry M
2005-01-01
As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html.
Cosmanescu, Alin; Miller, Benjamin; Magno, Terence; Ahmed, Assad; Kremenic, Ian
2006-01-01
A portable, multi-purpose Bio-instrumentation Amplifier and Data AcQuisition device (BADAQ) capable of measuring and transmitting EMG and EKG signals wirelessly via Bluetooth is designed and implemented. Common topologies for instrumentation amplifiers and filters are used and realized with commercially available, low-voltage, high precision operational amplifiers. An 8-bit PIC microcontroller performs 10-bit analog-to-digital conversion of the amplified and filtered signals and controls a Bluetooth transceiver capable of wirelessly transmitting the data to any Bluetooth enabled device. Electrical isolation between patient/subject, circuitry, and ancillary equipment is achieved by optocoupling components. The design focuses on simplicity, portability, and affordability.
Accelerating epistasis analysis in human genetics with consumer graphics hardware.
Sinnott-Armstrong, Nicholas A; Greene, Casey S; Cancare, Fabio; Moore, Jason H
2009-07-24
Human geneticists are now capable of measuring more than one million DNA sequence variations from across the human genome. The new challenge is to develop computationally feasible methods capable of analyzing these data for associations with common human disease, particularly in the context of epistasis. Epistasis describes the situation where multiple genes interact in a complex non-linear manner to determine an individual's disease risk and is thought to be ubiquitous for common diseases. Multifactor Dimensionality Reduction (MDR) is an algorithm capable of detecting epistasis. An exhaustive analysis with MDR is often computationally expensive, particularly for high order interactions. This challenge has previously been met with parallel computation and expensive hardware. The option we examine here exploits commodity hardware designed for computer graphics. In modern computers Graphics Processing Units (GPUs) have more memory bandwidth and computational capability than Central Processing Units (CPUs) and are well suited to this problem. Advances in the video game industry have led to an economy of scale creating a situation where these powerful components are readily available at very low cost. Here we implement and evaluate the performance of the MDR algorithm on GPUs. Of primary interest are the time required for an epistasis analysis and the price to performance ratio of available solutions. We found that using MDR on GPUs consistently increased performance per machine over both a feature rich Java software package and a C++ cluster implementation. The performance of a GPU workstation running a GPU implementation reduces computation time by a factor of 160 compared to an 8-core workstation running the Java implementation on CPUs. This GPU workstation performs similarly to 150 cores running an optimized C++ implementation on a Beowulf cluster. Furthermore this GPU system provides extremely cost effective performance while leaving the CPU available for other tasks. The GPU workstation containing three GPUs costs $2000 while obtaining similar performance on a Beowulf cluster requires 150 CPU cores which, including the added infrastructure and support cost of the cluster system, cost approximately $82,500. Graphics hardware based computing provides a cost effective means to perform genetic analysis of epistasis using MDR on large datasets without the infrastructure of a computing cluster.
75 FR 30483 - Atlantic Highly Migratory Species; Atlantic Shark Management Measures; Amendment 3
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-01
...NMFS publishes this final rule implementing the Final Amendment 3 to the Consolidated Atlantic Highly Migratory Species (HMS) Fishery Management Plan (FMP). As it developed Amendment 3, NMFS examined a full range of management alternatives available to rebuild blacknose sharks and end overfishing of blacknose and shortfin mako sharks, consistent with recent stock assessments, the Magnuson-Stevens Fishery Conservation and Management Act (Magnuson-Stevens Act), and other applicable law, and evaluated options for managing smooth dogfish as a highly migratory species under the HMS FMP. This final rule implements the final conservation and management measures in Amendment 3 for blacknose sharks, shortfin mako sharks, and smooth dogfish. In order to reduce confusion with spiny dogfish regulations, this final rule places both smooth dogfish and Florida smoothhound into the ``smoothhound shark complex.'' This final rule also announces the opening date and 2010 annual quotas for small coastal sharks (SCS). These changes could affect all fishermen, commercial and recreational, who fish for sharks in the Atlantic Ocean, the Gulf of Mexico, and the Caribbean Sea.
RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection.
Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S
Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request.
RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection
Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S.
2015-01-01
Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request. PMID:25685112
Implementation of weather stations at Ghanaian high schools
NASA Astrophysics Data System (ADS)
Pieron, M.
2012-04-01
The Trans-African Hydro-Meteorological Observatory (www.tahmo.org) is an initiative that aims to develop a dense weather observation network in Sub-Sahara Africa. The ambition is to have 20.000 low-cost innovative weather stations in place in 2015. An increased amount of weather data is locally required to provide stakeholders that are dependent on the weather, such as farmers and fishermen, with accurate forecasts. As a first proof of concept, showing that sensors can be built at costs lower than commercially available, a disdrometer was developed. In parallel with the design of the measurement instruments, a high school curriculum is developed that covers environmental sciences. In order to find out which requirements the TAHMO weather station and accompanying educational materials should meet for optimal use at Junior High Schools research was done at Ghanaian schools. Useful insights regarding the future African context of the weather station and requirements for an implementation strategy were obtained during workshops with teachers and students, visits to WMO observatories and case studies regarding use of educational materials. The poster presents the conclusions of this research, which is part of the bigger TAHMO framework.
An integrated CMOS high voltage supply for lab-on-a-chip systems.
Behnam, M; Kaigala, G V; Khorasani, M; Marshall, P; Backhouse, C J; Elliott, D G
2008-09-01
Electrophoresis is a mainstay of lab-on-a-chip (LOC) implementations of molecular biology procedures and is the basis of many medical diagnostics. High voltage (HV) power supplies are necessary in electrophoresis instruments and are a significant part of the overall system cost. This cost of instrumentation is a significant impediment to making LOC technologies more widely available. We believe one approach to overcoming this problem is to use microelectronic technology (complementary metal-oxide semiconductor, CMOS) to generate and control the HV. We present a CMOS-based chip (3 mm x 2.9 mm) that generates high voltages (hundreds of volts), switches HV outputs, and is powered by a 5 V input supply (total power of 28 mW) while being controlled using a standard computer serial interface. Microchip electrophoresis with laser induced fluorescence (LIF) detection is implemented using this HV CMOS chip. With the other advancements made in the LOC community (e.g. micro-fluidic and optical devices), these CMOS chips may ultimately enable 'true' LOC solutions where essentially all the microfluidics, photonics and electronics are on a single chip.
Solution processed integrated pixel element for an imaging device
NASA Astrophysics Data System (ADS)
Swathi, K.; Narayan, K. S.
2016-09-01
We demonstrate the implementation of a solid state circuit/structure comprising of a high performing polymer field effect transistor (PFET) utilizing an oxide layer in conjunction with a self-assembled monolayer (SAM) as the dielectric and a bulk-heterostructure based organic photodiode as a CMOS-like pixel element for an imaging sensor. Practical usage of functional organic photon detectors requires on chip components for image capture and signal transfer as in the CMOS/CCD architecture rather than simple photodiode arrays in order to increase speed and sensitivity of the sensor. The availability of high performing PFETs with low operating voltage and photodiodes with high sensitivity provides the necessary prerequisite to implement a CMOS type image sensing device structure based on organic electronic devices. Solution processing routes in organic electronics offers relatively facile procedures to integrate these components, combined with unique features of large-area, form factor and multiple optical attributes. We utilize the inherent property of a binary mixture in a blend to phase-separate vertically and create a graded junction for effective photocurrent response. The implemented design enables photocharge generation along with on chip charge to voltage conversion with performance parameters comparable to traditional counterparts. Charge integration analysis for the passive pixel element using 2D TCAD simulations is also presented to evaluate the different processes that take place in the monolithic structure.
The Effects of Alarm Display, Processing, and Availability on Crew Performance
2000-11-01
snow Instrumentation line leakage Small LOCA Steam generator tube rupture Small feedwater leakage inside containment Cycling of main steam...implemented. • Due to primary pressure controller failure, pressure heater banks cycle between on and off. 8.00 CF1 CF2 CF3 CF4 CF5...temperatures after the high-pressure pre- heaters flows into the steam generators number of active emergency feedwater pumps openings of the condensate
STARL -- a Program to Correct CCD Image Defects
NASA Astrophysics Data System (ADS)
Narbutis, D.; Vanagas, R.; Vansevičius, V.
We present a program tool, STARL, designed for automatic detection and correction of various defects in CCD images. It uses genetic algorithm for deblending and restoring of overlapping saturated stars in crowded stellar fields. Using Subaru Telescope Suprime-Cam images we demonstrate that the program can be implemented in the wide-field survey data processing pipelines for production of high quality color mosaics. The source code and examples are available at the STARL website.
A computational genomics pipeline for prokaryotic sequencing projects
Kislyuk, Andrey O.; Katz, Lee S.; Agrawal, Sonia; Hagen, Matthew S.; Conley, Andrew B.; Jayaraman, Pushkala; Nelakuditi, Viswateja; Humphrey, Jay C.; Sammons, Scott A.; Govil, Dhwani; Mair, Raydel D.; Tatti, Kathleen M.; Tondella, Maria L.; Harcourt, Brian H.; Mayer, Leonard W.; Jordan, I. King
2010-01-01
Motivation: New sequencing technologies have accelerated research on prokaryotic genomes and have made genome sequencing operations outside major genome sequencing centers routine. However, no off-the-shelf solution exists for the combined assembly, gene prediction, genome annotation and data presentation necessary to interpret sequencing data. The resulting requirement to invest significant resources into custom informatics support for genome sequencing projects remains a major impediment to the accessibility of high-throughput sequence data. Results: We present a self-contained, automated high-throughput open source genome sequencing and computational genomics pipeline suitable for prokaryotic sequencing projects. The pipeline has been used at the Georgia Institute of Technology and the Centers for Disease Control and Prevention for the analysis of Neisseria meningitidis and Bordetella bronchiseptica genomes. The pipeline is capable of enhanced or manually assisted reference-based assembly using multiple assemblers and modes; gene predictor combining; and functional annotation of genes and gene products. Because every component of the pipeline is executed on a local machine with no need to access resources over the Internet, the pipeline is suitable for projects of a sensitive nature. Annotation of virulence-related features makes the pipeline particularly useful for projects working with pathogenic prokaryotes. Availability and implementation: The pipeline is licensed under the open-source GNU General Public License and available at the Georgia Tech Neisseria Base (http://nbase.biology.gatech.edu/). The pipeline is implemented with a combination of Perl, Bourne Shell and MySQL and is compatible with Linux and other Unix systems. Contact: king.jordan@biology.gatech.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20519285
NASA Astrophysics Data System (ADS)
Felfelani, F.; Pokhrel, Y. N.
2017-12-01
In this study, we use in-situ observations and satellite data of soil moisture and groundwater to improve irrigation and groundwater parameterizations in the version 4.5 of the Community Land Model (CLM). The irrigation application trigger, which is based on the soil moisture deficit mechanism, is enhanced by integrating soil moisture observations and the data from the Soil Moisture Active Passive (SMAP) mission which is available since 2015. Further, we incorporate different irrigation application mechanisms based on schemes used in various other land surface models (LSMs) and carry out a sensitivity analysis using point simulations at two different irrigated sites in Mead, Nebraska where data from the AmeriFlux observational network are available. We then conduct regional simulations over the entire High Plains region and evaluate model results with the available irrigation water use data at the county-scale. Finally, we present results of groundwater simulations by implementing a simple pumping scheme based on our previous studies. Results from the implementation of current irrigation parameterization used in various LSMs show relatively large difference in vertical soil moisture content profile (e.g., 0.2 mm3/mm3) at point scale which is mostly decreased when averaged over relatively large regions (e.g., 0.04 mm3/mm3 in the High Plains region). It is found that original irrigation module in CLM 4.5 tends to overestimate the soil moisture content compared to both point observations and SMAP, and the results from the improved scheme linked with the groundwater pumping scheme show better agreement with the observations.
Towards a geospatial wikipedia
NASA Astrophysics Data System (ADS)
Fritz, S.; McCallum, I.; Schill, C.; Perger, C.; Kraxner, F.; Obersteiner, M.
2009-04-01
Based on the Google Earth (http://earth.google.com) platform we have developed a geospatial Wikipedia (geo-wiki.org). The tool allows everybody in the world to contribute to spatial validation and is made available to the internet community interested in that task. We illustrate how this tool can be used for different applications. In our first application we combine uncertainty hotspot information from three global land cover datasets (GLC, MODIS, GlobCover). With an ever increasing amount of high resolution images available on Google Earth, it is becoming increasingly possible to distinguish land cover features with a high degree of accuracy. We first direct the land cover validation community to certain hotspots of land cover uncertainty and then ask them to fill in a small popup menu on type of land cover, possibly a picture at that location with the different cardinal points as well as date and what type of validation was chosen (google earth imagery/panoramio or if the person has ground truth data). We have implemented the tool via a land cover validation community at FACEBOOK which is based on a snowball system which allows the tracking of individuals and the possibility to ignore users which misuse the system. In a second application we illustrate how the tool could possibly be used for mapping malaria occurrence and small water bodies as well as overall malaria risk. For this application we have implemented a polygon as well as attribute function using Google maps as along with virtual earth using openlayers. The third application deals with illegal logging and how an alert system for illegal logging detection within a certain land tenure system could be implemented. Here we show how the tool can be used to document illegal logging via a YouTube video.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dennis, R.A.
1997-05-01
The availability of reliable, low-cost electricity is a cornerstone for the United States` ability to compete in the world market. The Department of Energy (DOE) projects the total consumption of electricity in the US to rise from 2.7 trillion kilowatt-hours in 1990 to 3.5 trillion in 2010. Although energy sources are diversifying, fossil fuel still produces 90 percent of the nation`s energy. Coal is our most abundant fossil fuel resource and the source of 56 percent of our electricity. It has been the fuel of choice because of its availability and low cost. A new generation of high-efficiency power systemsmore » has made it possible to continue the use of coal while still protecting the environment. Such power systems greatly reduce the pollutants associated with cola-fired plants built before the 1970s. To realize this high efficiency and superior environmental performance, advanced coal-based power systems will require gas stream cleanup under high-temperature and high-pressure (HTHP) process conditions. Presented in this paper are the HTHP particulate capture requirements for the Integrated Gasification Combined Cycle (IGCC) and Pressurized Fluidized-Bed Combustion (PFBC) power systems, the HTHP particulate cleanup systems being implemented in the PFBC and IGCC Clean Coal Technology (CCT) Projects, and the currently available particulate capture performance results.« less
Evolution of the Ultrasonic Inspection of Heavy Rotor Forgings Over the Last Decades
NASA Astrophysics Data System (ADS)
Zimmer, A.; Vrana, J.; Meiser, J.; Maximini, W.; Blaes, N.
2010-02-01
All types of heavy forgings that are used in energy machine industry, rotor shafts as well as discs, retaining rings or tie bolts are subject to extensive nondestructive inspections before they are delivered to the customer. Due to the availability of the parts in simple shapes, these forgings are very well suited for full volmetric inspections using ultrasound. In the beginning, these inspections were carried out manually, using straight beam probes and analogue equipment. Higher requirements in reliability, efficiency, safety and power output in the machines have lead to higher requirements for the ultrasonic inspection in the form of more scanning directions, higher sensitivity demands and improved documentation means. This and the increasing use of high alloy materials for ever growing parts, increase the need for more and more sophisticated methods for testing the forgings. Angle scans and sizing technologies like DGS have been implemented, and for more than 15 years now, mechanized and automated inspections have gained importance since they allow better documentation as well as easier evaluation of the recorded data using different views (B- C- or D-Scans), projections or tomography views. The latest major development has been the availability of phased array probes to increase the flexibility of the inspection systems. Many results of the ongoing research in ultrasonic's have not been implemented yet. Today's availability of fast computers, large and fast data storages allows saving RF inspection data and applying sophisticated signal processing methods. For example linear diffraction tomography methods like SAFT offer tools for 3D reconstruction of inspection data, simplifying sizing and locating of defects as well as for improving signal to noise ratios. While such methods are already applied in medical ultrasonic's, they are still to be implemented in the steel industry. This paper describes the development of the ultrasonic inspection of heavy forgings from the beginning up to today at the example of Saarschmiede GmbH explains the difficulties in implementing changes and gives an outlook over the current progression.
A high-speed on-chip pseudo-random binary sequence generator for multi-tone phase calibration
NASA Astrophysics Data System (ADS)
Gommé, Liesbeth; Vandersteen, Gerd; Rolain, Yves
2011-07-01
An on-chip reference generator is conceived by adopting the technique of decimating a pseudo-random binary sequence (PRBS) signal in parallel sequences. This is of great benefit when high-speed generation of PRBS and PRBS-derived signals is the objective. The design implemented standard CMOS logic is available in commercial libraries to provide the logic functions for the generator. The design allows the user to select the periodicity of the PRBS and the PRBS-derived signals. The characterization of the on-chip generator marks its performance and reveals promising specifications.
Zero Gyro Kalman Filtering in the presence of a Reaction Wheel Failure
NASA Technical Reports Server (NTRS)
Hur-Diaz, Sun; Wirzburger, John; Smith, Dan; Myslinski, Mike
2007-01-01
Typical implementation of Kalman filters for spacecraft attitude estimation involves the use of gyros for three-axis rate measurements. When there are less than three axes of information available, the accuracy of the Kalman filter depends highly on the accuracy of the dynamics model. This is particularly significant during the transient period when a reaction wheel with a high momentum fails, is taken off-line, and spins down. This paper looks at how a reaction wheel failure can affect the zero-gyro Kalman filter performance for the Hubble Space Telescope and what steps are taken to minimize its impact.
Zero Gyro Kalman Filtering in the Presence of a Reaction Wheel Failure
NASA Technical Reports Server (NTRS)
Hur-Diaz, Sun; Wirzburger, John; Smith, Dan; Myslinski, Mike
2007-01-01
Typical implementation of Kalman filters for spacecraft attitude estimation involves the use of gyros for three-axis rate measurements. When there are less than three axes of information available, the accuracy of the Kalman filter depends highly on the accuracy of the dynamics model. This is particularly significant during the transient period when a reaction wheel with a high momentum fails, is taken off-line, and spins down. This paper looks at how a reaction wheel failure can affect the zero-gyro Kalman filter performance for the Hubble Space Telescope and what steps are taken to minimize its impact.
Formative research and stakeholder participation in intervention development.
Vastine, Amy; Gittelsohn, Joel; Ethelbah, Becky; Anliker, Jean; Caballero, Benjamin
2005-01-01
To present a model for using formative research and stakeholder participation to develop a community-based dietary intervention targeting American Indians. Formative research included interviews, assessment of food- purchasing frequency and preparation methods, and dietary recalls. Stakeholders contributed to intervention development through formative research, a program planning workshop, group feedback, and implementation training. Foods high in fat and sugar are commonly consumed. Barriers to healthy eating include low availability, perceived high cost, and poor flavor. Stakeholder participation contributed to the development of a culturally appropriate intervention. This approach resulted in project acceptance, stakeholder collaboration, and a culturally appropriate program.
32-channel single photon counting module for ultrasensitive detection of DNA sequences
NASA Astrophysics Data System (ADS)
Gudkov, Georgiy; Dhulla, Vinit; Borodin, Anatoly; Gavrilov, Dmitri; Stepukhovich, Andrey; Tsupryk, Andrey; Gorbovitski, Boris; Gorfinkel, Vera
2006-10-01
We continue our work on the design and implementation of multi-channel single photon detection systems for highly sensitive detection of ultra-weak fluorescence signals, for high-performance, multi-lane DNA sequencing instruments. A fiberized, 32-channel single photon detection (SPD) module based on single photon avalanche diode (SPAD), model C30902S-DTC, from Perkin Elmer Optoelectronics (PKI) has been designed and implemented. Unavailability of high performance, large area SPAD arrays and our desire to design high performance photon counting systems drives us to use individual diodes. Slight modifications in our quenching circuit has doubled the linear range of our system from 1MHz to 2MHz, which is the upper limit for these devices and the maximum saturation count rate has increased to 14 MHz. The detector module comprises of a single board computer PC-104 that enables data visualization, recording, processing, and transfer. Very low dark count (300-1000 counts/s), robust, efficient, simple data collection and processing, ease of connectivity to any other application demanding similar requirements and similar performance results to the best commercially available single photon counting module (SPCM from PKI) are some of the features of this system.
Haslinger-Baumann, Elisabeth; Lang, Gert; Müller, Gerhard
2014-01-01
In nursing practice, research results have to undergo a systematic process of transformation. Currently in Austria, there is no empirical data available concerning the actual implementation of research results. An English validated questionnaire was translated into German and tested for validity and reliability. A survey of 178 registered nurses (n = 178) was conducted in a multicenter, quantitative, cross-sectional study in Austria in 2011. Cronbach's alpha values (.82-.92) were calculated for 4 variables ("use," "attitude," "availability," "support") after the reduction of 7 irrelevant items. Exploratory factor analysis was calculated with Kaiser-Meyer-Olkin (KMO) ranging from .78 to .92; the total variance ranged from 46% to 56%. A validated German questionnaire concerning the implementation of research results is now available for the nursing practice.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-05
... Pollution Affecting Visibility and Best Available Retrofit Technology Determination AGENCY: Environmental... technology (BART) for NO X for this source. This action is being taken under section 110 and part C of the.... Visibility Protection 3. Best Available Retrofit Technology 4. The Western Regional Air Partnership and...
Gas Turbine Engine Production Implementation Study : Volume 1. Executive Summary.
DOT National Transportation Integrated Search
1973-07-01
The report presents an assessment of available information pertaining to implementing mass production of gas turbine powered automobiles. The status of the technology and implementation schedule visibility reported herein is that existing at the time...
Gas Turbine Engine Production Implementation Study : Volume 2. Technical Discussion.
DOT National Transportation Integrated Search
1973-07-01
This report presents a summarization and assessment of available information pertaining to the potential for implementing mass production of gas turbine engine-powered automobiles. The main topic covered is the schedule requirement for that implement...
El-Mallakh, Peggy; Howard, Patricia B; Rayens, Mary Kay; Roque, Autumn P; Adkins, Sarah
2013-11-01
Organizational support is essential for successful implementation of evidence-based practice (EBP) in clinical settings. This 3-year study used a mixed qualitative and quantitative design to implement a medication management EBP in the treatment of schizophrenia in six community mental health clinics in a south-central state of the United States. Findings from organizational fidelity assessments indicate that support for EBP implementation was moderate. Organizational support was highest for prescriber access to relevant patient information at each medication visit, scheduling flexibility for patients' urgent problems, and availability of medication guidelines. Organizational support was lowest for medication availability and identification of treatment refractory patients. Findings suggest that leadership is essential to support successful implementation. Nurse educators can incorporate implementation research and leadership training into graduate nursing programs to facilitate successful EBP implementation in practice settings. Copyright 2013, SLACK Incorporated.
HitWalker2: visual analytics for precision medicine and beyond.
Bottomly, Daniel; McWeeney, Shannon K; Wilmot, Beth
2016-04-15
The lack of visualization frameworks to guide interpretation and facilitate discovery is a potential bottleneck for precision medicine, systems genetics and other studies. To address this we have developed an interactive, reproducible, web-based prioritization approach that builds on our earlier work. HitWalker2 is highly flexible and can utilize many data types and prioritization methods based upon available data and desired questions, allowing it to be utilized in a diverse range of studies such as cancer, infectious disease and psychiatric disorders. Source code is freely available at https://github.com/biodev/HitWalker2 and implemented using Python/Django, Neo4j and Javascript (D3.js and jQuery). We support major open source browsers (e.g. Firefox and Chromium/Chrome). wilmotb@ohsu.edu Supplementary data are available at Bioinformatics online. Additional information/instructions are available at https://github.com/biodev/HitWalker2/wiki. © The Author 2015. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephen, Jamie; Sokhansanj, Shahabaddine; Bi, X.T.
2009-11-01
Biorefineries or other biomass-dependent facilities require a predictable, dependable feedstock supplied over many years to justify capital investments. Determining inter-year variability in biomass availability is essential to quantifying the feedstock supply risk. Using a geographic information system (GIS) and historic crop yield data, average production was estimated for 10 sites in the Peace River region of Alberta, Canada. Four high-yielding potential sites were investigated for variability over a 20 year time-frame (1980 2000). The range of availability was large, from double the average in maximum years to nothing in minimum years. Biomass availability is a function of grain yield, themore » biomass to grain ratio, the cropping frequency, and residue retention rate to ensure future crop productivity. Storage strategies must be implemented and alternate feedstock sources identified to supply biomass processing facilities in low-yield years.« less
Samant, Sanjiv S; Xia, Junyi; Muyan-Ozcelik, Pinar; Owens, John D
2008-08-01
The advent of readily available temporal imaging or time series volumetric (4D) imaging has become an indispensable component of treatment planning and adaptive radiotherapy (ART) at many radiotherapy centers. Deformable image registration (DIR) is also used in other areas of medical imaging, including motion corrected image reconstruction. Due to long computation time, clinical applications of DIR in radiation therapy and elsewhere have been limited and consequently relegated to offline analysis. With the recent advances in hardware and software, graphics processing unit (GPU) based computing is an emerging technology for general purpose computation, including DIR, and is suitable for highly parallelized computing. However, traditional general purpose computation on the GPU is limited because the constraints of the available programming platforms. As well, compared to CPU programming, the GPU currently has reduced dedicated processor memory, which can limit the useful working data set for parallelized processing. We present an implementation of the demons algorithm using the NVIDIA 8800 GTX GPU and the new CUDA programming language. The GPU performance will be compared with single threading and multithreading CPU implementations on an Intel dual core 2.4 GHz CPU using the C programming language. CUDA provides a C-like language programming interface, and allows for direct access to the highly parallel compute units in the GPU. Comparisons for volumetric clinical lung images acquired using 4DCT were carried out. Computation time for 100 iterations in the range of 1.8-13.5 s was observed for the GPU with image size ranging from 2.0 x 10(6) to 14.2 x 10(6) pixels. The GPU registration was 55-61 times faster than the CPU for the single threading implementation, and 34-39 times faster for the multithreading implementation. For CPU based computing, the computational time generally has a linear dependence on image size for medical imaging data. Computational efficiency is characterized in terms of time per megapixels per iteration (TPMI) with units of seconds per megapixels per iteration (or spmi). For the demons algorithm, our CPU implementation yielded largely invariant values of TPMI. The mean TPMIs were 0.527 spmi and 0.335 spmi for the single threading and multithreading cases, respectively, with <2% variation over the considered image data range. For GPU computing, we achieved TPMI =0.00916 spmi with 3.7% variation, indicating optimized memory handling under CUDA. The paradigm of GPU based real-time DIR opens up a host of clinical applications for medical imaging.
Scollo, Michelle; Bayly, Megan; Wakefield, Melanie
2015-03-01
We aimed to assess change in the availability of illicit tobacco in small mixed business retail outlets following the December 2012 introduction of plain packaging in Australia. 303 small retail outlets were visited in June and September 2012 (baseline months), and in December 2012 and February, April and July 2013. Fieldworkers requested a particular low-cost brand of cigarettes and then pressed the retailer for an 'even cheaper' brand. The cheapest pack of cigarettes offered was purchased and later examined to assess any divergence from prescribed Australian packaging regulations. The price paid was compared with tax liability and recommended retail price for the particular brand and pack size. In a sub-set of 179 stores, fieldworkers then asked the retailer about availability of unbranded (chop-chop) tobacco. Thirteen (2.2%) of 598 packs purchased pre-plain packaging were either non-compliant with Australian health warnings and/or suspiciously priced. Four packs (1.3%) of 297 met either or both criteria in the December implementation month, and five (0.6%) of 878 did so in the three collection months following implementation. Chop-chop was offered upon enquiry on 0.6% (n=2) of 338 occasions prior to implementation, 0.6% (n=1) of 170 occasions in the December 2012 implementation month, and 0.6% (n=3) of 514 occasions postimplementation. The likelihood of a 'positive' response (either an offer to sell or information about where unbranded tobacco may be purchased) did not differ across preimplementation, during-implementation and postimplementation waves. Overall, packs judged likely to be illicit were sold in response to requests for cheapest available packs on fewer than one percent of occasions. Offers to sell unbranded tobacco were rare. No change in availability of illicit tobacco was observed following implementation of plain packaging. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
SNPversity: a web-based tool for visualizing diversity
Schott, David A; Vinnakota, Abhinav G; Portwood, John L; Andorf, Carson M
2018-01-01
Abstract Many stand-alone desktop software suites exist to visualize single nucleotide polymorphism (SNP) diversity, but web-based software that can be easily implemented and used for biological databases is absent. SNPversity was created to answer this need by building an open-source visualization tool that can be implemented on a Unix-like machine and served through a web browser that can be accessible worldwide. SNPversity consists of a HDF5 database back-end for SNPs, a data exchange layer powered by TASSEL libraries that represent data in JSON format, and an interface layer using PHP to visualize SNP information. SNPversity displays data in real-time through a web browser in grids that are color-coded according to a given SNP’s allelic status and mutational state. SNPversity is currently available at MaizeGDB, the maize community’s database, and will be soon available at GrainGenes, the clade-oriented database for Triticeae and Avena species, including wheat, barley, rye, and oat. The code and documentation are uploaded onto github, and they are freely available to the public. We expect that the tool will be highly useful for other biological databases with a similar need to display SNP diversity through their web interfaces. Database URL: https://www.maizegdb.org/snpversity PMID:29688387
Identification of hierarchical chromatin domains
Weinreb, Caleb; Raphael, Benjamin J.
2016-01-01
Motivation: The three-dimensional structure of the genome is an important regulator of many cellular processes including differentiation and gene regulation. Recently, technologies such as Hi-C that combine proximity ligation with high-throughput sequencing have revealed domains of self-interacting chromatin, called topologically associating domains (TADs), in many organisms. Current methods for identifying TADs using Hi-C data assume that TADs are non-overlapping, despite evidence for a nested structure in which TADs and sub-TADs form a complex hierarchy. Results: We introduce a model for decomposition of contact frequencies into a hierarchy of nested TADs. This model is based on empirical distributions of contact frequencies within TADs, where positions that are far apart have a greater enrichment of contacts than positions that are close together. We find that the increase in contact enrichment with distance is stronger for the inner TAD than for the outer TAD in a TAD/sub-TAD pair. Using this model, we develop the TADtree algorithm for detecting hierarchies of nested TADs. TADtree compares favorably with previous methods, finding TADs with a greater enrichment of chromatin marks such as CTCF at their boundaries. Availability and implementation: A python implementation of TADtree is available at http://compbio.cs.brown.edu/software/ Contact: braphael@cs.brown.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26315910
Iwelunmor, Juliet; Blackstone, Sarah; Veira, Dorice; Nwaozuru, Ucheoma; Airhihenbuwa, Collins; Munodawafa, Davison; Kalipeni, Ezekiel; Jutal, Antar; Shelley, Donna; Ogedegebe, Gbenga
2016-03-23
Sub-Saharan Africa (SSA) is facing a double burden of disease with a rising prevalence of non-communicable diseases (NCDs) while the burden of communicable diseases (CDs) remains high. Despite these challenges, there remains a significant need to understand how or under what conditions health interventions implemented in sub-Saharan Africa are sustained. The purpose of this study was to conduct a systematic review of empirical literature to explore how health interventions implemented in SSA are sustained. We searched MEDLINE, Biological Abstracts, CINAHL, Embase, PsycInfo, SCIELO, Web of Science, and Google Scholar for available research investigating the sustainability of health interventions implemented in sub-Saharan Africa. We also used narrative synthesis to examine factors whether positive or negative that may influence the sustainability of health interventions in the region. The search identified 1819 citations, and following removal of duplicates and our inclusion/exclusion criteria, only 41 papers were eligible for inclusion in the review. Twenty-six countries were represented in this review, with Kenya and Nigeria having the most representation of available studies examining sustainability. Study dates ranged from 1996 to 2015. Of note, majority of these studies (30 %) were published in 2014. The most common framework utilized was the sustainability framework, which was discussed in four of the studies. Nineteen out of 41 studies (46 %) reported sustainability outcomes focused on communicable diseases, with HIV and AIDS represented in majority of the studies, followed by malaria. Only 21 out of 41 studies had clear definitions of sustainability. Community ownership and mobilization were recognized by many of the reviewed studies as crucial facilitators for intervention sustainability, both early on and after intervention implementation, while social and ecological conditions as well as societal upheavals were barriers that influenced the sustainment of interventions in sub-Saharan Africa. The sustainability of health interventions implemented in sub-Saharan Africa is inevitable given the double burden of diseases, health care worker shortage, weak health systems, and limited resources. We propose a conceptual framework that draws attention to sustainability as a core component of the overall life cycle of interventions implemented in the region.
NASA Technical Reports Server (NTRS)
Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan
2016-01-01
Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.
Single chemical entity legal highs: assessing the risk for long term harm.
McNabb, Carolyn B; Russell, Bruce R; Caprioli, Daniele; Nutt, David J; Gibbons, Simon; Dalley, Jeffrey W
2012-12-01
A recent and dramatic increase in the emergence of novel psychoactive substances ('legal highs') has left many governments unable to provide a timely response to an increasing number of potentially harmful drugs now available to the public. In response to this rapid increase in lawful drug use, the UK government intends to implement temporary class drug orders, whereby substances with a potential for misuse and harm can be regulated for a 12 month period. During this period an investigation of the potential for harms induced by these drugs will take place. However, the short time-frame in which information must be gathered, and the paucity of data available on novel psychoactive substances, means that robust pharmacological and toxicological analyses may be replaced by extrapolating data from illegal drugs with similar chemical structures. This review explores the potential pharmacology and toxicology of past and present 'legal highs' and discusses the risks of failing to carry out in-depth scientific research on individual substances.
GAC: Gene Associations with Clinical, a web based application.
Zhang, Xinyan; Rupji, Manali; Kowalski, Jeanne
2017-01-01
We present GAC, a shiny R based tool for interactive visualization of clinical associations based on high-dimensional data. The tool provides a web-based suite to perform supervised principal component analysis (SuperPC), an approach that uses both high-dimensional data, such as gene expression, combined with clinical data to infer clinical associations. We extended the approach to address binary outcomes, in addition to continuous and time-to-event data in our package, thereby increasing the use and flexibility of SuperPC. Additionally, the tool provides an interactive visualization for summarizing results based on a forest plot for both binary and time-to-event data. In summary, the GAC suite of tools provide a one stop shop for conducting statistical analysis to identify and visualize the association between a clinical outcome of interest and high-dimensional data types, such as genomic data. Our GAC package has been implemented in R and is available via http://shinygispa.winship.emory.edu/GAC/. The developmental repository is available at https://github.com/manalirupji/GAC.
Suicide burden and prevention in Nepal: The need for a national strategy.
Marahatta, Kedar; Samuel, Reuben; Sharma, Pawan; Dixit, Lonim; Shrestha, Bhola Ram
2017-04-01
Suicide is a major cause of deaths worldwide and is a key public health concern in Nepal. Although routine national data are not collected in Nepal, the available evidence suggests that suicide rates are relatively high, notably for women. In addition, civil conflict and the 2015 earthquake have had significant contributory effects. A range of factors both facilitate suicide attempts and hinder those affected from seeking help, such as the ready availability of toxic pesticides and the widespread, although erroneous, belief that suicide is illegal. Various interventions have been undertaken at different levels in prevention and rehabilitation but a specific long-term national strategy for suicide prevention is lacking. Hence, to address this significant public health problem, a multisectoral platform of stakeholders needs to be established under government leadership, to design and implement innovative and country-contextualized policies and programmes. A bottom-up approach, with active and participatory community engagement from the start of the policy- and strategy-formulation stage, through to the design and implementation of interventions, could potentially build grass-roots public ownership, reduce stigma and ensure a scaleable and sustainable response.
NASA Astrophysics Data System (ADS)
D'Alessandro, Valerio; Binci, Lorenzo; Montelpare, Sergio; Ricci, Renato
2018-01-01
Open-source CFD codes provide suitable environments for implementing and testing low-dissipative algorithms typically used to simulate turbulence. In this research work we developed CFD solvers for incompressible flows based on high-order explicit and diagonally implicit Runge-Kutta (RK) schemes for time integration. In particular, an iterated PISO-like procedure based on Rhie-Chow correction was used to handle pressure-velocity coupling within each implicit RK stage. For the explicit approach, a projected scheme was used to avoid the "checker-board" effect. The above-mentioned approaches were also extended to flow problems involving heat transfer. It is worth noting that the numerical technology available in the OpenFOAM library was used for space discretization. In this work, we additionally explore the reliability and effectiveness of the proposed implementations by computing several unsteady flow benchmarks; we also show that the numerical diffusion due to the time integration approach is completely canceled using the solution techniques proposed here.
Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.; ...
2018-04-19
QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less
NASA Technical Reports Server (NTRS)
Eloranta, E. W.; Piironen, P. K.
1996-01-01
Quantitative lidar measurements of aerosol scattering are hampered by the need for calibrations and the problem of correcting observed backscatter profiles for the effects of attenuation. The University of Wisconsin High Spectral Resolution Lidar (HSRL) addresses these problems by separating molecular scattering contributions from the aerosol scattering; the molecular scattering is then used as a calibration target that is available at each point in the observed profiles. While the HSRl approach has intrinsic advantages over competing techniques, realization of these advantages requires implementation of a technically demanding system which is potentially very sensitive to changes in temperature and mechanical alignments. This paper describes a new implementation of the HSRL in an instrumented van which allows measurements during field experiments. The HSRL was modified to measure depolarization. In addition, both the signal amplitude and depolarization variations with receiver field of view are simultaneously measured. This allows for discrimination of ice clouds from water clouds and observation of multiple scattering contributions to the lidar return.
Unsafe abortion - the current global scenario.
Faúndes, Anibal
2010-08-01
Unsafe abortion is prevalent in many developing countries, mostly in sub-Saharan Africa, Latin America and South and Southeast Asia, where abortion laws are more restrictive, the unmet need for contraception high and the status of women in society low. The main interventions for reducing the prevalence of unsafe abortion are known: better and more widely available family planning services, comprehensive sex education, improved access to safe abortion and high-quality post-abortion care, including contraceptive counselling and on-site services. Although these proposals have been included in statements and recommendations drawn up at several international conferences and adopted by the vast majority of nations, they have either been inadequately implemented or not implemented at all in the countries in which the need is greatest. A well-coordinated effort by both national and international organisations and agencies is required to put these recommendations into practice; however, the most important factor determining the success of such efforts is the commitment of governments towards preventing unsafe abortion and reducing its prevalence and consequences. 2010 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-12-04
The software serves two purposes. The first purpose of the software is to prototype the Sandia High Performance Computing Power Application Programming Interface Specification effort. The specification can be found at http://powerapi.sandia.gov . Prototypes of the specification were developed in parallel with the development of the specification. Release of the prototype will be instructive to anyone who intends to implement the specification. More specifically, our vendor collaborators will benefit from the availability of the prototype. The second is in direct support of the PowerInsight power measurement device, which was co-developed with Penguin Computing. The software provides a cluster wide measurementmore » capability enabled by the PowerInsight device. The software can be used by anyone who purchases a PowerInsight device. The software will allow the user to easily collect power and energy information of a node that is instrumented with PowerInsight. The software can also be used as an example prototype implementation of the High Performance Computing Power Application Programming Interface Specification.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.
QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less
The coordination of allocation: Logistics of kidney organ allocation to highly sensitized patients.
Lunz, John; Hinsdale, Lisa; King, Casey; Pastush, Robin; Buenvenida, Magnolia; Harmon, Michael
2017-01-01
Since implementation, the new UNOS OPTN kidney allocation system (KAS) has drastically expanded the pool of available kidneys to candidates that may have previously waited extended periods for an organ offer. This is particularly true for highly sensitized patients. The changes to the KAS have had ramifications throughout the transplant process, including for organ procurement organizations (OPO) and local transplant hospital call centers. Here, we will examine the impact of the new KAS on the organ donation process and highlight the necessary interactions between the OPO and transplant centers to best match donor kidneys and highly sensitized recipients. Copyright © 2016 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.
Smartphone based monitoring system for long-term sleep assessment.
Domingues, Alexandre
2015-01-01
The diagnosis of sleep disorders, highly prevalent in Western countries, typically involves sophisticated procedures and equipment that are highly intrusive to the patient. The high processing capabilities and storage capacity of current portable devices, together with a big range of available sensors, many of them with wireless capabilities, create new opportunities and change the paradigms in sleep studies. In this work, a smartphone based sleep monitoring system is presented along with the details of the hardware, software and algorithm implementation. The aim of this system is to provide a way for subjects, with no pre-diagnosed sleep disorders, to monitor their sleep habits, and on the initial screening of abnormal sleep patterns.
CBESW: sequence alignment on the Playstation 3.
Wirawan, Adrianto; Kwoh, Chee Keong; Hieu, Nim Tri; Schmidt, Bertil
2008-09-17
The exponential growth of available biological data has caused bioinformatics to be rapidly moving towards a data-intensive, computational science. As a result, the computational power needed by bioinformatics applications is growing exponentially as well. The recent emergence of accelerator technologies has made it possible to achieve an excellent improvement in execution time for many bioinformatics applications, compared to current general-purpose platforms. In this paper, we demonstrate how the PlayStation 3, powered by the Cell Broadband Engine, can be used as a computational platform to accelerate the Smith-Waterman algorithm. For large datasets, our implementation on the PlayStation 3 provides a significant improvement in running time compared to other implementations such as SSEARCH, Striped Smith-Waterman and CUDA. Our implementation achieves a peak performance of up to 3,646 MCUPS. The results from our experiments demonstrate that the PlayStation 3 console can be used as an efficient low cost computational platform for high performance sequence alignment applications.
CBESW: Sequence Alignment on the Playstation 3
Wirawan, Adrianto; Kwoh, Chee Keong; Hieu, Nim Tri; Schmidt, Bertil
2008-01-01
Background The exponential growth of available biological data has caused bioinformatics to be rapidly moving towards a data-intensive, computational science. As a result, the computational power needed by bioinformatics applications is growing exponentially as well. The recent emergence of accelerator technologies has made it possible to achieve an excellent improvement in execution time for many bioinformatics applications, compared to current general-purpose platforms. In this paper, we demonstrate how the PlayStation® 3, powered by the Cell Broadband Engine, can be used as a computational platform to accelerate the Smith-Waterman algorithm. Results For large datasets, our implementation on the PlayStation® 3 provides a significant improvement in running time compared to other implementations such as SSEARCH, Striped Smith-Waterman and CUDA. Our implementation achieves a peak performance of up to 3,646 MCUPS. Conclusion The results from our experiments demonstrate that the PlayStation® 3 console can be used as an efficient low cost computational platform for high performance sequence alignment applications. PMID:18798993
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
NASA Astrophysics Data System (ADS)
Simunovic, S.; Zacharia, T.; Baltas, N.; Spalding, D. B.
1995-03-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. The Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, S.; Zacharia, T.; Baltas, N.
1995-04-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. Themore » Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.« less
A Real-Time Earthquake Precursor Detection Technique Using TEC from a GPS Network
NASA Astrophysics Data System (ADS)
Alp Akyol, Ali; Arikan, Feza; Arikan, Orhan
2016-07-01
Anomalies have been observed in the ionospheric electron density distribution prior to strong earthquakes. However, most of the reported results are obtained by earthquake analysis. Therefore, their implementation in practice is highly problematic. Recently, a novel earthquake precursor detection technique based on spatio-temporal analysis of Total Electron Content (TEC) data obtained from Turkish National Permanent GPS Network (TNPGN) is developed by IONOLAB group (www.ionolab.org). In the present study, the developed detection technique is implemented in a causal setup over the available data set in test phase that enables the real time implementation. The performance of the developed earthquake prediction technique is evaluated by using 10 fold cross validation over the data obtained in 2011. Among the 23 earthquakes that have magnitudes higher than 5, the developed technique can detect precursors of 14 earthquakes while producing 8 false alarms. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.
[Reducing the burden of disease caused by alcohol use in Peru: evidence- based approaches].
Fiestas, Fabián
2012-03-01
Alcohol use is one the most important risk factors for illness and early death in Peru. Measures aimed at decreasing or controlling the great impact caused by alcohol in the Peruvian society are urgently needed. This article identifies and promotes the implementation of public health measures supported by sound scientific evidence of effectiveness or, in some cases, cost-effectiveness. The 10 evidence-based public health measures identified and described here represent a set if measures with high probability of success if implemented, as they are supported by scientific evidence. We recommend that governments, at the national or local levels, apply these measures not individually, but in combination, arranging them into a plan or roadmap, where the framework in which they will be applied must be established according to each context. Considering the available resources, some of these measures could be implemented in the short and medium term while the others can be set in the long-term.
High-speed line-scan camera with digital time delay integration
NASA Astrophysics Data System (ADS)
Bodenstorfer, Ernst; Fürtler, Johannes; Brodersen, Jörg; Mayer, Konrad J.; Eckel, Christian; Gravogl, Klaus; Nachtnebel, Herbert
2007-02-01
Dealing with high-speed image acquisition and processing systems, the speed of operation is often limited by the amount of available light, due to short exposure times. Therefore, high-speed applications often use line-scan cameras, based on charge-coupled device (CCD) sensors with time delayed integration (TDI). Synchronous shift and accumulation of photoelectric charges on the CCD chip - according to the objects' movement - result in a longer effective exposure time without introducing additional motion blur. This paper presents a high-speed color line-scan camera based on a commercial complementary metal oxide semiconductor (CMOS) area image sensor with a Bayer filter matrix and a field programmable gate array (FPGA). The camera implements a digital equivalent to the TDI effect exploited with CCD cameras. The proposed design benefits from the high frame rates of CMOS sensors and from the possibility of arbitrarily addressing the rows of the sensor's pixel array. For the digital TDI just a small number of rows are read out from the area sensor which are then shifted and accumulated according to the movement of the inspected objects. This paper gives a detailed description of the digital TDI algorithm implemented on the FPGA. Relevant aspects for the practical application are discussed and key features of the camera are listed.
Segers, Laurent; Van Bavegem, David; De Winne, Sam; Braeken, An; Touhafi, Abdellah; Steenhaut, Kris
2015-01-01
This paper describes a new approach and implementation methodology for indoor ranging based on the time difference of arrival using code division multiple access with ultrasound signals. A novel implementation based on a field programmable gate array using finite impulse response filters and an optimized correlation demodulator implementation for ultrasound orthogonal signals is developed. Orthogonal codes are modulated onto ultrasound signals using frequency shift keying with carrier frequencies of 24.5 kHz and 26 kHz. This implementation enhances the possibilities for real-time, embedded and low-power tracking of several simultaneous transmitters. Due to the high degree of parallelism offered by field programmable gate arrays, up to four transmitters can be tracked simultaneously. The implementation requires at most 30% of the available logic gates of a Spartan-6 XC6SLX45 device and is evaluated on accuracy and precision through several ranging topologies. In the first topology, the distance between one transmitter and one receiver is evaluated. Afterwards, ranging analyses are applied between two simultaneous transmitters and one receiver. Ultimately, the position of the receiver against four transmitters using trilateration is also demonstrated. Results show enhanced distance measurements with distances ranging from a few centimeters up to 17 m, while keeping a centimeter-level accuracy. PMID:26263986
Drury, Peta; McInnes, Elizabeth; Hardy, Jennifer; Dale, Simeon; Middleton, Sandy
2016-04-01
The uptake of evidence into practice may be impeded or facilitated by individual and organizational factors within the local context. This study investigated Nurse Managers of New South Wales, Australia, stroke units (n = 19) in their views on: leadership ability (measured by the Leadership Practices Inventory), organizational learning (measured by the Organizational Learning Survey), attitudes and beliefs towards evidence-based practice (EBP) and readiness for change. Overall Nurse Managers reported high-level leadership skills and a culture of learning. Nurse Managers' attitude towards EBP was positive, although nursing colleague's attitudes were perceived as less positive. Nurse Managers agreed that implementing evidence in practice places additional demands on staff; and almost half (n = 9, 47%) reported that resources were not available for evidence implementation. The findings indicate that key persons responsible for evidence implementation are not allocated sufficient time to coordinate and implement guidelines into practice. The findings suggest that barriers to evidence uptake, including insufficient resources and time constraints, identified by Nurse Managers in this study are not likely to be unique to stroke units. Furthermore, Nurse Managers may be unable to address these organizational barriers (i.e. lack of resources) and thus provide all the components necessary to implement EBP. © 2015 John Wiley & Sons Australia, Ltd.
Muilenburg, Jessica L; Laschober, Tanja C; Eby, Lillian T
2015-09-01
Adolescence is a prime developmental stage for early tobacco cessation (TC) intervention. This study examined substance use disorder counselors' reports of the availability and implementation of TC services (behavioral treatments and pharmacotherapies) in their treatment programs and the relationship between their tobacco-related knowledge and implementation of TC services. Survey data were collected in 2012 from 63 counselors working in 22 adolescent-only treatment programs. Measures included 15 TC behavioral treatments, nine TC pharmacotherapies, and three tobacco-related knowledge scales (morbidity/mortality, modalities and effectiveness, pharmacology). First, nine of the 15 behavioral treatments are reported as being available by more than half of counselors; four of the 15 behavioral treatments are used by counselors with more than half of adolescents. Of the nine pharmacotherapies, availability of the nicotine patch is reported by almost 40%, buproprion by nearly 30%, and clonidine by about 21% of counselors. Pharmacotherapies are used by counselors with very few adolescents. Second, counselors' tobacco-related knowledge varies based on the knowledge scale examined. Third, we only find a significant positive relationship between counselors' implementation of TC behavioral treatments and TC modalities and effectiveness knowledge. Findings suggest that more behavioral treatments should be made available in substance use disorder treatment programs considering that they are the main treatment recommendation for adolescents. Counselors should be encouraged to routinely use a wide range of available behavioral treatments. Finally, counselors should be encouraged to expand their knowledge of TC modalities and effectiveness because of the relationship with behavioral treatments implementation. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-02
... Promulgation of Implementation Plans; Texas; Reasonably Available Control Technology for the 1997 8-Hour Ozone... (SIP) for the Houston/Galveston/ Brazoria (HGB) 1997 8-Hour ozone nonattainment Area (Area). The HGB... Room between the hours of 8:30am and 4:30 p.m. weekdays except for legal holidays. Contact the person...
ERIC Educational Resources Information Center
Terry-McElrath, Yvonne M.; Hood, Nancy E.; Colabianchi, Natalie; O'Malley, Patrick M.; Johnston, Lloyd D.
2014-01-01
Background: The 2013-2014 school year involved preparation for implementing the new US Department of Agriculture (USDA) competitive foods nutrition standards. An awareness of associations between commercial supplier involvement, food vending practices, and food vending item availability may assist schools in preparing for the new standards.…
Cost Assessment of Implementation of Immune Tolerance Induction in Iran.
Cheraghali, Abdol Majid; Eshghi, Peyman
2012-05-01
A number of hemophilia A patients who receive clotting factors may develop antibodies (inhibitors) against clotting factors. The immune tolerance induction (ITI) method has proved to be a very cost-effective alternative to bypassing agents. Iran's national health authority is interested in implementing the ITI method for the management of hemophilia patients with inhibitors. The objective of this study was to calculate the breakeven point between costs attributed to the ITI method and the use of bypassing agents for the management of high-responder hemophilia patients with inhibitors. This study assessed costs attributed to the implementation of ITI for the management of Iranian hemophilia patients with costs of high-titer and high-responding inhibitors from the perspective of the national health system. The main objective was to find the breakeven point for the ITI method in comparison with the use of bypassing medicine, recombinant factor VIIa (Novoseven). Based on the sensitivity analysis performed, the breakeven point mainly depends on costs of factor VIII, Novoseven, and the success rate of the ITI intervention. According to this analysis, the breakeven point of ITI and Novoseven methods varies between 16 and 34 months posttreatment. The optimized point is about 17 months posttreatment. Iran's national health system spends more than 24 million euros for providing bypassing agents to about 124 hemophilia patients with inhibitors. Because of limited resources available in Iran's health sector, this is a huge burden. Results of this study show that the implementation of the ITI method for the management of Iranian hemophilia patients with inhibitors is a cost-saving method and Iran's health system will recover all the expenditure related to the implementation of ITI in less than 2 years and will make a considerable saving along with providing standard care for these patients. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
2014-01-01
Background High rates of childhood obesity have generated interest among policy makers to improve the school food environment and increase students’ levels of physical activity. The purpose of this study was to examine school-level changes associated with implementation of the Food and Beverage Sales in Schools (FBSS) and Daily Physical Activity (DPA) guidelines in British Columbia, Canada. Methods Elementary and middle/high school principals completed a survey on the school food and physical activity environment in 2007–08 (N = 513) and 2011–12 (N = 490). Hierarchical mixed effects regression was used to examine changes in: 1) availability of food and beverages; 2) minutes per day of Physical Education (PE); 3) delivery method of PE; and 4) school community support. Models controlled for school enrollment and community type, education and income. Results After policy implementation was expected, more elementary schools provided access to fruits and vegetables and less to 100% fruit juice. Fewer middle/high schools provided access to sugar-sweetened beverages, French fries, baked goods, salty snacks and chocolate/candy. Schools were more likely to meet 150 min/week of PE for grade 6 students, and offer more minutes of PE per week for grade 8 and 10 students including changes to PE delivery method. School community support for nutrition and physical activity policies increased over time. Conclusion Positive changes to the school food environment occurred after schools were expected to implement the FBSS and DPA guidelines. Reported changes to the school environment are encouraging and provide support for guidelines and policies that focus on increasing healthy eating and physical activity in schools. PMID:24731514
Hart, Reece K; Rico, Rudolph; Hare, Emily; Garcia, John; Westbrook, Jody; Fusaro, Vincent A
2015-01-15
Biological sequence variants are commonly represented in scientific literature, clinical reports and databases of variation using the mutation nomenclature guidelines endorsed by the Human Genome Variation Society (HGVS). Despite the widespread use of the standard, no freely available and comprehensive programming libraries are available. Here we report an open-source and easy-to-use Python library that facilitates the parsing, manipulation, formatting and validation of variants according to the HGVS specification. The current implementation focuses on the subset of the HGVS recommendations that precisely describe sequence-level variation relevant to the application of high-throughput sequencing to clinical diagnostics. The package is released under the Apache 2.0 open-source license. Source code, documentation and issue tracking are available at http://bitbucket.org/hgvs/hgvs/. Python packages are available at PyPI (https://pypi.python.org/pypi/hgvs). Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
How to keep the Grid full and working with ATLAS production and physics jobs
NASA Astrophysics Data System (ADS)
Pacheco Pagés, A.; Barreiro Megino, F. H.; Cameron, D.; Fassi, F.; Filipcic, A.; Di Girolamo, A.; González de la Hoz, S.; Glushkov, I.; Maeno, T.; Walker, R.; Yang, W.; ATLAS Collaboration
2017-10-01
The ATLAS production system provides the infrastructure to process millions of events collected during the LHC Run 1 and the first two years of Run 2 using grid, clouds and high performance computing. We address in this contribution the strategies and improvements that have been implemented to the production system for optimal performance and to achieve the highest efficiency of available resources from operational perspective. We focus on the recent developments.
Seat-belt message and the law?
Sengupta, S K; Patil, N G; Law, G
1989-09-01
This paper attempts to draw together available information on the use of seat belts, one of the most important safety devices for a person in a car. Considering the high rate of mortality and morbidity due to road traffic accidents in Papua New Guinea the authors strongly feel that seat-belt usage should be made compulsory. When one looks at the history of the implementation of such a successful countermeasure in other countries it seems that legislation is the only answer.
Broadcast control of air traffic
NASA Technical Reports Server (NTRS)
Litchford, G. B.
1972-01-01
Applications of wide range broadcast procedures to improve air traffic control and make more airspace available are discussed. A combination of the Omega navigation system and the very high frequency omnirange (VOR) is recommended as a means for accomplishing improved air traffic control. The benefits to be derived by commercial and general aviation are described. The air/ground communications aspects of the improved air traffic control system are explained. Research and development programs for implementing the broadcast concept are recommended.
The Galileo Spacecraft: A Telecommunications Legacy for Future Space Flight
NASA Technical Reports Server (NTRS)
Deutsch, Leslie J.
1997-01-01
The Galileo mission to Jupiter has implemented a wide range of telecommunication inprovements in response to the loss of its high gain antenna. While necessity dictated the use of these new techniques for Galileo, now that they have been proven in flight, they are available for use on future deep space missions. This telecommunications legacy of Galileo will aid in our ability to conduct a meaningful exploration of the solar system, and beyond, at a reasonable cost.
Ultra-Wideband Time-Difference-of-Arrival High Resolution 3D Proximity Tracking System
NASA Technical Reports Server (NTRS)
Ni, Jianjun; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dekome, Kent; Dusl, John
2010-01-01
This paper describes a research and development effort for a prototype ultra-wideband (UWB) tracking system that is currently under development at NASA Johnson Space Center (JSC). The system is being studied for use in tracking of lunar./Mars rovers and astronauts during early exploration missions when satellite navigation systems are not available. U IATB impulse radio (UWB-IR) technology is exploited in the design and implementation of the prototype location and tracking system. A three-dimensional (3D) proximity tracking prototype design using commercially available UWB products is proposed to implement the Time-Difference- Of-Arrival (TDOA) tracking methodology in this research effort. The TDOA tracking algorithm is utilized for location estimation in the prototype system, not only to exploit the precise time resolution possible with UWB signals, but also to eliminate the need for synchronization between the transmitter and the receiver. Simulations show that the TDOA algorithm can achieve the fine tracking resolution with low noise TDOA estimates for close-in tracking. Field tests demonstrated that this prototype UWB TDOA High Resolution 3D Proximity Tracking System is feasible for providing positioning-awareness information in a 3D space to a robotic control system. This 3D tracking system is developed for a robotic control system in a facility called "Moonyard" at Honeywell Defense & System in Arizona under a Space Act Agreement.
Optimizability of OGC Standards Implementations - a Case Study
NASA Astrophysics Data System (ADS)
Misev, D.; Baumann, P.
2012-04-01
Why do we shop at Amazon? Because they have a unique offering that is nowhere else available? Certainly not. Rather, Amazon offers (i) simple, yet effective search; (ii) very simple payment; (iii) extremely rapid delivery. This is how scientific services will be distinguished in future: not for their data holding (there will be manifold choice), but for their service quality. We are facing the transition from data stewardship to service stewardship. One of the OGC standards which particularly enables flexible retrieval is the Web Coverage Processing Service (WCPS). It defines a high-level query language on large, multi-dimensional raster data, such as 1D timeseries, 2D EO imagery, 3D x/y/t image time series and x/y/z geophysical data, 4D x/y/z/t climate and ocean data. We have implemented WCPS based on an Array Database Management System, rasdaman, which is available in open source. In this demonstration, we study WCPS queries on 2D, 3D, and 4D data sets. Particular emphasis is placed on the computational load queries generate in such on-demand processing and filtering. We look at different techniques and their impact on performance, such as adaptive storage partitioning, query rewriting, and just-in-time compilation. Results show that there is significant potential for effective server-side optimization once a query language is sufficiently high-level and declarative.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.
Simonyan, Vahan; Mazumder, Raja
2014-09-30
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.
NASA Astrophysics Data System (ADS)
Bostrom, G.; Atkinson, D.; Rice, A.
2015-04-01
Cavity ringdown spectroscopy (CRDS) uses the exponential decay constant of light exiting a high-finesse resonance cavity to determine analyte concentration, typically via absorption. We present a high-throughput data acquisition system that determines the decay constant in near real time using the discrete Fourier transform algorithm on a field programmable gate array (FPGA). A commercially available, high-speed, high-resolution, analog-to-digital converter evaluation board system is used as the platform for the system, after minor hardware and software modifications. The system outputs decay constants at maximum rate of 4.4 kHz using an 8192-point fast Fourier transform by processing the intensity decay signal between ringdown events. We present the details of the system, including the modifications required to adapt the evaluation board to accurately process the exponential waveform. We also demonstrate the performance of the system, both stand-alone and incorporated into our existing CRDS system. Details of FPGA, microcontroller, and circuitry modifications are provided in the Appendix and computer code is available upon request from the authors.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis
Simonyan, Vahan; Mazumder, Raja
2014-01-01
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953
Computational statistics using the Bayesian Inference Engine
NASA Astrophysics Data System (ADS)
Weinberg, Martin D.
2013-09-01
This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.
Moye-Holz, Daniela; van Dijk, Jitse P; Reijneveld, Sijmen A; Hogerzeil, Hans V
2017-08-01
The World Health Organization recommends establishing and implementing a national pharmaceutical policy (NPP) to guarantee effective and equitable access to medicines. Mexico has implemented several policy approaches to regulate the pharmaceutical sector, but it has no formal NPP. This article describes the approach that the Mexican government has taken to improve availability and affordability of essential medicines. Descriptive policy analysis of public pharmaceutical policy proposals and health action plans on the basis of publicly available data and health progress reports, with a focus on availability and affordability of medicines. The government has implemented pooled procurement, price negotiations, and an information platform in the public sector to improve affordability and availability. The government mainly reports on the savings that these strategies have generated in the public expenditure but their full impact on availability and affordability has not been assessed. To increase availability and affordability of medicines in the public sector, the Mexican government has resorted on isolated strategies. In addition to efficient procurement, price negotiations and price information, other policy components and pricing interventions are needed. All these strategies should be included in a comprehensive NPP.
snpAD: An ancient DNA genotype caller.
Prüfer, Kay
2018-06-21
The study of ancient genomes can elucidate the evolutionary past. However, analyses are complicated by base-modifications in ancient DNA molecules that result in errors in DNA sequences. These errors are particularly common near the ends of sequences and pose a challenge for genotype calling. I describe an iterative method that estimates genotype frequencies and errors along sequences to allow for accurate genotype calling from ancient sequences. The implementation of this method, called snpAD, performs well on high-coverage ancient data, as shown by simulations and by subsampling the data of a high-coverage Neandertal genome. Although estimates for low-coverage genomes are less accurate, I am able to derive approximate estimates of heterozygosity from several low-coverage Neandertals. These estimates show that low heterozygosity, compared to modern humans, was common among Neandertals. The C ++ code of snpAD is freely available at http://bioinf.eva.mpg.de/snpAD/. Supplementary data are available at Bioinformatics online.
64 x 64 thresholding photodetector array for optical pattern recognition
NASA Astrophysics Data System (ADS)
Langenbacher, Harry; Chao, Tien-Hsin; Shaw, Timothy; Yu, Jeffrey W.
1993-10-01
A high performance 32 X 32 peak detector array is introduced. This detector consists of a 32 X 32 array of thresholding photo-transistor cells, manufactured with a standard MOSIS digital 2-micron CMOS process. A built-in thresholding function that is able to perform 1024 thresholding operations in parallel strongly distinguishes this chip from available CCD detectors. This high speed detector offers responses from one to 10 milliseconds that is much higher than the commercially available CCD detectors operating at a TV frame rate. The parallel multiple peaks thresholding detection capability makes it particularly suitable for optical correlator and optoelectronically implemented neural networks. The principle of operation, circuit design and the performance characteristics are described. Experimental demonstration of correlation peak detection is also provided. Recently, we have also designed and built an advanced version of a 64 X 64 thresholding photodetector array chip. Experimental investigation of using this chip for pattern recognition is ongoing.
A global × global test for testing associations between two large sets of variables.
Chaturvedi, Nimisha; de Menezes, Renée X; Goeman, Jelle J
2017-01-01
In high-dimensional omics studies where multiple molecular profiles are obtained for each set of patients, there is often interest in identifying complex multivariate associations, for example, copy number regulated expression levels in a certain pathway or in a genomic region. To detect such associations, we present a novel approach to test for association between two sets of variables. Our approach generalizes the global test, which tests for association between a group of covariates and a single univariate response, to allow high-dimensional multivariate response. We apply the method to several simulated datasets as well as two publicly available datasets, where we compare the performance of multivariate global test (G2) with univariate global test. The method is implemented in R and will be available as a part of the globaltest package in R. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
GeoSTAR - A Synthetic Aperture Microwave Sounder for Geostationary Missions
NASA Technical Reports Server (NTRS)
Lambrigtsen, Bjorn; Wilson, William; Tanner, Alan; Kangaslahti, Pekka
2004-01-01
The Geostationary Synthetic Thinned Aperture Radiometer (GeoSTAR) is a new microwave atmospheric sounder under development. It will bring capabilities similar to those now available on low-earth orbiting environmental satellites to geostationary orbit - where such capabilities have not been available. GeoSTAR will synthesize the multimeter aperture needed to achieve the required spatial resolution, which will overcome the obstacle that has prevented a GEO microwave sounder from being implemented until now. The synthetic aperture approach has until recently not been feasible, due to the high power needed to operate the on-board high-speed massively parallel processing system required for 2D-synthesis, as well as a number of system and calibration obstacles. The development effort under way at JPL, with important contributions from the Goddard Space Flight Center and the University of Michigan, is intended to demonstrate the measurement concept and retire much of the technology risk.
Cognitive ergonomics of operational tools
NASA Astrophysics Data System (ADS)
Lüdeke, A.
2012-10-01
Control systems have become increasingly more powerful over the past decades. The availability of high data throughput and sophisticated graphical interactions has opened a variety of new possibilities. But has this helped to provide intuitive, easy to use applications to simplify the operation of modern large scale accelerator facilities? We will discuss what makes an application useful to operation and what is necessary to make a tool easy to use. We will show that even the implementation of a small number of simple application design rules can help to create ergonomic operational tools. The author is convinced that such tools do indeed help to achieve higher beam availability and better beam performance at accelerator facilities.
HTML5 PivotViewer: high-throughput visualization and querying of image data on the web
Taylor, Stephen; Noble, Roger
2014-01-01
Motivation: Visualization and analysis of large numbers of biological images has generated a bottle neck in research. We present HTML5 PivotViewer, a novel, open source, platform-independent viewer making use of the latest web technologies that allows seamless access to images and associated metadata for each image. This provides a powerful method to allow end users to mine their data. Availability and implementation: Documentation, examples and links to the software are available from http://www.cbrg.ox.ac.uk/data/pivotviewer/. The software is licensed under GPLv2. Contact: stephen.taylor@imm.ox.ac.uk and roger@coritsu.com PMID:24849578
Sustainability via Active Garden Education (SAGE): results from two feasibility pilot studies.
Lee, Rebecca E; Parker, Nathan H; Soltero, Erica G; Ledoux, Tracey A; Mama, Scherezade K; McNeill, Lorna
2017-03-10
Low physical activity (PA) and fruit and vegetable (F&V) consumption in early childhood are continued public health challenges. This manuscript describes outcomes from two pilot studies for Sustainability via Active Garden Education (SAGE), a program designed to increase PA and F&V consumption among 3 to 5 year old children. SAGE was developed using community-based participatory research (CBPR) and delivered to children (N = 89) in early care and education centers (ECEC, N = 6) in two US cities. Children participated in 12 one-hour sessions that included songs, games, and interactive learning activities involving garden maintenance and taste tests. We evaluated reach, efficacy, adoption, implementation, and potential for maintenance of SAGE following the RE-AIM framework. Reach was evaluated by comparing demographic characteristics among SAGE participants and residents of target geographic areas. Efficacy was evaluated with accelerometer-measured PA, F&V consumption, and eating in the absence of hunger among children, parenting practices regarding PA, and home availability of F&V. Adoption was evaluated by the number of ECEC that participated relative to the number of ECEC that were recruited. Implementation was evaluated by completion rates of planned SAGE lessons and activities, and potential for maintenance was evaluated with a parent satisfaction survey. SAGE reached ECEC in neighborhoods representing a wide range of socioeconomic status, with participants' sociodemographic characteristics representing those of the intervention areas. Children significantly increased PA during SAGE lessons compared to usual lessons, but they also consumed more calories in the absence of hunger in post- vs. pre-intervention tests (both p < .05). Parent reports did not suggest changes in F&V consumption, parenting PA practices, or home F&V availability, possibly due to low parent engagement. ECEC had moderate-to-high implementation of SAGE lessons and curriculum. Potential for maintenance was strong, with parents rating SAGE favorably and reporting increases in knowledge about PA and nutrition guidelines for young children. SAGE successfully translated national PA guidelines to practice for young children but was less successful with nutrition guidelines. High adoption and implementation and favorable parent reports suggest high potential for program sustainability. Further work to engage parents and families of young children in ECEC-based PA and nutrition programming is needed.
Technological trends in health care: electronic health record.
Abraham, Sam
2010-01-01
The most relevant technological trend affecting health care organizations and physician services is the electronic health record (EHR). Billions of dollars from the federal government stimulus bill are available for investment toward EHR. Based on the government directives, it is evident EHR has to be a high-priority technological intervention in health care organizations. Addressed in the following pages are the effects of the EHR trend on financial and human resources; analysis of advantages and disadvantages of EHR; action steps involved in implementing EHR, and a timeline for implementation. Medical facilities that do not meet the timetable for using EHR will likely experience reduction of Medicare payments. This article also identifies the strengths, weaknesses, opportunities, and threats of the EHR and steps to be taken by hospitals and physician medical groups to receive stimulus payment.
ATK-ForceField: a new generation molecular dynamics software package
NASA Astrophysics Data System (ADS)
Schneider, Julian; Hamaekers, Jan; Chill, Samuel T.; Smidstrup, Søren; Bulin, Johannes; Thesen, Ralph; Blom, Anders; Stokbro, Kurt
2017-12-01
ATK-ForceField is a software package for atomistic simulations using classical interatomic potentials. It is implemented as a part of the Atomistix ToolKit (ATK), which is a Python programming environment that makes it easy to create and analyze both standard and highly customized simulations. This paper will focus on the atomic interaction potentials, molecular dynamics, and geometry optimization features of the software, however, many more advanced modeling features are available. The implementation details of these algorithms and their computational performance will be shown. We present three illustrative examples of the types of calculations that are possible with ATK-ForceField: modeling thermal transport properties in a silicon germanium crystal, vapor deposition of selenium molecules on a selenium surface, and a simulation of creep in a copper polycrystal.
Stephens, Susie M.; Chen, Jake Y.; Davidson, Marcel G.; Thomas, Shiby; Trute, Barry M.
2005-01-01
As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html PMID:15608287
A security architecture for health information networks.
Kailar, Rajashekar; Muralidhar, Vinod
2007-10-11
Health information network security needs to balance exacting security controls with practicality, and ease of implementation in today's healthcare enterprise. Recent work on 'nationwide health information network' architectures has sought to share highly confidential data over insecure networks such as the Internet. Using basic patterns of health network data flow and trust models to support secure communication between network nodes, we abstract network security requirements to a core set to enable secure inter-network data sharing. We propose a minimum set of security controls that can be implemented without needing major new technologies, but yet realize network security and privacy goals of confidentiality, integrity and availability. This framework combines a set of technology mechanisms with environmental controls, and is shown to be sufficient to counter commonly encountered network security threats adequately.
A Security Architecture for Health Information Networks
Kailar, Rajashekar
2007-01-01
Health information network security needs to balance exacting security controls with practicality, and ease of implementation in today’s healthcare enterprise. Recent work on ‘nationwide health information network’ architectures has sought to share highly confidential data over insecure networks such as the Internet. Using basic patterns of health network data flow and trust models to support secure communication between network nodes, we abstract network security requirements to a core set to enable secure inter-network data sharing. We propose a minimum set of security controls that can be implemented without needing major new technologies, but yet realize network security and privacy goals of confidentiality, integrity and availability. This framework combines a set of technology mechanisms with environmental controls, and is shown to be sufficient to counter commonly encountered network security threats adequately. PMID:18693862
A New Design for Airway Management Training with Mixed Reality and High Fidelity Modeling.
Shen, Yunhe; Hananel, David; Zhao, Zichen; Burke, Daniel; Ballas, Crist; Norfleet, Jack; Reihsen, Troy; Sweet, Robert
2016-01-01
Restoring airway function is a vital task in many medical scenarios. Although various simulation tools have been available for learning such skills, recent research indicated that fidelity in simulating airway management deserves further improvements. In this study, we designed and implemented a new prototype for practicing relevant tasks including laryngoscopy, intubation and cricothyrotomy. A large amount of anatomical details or landmarks were meticulously selected and reconstructed from medical scans, and 3D-printed or molded to the airway intervention model. This training model was augmented by virtually and physically presented interactive modules, which are interoperable with motion tracking and sensor data feedback. Implementation results showed that this design is a feasible approach to develop higher fidelity airway models that can be integrated with mixed reality interfaces.
One-way quantum computing in superconducting circuits
NASA Astrophysics Data System (ADS)
Albarrán-Arriagada, F.; Alvarado Barrios, G.; Sanz, M.; Romero, G.; Lamata, L.; Retamal, J. C.; Solano, E.
2018-03-01
We propose a method for the implementation of one-way quantum computing in superconducting circuits. Measurement-based quantum computing is a universal quantum computation paradigm in which an initial cluster state provides the quantum resource, while the iteration of sequential measurements and local rotations encodes the quantum algorithm. Up to now, technical constraints have limited a scalable approach to this quantum computing alternative. The initial cluster state can be generated with available controlled-phase gates, while the quantum algorithm makes use of high-fidelity readout and coherent feedforward. With current technology, we estimate that quantum algorithms with above 20 qubits may be implemented in the path toward quantum supremacy. Moreover, we propose an alternative initial state with properties of maximal persistence and maximal connectedness, reducing the required resources of one-way quantum computing protocols.
Tuffaha, Haitham W; Roberts, Shelley; Chaboyer, Wendy; Gordon, Louisa G; Scuffham, Paul A
2015-04-01
Pressure ulcers are a major cause of mortality, morbidity, and increased healthcare cost. Nutritional support may reduce the incidence of pressure ulcers in hospitalised patients who are at risk of pressure ulcer and malnutrition. To evaluate the cost-effectiveness of nutritional support in preventing pressure ulcers in high-risk hospitalised patients, and to assess the value of further research to inform the decision to implement this intervention using value of information analysis (VOI). The analysis was from the perspective of Queensland Health, Australia using a decision model with evidence derived from a systematic review and meta-analysis. Resources were valued using 2014 prices and the time horizon of the analysis was one year. Monte Carlo simulation was used to estimate net monetary benefits (NB) and to calculate VOI measures. Compared with standard hospital diet, nutritional support was cost saving at AU$425 per patient, and more effective with an average 0.005 quality-adjusted life years (QALY) gained. At a willingness-to-pay of AU$50,000 per QALY, the incremental NB was AU$675 per patient, with a probability of 87 % that nutritional support is cost-effective. The expected value of perfect information was AU$5 million and the expected value of perfect parameter information was highest for the relative risk of developing a pressure ulcer at AU$2.5 million. For a future trial investigating the relative effectiveness of the interventions, the expected net benefit of research would be maximised at AU$100,000 with 1,200 patients in each arm if nutritional support was perfectly implemented. The opportunity cost of withholding the decision to implement the intervention until the results of the future study are available would be AU$14 million. Nutritional support is cost-effective in preventing pressure ulcers in high-risk hospitalised patients compared with standard diet. Future research to reduce decision uncertainty is worthwhile; however, given the opportunity losses associated with delaying the implementation, "implement and research" is the approach recommended for this intervention.
Implementation and test of a coastal forecasting system for wind waves in the Mediterranean Sea
NASA Astrophysics Data System (ADS)
Inghilesi, R.; Catini, F.; Orasi, A.; Corsini, S.
2010-09-01
A coastal forecasting system has been implemented in order to provide a coverage of the whole Mediterranean Sea and of several enclosed coastal areas as well. The problem is to achieve a good definition of the small scale coastal processes which affect the propagation of waves toward the shores while retaining the possibility of selecting any of the possible coastal areas in the whole Mediterranean Sea. The system is built on a very high resolution parallel implementation of the WAM and SWAN models, one-way chain-nested in key areas. The system will shortly be part of the ISPRA SIMM forecasting system which has been operative since 2001. The SIMM sistem makes available the high resolution wind fields (0.1/0.1 deg) used in the coastal system. The coastal system is being tested on several Italian coastal areas (Ligurian Sea, Lower Tyrrenian Sea, Sicily Channel, Lower Adriatic Sea) in order to optimise the numerics of the coastal processes and to verify the results in shallow waters and complex bathymetries. The results of the comparison between hindcast and buoy data in very shallow (14m depth) and deep sea (150m depth) will be shown for several episodes in the upper Tyrrenian Sea.
Accuracy of Genomic Prediction in a Commercial Perennial Ryegrass Breeding Program.
Fè, Dario; Ashraf, Bilal H; Pedersen, Morten G; Janss, Luc; Byrne, Stephen; Roulund, Niels; Lenk, Ingo; Didion, Thomas; Asp, Torben; Jensen, Christian S; Jensen, Just
2016-11-01
The implementation of genomic selection (GS) in plant breeding, so far, has been mainly evaluated in crops farmed as homogeneous varieties, and the results have been generally positive. Fewer results are available for species, such as forage grasses, that are grown as heterogenous families (developed from multiparent crosses) in which the control of the genetic variation is far more complex. Here we test the potential for implementing GS in the breeding of perennial ryegrass ( L.) using empirical data from a commercial forage breeding program. Biparental F and multiparental synthetic (SYN) families of diploid perennial ryegrass were genotyped using genotyping-by-sequencing, and phenotypes for five different traits were analyzed. Genotypes were expressed as family allele frequencies, and phenotypes were recorded as family means. Different models for genomic prediction were compared by using practically relevant cross-validation strategies. All traits showed a highly significant level of genetic variance, which could be traced using the genotyping assay. While there was significant genotype × environment (G × E) interaction for some traits, accuracies were high among F families and between biparental F and multiparental SYN families. We have demonstrated that the implementation of GS in grass breeding is now possible and presents an opportunity to make significant gains for various traits. Copyright © 2016 Crop Science Society of America.
Implementationof a modular software system for multiphysical processes in porous media
NASA Astrophysics Data System (ADS)
Naumov, Dmitri; Watanabe, Norihiro; Bilke, Lars; Fischer, Thomas; Lehmann, Christoph; Rink, Karsten; Walther, Marc; Wang, Wenqing; Kolditz, Olaf
2016-04-01
Subsurface georeservoirs are a candidate technology for large scale energy storage required as part of the transition to renewable energy sources. The increased use of the subsurface results in competing interests and possible impacts on protected entities. To optimize and plan the use of the subsurface in large scale scenario analyses,powerful numerical frameworks are required that aid process understanding and can capture the coupled thermal (T), hydraulic (H), mechanical (M), and chemical (C) processes with high computational efficiency. Due to having a multitude of different couplings between basic T, H, M, or C processes and the necessity to implement new numerical schemes the development focus has moved to software's modularity. The decreased coupling between the components results in two major advantages: easier addition of specialized processes and improvement of the code's testability and therefore its quality. The idea of modularization is implemented on several levels, in addition to library based separation of the previous code version, by using generalized algorithms available in the Standard Template Library and the Boost library, relying on efficient implementations of liner algebra solvers, using concepts when designing new types, and localization of frequently accessed data structures. This procedure shows certain benefits for a flexible high-performance framework applied to the analysis of multipurpose georeservoirs.
Implementation and control of a 3 degree-of-freedom, force-reflecting manual controller
NASA Astrophysics Data System (ADS)
Kim, Whee-Kuk; Bevill, Pat; Tesar, Delbert
1991-02-01
Most available manual controllers which are used in bilateral or force-reflecting teleoperator systems can be characterized by their bulky size heavy weight high cost low magnitude of reflecting-force lack of smoothness insufficient transparency and simplified architectures. A compact smooth lightweight portable universal manual controller could provide a markedly improved level of transparency and be able to drive a broad spectrum of slave manipulators. This implies that a single stand-off position could be used for a diverse population of remote systems and that a standard environment for training of operators would result in reduced costs and higher reliability. In the implementation presented in this paper a parallel 3 degree-of-freedom (DOF) spherical structure (for compactness and reduced weight) is combined with high gear-ratio reducers using a force control algorithm to produce a " power steering" effect for enhanced smoothness and transparency. The force control algorithm has the further benefit of minimizing the effect of the system friction and non-linear inertia forces. The fundamental analytical description for the spherical force-reflecting manual controller such as forward position analysis reflecting-force transformation and applied force control algorithm are presented. Also a brief description of the system integration its actual implementation and preliminary test results are presented in the paper.
Spacewire on Earth orbiting scatterometers
NASA Technical Reports Server (NTRS)
Bachmann, Alex; Lang, Minh; Lux, James; Steffke, Richard
2002-01-01
The need for a high speed, reliable and easy to implement communication link has led to the development of a space flight oriented version of IEEE 1355 called SpaceWire. SpaceWire is based on high-speed (200 Mbps) serial point-to-point links using Low Voltage Differential Signaling (LVDS). SpaceWIre has provisions for routing messages between a large network of processors, using wormhole routing for low overhead and latency. {additionally, there are available space qualified hybrids, which provide the Link layer to the user's bus}. A test bed of multiple digital signal processor breadboards, demonstrating the ability to meet signal processing requirements for an orbiting scatterometer has been implemented using three Astrium MCM-DSPs, each breadboard consists of a Multi Chip Module (MCM) that combines a space qualified Digital Signal Processor and peripherals, including IEEE-1355 links. With the addition of appropriate physical layer interfaces and software on the DSP, the SpaceWire link is used to communicate between processors on the test bed, e.g. sending timing references, commands, status, and science data among the processors. Results are presented on development issues surrounding the use of SpaceWire in this environment, from physical layer implementation (cables, connectors, LVDS drivers) to diagnostic tools, driver firmware, and development methodology. The tools, methods, and hardware, software challenges and preliminary performance are investigated and discussed.
Kawamoto, Kensaku; Martin, Cary J; Williams, Kip; Tu, Ming-Chieh; Park, Charlton G; Hunter, Cheri; Staes, Catherine J; Bray, Bruce E; Deshmukh, Vikrant G; Holbrook, Reid A; Morris, Scott J; Fedderson, Matthew B; Sletta, Amy; Turnbull, James; Mulvihill, Sean J; Crabtree, Gordon L; Entwistle, David E; McKenna, Quinn L; Strong, Michael B; Pendleton, Robert C; Lee, Vivian S
2015-01-01
Objective To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). Materials and methods In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. Results A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. Conclusions The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value. PMID:25324556
Schmutz, Sven; Sonderegger, Andreas; Sauer, Juergen
2016-06-01
We examined the consequences of implementing Web accessibility guidelines for nondisabled users. Although there are Web accessibility guidelines for people with disabilities available, they are rarely used in practice, partly due to the fact that practitioners believe that such guidelines provide no benefits, or even have negative consequences, for nondisabled people, who represent the main user group of Web sites. Despite these concerns, there is a lack of empirical research on the effects of current Web accessibility guidelines on nondisabled users. Sixty-one nondisabled participants used one of three Web sites differing in levels of accessibility (high, low, and very low). Accessibility levels were determined by following established Web accessibility guidelines (WCAG 2.0). A broad methodological approach was used, including performance measures (e.g., task completion time) and user ratings (e.g., perceived usability). A high level of Web accessibility led to better performance (i.e., task completion time and task completion rate) than low or very low accessibility. Likewise, high Web accessibility improved user ratings (i.e., perceived usability, aesthetics, workload, and trustworthiness) compared to low or very low Web accessibility. There was no difference between the very low and low Web accessibility conditions for any of the outcome measures. Contrary to some concerns in the literature and among practitioners, high conformance with Web accessibility guidelines may provide benefits to users without disabilities. The findings may encourage more practitioners to implement WCAG 2.0 for the benefit of users with disabilities and nondisabled users. © 2016, Human Factors and Ergonomics Society.
NASA Astrophysics Data System (ADS)
Rey, David M.
Energy and water are connected through the water-use cycle (e.g. obtaining, transporting, and treating water) and thermoelectric energy generation, which converts heat to electricity via steam-driven turbines. As the United States implements more renewable energy technologies, quantifying the relationships between energy, water, and land-surface impacts of these implementations will provide policy makers the strengths and weaknesses of different renewable energy options. In this study, a MODFLOW model of the Indian Wells Valley (IWV), in California, was developed to capture the water, energy, and land-surface impacts of potential proposed 1) solar, 2) wind, and 3) biofuel implementations. The model was calibrated to pre-existing groundwater head data from 1985 to present to develop a baseline model before running two-year predictive scenarios for photovoltaic (PV), concentrating solar power (CSP), wind, and biofuel implementations. Additionally, the baseline model was perturbed by decreasing mountain front recharge values by 5%, 10%, and 15%, simulating potential future system perturbations under a changing climate. These potential future conditions were used to re-run each implementation scenario. Implementation scenarios were developed based on population, typical energy use per person, existing land-use and land-cover type within the IWV, and previously published values for water use, surface-area use, and energy-generation potential for each renewable fuel type. The results indicate that the quantity of water needed, localized drawdown from pumping water to meet implementation demands, and generation efficiency are strongly controlled by the fuel type, as well as the energy generating technology and thermoelectric technologies implemented. Specifically, PV and wind-turbine (WT) implementations required less than 1% of the estimated annual aquifer recharge, while technologies such as biofuels and CSP, which rely on thermoelectric generation, ranged from 3% to 20%. As modeled groundwater elevations declined in the IWV, the net generation (i.e. energy produced - energy used) of each renewable energy implementation decreased due a higher energy cost for pumping groundwater. The loss in efficiency was minimal for PV and wind solutions, with maximum changes in the drawdown being less than 10 m; however, for CSP and biofuel implementations drawdowns over 50 m were observed at the pumping well, resulting in electrical generation efficiency losses between 4% and 50% over a two-year period. It was concluded that PV would be the best balance between water and land-use for the IWV, or other groundwater dependent Basin and Range settings. In areas with limited water resources but abundant available land for implementation, WT solutions would have the smallest hydrologic impact. The impact of renewable scenarios was highly variable across and within differing fuel types, with the potential for larger negative impacts under a changing climate in areas with no perennial surface water.
Nepveux, Kevin; Sherlock, Jon-Paul; Futran, Mauricio; Thien, Michael; Krumme, Markus
2015-03-01
Continuous manufacturing (CM) is a process technology that has been used in the chemical industry for large-scale mass production of chemicals in single-purpose plants with benefit for many years. Recent interest has been raised to expand CM into the low-volume, high-value pharmaceutical business with its unique requirements regarding readiness for human use and the required quality, supply chain, and liability constraints in this business context. Using a fairly abstract set of definitions, this paper derives technical consequences of CM in different scenarios along the development-launch-supply axis in different business models and how they compare to batch processes. Impact of CM on functions in development is discussed and several operational models suitable for originators and other business models are discussed and specific aspects of CM are deduced from CM's technical characteristics. Organizational structures of current operations typically can support CM implementations with just minor refinements if the CM technology is limited to single steps or small sequences (bin-to-bin approach) and if the appropriate technical skill set is available. In such cases, a small, dedicated group focused on CM is recommended. The manufacturing strategy, as centralized versus decentralized in light of CM processes, is discussed and the potential impact of significantly shortened supply lead times on the organization that runs these processes. The ultimate CM implementation may be seen by some as a totally integrated monolithic plant, one that unifies chemistry and pharmaceutical operations into one plant. The organization supporting this approach will have to reflect this change in scope and responsibility. The other extreme, admittedly futuristic at this point, would be a highly decentralized approach with multiple smaller hubs; this would require a new and different organizational structure. This processing approach would open up new opportunities for products that, because of stability constraints or individualization to patients, do not allow centralized manufacturing approaches at all. Again, the entire enterprise needs to be restructured accordingly. The situation of CM in an outsourced operation business model is discussed. Next steps for the industry are recommended. In summary, opportunistic implementation of isolated steps in existing portfolios can be implemented with minimal organizational changes; the availability of the appropriate skills is the determining factor. The implementation of more substantial sequences requires business processes that consider the portfolio, not just single products. Exploration and implementation of complete process chains with consequences for quality decisions do require appropriate organizational support. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Van de Velde, Stijn; Roshanov, Pavel; Kortteisto, Tiina; Kunnamo, Ilkka; Aertgeerts, Bert; Vandvik, Per Olav; Flottorp, Signe
2016-03-05
A computerised clinical decision support system (CCDSS) is a technology that uses patient-specific data to provide relevant medical knowledge at the point of care. It is considered to be an important quality improvement intervention, and the implementation of CCDSS is growing substantially. However, the significant investments do not consistently result in value for money due to content, context, system and implementation issues. The Guideline Implementation with Decision Support (GUIDES) project aims to improve the impact of CCDSS through optimised implementation based on high-quality evidence-based recommendations. To achieve this, we will develop tools that address the factors that determine successful CCDSS implementation. We will develop the GUIDES tools in four steps, using the methods and results of the Tailored Implementation for Chronic Diseases (TICD) project as a starting point: (1) a review of research evidence and frameworks on the determinants of implementing recommendations using CCDSS; (2) a synthesis of a comprehensive framework for the identified determinants; (3) the development of tools for use of the framework and (4) pilot testing the utility of the tools through the development of a tailored CCDSS intervention in Norway, Belgium and Finland. We selected the conservative management of knee osteoarthritis as a prototype condition for the pilot. During the process, the authors will collaborate with an international expert group to provide input and feedback on the tools. This project will provide guidance and tools on methods of identifying implementation determinants and selecting strategies to implement evidence-based recommendations through CCDSS. We will make the GUIDES tools available to CCDSS developers, implementers, researchers, funders, clinicians, managers, educators, and policymakers internationally. The tools and recommendations will be generic, which makes them scalable to a large spectrum of conditions. Ultimately, the better implementation of CCDSS may lead to better-informed decisions and improved care and patient outcomes for a wide range of conditions. PROSPERO, CRD42016033738.
Implementation of the Arsenic Biosand Filter in Nepal
NASA Astrophysics Data System (ADS)
Murcott, S.; Ngai, T.; Shrestha, R.; Pokharel, K.; Walewijk, S.
2004-05-01
A low-cost, household-scale drinking water filter, the Arsenic Biosand Filter (ABF), appropriate for rural Nepal, was developed by researchers at Massachusetts Institute of Technology and two local partners (ENPHO and RWSSSP) to simultaneously remove arsenic and pathogens from tubewell water. The project implementation site is the Terai region of southern Nepal, where about 90% of people receive water from tubewells and where about 25+% and 40+% of tubewells are contaminated with arsenic (naturally-occurring) and coliforms (from human and animal sources) respectively, causing severe health consequences such as cancers and gastrointestinal illnesses. Despite growing recognition of the immediacy of the arsenic crisis in this region, many previous arsenic technology projects have failed. This is because many of the available technologies have serious drawbacks, including complex production methods, high maintenance, high costs, insufficient filtration rate, and/or reliance on materials unavailable in remote villages. In addition, most technologies treat arsenic and pathogens independently, resulting in complicated treatment operations. Implementation deficiencies including ineffective technology transfer, confusing NGO responsibilities, organizational non-sustainability, lack of user education and contribution, and inadequate long-term maintenance and monitoring capacity are other major problems. The ABF design is optimized based on the socio-economic conditions of rural Terai and is constructed using locally available labor and materials. It was the only arsenic remediation technology to win the prestigious World Bank Development Marketplace Competition in 2003. Funding from this prize will provide start-up capital to pilot a technology transfer network. In 2004, the team has established an in-country technology dissemination and implementation center and is building local capacity in arsenic-affected villages towards long-term, self-reliant, user-participatory safe water provision, involving training of local women, entrepreneurs, trainers, teachers, and local authorities. A laboratory and three month pilot study conducted in Nepal from September 2002 to January 2003 found that the ABF removed arsenic (range = 87 to 96%, mean = 93%), total coliform (range = 0 to 99%, mean = 58%), E. Coli (range = 0 to >99%, mean = 64%), and iron (range = >90 to >97 %, mean = >93%). This presentation will report on the results of the 2004 ABF implementation program in 25 villages in Nepal, targeting an overall population of 10,000 people and will discuss the ABF technology in the context of other similar low-cost household scale approaches to remediation of arsenic-contaminated groundwater.
Marshall, Christy L.; Petersen, Nancy J.; Naik, Aanand D.; Velde, Nancy Vander; Artinyan, Avo; Albo, Daniel; Berger, David H.
2014-01-01
Abstract Background: Tumor board (TB) conferences facilitate multidisciplinary cancer care and are associated with overall improved outcomes. Because of shortages of the oncology workforce and limited access to TB conferences, multidisciplinary care is not available at every institution. This pilot study assessed the feasibility and acceptance of using telemedicine to implement a virtual TB (VTB) program within a regional healthcare network. Materials and Methods: The VTB program was implemented through videoconference technology and electronic medical records between the Houston (TX) Veterans Affairs Medical Center (VAMC) (referral center) and the New Orleans (LA) VAMC (referring center). Feasibility was assessed as the proportion of completed VTB encounters, rate of technological failures/mishaps, and presentation duration. Validated surveys for confidence and satisfaction were administered to 36 TB participants to assess acceptance (1–5 point Likert scale). Secondary outcomes included preliminary data on VTB utilization and its effectiveness in providing access to quality cancer care within the region. Results: Ninety TB case presentations occurred during the study period, of which 14 (15%) were VTB cases. Although one VTB encounter had a technical mishap during presentation, all scheduled encounters were completed (100% completion rate). Case presentations took longer for VTB than for regular TB cases (p=0.0004). However, VTB was highly accepted with mean scores for satisfaction and confidence of 4.6. Utilization rate of VTB was 75%, and its effectiveness was equivalent to that observed for non-VTB cases. Conclusions: Implementation of VTB is feasible and highly accepted by its participants. Future studies should focus on widespread implementation and validating the effectiveness of this model. PMID:24845366
Yoo, Sooyoung; Kim, Seok; Kim, Taegi; Kim, Jon Soo; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee
2012-12-01
The cloud computing-based virtual desktop infrastructure (VDI) allows access to computing environments with no limitations in terms of time or place such that it can permit the rapid establishment of a mobile hospital environment. The objective of this study was to investigate the empirical issues to be considered when establishing a virtual mobile environment using VDI technology in a hospital setting and to examine the utility of the technology with an Apple iPad during a physician's rounds as a case study. Empirical implementation issues were derived from a 910-bed tertiary national university hospital that recently launched a VDI system. During the physicians' rounds, we surveyed patient satisfaction levels with the VDI-based mobile consultation service with the iPad and the relationship between these levels of satisfaction and hospital revisits, hospital recommendations, and the hospital brand image. Thirty-five inpatients (including their next-of-kin) and seven physicians participated in the survey. Implementation issues pertaining to the VDI system arose with regard to the highly availability system architecture, wireless network infrastructure, and screen resolution of the system. Other issues were related to privacy and security, mobile device management, and user education. When the system was used in rounds, patients and their next-of-kin expressed high satisfaction levels, and a positive relationship was noted as regards patients' decisions to revisit the hospital and whether the use of the VDI system improved the brand image of the hospital. Mobile hospital environments have the potential to benefit both physicians and patients. The issues related to the implementation of VDI system discussed here should be examined in advance for its successful adoption and implementation.
Yoo, Sooyoung; Kim, Seok; Kim, Taegi; Kim, Jon Soo; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb
2012-01-01
Objectives The cloud computing-based virtual desktop infrastructure (VDI) allows access to computing environments with no limitations in terms of time or place such that it can permit the rapid establishment of a mobile hospital environment. The objective of this study was to investigate the empirical issues to be considered when establishing a virtual mobile environment using VDI technology in a hospital setting and to examine the utility of the technology with an Apple iPad during a physician's rounds as a case study. Methods Empirical implementation issues were derived from a 910-bed tertiary national university hospital that recently launched a VDI system. During the physicians' rounds, we surveyed patient satisfaction levels with the VDI-based mobile consultation service with the iPad and the relationship between these levels of satisfaction and hospital revisits, hospital recommendations, and the hospital brand image. Thirty-five inpatients (including their next-of-kin) and seven physicians participated in the survey. Results Implementation issues pertaining to the VDI system arose with regard to the highly availability system architecture, wireless network infrastructure, and screen resolution of the system. Other issues were related to privacy and security, mobile device management, and user education. When the system was used in rounds, patients and their next-of-kin expressed high satisfaction levels, and a positive relationship was noted as regards patients' decisions to revisit the hospital and whether the use of the VDI system improved the brand image of the hospital. Conclusions Mobile hospital environments have the potential to benefit both physicians and patients. The issues related to the implementation of VDI system discussed here should be examined in advance for its successful adoption and implementation. PMID:23346476
Spears-Lanoix, Erica C; McKyer, E Lisako J; Evans, Alexandra; McIntosh, William Alex; Ory, Marcia; Whittlesey, Lisa; Kirk, Alice; Hoelscher, Deanna M; Warren, Judith L
2015-12-01
The TEXAS! GROW! EAT! GO! (TGEG) randomized, control trial is a 5-year study to measure the impact of a nutrition and gardening intervention and/or physical activity (PA) intervention on the weight status of third-grade students. This article describes the results of the pilot study to test the feasibility of two interventions and test the measures to be used in the main trial. The pilot study was conducted in one school with third-grade students and their parents or guardians. The Junior Master Gardner (JMG) and Walk Across Texas (WAT) interventions were implemented over a 5-month period in three third-grade classrooms during spring 2012. The respective interventions focused on improving healthy eating and PA behaviors of children and their families. Baseline and immediate post-test data were collected from students and parents/guardians to measure four child, four parent, and four parent-child interaction behaviors. Process data regarding implementation were also collected from teachers and school administration. Forty-four students and 34 parents or guardians provided both pre- and post-test data. Paired-sample t-tests showed statistically significant changes in student knowledge, vegetable preferences, vegetable consumption, and home food availability (all p < 0.05). At baseline, participants' weight status categories included 57% obese, 10% overweight, and 31% normal weight. Postintervention, weight status categories included 39% obese, 16% overweight, and normal 45%. Data collected from teachers indicated high levels of implementation fidelity. Implementation of both interventions occurred at a very high fidelity level, which led to positive changes in BMI status, and several dietary and PA behaviors. Although the pilot study indicated feasibility of the two interventions for school implementation, results guided revisions to the TGEG program and its survey instruments.
PG4KDS: A Model for the Clinical Implementation of Pre-emptive Pharmacogenetics
Hoffman, James M.; Haidar, Cyrine E.; Wilkinson, Mark R.; Crews, Kristine R.; Baker, Donald K.; Kornegay, Nancy M.; Yang, Wenjian; Pui, Ching-Hon; Reiss, Ulrike M.; Gaur, Aditya H.; Howard, Scott C.; Evans, William E.; Broeckel, Ulrich; Relling, Mary V.
2014-01-01
Pharmacogenetics is frequently cited as an area for initial focus of the clinical implementation of genomics. Through the PG4KDS protocol, St. Jude Children’s Research Hospital pre-emptively genotypes patients for 230 genes using the Affymetrix Drug Metabolizing Enzymes and Transporters (DMET) Plus array supplemented with a CYP2D6 copy number assay. The PG4KDS protocol provides a rational, stepwise process for implementing gene/drug pairs, organizing data, and obtaining consent from patients and families. Through August 2013, 1559 patients have been enrolled, and 4 gene tests have been released into the electronic health record (EHR) for clinical implementation: TPMT, CYP2D6, SLCO1B1, and CYP2C19. These genes are coupled to 12 high-risk drugs. Of the 1016 patients with genotype test results available, 78% of them had at least one high-risk (i.e., actionable) genotype result placed in their EHR. Each diplotype result released to the EHR is coupled with an interpretive consult that is created in a concise, standardized format. To support-gene based prescribing at the point of care, 55 interruptive clinical decision support (CDS) alerts were developed. Patients are informed of their genotyping result and its relevance to their medication use through a letter. Key elements necessary for our successful implementation have included strong institutional support, a knowledgeable clinical laboratory, a process to manage any incidental findings, a strategy to educate clinicians and patients, a process to return results, and extensive use of informatics, especially CDS. Our approach to pre-emptive clinical pharmacogenetics has proven feasible, clinically useful, and scalable. PMID:24619595
PG4KDS: a model for the clinical implementation of pre-emptive pharmacogenetics.
Hoffman, James M; Haidar, Cyrine E; Wilkinson, Mark R; Crews, Kristine R; Baker, Donald K; Kornegay, Nancy M; Yang, Wenjian; Pui, Ching-Hon; Reiss, Ulrike M; Gaur, Aditya H; Howard, Scott C; Evans, William E; Broeckel, Ulrich; Relling, Mary V
2014-03-01
Pharmacogenetics is frequently cited as an area for initial focus of the clinical implementation of genomics. Through the PG4KDS protocol, St. Jude Children's Research Hospital pre-emptively genotypes patients for 230 genes using the Affymetrix Drug Metabolizing Enzymes and Transporters (DMET) Plus array supplemented with a CYP2D6 copy number assay. The PG4KDS protocol provides a rational, stepwise process for implementing gene/drug pairs, organizing data, and obtaining consent from patients and families. Through August 2013, 1,559 patients have been enrolled, and four gene tests have been released into the electronic health record (EHR) for clinical implementation: TPMT, CYP2D6, SLCO1B1, and CYP2C19. These genes are coupled to 12 high-risk drugs. Of the 1,016 patients with genotype test results available, 78% of them had at least one high-risk (i.e., actionable) genotype result placed in their EHR. Each diplotype result released to the EHR is coupled with an interpretive consult that is created in a concise, standardized format. To support-gene based prescribing at the point of care, 55 interruptive clinical decision support (CDS) alerts were developed. Patients are informed of their genotyping result and its relevance to their medication use through a letter. Key elements necessary for our successful implementation have included strong institutional support, a knowledgeable clinical laboratory, a process to manage any incidental findings, a strategy to educate clinicians and patients, a process to return results, and extensive use of informatics, especially CDS. Our approach to pre-emptive clinical pharmacogenetics has proven feasible, clinically useful, and scalable. © 2014 Wiley Periodicals, Inc.
Nasir, Muhammad; Attique Khan, Muhammad; Sharif, Muhammad; Lali, Ikram Ullah; Saba, Tanzila; Iqbal, Tassawar
2018-02-21
Melanoma is the deadliest type of skin cancer with highest mortality rate. However, the annihilation in early stage implies a high survival rate therefore, it demands early diagnosis. The accustomed diagnosis methods are costly and cumbersome due to the involvement of experienced experts as well as the requirements for highly equipped environment. The recent advancements in computerized solutions for these diagnoses are highly promising with improved accuracy and efficiency. In this article, we proposed a method for the classification of melanoma and benign skin lesions. Our approach integrates preprocessing, lesion segmentation, features extraction, features selection, and classification. Preprocessing is executed in the context of hair removal by DullRazor, whereas lesion texture and color information are utilized to enhance the lesion contrast. In lesion segmentation, a hybrid technique has been implemented and results are fused using additive law of probability. Serial based method is applied subsequently that extracts and fuses the traits such as color, texture, and HOG (shape). The fused features are selected afterwards by implementing a novel Boltzman Entropy method. Finally, the selected features are classified by Support Vector Machine. The proposed method is evaluated on publically available data set PH2. Our approach has provided promising results of sensitivity 97.7%, specificity 96.7%, accuracy 97.5%, and F-score 97.5%, which are significantly better than the results of existing methods available on the same data set. The proposed method detects and classifies melanoma significantly good as compared to existing methods. © 2018 Wiley Periodicals, Inc.
Poddar, Raju; Cortés, Dennis E.; Werner, John S.; Mannis, Mark J.
2013-01-01
Abstract. A high-speed (100 kHz A-scans/s) complex conjugate resolved 1 μm swept source optical coherence tomography (SS-OCT) system using coherence revival of the light source is suitable for dense three-dimensional (3-D) imaging of the anterior segment. The short acquisition time helps to minimize the influence of motion artifacts. The extended depth range of the SS-OCT system allows topographic analysis of clinically relevant images of the entire depth of the anterior segment of the eye. Patients with the type 1 Boston Keratoprosthesis (KPro) require evaluation of the full anterior segment depth. Current commercially available OCT systems are not suitable for this application due to limited acquisition speed, resolution, and axial imaging range. Moreover, most commonly used research grade and some clinical OCT systems implement a commercially available SS (Axsun) that offers only 3.7 mm imaging range (in air) in its standard configuration. We describe implementation of a common swept laser with built-in k-clock to allow phase stable imaging in both low range and high range, 3.7 and 11.5 mm in air, respectively, without the need to build an external MZI k-clock. As a result, 3-D morphology of the KPro position with respect to the surrounding tissue could be investigated in vivo both at high resolution and with large depth range to achieve noninvasive and precise evaluation of success of the surgical procedure. PMID:23912759
Salvo, Deborah; Reis, Rodrigo S; Sarmiento, Olga L; Pratt, Michael
2014-12-01
There is evidence linking the built environment (BE) with physical activity (PA), but few studies have been conducted in Latin America (LA). State-of-the-art methods and protocols have been designed in and applied in high-income countries (HIC). In this paper, we identify key challenges and potential solutions to conducting high-quality PA and BE research in LA. The experience of implementing the IPEN data collection protocol (IPEN: International Physical Activity Environment Network) in Curitiba, Brazil; Bogotá, Colombia; and Cuernavaca, Mexico (2010-2011); is described to identify challenges for conducting PA and BE research in LA. Five challenges were identified: lack of academic capacity (implemented solutions (IS): building a strong international collaborative network); limited data availability, access and quality (IS: partnering with influential local institutions, and crafting creative solutions to use the best-available data); socio-political, socio-cultural and socio-economic context (IS: in-person recruitment and data collection, alternative incentives); safety (IS: strict rules for data collection procedures, and specific measures to increase trust); and appropriateness of instruments and measures (IS: survey adaptation, use of standardized additional survey components, and employing a context-based approach to understanding the relationship between PA and the BE). Advantages of conducting PA and BE research in LA were also identified. Conducting high-quality PA and BE research in LA is challenging but feasible. Networks of institutions and researchers from both HIC and LMIC play a key role. The lessons learned from the IPEN LA study may be applicable to other LMIC. Copyright © 2014 Elsevier Inc. All rights reserved.
Salvo, Deborah; Reis, Rodrigo S.; Sarmiento, Olga L.; Pratt, Michael
2014-01-01
Objective There is evidence linking the built environment (BE) with physical activity (PA), but few studies have been conducted in Latin America (LA). State-of-the-art methods and protocols have been designed in and applied in high-income countries (HIC). In this paper we identify key challenges and potential solutions to conducting high quality PA and BE research in LA. Methods The experience of implementing the IPEN data collection protocol (IPEN: International Physical Activity Environment Network) in Curitiba, Brazil; Bogotá, Colombia; and Cuernavaca, Mexico (2010-2011); is described to identify challenges for conducting PA and BE research in LA. Results Five challenges were identified: Lack of academic capacity (implemented solutions (IS): building a strong international collaborative network); limited data availability, access and quality (IS: partnering with influential local institutions, and crafting creative solutions to use the best-available data); socio-political, socio-cultural and socio-economic context (IS: in-person recruitment and data collection, alternative incentives); safety (IS: strict rules for data collection procedures, and specific measures to increase trust); appropriateness of instruments and measures (IS: survey adaptation, use of standardized additional survey components, and employing a context-based approach to understanding the relationship between PA and the BE). Advantages of conducting PA and BE research in LA were also identified. Conclusions Conducting high quality PA and BE research in LA is challenging but feasible. Networks of institutions and researchers from both HIC and LMIC play a key role. The lessons learnt from the IPEN LA study may be applicable to other LMIC. PMID:25456800
Multicenter Safety and Immunogenicity Trial of an Attenuated Measles Vaccine for NHP
Yee, JoAnn L; McChesney, Michael B; Christe, Kari L
2015-01-01
Measles is a highly contagious viral disease in NHP. The infection can range from asymptomatic to rapidly fatal, resulting in significant morbidity and mortality in captive populations. In addition to appropriate quarantine practices, restricted access, the immunization of all personnel in contact with NHP, and the wearing of protective clothing including face masks, measles immunization further reduces the infection risk. Commercially available measles vaccines are effective for use in NHP, but interruptions in their availability have prevented the implementation of ongoing, consistent vaccination programs. This need for a readily available vaccine led us to perform a broad, multicenter safety and immunogenicity study of another candidate vaccine, MVac (Serum Institute of India), a monovalent measles vaccine derived from live Edmonston–Zagreb strain virus that had been attenuated after 22 passages on human diploid cells. PMID:26473350
Analysis and Prediction of Weather Impacted Ground Stop Operations
NASA Technical Reports Server (NTRS)
Wang, Yao Xun
2014-01-01
When the air traffic demand is expected to exceed the available airport's capacity for a short period of time, Ground Stop (GS) operations are implemented by Federal Aviation Administration (FAA) Traffic Flow Management (TFM). The GS requires departing aircraft meeting specific criteria to remain on the ground to achieve reduced demands at the constrained destination airport until the end of the GS. This paper provides a high-level overview of the statistical distributions as well as causal factors for the GSs at the major airports in the United States. The GS's character, the weather impact on GSs, GS variations with delays, and the interaction between GSs and Ground Delay Programs (GDPs) at Newark Liberty International Airport (EWR) are investigated. The machine learning methods are used to generate classification models that map the historical airport weather forecast, schedule traffic, and other airport conditions to implemented GS/GDP operations and the models are evaluated using the cross-validations. This modeling approach produced promising results as it yielded an 85% overall classification accuracy to distinguish the implemented GS days from the normal days without GS and GDP operations and a 71% accuracy to differentiate the GS and GDP implemented days from the GDP only days.
Considerations in change management related to technology.
Luo, John S; Hilty, Donald M; Worley, Linda L; Yager, Joel
2006-01-01
The authors describe the complexity of social processes for implementing technological change. Once a new technology is available, information about its availability and benefits must be made available to the community of users, with opportunities to try the innovations and find them worthwhile, despite organizational resistances. The authors reviewed the literature from psychiatry, psychology, sociology, business, and technology to distill common denominators for success and failure related to implementing technology. Beneficial technological innovations that are simple to use and obviously save everyone time and effort are easy to inaugurate. However, innovations that primarily serve management rather than subordinates or front-line utilizers may fail, despite considerable institutional effort. This article reviews and outlines several of the more prominent theoretical models governing successful institutional change. Successful implementation of difficult technological changes requires visionary leadership that has carefully considered the benefits, consulted with influence leaders at all organizational levels to spot unintended consequences and sources of resistance, and developed a detailed plan and continuous quality assurance process to foster implementation over time.
ERIC Educational Resources Information Center
Eastman-Mueller, Heather P.; Gomez-Scott, Jessica R.; Jung, Ae-Kyung; Oswalt, Sara B.; Hagglund, Kristofer
2016-01-01
The U.S. Centers for Disease Control and Prevention advocate access to condoms as a critical sexual health prevention strategy. The purpose of this article is to discuss the implementation and evaluation of a condom availability program using dispensing machines in residence halls at a Midwestern U.S. university. Undergraduate students (N = 337)…
NMF-mGPU: non-negative matrix factorization on multi-GPU systems.
Mejía-Roa, Edgardo; Tabas-Madrid, Daniel; Setoain, Javier; García, Carlos; Tirado, Francisco; Pascual-Montano, Alberto
2015-02-13
In the last few years, the Non-negative Matrix Factorization ( NMF ) technique has gained a great interest among the Bioinformatics community, since it is able to extract interpretable parts from high-dimensional datasets. However, the computing time required to process large data matrices may become impractical, even for a parallel application running on a multiprocessors cluster. In this paper, we present NMF-mGPU, an efficient and easy-to-use implementation of the NMF algorithm that takes advantage of the high computing performance delivered by Graphics-Processing Units ( GPUs ). Driven by the ever-growing demands from the video-games industry, graphics cards usually provided in PCs and laptops have evolved from simple graphics-drawing platforms into high-performance programmable systems that can be used as coprocessors for linear-algebra operations. However, these devices may have a limited amount of on-board memory, which is not considered by other NMF implementations on GPU. NMF-mGPU is based on CUDA ( Compute Unified Device Architecture ), the NVIDIA's framework for GPU computing. On devices with low memory available, large input matrices are blockwise transferred from the system's main memory to the GPU's memory, and processed accordingly. In addition, NMF-mGPU has been explicitly optimized for the different CUDA architectures. Finally, platforms with multiple GPUs can be synchronized through MPI ( Message Passing Interface ). In a four-GPU system, this implementation is about 120 times faster than a single conventional processor, and more than four times faster than a single GPU device (i.e., a super-linear speedup). Applications of GPUs in Bioinformatics are getting more and more attention due to their outstanding performance when compared to traditional processors. In addition, their relatively low price represents a highly cost-effective alternative to conventional clusters. In life sciences, this results in an excellent opportunity to facilitate the daily work of bioinformaticians that are trying to extract biological meaning out of hundreds of gigabytes of experimental information. NMF-mGPU can be used "out of the box" by researchers with little or no expertise in GPU programming in a variety of platforms, such as PCs, laptops, or high-end GPU clusters. NMF-mGPU is freely available at https://github.com/bioinfo-cnb/bionmf-gpu .
Callaghan, Russell C.; Liu, Lon‐Mu
2015-01-01
Abstract Background and Aims Research shows that essential/precursor chemical controls have had substantial impacts on US methamphetamine and heroin availability. This study examines whether US federal essential chemical regulations have impacted US cocaine seizure amount, price and purity—indicators of cocaine availability. Design Autoregressive integrated moving average (ARIMA)‐intervention time–series analysis was used to assess the impacts of four US regulations targeting cocaine manufacturing chemicals: potassium permanganate/selected solvents, implemented October 1989 sulfuric acid/hydrochloric acid, implemented October 1992; methyl isobutyl ketone, implemented May 1995; and sodium permanganate, implemented December 2006. Of these chemicals, potassium permanganate and sodium permanganate are the most critical to cocaine production. Setting Conterminous United States (January 1987—April 2011). Measurements Monthly time–series: purity‐adjusted cocaine seizure amount (in gross weight seizures < 6000 grams), purity‐adjusted price (all available seizures), and purity (all available seizures). Data source: System to Retrieve Information from Drug Evidence. Findings The 1989 potassium permanganate/solvents regulation was associated with a seizure amount decrease (change in series level) of 28% (P < 0.05), a 36% increase in price (P < 0.05) and a 4% decrease in purity (P < 0.05). Availability recovered in 1–2 years. The 2006 potassium permanganate regulation was associated with a 22% seizure amount decrease (P < 0.05), 100% price increase (P < 0.05) and 35% purity decrease (P < 0.05). Following the 2006 regulation, essentially no recovery occurred to April 2011. The other two chemical regulations were associated with statistically significant but lesser declines in indicated availability. Conclusions In the United States, essential chemical controls from 1989 to 2006 were associated with pronounced downturns in cocaine availability. PMID:25559418
Cunningham, James K; Callaghan, Russell C; Liu, Lon-Mu
2015-05-01
Research shows that essential/precursor chemical controls have had substantial impacts on US methamphetamine and heroin availability. This study examines whether US federal essential chemical regulations have impacted US cocaine seizure amount, price and purity-indicators of cocaine availability. Autoregressive integrated moving average (ARIMA)-intervention time-series analysis was used to assess the impacts of four US regulations targeting cocaine manufacturing chemicals: potassium permanganate/selected solvents, implemented October 1989 sulfuric acid/hydrochloric acid, implemented October 1992; methyl isobutyl ketone, implemented May 1995; and sodium permanganate, implemented December 2006. Of these chemicals, potassium permanganate and sodium permanganate are the most critical to cocaine production. Conterminous United States (January 1987-April 2011). Monthly time-series: purity-adjusted cocaine seizure amount (in gross weight seizures < 6000 grams), purity-adjusted price (all available seizures), and purity (all available seizures). System to Retrieve Information from Drug Evidence. The 1989 potassium permanganate/solvents regulation was associated with a seizure amount decrease (change in series level) of 28% (P < 0.05), a 36% increase in price (P < 0.05) and a 4% decrease in purity (P < 0.05). Availability recovered in 1-2 years. The 2006 potassium permanganate regulation was associated with a 22% seizure amount decrease (P < 0.05), 100% price increase (P < 0.05) and 35% purity decrease (P < 0.05). Following the 2006 regulation, essentially no recovery occurred to April 2011. The other two chemical regulations were associated with statistically significant but lesser declines in indicated availability. In the United States, essential chemical controls from 1989 to 2006 were associated with pronounced downturns in cocaine availability. © 2015 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
TaggerOne: joint named entity recognition and normalization with semi-Markov Models
Leaman, Robert; Lu, Zhiyong
2016-01-01
Motivation: Text mining is increasingly used to manage the accelerating pace of the biomedical literature. Many text mining applications depend on accurate named entity recognition (NER) and normalization (grounding). While high performing machine learning methods trainable for many entity types exist for NER, normalization methods are usually specialized to a single entity type. NER and normalization systems are also typically used in a serial pipeline, causing cascading errors and limiting the ability of the NER system to directly exploit the lexical information provided by the normalization. Methods: We propose the first machine learning model for joint NER and normalization during both training and prediction. The model is trainable for arbitrary entity types and consists of a semi-Markov structured linear classifier, with a rich feature approach for NER and supervised semantic indexing for normalization. We also introduce TaggerOne, a Java implementation of our model as a general toolkit for joint NER and normalization. TaggerOne is not specific to any entity type, requiring only annotated training data and a corresponding lexicon, and has been optimized for high throughput. Results: We validated TaggerOne with multiple gold-standard corpora containing both mention- and concept-level annotations. Benchmarking results show that TaggerOne achieves high performance on diseases (NCBI Disease corpus, NER f-score: 0.829, normalization f-score: 0.807) and chemicals (BioCreative 5 CDR corpus, NER f-score: 0.914, normalization f-score 0.895). These results compare favorably to the previous state of the art, notwithstanding the greater flexibility of the model. We conclude that jointly modeling NER and normalization greatly improves performance. Availability and Implementation: The TaggerOne source code and an online demonstration are available at: http://www.ncbi.nlm.nih.gov/bionlp/taggerone Contact: zhiyong.lu@nih.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27283952
Maisonneuve, Jenny J; Semrau, Katherine E A; Maji, Pinki; Pratap Singh, Vinay; Miller, Kate A; Solsky, Ian; Dixit, Neeraj; Sharma, Jigyasa; Lagoo, Janaka; Panariello, Natalie; Neal, Brandon; Kalita, Tapan; Kara, Nabihah; Kumar, Vishwajeet; Hirschhorn, Lisa R
2018-04-30
Evaluate the impact of a World Health Organization Safe Childbirth Checklist coaching-based intervention (BetterBirth Program) on availability and procurement of essential childbirth-related supplies. Matched pair, cluster-randomized controlled trial. Uttar Pradesh, India. 120 government-sector health facilities (60 interventions, 60 controls). Supply-availability surveys were conducted quarterly in all sites. Coaches collected supply procurement sources from intervention sites. Coaching targeting implementation of Checklist with data feedback and action planning. Mean supply availability by study arm; change in procurement sources for intervention sites. At baseline, 6 and 12 months, the intervention sites had a mean of 20.9 (95% confidence interval (CI): 20.2-21.5); 22.4 (95% CI: 21.8-22.9) and 22.1 (95% CI:21.4-22.8) items, respectively. Control sites had 20.8 (95% CI: 20.3-21.3); 20.9 (95% CI: 20.3-21.5) and 21.7 (95% CI: 20.8-22.6) items at the same time-points. There was a small but statistically significant higher availability in intervention sites at 6 months (difference-in-difference (DID) = 1.43, P < 0.001), which was not seen by 12 months (DID = 0.37, P = 0.53). Greater difference between intervention and control sites starting in the bottom quartile of supply availability was seen at 6 months (DID = 4.0, P = 0.0002), with no significant difference by 12 months (DID = 1.5, P = 0.154). No change was seen in procurement sources with ~5% procured by patients with some rates as high as 29% (oxytocin). Implementation of the BetterBirth Program, incorporating supply availability, resulted in modest improvements with catch-up by control facilities by 12 months. Supply-chain coaching may be most beneficial in sites starting with lower supply availability. Efforts are needed to reduce reliance on patient-funding for some critical medications. ClinicalTrials.gov #NCT02148952; Universal Trial Number: U1111-1131-5647.
NASA Astrophysics Data System (ADS)
Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung
2015-11-01
Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr05839b
Keefe, Matthew J; Loda, Justin B; Elhabashy, Ahmad E; Woodall, William H
2017-06-01
The traditional implementation of the risk-adjusted Bernoulli cumulative sum (CUSUM) chart for monitoring surgical outcome quality requires waiting a pre-specified period of time after surgery before incorporating patient outcome information. We propose a simple but powerful implementation of the risk-adjusted Bernoulli CUSUM chart that incorporates outcome information as soon as it is available, rather than waiting a pre-specified period of time after surgery. A simulation study is presented that compares the performance of the traditional implementation of the risk-adjusted Bernoulli CUSUM chart to our improved implementation. We show that incorporating patient outcome information as soon as it is available leads to quicker detection of process deterioration. Deterioration of surgical performance could be detected much sooner using our proposed implementation, which could lead to the earlier identification of problems. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Bruyeron, Olivier; Denizeau, Mirrdyn; Berger, Jacques; Trèche, Serge
2010-06-01
Sustainable approaches to improving infant and young child feeding are needed. The Nutridev program worked in Vietnam, Madagascar, and Burkina Faso to test different strategies to improve complementary feeding using fortified products sold to families. To review the experiences of programs producing and marketing fortified complementary foods and to report on the feasibility of local production and marketing of fortified complementary foods to increase usage of high-quality foods among children of low-income families in a self-sustaining manner. Project documents, surveys of mothers, and production and sales reports were reviewed. Nutridev experience in Vietnam, Madagascar, and Burkina Faso demonstrates that it is possible to produce affordable, high-quality complementary foods and supplements locally in developing countries. Strategies to make products readily available to the targeted population and to convince this population to consume them yielded mixed results, varying greatly based on the strategy utilized and the context in which it was implemented. In several contexts, the optimal approach appears to be strengthening the existing food distribution network to sell complementary foods and supplements, with the implementation of a temporary promotion and nutrition education network in partnership with local authorities (e.g., health services) to increase awareness among families about the fortified complementary food product and optimal feeding practices. In urban areas, where the density of the population is high, design and implementation of specific networks very close to consumers seems to be a good way to combine economic sustainability and good consumption levels.
gr-MRI: A software package for magnetic resonance imaging using software defined radios
NASA Astrophysics Data System (ADS)
Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.
2016-09-01
The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.
Khin, Hnin Su Su; Aung, Tin; Aung, Moe; Thi, Aung; Boxshall, Matt; White, Chris
2016-08-18
In 2012, alarmingly high rates of oral artemisinin monotherapy availability and use were detected along Eastern Myanmar, threatening efforts to halt the spread of artemisinin resistance in the Greater Mekong Subregion (GMS), and globally. The aim of this paper is to exemplify how the use of supply side evidence generated through the ACTwatch project shaped the artemisinin monotherapy replacement malaria (AMTR) project's design and interventions to rapidly displace oral artemisinin monotherapy with subsidized, quality-assured ACT in the private sector. The AMTR project was implemented as part of the Myanmar artemisinin resistance containment (MARC) framework along Eastern Myanmar. Guided by outlet survey and supply chain evidence, the project implemented a high-level subsidy, including negotiations with a main anti-malarial distributor, with the aim of squeezing oral artemisinin monotherapy out of the market through price competition and increased availability of quality-assured artemisinin-based combinations. This was complemented with a plethora of demand-creation activities targeting anti-malarial providers and consumers. Priority outlet types responsible for the distribution of oral artemisinin monotherapy were identified by the outlet survey, and this evidence was used to target the AMTR project's supporting interventions. The widespread availability and use of oral artemisinin monotherapy in Myanmar has been a serious threat to malaria control and elimination in the country and across the region. Practical anti-malarial market evidence was rapidly generated and used to inform private sector approaches to address these threats. The program design approach outlined in this paper is illustrative of the type of evidence generation and use that will be required to ensure effective containment of artemisinin drug resistance and progress toward regional and global malaria elimination goals.
NASA Astrophysics Data System (ADS)
Parker, Jay; Donnellan, Andrea; Glasscoe, Margaret; Fox, Geoffrey; Wang, Jun; Pierce, Marlon; Ma, Yu
2015-08-01
High-resolution maps of earth surface deformation are available in public archives for scientific interpretation, but are primarily available as bulky downloads on the internet. The NASA uninhabited aerial vehicle synthetic aperture radar (UAVSAR) archive of airborne radar interferograms delivers very high resolution images (approximately seven meter pixels) making remote handling of the files that much more pressing. Data exploration requiring data selection and exploratory analysis has been tedious. QuakeSim has implemented an archive of UAVSAR data in a web service and browser system based on GeoServer (http://geoserver.org). This supports a variety of services that supply consistent maps, raster image data and geographic information systems (GIS) objects including standard earthquake faults. Browsing the database is supported by initially displaying GIS-referenced thumbnail images of the radar displacement maps. Access is also provided to image metadata and links for full file downloads. One of the most widely used features is the QuakeSim line-of-sight profile tool, which calculates the radar-observed displacement (from an unwrapped interferogram product) along a line specified through a web browser. Displacement values along a profile are updated to a plot on the screen as the user interactively redefines the endpoints of the line and the sampling density. The profile and also a plot of the ground height are available as CSV (text) files for further examination, without any need to download the full radar file. Additional tools allow the user to select a polygon overlapping the radar displacement image, specify a downsampling rate and extract a modest sized grid of observations for display or for inversion, for example, the QuakeSim simplex inversion tool which estimates a consistent fault geometry and slip model.
Alvarado-Rojas, Catalina; Le Van Quyen, Michel; Valderrama, Mario
2016-01-01
High Frequency Oscillations (HFOs) in the brain have been associated with different physiological and pathological processes. In epilepsy, HFOs might reflect a mechanism of epileptic phenomena, serving as a biomarker of epileptogenesis and epileptogenicity. Despite the valuable information provided by HFOs, their correct identification is a challenging task. A comprehensive application, RIPPLELAB, was developed to facilitate the analysis of HFOs. RIPPLELAB provides a wide range of tools for HFOs manual and automatic detection and visual validation; all of them are accessible from an intuitive graphical user interface. Four methods for automated detection—as well as several options for visualization and validation of detected events—were implemented and integrated in the application. Analysis of multiple files and channels is possible, and new options can be added by users. All features and capabilities implemented in RIPPLELAB for automatic detection were tested through the analysis of simulated signals and intracranial EEG recordings from epileptic patients (n = 16; 3,471 analyzed hours). Visual validation was also tested, and detected events were classified into different categories. Unlike other available software packages for EEG analysis, RIPPLELAB uniquely provides the appropriate graphical and algorithmic environment for HFOs detection (visual and automatic) and validation, in such a way that the power of elaborated detection methods are available to a wide range of users (experts and non-experts) through the use of this application. We believe that this open-source tool will facilitate and promote the collaboration between clinical and research centers working on the HFOs field. The tool is available under public license and is accessible through a dedicated web site. PMID:27341033
Barrier infrared detector research at the Jet Propulsion Laboratory
NASA Astrophysics Data System (ADS)
Ting, David Z.; Keo, Sam A.; Liu, John K.; Mumolo, Jason M.; Khoshakhlagh, Arezou; Soibel, Alexander; Nguyen, Jean; Höglund, Linda; Rafol, B., , Sir; Hill, Cory J.; Gunapala, Sarath D.
2012-10-01
The barrier infrared detector device architecture offers the advantage of reduced dark current resulting from suppressed Shockley-Read-Hall (SRH) recombination and surface leakage. The versatility of the antimonide material system, with the availability of three different types of band offsets for flexibility in device design, provides the ideal setting for implementing barrier infrared detectors. We describe the progress made at the NASA Jet Propulsion Laboratory in recent years in Barrier infrared detector research that resulted in high-performance quantum structure infrared detectors, including the type-II superlattice complementary barrier infrared detector (CBIRD), and the high operating quantum dot barrier infrared detector (HOT QD-BIRD).
A high fidelity real-time simulation of a small turboshaft engine
NASA Technical Reports Server (NTRS)
Ballin, Mark G.
1988-01-01
A high-fidelity component-type model and real-time digital simulation of the General Electric T700-GE-700 turboshaft engine were developed for use with current generation real-time blade-element rotor helicopter simulations. A control system model based on the specification fuel control system used in the UH-60A Black Hawk helicopter is also presented. The modeling assumptions and real-time digital implementation methods particular to the simulation of small turboshaft engines are described. The validity of the simulation is demonstrated by comparison with analysis-oriented simulations developed by the manufacturer, available test data, and flight-test time histories.
A temperature controller board for the ARC controller
NASA Astrophysics Data System (ADS)
Tulloch, Simon
2016-07-01
A high-performance temperature controller board has been produced for the ARC Generation-3 CCD controller. It contains two 9W temperature servo loops and four temperature input channels and is fully programmable via the ARC API and OWL data acquisition program. PI-loop control is implemented in an on-board micro. Both diode and RTD sensors can be used. Control and telemetry data is sent via the ARC backplane although a USB-2 interface is also available. Further functionality includes hardware timers and high current drivers for external shutters and calibration LEDs, an LCD display, a parallel i/o port, a pressure sensor interface and an uncommitted analogue telemetry input.
ProjectQ: Compiling quantum programs for various backends
NASA Astrophysics Data System (ADS)
Haener, Thomas; Steiger, Damian S.; Troyer, Matthias
In order to control quantum computers beyond the current generation, a high level quantum programming language and optimizing compilers will be essential. Therefore, we have developed ProjectQ - an open source software framework to facilitate implementing and running quantum algorithms both in software and on actual quantum hardware. Here, we introduce the backends available in ProjectQ. This includes a high-performance simulator and emulator to test and debug quantum algorithms, tools for resource estimation, and interfaces to several small-scale quantum devices. We demonstrate the workings of the framework and show how easily it can be further extended to control upcoming quantum hardware.
Silicon saw-tooth refractive lens for high-energy x-rays made using a diamond saw.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Said, A. H.; Shastri, S. D.; X-Ray Science Division
2010-01-01
Silicon is a material well suited for refractive lenses operating at high X-ray energies (>50 keV), particularly if implemented in a single-crystal form to minimize small-angle scattering. A single-crystal silicon saw-tooth refractive lens, fabricated by a dicing process using a thin diamond wheel, was tested with 115 keV X-rays, giving an ideal 17 {mu}m line focus width in a long focal length, 2:1 ratio demagnification geometry, with a source-to-focus distance of 58.5 m. The fabrication is simple, using resources typically available at any synchrotron facility's optics shop.
Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.
Bergeron, Dominic; Tremblay, A-M S
2016-08-01
Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.
Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation
NASA Astrophysics Data System (ADS)
Bergeron, Dominic; Tremblay, A.-M. S.
2016-08-01
Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.
Liu, Hueiming; Lindley, Richard; Alim, Mohammed; Felix, Cynthia; Gandhi, Dorcas B C; Verma, Shweta J; Tugnawat, Deepak Kumar; Syrigapu, Anuradha; Ramamurthy, Ramaprabhu Krishnappa; Pandian, Jeyaraj D; Walker, Marion; Forster, Anne; Anderson, Craig S; Langhorne, Peter; Murthy, Gudlavalleti Venkata Satyanarayana; Shamanna, Bindiganavale Ramaswamy; Hackett, Maree L; Maulik, Pallab K; Harvey, Lisa A; Jan, Stephen
2016-01-01
Introduction We are undertaking a randomised controlled trial (fAmily led rehabiliTaTion aftEr stroke in INDia, ATTEND) evaluating training a family carer to enable maximal rehabilitation of patients with stroke-related disability; as a potentially affordable, culturally acceptable and effective intervention for use in India. A process evaluation is needed to understand how and why this complex intervention may be effective, and to capture important barriers and facilitators to its implementation. We describe the protocol for our process evaluation to encourage the development of in-process evaluation methodology and transparency in reporting. Methods and analysis The realist and RE-AIM (Reach, Effectiveness, Adoption, Implementation and Maintenance) frameworks informed the design. Mixed methods include semistructured interviews with health providers, patients and their carers, analysis of quantitative process data describing fidelity and dose of intervention, observations of trial set up and implementation, and the analysis of the cost data from the patients and their families perspective and programme budgets. These qualitative and quantitative data will be analysed iteratively prior to knowing the quantitative outcomes of the trial, and then triangulated with the results from the primary outcome evaluation. Ethics and dissemination The process evaluation has received ethical approval for all sites in India. In low-income and middle-income countries, the available human capital can form an approach to reducing the evidence practice gap, compared with the high cost alternatives available in established market economies. This process evaluation will provide insights into how such a programme can be implemented in practice and brought to scale. Through local stakeholder engagement and dissemination of findings globally we hope to build on patient-centred, cost-effective and sustainable models of stroke rehabilitation. Trial registration number CTRI/2013/04/003557. PMID:27633636
Standardized Solution for Management Controller for MTCA.4
NASA Astrophysics Data System (ADS)
Makowski, D.; Fenner, M.; Ludwig, F.; Mavrič, U.; Mielczarek, A.; Napieralski, A.; Perek, P.; Schlarb, H.
2015-06-01
The Micro Telecommunications Computing Architecture (MTCA) standard is a modern platform that is gaining popularity in the area of High Energy Physics (HEP) experiments. The standard provides extensive management, monitoring and diagnostics functionalities. The hardware control and monitoring is based on the Intelligent Platform Management Interface (IPMI), that was initially developed for supervision of complex computers operation. The original IPMI specification was extended to support functions required by the MTCA specification. The Module Management Controller (MMC) is required on each Advanced Mezzanine Card (AMC) installed in MTCA chassis. The Rear Transition Modules (RTMs) have to be equipped with RTM Management Controllers (RMCs) which is required by the MTCA.4 subsidiary specification. The commercially available implementations of MMC and RMC are expensive and do not provide the complete functionality that is required by specific HEP applications. Therefore, many research centers and commercial companies work on their own implementation of AMC and RTM controllers. The available implementations suffer because of lack of common approach and interoperability problems. Since both Lodz University of Technology (TUL) and Deutsches Elektronen-Synchrotron (DESY) have long-term experience in developing ATCA and MTCA hardware, the authors decided to develop a unified solution of management controller fully compliant with AMC and MTCA.4 standards. The MMC v1.00 solution is dedicated for management of AMC and RTM modules. The MMC v1.00 is based on Atmel ATxmega MCUs and can be fully customized by the user or used as a drop-in-module without any modifications. The paper discusses the functionality of the MMC v1.00 solution. The implementation was verified with developed evaluation kits for AMC and RTM cards.
Least reliable bits coding (LRBC) for high data rate satellite communications
NASA Technical Reports Server (NTRS)
Vanderaar, Mark; Budinger, James; Wagner, Paul
1992-01-01
LRBC, a bandwidth efficient multilevel/multistage block-coded modulation technique, is analyzed. LRBC uses simple multilevel component codes that provide increased error protection on increasingly unreliable modulated bits in order to maintain an overall high code rate that increases spectral efficiency. Soft-decision multistage decoding is used to make decisions on unprotected bits through corrections made on more protected bits. Analytical expressions and tight performance bounds are used to show that LRBC can achieve increased spectral efficiency and maintain equivalent or better power efficiency compared to that of BPSK. The relative simplicity of Galois field algebra vs the Viterbi algorithm and the availability of high-speed commercial VLSI for block codes indicates that LRBC using block codes is a desirable method for high data rate implementations.
NASA Technical Reports Server (NTRS)
Shafer, Jaclyn; Watson, Leela R.
2015-01-01
NASA's Launch Services Program, Ground Systems Development and Operations, Space Launch System and other programs at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) use the daily and weekly weather forecasts issued by the 45th Weather Squadron (45 WS) as decision tools for their day-to-day and launch operations on the Eastern Range (ER). Examples include determining if they need to limit activities such as vehicle transport to the launch pad, protect people, structures or exposed launch vehicles given a threat of severe weather, or reschedule other critical operations. The 45 WS uses numerical weather prediction models as a guide for these weather forecasts, particularly the Air Force Weather Agency (AFWA) 1.67 km Weather Research and Forecasting (WRF) model. Considering the 45 WS forecasters' and Launch Weather Officers' (LWO) extensive use of the AFWA model, the 45 WS proposed a task at the September 2013 Applied Meteorology Unit (AMU) Tasking Meeting requesting the AMU verify this model. Due to the lack of archived model data available from AFWA, verification is not yet possible. Instead, the AMU proposed to implement and verify the performance of an ER version of the high-resolution WRF Environmental Modeling System (EMS) model configured by the AMU (Watson 2013) in real time. Implementing a real-time version of the ER WRF-EMS would generate a larger database of model output than in the previous AMU task for determining model performance, and allows the AMU more control over and access to the model output archive. The tasking group agreed to this proposal; therefore the AMU implemented the WRF-EMS model on the second of two NASA AMU modeling clusters. The AMU also calculated verification statistics to determine model performance compared to observational data. Finally, the AMU made the model output available on the AMU Advanced Weather Interactive Processing System II (AWIPS II) servers, which allows the 45 WS and AMU staff to customize the model output display on the AMU and Range Weather Operations (RWO) AWIPS II client computers and conduct real-time subjective analyses.
Writing in Chemistry: An Effective Learning Tool
NASA Astrophysics Data System (ADS)
Kovac, Jeffrey; Sherwood, Donna W.
1999-10-01
Writing is both a powerful learning tool and an important professional skill for chemists. We have developed a systematic approach to the integration of writing into the chemistry curriculum, which is described in detail in Writing Across the Chemistry Curriculum: A Faculty Handbook, available from the authors in a preliminary edition. The approach has been tested in high-enrollment sections of general chemistry at the University of Tennessee, Knoxville, with considerable success. This paper describes both the general approach and the specific implementation in the classroom.
Point-to-Point Multicast Communications Protocol
NASA Technical Reports Server (NTRS)
Byrd, Gregory T.; Nakano, Russell; Delagi, Bruce A.
1987-01-01
This paper describes a protocol to support point-to-point interprocessor communications with multicast. Dynamic, cut-through routing with local flow control is used to provide a high-throughput, low-latency communications path between processors. In addition multicast transmissions are available, in which copies of a packet are sent to multiple destinations using common resources as much as possible. Special packet terminators and selective buffering are introduced to avoid a deadlock during multicasts. A simulated implementation of the protocol is also described.
Necessity of creating digital tools to ensure efficiency of technical means
NASA Astrophysics Data System (ADS)
Rakov, V. I.; Zakharova, O. V.
2018-05-01
The authors estimated the problems of functioning of technical objects. The article notes that the increasing complexity of automation systems may lead to an increase of the redundant resource in proportion to the number of components and relationships in the system, and to the need of the redundant resource constant change that can make implementation of traditional structures with redundancy unnecessarily costly (Standby System, Fault Tolerance, High Availability). It proposes the idea of creating digital tools to ensure efficiency of technical facilities.
Spectrum Savings from High Performance Recording and Playback Onboard the Test Article
2013-02-20
execute within a Windows 7 environment, and data is recorded on SSDs. The underlying database is implemented using MySQL . Figure 1 illustrates the... MySQL database. This is effectively the time at which the recorded data are available for retransmission. CPU and Memory utilization were collected...17.7% MySQL avg. 3.9% EQDR Total avg. 21.6% Table 1 CPU Utilization with260 Mbits/sec Load The difference between the total System CPU (27.8
(abstract) The Galileo Spacecraft: A Telecommunications Legacy for Future Space Flight
NASA Technical Reports Server (NTRS)
Deutsch, Leslie J.
1997-01-01
The Galileo mission to Jupiter has implemented a wide range of telecommunication improvements in response to the loss of its high gain antenna. While necessity dictated the use of these new techniques for Galileo, now that they have been proven in flight, they are available for use on future deep space missions. This telecommunications legacy of Galileo will aid in our ability to conduct a meaningful exploration of the solar system, and beyond, at a reasonable cost.
Harvey, Gill; Llewellyn, Sue; Maniatopoulos, Greg; Boyd, Alan; Procter, Rob
2018-05-10
Accelerating the implementation of new technology in healthcare is typically complex and multi-faceted. One strategy is to charge a national agency with the responsibility for facilitating implementation. This study examines the role of such an agency in the English National Health Service. In particular, it compares two different facilitation strategies employed by the agency to support the implementation of insulin pump therapy. The research involved an empirical case study of four healthcare organisations receiving different levels of facilitation from the national agency: two received active hands-on facilitation; one was the intended recipient of a more passive, web-based facilitation strategy; the other implemented the technology without any external facilitation. The primary method of data collection was semi-structured qualitative interviews with key individuals involved in implementation. The integrated-PARIHS framework was applied as a conceptual lens to analyse the data. The two sites that received active facilitation from an Implementation Manager in the national agency made positive progress in implementing the technology. In both sites there was a high level of initial receptiveness to implementation. This was similar to a site that had successfully introduced insulin pump therapy without facilitation support from the national agency. By contrast, a site that did not have direct contact with the national agency made little progress with implementation, despite the availability of a web-based implementation resource. Clinicians expressed differences of opinion around the value and effectiveness of the technology and contextual barriers related to funding for implementation persisted. The national agency's intended roll out strategy using passive web-based facilitation appeared to have little impact. When favourable conditions exist, in terms of agreement around the value of the technology, clinician receptiveness and motivation to change, active facilitation via an external agency can help to structure the implementation process and address contextual barriers. Passive facilitation using web-based implementation resources appears less effective. Moving from initial implementation to wider scale-up presents challenges and is an issue that warrants further attention.
Career Education's Missing Link: Support Personnel
ERIC Educational Resources Information Center
Panther, Edward E.
1975-01-01
This article describes the need for career education support personnel in the planning and implementation of career education programs. In Project CHOICE (Comprehensive Humanistic Oriented Implementation of Career Education), the career specialist was available as a full-time resource person and proved essential to program implementation at the…
Fienen, Michael N.; D'Oria, Marco; Doherty, John E.; Hunt, Randall J.
2013-01-01
The application bgaPEST is a highly parameterized inversion software package implementing the Bayesian Geostatistical Approach in a framework compatible with the parameter estimation suite PEST. Highly parameterized inversion refers to cases in which parameters are distributed in space or time and are correlated with one another. The Bayesian aspect of bgaPEST is related to Bayesian probability theory in which prior information about parameters is formally revised on the basis of the calibration dataset used for the inversion. Conceptually, this approach formalizes the conditionality of estimated parameters on the specific data and model available. The geostatistical component of the method refers to the way in which prior information about the parameters is used. A geostatistical autocorrelation function is used to enforce structure on the parameters to avoid overfitting and unrealistic results. Bayesian Geostatistical Approach is designed to provide the smoothest solution that is consistent with the data. Optionally, users can specify a level of fit or estimate a balance between fit and model complexity informed by the data. Groundwater and surface-water applications are used as examples in this text, but the possible uses of bgaPEST extend to any distributed parameter applications.
[Maintainance of a research tissue bank. (Infra)structural and quality aspects].
Schmitt, S; Kynast, K; Schirmacher, P; Herpel, E
2015-11-01
The availability of high quality human tissue samples and access to associated histopathological and clinical data are essential for biomedical research. Therefore, it is necessary to establish quality assured tissue biobanks that provide high quality tissue samples for research purposes. This entails quality concerns referring not only to the biomaterial specimen itself but encompassing all procedures related to biobanking, including the implementation of structural components, e.g. ethical and legal guidelines, quality management documentation as well as data and project management and information technology (IT) administration. Moreover, an integral aspect of tissue biobanks is the quality assured evaluation of every tissue specimen that is stored in a tissue biobank and used for projects to guarantee high quality assured biomaterial.
Orr, G; Roth, M
2012-08-01
A low-voltage (mV) electronically triggered spot welding system for fabricating fine thermocouples and thin sheets used in high-temperature characterization of materials' properties is suggested. The system is based on the capacitance discharge method with a timed trigger for obtaining reliable and consistent welds. In contrast to existing techniques based on employing high voltage DC supplies for charging the capacitor or supplies with positive and negative rails, this method uses a simple, standard dual power supply available at most of the physical laboratories or can be acquired at a low cost. In addition, an efficient and simple method of fabricating non-sticking electrodes that do not contaminate the weld area is suggested and implemented.
Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N
2017-10-01
Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.
Serra, M; Pereiro, I; Yamada, A; Viovy, J-L; Descroix, S; Ferraro, D
2017-02-14
The sealing of microfluidic devices remains a complex and time-consuming process requiring specific equipment and protocols: a universal method is thus highly desirable. We propose here the use of a commercially available sealing tape as a robust, versatile, reversible solution, compatible with cell and molecular biology protocols, and requiring only the application of manually achievable pressures. The performance of the seal was tested with regards to the most commonly used chip materials. For most materials, the bonding resisted 5 bars at room temperature and 1 bar at 95 °C. This method should find numerous uses, ranging from fast prototyping in the laboratory to implementation in low technology environments or industrial production.