Sample records for up-data program complexes

  1. lsjk—a C++ library for arbitrary-precision numeric evaluation of the generalized log-sine functions

    NASA Astrophysics Data System (ADS)

    Kalmykov, M. Yu.; Sheplyakov, A.

    2005-10-01

    Generalized log-sine functions Lsj(k)(θ) appear in higher order ɛ-expansion of different Feynman diagrams. We present an algorithm for the numerical evaluation of these functions for real arguments. This algorithm is implemented as a C++ library with arbitrary-precision arithmetics for integer 0⩽k⩽9 and j⩾2. Some new relations and representations of the generalized log-sine functions are given. Program summaryTitle of program:lsjk Catalogue number:ADVS Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVS Program obtained from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing terms: GNU General Public License Computers:all Operating systems:POSIX Programming language:C++ Memory required to execute:Depending on the complexity of the problem, at least 32 MB RAM recommended No. of lines in distributed program, including testing data, etc.:41 975 No. of bytes in distributed program, including testing data, etc.:309 156 Distribution format:tar.gz Other programs called:The CLN library for arbitrary-precision arithmetics is required at version 1.1.5 or greater External files needed:none Nature of the physical problem:Numerical evaluation of the generalized log-sine functions for real argument in the region 0<θ<π. These functions appear in Feynman integrals Method of solution:Series representation for the real argument in the region 0<θ<π Restriction on the complexity of the problem:Limited up to Lsj(9)(θ), and j is an arbitrary integer number. Thus, all function up to the weight 12 in the region 0<θ<π can be evaluated. The algorithm can be extended up to higher values of k(k>9) without modification Typical running time:Depending on the complexity of problem. See text below.

  2. Ripple Effect Mapping: A "Radiant" Way to Capture Program Impacts

    ERIC Educational Resources Information Center

    Kollock, Debra Hansen; Flage, Lynette; Chazdon, Scott; Paine, Nathan; Higgins, Lorie

    2012-01-01

    Learn more about a promising follow-up, participatory group process designed to document the results of Extension educational efforts within complex, real-life settings. The method, known as Ripple Effect Mapping, uses elements of Appreciative Inquiry, mind mapping, and qualitative data analysis to engage program participants and other community…

  3. Evaluating complex health financing interventions: using mixed methods to inform further implementation of a novel PBI intervention in rural Malawi.

    PubMed

    McMahon, Shannon A; Brenner, Stephan; Lohmann, Julia; Makwero, Christopher; Torbica, Aleksandra; Mathanga, Don P; Muula, Adamson S; De Allegri, Manuela

    2016-08-19

    Gaps remain in understanding how performance-based incentive (PBI) programs affect quality of care and service quantity, whether programs are cost effective and how programs could be tailored to meet client and provider needs while remaining operationally viable. In 2014, Malawi's Ministry of Health launched the Service Delivery Integration-PBI (SSDI-PBI) program. The program is unique in that no portion of performance bonuses are paid to individual health workers, and it shifts responsibility for infrastructure and equipment procurement from facility staff to implementing partners. This protocol outlines an approach that analyzes processes and outcomes, considers expected and unexpected consequences of the program and frames the program's outputs relative to its costs. Findings from this evaluation will inform the intended future scale-up of PBI in Malawi. This study employs a prospective controlled before-and-after triangulation design to assess effects of the PBI program by analyzing quantitative and qualitative data from intervention and control facilities. Guided by a theoretical framework, the evaluation consists of four main components: service provision, health worker motivation, implementation processes and costing. Quality and access outcomes are assessed along four dimensions: (1) structural elements (related to equipment, drugs, staff); (2) process elements (providers' compliance with standards); (3) outputs (service utilization); (4) experiential elements (experiences of service delivery). The costing component includes costs related to start-up, ongoing management, and the cost of incentives themselves. The cost analysis considers costs incurred within the Ministry of Health, funders, and the implementing agency. The evaluation relies on primary data (including interviews and surveys) and secondary data (including costing and health management information system data). Through the lens of a PBI program, we illustrate how complex interventions can be evaluated via not only primary, mixed-methods data collection, but also through a wealth of secondary data from program implementers (including monitoring, evaluation and financial data), and the health system (including service utilization and service readiness data). We also highlight the importance of crafting a theory and using theory to inform the nature of data collected. Finally, we highlight the need to be responsive to stakeholders in order to enhance a study's relevance.

  4. The SAS-3 delayed command system

    NASA Technical Reports Server (NTRS)

    Hoffman, E. J.

    1975-01-01

    To meet the requirements arising from the increased complexity of the power, attitude control and telemetry systems, a full redundant high-performance control section with delayed command capability was designed for the Small Astronomy Satellite-3 (SAS-3). The relay command system of SAS-3 is characterized by 56 bystate relay commands, with capability for handling up to 64 commands in future versions. The 'short' data command service of SAS-1 and SAS-2 consisting of shifting 24-bit words to two users was expanded to five users and augmented with a 'long load' data command service (up to 4080 bits) used to program the telemetry system and the delayed command subsystem. The inclusion of a delayed command service ensures a program of up to 30 relay or short data commands to be loaded for execution at designated times. The design and system operation of the SAS-3 command section are analyzed, with special attention given to the delayed command subsystem.

  5. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    NASA Technical Reports Server (NTRS)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  6. A complexity-scalable software-based MPEG-2 video encoder.

    PubMed

    Chen, Guo-bin; Lu, Xin-ning; Wang, Xing-guo; Liu, Ji-lin

    2004-05-01

    With the development of general-purpose processors (GPP) and video signal processing algorithms, it is possible to implement a software-based real-time video encoder on GPP, and its low cost and easy upgrade attract developers' interests to transfer video encoding from specialized hardware to more flexible software. In this paper, the encoding structure is set up first to support complexity scalability; then a lot of high performance algorithms are used on the key time-consuming modules in coding process; finally, at programming level, processor characteristics are considered to improve data access efficiency and processing parallelism. Other programming methods such as lookup table are adopted to reduce the computational complexity. Simulation results showed that these ideas could not only improve the global performance of video coding, but also provide great flexibility in complexity regulation.

  7. Experiences in running a complex electronic data capture system using mobile phones in a large-scale population trial in southern Nepal.

    PubMed

    Style, Sarah; Beard, B James; Harris-Fry, Helen; Sengupta, Aman; Jha, Sonali; Shrestha, Bhim P; Rai, Anjana; Paudel, Vikas; Thondoo, Meelan; Pulkki-Brannstrom, Anni-Maria; Skordis-Worrall, Jolene; Manandhar, Dharma S; Costello, Anthony; Saville, Naomi M

    2017-01-01

    The increasing availability and capabilities of mobile phones make them a feasible means of data collection. Electronic Data Capture (EDC) systems have been used widely for public health monitoring and surveillance activities, but documentation of their use in complicated research studies requiring multiple systems is limited. This paper shares our experiences of designing and implementing a complex multi-component EDC system for a community-based four-armed cluster-Randomised Controlled Trial in the rural plains of Nepal, to help other researchers planning to use EDC for complex studies in low-income settings. We designed and implemented three interrelated mobile phone data collection systems to enrol and follow-up pregnant women (trial participants), and to support the implementation of trial interventions (women's groups, food and cash transfers). 720 field staff used basic phones to send simple coded text messages, 539 women's group facilitators used Android smartphones with Open Data Kit Collect, and 112 Interviewers, Coordinators and Supervisors used smartphones with CommCare. Barcoded photo ID cards encoded with participant information were generated for each enrolled woman. Automated systems were developed to download, recode and merge data for nearly real-time access by researchers. The systems were successfully rolled out and used by 1371 staff. A total of 25,089 pregnant women were enrolled, and 17,839 follow-up forms completed. Women's group facilitators recorded 5717 women's groups and the distribution of 14,647 food and 13,482 cash transfers. Using EDC sped up data collection and processing, although time needed for programming and set-up delayed the study inception. EDC using three interlinked mobile data management systems (FrontlineSMS, ODK and CommCare) was a feasible and effective method of data capture in a complex large-scale trial in the plains of Nepal. Despite challenges including prolonged set-up times, the systems met multiple data collection needs for users with varying levels of literacy and experience.

  8. Experiences in running a complex electronic data capture system using mobile phones in a large-scale population trial in southern Nepal

    PubMed Central

    Style, Sarah; Beard, B. James; Harris-Fry, Helen; Sengupta, Aman; Jha, Sonali; Shrestha, Bhim P.; Rai, Anjana; Paudel, Vikas; Thondoo, Meelan; Pulkki-Brannstrom, Anni-Maria; Skordis-Worrall, Jolene; Manandhar, Dharma S.; Costello, Anthony; Saville, Naomi M.

    2017-01-01

    ABSTRACT The increasing availability and capabilities of mobile phones make them a feasible means of data collection. Electronic Data Capture (EDC) systems have been used widely for public health monitoring and surveillance activities, but documentation of their use in complicated research studies requiring multiple systems is limited. This paper shares our experiences of designing and implementing a complex multi-component EDC system for a community-based four-armed cluster-Randomised Controlled Trial in the rural plains of Nepal, to help other researchers planning to use EDC for complex studies in low-income settings. We designed and implemented three interrelated mobile phone data collection systems to enrol and follow-up pregnant women (trial participants), and to support the implementation of trial interventions (women’s groups, food and cash transfers). 720 field staff used basic phones to send simple coded text messages, 539 women’s group facilitators used Android smartphones with Open Data Kit Collect, and 112 Interviewers, Coordinators and Supervisors used smartphones with CommCare. Barcoded photo ID cards encoded with participant information were generated for each enrolled woman. Automated systems were developed to download, recode and merge data for nearly real-time access by researchers. The systems were successfully rolled out and used by 1371 staff. A total of 25,089 pregnant women were enrolled, and 17,839 follow-up forms completed. Women’s group facilitators recorded 5717 women’s groups and the distribution of 14,647 food and 13,482 cash transfers. Using EDC sped up data collection and processing, although time needed for programming and set-up delayed the study inception. EDC using three interlinked mobile data management systems (FrontlineSMS, ODK and CommCare) was a feasible and effective method of data capture in a complex large-scale trial in the plains of Nepal. Despite challenges including prolonged set-up times, the systems met multiple data collection needs for users with varying levels of literacy and experience. PMID:28613121

  9. Feasibility of a Team Approach to Complex Congenital Heart Defect Neurodevelopmental Follow-Up: Early Experience of a Combined Cardiology/Neonatal Intensive Care Unit Follow-Up Program.

    PubMed

    Chorna, Olena; Baldwin, H Scott; Neumaier, Jamie; Gogliotti, Shirley; Powers, Deborah; Mouvery, Amanda; Bichell, David; Maitre, Nathalie L

    2016-07-01

    Infants with complex congenital heart disease are at high risk for poor neurodevelopmental outcomes. However, implementation of dedicated congenital heart disease follow-up programs presents important infrastructure, personnel, and resource challenges. We present the development, implementation, and retrospective review of 1- and 2-year outcomes of a Complex Congenital Heart Defect Neurodevelopmental Follow-Up program. This program was a synergistic approach between the Pediatric Cardiology, Cardiothoracic Surgery, Pediatric Intensive Care, and Neonatal Intensive Care Unit Follow-Up teams to provide a feasible and responsible utilization of existing infrastructure and personnel, to develop and implement a program dedicated to children with congenital heart disease. Trained developmental testers administered the Ages and Stages Questionnaire-3 over the phone to the parents of all referred children at least once between 6 and 12 months' corrected age. At 18 months' corrected age, all children were scheduled in the Neonatal Intensive-Care Unit Follow-Up Clinic for a visit with standardized neurological exams, Bayley III, multidisciplinary therapy evaluations and continued follow-up. Of the 132 patients identified in the Cardiothoracic Surgery database and at discharge from the hospital, a total number of 106 infants were reviewed. A genetic syndrome was identified in 23.4% of the population. Neuroimaging abnormalities were identified in 21.7% of the cohort with 12.8% having visibly severe insults. As a result, 23 (26.7%) received first-time referrals for early intervention services, 16 (13.8%) received referrals for new services in addition to their existing ones. We concluded that utilization of existing resources in collaboration with established programs can ensure targeted neurodevelopmental follow-up for all children with complex congenital heart disease. © 2016 American Heart Association, Inc.

  10. OCCULT-ORSER complete conversational user-language translator

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.; Young, K.

    1981-01-01

    Translator program (OCCULT) assists non-computer-oriented users in setting up and submitting jobs for complex ORSER system. ORSER is collection of image processing programs for analyzing remotely sensed data. OCCULT is designed for those who would like to use ORSER but cannot justify acquiring and maintaining necessary proficiency in Remote Job Entry Language, Job Control Language, and control-card formats. OCCULT is written in FORTRAN IV and OS Assembler for interactive execution.

  11. Systems approach to monitoring and evaluation guides scale up of the Standard Days Method of family planning in Rwanda.

    PubMed

    Igras, Susan; Sinai, Irit; Mukabatsinda, Marie; Ngabo, Fidele; Jennings, Victoria; Lundgren, Rebecka

    2014-05-01

    There is no guarantee that a successful pilot program introducing a reproductive health innovation can also be expanded successfully to the national or regional level, because the scaling-up process is complex and multilayered. This article describes how a successful pilot program to integrate the Standard Days Method (SDM) of family planning into existing Ministry of Health services was scaled up nationally in Rwanda. Much of the success of the scale-up effort was due to systematic use of monitoring and evaluation (M&E) data from several sources to make midcourse corrections. Four lessons learned illustrate this crucially important approach. First, ongoing M&E data showed that provider training protocols and client materials that worked in the pilot phase did not work at scale; therefore, we simplified these materials to support integration into the national program. Second, triangulation of ongoing monitoring data with national health facility and population-based surveys revealed serious problems in supply chain mechanisms that affected SDM (and the accompanying CycleBeads client tool) availability and use; new procedures for ordering supplies and monitoring stockouts were instituted at the facility level. Third, supervision reports and special studies revealed that providers were imposing unnecessary medical barriers to SDM use; refresher training and revised supervision protocols improved provider practices. Finally, informal environmental scans, stakeholder interviews, and key events timelines identified shifting political and health policy environments that influenced scale-up outcomes; ongoing advocacy efforts are addressing these issues. The SDM scale-up experience in Rwanda confirms the importance of monitoring and evaluating programmatic efforts continuously, using a variety of data sources, to improve program outcomes.

  12. Systems approach to monitoring and evaluation guides scale up of the Standard Days Method of family planning in Rwanda

    PubMed Central

    Igras, Susan; Sinai, Irit; Mukabatsinda, Marie; Ngabo, Fidele; Jennings, Victoria; Lundgren, Rebecka

    2014-01-01

    There is no guarantee that a successful pilot program introducing a reproductive health innovation can also be expanded successfully to the national or regional level, because the scaling-up process is complex and multilayered. This article describes how a successful pilot program to integrate the Standard Days Method (SDM) of family planning into existing Ministry of Health services was scaled up nationally in Rwanda. Much of the success of the scale-up effort was due to systematic use of monitoring and evaluation (M&E) data from several sources to make midcourse corrections. Four lessons learned illustrate this crucially important approach. First, ongoing M&E data showed that provider training protocols and client materials that worked in the pilot phase did not work at scale; therefore, we simplified these materials to support integration into the national program. Second, triangulation of ongoing monitoring data with national health facility and population-based surveys revealed serious problems in supply chain mechanisms that affected SDM (and the accompanying CycleBeads client tool) availability and use; new procedures for ordering supplies and monitoring stockouts were instituted at the facility level. Third, supervision reports and special studies revealed that providers were imposing unnecessary medical barriers to SDM use; refresher training and revised supervision protocols improved provider practices. Finally, informal environmental scans, stakeholder interviews, and key events timelines identified shifting political and health policy environments that influenced scale-up outcomes; ongoing advocacy efforts are addressing these issues. The SDM scale-up experience in Rwanda confirms the importance of monitoring and evaluating programmatic efforts continuously, using a variety of data sources, to improve program outcomes. PMID:25276581

  13. WATEQF; a FORTRAN IV version of WATEQ : a computer program for calculating chemical equilibrium of natural waters

    USGS Publications Warehouse

    Plummer, Niel; Jones, Blair F.; Truesdell, Alfred Hemingway

    1976-01-01

    WATEQF is a FORTRAN IV computer program that models the thermodynamic speciation of inorganic ions and complex species in solution for a given water analysis. The original version (WATEQ) was written in 1973 by A. H. Truesdell and B. F. Jones in Programming Language/one (PL/1.) With but a few exceptions, the thermochemical data, speciation, coefficients, and general calculation procedure of WATEQF is identical to the PL/1 version. This report notes the differences between WATEQF and WATEQ, demonstrates how to set up the input data to execute WATEQF, provides a test case for comparison, and makes available a listing of WATEQF. (Woodard-USGS)

  14. 2.0 Introduction to the Delaware River Basin pilot study

    Treesearch

    Peter S. Murdoch; Jennifer C. Jenkins; Richard A. Birdsey

    2008-01-01

    The past 20 years of environmental research have shown that the environment is not made up of discrete components acting independently, but rather it is a mosaic of complex relationships among air, land, water, living resources, and human activities. The data collection and analytical capabilities of current ecosystem assessment and monitoring programs are insufficient...

  15. An integrated tool for loop calculations: AITALC

    NASA Astrophysics Data System (ADS)

    Lorca, Alejandro; Riemann, Tord

    2006-01-01

    AITALC, a new tool for automating loop calculations in high energy physics, is described. The package creates Fortran code for two-fermion scattering processes automatically, starting from the generation and analysis of the Feynman graphs. We describe the modules of the tool, the intercommunication between them and illustrate its use with three examples. Program summaryTitle of the program:AITALC version 1.2.1 (9 August 2005) Catalogue identifier:ADWO Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWO Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Computer:PC i386 Operating system:GNU/ LINUX, tested on different distributions SuSE 8.2 to 9.3, Red Hat 7.2, Debian 3.0, Ubuntu 5.04. Also on SOLARIS Programming language used:GNU MAKE, DIANA, FORM, FORTRAN77 Additional programs/libraries used:DIANA 2.35 ( QGRAF 2.0), FORM 3.1, LOOPTOOLS 2.1 ( FF) Memory required to execute with typical data:Up to about 10 MB No. of processors used:1 No. of lines in distributed program, including test data, etc.:40 926 No. of bytes in distributed program, including test data, etc.:371 424 Distribution format:tar gzip file High-speed storage required:from 1.5 to 30 MB, depending on modules present and unfolding of examples Nature of the physical problem:Calculation of differential cross sections for ee annihilation in one-loop approximation. Method of solution:Generation and perturbative analysis of Feynman diagrams with later evaluation of matrix elements and form factors. Restriction of the complexity of the problem:The limit of application is, for the moment, the 2→2 particle reactions in the electro-weak standard model. Typical running time:Few minutes, being highly depending on the complexity of the process and the FORTRAN compiler.

  16. Nuclear Weapons: Comprehensive Test Ban Treaty

    DTIC Science & Technology

    2006-07-10

    continued...) The complex could contain explosions up to 500 pounds of explosive and associated plutonium. Another SCE, “ Unicorn ,” is to be conducted...scheduled for FY2006, as noted below. SCEs try to determine if radioactive decay of aged plutonium would degrade weapon performance. Several SCEs...Richardson called SCEs “a key part of our scientific program to provide new tools and data that assess age -related complications and maintain the reliability

  17. Impediments to Increasing Diversity in Post-Secondary Education

    ERIC Educational Resources Information Center

    Johnson, Carol Siri

    2007-01-01

    Due to the increasing complexity in the financial aid process and the movement of available financial aid up the economic scale, poor people and minorities have less access to college, including engineering programs. Some impediments are lack of access to knowledge about college, increasing complexity and up-front costs in the application process…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this softwaremore » 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.« less

  19. The control and data acquisition structure for the GAMMA-400 space gamma-telescope

    NASA Astrophysics Data System (ADS)

    Arkhangelskiy, Andrey

    2016-07-01

    The GAMMA-400 space project is intended for precision investigation of the cosmic gamma-emission in the energy band from keV region up to several TeV, electrons and positrons fluxes from ˜~1~GeV up to ˜~10~TeV and high energy cosmic-ray nuclei fluxes. A description of the control and data acquisition structure for gamma-telescope involved in the GAMMA 400 space project is given. The technical capabilities of all specialized equipment providing the functioning of the scientific instrumentation and satellite support systems are unified in a single structure. Control of the scientific instruments is maintained using one-time pulse radio commands and program commands transmitted via onboard control system and scientific data acquisition system. Up to 100~GByte of data per day can be transferred to the ground segment of the project. The correctness of the proposed and implemented structure, engineering solutions and electronic elemental base selection has been verified experimentally with the scientific complex prototype in the laboratory conditions.

  20. The structure of control and data transfer management system for the GAMMA-400 scientific complex

    NASA Astrophysics Data System (ADS)

    Arkhangelskiy, A. I.; Bobkov, S. G.; Serdin, O. V.; Gorbunov, M. S.; Topchiev, N. P.

    2016-02-01

    A description of the control and data transfer management system for scientific instrumentation involved in the GAMMA-400 space project is given. The technical capabilities of all specialized equipment to provide the functioning of the scientific instrumentation and satellite support systems are unified in a single structure. Control of the scientific instruments is maintained using one-time pulse radio commands, as well as program commands in the form of 16-bit code words, which are transmitted via onboard control system and scientific data acquisition system. Up to 100 GByte of data per day can be transferred to the ground segment of the project. The correctness of the proposed and implemented structure, engineering solutions and electronic elemental base selection has been verified by the experimental working-off of the prototype of the GAMMA-400 scientific complex in laboratory conditions.

  1. A tertiary care-primary care partnership model for medically complex and fragile children and youth with special health care needs.

    PubMed

    Gordon, John B; Colby, Holly H; Bartelt, Tera; Jablonski, Debra; Krauthoefer, Mary L; Havens, Peter

    2007-10-01

    To evaluate the impact of a tertiary care center special needs program that partners with families and primary care physicians to ensure seamless inpatient and outpatient care and assist in providing medical homes. Up to 3 years of preenrollment and postenrollment data were compared for patients in the special needs program from July 1, 2002, through June 30, 2005. A tertiary care center pediatric hospital and medical school serving urban and rural patients. A total of 227 of 230 medically complex and fragile children and youth with special needs who had a wide range of chronic disorders and were enrolled in the special needs program. Care coordination provided by a special needs program pediatric nurse case manager with or without a special needs program physician. Preenrollment and postenrollment tertiary care center resource utilization, charges, and payments. A statistically significant decrease was found in the number of hospitalizations, number of hospital days, and tertiary care center charges and payments, and an increase was found in the use of outpatient services. Aggregate data revealed a decrease in hospital days from 7926 to 3831, an increase in clinic visits from 3150 to 5420, and a decrease in tertiary care center payments of $10.7 million. The special needs program budget for fiscal year 2005 had a deficit of $400,000. This tertiary care-primary care partnership model improved health care and reduced costs with relatively modest institutional support.

  2. Human body segmentation via data-driven graph cut.

    PubMed

    Li, Shifeng; Lu, Huchuan; Shao, Xingqing

    2014-11-01

    Human body segmentation is a challenging and important problem in computer vision. Existing methods usually entail a time-consuming training phase for prior knowledge learning with complex shape matching for body segmentation. In this paper, we propose a data-driven method that integrates top-down body pose information and bottom-up low-level visual cues for segmenting humans in static images within the graph cut framework. The key idea of our approach is first to exploit human kinematics to search for body part candidates via dynamic programming for high-level evidence. Then, by using the body parts classifiers, obtaining bottom-up cues of human body distribution for low-level evidence. All the evidence collected from top-down and bottom-up procedures are integrated in a graph cut framework for human body segmentation. Qualitative and quantitative experiment results demonstrate the merits of the proposed method in segmenting human bodies with arbitrary poses from cluttered backgrounds.

  3. Error Prevention Aid

    NASA Technical Reports Server (NTRS)

    1987-01-01

    In a complex computer environment there is ample opportunity for error, a mistake by a programmer, or a software-induced undesirable side effect. In insurance, errors can cost a company heavily, so protection against inadvertent change is a must for the efficient firm. The data processing center at Transport Life Insurance Company has taken a step to guard against accidental changes by adopting a software package called EQNINT (Equations Interpreter Program). EQNINT cross checks the basic formulas in a program against the formulas that make up the major production system. EQNINT assures that formulas are coded correctly and helps catch errors before they affect the customer service or its profitability.

  4. Accounting for variations in ART program sustainability outcomes in health facilities in Uganda: a comparative case study analysis.

    PubMed

    Zakumumpa, Henry; Bennett, Sara; Ssengooba, Freddie

    2016-10-18

    Uganda implemented a national ART scale-up program at public and private health facilities between 2004 and 2009. Little is known about how and why some health facilities have sustained ART programs and why others have not sustained these interventions. The objective of the study was to identify facilitators and barriers to the long-term sustainability of ART programs at six health facilities in Uganda which received donor support to commence ART between 2004 and 2009. A case-study approach was adopted. Six health facilities were purposively selected for in-depth study from a national sample of 195 health facilities across Uganda which participated in an earlier study phase. The six health facilities were placed in three categories of sustainability; High Sustainers (2), Low Sustainers (2) and Non- Sustainers (2). Semi-structured interviews with ART Clinic managers (N = 18) were conducted. Questionnaire data were analyzed (N = 12). Document review augmented respondent data. Based on the data generated, across-case comparative analyses were performed. Data were collected between February and June 2015. Several distinguishing features were found between High Sustainers, and Low and Non-Sustainers' ART program characteristics. High Sustainers had larger ART programs with higher staffing and patient volumes, a broader 'menu' of ART services and more stable program leadership compared to the other cases. High Sustainers associated sustained ART programs with multiple funding streams, robust ART program evaluation systems and having internal and external program champions. Low and Non Sustainers reported similar barriers of shortage and attrition of ART-proficient staff, low capacity for ART program reporting, irregular and insufficient supply of ARV drugs and a lack of alignment between ART scale-up and their for-profit orientation in three of the cases. We found that ART program sustainability was embedded in a complex system involving dynamic interactions between internal (program champion, staffing strength, M &E systems, goal clarity) and external drivers (donors, ARVs supply chain, patient demand). ART program sustainability contexts were distinguished by the size of health facility and ownership-type. The study's implications for health systems strengthening in resource-limited countries are discussed.

  5. Programs for transferring data between a relational data base and a finite element structural analysis program

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1982-01-01

    An interface system for passing data between a relational information management (RIM) data base complex and engineering analysis language (EAL), a finite element structural analysis program is documented. The interface system, implemented on a CDC Cyber computer, is composed of two FORTRAN programs called RIM2EAL and EAL2RIM. The RIM2EAL reads model definition data from RIM and creates a file of EAL commands to define the model. The EAL2RIM reads model definition and EAL generated analysis data from EAL's data library and stores these data dirctly in a RIM data base. These two interface programs and the format for the RIM data complex are described.

  6. Closha: bioinformatics workflow system for the analysis of massive sequencing data.

    PubMed

    Ko, GunHwan; Kim, Pan-Gyu; Yoon, Jongcheol; Han, Gukhee; Park, Seong-Jin; Song, Wangho; Lee, Byungwook

    2018-02-19

    While next-generation sequencing (NGS) costs have fallen in recent years, the cost and complexity of computation remain substantial obstacles to the use of NGS in bio-medical care and genomic research. The rapidly increasing amounts of data available from the new high-throughput methods have made data processing infeasible without automated pipelines. The integration of data and analytic resources into workflow systems provides a solution to the problem by simplifying the task of data analysis. To address this challenge, we developed a cloud-based workflow management system, Closha, to provide fast and cost-effective analysis of massive genomic data. We implemented complex workflows making optimal use of high-performance computing clusters. Closha allows users to create multi-step analyses using drag and drop functionality and to modify the parameters of pipeline tools. Users can also import the Galaxy pipelines into Closha. Closha is a hybrid system that enables users to use both analysis programs providing traditional tools and MapReduce-based big data analysis programs simultaneously in a single pipeline. Thus, the execution of analytics algorithms can be parallelized, speeding up the whole process. We also developed a high-speed data transmission solution, KoDS, to transmit a large amount of data at a fast rate. KoDS has a file transfer speed of up to 10 times that of normal FTP and HTTP. The computer hardware for Closha is 660 CPU cores and 800 TB of disk storage, enabling 500 jobs to run at the same time. Closha is a scalable, cost-effective, and publicly available web service for large-scale genomic data analysis. Closha supports the reliable and highly scalable execution of sequencing analysis workflows in a fully automated manner. Closha provides a user-friendly interface to all genomic scientists to try to derive accurate results from NGS platform data. The Closha cloud server is freely available for use from http://closha.kobic.re.kr/ .

  7. OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing

    NASA Astrophysics Data System (ADS)

    Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping

    2017-02-01

    The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.

  8. The Paperless Solution

    NASA Technical Reports Server (NTRS)

    2001-01-01

    REI Systems, Inc. developed a software solution that uses the Internet to eliminate the paperwork typically required to document and manage complex business processes. The data management solution, called Electronic Handbooks (EHBs), is presently used for the entire SBIR program processes at NASA. The EHB-based system is ideal for programs and projects whose users are geographically distributed and are involved in complex management processes and procedures. EHBs provide flexible access control and increased communications while maintaining security for systems of all sizes. Through Internet Protocol- based access, user authentication and user-based access restrictions, role-based access control, and encryption/decryption, EHBs provide the level of security required for confidential data transfer. EHBs contain electronic forms and menus, which can be used in real time to execute the described processes. EHBs use standard word processors that generate ASCII HTML code to set up electronic forms that are viewed within a web browser. EHBs require no end-user software distribution, significantly reducing operating costs. Each interactive handbook simulates a hard-copy version containing chapters with descriptions of participants' roles in the online process.

  9. Time-temperature-stress capabilities of composite materials for advanced supersonic technology application, phase 1

    NASA Technical Reports Server (NTRS)

    Kerr, J. R.; Haskins, J. F.

    1980-01-01

    Implementation of metal and resin matrix composites into supersonic vehicle usage is contingent upon accelerating the demonstration of service capacity and design technology. Because of the added material complexity and lack of extensive service data, laboratory replication of the flight service will provide the most rapid method of documenting the airworthiness of advanced composite systems. A program in progress to determine the time temperature stress capabilities of several high temperature composite materials includes thermal aging, environmental aging, fatigue, creep, fracture, and tensile tests as well as real time flight simulation exposure. The program has two parts. The first includes all the material property determinations and aging and simulation exposures up through 10,000 hours. The second continues these tests up to 50,000 cumulative hours. Results are presented of the 10,000 hour phase, which has now been completed.

  10. A Locality-Based Threading Algorithm for the Configuration-Interaction Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shan, Hongzhang; Williams, Samuel; Johnson, Calvin

    The Configuration Interaction (CI) method has been widely used to solve the non-relativistic many-body Schrodinger equation. One great challenge to implementing it efficiently on manycore architectures is its immense memory and data movement requirements. To address this issue, within each node, we exploit a hybrid MPI+OpenMP programming model in lieu of the traditional flat MPI programming model. Here in this paper, we develop optimizations that partition the workloads among OpenMP threads based on data locality,-which is essential in ensuring applications with complex data access patterns scale well on manycore architectures. The new algorithm scales to 256 threadson the 64-core Intelmore » Knights Landing (KNL) manycore processor and 24 threads on dual-socket Ivy Bridge (Xeon) nodes. Compared with the original implementation, the performance has been improved by up to 7× on theKnights Landing processor and 3× on the dual-socket Ivy Bridge node.« less

  11. A Locality-Based Threading Algorithm for the Configuration-Interaction Method

    DOE PAGES

    Shan, Hongzhang; Williams, Samuel; Johnson, Calvin; ...

    2017-07-03

    The Configuration Interaction (CI) method has been widely used to solve the non-relativistic many-body Schrodinger equation. One great challenge to implementing it efficiently on manycore architectures is its immense memory and data movement requirements. To address this issue, within each node, we exploit a hybrid MPI+OpenMP programming model in lieu of the traditional flat MPI programming model. Here in this paper, we develop optimizations that partition the workloads among OpenMP threads based on data locality,-which is essential in ensuring applications with complex data access patterns scale well on manycore architectures. The new algorithm scales to 256 threadson the 64-core Intelmore » Knights Landing (KNL) manycore processor and 24 threads on dual-socket Ivy Bridge (Xeon) nodes. Compared with the original implementation, the performance has been improved by up to 7× on theKnights Landing processor and 3× on the dual-socket Ivy Bridge node.« less

  12. New Modular Ultrasonic Signal Processing Building Blocks for Real-Time Data Acquisition and Post Processing

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion

    2003-03-01

    A suite of basic signal processors has been developed. These basic building blocks can be cascaded together to form more complex processors without the need for programming. The data structures between each of the processors are handled automatically. This allows a processor built for one purpose to be applied to any type of data such as images, waveform arrays and single values. The processors are part of Winspect Data Acquisition software. The new processors are fast enough to work on A-scan signals live while scanning. Their primary use is to extract features, reduce noise or to calculate material properties. The cascaded processors work equally well on live A-scan displays, live gated data or as a post-processing engine on saved data. Researchers are able to call their own MATLAB or C-code from anywhere within the processor structure. A built-in formula node processor that uses a simple algebraic editor may make external user programs unnecessary. This paper also discusses the problems associated with ad hoc software development and how graphical programming languages can tie up researchers writing software rather than designing experiments.

  13. Extension of Generalized Fluid System Simulation Program's Fluid Property Database

    NASA Technical Reports Server (NTRS)

    Patel, Kishan

    2011-01-01

    This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.

  14. Influence of maternal depression on household food insecurity for low-income families.

    PubMed

    Garg, Arvin; Toy, Sarah; Tripodis, Yorghos; Cook, John; Cordella, Nick

    2015-01-01

    To examine whether maternal depression predicts future household food insecurity for low-income families. This was a secondary data analysis using data from the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B). The study cohort consisted of 2917 low-income mothers, defined as <185% federal poverty level, who were food secure at baseline. Maternal data collected when children were 9 and 24 months of age were used. Data at 9 months were considered baseline, and data at 24 months were considered follow-up. Baseline maternal depressive symptoms were measured by a 12-item abbreviated version of the Center for Epidemiologic Studies Depression Scale. Household food insecurity at follow-up was measured by the US Department of Agriculture Household Food Security Scale. At baseline, 16% of mothers were depressed (raw score >9). Most mothers were white, unemployed, and born in the United States. The majority received Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) (86%); 39% received Supplemental Nutrition Assistance Program (SNAP). At follow-up, 11.8% of mothers reported household food insecurity. In multivariable analysis, maternal depression at baseline was significantly associated with food insecurity at follow-up (adjusted odds ratio 1.50; 95% confidence interval 1.06-2.12). Our results suggest that maternal depression is an independent risk factor for household food insecurity in low-income families with young children. Multidisciplinary interventions embedded within and outside the pediatric medical home should be developed to identify depressed mothers and link them to community-based mental health and food resources. Further longitudinal and interventional studies are needed to understand and address the complex relationship between poverty, maternal depression, social safety nets, and food insecurity. Copyright © 2015 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  15. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    PubMed

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-04

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem.

  16. 77 FR 61789 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Workforce...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-11

    ... Programs Gold Standard Evaluation Follow-Up Surveys, Veterans Study, and Cost Data,'' to the Office of... Standard Evaluation Follow-Up Surveys, Veterans Study, and Cost Data ACTION: Notice. SUMMARY: The... Worker Programs Gold Standard Evaluation Follow-Up Surveys, Veterans Study, and Cost Data. OMB ICR...

  17. CET89 - CHEMICAL EQUILIBRIUM WITH TRANSPORT PROPERTIES, 1989

    NASA Technical Reports Server (NTRS)

    Mcbride, B.

    1994-01-01

    Scientists and engineers need chemical equilibrium composition data to calculate the theoretical thermodynamic properties of a chemical system. This information is essential in the design and analysis of equipment such as compressors, turbines, nozzles, engines, shock tubes, heat exchangers, and chemical processing equipment. The substantial amount of numerical computation required to obtain equilibrium compositions and transport properties for complex chemical systems led scientists at NASA's Lewis Research Center to develop CET89, a program designed to calculate the thermodynamic and transport properties of these systems. CET89 is a general program which will calculate chemical equilibrium compositions and mixture properties for any chemical system with available thermodynamic data. Generally, mixtures may include condensed and gaseous products. CET89 performs the following operations: it 1) obtains chemical equilibrium compositions for assigned thermodynamic states, 2) calculates dilute-gas transport properties of complex chemical mixtures, 3) obtains Chapman-Jouguet detonation properties for gaseous species, 4) calculates incident and reflected shock properties in terms of assigned velocities, and 5) calculates theoretical rocket performance for both equilibrium and frozen compositions during expansion. The rocket performance function allows the option of assuming either a finite area or an infinite area combustor. CET89 accommodates problems involving up to 24 reactants, 20 elements, and 600 products (400 of which may be condensed). The program includes a library of thermodynamic and transport properties in the form of least squares coefficients for possible reaction products. It includes thermodynamic data for over 1300 gaseous and condensed species and transport data for 151 gases. The subroutines UTHERM and UTRAN convert thermodynamic and transport data to unformatted form for faster processing. The program conforms to the FORTRAN 77 standard, except for some input in NAMELIST format. It requires about 423 KB memory, and is designed to be used on mainframe, workstation, and mini computers. Due to its memory requirements, this program does not readily lend itself to implementation on MS-DOS based machines.

  18. A code for analysis of the fine structure in near-rigid weakly-bonded open-shell complexes that consist of a diatomic radical in a Σ3 state and a closed-shell molecule

    NASA Astrophysics Data System (ADS)

    Fawzy, Wafaa M.

    2010-10-01

    A FORTRAN code is developed for simulation and fitting the fine structure of a planar weakly-bonded open-shell complex that consists of a diatomic radical in a Σ3 electronic state and a diatomic or a polyatomic closed-shell molecule. The program sets up the proper total Hamiltonian matrix for a given J value and takes account of electron-spin-electron-spin, electron-spin rotation interactions, and the quartic and sextic centrifugal distortion terms within the complex. Also, R-dependence of electron-spin-electron-spin and electron-spin rotation couplings are considered. The code does not take account of effects of large-amplitude internal rotation of the diatomic radical within the complex. It is assumed that the complex has a well defined equilibrium geometry so that effects of large amplitude motion are negligible. Therefore, the computer code is suitable for a near-rigid rotor. Numerical diagonalization of the matrix provides the eigenvalues and the eigenfunctions that are necessary for calculating energy levels, frequencies, relative intensities of infrared or microwave transitions, and expectation values of the quantum numbers within the complex. Goodness of all the quantum numbers, with exception of J and parity, depends on relative sizes of the product of the rotational constants and quantum numbers (i.e. BJ, CJ, and AK), electron-spin-electron-spin, and electron-spin rotation couplings, as well as the geometry of the complex. Therefore, expectation values of the quantum numbers are calculated in the eigenfunctions basis of the complex. The computational time for the least squares fits has been significantly reduced by using the Hellman-Feynman theory for calculating the derivatives. The computer code is useful for analysis of high resolution infrared and microwave spectra of a planar near-rigid weakly-bonded open-shell complex that contains a diatomic fragment in a Σ3 electronic state and a closed-shell molecule. The computer program was successfully applied to analysis and fitting the observed high resolution infrared spectra of the O 2sbnd HF/O 2sbnd DF and O 2sbnd N 2O complexes. Test input file for simulation and fitting the high resolution infrared spectrum of the O 2sbnd DF complex is provided. Program summaryProgram title: TSIG_COMP Catalogue identifier: AEGM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 10 030 No. of bytes in distributed program, including test data, etc.: 51 663 Distribution format: tar.gz Programming language: Fortran 90, free format Computer: SGI Origin 3400, workstations and PCs Operating system: Linux, UNIX and Windows (see Restrictions below) RAM: Case dependent Classification: 16.2 Nature of problem: TSIG_COMP calculates frequencies, relative intensities, and expectation values of the various quantum numbers and parities of bound states involved in allowed ro-vibrational transitions in semi-rigid planar weakly-bonded open-shell complexes. The complexes of interest contain a free radical in a Σ3 state and a closed-shell partner, where the electron-spin-electron-spin interaction, electron-spin rotation interaction, and centrifugal forces significantly modify the spectral patterns. To date, ab initio methods are incapable of taking these effects into account to provide accurate predictions for the ro-vibrational energy levels of the complexes of interest. In the TSIG_COMP program, the problem is solved by using the proper effective Hamiltonian and molecular basis set. Solution method: The program uses a Hamiltonian operator that takes into account vibration, end-over-end rotation, electron-spin-electron-spin and electron-spin rotation interactions as well as the various centrifugal distortion terms. The Hamiltonian operator and the molecular basis set are used to set up the Hamiltonian matrix in the inertial axis system of the complex of interest. Diagonalization of the Hamiltonian matrix provides the eigenvalues and the eigenfunctions for the bound ro-vibrational states. These eigenvalues and eigenfunctions are used to calculate frequencies and relative intensities of the allowed infrared or microwave transitions as well as expectation values of all the quantum numbers and parities of states involved in the transitions. The program employs the method of least squares fits to fit the observed frequencies to the calculated frequencies to provide the molecular parameters that determine the geometry of the complex of interest. Restrictions: The number of transitions and parameters included in the fits is limited to 80 parameters and 200 transitions. However, these numbers can be increased by adjusting dimensions of the arrays (not recommended). Running the program under MS windows is recommended for simulations of any number of transitions and for fitting a relatively small number of parameters and transitions (maximum 15 parameters and 82 transitions), for fitting larger number of parameters run time error may occur. Because spectra of weakly bonded complexes are recorded at low temperatures, in most of cases fittings can be performed under MS windows. Running time: Problem-dependent. The provided test input for Linux fits 82 transitions and 21 parameters, the actual run time is 62 minutes. The provided test input file for MS windows fits 82 transitions and 15 parameters; the actual runtime is 5 minutes.

  19. Using a theory driven approach to develop and evaluate a complex mental health intervention: the friendship bench project in Zimbabwe.

    PubMed

    Chibanda, Dixon; Verhey, Ruth; Munetsi, Epiphany; Cowan, Frances M; Lund, Crick

    2016-01-01

    There is a paucity of data on how to deliver complex interventions that seek to reduce the treatment gap for mental disorders, particularly in sub-Saharan Africa. The need for well-documented protocols which clearly describe the development and the scale-up of programs and interventions is necessary if such interventions are to be replicated elsewhere. This article describes the use of a theory of change (ToC) model to develop a brief psychological intervention for common mental disorders and its' evaluation through a cluster randomized controlled trial in Zimbabwe. A total of eight ToC workshops were held with a range of stakeholders over a 6-month period with a focus on four key components of the program: formative work, piloting, evaluation and scale-up. A ToC map was developed as part of the process with defined causal pathways leading to the desired impact. Interventions, indicators, assumptions and rationale for each point along the causal pathway were considered. Political buy-in from stakeholders together with key resources, which included human, facility/infrastructure, communication and supervision were identified as critical needs using the ToC approach. Ten (10) key interventions with specific indicators, assumptions and rationale formed part of the final ToC map, which graphically illustrated the causal pathway leading to the development of a psychological intervention and the successful implementation of a cluster randomized controlled trial. ToC workshops can enhance stakeholder engagement through an iterative process leading to a shared vision that can improve outcomes of complex mental health interventions particularly where scaling up of the intervention is desired.

  20. Kranc: a Mathematica package to generate numerical codes for tensorial evolution equations

    NASA Astrophysics Data System (ADS)

    Husa, Sascha; Hinder, Ian; Lechner, Christiane

    2006-06-01

    We present a suite of Mathematica-based computer-algebra packages, termed "Kranc", which comprise a toolbox to convert certain (tensorial) systems of partial differential evolution equations to parallelized C or Fortran code for solving initial boundary value problems. Kranc can be used as a "rapid prototyping" system for physicists or mathematicians handling very complicated systems of partial differential equations, but through integration into the Cactus computational toolkit we can also produce efficient parallelized production codes. Our work is motivated by the field of numerical relativity, where Kranc is used as a research tool by the authors. In this paper we describe the design and implementation of both the Mathematica packages and the resulting code, we discuss some example applications, and provide results on the performance of an example numerical code for the Einstein equations. Program summaryTitle of program: Kranc Catalogue identifier: ADXS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXS_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Distribution format: tar.gz Computer for which the program is designed and others on which it has been tested: General computers which run Mathematica (for code generation) and Cactus (for numerical simulations), tested under Linux Programming language used: Mathematica, C, Fortran 90 Memory required to execute with typical data: This depends on the number of variables and gridsize, the included ADM example requires 4308 KB Has the code been vectorized or parallelized: The code is parallelized based on the Cactus framework. Number of bytes in distributed program, including test data, etc.: 1 578 142 Number of lines in distributed program, including test data, etc.: 11 711 Nature of physical problem: Solution of partial differential equations in three space dimensions, which are formulated as an initial value problem. In particular, the program is geared towards handling very complex tensorial equations as they appear, e.g., in numerical relativity. The worked out examples comprise the Klein-Gordon equations, the Maxwell equations, and the ADM formulation of the Einstein equations. Method of solution: The method of numerical solution is finite differencing and method of lines time integration, the numerical code is generated through a high level Mathematica interface. Restrictions on the complexity of the program: Typical numerical relativity applications will contain up to several dozen evolution variables and thousands of source terms, Cactus applications have shown scaling up to several thousand processors and grid sizes exceeding 500 3. Typical running time: This depends on the number of variables and the grid size: the included ADM example takes approximately 100 seconds on a 1600 MHz Intel Pentium M processor. Unusual features of the program: based on Mathematica and Cactus

  1. SWOT Analysis of Total Sanitation Campaign in Yavatmal District of Maharashtra

    PubMed Central

    Pardeshi, Geeta; Shirke, Avinash; Jagtap, Minal

    2008-01-01

    Aims: To study the strengths, weaknesses, opportunities, and threats of the Total Sanitation Campaign (TSC) in the Yavatmal district of Maharashtra. Methodology: Data was collected in December 2006 through interviews with stakeholders, house-to-house surveys, focus group discussions, and transect walks. Information in each category was finalized in a meeting after brainstorming and discussion with the TSC cell members. Results: The strengths of the campaign were innovations in Information Education and Communication, motivation through incentives, competitive spirit, active participation and partnerships, involvement of women, and universal coverage. The main weaknesses of the program were the absence of Rural Sanitary Marts/Production Centers, poor maintenance of Women Sanitary Complexes, lack of facilities for monitoring/ follow-up and a temporary focus of the campaign approach. There is an opportunity to tap additional resources, learn from other experiences, and institute back-up agencies to support and guide the community in the post-TSC phase. A change in administration and local leadership and loss of priority and interest needed to sustain the momentum while scaling up the interventions are possible threats for the program. PMID:19876501

  2. SWOT Analysis of Total Sanitation Campaign in Yavatmal District of Maharashtra.

    PubMed

    Pardeshi, Geeta; Shirke, Avinash; Jagtap, Minal

    2008-10-01

    To study the strengths, weaknesses, opportunities, and threats of the Total Sanitation Campaign (TSC) in the Yavatmal district of Maharashtra. Data was collected in December 2006 through interviews with stakeholders, house-to-house surveys, focus group discussions, and transect walks. Information in each category was finalized in a meeting after brainstorming and discussion with the TSC cell members. The strengths of the campaign were innovations in Information Education and Communication, motivation through incentives, competitive spirit, active participation and partnerships, involvement of women, and universal coverage. The main weaknesses of the program were the absence of Rural Sanitary Marts/Production Centers, poor maintenance of Women Sanitary Complexes, lack of facilities for monitoring/ follow-up and a temporary focus of the campaign approach. There is an opportunity to tap additional resources, learn from other experiences, and institute back-up agencies to support and guide the community in the post-TSC phase. A change in administration and local leadership and loss of priority and interest needed to sustain the momentum while scaling up the interventions are possible threats for the program.

  3. Science information systems: Visualization

    NASA Technical Reports Server (NTRS)

    Wall, Ray J.

    1991-01-01

    Future programs in earth science, planetary science, and astrophysics will involve complex instruments that produce data at unprecedented rates and volumes. Current methods for data display, exploration, and discovery are inadequate. Visualization technology offers a means for the user to comprehend, explore, and examine complex data sets. The goal of this program is to increase the effectiveness and efficiency of scientists in extracting scientific information from large volumes of instrument data.

  4. Qualitative Analysis for Maintenance Process Assessment

    NASA Technical Reports Server (NTRS)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  5. Testing of Environmental Satellite Bus-Instrument Interfaces Using Engineering Models

    NASA Technical Reports Server (NTRS)

    Gagnier, Don; Hayner, Rick; Roza, Michael; Nosek, Thomas; Razzaghi, Andrea

    2004-01-01

    This paper discusses the formulation and execution of a laboratory test of the electrical interfaces between multiple atmospheric science instruments and the spacecraft bus that carries them. The testing, performed in 2002, used engineering models of the instruments that will be flown on the Aura s p a c m and of the Aura spacecraft bus electronics. Aura is one of NASA's Earth Observing System @OS) Program missions managed by the Goddard Space Flight Center. The test was designed to evaluate the complex interfaces in the spacecraft and instrument command and data handling (C&DH) subsystems prior to integration of the complete flight instruments on the spacecraft. A problem discovered during (and not before) the flight hardware integration phase can cause significant cost and schedule impacts. The testing successfully surfaced problems and led to their resolution before the full-up integration phase, saving significant cost and schedule time. This approach could be used on future environmental satellite programs involving multiple, complex scientific instruments being integrated onto a bus.

  6. Current state of high-risk infant follow-up care in the United States: results of a national survey of academic follow-up programs.

    PubMed

    Kuppala, V S; Tabangin, M; Haberman, B; Steichen, J; Yolton, K

    2012-04-01

    High-risk infant follow-up programs have the potential to act as multipurpose clinics by providing continuity of clinical care, education of health care trainees and facilitating outcome data research. Currently there are no nationally representative data on high-risk infant follow-up practices in the United States. The objective of this study is to collect information about the composition of high-risk infant follow-up programs associated with academic centers in the United States, with respect to their structure, function, funding resources and developmental assessment practices, and to identify the barriers to establishment of such programs. Staff neonatologists, follow-up program directors and division directors of 170 Neonatal Intensive Care Units (NICU) associated with pediatric residency programs were invited to participate in an anonymous online survey from October 2009 to January 2010. The overall response rate was 84%. Ninety three percent of the respondents have a follow-up program associated with their NICU. Birth weight, gestational age and critical illness in the NICU were the major criteria for follow-up care. Management of nutrition and neurodevelopmental assessments was the most common service provided. Over 70% have health care trainees in the clinic. About 75% of the respondents have the neurodevelopmental outcome data available. Most of the respondents reported multiple funding sources. Lack of personnel and funding were the most common causes for not having a follow-up program. High-risk infant follow-up programs associated with academic centers in the United States are functioning as multidisciplinary programs providing clinical care, trainee education and facilitating outcomes research.

  7. Beyond Adoption: A New Framework for Theorizing and Evaluating Nonadoption, Abandonment, and Challenges to the Scale-Up, Spread, and Sustainability of Health and Care Technologies

    PubMed Central

    Wherton, Joseph; Papoutsi, Chrysanthi; Lynch, Jennifer; Hughes, Gemma; A'Court, Christine; Hinder, Susan; Fahy, Nick; Procter, Rob; Shaw, Sara

    2017-01-01

    Background Many promising technological innovations in health and social care are characterized by nonadoption or abandonment by individuals or by failed attempts to scale up locally, spread distantly, or sustain the innovation long term at the organization or system level. Objective Our objective was to produce an evidence-based, theory-informed, and pragmatic framework to help predict and evaluate the success of a technology-supported health or social care program. Methods The study had 2 parallel components: (1) secondary research (hermeneutic systematic review) to identify key domains, and (2) empirical case studies of technology implementation to explore, test, and refine these domains. We studied 6 technology-supported programs—video outpatient consultations, global positioning system tracking for cognitive impairment, pendant alarm services, remote biomarker monitoring for heart failure, care organizing software, and integrated case management via data sharing—using longitudinal ethnography and action research for up to 3 years across more than 20 organizations. Data were collected at micro level (individual technology users), meso level (organizational processes and systems), and macro level (national policy and wider context). Analysis and synthesis was aided by sociotechnically informed theories of individual, organizational, and system change. The draft framework was shared with colleagues who were introducing or evaluating other technology-supported health or care programs and refined in response to feedback. Results The literature review identified 28 previous technology implementation frameworks, of which 14 had taken a dynamic systems approach (including 2 integrative reviews of previous work). Our empirical dataset consisted of over 400 hours of ethnographic observation, 165 semistructured interviews, and 200 documents. The final nonadoption, abandonment, scale-up, spread, and sustainability (NASSS) framework included questions in 7 domains: the condition or illness, the technology, the value proposition, the adopter system (comprising professional staff, patient, and lay caregivers), the organization(s), the wider (institutional and societal) context, and the interaction and mutual adaptation between all these domains over time. Our empirical case studies raised a variety of challenges across all 7 domains, each classified as simple (straightforward, predictable, few components), complicated (multiple interacting components or issues), or complex (dynamic, unpredictable, not easily disaggregated into constituent components). Programs characterized by complicatedness proved difficult but not impossible to implement. Those characterized by complexity in multiple NASSS domains rarely, if ever, became mainstreamed. The framework showed promise when applied (both prospectively and retrospectively) to other programs. Conclusions Subject to further empirical testing, NASSS could be applied across a range of technological innovations in health and social care. It has several potential uses: (1) to inform the design of a new technology; (2) to identify technological solutions that (perhaps despite policy or industry enthusiasm) have a limited chance of achieving large-scale, sustained adoption; (3) to plan the implementation, scale-up, or rollout of a technology program; and (4) to explain and learn from program failures. PMID:29092808

  8. Selection and visualisation of outcome measures for complex post-acute acquired brain injury rehabilitation interventions

    PubMed Central

    Ford, Catherine Elaine Longworth; Malley, Donna; Bateman, Andrew; Clare, Isabel C.H.; Wagner, Adam P.; Gracey, Fergus

    2016-01-01

    Background Outcome measurement challenges rehabilitation services to select tools that promote stakeholder engagement in measuring complex interventions. Objectives To examine the suitability of outcome measures for complex post-acute acquired brain injury (ABI) rehabilitation interventions, report outcomes of a holistic, neuropsychological ABI rehabilitation program and propose a simple way of visualizing complex outcomes. Methods Patient/carer reported outcome measures (PROMS), experience measures (PREMS) and staff-rated measures were collected for consecutive admissions over 1 year to an 18-week holistic, neuropsychological rehabilitation programme at baseline, 18 weeks and 3- and 6-month follow-up. Results Engagement with outcome measurement was poorest for carers and at follow-up for all stakeholders. Dependence, abilities, adjustment, unmet needs, symptomatology including executive dysfunction, and self-reassurance showed improvements at 18 weeks. Adjustment, social participation, perceived health, symptomatology including dysexecutive difficulties, and anxiety were worse at baseline for those who did not complete rehabilitation, than those who did. A radar plot facilitated outcome visualization. Conclusions Engagement with outcome measurement was best when time and support were provided. Supplementing patient- with staff-rated and attendance measures may explain missing data and help quantify healthcare needs. The MPAI4, EBIQ and DEX-R appeared suitable measures to evaluate outcomes and distinguish those completing and not completing neuropsychological rehabilitation. PMID:27341362

  9. Can a Home-based Cardiac Physical Activity Program Improve the Physical Function Quality of Life in Children with Fontan Circulation?

    PubMed

    Jacobsen, Roni M; Ginde, Salil; Mussatto, Kathleen; Neubauer, Jennifer; Earing, Michael; Danduran, Michael

    2016-01-01

    Patients after Fontan operation for complex congenital heart disease (CHD) have decreased exercise capacity and report reduced health-related quality of life (HRQOL). Studies suggest hospital-based cardiac physical activity programs can improve HRQOL and exercise capacity in patients with CHD; however, these programs have variable adherence rates. The impact of a home-based cardiac physical activity program in Fontan survivors is unclear. This pilot study evaluated the safety, feasibility, and benefits of an innovative home-based physical activity program on HRQOL in Fontan patients. A total of 14 children, 8-12 years, with Fontan circulation enrolled in a 12-week moderate/high intensity home-based cardiac physical activity program, which included a home exercise routine and 3 formalized in-person exercise sessions at 0, 6, and 12 weeks. Subjects and parents completed validated questionnaires to assess HRQOL. The Shuttle Test Run was used to measure exercise capacity. A Fitbit Flex Activity Monitor was used to assess adherence to the home activity program. Of the 14 patients, 57% were male and 36% had a dominant left ventricle. Overall, 93% completed the program. There were no adverse events. Parents reported significant improvement in their child's overall HRQOL (P < .01), physical function (P < .01), school function (P = .01), and psychosocial function (P < .01). Patients reported no improvement in HRQOL. Exercise capacity, measured by total shuttles and exercise time in the Shuttle Test Run and calculated VO2 max, improved progressively from baseline to the 6 and 12 week follow up sessions. Monthly Fitbit data suggested adherence to the program. This 12-week home-based cardiac physical activity program is safe and feasible in preteen Fontan patients. Parent proxy-reported HRQOL and objective measures of exercise capacity significantly improved. A 6-month follow up session is scheduled to assess sustainability. A larger study is needed to determine the applicability and reproducibility of these findings in other age groups and forms of complex CHD. © 2016 Wiley Periodicals, Inc.

  10. Research: Detailed and Selective Follow-up of Students for Improvement of Programs/Program Components in Business & Office Education and Marketing & Distributive Education. Final Report.

    ERIC Educational Resources Information Center

    Scott, Gary D.; Chapman, Alberta

    The Kentucky student follow-up system was studied to identify the current status of follow-up activities in business and office education and marketing and distributive education; to identify the impact of follow-up data on these programs; to identify program components for which detailed follow-up can provide information to assist in program…

  11. Algorithms and programs complex for solving inverse problems of artificial Earth satellite dynamics with using parallel computation. (Russian Title: Программно-математическое обеспечение для решения обратных задач динамики ИСЗ с использованием параллельных вычислений )

    NASA Astrophysics Data System (ADS)

    Chuvashov, I. N.

    2011-07-01

    In this paper complex of algorithms and programs for solving inverse problems of artificial earth satellite dynamics is described. Complex has been intended for satellite orbit improvement, calculation of motion model parameters and etc. Programs complex has been worked up for cluster "Skiff Cyberia". Results of numerical experiments obtained by using new complex in common the program "Numerical model of the system artificial satellites motion" is presented in this paper.

  12. Community Health Workers in Low- and Middle-Income Countries: What Do We Know About Scaling Up and Sustainability?

    PubMed Central

    Minhas, Dilpreet; Pérez-Escamilla, Rafael; Taylor, Lauren; Curry, Leslie; Bradley, Elizabeth H.

    2013-01-01

    Objectives. We sought to provide a systematic review of the determinants of success in scaling up and sustaining community health worker (CHW) programs in low- and middle-income countries (LMICs). Methods. We searched 11 electronic databases for academic literature published through December 2010 (n = 603 articles). Two independent reviewers applied exclusion criteria to identify articles that provided empirical evidence about the scale-up or sustainability of CHW programs in LMICs, then extracted data from each article by using a standardized form. We analyzed the resulting data for determinants and themes through iterated categorization. Results. The final sample of articles (n = 19) present data on CHW programs in 16 countries. We identified 23 enabling factors and 15 barriers to scale-up and sustainability, which were grouped into 3 thematic categories: program design and management, community fit, and integration with the broader environment. Conclusions. Scaling up and sustaining CHW programs in LMICs requires effective program design and management, including adequate training, supervision, motivation, and funding; acceptability of the program to the communities served; and securing support for the program from political leaders and other health care providers. PMID:23678926

  13. Multimorbidity: A Review of the Complexity of Mental Health Issues in Bariatric Surgery Candidates Informed by Canadian Data.

    PubMed

    Taylor, Valerie H; Hensel, Jennifer

    2017-08-01

    Multimorbidity is significant for obesity and mental health issues. As a consequence, mental illness is overrepresented in patients seeking bariatric surgery. This review addresses that overlap, with a focus on Canadian data. The healthcare system in Canada is unique, but issues related to prevalence of mental health in patients seeking bariatric surgery are similar to those in studies conducted in other countries. Although data on suicide are lacking, Canadian data have shown similar rates of self-harm behaviours and linkages between psychopathology and weight regain after surgery. Geographic issues that make it difficult for individuals to attend regular follow-up appointments have encouraged the use of e-health tools to engage patients and ensure access to follow-up care, which may provide unique opportunities going forward. Additional work is needed to inform best practices in the Canadian system, but in keeping with other data, the consistent message from Canada is that appropriate evaluation and aftercare are essential components of a well-informed bariatric program. Copyright © 2017 Diabetes Canada. Published by Elsevier Inc. All rights reserved.

  14. Treatment outcomes for human African Trypanosomiasis in the Democratic Republic of the Congo: analysis of routine program data from the world's largest sleeping sickness control program.

    PubMed

    Hasker, E; Mpanya, A; Makabuza, J; Mbo, F; Lumbala, C; Kumpel, J; Claeys, Y; Kande, V; Ravinetto, R; Menten, J; Lutumba, P; Boelaert, M

    2012-09-01

    To enable the human African trypanosomiasis (HAT) control program of the Democratic Republic of the Congo to generate data on treatment outcomes, an electronic database was developed. The database was piloted in two provinces, Bandundu and Kasai Oriental. In this study, we analysed routine data from the two provinces for the period 2006-2008. Data were extracted from case declaration cards and monthly reports available at national and provincial HAT coordination units and entered into the database. Data were retrieved for 15 086 of 15 741 cases reported in the two provinces for the period (96%). Compliance with post-treatment follow-up was very poor in both provinces; only 25% had undergone at least one post-treatment follow-up examination, <1% had undergone the required four follow-up examinations. Relapse rates among those presenting for follow-up were high in Kasai (18%) but low in Bandundu (0.3%). High relapse rates in Kasai and poor compliance with post-treatment follow-up in both provinces are important problems that the HAT control program urgently needs to address. Moreover, in analogy to tuberculosis control programs, HAT control programs need to adopt a recording and reporting routine that includes reporting on treatment outcomes. © 2012 Blackwell Publishing Ltd.

  15. Beyond Adoption: A New Framework for Theorizing and Evaluating Nonadoption, Abandonment, and Challenges to the Scale-Up, Spread, and Sustainability of Health and Care Technologies.

    PubMed

    Greenhalgh, Trisha; Wherton, Joseph; Papoutsi, Chrysanthi; Lynch, Jennifer; Hughes, Gemma; A'Court, Christine; Hinder, Susan; Fahy, Nick; Procter, Rob; Shaw, Sara

    2017-11-01

    Many promising technological innovations in health and social care are characterized by nonadoption or abandonment by individuals or by failed attempts to scale up locally, spread distantly, or sustain the innovation long term at the organization or system level. Our objective was to produce an evidence-based, theory-informed, and pragmatic framework to help predict and evaluate the success of a technology-supported health or social care program. The study had 2 parallel components: (1) secondary research (hermeneutic systematic review) to identify key domains, and (2) empirical case studies of technology implementation to explore, test, and refine these domains. We studied 6 technology-supported programs-video outpatient consultations, global positioning system tracking for cognitive impairment, pendant alarm services, remote biomarker monitoring for heart failure, care organizing software, and integrated case management via data sharing-using longitudinal ethnography and action research for up to 3 years across more than 20 organizations. Data were collected at micro level (individual technology users), meso level (organizational processes and systems), and macro level (national policy and wider context). Analysis and synthesis was aided by sociotechnically informed theories of individual, organizational, and system change. The draft framework was shared with colleagues who were introducing or evaluating other technology-supported health or care programs and refined in response to feedback. The literature review identified 28 previous technology implementation frameworks, of which 14 had taken a dynamic systems approach (including 2 integrative reviews of previous work). Our empirical dataset consisted of over 400 hours of ethnographic observation, 165 semistructured interviews, and 200 documents. The final nonadoption, abandonment, scale-up, spread, and sustainability (NASSS) framework included questions in 7 domains: the condition or illness, the technology, the value proposition, the adopter system (comprising professional staff, patient, and lay caregivers), the organization(s), the wider (institutional and societal) context, and the interaction and mutual adaptation between all these domains over time. Our empirical case studies raised a variety of challenges across all 7 domains, each classified as simple (straightforward, predictable, few components), complicated (multiple interacting components or issues), or complex (dynamic, unpredictable, not easily disaggregated into constituent components). Programs characterized by complicatedness proved difficult but not impossible to implement. Those characterized by complexity in multiple NASSS domains rarely, if ever, became mainstreamed. The framework showed promise when applied (both prospectively and retrospectively) to other programs. Subject to further empirical testing, NASSS could be applied across a range of technological innovations in health and social care. It has several potential uses: (1) to inform the design of a new technology; (2) to identify technological solutions that (perhaps despite policy or industry enthusiasm) have a limited chance of achieving large-scale, sustained adoption; (3) to plan the implementation, scale-up, or rollout of a technology program; and (4) to explain and learn from program failures. ©Trisha Greenhalgh, Joseph Wherton, Chrysanthi Papoutsi, Jennifer Lynch, Gemma Hughes, Christine A'Court, Susan Hinder, Nick Fahy, Rob Procter, Sara Shaw. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 01.11.2017.

  16. Computer assisted thermal-vacuum testing

    NASA Technical Reports Server (NTRS)

    Petrie, W.; Mikk, G.

    1977-01-01

    In testing complex systems and components under dynamic thermal-vacuum environments, it is desirable to optimize the environment control sequence in order to reduce test duration and cost. This paper describes an approach where a computer is utilized as part of the test control operation. Real time test data is made available to the computer through time-sharing terminals at appropriate time intervals. A mathematical model of the test article and environmental control equipment is then operated on using the real time data to yield current thermal status, temperature analysis, trend prediction and recommended thermal control setting changes to arrive at the required thermal condition. The data acquisition interface and the time-sharing hook-up to an IBM-370 computer is described along with a typical control program and data demonstrating its use.

  17. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  18. A new order-theoretic characterisation of the polytime computable functions☆

    PubMed Central

    Avanzini, Martin; Eguchi, Naohi; Moser, Georg

    2015-01-01

    We propose a new order-theoretic characterisation of the class of polytime computable functions. To this avail we define the small polynomial path order (sPOP⁎ for short). This termination order entails a new syntactic method to analyse the innermost runtime complexity of term rewrite systems fully automatically: for any rewrite system compatible with sPOP⁎ that employs recursion up to depth d, the (innermost) runtime complexity is polynomially bounded of degree d. This bound is tight. Thus we obtain a direct correspondence between a syntactic (and easily verifiable) condition of a program and the asymptotic worst-case complexity of the program. PMID:26412933

  19. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  20. Characterizing and Mitigating Work Time Inflation in Task Parallel Programs

    DOE PAGES

    Olivier, Stephen L.; de Supinski, Bronis R.; Schulz, Martin; ...

    2013-01-01

    Task parallelism raises the level of abstraction in shared memory parallel programming to simplify the development of complex applications. However, task parallel applications can exhibit poor performance due to thread idleness, scheduling overheads, and work time inflation – additional time spent by threads in a multithreaded computation beyond the time required to perform the same work in a sequential computation. We identify the contributions of each factor to lost efficiency in various task parallel OpenMP applications and diagnose the causes of work time inflation in those applications. Increased data access latency can cause significant work time inflation in NUMA systems.more » Our locality framework for task parallel OpenMP programs mitigates this cause of work time inflation. Our extensions to the Qthreads library demonstrate that locality-aware scheduling can improve performance up to 3X compared to the Intel OpenMP task scheduler.« less

  1. Scalable and portable visualization of large atomistic datasets

    NASA Astrophysics Data System (ADS)

    Sharma, Ashish; Kalia, Rajiv K.; Nakano, Aiichiro; Vashishta, Priya

    2004-10-01

    A scalable and portable code named Atomsviewer has been developed to interactively visualize a large atomistic dataset consisting of up to a billion atoms. The code uses a hierarchical view frustum-culling algorithm based on the octree data structure to efficiently remove atoms outside of the user's field-of-view. Probabilistic and depth-based occlusion-culling algorithms then select atoms, which have a high probability of being visible. Finally a multiresolution algorithm is used to render the selected subset of visible atoms at varying levels of detail. Atomsviewer is written in C++ and OpenGL, and it has been tested on a number of architectures including Windows, Macintosh, and SGI. Atomsviewer has been used to visualize tens of millions of atoms on a standard desktop computer and, in its parallel version, up to a billion atoms. Program summaryTitle of program: Atomsviewer Catalogue identifier: ADUM Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUM Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: 2.4 GHz Pentium 4/Xeon processor, professional graphics card; Apple G4 (867 MHz)/G5, professional graphics card Operating systems under which the program has been tested: Windows 2000/XP, Mac OS 10.2/10.3, SGI IRIX 6.5 Programming languages used: C++, C and OpenGL Memory required to execute with typical data: 1 gigabyte of RAM High speed storage required: 60 gigabytes No. of lines in the distributed program including test data, etc.: 550 241 No. of bytes in the distributed program including test data, etc.: 6 258 245 Number of bits in a word: Arbitrary Number of processors used: 1 Has the code been vectorized or parallelized: No Distribution format: tar gzip file Nature of physical problem: Scientific visualization of atomic systems Method of solution: Rendering of atoms using computer graphic techniques, culling algorithms for data minimization, and levels-of-detail for minimal rendering Restrictions on the complexity of the problem: None Typical running time: The program is interactive in its execution Unusual features of the program: None References: The conceptual foundation and subsequent implementation of the algorithms are found in [A. Sharma, A. Nakano, R.K. Kalia, P. Vashishta, S. Kodiyalam, P. Miller, W. Zhao, X.L. Liu, T.J. Campbell, A. Haas, Presence—Teleoperators and Virtual Environments 12 (1) (2003)].

  2. Building macromolecular assemblies by information-driven docking: introducing the HADDOCK multibody docking server.

    PubMed

    Karaca, Ezgi; Melquiond, Adrien S J; de Vries, Sjoerd J; Kastritis, Panagiotis L; Bonvin, Alexandre M J J

    2010-08-01

    Over the last years, large scale proteomics studies have generated a wealth of information of biomolecular complexes. Adding the structural dimension to the resulting interactomes represents a major challenge that classical structural experimental methods alone will have difficulties to confront. To meet this challenge, complementary modeling techniques such as docking are thus needed. Among the current docking methods, HADDOCK (High Ambiguity-Driven DOCKing) distinguishes itself from others by the use of experimental and/or bioinformatics data to drive the modeling process and has shown a strong performance in the critical assessment of prediction of interactions (CAPRI), a blind experiment for the prediction of interactions. Although most docking programs are limited to binary complexes, HADDOCK can deal with multiple molecules (up to six), a capability that will be required to build large macromolecular assemblies. We present here a novel web interface of HADDOCK that allows the user to dock up to six biomolecules simultaneously. This interface allows the inclusion of a large variety of both experimental and/or bioinformatics data and supports several types of cyclic and dihedral symmetries in the docking of multibody assemblies. The server was tested on a benchmark of six cases, containing five symmetric homo-oligomeric protein complexes and one symmetric protein-DNA complex. Our results reveal that, in the presence of either bioinformatics and/or experimental data, HADDOCK shows an excellent performance: in all cases, HADDOCK was able to generate good to high quality solutions and ranked them at the top, demonstrating its ability to model symmetric multicomponent assemblies. Docking methods can thus play an important role in adding the structural dimension to interactomes. However, although the current docking methodologies were successful for a vast range of cases, considering the variety and complexity of macromolecular assemblies, inclusion of some kind of experimental information (e.g. from mass spectrometry, nuclear magnetic resonance, cryoelectron microscopy, etc.) will remain highly desirable to obtain reliable results.

  3. Issues to consider in the derivation of water quality benchmarks for the protection of aquatic life.

    PubMed

    Schneider, Uwe

    2014-01-01

    While water quality benchmarks for the protection of aquatic life have been in use in some jurisdictions for several decades (USA, Canada, several European countries), more and more countries are now setting up their own national water quality benchmark development programs. In doing so, they either adopt an existing method from another jurisdiction, update on an existing approach, or develop their own new derivation method. Each approach has its own advantages and disadvantages, and many issues have to be addressed when setting up a water quality benchmark development program or when deriving a water quality benchmark. Each of these tasks requires a special expertise. They may seem simple, but are complex in their details. The intention of this paper was to provide some guidance for this process of water quality benchmark development on the program level, for the derivation methodology development, and in the actual benchmark derivation step, as well as to point out some issues (notably the inclusion of adapted populations and cryptic species and points to consider in the use of the species sensitivity distribution approach) and future opportunities (an international data repository and international collaboration in water quality benchmark development).

  4. A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems

    PubMed Central

    Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio

    2013-01-01

    Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131

  5. Convicted Driving-While-Impaired Offenders’ Views on Effectiveness of Sanctions and Treatment

    PubMed Central

    Lapham, Sandra; England-Kennedy, Elizabeth

    2011-01-01

    In this article we analyze qualitative data from a multiple methods, longitudinal study drawn from 15-year follow-up interviews with a subsample of 82 individuals arrested for driving while intoxicated in a southwestern state (1989–1995). We explore reactions to the arrest and court-mandated sanctions, including legal punishments, mandated interventions, and/or participation in programs aimed at reducing recidivism. Key findings include experiencing certain negative emotional reactions to the arrest, reactions to being jailed, experiencing other court-related sanctions as deterring driving while intoxicated behavior, and generally negative opinions regarding court-mandated interventions. We discuss interviewees’ complex perspectives on treatment and program participation and their effects on lessening recidivism, and we offer suggestions for reducing recidivism based on our findings. PMID:21490294

  6. A Regional Serials Program under National Serials Data Program Auspices: Discussion Paper Prepared for Ad Hoc Serials Discussion Group

    ERIC Educational Resources Information Center

    Grosch, Audrey N.

    1973-01-01

    A regionally organized program for serials bibliography is proposed because of the large volume of complex data needing control and the many purposes to which the data can be put in support of regional or local needs. (2 references) (Author)

  7. Predicting complete loss to follow-up after a health-education program: number of absences and face-to-face contact with a researcher.

    PubMed

    Park, M J; Yamazaki, Yoshihiko; Yonekura, Yuki; Yukawa, Keiko; Ishikawa, Hirono; Kiuchi, Takahiro; Green, Joseph

    2011-10-27

    Research on health-education programs requires longitudinal data. Loss to follow-up can lead to imprecision and bias, and complete loss to follow-up is particularly damaging. If that loss is predictable, then efforts to prevent it can be focused on those program participants who are at the highest risk. We identified predictors of complete loss to follow-up in a longitudinal cohort study. Data were collected over 1 year in a study of adults with chronic illnesses who were in a program to learn self-management skills. Following baseline measurements, the program had one group-discussion session each week for six weeks. Follow-up questionnaires were sent 3, 6, and 12 months after the baseline measurement. A person was classified as completely lost to follow-up if none of those three follow-up questionnaires had been returned by two months after the last one was sent.We tested two hypotheses: that complete loss to follow-up was directly associated with the number of absences from the program sessions, and that it was less common among people who had had face-to-face contact with one of the researchers. We also tested predictors of data loss identified previously and examined associations with specific diagnoses.Using the unpaired t-test, the U test, Fisher's exact test, and logistic regression, we identified good predictors of complete loss to follow-up. The prevalence of complete loss to follow-up was 12.2% (50/409). Complete loss to follow-up was directly related to the number of absences (odds ratio; 95% confidence interval: 1.78; 1.49-2.12), and it was inversely related to age (0.97; 0.95-0.99). Complete loss to follow-up was less common among people who had met one of the researchers (0.51; 0.28-0.95) and among those with connective tissue disease (0.29; 0.09-0.98). For the multivariate logistic model the area under the ROC curve was 0.77. Complete loss to follow-up after this health-education program can be predicted to some extent from data that are easy to collect (age, number of absences, and diagnosis). Also, face-to-face contact with a researcher deserves further study as a way of increasing participation in follow-up, and health-education programs should include it.

  8. From Utility to Exploration: Teaching with Data to Develop Complexity Thinking

    NASA Astrophysics Data System (ADS)

    Lutz, T. M.

    2016-12-01

    Scientific, social, and economic advances are possible because we impose simplicity and predictability on natural and social systems that are inherently complex and uncertain. But, the work of Edgar Morin, Gregory Bateson and others, suggests that a failure to integrate the simple and the complex in our thinking (worldview) is a root cause of humanity's unsustainable existence. This diagnosis is challenging for scientists because we make the world visible through data: complex earth systems reduced to numbers. What we do with those numbers mirrors our approach to the world. Geoscience students gain much of their experience working with data from courses in statistics, physics, and chemistry as well as courses in their major. They learn to solve problems within a scientific context, and are led to see data analysis as a set of tools needed to make predictions and decisions (e.g., probabilities, regression equations). They learn that there are right ways of doing things and correct answers to be found. We do need such skills - but they reflect a simple and reductionist view. For example, the objective of a regression model may be to reduce a large number of data to a much smaller number of parameters to gain utility in prediction. However, this is the "wrong direction" to approach complexity. The mission of Geometrics, a combined undergraduate & graduate course (ESS 321/521), at West Chester University is to seek ways to meaningfully reveal complexity (within the limitations of the data) and to understand data differently. The aim is to create multiple, possibly divergent, views of data sets to create a sense of richness and depth. This presentation will give examples of heuristic models, exploratory methods (e.g., moving average and kernel modeling; ensemble simulation) and visualizations (data slicing, conditioning, and rotation). Excel programs used in the course are constructed to develop a sense of playfulness and freedom in the students' approach to data, and they open up an often neglected side of scientific methods: abductive reasoning, and the formation of hypotheses that recognize complexity.

  9. You've Come a Long Way, Baby, but...

    ERIC Educational Resources Information Center

    Fayen, Emily Gallup

    An online library system is an example of a complex computer system in that it supports a variety of users, both patrons and staff, and is made up of many intricate programs with complex relationships among them. Certain features are essential to a user friendly system: (1) users cannot get lost in the system; (2) users cannot enter illegal…

  10. The engineering design integration (EDIN) system. [digital computer program complex

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Reiners, S. J.

    1974-01-01

    A digital computer program complex for the evaluation of aerospace vehicle preliminary designs is described. The system consists of a Univac 1100 series computer and peripherals using the Exec 8 operating system, a set of demand access terminals of the alphanumeric and graphics types, and a library of independent computer programs. Modification of the partial run streams, data base maintenance and construction, and control of program sequencing are provided by a data manipulation program called the DLG processor. The executive control of library program execution is performed by the Univac Exec 8 operating system through a user established run stream. A combination of demand and batch operations is employed in the evaluation of preliminary designs. Applications accomplished with the EDIN system are described.

  11. The SPARK Programs: A Public Health Model of Physical Education Research and Dissemination

    ERIC Educational Resources Information Center

    McKenzie, Thomas L.; Sallis, James F.; Rosengard, Paul; Ballard, Kymm

    2016-01-01

    SPARK [Sports, Play, and Active Recreation for Kids], in its current form, is a brand that represents a collection of exemplary, research-based, physical education and physical activity programs that emphasize a highly active curriculum, on-site staff development, and follow-up support. Given its complexity (e.g., multiple school levels, inclusion…

  12. ReOpt[trademark] V2.0 user guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M K; Bryant, J L

    1992-10-01

    Cleaning up the large number of contaminated waste sites at Department of Energy (DOE) facilities in the US presents a large and complex problem. Each waste site poses a singular set of circumstances (different contaminants, environmental concerns, and regulations) that affect selection of an appropriate response. Pacific Northwest Laboratory (PNL) developed ReOpt to provide information about the remedial action technologies that are currently available. It is an easy-to-use personal computer program and database that contains data about these remedial technologies and auxiliary data about contaminants and regulations. ReOpt will enable engineers and planners involved in environmental restoration efforts to quicklymore » identify potentially applicable environmental restoration technologies and access corresponding information required to select cleanup activities for DOE sites.« less

  13. Cervical cancer screening in the National Breast and Cervical Cancer Early Detection Program (NBCCEDP) in four US-Affiliated Pacific Islands between 2007 and 2015.

    PubMed

    Senkomago, Virginia; Royalty, Janet; Miller, Jacqueline W; Buenconsejo-Lum, Lee E; Benard, Vicki B; Saraiya, Mona

    2017-10-01

    Cervical cancer incidence in the US-Affiliated Pacific Islands (USAPIs) is double that of the US mainland. American Samoa, Commonwealth of Northern Mariana Islands (CNMI), Guam and the Republic of Palau receive funding from the Centers for Disease Control (CDC) National Breast and Cervical Cancer Early Detection Program (NBCCEDP) to implement cervical cancer screening to low-income, uninsured or under insured women. The USAPI grantees report data on screening and follow-up activities to the CDC. We examined cervical cancer screening and follow-up data from the NBCCEDP programs in the four USAPIs from 2007 to 2015. We summarized screening done by Papanicolaou (Pap) and oncogenic human papillomavirus (HPV) tests, follow-up and diagnostic tests provided, and histology results observed. A total of 22,249 Pap tests were conducted in 14,206 women in the four USAPIs programs from 2007-2015. The overall percentages of abnormal Pap results (low-grade squamous intraepithelial lesions or worse) was 2.4% for first program screens and 1.8% for subsequent program screens. Histology results showed a high proportion of cervical intraepithelial neoplasia grade 2 or worse (57%) among women with precancers and cancers. Roughly one-third (32%) of Pap test results warranting follow-up had no data recorded on diagnostic tests or follow-up done. This is the first report of cervical cancer screening and outcomes of women served in the USAPI through the NBCCEDP with similar results for abnormal Pap tests, but higher proportion of precancers and cancers, when compared to national NBCCEDP data. The USAPI face significant challenges in implementing cervical cancer screening, particularly in providing and recording data on diagnostic tests and follow-up. The screening programs in the USAPI should further examine specific barriers to follow-up of women with abnormal Pap results and possible solutions to address them. Published by Elsevier Ltd.

  14. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    PubMed

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-11-13

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.

  15. Using EXPLORE[R] and PLAN[R] Data to Evaluate GEAR UP Programs

    ERIC Educational Resources Information Center

    ACT, Inc., 2007

    2007-01-01

    The Gaining Early Awareness and Readiness for Undergraduate Program (GEAR UP) is designed to provide assistance to low income students. The program provides discretionary grants for the purpose of increasing the readiness of low income students to attend and succeed in postsecondary education. The grants are up to six years in length and provide…

  16. Cellulosic-Derived Biofuels Program in Kentucky - Part 2

    DTIC Science & Technology

    2014-04-30

    and lignin, are complex raw materials. Selection of robust strains of algae that are able to convert C6 (glucose) and C5 carbohydrates from...13 Task B2.03 Development of Metalloporphyrin-Ionic Liquid Complexes for Degradation of Biomass . 14 Task B2.04 –Biomass Conversion Process Scale...Up ............................................................................. 15 Task B3: Carbohydrate to Oil Conversion Process Development

  17. Time-temperature-stress capabilities of composite materials for advanced supersonic technology application

    NASA Technical Reports Server (NTRS)

    Kerr, James R.; Haskins, James F.

    1987-01-01

    Advanced composites will play a key role in the development of the technology for the design and fabrication of future supersonic vehicles. However, incorporating the material into vehicle usage is contingent on accelerating the demonstration of service capacity and design technology. Because of the added material complexity and lack of extensive data, laboratory replication of the flight service will provide the most rapid method to document the airworthiness of advanced composite systems. Consequently, a laboratory program was conducted to determine the time-temperature-stress capabilities of several high temperature composites. Tests included were thermal aging, environmental aging, fatigue, creep, fracture, tensile, and real-time flight simulation exposure. The program had two phases. The first included all the material property determinations and aging and simulation exposures up through 10,000 hours. The second continued these tests up to 50,000 cumulative hours. This report presents the results of the Phase 1 baseline and 10,000-hr aging and flight simulation studies, the Phase 2 50,000-hr aging studies, and the Phase 2 flight simulation tests, some of which extended to almost 40,000 hours.

  18. Data systems and computer science: Software Engineering Program

    NASA Technical Reports Server (NTRS)

    Zygielbaum, Arthur I.

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.

  19. A high performance data parallel tensor contraction framework: Application to coupled electro-mechanics

    NASA Astrophysics Data System (ADS)

    Poya, Roman; Gil, Antonio J.; Ortigosa, Rogelio

    2017-07-01

    The paper presents aspects of implementation of a new high performance tensor contraction framework for the numerical analysis of coupled and multi-physics problems on streaming architectures. In addition to explicit SIMD instructions and smart expression templates, the framework introduces domain specific constructs for the tensor cross product and its associated algebra recently rediscovered by Bonet et al. (2015, 2016) in the context of solid mechanics. The two key ingredients of the presented expression template engine are as follows. First, the capability to mathematically transform complex chains of operations to simpler equivalent expressions, while potentially avoiding routes with higher levels of computational complexity and, second, to perform a compile time depth-first or breadth-first search to find the optimal contraction indices of a large tensor network in order to minimise the number of floating point operations. For optimisations of tensor contraction such as loop transformation, loop fusion and data locality optimisations, the framework relies heavily on compile time technologies rather than source-to-source translation or JIT techniques. Every aspect of the framework is examined through relevant performance benchmarks, including the impact of data parallelism on the performance of isomorphic and nonisomorphic tensor products, the FLOP and memory I/O optimality in the evaluation of tensor networks, the compilation cost and memory footprint of the framework and the performance of tensor cross product kernels. The framework is then applied to finite element analysis of coupled electro-mechanical problems to assess the speed-ups achieved in kernel-based numerical integration of complex electroelastic energy functionals. In this context, domain-aware expression templates combined with SIMD instructions are shown to provide a significant speed-up over the classical low-level style programming techniques.

  20. Monitoring and Acquisition Real-time System (MARS)

    NASA Technical Reports Server (NTRS)

    Holland, Corbin

    2013-01-01

    MARS is a graphical user interface (GUI) written in MATLAB and Java, allowing the user to configure and control the Scalable Parallel Architecture for Real-Time Acquisition and Analysis (SPARTAA) data acquisition system. SPARTAA not only acquires data, but also allows for complex algorithms to be applied to the acquired data in real time. The MARS client allows the user to set up and configure all settings regarding the data channels attached to the system, as well as have complete control over starting and stopping data acquisition. It provides a unique "Test" programming environment, allowing the user to create tests consisting of a series of alarms, each of which contains any number of data channels. Each alarm is configured with a particular algorithm, determining the type of processing that will be applied on each data channel and tested against a defined threshold. Tests can be uploaded to SPARTAA, thereby teaching it how to process the data. The uniqueness of MARS is in its capability to be adaptable easily to many test configurations. MARS sends and receives protocols via TCP/IP, which allows for quick integration into almost any test environment. The use of MATLAB and Java as the programming languages allows for developers to integrate the software across multiple operating platforms.

  1. LACIE performance predictor final operational capability program description, volume 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The program EPHEMS computes the orbital parameters for up to two vehicles orbiting the earth for up to 549 days. The data represents a continuous swath about the earth, producing tables which can be used to determine when and if certain land segments will be covered. The program GRID processes NASA's climatology tape to obtain the weather indices along with associated latitudes and longitudes. The program LUMP takes substrata historical data and sample segment ID, crop window, crop window error and statistical data, checks for valid input parameters and generates the segment ID file, crop window file and the substrata historical file. Finally, the System Error Executive (SEE) Program checks YES error and truth data, CAMS error data, and signature extension data for validity and missing elements. A message is printed for each error found.

  2. Archive data base and handling system for the Orbiter flying qualities experiment program

    NASA Technical Reports Server (NTRS)

    Myers, T. T.; Dimarco, R.; Magdaleno, R. E.; Aponso, B. L.

    1986-01-01

    The OFQ archives data base and handling system assembled as part of the Orbiter Flying Qualities (OFQ) research of the Orbiter Experiments Program (EOX) are described. The purpose of the OFQ archives is to preserve and document shuttle flight data relevant to vehicle dynamics, flight control, and flying qualities in a form that permits maximum use for qualified users. In their complete form, the OFQ archives contain descriptive text (general information about the flight, signal descriptions and units) as well as numerical time history data. Since the shuttle program is so complex, the official data base contains thousands of signals and very complex entries are required to obtain data. The OFQ archives are intended to provide flight phase oriented data subsets with relevant signals which are easily identified for flying qualities research.

  3. Mercer County Community College Remedial Program Assessment, August 1988. Two Year Follow-Up of the Fall 1986 Cohort.

    ERIC Educational Resources Information Center

    Porter, Al

    This assessment of New Jersey's Mercer County Community College's (MCCC's) remedial program provides a program overview, results of a two-year follow-up of fall 1986 remedial students, and comparative data from previous years. The program overview examines policies and procedures concerning placement criteria, exit standards, program acceptance,…

  4. Meeting the complex needs of urban youth and their families through the 4Rs 2Ss Family Strengthening Program: The “real world” meets evidence-informed care

    PubMed Central

    Small, Latoya; Jackson, Jerrold; Gopalan, Geetha; McKay, Mary McKernan

    2014-01-01

    Youth living in poverty face compounding familial and environmental challenges in utilizing effective community mental health services. They have ongoing stressors that increase their dropout rate in mental health service use. Difficulties also exist in staying engaged in services when they are involved with the child welfare system. This study examines the 4Rs 2Ss Family Strengthening Program, developed across four broad conceptual categories related to parenting skills and family processes that form a multiple family group service delivery approach. A total of 321 families were enrolled in this randomized intervention study, assigned to either the 4Rs 2Ss Family Strengthening Program or standard care services. Caregivers and their children randomly assigned to the experimental condition received a 16 week multiple family group intervention through their respective outpatient community mental health clinic. Data was collected at baseline, midtest (8 weeks), posttest (16 weeks), and 6 month follow-up. Major findings include high engagement in the 4Rs 2Ss Family Strengthening Program, compared to standard services. Although child welfare status is not related to attendance, family stress and parental depression are also related to participant engagement in this multiple family group intervention. Involvement in the 4Rs 2Ss Family Strengthening Program resulted in improved effects for child behaviors. Lastly, no evidence of moderation effects on family stress, child welfare involvement, or parental needs were found. The 4Rs 2Ss Family Strengthening Program appeared able to engage families with more complex “real world” needs. PMID:26523115

  5. Meeting the complex needs of urban youth and their families through the 4Rs 2Ss Family Strengthening Program: The "real world" meets evidence-informed care.

    PubMed

    Small, Latoya; Jackson, Jerrold; Gopalan, Geetha; McKay, Mary McKernan

    2015-07-01

    Youth living in poverty face compounding familial and environmental challenges in utilizing effective community mental health services. They have ongoing stressors that increase their dropout rate in mental health service use. Difficulties also exist in staying engaged in services when they are involved with the child welfare system. This study examines the 4Rs 2Ss Family Strengthening Program, developed across four broad conceptual categories related to parenting skills and family processes that form a multiple family group service delivery approach. A total of 321 families were enrolled in this randomized intervention study, assigned to either the 4Rs 2Ss Family Strengthening Program or standard care services. Caregivers and their children randomly assigned to the experimental condition received a 16 week multiple family group intervention through their respective outpatient community mental health clinic. Data was collected at baseline, midtest (8 weeks), posttest (16 weeks), and 6 month follow-up. Major findings include high engagement in the 4Rs 2Ss Family Strengthening Program, compared to standard services. Although child welfare status is not related to attendance, family stress and parental depression are also related to participant engagement in this multiple family group intervention. Involvement in the 4Rs 2Ss Family Strengthening Program resulted in improved effects for child behaviors. Lastly, no evidence of moderation effects on family stress, child welfare involvement, or parental needs were found. The 4Rs 2Ss Family Strengthening Program appeared able to engage families with more complex "real world" needs.

  6. Multiple transitions in sick leave, disability benefits, and return to work. - A 4-year follow-up of patients participating in a work-related rehabilitation program.

    PubMed

    Oyeflaten, Irene; Lie, Stein Atle; Ihlebæk, Camilla M; Eriksen, Hege R

    2012-09-06

    Return to work (RTW) after long-term sick leave can be a long-lasting process where the individual may shift between work and receiving different social security benefits, as well as between part-time and full-time work. This is a challenge in the assessment of RTW outcomes after rehabilitation interventions. The aim of this study was to analyse the probability for RTW, and the probabilities of transitions between different benefits during a 4-year follow-up, after participating in a work-related rehabilitation program. The sample consisted of 584 patients (66% females), mean age 44 years (sd = 9.3). Mean duration on various types of sick leave benefits at entry to the rehabilitation program was 9.3 months (sd = 3.4)]. The patients had mental (47%), musculoskeletal (46%), or other diagnoses (7%). Official national register data over a 4-year follow-up period was analysed. Extended statistical tools for multistate models were used to calculate transition probabilities between the following eight states; working, partial sick leave, full-time sick leave, medical rehabilitation, vocational rehabilitation, and disability pension; (partial, permanent and time-limited). During the follow-up there was an increased probability for working, a decreased probability for being on sick leave, and an increased probability for being on disability pension. The probability of RTW was not related to the work and benefit status at departure from the rehabilitation clinic. The patients had an average of 3.7 (range 0-18) transitions between work and the different benefits. The process of RTW or of receiving disability pension was complex, and may take several years, with multiple transitions between work and different benefits. Access to reliable register data and the use of a multistate RTW model, makes it possible to describe the developmental nature and the different levels of the recovery and disability process.

  7. Multiple transitions in sick leave, disability benefits, and return to work. - A 4-year follow-up of patients participating in a work-related rehabilitation program

    PubMed Central

    2012-01-01

    Background Return to work (RTW) after long-term sick leave can be a long-lasting process where the individual may shift between work and receiving different social security benefits, as well as between part-time and full-time work. This is a challenge in the assessment of RTW outcomes after rehabilitation interventions. The aim of this study was to analyse the probability for RTW, and the probabilities of transitions between different benefits during a 4-year follow-up, after participating in a work-related rehabilitation program. Methods The sample consisted of 584 patients (66% females), mean age 44 years (sd = 9.3). Mean duration on various types of sick leave benefits at entry to the rehabilitation program was 9.3 months (sd = 3.4)]. The patients had mental (47%), musculoskeletal (46%), or other diagnoses (7%). Official national register data over a 4-year follow-up period was analysed. Extended statistical tools for multistate models were used to calculate transition probabilities between the following eight states; working, partial sick leave, full-time sick leave, medical rehabilitation, vocational rehabilitation, and disability pension; (partial, permanent and time-limited). Results During the follow-up there was an increased probability for working, a decreased probability for being on sick leave, and an increased probability for being on disability pension. The probability of RTW was not related to the work and benefit status at departure from the rehabilitation clinic. The patients had an average of 3.7 (range 0–18) transitions between work and the different benefits. Conclusions The process of RTW or of receiving disability pension was complex, and may take several years, with multiple transitions between work and different benefits. Access to reliable register data and the use of a multistate RTW model, makes it possible to describe the developmental nature and the different levels of the recovery and disability process. PMID:22954254

  8. Genetic parameters for oocyte number and embryo production within a bovine ovum pick-up-in vitro production embryo-production program.

    PubMed

    Merton, J S; Ask, B; Onkundi, D C; Mullaart, E; Colenbrander, B; Nielen, M

    2009-10-15

    Genetic factors influencing the outcome of bovine ovum pick-up-in vitro production (OPU-IVP) and its relation to female fertility were investigated. For the first time, genetic parameters were estimated for the number of cumulus-oocyte complexes (Ncoc), quality of cumulus-oocyte complexes (Qcoc), number and proportion of cleaved embryos at Day 4 (Ncleav(D4), Pcleav(D4)), and number and proportion of total and transferable embryos at Day 7 of culture (Nemb(D7), Pemb(D7) and NTemb(D7), PTemb(D7), respectively). Data were recorded by CRV (formally Holland Genetics) from the OPU-IVP program from January 1995 to March 2006. Data were collected from 1508 Holstein female donors, both cows and pregnant virgin heifers, with a total of 18,702 OPU sessions. Data were analyzed with repeated-measure sire models with permanent environment effect using ASREML (Holstein Friesian). Estimates of heritability were 0.25 for Ncoc, 0.09 for Qcoc, 0.19 for Ncleav(D4), 0.21 for Nemb(D7), 0.16 for NTemb(D7), 0.07 for Pcleav(D4), 0.12 for Pemb(D7), and 0.10 for PTemb(D7). Genetic correlation between Ncoc and Qcoc was close to zero, whereas genetic correlations between Ncoc and the number of embryos were positive and moderate to high for Nemb(D7) (0.47), NTemb(D7) (0.52), and Ncleav(D4) (0.85). Genetic correlations between Ncoc and percentages of embryos (Pcleav(D4), Pemb(D7), and PTemb(D7)) were all close to zero. Phenotypic correlations were in line with genetic correlations. Genetic and phenotypic correlations between Qcoc and all other traits were not significant except for the phenotypic correlations between Qcoc and number of embryos, which were negative and low to moderate for Nemb(D7) (-0.20), NTemb(D7) (-0.24), and Ncleav(D4) (-0.43). Results suggest that cumulus-oocyte complex (COC) quality, based on cumulus investment, is independent from the total number of COCs collected via OPU and that in general, a higher number of COCs will lead to a higher number of embryos produced. The correlation between the estimated breeding values for Ncoc and PTemb(D7) of sires in this study and the sires breeding index for female-fertility based on the Dutch cattle population was close to zero. This study revealed OPU-IVP traits (Nemb(D7), NTemb(D7), and Ncoc) that could be of potential value for selection. Introduction of such traits in breeding programs would enhance the number of offspring from superior donors as well as improve the cost efficiency of OPU-IVP programs.

  9. Impact of Pharmacists’ Participation in a Pharmacotherapy Follow-Up Program

    PubMed Central

    Dualde, Elena; Santonja, Francisco J.; Faus, Maria J.

    2012-01-01

    Objective. To evaluate the impact of a continuing pharmacy education (CPE) course on Spanish community pharmacists’ participation in a pharmacotherapy follow-up program. Design. Participation in a CPE course offered 4 times over a 4-year period via satellite teleconferencing was monitored and the data analyzed to determine the course’s impact on community pharmacists’ participation in a pharmacotherapy follow-up program. Assessment. Community pharmacists’ participation in the pharmaceutical care CPE course had a slightly positive impact on their participation in the pharmacotherapy follow-up program. In the best profiles, there was a probability of 7.3% that participants would participate in the pharmacotherapy follow-up program. Conclusions. Completion of pharmaceutical care CPE courses did not have a significant impact on pharmacists’ participation in a pharmacotherapy follow-up program. PMID:22438606

  10. DORMAN computer program (study 2.5). Volume 2: User's guide and programmer's guide. [development of data bank for computerized information storage of NASA programs

    NASA Technical Reports Server (NTRS)

    Wray, S. T., Jr.

    1973-01-01

    The DORMAN program was developed to create and modify a data bank containing data decks which serve as input to the DORCA Computer Program. Via a remote terminal a user can access the bank, extract any data deck, modify that deck, output the modified deck to be input to the DORCA program, and save the modified deck in the data bank. This computer program is an assist in the utilization of the DORCA program. The program is dimensionless and operates almost entirely in integer mode. The program was developed on the CDC 6400/7600 complex for implementation on a UNIVAC 1108 computer.

  11. Simulating New Drop Test Vehicles and Test Techniques for the Orion CEV Parachute Assembly System

    NASA Technical Reports Server (NTRS)

    Morris, Aaron L.; Fraire, Usbaldo, Jr.; Bledsoe, Kristin J.; Ray, Eric; Moore, Jim W.; Olson, Leah M.

    2011-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) project is engaged in a multi-year design and test campaign to qualify a parachute recovery system for human use on the Orion Spacecraft. Test and simulation techniques have evolved concurrently to keep up with the demands of a challenging and complex system. The primary simulations used for preflight predictions and post-test data reconstructions are Decelerator System Simulation (DSS), Decelerator System Simulation Application (DSSA), and Drop Test Vehicle Simulation (DTV-SIM). The goal of this paper is to provide a roadmap to future programs on the test technique challenges and obstacles involved in executing a large-scale, multi-year parachute test program. A focus on flight simulation modeling and correlation to test techniques executed to obtain parachute performance parameters are presented.

  12. Teaching Beginners to Program: Some Cognitive Considerations.

    ERIC Educational Resources Information Center

    Rogers, Jean B.

    Learning to program involves developing an understanding of two hierarchies of concepts. One hierarchy consists of data and extends from very literal data (which represents only itself) to very abstract data incorporating variable values in complex interrelationships. The other hierarchy consists of the operations performed on the data and extends…

  13. Computer program for determining mass properties of a rigid structure

    NASA Technical Reports Server (NTRS)

    Hull, R. A.; Gilbert, J. L.; Klich, P. J.

    1978-01-01

    A computer program was developed for the rapid computation of the mass properties of complex structural systems. The program uses rigid body analyses and permits differences in structural material throughout the total system. It is based on the premise that complex systems can be adequately described by a combination of basic elemental shapes. Simple geometric data describing size and location of each element and the respective material density or weight of each element were the only required input data. From this minimum input, the program yields system weight, center of gravity, moments of inertia and products of inertia with respect to mutually perpendicular axes through the system center of gravity. The program also yields mass properties of the individual shapes relative to component axes.

  14. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*

    PubMed Central

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying

    2016-01-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  15. A web interface for easy flexible protein-protein docking with ATTRACT.

    PubMed

    de Vries, Sjoerd J; Schindler, Christina E M; Chauvot de Beauchêne, Isaure; Zacharias, Martin

    2015-02-03

    Protein-protein docking programs can give valuable insights into the structure of protein complexes in the absence of an experimental complex structure. Web interfaces can facilitate the use of docking programs by structural biologists. Here, we present an easy web interface for protein-protein docking with the ATTRACT program. While aimed at nonexpert users, the web interface still covers a considerable range of docking applications. The web interface supports systematic rigid-body protein docking with the ATTRACT coarse-grained force field, as well as various kinds of protein flexibility. The execution of a docking protocol takes up to a few hours on a standard desktop computer. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. From OO to FPGA :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kou, Stephen; Palsberg, Jens; Brooks, Jeffrey

    Consumer electronics today such as cell phones often have one or more low-power FPGAs to assist with energy-intensive operations in order to reduce overall energy consumption and increase battery life. However, current techniques for programming FPGAs require people to be specially trained to do so. Ideally, software engineers can more readily take advantage of the benefits FPGAs offer by being able to program them using their existing skills, a common one being object-oriented programming. However, traditional techniques for compiling object-oriented languages are at odds with todays FPGA tools, which support neither pointers nor complex data structures. Open until now ismore » the problem of compiling an object-oriented language to an FPGA in a way that harnesses this potential for huge energy savings. In this paper, we present a new compilation technique that feeds into an existing FPGA tool chain and produces FPGAs with up to almost an order of magnitude in energy savings compared to a low-power microprocessor while still retaining comparable performance and area usage.« less

  17. Evaluation design for a complex intervention program targeting loneliness in non-institutionalized elderly Dutch people.

    PubMed

    de Vlaming, Rianne; Haveman-Nies, Annemien; Van't Veer, Pieter; de Groot, Lisette Cpgm

    2010-09-13

    The aim of this paper is to provide the rationale for an evaluation design for a complex intervention program targeting loneliness among non-institutionalized elderly people in a Dutch community. Complex public health interventions characteristically use the combined approach of intervening on the individual and on the environmental level. It is assumed that the components of a complex intervention interact with and reinforce each other. Furthermore, implementation is highly context-specific and its impact is influenced by external factors. Although the entire community is exposed to the intervention components, each individual is exposed to different components with a different intensity. A logic model of change is used to develop the evaluation design. The model describes what outcomes may logically be expected at different points in time at the individual level. In order to address the complexity of a real-life setting, the evaluation design of the loneliness intervention comprises two types of evaluation studies. The first uses a quasi-experimental pre-test post-test design to evaluate the effectiveness of the overall intervention. A control community comparable to the intervention community was selected, with baseline measurements in 2008 and follow-up measurements scheduled for 2010. This study focuses on changes in the prevalence of loneliness and in the determinants of loneliness within individuals in the general elderly population. Complementarily, the second study is designed to evaluate the individual intervention components and focuses on delivery, reach, acceptance, and short-term outcomes. Different means of project records and surveys among participants are used to collect these data. Combining these two evaluation strategies has the potential to assess the effectiveness of the overall complex intervention and the contribution of the individual intervention components thereto.

  18. Evaluation design for a complex intervention program targeting loneliness in non-institutionalized elderly Dutch people

    PubMed Central

    2010-01-01

    Background The aim of this paper is to provide the rationale for an evaluation design for a complex intervention program targeting loneliness among non-institutionalized elderly people in a Dutch community. Complex public health interventions characteristically use the combined approach of intervening on the individual and on the environmental level. It is assumed that the components of a complex intervention interact with and reinforce each other. Furthermore, implementation is highly context-specific and its impact is influenced by external factors. Although the entire community is exposed to the intervention components, each individual is exposed to different components with a different intensity. Methods/Design A logic model of change is used to develop the evaluation design. The model describes what outcomes may logically be expected at different points in time at the individual level. In order to address the complexity of a real-life setting, the evaluation design of the loneliness intervention comprises two types of evaluation studies. The first uses a quasi-experimental pre-test post-test design to evaluate the effectiveness of the overall intervention. A control community comparable to the intervention community was selected, with baseline measurements in 2008 and follow-up measurements scheduled for 2010. This study focuses on changes in the prevalence of loneliness and in the determinants of loneliness within individuals in the general elderly population. Complementarily, the second study is designed to evaluate the individual intervention components and focuses on delivery, reach, acceptance, and short-term outcomes. Different means of project records and surveys among participants are used to collect these data. Discussion Combining these two evaluation strategies has the potential to assess the effectiveness of the overall complex intervention and the contribution of the individual intervention components thereto. PMID:20836840

  19. Investing in innovation: trade-offs in the costs and cost-efficiency of school feeding using community-based kitchens in Bangladesh.

    PubMed

    Gelli, Aulo; Suwa, Yuko

    2014-09-01

    School feeding programs have been a key response to the recent food and economic crises and function to some degree in nearly every country in the world. However, school feeding programs are complex and exhibit different, context-specific models or configurations. To examine the trade-offs, including the costs and cost-efficiency, of an innovative cluster kitchen implementation model in Bangladesh using a standardized framework. A supply chain framework based on international standards was used to provide benchmarks for meaningful comparisons across models. Implementation processes specific to the program in Bangladesh were mapped against this reference to provide a basis for standardized performance measures. Qualitative and quantitative data on key metrics were collected retrospectively using semistructured questionnaires following an ingredients approach, including both financial and economic costs. Costs were standardized to a 200-feeding-day year and 700 kcal daily. The cluster kitchen model had similarities with the semidecentralized model and outsourced models in the literature, the main differences involving implementation scale, scale of purchasing volumes, and frequency of purchasing. Two important features stand out in terms of implementation: the nutritional quality of meals and the level of community involvement. The standardized full cost per child per year was US$110. Despite the nutritious content of the meals, the overall cost-efficiency in cost per nutrient output was lower than the benchmark for centralized programs, due mainly to support and start-up costs. Cluster kitchens provide an example of an innovative implementation model, combining an emphasis on quality meal delivery with strong community engagement. However, the standardized costs-per child were above the average benchmarks for both low-and middle-income countries. In contrast to the existing benchmark data from mature, centralized models, the main cost drivers of the program were associated with support and start-up activities. Further research is required to better understand changes in cost drivers as programs mature.

  20. Strategic information for hospital service planning: a linked data study to inform an urban Aboriginal Health Liaison Officer program in Western Australia.

    PubMed

    Katzenellenbogen, Judith M; Miller, Laura J; Somerford, Peter; McEvoy, Suzanne; Bessarab, Dawn

    2015-09-01

    The aim of the present study was to provide descriptive planning data for a hospital-based Aboriginal Health Liaison Officer (AHLO) program, specifically quantifying episodes of care and outcomes within 28 days after discharge. A follow-up study of Aboriginal in-patient hospital episodes was undertaken using person-based linked administrative data from four South Metropolitan hospitals in Perth, Western Australia (2006-11). Outcomes included 28-day deaths, emergency department (ED) presentations and in-patient re-admissions. There were 8041 eligible index admissions among 5113 individuals, with episode volumes increasing by 31% over the study period. Among patients 25 years and older, the highest ranking comorbidities included injury (47%), drug and alcohol disorders (41%), heart disease (40%), infection (40%), mental illness (31%) and diabetes (31%). Most events (96%) ended in a regular discharge. Within 28 days, 24% of events resulted in ED presentations and 20% resulted in hospital readmissions. Emergency readmissions (13%) were twice as likely as booked re-admissions (7%). Stratified analyses showed poorer outcomes for older people, and for emergency and tertiary hospital admissions. Future planning must address the greater service volumes anticipated. The high prevalence of comorbidities requires intensive case management to address case complexity. These data will inform the refinement of the AHLO program to improve in-patient experiences and outcomes.

  1. Testing of Environmental Satellite Bus-Instrument Interfaces Using Engineering Models

    NASA Technical Reports Server (NTRS)

    Gagnier, Donald; Hayner, Rick; Nosek, Thomas; Roza, Michael; Hendershot, James E.; Razzaghi, Andrea I.

    2004-01-01

    This paper discusses the formulation and execution of a laboratory test of the electrical interfaces between multiple atmospheric scientific instruments and the spacecraft bus that carries them. The testing, performed in 2002, used engineering models of the instruments and the Aura spacecraft bus electronics. Aura is one of NASA s Earth Observatory System missions. The test was designed to evaluate the complex interfaces in the command and data handling subsystems prior to integration of the complete flight instruments on the spacecraft. A problem discovered during the flight integration phase of the observatory can cause significant cost and schedule impacts. The tests successfully revealed problems and led to their resolution before the full-up integration phase, saving significant cost and schedule. This approach could be beneficial for future environmental satellite programs involving the integration of multiple, complex scientific instruments onto a spacecraft bus.

  2. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  3. Introduction to the magnet and vacuum systems of an electron storage ring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weng, W.T.

    An accelerator or storage ring complex is a concerted interplay of various functional systems. For the convenience of discussion we can divide it into the following systems: injector, magnet, RF, vacuum, instrumentation and control. In addition, the conventional construction of the building and radiation safety consideration are also needed and finally the beam lines, detector, data acquisition and analysis set-ups for research programs. Dr. L. Teng has given a comprehensive review of the whole complex and the operation of such a facility. I concentrate on the description of magnet and vacuum systems. Only the general function of each system andmore » the basic design concepts will be introduced, no detailed engineering practice will be given which will be best done after a machine design is produced. For further understanding and references a table of bibliography is provided at the end of the paper.« less

  4. C++, objected-oriented programming, and astronomical data models

    NASA Technical Reports Server (NTRS)

    Farris, A.

    1992-01-01

    Contemporary astronomy is characterized by increasingly complex instruments and observational techniques, higher data collection rates, and large data archives, placing severe stress on software analysis systems. The object-oriented paradigm represents a significant new approach to software design and implementation that holds great promise for dealing with this increased complexity. The basic concepts of this approach will be characterized in contrast to more traditional procedure-oriented approaches. The fundamental features of objected-oriented programming will be discussed from a C++ programming language perspective, using examples familiar to astronomers. This discussion will focus on objects, classes and their relevance to the data type system; the principle of information hiding; and the use of inheritance to implement generalization/specialization relationships. Drawing on the object-oriented approach, features of a new database model to support astronomical data analysis will be presented.

  5. Chemistry Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1982

    1982-01-01

    Presents laboratory procedures, classroom materials/activities, and demonstrations, including: vapor pressure of liquid mixtures and Raoult's law; preparation/analysis of transition metal complexes of ethylammonium chloride; atomic structure display using a ZX81 (includes complete program listing); "pop-up" models of molecules and ions;…

  6. Northeast regional and state trends in anuran occupancy from calling survey data (2001-2011) from the North American Amphibian Monitoring Program

    USGS Publications Warehouse

    Weir, Linda A.; Royle, Andy; Gazenski, Kimberly D.; Villena Carpio, Oswaldo

    2014-01-01

    We present the first regional trends in anuran occupancy from North American Amphibian Monitoring Program (NAAMP) data from 11 northeastern states using an 11 years of data. NAAMP is a long-term monitoring program where observers collect data at assigned random roadside routes using a calling survey technique. We assessed occupancy trends for 17 species. Eight species had statistically significant regional trends, of these seven were negative (Anaxyrus fowleri, Acris crepitans, Pseudacris brachyphona, Pseudacris feriarum-kalmi complex, Lithobates palustris, Lithobates pipiens, and Lithobates sphenocephalus) and one was positive (Hyla versicolor-chrysoscelis complex). We also assessed state level trends for 101 species/state combinations, of these 29 showed a significant decline and nine showed a significant increase in occupancy.

  7. MMA-EoS: A Computational Framework for Mineralogical Thermodynamics

    NASA Astrophysics Data System (ADS)

    Chust, T. C.; Steinle-Neumann, G.; Dolejš, D.; Schuberth, B. S. A.; Bunge, H.-P.

    2017-12-01

    We present a newly developed software framework, MMA-EoS, that evaluates phase equilibria and thermodynamic properties of multicomponent systems by Gibbs energy minimization, with application to mantle petrology. The code is versatile in terms of the equation-of-state and mixing properties and allows for the computation of properties of single phases, solution phases, and multiphase aggregates. Currently, the open program distribution contains equation-of-state formulations widely used, that is, Caloric-Murnaghan, Caloric-Modified-Tait, and Birch-Murnaghan-Mie-Grüneisen-Debye models, with published databases included. Through its modular design and easily scripted database, MMA-EoS can readily be extended with new formulations of equations-of-state and changes or extensions to thermodynamic data sets. We demonstrate the application of the program by reproducing and comparing physical properties of mantle phases and assemblages with previously published work and experimental data, successively increasing complexity, up to computing phase equilibria of six-component compositions. Chemically complex systems allow us to trace the budget of minor chemical components in order to explore whether they lead to the formation of new phases or extend stability fields of existing ones. Self-consistently computed thermophysical properties for a homogeneous mantle and a mechanical mixture of slab lithologies show no discernible differences that require a heterogeneous mantle structure as has been suggested previously. Such examples illustrate how thermodynamics of mantle mineralogy can advance the study of Earth's interior.

  8. The effect of a complex training program on skating abilities in ice hockey players.

    PubMed

    Lee, Changyoung; Lee, Sookyung; Yoo, Jaehyun

    2014-04-01

    [Purpose] Little data exist on systemic training programs to improve skating abilities in ice hockey players. The purpose of this study was to evaluate the effectiveness of a complex training program on skating abilities in ice hockey players. [Methods] Ten male ice hockey players (training group) that engaged in 12 weeks of complex training and skating training and ten male players (control group) that only participated in 12 weeks of skating training completed on-ice skating tests including a 5 time 18 meters shuttle, t-test, Rink dash 5 times, and line drill before, during, and the training. [Results] Significant group-by-time interactions were found in all skating ability tests. [Conclusion] The complex training program intervention for 12 weeks improved their skating abilities of the ice hockey players.

  9. A computer program to trace seismic ray distribution in complex two-dimensional geological models

    USGS Publications Warehouse

    Yacoub, Nazieh K.; Scott, James H.

    1970-01-01

    A computer program has been developed to trace seismic rays and their amplitudes and energies through complex two-dimensional geological models, for which boundaries between elastic units are defined by a series of digitized X-, Y-coordinate values. Input data for the program includes problem identification, control parameters, model coordinates and elastic parameter for the elastic units. The program evaluates the partitioning of ray amplitude and energy at elastic boundaries, computes the total travel time, total travel distance and other parameters for rays arising at the earth's surface. Instructions are given for punching program control cards and data cards, and for arranging input card decks. An example of printer output for a simple problem is presented. The program is written in FORTRAN IV language. The listing of the program is shown in the Appendix, with an example output from a CDC-6600 computer.

  10. EAGLEView: A surface and grid generation program and its data management

    NASA Technical Reports Server (NTRS)

    Remotigue, M. G.; Hart, E. T.; Stokes, M. L.

    1992-01-01

    An old and proven grid generation code, the EAGLE grid generation package, is given an added dimension of a graphical interface and a real time data base manager. The Numerical Aerodynamic Simulation (NAS) Panel Library is used for the graphical user interface. Through the panels, EAGLEView constructs the EAGLE script command and sends it to EAGLE to be processed. After the object is created, the script is saved in a mini-buffer which can be edited and/or saved and reinterpreted. The graphical objects are set-up in a linked-list and can be selected or queried by pointing and clicking the mouse. The added graphical enhancement to the EAGLE system emphasizes the unique capability to construct field points around complex geometry and visualize the construction every step of the way.

  11. Wildland fire management policy: Learning from the past and present and responding to future challenges

    Treesearch

    Tom Zimmerman

    2009-01-01

    Wildland fire is one of the most important vegetation- shaping factors that land managers deal with. It is our highest risk, most complex, and potentially highest consequence program. Wildland fire management policy is the most important element in defining the direction, scope, and focus of the program. What is policy? If we look it up in Merriam-Webster's...

  12. Economic analysis of measles elimination program in the Republic of Korea, 2001: a cost benefit analysis study.

    PubMed

    Bae, Geun-Ryang; Choe, Young June; Go, Un Yeong; Kim, Yong-Ik; Lee, Jong-Koo

    2013-05-31

    In this study, we modeled the cost benefit analysis for three different measles vaccination strategies based upon three different measles-containing vaccines in Korea, 2001. We employed an economic analysis model using vaccination coverage data and population-based measles surveillance data, along with available estimates of the costs for the different strategies. In addition, we have included analysis on benefit of reduction of complication by mumps and rubella. We evaluated four different strategies: strategy 1, keep-up program with a second dose measles-mumps-rubella (MMR) vaccine at 4-6 years without catch-up campaign; strategy 2, additional catch-up campaign with measles (M) vaccine; strategy 3, catch-up campaign with measles-rubella (MR) vaccine; and strategy 4, catch-up campaign with MMR vaccine. The cost of vaccination included cost for vaccines, vaccination practices and other administrative expenses. The direct benefit of estimated using data from National Health Insurance Company, a government-operated system that reimburses all medical costs spent on designated illness in Korea. With the routine one-dose MMR vaccination program, we estimated a baseline of 178,560 measles cases over the 20 years; when the catch-up campaign with M, MR or MMR vaccines was conducted, we estimated the measles cases would decrease to 5936 cases. Among all strategies, the two-dose MMR keep-up program with MR catch-up campaign showed the highest benefit-cost ratio of 1.27 with a net benefit of 51.6 billion KRW. Across different vaccination strategies, our finding suggest that MR catch-up campaign in conjunction with two-dose MMR keep-up program was the most appropriate option in terms of economic costs and public health effects associated with measles elimination strategy in Korea. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. New Exploration of Kerguelen Plateau Margins

    NASA Astrophysics Data System (ADS)

    Vially, R.; Roest, W. R.; Loubrieu, B.; Courreges, E.; Lecomte, J.; Patriat, M.; Pierre, D.; Schaming, M.; Schmitz, J.

    2008-12-01

    France ratified the United Nations Convention on the Law of the Sea in 1996, and has since undertaken an ambitious program of bathymetric and seismic data acquisition (EXTRAPLAC Program) to support claims for the extension of the legal continental shelf, in accordance with Article 76 of this convention. For this purpose, three oceanographic surveys took place on board of the R/V Marion Dufresne II on the Kerguelen Plateau, in Southern Indian Ocean: MD137-Kergueplac1 (February 2004), MD150-Kergueplac2 (October 2005) and MD165-Kergueplac3 (January 2008), operated by the French Polar Institute. Thus, more than 20 000 km of multibeam bathymetric, magnetic and gravimetric profiles, and almost 6 000 km of seismic profiles where acquired during a total of 62 days of survey in the study area. Ifremer's "rapid seismic" system was used, comprised of 4 guns and a 24 trace digital streamer, operated at speeds up to 10 knots. In addition to its use for the Extraplac Program, the data set issued from these surveys gives the opportunity to improve our knowledge of the structure of the Kerguelen Plateau and more particularly of its complex margins. In this poster, we will show the high resolution bathymetry (200 m) data set, that allows us to specify the irregular morphology of the sea floor in the north Kerguelen Plateau, characterised by ridges and volcanoes chains, radial to the plateau, that intersect the oceanic basin on the NE edge of the Kerguelen Plateau. We will also show magnetic and gravity data, which help us to understand the setting up of the oceanic plateau and the kinematics reconstructions. The seismic profiles show that the acoustic basement of the plateau is not much tectonised, and displays a very smooth texture, clearly contrasting it from typical oceanic basement. Both along the edge of the plateau as in the abyssal plain, sediments have variable thicknesses. The sediments on the margin of the plateau are up to 1200 meters thick and display irregular crisscross patterns, suggesting the presence of important bottom currents.

  14. Paying pharmacists for patient care

    PubMed Central

    Houle, Sherilyn K. D.; Grindrod, Kelly A.; Chatterley, Trish; Tsuyuki, Ross T.

    2014-01-01

    Background: Expansion of scope of practice and diminishing revenues from dispensing are requiring pharmacists to increasingly adopt clinical care services into their practices. Pharmacists must be able to receive payment in order for provision of clinical care to be sustainable. The objective of this study is to update a previous systematic review by identifying remunerated pharmacist clinical care programs worldwide and reporting on uptake and patient care outcomes observed as a result. Methods: Literature searches were performed in several databases, including MEDLINE, Embase and International Pharmaceutical Abstracts, for papers referencing remuneration, pharmacy and cognitive services. Searches of the grey literature and Internet were also conducted. Papers and programs were identified up to December 2012 and were included if they were not reported in our previous review. One author performed data abstraction, which was independently reviewed by a second author. All results are presented descriptively. Results: Sixty new remunerated programs were identified across Canada, the United States, Europe, Australia and New Zealand, ranging in complexity from emergency contraception counseling to minor ailments schemes and comprehensive medication management. In North America, the average fee provided for a medication review is $68.86 (all figures are given in Canadian dollars), with $23.37 offered for a follow-up visit and $15.16 for prescription adaptations. Time-dependent fees were reimbursed at $93.60 per hour on average. Few programs evaluated uptake and outcomes of these services but, when available, indicated slow uptake but improved chronic disease markers and cost savings. Discussion: Remuneration for pharmacists’ clinical care services is highly variable, with few programs reporting program outcomes. Programs and pharmacists are encouraged to examine the time required to perform these activities and the outcomes achieved to ensure that fees are adequate to sustain these patient care activities. PMID:25360148

  15. Parent-child development center follow-up project: child behavior problem results.

    PubMed

    Johnson, Dale L

    2006-07-01

    The long-term effectiveness of the Parent-Child Development Centers (PCDCs) as programs to prevent behavior problems in children was examined with follow-up data collected 6-13 years after program completion. Data were collected for 581 children who had been in the programs with their mothers (Ns: Birmingham, 151; New Orleans, 186; Houston, 244). Mothers and teachers were interviewed. There were few significant differences between program and control groups. Only the early cohorts of the Houston program showed significant differences between groups on the Child Behavior Checklist (CBCL). EDITORS' STRATEGIC IMPLICATIONS: This is a rare example of long-term longitudinal evaluation of a cross-site prevention program with a large sample size. Practitioners and program designers will be interested in the author's descriptions of cohort and site implementation differences. The absence of major effects at follow-up (despite significant short-term effects) in this well-designed study must caution us against thinking of early prevention programs as inoculations.

  16. Current status of neonatal follow-up in Canada

    PubMed Central

    Synnes, Anne R; Lefebvre, Francine; Cake, Heather A

    2006-01-01

    Follow-up programs in Canada collect audit and outcome research data, and provide clinical and preventive health care to extremely premature survivors and other new survivors of neonatal intensive care. Results of a 2001 to 2002 survey of Canadian follow-up programs showed a tremendous variation in the patient populations seen, the timing of visits and the evaluations performed. A description of the new Quebec consortium of follow-up programs is provided and possible future directions are discussed. PMID:19030287

  17. Process evaluation of TXT2BFiT: a multi-component mHealth randomised controlled trial to prevent weight gain in young adults.

    PubMed

    Partridge, Stephanie R; Allman-Farinelli, Margaret; McGeechan, Kevin; Balestracci, Kate; Wong, Annette T Y; Hebden, Lana; Harris, Mark F; Bauman, Adrian; Phongsavan, Philayrath

    2016-01-19

    TXT2BFiT was one of the first few innovative mHealth programs designed for young adults (18-35 years) with demonstrated efficacy in weight management. However, research is lacking to understand intervention effectiveness, especially in complex, multi-component mHealth programs. This paper investigates participant perceptions of and engagement with the mHealth program components in the TXT2BFiT to understand program effects. Process evaluation data were collected continuously for the study duration. The TXT2BFiT program was a multi-component lifestyle program delivered intensively for 3-month followed by a 6-month maintenance phase. Program components included personalised coaching calls, text messages, emails, smartphone apps and website access. Process evaluation measures included frequency of use of components and frequency for number of components used (online survey data); dose delivered and engagement with program components (researcher logs and web platform reports); frequency, timing and difficulties experienced with program components (online survey data) and overall perceptions of program components (online survey data and semi-structured telephone interviews). Qualitative data analysis was performed using NVivo10. Over 80% of participants completed post-intervention (3-months, intervention, n = 110, control n = 104) and follow-up surveys (9-months, intervention, n = 96, control n = 104). Thirty intervention participants completed semi-structured telephone interviews. Participants reported high use of coaching calls, text messages and emails and no issues in content delivery from these components. These components were described as helping them to achieve their goals. Website and app use and engagement was low for the duration of the program. Participants would prefer incorporation of the self-monitoring apps and website resources into one smartphone application that can be individualised by entry of their personal data. Our process evaluation has allowed a comprehensive understanding of use and preference for different program components. The high value placed on the coaching calls is consistent with a desire for personalisation of the mHealth program and even further tailoring of text messages and emails. The findings of this study will be used to revise TXT2BFiT for future users. The trial is registered with the Australian New Zealand Clinical Trials Registry ( ACTRN12612000924853 ).

  18. VESPA: Developing the Planetary Science Virtual Observatory in H2020

    NASA Astrophysics Data System (ADS)

    Erard, S.; Cecconi, B.; Le Sidaner, P.; Capria, M. T.; Rossi, A. P.; Schmitt, B.; Andre, N.; Vandaele, A. C.; Scherf, M.; Hueso, R.; Maattanen, A. E.; Thuillot, W.; Achilleos, N.; Marmo, C.; Santolik, O.; Benson, K.

    2015-12-01

    In the frame of the Europlanet-RI program, a prototype Virtual Observatory dedicated to Planetary Science has been set up. Most of the activity was dedicated to the definition of standards to handle data in this field. The aim was to facilitate searches in big archives as well as sparse databases, to make on-line data access and visualization possible, and to allow small data providers to make their data available in an interoperable environment with minimum effort. This system makes intensive use of studies and developments led in Astronomy (IVOA), Solar Science (HELIO), and space archive services (IPDA). A general standard has been devised to handle the specific complexity of Planetary Science, e.g. in terms of measurement types and coordinate frames [1]. A procedure has been identified to install small data services, and several hands-on sessions have been organized already. A specific client (VESPA) has been developed at VO-Paris (http://vespa.obspm.fr), using a resolver for target names. Selected data can be sent to VO visualization tools such as TOPCAT or Aladin though the SAMP protocol. The Europlanet H2020 program started in Sept 2015 will provide support to new data services in Europe (30 to 50 expected), and focus on the improvement of the infrastructure. Future steps will include the development of a connection between the VO world and GIS tools, and integration of heliophysics, planetary plasma and reference spectroscopic data. The Europlanet H2020 project is funded by the European Commission under the H2020 Program, grant 654208. [1] Erard et al Astron & Comp 2014

  19. [Progress in synthetic biology of "973 Funding Program" in China].

    PubMed

    Chen, Guoqiang; Wang, Ying

    2015-06-01

    This paper reviews progresses made in China from 2011 in areas of "Synthetic Biology" supported by State Basic Research 973 Program. Till the end of 2014, 9 "synthetic biology" projects have been initiated with emphasis on "microbial manufactures" with the 973 Funding Program. Combined with the very recent launch of one project on "mammalian cell synthetic biology" and another on "plant synthetic biology", Chinese "synthetic biology" research reflects its focus on "manufactures" while not giving up efforts on "synthetic biology" of complex systems.

  20. Pyrame 3, an online framework for Calice SiW-Ecal

    NASA Astrophysics Data System (ADS)

    Magniette, F.; Irles, A.

    2018-03-01

    Pyrame 3 is the new version of the Pyrame framework [1], with emphasize on the online data treatment and the complex tasks scripting. A new mechanism has been implemented to allow any module to treat and publish data in real time. Those data are made available to any requesting module. A circular buffer mechanism allows to break the real-time constraint and to serve the slower programs in a generic subsampling way. On the other side, a programming facility called event-loop has been provided in C/C++ language to ease the development of monitoring programs. On the SiW-Ecal prototype, the acquisition chain launches a bunch of online decoders that makes available raw data plus some basic reconstruction data (true coordinate, true time, data quality tags\\ldots). With the event-loop, it is now really very easy to implement new online monitoring programs. On the other side, the scripting mechanism has been enhanced to provide complete control of the detector to the scripts. This way, we are able to script and monitor complex behaviours like position or energy scanning, calibrations or data driven reconfigurations.

  1. Inventory and perspectives of chronic disease management programs in Switzerland: an exploratory survey.

    PubMed

    Peytremann-Bridevaux, Isabelle; Burnand, Bernard

    2009-10-07

    To describe chronic disease management programs active in Switzerland in 2007, using an exploratory survey. We searched the internet (Swiss official websites and Swiss web-pages, using Google), a medical electronic database (Medline), reference lists of pertinent articles, and contacted key informants. Programs met our operational definition of chronic disease management if their interventions targeted a chronic disease, included a multidisciplinary team (>/=2 healthcare professionals), lasted at least six months, and had already been implemented and were active in December 2007. We developed an extraction grid and collected data pertaining to eight domains (patient population, intervention recipient, intervention content, delivery personnel, method of communication, intensity and complexity, environment, clinical outcomes). We identified seven programs fulfilling our operational definition of chronic disease management. Programs targeted patients with diabetes, hypertension, heart failure, obesity, psychosis and breast cancer. Interventions were multifaceted; all included education and half considered planned follow-ups. The recipients of the interventions were patients, and healthcare professionals involved were physicians, nurses, social workers, psychologists and case managers of various backgrounds. In Switzerland, a country with universal healthcare insurance coverage and little incentive to develop new healthcare strategies, chronic disease management programs are scarce. For future developments, appropriate evaluations of existing programs, involvement of all healthcare stakeholders, strong leadership and political will are, at least, desirable.

  2. Hierarchical Aligned Cluster Analysis for Temporal Clustering of Human Motion.

    PubMed

    Zhou, Feng; De la Torre, Fernando; Hodgins, Jessica K

    2013-03-01

    Temporal segmentation of human motion into plausible motion primitives is central to understanding and building computational models of human motion. Several issues contribute to the challenge of discovering motion primitives: the exponential nature of all possible movement combinations, the variability in the temporal scale of human actions, and the complexity of representing articulated motion. We pose the problem of learning motion primitives as one of temporal clustering, and derive an unsupervised hierarchical bottom-up framework called hierarchical aligned cluster analysis (HACA). HACA finds a partition of a given multidimensional time series into m disjoint segments such that each segment belongs to one of k clusters. HACA combines kernel k-means with the generalized dynamic time alignment kernel to cluster time series data. Moreover, it provides a natural framework to find a low-dimensional embedding for time series. HACA is efficiently optimized with a coordinate descent strategy and dynamic programming. Experimental results on motion capture and video data demonstrate the effectiveness of HACA for segmenting complex motions and as a visualization tool. We also compare the performance of HACA to state-of-the-art algorithms for temporal clustering on data of a honey bee dance. The HACA code is available online.

  3. Determinants of pediatric cataract program outcomes and follow-up in a large series in Mexico.

    PubMed

    Congdon, Nathan G; Ruiz, Sergio; Suzuki, Maki; Herrera, Veronica

    2007-10-01

    To report determinants of outcomes and follow-up in a large Mexican pediatric cataract project. Hospital Luis Sanchez Bulnes, Mexico City, Mexico. Data were collected prospectively from a pediatric cataract surgery program at the Hospital Luis Sanchez Bulnes, implemented by Helen Keller International. Preoperative data included age, sex, baseline visual acuity, type of cataract, laterality, and presence of conditions such as amblyopia. Surgical data included vitrectomy, capsulotomy, complications, and use of intraocular lenses (IOLs). Postoperative data included final visual acuity, refraction, number of follow-up visits, and program support for follow-up. Of 574 eyes of 415 children (mean age 7.1 years +/- 4.7 [SD]), IOLs were placed in 416 (87%). At least 1 follow-up was attended by 408 patients (98.3%) (mean total follow-up 3.5 +/- 1.8 months); 40% of eyes achieved a final visual acuity of 6/18 or better. Children living farther from the hospital had fewer postoperative visits (P = .04), while children receiving program support had more visits (P = .001). Factors predictive of better acuity included receiving an IOL during surgery (P = .04) and provision of postoperative spectacles (P = .001). Predictive of worse acuity were amblyopia (P = .003), postoperative complications (P = .0001), unilateral surgery (P = .0075), and female sex (P = .045). The results underscore the importance of surgical training in reducing complications, early intervention before amblyopia (observed in 40% of patients) can develop, and vigorous treatment if amblyopia is present. The positive impact of program support on follow-up is encouraging, although direct financial support may pose a problem for sustainability. More work is needed to understand reasons for worse outcomes in girls.

  4. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    NASA Astrophysics Data System (ADS)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.

    2009-12-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks — e.g. data mining in HEP — by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way. Program summaryProgram title: ROOT Catalogue identifier: AEFA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL No. of lines in distributed program, including test data, etc.: 3 044 581 No. of bytes in distributed program, including test data, etc.: 36 325 133 Distribution format: tar.gz Programming language: C++ Computer: Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC Operating system: GNU/Linux, Windows XP/Vista, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Has the code been vectorized or parallelized?: Yes RAM:>55 Mbytes Classification: 4, 9, 11.9, 14 Nature of problem: Storage, analysis and visualization of scientific data Solution method: Object store, wide range of analysis algorithms and visualization methods Additional comments: For an up-to-date author list see: http://root.cern.ch/drupal/content/root-development-team and http://root.cern.ch/drupal/content/former-root-developers Running time: Depending on the data size and complexity of analysis algorithms References:http://root.cern.ch.

  5. A non-linear regression analysis program for describing electrophysiological data with multiple functions using Microsoft Excel.

    PubMed

    Brown, Angus M

    2006-04-01

    The objective of this present study was to demonstrate a method for fitting complex electrophysiological data with multiple functions using the SOLVER add-in of the ubiquitous spreadsheet Microsoft Excel. SOLVER minimizes the difference between the sum of the squares of the data to be fit and the function(s) describing the data using an iterative generalized reduced gradient method. While it is a straightforward procedure to fit data with linear functions, and we have previously demonstrated a method of non-linear regression analysis of experimental data based upon a single function, it is more complex to fit data with multiple functions, usually requiring specialized expensive computer software. In this paper we describe an easily understood program for fitting experimentally acquired data, in this case the stimulus-evoked compound action potential from the mouse optic nerve, with multiple Gaussian functions. The program is flexible and can be applied to describe data with a wide variety of user-input functions.

  6. High-performance technology for indexing of high volumes of Earth remote sensing data

    NASA Astrophysics Data System (ADS)

    Strotov, Valery V.; Taganov, Alexander I.; Kolesenkov, Aleksandr N.; Kostrov, Boris V.

    2017-10-01

    The present paper has suggested a technology for search, indexing, cataloging and distribution of aerospace images on the basis of geo-information approach, cluster and spectral analysis. It has considered information and algorithmic support of the system. Functional circuit of the system and structure of the geographical data base have been developed on the basis of the geographical online portal technology. Taking into account heterogeneity of information obtained from various sources it is reasonable to apply a geoinformation platform that allows analyzing space location of objects and territories and executing complex processing of information. Geoinformation platform is based on cartographic fundamentals with the uniform coordinate system, the geographical data base, a set of algorithms and program modules for execution of various tasks. The technology for adding by particular users and companies of images taken by means of professional and amateur devices and also processed by various software tools to the array system has been suggested. Complex usage of visual and instrumental approaches allows significantly expanding an application area of Earth remote sensing data. Development and implementation of new algorithms based on the complex usage of new methods for processing of structured and unstructured data of high volumes will increase periodicity and rate of data updating. The paper has shown that application of original algorithms for search, indexing and cataloging of aerospace images will provide an easy access to information spread by hundreds of suppliers and allow increasing an access rate to aerospace images up to 5 times in comparison with current analogues.

  7. Modeling Complex Equilibria in ITC Experiments: Thermodynamic Parameters Estimation for a Three Binding Site Model

    PubMed Central

    Le, Vu H.; Buscaglia, Robert; Chaires, Jonathan B.; Lewis, Edwin A.

    2013-01-01

    Isothermal Titration Calorimetry, ITC, is a powerful technique that can be used to estimate a complete set of thermodynamic parameters (e.g. Keq (or ΔG), ΔH, ΔS, and n) for a ligand binding interaction described by a thermodynamic model. Thermodynamic models are constructed by combination of equilibrium constant, mass balance, and charge balance equations for the system under study. Commercial ITC instruments are supplied with software that includes a number of simple interaction models, for example one binding site, two binding sites, sequential sites, and n-independent binding sites. More complex models for example, three or more binding sites, one site with multiple binding mechanisms, linked equilibria, or equilibria involving macromolecular conformational selection through ligand binding need to be developed on a case by case basis by the ITC user. In this paper we provide an algorithm (and a link to our MATLAB program) for the non-linear regression analysis of a multiple binding site model with up to four overlapping binding equilibria. Error analysis demonstrates that fitting ITC data for multiple parameters (e.g. up to nine parameters in the three binding site model) yields thermodynamic parameters with acceptable accuracy. PMID:23262283

  8. A Generalized National Planning Approach for Admission Capacity in Higher Education: A Nonlinear Integer Goal Programming Model with a Novel Differential Evolution Algorithm

    PubMed Central

    El-Qulity, Said Ali; Mohamed, Ali Wagdy

    2016-01-01

    This paper proposes a nonlinear integer goal programming model (NIGPM) for solving the general problem of admission capacity planning in a country as a whole. The work aims to satisfy most of the required key objectives of a country related to the enrollment problem for higher education. The system general outlines are developed along with the solution methodology for application to the time horizon in a given plan. The up-to-date data for Saudi Arabia is used as a case study and a novel evolutionary algorithm based on modified differential evolution (DE) algorithm is used to solve the complexity of the NIGPM generated for different goal priorities. The experimental results presented in this paper show their effectiveness in solving the admission capacity for higher education in terms of final solution quality and robustness. PMID:26819583

  9. A Generalized National Planning Approach for Admission Capacity in Higher Education: A Nonlinear Integer Goal Programming Model with a Novel Differential Evolution Algorithm.

    PubMed

    El-Qulity, Said Ali; Mohamed, Ali Wagdy

    2016-01-01

    This paper proposes a nonlinear integer goal programming model (NIGPM) for solving the general problem of admission capacity planning in a country as a whole. The work aims to satisfy most of the required key objectives of a country related to the enrollment problem for higher education. The system general outlines are developed along with the solution methodology for application to the time horizon in a given plan. The up-to-date data for Saudi Arabia is used as a case study and a novel evolutionary algorithm based on modified differential evolution (DE) algorithm is used to solve the complexity of the NIGPM generated for different goal priorities. The experimental results presented in this paper show their effectiveness in solving the admission capacity for higher education in terms of final solution quality and robustness.

  10. Automatic Data Distribution for CFD Applications on Structured Grids

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Yan, Jerry

    2000-01-01

    Data distribution is an important step in implementation of any parallel algorithm. The data distribution determines data traffic, utilization of the interconnection network and affects the overall code efficiency. In recent years a number data distribution methods have been developed and used in real programs for improving data traffic. We use some of the methods for translating data dependence and affinity relations into data distribution directives. We describe an automatic data alignment and placement tool (ADAFT) which implements these methods and show it results for some CFD codes (NPB and ARC3D). Algorithms for program analysis and derivation of data distribution implemented in ADAFT are efficient three pass algorithms. Most algorithms have linear complexity with the exception of some graph algorithms having complexity O(n(sup 4)) in the worst case.

  11. Automatic Data Distribution for CFD Applications on Structured Grids

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Yan, Jerry

    1999-01-01

    Data distribution is an important step in implementation of any parallel algorithm. The data distribution determines data traffic, utilization of the interconnection network and affects the overall code efficiency. In recent years a number data distribution methods have been developed and used in real programs for improving data traffic. We use some of the methods for translating data dependence and affinity relations into data distribution directives. We describe an automatic data alignment and placement tool (ADAPT) which implements these methods and show it results for some CFD codes (NPB and ARC3D). Algorithms for program analysis and derivation of data distribution implemented in ADAPT are efficient three pass algorithms. Most algorithms have linear complexity with the exception of some graph algorithms having complexity O(n(sup 4)) in the worst case.

  12. Avoid the Pitfalls: Benefits of Formal Part C Data System Governance. Revised

    ERIC Educational Resources Information Center

    Mauzy, Denise; Bull, Bruce; Gould, Tate

    2016-01-01

    Since the initial authorizing legislation for Part C of the Individuals with Disabilities Education Act (IDEA) in 1986, the scope and complexity of data collected by Part C programs have significantly increased. Formal governance establishes responsibility for Part C data and enables program staff to improve the effectiveness of data processes and…

  13. A MPEG-4 encoder based on TMS320C6416

    NASA Astrophysics Data System (ADS)

    Li, Gui-ju; Liu, Wei-ning

    2013-08-01

    Engineering and products need to achieve real-time video encoding by DSP, but the high computational complexity and huge amount of data requires that system has high data throughput. In this paper, a real-time MPEG-4 video encoder is designed based on TMS320C6416 platform. The kernel is the DSP of TMS320C6416T and FPGA chip f as the organization and management of video data. In order to control the flow of input and output data. Encoded stream is output using the synchronous serial port. The system has the clock frequency of 1GHz and has up to 8000 MIPS speed processing capacity when running at full speed. Due to the low coding efficiency of MPEG-4 video encoder transferred directly to DSP platform, it is needed to improve the program structure, data structures and algorithms combined with TMS320C6416T characteristics. First: Design the image storage architecture by balancing the calculation spending, storage space cost and EDMA read time factors. Open up a more buffer in memory, each buffer cache 16 lines of video data to be encoded, reconstruction image and reference image including search range. By using the variable alignment mode of the DSP, modifying the definition of structure variables and change the look-up table which occupy larger space with a direct calculation array to save memory space. After the program structure optimization, the program code, all variables, buffering buffers and the interpolation image including the search range can be placed in memory. Then, as to the time-consuming process modules and some functions which are called many times, the corresponding modules are written in parallel assembly language of TMS320C6416T which can increase the running speed. Besides, the motion estimation algorithm is improved by using a cross-hexagon search algorithm, The search speed can be increased obviously. Finally, the execution time, signal-to-noise ratio and compression ratio of a real-time image acquisition sequence is given. The experimental results show that the designed encoder in this paper can accomplish real-time encoding of a 768× 576, 25 frames per second grayscale video. The code rate is 1.5M bits per second.

  14. The Pain Associates' International Network Initiative: a novel practical approach to the challenge of chronic pain management in Europe.

    PubMed

    Morlion, Bart; Walch, Heribert; Yihune, Gabriel; Vielvoye-Kerkmeer, Ans; de Jong, Zuzana; Castro-Lopes, José; Stanton-Hicks, Michael

    2008-01-01

    Chronic pain is a debilitating condition with a multidimensional impact on the lives of patients, their families and communities. The public health burden of chronic pain is gathering recognition as a major healthcare problem in its own right and deserves closer attention. The challenge in treating chronic pain is to provide effective clinical management of a complex, multifaceted set of conditions that require a coordinated strategy of care. Epidemiological data and patient surveys have highlighted the areas of pain management that might be improved. These include a need for better understanding and documentation of the symptoms of chronic pain, standardized levels of care, improved communication among clinical personnel and with patients, and an updated education program for clinicians. For these reasons, new strategies aimed at improving the standards of pain management are needed. The Pain Associates' International Network (P.A.I.N.) Initiative was set up to devise practical methods for improving the quality of pain management for patients. These strategies have recently been put into practice through a number of activities: P.A.I.N. Workshops are meetings of international pain management professionals dedicated to discussing current management strategies and producing consensus recommendations for improving standards of care; P.A.I.N. Quality is a unique software program designed to help treating clinicians to document patient data and derive effective treatment plans; P.A.I.N. Online provides a web site forum for discussion of pain management topics; and P.A.I.N. Management is a clinician education program providing up-to-date training in pain management.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russell, J.S.

    Several factors in the development of the East Wilmington oil field by THUMS Long Beach Co. are described. These include: critical path scheduling, complex stratigraphy, reservoir engineering, drilling program, production methods, pressure maintenance, crude oil processing, automation, transportation facilities, service lines, and electrical facilities. The complexity and closely scheduled operational events interwoven in the THUMS project demands a method for the carefully planned sequence of jobs to be done, beginning with island construction up through routine production and to the LACT system. These demanding requirements necessitated the use of a critical path scheduling program. It was decided to use themore » program evaluation technique. This technique is used to assign responsibilities for individual assignments to time assignments, and to keep the overall program on schedule. The stratigraphy of East Wilmington complicates all engineering functions associated with recovery methods and reservoir evaluation. At least 5 major faults are anticipated.« less

  16. Distributed computing for macromolecular crystallography

    PubMed Central

    Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Ballard, Charles

    2018-01-01

    Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community. PMID:29533240

  17. Distributed computing for macromolecular crystallography.

    PubMed

    Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Winn, Martyn; Ballard, Charles

    2018-02-01

    Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community.

  18. Schools as social complex adaptive systems: a new way to understand the challenges of introducing the health promoting schools concept.

    PubMed

    Keshavarz, Nastaran; Nutbeam, Don; Rowling, Louise; Khavarpour, Freidoon

    2010-05-01

    Achieving system-wide implementation of health promotion programs in schools and sustaining both the program and its health related benefits have proved challenging. This paper reports on a qualitative study examining the implementation of health promoting schools programs in primary schools in Sydney, Australia. It draw upon insights from systems science to examine the relevance and usefulness of the concept of "complex adaptive systems" as a framework to better understand ways in which health promoting school interventions could be introduced and sustained. The primary data for the study were collected by semi-structured interviews with 26 school principals and teachers. Additional information was extracted from publicly available school management plans and annual reports. We examined the data from these sources to determine whether schools exhibit characteristics of complex adaptive systems. The results confirmed that schools do exhibit most, but not all of the characteristics of social complex adaptive systems, and exhibit significant differences with artificial and natural systems. Understanding schools as social complex adaptive systems may help to explain some of the challenges of introducing and sustaining change in schools. These insights may, in turn, lead us to adopt more sophisticated approaches to the diffusion of new programs in school systems that account for the diverse, complex and context specific nature of individual school systems. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  19. The Phenomenon of Business Start-Ups.

    ERIC Educational Resources Information Center

    Melis, Africa

    1990-01-01

    A study of four European countries (France, United Kingdom, Italy, and Spain) was conducted to gather data on the business start-up process and its impact on the generation of jobs, small business start-up support programs, training and counseling programs, and characteristics of successful business starters. (The original aim of the study was to…

  20. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CLEANING: FOLLOW UP QUESTIONNAIRE (UA-D-22.0)

    EPA Science Inventory

    The purpose of this SOP is to define the steps involved in cleaning the electronic data generated from data entry of the Follow Up Questionnaire. It applies to electronic data corresponding to the Follow Up Questionnaire that was scanned and verified by the data staff during the...

  1. The Effect of a Complex Training Program on Skating Abilities in Ice Hockey Players

    PubMed Central

    Lee, Changyoung; Lee, Sookyung; Yoo, Jaehyun

    2014-01-01

    [Purpose] Little data exist on systemic training programs to improve skating abilities in ice hockey players. The purpose of this study was to evaluate the effectiveness of a complex training program on skating abilities in ice hockey players. [Methods] Ten male ice hockey players (training group) that engaged in 12 weeks of complex training and skating training and ten male players (control group) that only participated in 12 weeks of skating training completed on-ice skating tests including a 5 time 18 meters shuttle, t-test, Rink dash 5 times, and line drill before, during, and the training. [Results] Significant group-by-time interactions were found in all skating ability tests. [Conclusion] The complex training program intervention for 12 weeks improved their skating abilities of the ice hockey players. PMID:24764628

  2. 41 CFR 101-5.104-3 - Data requirements for feasibility studies.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... feasibility studies. 101-5.104-3 Section 101-5.104-3 Public Contracts and Property Management Federal Property... FEDERAL BUILDINGS AND COMPLEXES 5.1-General § 101-5.104-3 Data requirements for feasibility studies. (a) The data requirements for feasibility studies may vary from program to program, but shall be standard...

  3. 41 CFR 101-5.104-3 - Data requirements for feasibility studies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... feasibility studies. 101-5.104-3 Section 101-5.104-3 Public Contracts and Property Management Federal Property... FEDERAL BUILDINGS AND COMPLEXES 5.1-General § 101-5.104-3 Data requirements for feasibility studies. (a) The data requirements for feasibility studies may vary from program to program, but shall be standard...

  4. 41 CFR 101-5.104-3 - Data requirements for feasibility studies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... feasibility studies. 101-5.104-3 Section 101-5.104-3 Public Contracts and Property Management Federal Property... FEDERAL BUILDINGS AND COMPLEXES 5.1-General § 101-5.104-3 Data requirements for feasibility studies. (a) The data requirements for feasibility studies may vary from program to program, but shall be standard...

  5. 41 CFR 101-5.104-3 - Data requirements for feasibility studies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... feasibility studies. 101-5.104-3 Section 101-5.104-3 Public Contracts and Property Management Federal Property... FEDERAL BUILDINGS AND COMPLEXES 5.1-General § 101-5.104-3 Data requirements for feasibility studies. (a) The data requirements for feasibility studies may vary from program to program, but shall be standard...

  6. 41 CFR 101-5.104-3 - Data requirements for feasibility studies.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... feasibility studies. 101-5.104-3 Section 101-5.104-3 Public Contracts and Property Management Federal Property... FEDERAL BUILDINGS AND COMPLEXES 5.1-General § 101-5.104-3 Data requirements for feasibility studies. (a) The data requirements for feasibility studies may vary from program to program, but shall be standard...

  7. Reduze - Feynman integral reduction in C++

    NASA Astrophysics Data System (ADS)

    Studerus, C.

    2010-07-01

    Reduze is a computer program for reducing Feynman integrals to master integrals employing a Laporta algorithm. The program is written in C++ and uses classes provided by the GiNaC library to perform the simplifications of the algebraic prefactors in the system of equations. Reduze offers the possibility to run reductions in parallel. Program summaryProgram title:Reduze Catalogue identifier: AEGE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:: yes No. of lines in distributed program, including test data, etc.: 55 433 No. of bytes in distributed program, including test data, etc.: 554 866 Distribution format: tar.gz Programming language: C++ Computer: All Operating system: Unix/Linux Number of processors used: The number of processors is problem dependent. More than one possible but not arbitrary many. RAM: Depends on the complexity of the system. Classification: 4.4, 5 External routines: CLN ( http://www.ginac.de/CLN/), GiNaC ( http://www.ginac.de/) Nature of problem: Solving large systems of linear equations with Feynman integrals as unknowns and rational polynomials as prefactors. Solution method: Using a Gauss/Laporta algorithm to solve the system of equations. Restrictions: Limitations depend on the complexity of the system (number of equations, number of kinematic invariants). Running time: Depends on the complexity of the system.

  8. Developing a robotic pancreas program: the Dutch experience

    PubMed Central

    Nota, Carolijn L.; Zwart, Maurice J.; Fong, Yuman; Hagendoorn, Jeroen; Hogg, Melissa E.; Koerkamp, Bas Groot; Besselink, Marc G.

    2017-01-01

    Robot-assisted surgery has been developed to overcome limitations of conventional laparoscopy aiming to further optimize minimally invasive surgery. Despite the fact that robotics already have been widely adopted in urology, gynecology, and several gastro-intestinal procedures, like colorectal surgery, pancreatic surgery lags behind. Due to the complex nature of the procedure, surgeons probably have been hesitant to apply minimally invasive techniques in pancreatic surgery. Nevertheless, the past few years pancreatic surgery has been catching up. An increasing number of procedures are being performed laparoscopically and robotically, despite it being a highly complex procedure with high morbidity and mortality rates. Since the complex nature and extensiveness of the procedure, the start of a robotic pancreatic program should be properly prepared and should comply with several conditions within high-volume centers. Robotic training plays a significant role in the preparation. In this review we discuss the different aspects of preparation when working towards the start of a robotic pancreas program against the background of our nationwide experience in the Netherlands. PMID:29078666

  9. Developing a robotic pancreas program: the Dutch experience.

    PubMed

    Nota, Carolijn L; Zwart, Maurice J; Fong, Yuman; Hagendoorn, Jeroen; Hogg, Melissa E; Koerkamp, Bas Groot; Besselink, Marc G; Molenaar, I Quintus

    2017-01-01

    Robot-assisted surgery has been developed to overcome limitations of conventional laparoscopy aiming to further optimize minimally invasive surgery. Despite the fact that robotics already have been widely adopted in urology, gynecology, and several gastro-intestinal procedures, like colorectal surgery, pancreatic surgery lags behind. Due to the complex nature of the procedure, surgeons probably have been hesitant to apply minimally invasive techniques in pancreatic surgery. Nevertheless, the past few years pancreatic surgery has been catching up. An increasing number of procedures are being performed laparoscopically and robotically, despite it being a highly complex procedure with high morbidity and mortality rates. Since the complex nature and extensiveness of the procedure, the start of a robotic pancreatic program should be properly prepared and should comply with several conditions within high-volume centers. Robotic training plays a significant role in the preparation. In this review we discuss the different aspects of preparation when working towards the start of a robotic pancreas program against the background of our nationwide experience in the Netherlands.

  10. Lessons Learned From Early Implementation of Option B+: The Elizabeth Glaser Pediatric AIDS Foundation Experience in 11 African Countries

    PubMed Central

    Mattingly, Meghan; Giphart, Anja; van de Ven, Roland; Chouraya, Caspian; Walakira, Moses; Boon, Alexandre; Mikusova, Silvia; Simonds, R. J.

    2014-01-01

    Background: “Option B+” is a World Health Organization-recommended approach to prevent mother-to-child HIV transmission whereby all HIV-positive pregnant and lactating women initiate lifelong antiretroviral therapy (ART). This review of early Option B+ implementation experience is intended to inform Ministries of Health and others involved in implementing Option B+. Methods: This implementation science study analyzed data from 11 African countries supported by the Elizabeth Glaser Pediatric AIDS Foundation (EGPAF) to describe early experience implementing Option B+. Data are from 4 sources: (1) national guidelines for prevention of mother-to-child HIV transmission and Option B+ implementation plans, (2) aggregated service delivery data between January 2013 and March 2014 from EGPAF-supported sites, (3) field visits to Option B+ implementation sites, and (4) relevant EGPAF research, quality improvement, and evaluation studies. Results: Rapid adoption of Option B+ led to large increases in percentage of HIV-positive pregnant women accessing ART in antenatal care. By the end of 2013, most programs reached at least 50% of HIV-positive women in antenatal care with ART, even in countries using a phased approach to implementation. Scaling up Option B+ through integrating ART in maternal and child health settings has required expansion of the workforce, and task shifting to allow nurse-led ART initiation has created staffing pressure on lower-level cadres for counseling and community follow-up. Complex data collection needs may be impairing data quality. Discussion: Early experiences with Option B+ implementation demonstrate promise. Continued program evaluation is needed, as is specific attention to counseling and support around initiation of lifetime ART in the context of pregnancy and lactation. PMID:25436817

  11. Lessons learned from early implementation of option B+: the Elizabeth Glaser Pediatric AIDS Foundation experience in 11 African countries.

    PubMed

    Kieffer, Mary Pat; Mattingly, Meghan; Giphart, Anja; van de Ven, Roland; Chouraya, Caspian; Walakira, Moses; Boon, Alexandre; Mikusova, Silvia; Simonds, R J

    2014-12-01

    "Option B+" is a World Health Organization-recommended approach to prevent mother-to-child HIV transmission whereby all HIV-positive pregnant and lactating women initiate lifelong antiretroviral therapy (ART). This review of early Option B+ implementation experience is intended to inform Ministries of Health and others involved in implementing Option B+. This implementation science study analyzed data from 11 African countries supported by the Elizabeth Glaser Pediatric AIDS Foundation (EGPAF) to describe early experience implementing Option B+. Data are from 4 sources: (1) national guidelines for prevention of mother-to-child HIV transmission and Option B+ implementation plans, (2) aggregated service delivery data between January 2013 and March 2014 from EGPAF-supported sites, (3) field visits to Option B+ implementation sites, and (4) relevant EGPAF research, quality improvement, and evaluation studies. Rapid adoption of Option B+ led to large increases in percentage of HIV-positive pregnant women accessing ART in antenatal care. By the end of 2013, most programs reached at least 50% of HIV-positive women in antenatal care with ART, even in countries using a phased approach to implementation. Scaling up Option B+ through integrating ART in maternal and child health settings has required expansion of the workforce, and task shifting to allow nurse-led ART initiation has created staffing pressure on lower-level cadres for counseling and community follow-up. Complex data collection needs may be impairing data quality. Early experiences with Option B+ implementation demonstrate promise. Continued program evaluation is needed, as is specific attention to counseling and support around initiation of lifetime ART in the context of pregnancy and lactation.

  12. TH-E-209-01: Fluoroscopic Dose Monitoring and Patient Follow-Up Program at Massachusetts General Hospital

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, B.

    2016-06-15

    Radiation dose monitoring solutions have opened up new opportunities for medical physicists to be more involved in modern clinical radiology practices. In particular, with the help of comprehensive radiation dose data, data-driven protocol management and informed case follow up are now feasible. Significant challenges remain however and the problems faced by medical physicists are highly heterogeneous. Imaging systems from multiple vendors and a wide range of vintages co-exist in the same department and employ data communication protocols that are not fully standardized or implemented making harmonization complex. Many different solutions for radiation dose monitoring have been implemented by imaging facilitiesmore » over the past few years. Such systems are based on commercial software, home-grown IT solutions, manual PACS data dumping, etc., and diverse pathways can be used to bring the data to impact clinical practice. The speakers will share their experiences with creating or tailoring radiation dose monitoring/management systems and procedures over the past few years, which vary significantly in design and scope. Topics to cover: (1) fluoroscopic dose monitoring and high radiation event handling from a large academic hospital; (2) dose monitoring and protocol optimization in pediatric radiology; and (3) development of a home-grown IT solution and dose data analysis framework. Learning Objectives: Describe the scope and range of radiation dose monitoring and protocol management in a modern radiology practice Review examples of data available from a variety of systems and how it managed and conveyed. Reflect on the role of the physicist in radiation dose awareness.« less

  13. TXM-Wizard: a program for advanced data collection and evaluation in full-field transmission X-ray microscopy

    PubMed Central

    Liu, Yijin; Meirer, Florian; Williams, Phillip A.; Wang, Junyue; Andrews, Joy C.; Pianetta, Piero

    2012-01-01

    Transmission X-ray microscopy (TXM) has been well recognized as a powerful tool for non-destructive investigation of the three-dimensional inner structure of a sample with spatial resolution down to a few tens of nanometers, especially when combined with synchrotron radiation sources. Recent developments of this technique have presented a need for new tools for both system control and data analysis. Here a software package developed in MATLAB for script command generation and analysis of TXM data is presented. The first toolkit, the script generator, allows automating complex experimental tasks which involve up to several thousand motor movements. The second package was designed to accomplish computationally intense tasks such as data processing of mosaic and mosaic tomography datasets; dual-energy contrast imaging, where data are recorded above and below a specific X-ray absorption edge; and TXM X-ray absorption near-edge structure imaging datasets. Furthermore, analytical and iterative tomography reconstruction algorithms were implemented. The compiled software package is freely available. PMID:22338691

  14. Generalized Symbolic Execution for Model Checking and Testing

    NASA Technical Reports Server (NTRS)

    Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)

    2003-01-01

    Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.

  15. [Implementation of a proactive integrated primary care program for frail older people: from science to evidence-based practice].

    PubMed

    Bleijenberg, Nienke; de Jonge, Artine; Brand, Morris P; O'Flynn, Caitriona; Schuurmans, Marieke J; de Wit, Niek J

    2016-12-01

    Multimorbidity, functional impairment and frailty among community-dwelling older people are causing increasing complexity in primary care. A proactive integrated primary care approach is therefore essential. Between October 2014-2015, an evidence-based proactive care program for frail older people was implemented in the region Noord-West Veluwe en Zeewolde, the Netherlands. This study evaluated the feasibility of the implementation, having a strong focus on the collaboration between the medical and social domain. Using a mixed-methods design we evaluated several process indicators. Data were obtained from electronic routine medical record data within primary care, questionnaires, and interviews with older adults. The questionnaires provided information regarding the expectations and experiences towards the program and were sent to health care professionals at baseline and six months follow-up. Stakeholders from various domains were asked to fill in the questionnaire at baseline and twelve months follow-up. Interviews were conducted to explore the experiences of older adults with the program. Regional work groups were set up in each municipality to enhance the interdisciplinary and domain transcending collaboration. The proactive primary care program was implemented in 42 general practices who provided care to 7904 older adults aged 75 years or older. A total of 101 health care professionals and 44 stakeholders filled in the questionnaires. The need for better structure and interdisciplinary cooperation seemed widespread among the participants. The implementation resulted in a positive significant change in the demand for a better regional healthcare-framework (34% p ≤ .001) among health care professionals, and the needs for transparency regarding the possibilities for referral improved (27% , p = .009). Half of the participants reported that the regional collaboration has been improved after the implementation. Health care professionals and stakeholders gained increased attention and awareness of frail elderly in their area compared to before the implementation. Older people and their caregivers were positive about the proactive approach. The nurses reported that the screenings questionnaire was too lengthy and therefore time consuming. The implementation of the proactive primary care approach in daily practice was feasible. A strong interdisciplinary collaboration was realized. The program was easily adapted to the local context.

  16. CASS—CFEL-ASG software suite

    NASA Astrophysics Data System (ADS)

    Foucar, Lutz; Barty, Anton; Coppola, Nicola; Hartmann, Robert; Holl, Peter; Hoppe, Uwe; Kassemeyer, Stephan; Kimmel, Nils; Küpper, Jochen; Scholz, Mirko; Techert, Simone; White, Thomas A.; Strüder, Lothar; Ullrich, Joachim

    2012-10-01

    The Max Planck Advanced Study Group (ASG) at the Center for Free Electron Laser Science (CFEL) has created the CFEL-ASG Software Suite CASS to view, process and analyse multi-parameter experimental data acquired at Free Electron Lasers (FELs) using the CFEL-ASG Multi Purpose (CAMP) instrument Strüder et al. (2010) [6]. The software is based on a modular design so that it can be adjusted to accommodate the needs of all the various experiments that are conducted with the CAMP instrument. In fact, this allows the use of the software in all experiments where multiple detectors are involved. One of the key aspects of CASS is that it can be used either 'on-line', using a live data stream from the free-electron laser facility's data acquisition system to guide the experiment, and 'off-line', on data acquired from a previous experiment which has been saved to file. Program summary Program title: CASS Catalogue identifier: AEMP_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence, version 3 No. of lines in distributed program, including test data, etc.: 167073 No. of bytes in distributed program, including test data, etc.: 1065056 Distribution format: tar.gz Programming language: C++. Computer: Intel x86-64. Operating system: GNU/Linux (for information about restrictions see outlook). RAM: >8 GB Classification: 2.3, 3, 15, 16.4. External routines: Qt-Framework[1], SOAP[2], (optional HDF5[3], VIGRA[4], ROOT[5], QWT[6]) Nature of problem: Analysis and visualisation of scientific data acquired at Free-Electron-Lasers Solution method: Generalise data access and storage so that a variety of small programming pieces can be linked to form a complex analysis chain. Unusual features: Complex analysis chains can be built without recompiling the program Additional comments: An updated extensive documentation of CASS is available at [7]. Running time: Depending on the data size and complexity of analysis algorithms. References: [1] http://qt.nokia.com [2] http://www.cs.fsu.edu/~engelen/soap.html [3] http://www.hdfgroup.org/HDF5/ [4] http://hci.iwr.uni-heidelberg.de/vigra/ [5] http://root.cern.ch [6] http://qwt.sourceforge.net/ [7] http://www.mpi-hd.mpg.de/personalhomes/gitasg/cass

  17. PyXRD v0.6.7: a free and open-source program to quantify disordered phyllosilicates using multi-specimen X-ray diffraction profile fitting

    NASA Astrophysics Data System (ADS)

    Dumon, M.; Van Ranst, E.

    2016-01-01

    This paper presents a free and open-source program called PyXRD (short for Python X-ray diffraction) to improve the quantification of complex, poly-phasic mixed-layer phyllosilicate assemblages. The validity of the program was checked by comparing its output with Sybilla v2.2.2, which shares the same mathematical formalism. The novelty of this program is the ab initio incorporation of the multi-specimen method, making it possible to share phases and (a selection of) their parameters across multiple specimens. PyXRD thus allows for modelling multiple specimens side by side, and this approach speeds up the manual refinement process significantly. To check the hypothesis that this multi-specimen set-up - as it effectively reduces the number of parameters and increases the number of observations - can also improve automatic parameter refinements, we calculated X-ray diffraction patterns for four theoretical mineral assemblages. These patterns were then used as input for one refinement employing the multi-specimen set-up and one employing the single-pattern set-ups. For all of the assemblages, PyXRD was able to reproduce or approximate the input parameters with the multi-specimen approach. Diverging solutions only occurred in single-pattern set-ups, which do not contain enough information to discern all minerals present (e.g. patterns of heated samples). Assuming a correct qualitative interpretation was made and a single pattern exists in which all phases are sufficiently discernible, the obtained results indicate a good quantification can often be obtained with just that pattern. However, these results from theoretical experiments cannot automatically be extrapolated to all real-life experiments. In any case, PyXRD has proven to be useful when X-ray diffraction patterns are modelled for complex mineral assemblages containing mixed-layer phyllosilicates with a multi-specimen approach.

  18. An enhanced cluster analysis program with bootstrap significance testing for ecological community analysis

    USGS Publications Warehouse

    McKenna, J.E.

    2003-01-01

    The biosphere is filled with complex living patterns and important questions about biodiversity and community and ecosystem ecology are concerned with structure and function of multispecies systems that are responsible for those patterns. Cluster analysis identifies discrete groups within multivariate data and is an effective method of coping with these complexities, but often suffers from subjective identification of groups. The bootstrap testing method greatly improves objective significance determination for cluster analysis. The BOOTCLUS program makes cluster analysis that reliably identifies real patterns within a data set more accessible and easier to use than previously available programs. A variety of analysis options and rapid re-analysis provide a means to quickly evaluate several aspects of a data set. Interpretation is influenced by sampling design and a priori designation of samples into replicate groups, and ultimately relies on the researcher's knowledge of the organisms and their environment. However, the BOOTCLUS program provides reliable, objectively determined groupings of multivariate data.

  19. KSC-2012-6387

    NASA Image and Video Library

    2012-12-04

    CAPE CANAVERAL, Fla. – At the Kennedy Space Center Visitor Complex in Florida sixth-grade students view a mock-up of a robotic device that could one day be sent to a distant planet. Between Nov. 26 and Dec. 7, 2012, about 5,300 sixth-graders in Brevard County, Florida were bused to Kennedy's Visitor Complex for Brevard Space Week, an educational program designed to encourage interest in science, technology, engineering and mathematics STEM careers. Photo credit: NASA/Tim Jacobs

  20. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques

    NASA Astrophysics Data System (ADS)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca

    2014-03-01

    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 83617 No. of bytes in distributed program, including test data, etc.: 1038160 Distribution format: tar.gz Programming language: C++. Computer: Tested on several PCs and on Mac. Operating system: Linux, Mac OS X, Windows (native and cygwin). RAM: It is dependent on the input data but usually between 1 and 10 MB. Classification: 2.5, 21.1. External routines: XrayLib (https://github.com/tschoonj/xraylib/wiki) Nature of problem: Simulation of a wide range of X-ray imaging and spectroscopy experiments using different types of sources and detectors. Solution method: XRMC is a versatile program that is useful for the simulation of a wide range of X-ray imaging and spectroscopy experiments. It enables the simulation of monochromatic and polychromatic X-ray sources, with unpolarised or partially/completely polarised radiation. Single-element detectors as well as two-dimensional pixel detectors can be used in the simulations, with several acquisition options. In the current version of the program, the sample is modelled by combining convex three-dimensional objects demarcated by quadric surfaces, such as planes, ellipsoids and cylinders. The Monte Carlo approach makes XRMC able to accurately simulate X-ray photon transport and interactions with matter up to any order of interaction. The differential cross-sections and all other quantities related to the interaction processes (photoelectric absorption, fluorescence emission, elastic and inelastic scattering) are computed using the xraylib software library, which is currently the most complete and up-to-date software library for X-ray parameters. The use of variance reduction techniques makes XRMC able to reduce the simulation time by several orders of magnitude compared to other general-purpose Monte Carlo simulation programs. Running time: It is dependent on the complexity of the simulation. For the examples distributed with the code, it ranges from less than 1 s to a few minutes.

  1. Homeostatis and Complexity as Integrating Tools in Gerontological Education.

    ERIC Educational Resources Information Center

    Richardson, Daniel; McCulloch, B. Jan; Rowles, Graham D.

    2001-01-01

    A gerontology doctoral program used the concepts of homeostasis and complexity to present biomedical and psychosocial issues. Data from 14 students showed that homeostasis was more useful for biomedical than psychosocial issues. Complexity helped in understanding interactions between the two. (SK)

  2. From the ground up: building a minimally invasive aortic valve surgery program

    PubMed Central

    Lamelas, Joseph

    2015-01-01

    Minimally invasive aortic valve replacement (MIAVR) is associated with numerous advantages including improved patient satisfaction, cosmesis, decreased transfusion requirements, and cost-effectiveness. Despite these advantages, little information exists on how to build a MIAVR program from the ground up. The steps to build a MIAVR program include compiling a multi-disciplinary team composed of surgeons, cardiologists, anesthesiologists, perfusionists, operating room (OR) technicians, and nurses. Once assembled, this team can then approach hospital administrators to present a cost-benefit analysis of MIAVR, emphasizing the importance of reduced resource utilization in the long-term to offset the initial financial investment that will be required. With hospital approval, training can commence to provide surgeons and other staff with the necessary knowledge and skills in MIAVR procedures and outcomes. Marketing and advertising of the program through the use of social media, educational conferences, grand rounds, and printed media will attract the initial patients. A dedicated website for the program can function as a “virtual lobby” for patients wanting to learn more. Initially, conservative selection criteria of cases that qualify for MIAVR will set the program up for success by avoiding complex co-morbidities and surgical techniques. During the learning curve phase of the program, patient safety should be a priority. PMID:25870815

  3. From the ground up: building a minimally invasive aortic valve surgery program.

    PubMed

    Nguyen, Tom C; Lamelas, Joseph

    2015-03-01

    Minimally invasive aortic valve replacement (MIAVR) is associated with numerous advantages including improved patient satisfaction, cosmesis, decreased transfusion requirements, and cost-effectiveness. Despite these advantages, little information exists on how to build a MIAVR program from the ground up. The steps to build a MIAVR program include compiling a multi-disciplinary team composed of surgeons, cardiologists, anesthesiologists, perfusionists, operating room (OR) technicians, and nurses. Once assembled, this team can then approach hospital administrators to present a cost-benefit analysis of MIAVR, emphasizing the importance of reduced resource utilization in the long-term to offset the initial financial investment that will be required. With hospital approval, training can commence to provide surgeons and other staff with the necessary knowledge and skills in MIAVR procedures and outcomes. Marketing and advertising of the program through the use of social media, educational conferences, grand rounds, and printed media will attract the initial patients. A dedicated website for the program can function as a "virtual lobby" for patients wanting to learn more. Initially, conservative selection criteria of cases that qualify for MIAVR will set the program up for success by avoiding complex co-morbidities and surgical techniques. During the learning curve phase of the program, patient safety should be a priority.

  4. Ligand-protein docking using a quantum stochastic tunneling optimization method.

    PubMed

    Mancera, Ricardo L; Källblad, Per; Todorov, Nikolay P

    2004-04-30

    A novel hybrid optimization method called quantum stochastic tunneling has been recently introduced. Here, we report its implementation within a new docking program called EasyDock and a validation with the CCDC/Astex data set of ligand-protein complexes using the PLP score to represent the ligand-protein potential energy surface and ScreenScore to score the ligand-protein binding energies. When taking the top energy-ranked ligand binding mode pose, we were able to predict the correct crystallographic ligand binding mode in up to 75% of the cases. By using this novel optimization method run times for typical docking simulations are significantly shortened. Copyright 2004 Wiley Periodicals, Inc. J Comput Chem 25: 858-864, 2004

  5. Causes of catastrophic failure in complex systems

    NASA Astrophysics Data System (ADS)

    Thomas, David A.

    2010-08-01

    Root causes of mission critical failures and major cost and schedule overruns in complex systems and programs are studied through the post-mortem analyses compiled for several examples, including the Hubble Space Telescope, the Challenger and Columbia Shuttle accidents, and the Three Mile Island nuclear power plant accident. The roles of organizational complexity, cognitive biases in decision making, the display of quantitative data, and cost and schedule pressure are all considered. Recommendations for mitigating the risk of similar failures in future programs are also provided.

  6. An ontology-based approach to patient follow-up assessment for continuous and personalized chronic disease management.

    PubMed

    Zhang, Yi-Fan; Gou, Ling; Zhou, Tian-Shu; Lin, De-Nan; Zheng, Jing; Li, Ye; Li, Jing-Song

    2017-08-01

    Chronic diseases are complex and persistent clinical conditions that require close collaboration among patients and health care providers in the implementation of long-term and integrated care programs. However, current solutions focus partially on intensive interventions at hospitals rather than on continuous and personalized chronic disease management. This study aims to fill this gap by providing computerized clinical decision support during follow-up assessments of chronically ill patients at home. We proposed an ontology-based framework to integrate patient data, medical domain knowledge, and patient assessment criteria for chronic disease patient follow-up assessments. A clinical decision support system was developed to implement this framework for automatic selection and adaptation of standard assessment protocols to suit patient personal conditions. We evaluated our method in the case study of type 2 diabetic patient follow-up assessments. The proposed framework was instantiated using real data from 115,477 follow-up assessment records of 36,162 type 2 diabetic patients. Standard evaluation criteria were automatically selected and adapted to the particularities of each patient. Assessment results were generated as a general typing of patient overall condition and detailed scoring for each criterion, providing important indicators to the case manager about possible inappropriate judgments, in addition to raising patient awareness of their disease control outcomes. Using historical data as the gold standard, our system achieved a rate of accuracy of 99.93% and completeness of 95.00%. This study contributes to improving the accessibility, efficiency and quality of current patient follow-up services. It also provides a generic approach to knowledge sharing and reuse for patient-centered chronic disease management. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Capturing, Codifying and Scoring Complex Data for Innovative, Computer-Based Items.

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    The Microsoft Certification Program (MCP) includes many new computer-based item types, based on complex cases involving the Windows 2000 (registered) operating system. This Innovative Item Technology (IIT) has presented challenges beyond traditional psychometric considerations such as capturing and storing the relevant response data from…

  8. Student Follow-Up Using Automated Record Linkage Techniques: Lessons from Florida's Education and Training Placement Information Program (FETPIP).

    ERIC Educational Resources Information Center

    Pfeiffer, Jay J.

    Florida's Education and Training Placement Information Program (FETPIP) is a statewide system linking the administrative databases of certain state and federal agencies to collect follow-up data on former students or program participants. The databases that are collected include those of the Florida Department of Corrections; Florida Department of…

  9. Investigating the adiabatic beam grouping at the NICA accelerator complex

    NASA Astrophysics Data System (ADS)

    Brovko, O. I.; Butenko, A. V.; Grebentsov, A. Yu.; Eliseev, A. V.; Meshkov, I. N.; Svetov, A. L.; Sidorin, A. O.; Slepnev, V. M.

    2016-12-01

    The NICA complex comprises the Booster and Nuclotron synchrotrons for accelerating particle beams to the required energy and the Collider machine, in which particle collisions are investigated. The experimental heavy-ion program deals with ions up to Au+79. The light-ion program deals with polarized deuterons and protons. Grouping of a beam coasting in an ion chamber is required in many parts of the complex. Beam grouping may effectively increase the longitudinal emittance and particle losses. To avoid these negative effects, various regimes of adiabatic grouping have been simulated and dedicated experiments with a deuteron beam have been conducted at the Nuclotron machine. As a result, we are able to construct and optimize the beam-grouping equipment, which provides a capture efficiency near 100% either retaining or varying the harmonic multiplicity of the HF system.

  10. The Significance of Ongoing Teacher Support in Earth Science Education Programs: Evidence from the GLOBE Program

    NASA Astrophysics Data System (ADS)

    Penuel, B.; Korbak, C.; Shear, L.

    2003-12-01

    The GLOBE program provides a rich context for examining issues concerning implementation of inquiry-oriented, scientist-driven educational programs, because the program has both a history of collecting evaluation data on implementation and mechanisms for capturing program activity as it occurs. In this paper, researchers from SRI International's evaluation team explore the different roles that regional partners play in preparing and supporting teachers to implement the GLOBE Program, an international inquiry-based Earth science education initiative that has trained over 14,000 teachers worldwide. GLOBE program evaluation results show the program can be effective in increasing students' inquiry skills, but that the program is also hard for teachers to implement (Means et al., 2001; Penuel et al., 2002). An analysis of GLOBE's regional partner organizations, which are tasked with preparing teachers to implement its data collection and reporting protocols with students, shows that some partners are more successful than others. This paper reports findings from a quantitative analysis of the relationship between data reporting and partner support activities and from case studies of two such regional partners focused on analyzing what makes them successful. The first analysis examined associations between partner training and support activities and data reporting. For this analysis, we used data from the GLOBE Student Data Archive matched with survey data collected from a large sample of GLOBE teachers as part of SRI's Year 5 evaluation of GLOBE. Our analyses point to the central importance of mentoring and material support to teachers. We found that incentives, mentoring, and other on-site support to teachers have a statistically significant association with higher data reporting levels. We also found that at present, teachers access these supports less often than they access listservs and e-mail communication with teachers after GLOBE training. As a follow-up to this study, SRI researchers used the data on student data reporting activity from different partners to identify candidate sites for case studies, where we might investigate the nature of follow-up activities provided by successful partners more closely. We worked to select 2 regional partners that had evidence of high percentages of teachers trained that reported data and that also offered follow-up to teachers. Case study researchers conducted observations within 2-3 active GLOBE schools supported by each regional partner organization and interviewed teachers, principals, and partner staff. On the basis of our observation data and transcripts from interviews, we compiled profiles of schools' implementation and analyzed the core activities of each regional partner. Researchers found that keys to promoting successful implementation in one partnership were: one partnership were: close alignment with state mathematics and science initiatives; mentors that helped teachers by modeling inquiry in GLOBE and by assisting with equipment set-up and curriculum planning; and allowing room for schools to adopt diverse goals for GLOBE. In the second partnership, keys to success included a strategic approach to developing funding for the program; a focus on integration of culturally-relevant knowledge into teacher preparation; follow-up support for teachers; and use of GLOBE as an opportunity to investigate local evidence of climate change. Both partner organizations were challenged by funding limitations that prevented them from providing as much follow-up support as they believe is necessary.

  11. Telling the story of tree species’ range shifts in a complex landscape

    Treesearch

    Sharon M. Stanton; Vicente J. Monleon; Heather E. Lintz; Joel Thompson

    2015-01-01

    The Forest Inventory and Analysis Program is the unrivaled source for long-term, spatially balanced, publicly available data. FIA will continue to be providers of data, but the program is growing and adapting, including a shift in how we communicate information and knowledge derived from those data. Online applications, interactive mapping, and infographics provide...

  12. Open Data, Jupyter Notebooks and Geospatial Data Standards Combined - Opening up large volumes of marine and climate data to other communities

    NASA Astrophysics Data System (ADS)

    Clements, O.; Siemen, S.; Wagemann, J.

    2017-12-01

    The EU-funded Earthserver-2 project aims to offer on-demand access to large volumes of environmental data (Earth Observation, Marine, Climate data and Planetary data) via the interface standard Web Coverage Service defined by the Open Geospatial Consortium. Providing access to data via OGC web services (e.g. WCS and WMS) has the potential to open up services to a wider audience, especially to users outside the respective communities. Especially WCS 2.0 with its processing extension Web Coverage Processing Service (WCPS) is highly beneficial to make large volumes accessible to non-expert communities. Users do not have to deal with custom community data formats, such as GRIB for the meteorological community, but can directly access the data in a format they are more familiar with, such as NetCDF, JSON or CSV. Data requests can further directly be integrated into custom processing routines and users are not required to download Gigabytes of data anymore. WCS supports trim (reduction of data extent) and slice (reduction of data dimension) operations on multi-dimensional data, providing users a very flexible on-demand access to the data. WCPS allows the user to craft queries to run on the data using a text-based query language, similar to SQL. These queries can be very powerful, e.g. condensing a three-dimensional data cube into its two-dimensional mean. However, the more processing-intensive the more complex the query. As part of the EarthServer-2 project, we developed a python library that helps users to generate complex WCPS queries with Python, a programming language they are more familiar with. The interactive presentation aims to give practical examples how users can benefit from two specific WCS services from the Marine and Climate community. Use-cases from the two communities will show different approaches to take advantage of a Web Coverage (Processing) Service. The entire content is available with Jupyter Notebooks, as they prove to be a highly beneficial tool to generate reproducible workflows for environmental data analysis.

  13. Acquisition Program Lead Systems Integration/Lead Capabilities Integration Decision Support Methodology and Tool

    DTIC Science & Technology

    2015-04-30

    from the MIT Sloan School that provide a relative complexity score for functions (Product and Context Complexity). The PMA assesses the complexity...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or

  14. LONGITUDINAL FOLLOW-UP OF LATE-ONSET ALZHEIMER DISEASE FAMILIES

    PubMed Central

    Carney, R.M.; Slifer, M.A.; Lin, P.I.; Gaskell, P. C.; Scott, W. K.; Potocky, C.F.; Hulette, C. M.; Welsh-Bohmer, K. A.; Schmechel, D. E.; Vance, J.M.; Pericak-Vance, M. A.

    2009-01-01

    Historically, data for genetic studies are collected at one time point. However, for diseases with late onset or with complex phenotypes, such as Alzheimer disease (AD), restricting diagnosis to a single ascertainment contact may not be sufficient. Affection status may change over time, and some initial diagnoses may be inconclusive. Follow-up provides the opportunity to resolve these complications. However, to date, previous studies have not formally demonstrated that longitudinally re-contacting families is practical or productive. To update data initially collected for linkage analysis of late-onset Alzheimer disease (LOAD), we successfully re-contacted 63 of 81 (78%) multiplex families (two to 17 years after ascertainment). Clinical status changed for 73 of the 230 (32%) non-affected participants. Additionally, expanded family history identified 20 additional affected individuals to supplement the data set. Furthermore, fostering ongoing relationships with participating families helped recruit 101 affected participants into an autopsy and tissue donation program. Despite similar presentations, discordance between clinical diagnosis and neuropathologic diagnosis was observed in 28% of those with tissue diagnoses. Most of the families were successfully re-contacted, and significant refinement and supplementation of the data was achieved. We concluded that serial contact with longitudinal evaluation of families has significant implications for genetic analyses. PMID:18361431

  15. STING Millennium: a web-based suite of programs for comprehensive and simultaneous analysis of protein structure and sequence

    PubMed Central

    Neshich, Goran; Togawa, Roberto C.; Mancini, Adauto L.; Kuser, Paula R.; Yamagishi, Michel E. B.; Pappas, Georgios; Torres, Wellington V.; Campos, Tharsis Fonseca e; Ferreira, Leonardo L.; Luna, Fabio M.; Oliveira, Adilton G.; Miura, Ronald T.; Inoue, Marcus K.; Horita, Luiz G.; de Souza, Dimas F.; Dominiquini, Fabiana; Álvaro, Alexandre; Lima, Cleber S.; Ogawa, Fabio O.; Gomes, Gabriel B.; Palandrani, Juliana F.; dos Santos, Gabriela F.; de Freitas, Esther M.; Mattiuz, Amanda R.; Costa, Ivan C.; de Almeida, Celso L.; Souza, Savio; Baudet, Christian; Higa, Roberto H.

    2003-01-01

    STING Millennium Suite (SMS) is a new web-based suite of programs and databases providing visualization and a complex analysis of molecular sequence and structure for the data deposited at the Protein Data Bank (PDB). SMS operates with a collection of both publicly available data (PDB, HSSP, Prosite) and its own data (contacts, interface contacts, surface accessibility). Biologists find SMS useful because it provides a variety of algorithms and validated data, wrapped-up in a user friendly web interface. Using SMS it is now possible to analyze sequence to structure relationships, the quality of the structure, nature and volume of atomic contacts of intra and inter chain type, relative conservation of amino acids at the specific sequence position based on multiple sequence alignment, indications of folding essential residue (FER) based on the relationship of the residue conservation to the intra-chain contacts and Cα–Cα and Cβ–Cβ distance geometry. Specific emphasis in SMS is given to interface forming residues (IFR)—amino acids that define the interactive portion of the protein surfaces. SMS may simultaneously display and analyze previously superimposed structures. PDB updates trigger SMS updates in a synchronized fashion. SMS is freely accessible for public data at http://www.cbi.cnptia.embrapa.br, http://mirrors.rcsb.org/SMS and http://trantor.bioc.columbia.edu/SMS. PMID:12824333

  16. Factors Associated with Attrition in Weight Loss Programs

    ERIC Educational Resources Information Center

    Grave, Riccardo Dalle; Suppini, Alessandro; Calugi, Simona; Marchesini, Giulio

    2006-01-01

    Attrition in weight loss programs is a complex process, influenced by patients' pretreatment characteristics and treatment variables, but available data are contradictory. Only a few variables have been confirmed by more than one study as relevant risk factors, but recently new data of clinical utility emerged from "real world" large observational…

  17. The application of dynamic programming in production planning

    NASA Astrophysics Data System (ADS)

    Wu, Run

    2017-05-01

    Nowadays, with the popularity of the computers, various industries and fields are widely applying computer information technology, which brings about huge demand for a variety of application software. In order to develop software meeting various needs with most economical cost and best quality, programmers must design efficient algorithms. A superior algorithm can not only soul up one thing, but also maximize the benefits and generate the smallest overhead. As one of the common algorithms, dynamic programming algorithms are used to solving problems with some sort of optimal properties. When solving problems with a large amount of sub-problems that needs repetitive calculations, the ordinary sub-recursive method requires to consume exponential time, and dynamic programming algorithm can reduce the time complexity of the algorithm to the polynomial level, according to which we can conclude that dynamic programming algorithm is a very efficient compared to other algorithms reducing the computational complexity and enriching the computational results. In this paper, we expound the concept, basic elements, properties, core, solving steps and difficulties of the dynamic programming algorithm besides, establish the dynamic programming model of the production planning problem.

  18. The first ten years of Swift supernovae

    NASA Astrophysics Data System (ADS)

    Brown, Peter J.; Roming, Peter W. A.; Milne, Peter A.

    2015-09-01

    The Swift Gamma Ray Burst Explorer has proven to be an incredible platform for studying the multiwavelength properties of supernova explosions. In its first ten years, Swift has observed over three hundred supernovae. The ultraviolet observations reveal a complex diversity of behavior across supernova types and classes. Even amongst the standard candle type Ia supernovae, ultraviolet observations reveal distinct groups. When the UVOT data is combined with higher redshift optical data, the relative populations of these groups appear to change with redshift. Among core-collapse supernovae, Swift discovered the shock breakout of two supernovae and the Swift data show a diversity in the cooling phase of the shock breakout of supernovae discovered from the ground and promptly followed up with Swift. Swift observations have resulted in an incredible dataset of UV and X-ray data for comparison with high-redshift supernova observations and theoretical models. Swift's supernova program has the potential to dramatically improve our understanding of stellar life and death as well as the history of our universe.

  19. Case Management for Patients with Complex Multimorbidity: Development and Validation of a Coordinated Intervention between Primary and Hospital Care

    PubMed Central

    Giménez-Campos, María Soledad; Villar-López, Julia; Faubel-Cava, Raquel; Donat-Castelló, Lucas; Valdivieso-Martínez, Bernardo; Soriano-Melchor, Elisa; Bahamontes-Mulió, Amparo; García-Gómez, Juan M.

    2017-01-01

    In the past few years, healthcare systems have been facing a growing demand related to the high prevalence of chronic diseases. Case management programs have emerged as an integrated care approach for the management of chronic disease. Nevertheless, there is little scientific evidence on the impact of using a case management program for patients with complex multimorbidity regarding hospital resource utilisation. We evaluated an integrated case management intervention set up by community-based care at outpatient clinics with nurse case managers from a telemedicine unit. The hypothesis to be tested was whether improved continuity of care resulting from the integration of community-based and hospital services reduced the use of hospital resources amongst patients with complex multimorbidity. A retrospective cohort study was performed using a sample of 714 adult patients admitted to the program between January 2012 and January 2015. We found a significant decrease in the number of emergency room visits, unplanned hospitalizations, and length of stay, and an expected increase in the home care hospital-based episodes. These results support the hypothesis that case management interventions can reduce the use of unplanned hospital admissions when applied to patients with complex multimorbidity. PMID:28970745

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Bingye; Sintov, Nicole

    To achieve energy savings, emerging energy management technologies and programs require customer adoption. Although a variety of models can be used to explain the adoption of energy management technologies and programs, they overlook the seemingly unconventional element of level of affiliation with nature. In fact, connectedness to nature has been identified as an important driver of many pro-environmental behaviors, but its role in pro-environmental technology adoption is also not well understood. Can affiliation with nature help to bridge the apparent gap—and complex chain of events—between sustainable technology adoption and protecting natural resources? Based on survey data from 856 southern Californiamore » residents, this study investigated the influence of connectedness to nature and other factors on intentions to adopt five energy management technologies and programs: using three platforms to monitor home energy use (website, mobile phone application, in-home display); signing up for a time-of-use pricing plan; and participating in demand response events. Regression results showed that nature connectedness was the strongest predictor of all outcomes such that higher nature connectedness predicted greater likelihood of technology and program adoption. In conclusion, these findings suggest that connectedness to nature may facilitate “bridging the logic gap” between sustainable innovation adoption and environmental protection.« less

  1. Genetic programming approach to evaluate complexity of texture images

    NASA Astrophysics Data System (ADS)

    Ciocca, Gianluigi; Corchs, Silvia; Gasparini, Francesca

    2016-11-01

    We adopt genetic programming (GP) to define a measure that can predict complexity perception of texture images. We perform psychophysical experiments on three different datasets to collect data on the perceived complexity. The subjective data are used for training, validation, and test of the proposed measure. These data are also used to evaluate several possible candidate measures of texture complexity related to both low level and high level image features. We select four of them (namely roughness, number of regions, chroma variance, and memorability) to be combined in a GP framework. This approach allows a nonlinear combination of the measures and could give hints on how the related image features interact in complexity perception. The proposed complexity measure M exhibits Pearson correlation coefficients of 0.890 on the training set, 0.728 on the validation set, and 0.724 on the test set. M outperforms each of all the single measures considered. From the statistical analysis of different GP candidate solutions, we found that the roughness measure evaluated on the gray level image is the most dominant one, followed by the memorability, the number of regions, and finally the chroma variance.

  2. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: FOLLOW UP QUESTIONNAIRE (UA-D-11.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Follow Up Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the Border study. Household and individual follow-up data were combined in a single Follow-up Questionnaire data f...

  3. SPES-2, an experimental program to support the AP600 development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarantini, M.; Medich, C.

    1995-09-01

    In support of the development of the AP600 reactor, ENEA, ENEL, ANSALDO and Westinghouse have signed a research agreement. In the framework of this agreement a complex Full Height Full Pressure (FHFP) integral system testing program has been planned on SPES-2 facility. The main purpose of this paper is to point out the status of the test program; describe the hot per-operational test performed and the complete test matrix, giving all the necessary references on the work already published. Two identical Small Break LOCA transients, performed with Pressurizer to Core Make-up Tank (PRZ-CMT) balance line (Test S00203) and without PRZ-CMTmore » balance line (Test S00303) are then compared, to show how the SPES-2 facility can contribute in confirming the new AP600 reactor design choices and can give useful indications to designers. Although the detailed analysis of test data has not been completed, some consideration on the analytical tools utilized and on the SPES-2 capability to simulate the reference plant is then drawn.« less

  4. Feasibility of and barriers to continuity of care in US general surgery residencies with an 80-hour duty week.

    PubMed

    Morrissey, Shawna; Dumire, Russell; Bost, James; Gregory, James S

    2011-03-01

    The current level of continuity of care for following up a single patient through preoperative evaluation, surgery, and postoperative care is unknown. A survey of residents was performed, asking for their best guess regarding the number of patients seen for 6 common and 4 uncommon surgeries, and ranking barriers to continuity of care. The length of time to achieve single-patient continuity of care in 5 patients was derived as well as the creation of odds ratios for the barriers. A total of 274 residents (56 programs) completed surveys. Residency length was 7 years for common surgeries and 9 for complex surgeries. The 30-hour work restrictions, inability to attend clinic, and floor/ward duties were the barriers to continuity of care. These data were unaffected by type of program, the presence of a night float system, or residency year. Achieving the level of continuity of care used in this article will require a radical change in the length or structure of general surgery residency programs. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. KISS for STRAP: user extensions for a protein alignment editor.

    PubMed

    Gille, Christoph; Lorenzen, Stephan; Michalsky, Elke; Frömmel, Cornelius

    2003-12-12

    The Structural Alignment Program STRAP is a comfortable comprehensive editor and analyzing tool for protein alignments. A wide range of functions related to protein sequences and protein structures are accessible with an intuitive graphical interface. Recent features include mapping of mutations and polymorphisms onto structures and production of high quality figures for publication. Here we address the general problem of multi-purpose program packages to keep up with the rapid development of bioinformatical methods and the demand for specific program functions. STRAP was remade implementing a novel design which aims at Keeping Interfaces in STRAP Simple (KISS). KISS renders STRAP extendable to bio-scientists as well as to bio-informaticians. Scientists with basic computer skills are capable of implementing statistical methods or embedding existing bioinformatical tools in STRAP themselves. For bio-informaticians STRAP may serve as an environment for rapid prototyping and testing of complex algorithms such as automatic alignment algorithms or phylogenetic methods. Further, STRAP can be applied as an interactive web applet to present data related to a particular protein family and as a teaching tool. JAVA-1.4 or higher. http://www.charite.de/bioinf/strap/

  6. Long-term Effects of the Family Bereavement Program (FBP) on Spousally-Bereaved Parents: Grief, Mental Health, Alcohol Problems, and Coping Efficacy

    PubMed Central

    Sandler, Irwin; Tein, Jenn-Yun; Cham, Heining; Wolchik, Sharlene; Ayers, Tim

    2016-01-01

    Objective Reports on the finding from a six-year follow-up of a randomized trial of the Family Bereavement Program (FBP) on outcomes for spousally-bereaved parents. Method Spousally-bereaved parents (N=131) participated in the trial in which they were randomly assigned to receive the FBP (N = 72) or literature control (LC, N = 59). Parents were assessed at four time points, pre-test, post-test, 11-months, and six-year follow-up. Parents reported on mental health problems, grief and parenting at all four time periods. At the six-year follow-up parents reported on additional measures of persistent complex bereavement disorder, alcohol abuse problems, and coping efficacy. Results Bereaved parents in the FBP as compared to those in the LC had lower levels of symptoms of depression, general psychiatric distress, prolonged grief, alcohol problems, and higher coping efficacy at the six-year follow-up. Multiple characteristics of the parent (e.g., gender, age, baseline mental health problems) and of the spousal death (e.g., cause of death) were tested as moderators of program effects on each outcome. Latent-growth modeling found that the effects of the FBP on depression, psychiatric distress and grief occurred immediately following program participation and were maintained over six-years. Mediation analysis found that improvement in positive parenting partially mediated program effects to reduce depression and psychiatric distress, but had an indirect effect to higher levels of grief at the six years follow-up. Mediation analysis also found that improved parenting at the six year follow-up was partially mediated by program effects to reduce depression and that program effects to increase coping efficacy at the six-year follow-up was partially mediated through reduced depression and grief and improved parenting. Conclusions FBP reduced mental health problems, prolonged grief and alcohol abuse and increased coping efficacy of spousally-bereaved parents six years later. Mediation pathways for program effects differed across outcomes at the six-year follow-up. PMID:27427807

  7. Enabling long-term oceanographic research: Changing data practices, information management strategies and informatics

    NASA Astrophysics Data System (ADS)

    Baker, Karen S.; Chandler, Cynthia L.

    2008-09-01

    Interdisciplinary global ocean science requires new ways of thinking about data and data management. With new data policies and growing technological capabilities, datasets of increasing variety and complexity are being made available digitally and data management is coming to be recognized as an integral part of scientific research. To meet the changing expectations of scientists collecting data and of data reuse by others, collaborative strategies involving diverse teams of information professionals are developing. These changes are stimulating the growth of information infrastructures that support multi-scale sampling, data repositories, and data integration. Two examples of oceanographic projects incorporating data management in partnership with science programs are discussed: the Palmer Station Long-Term Ecological Research program (Palmer LTER) and the United States Joint Global Ocean Flux Study (US JGOFS). Lessons learned from a decade of data management within these communities provide an experience base from which to develop information management strategies—short-term and long-term. Ocean Informatics provides one example of a conceptual framework for managing the complexities inherent to sharing oceanographic data. Elements are introduced that address the economies-of-scale and the complexities-of-scale pertinent to a broader vision of information management and scientific research.

  8. In situ ozone data for evaluation of the laser absorption spectrometer ozone remote sensor: 1979 southeastern Virginia urban plume study summer field program

    NASA Technical Reports Server (NTRS)

    Gregory, G. L.; Mcdougal, D. S.; Mathis, J. J., Jr.

    1980-01-01

    Ozone data from the 1979 Southeastern Virginia Urban Study (SEV-UPS) field program are presented. The SEV-UPS was conducted for evaluation of an ozone remote sensor, the Laser Absorption Spectrometer. During the measurement program, remote-sensor evaluation was in two areas; (1) determination of the remote sensor's accuracy, repeatability, and operational characteristics, and (2) demonstration of the application of remotely sensed ozone data in air-quality studies. Data from six experiments designed to provide in situ ozone data for evaluation of the sensor in area 1, above, are presented. Experiments consisted of overflights of a test area with the remote sensor aircraft while in situ measurements with a second aircraft and selected surface stations provided correlative ozone data within the viewing area of the remote sensor.

  9. A national survey of residents in combined Internal Medicine and Dermatology residency programs: educational experience and future plans.

    PubMed

    Mostaghimi, Arash; Wanat, Karolyn; Crotty, Bradley H; Rosenbach, Misha

    2015-10-16

    In response to a perceived erosion of medical dermatology, combined internal medicine and dermatology programs (med/derm) programs have been developed that aim to train dermatologists who take care of medically complex patients. Despite the investment in these programs, there is currently no data with regards to the potential impact of these trainees on the dermatology workforce. To determine the experiences, motivations, and future plans of residents in combined med/derm residency programs. We surveyed residents at all United States institutions with both categorical and combined training programs in spring of 2012. Respondents used visual analog scales to rate clinical interests, self-assessed competency, career plans, and challenges. The primary study outcomes were comfort in taking care of patients with complex disease, future practice plans, and experience during residency. Twenty-eight of 31 med/derm residents (87.5%) and 28 of 91 (31%) categorical residents responded (overall response rate 46%). No significant differences were seen in self-assessed dermatology competency, or comfort in performing inpatient consultations, cosmetic procedures, or prescribing systemic agents. A trend toward less comfort in general dermatology was seen among med/derm residents. Med/derm residents were more likely to indicate career preferences for performing inpatient consultation and taking care of medically complex patients. Categorical residents rated their programs and experiences more highly. Med/derm residents have stronger interests in serving medically complex patients. Categorical residents are more likely to have a positive experience during residency. Future work will be needed to ascertain career choices among graduates once data are available.

  10. Using a commercial CAD system for simultaneous input to theoretical aerodynamic programs and wind-tunnel model construction

    NASA Technical Reports Server (NTRS)

    Enomoto, F.; Keller, P.

    1984-01-01

    The Computer Aided Design (CAD) system's common geometry database was used to generate input for theoretical programs and numerically controlled (NC) tool paths for wind tunnel part fabrication. This eliminates the duplication of work in generating separate geometry databases for each type of analysis. Another advantage is that it reduces the uncertainty due to geometric differences when comparing theoretical aerodynamic data with wind tunnel data. The system was adapted to aerodynamic research by developing programs written in Design Analysis Language (DAL). These programs reduced the amount of time required to construct complex geometries and to generate input for theoretical programs. Certain shortcomings of the Design, Drafting, and Manufacturing (DDM) software limited the effectiveness of these programs and some of the Calma NC software. The complexity of aircraft configurations suggests that more types of surface and curve geometry should be added to the system. Some of these shortcomings may be eliminated as improved versions of DDM are made available.

  11. Development of a practice-based research program.

    PubMed

    Hawk, C; Long, C R; Boulanger, K

    1998-01-01

    To establish an infrastructure to collect accurate data from ambulatory settings. The program was developed through an iterative model governed by a process of formative evaluation. The three iterations were a needs assessment, feasibility study and pilot project. Necessary program components were identified as infrastructure, practitioner-researcher partnership, centralized data management and standardized quality assurance measures. Volunteer chiropractors and their staff collected data on patients in their practices in ambulatory settings in the U.S. and Canada. Evaluative measures were counts of participants, patients and completed forms. Standardized, validated and reliable measures collected by patient self-report were used to assess treatment outcomes. These included the SF-36 or SF-12 Health Survey, the Pain Disability Index, and the Global Well-Being Scale. For characteristics for which appropriate standardized instruments were not available, questionnaires were designed and and pilot-tested before use. Information was gathered on practice and patient characteristics and treatment outcomes, but for this report, only those data concerning process evaluation are reported. Through the three program iterations, 65 DCs collected data on 1360 patients, 663 of whom were new patients. Follow-up data recorded by doctors were obtained for more than 70% of patients; a maximum of 50% of patient-completed follow-up forms were collected in the three iterations. This program is capable of providing data for descriptive epidemiology of ambulatory patients, and, with continued effort to maximize follow-up, may have utility in providing insight into utilization patterns and patient outcomes.

  12. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CLEANING: FOOD DIARY FOLLOW UP (UA-D-21.0)

    EPA Science Inventory

    The purpose of this SOP is to define the steps involved in cleaning the electronic data generated from data entry of the Food Diary Follow Up Questionnaire. It applies to electronic data corresponding to the Food Diary Follow Up Questionnaire that was scanned and verified by the...

  13. How smart should pacemakers Be?

    PubMed

    Saoudi, N; Appl, U; Anselme, F; Voglimacci, M; Cribier, A

    1999-03-11

    The concept of the "smart" pacemaker has been continuously changing during 40 years of progress in technology. When we talk today about smart pacemakers, it means optimal treatment, diagnosis, and follow-up for patients fitting the current indications for pacemakers. So what is smart today becomes accepted as "state of the art" tomorrow. Originally, implantable pacemakers were developed to save lives from prolonged episodes of bradycardia and/or complete heart block. Now, in addition, they improve quality of life via numerous different functions acting under specific conditions, thanks to the introduction of microprocessors. The devices have become smaller, with the miniaturization of the electrical components, without compromising longevity. Nevertheless, there are still some unmatched objectives for these devices, for example, the optimization of cardiac output and the management of atrial arrhythmias in dual-chamber devices. Furthermore, indications continue to evolve, which in turn require new, additional functions. These functions are often very complex, necessitating computerized programming to simplify application. In addition, the follow-up of these devices is time-consuming, as appropriate system performance has to be regularly monitored. A great many of these functions could be automatically performed and documented, thus enabling physicians and paramedical staff to avoid losing time with routine control procedures. In addition, modern pacemakers offer extensive diagnostic functions to help diagnose patient symptoms and pacemaker system problems. Different types of data are available, and their presentation differs from one company to the other. This huge amount of data can only be managed with automatic diagnostic functions. Thus, the smart pacemaker of the near future should offer high flexibility to permit easy programming of available therapies and follow-up, and extensive, easily comprehensible diagnostic functions.

  14. PLANNING FOR INSTRUCTIONAL TELEVISION.

    ERIC Educational Resources Information Center

    WIENS, JACOB H.

    EXPERIENCES OF THREE JUNIOR COLLEGES WERE THE BASIS FOR THIS SUMMARY OF (1) PROCEDURES FOR ACQUIRING A TELEVISION CHANNEL, (2) METHODS OF SETTING UP AND FINANCING A STATION, AND (3) PROGRAMS SUITABLE FOR A DISTRICT OWNED STATION. THE COMPLEXITIES OF ACQUIRING A CHANNEL ASSIGNMENT ARE DESCRIBED IN DETAIL, INCLUDING THE FUNCTIONS OF LAWYERS AND…

  15. The Diversity Project: An Ethnography of Social Justice Experiential Education Programming

    ERIC Educational Resources Information Center

    Vernon, Franklin

    2016-01-01

    Whilst adventure-based experiential education traditions have long-standing claims of progressive, democratic learning potential, little research has examined practice from within democratic theories of participation and learning. Focusing on a complex network making up a disturbing interaction in an outdoor education programme, I posit forms of…

  16. Muscle Fiber Types and Training.

    ERIC Educational Resources Information Center

    Karp, Jason R.

    2001-01-01

    The specific types of fibers that make up individual muscles greatly influence how people will adapt to their training programs. This paper explains the complexities of skeletal muscles, focusing on types of muscle fibers (slow-twitch and fast-twitch), recruitment of muscle fibers to perform a motor task, and determining fiber type. Implications…

  17. Project based, Collaborative, Algorithmic Robotics for High School Students: Programming Self Driving Race Cars at MIT

    DTIC Science & Technology

    2017-02-19

    software systems: the students design and build robotics software towards real-world applications, without being distracted by hardware issues; (ii) it...high school students require the students to focus on building and integrating the hardware that make up the robot, at the expense of designing and...robotics programs focus on the mechanics; as a result, they do not have room for students to design and implement relatively complex software systems, as

  18. A Wideband Fast Multipole Method for the two-dimensional complex Helmholtz equation

    NASA Astrophysics Data System (ADS)

    Cho, Min Hyung; Cai, Wei

    2010-12-01

    A Wideband Fast Multipole Method (FMM) for the 2D Helmholtz equation is presented. It can evaluate the interactions between N particles governed by the fundamental solution of 2D complex Helmholtz equation in a fast manner for a wide range of complex wave number k, which was not easy with the original FMM due to the instability of the diagonalized conversion operator. This paper includes the description of theoretical backgrounds, the FMM algorithm, software structures, and some test runs. Program summaryProgram title: 2D-WFMM Catalogue identifier: AEHI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4636 No. of bytes in distributed program, including test data, etc.: 82 582 Distribution format: tar.gz Programming language: C Computer: Any Operating system: Any operating system with gcc version 4.2 or newer Has the code been vectorized or parallelized?: Multi-core processors with shared memory RAM: Depending on the number of particles N and the wave number k Classification: 4.8, 4.12 External routines: OpenMP ( http://openmp.org/wp/) Nature of problem: Evaluate interaction between N particles governed by the fundamental solution of 2D Helmholtz equation with complex k. Solution method: Multilevel Fast Multipole Algorithm in a hierarchical quad-tree structure with cutoff level which combines low frequency method and high frequency method. Running time: Depending on the number of particles N, wave number k, and number of cores in CPU. CPU time increases as N log N.

  19. Overview of national bird population monitoring programs and databases

    Treesearch

    Gregory S. Butcher; Bruce Peterjohn; C. John Ralph

    1993-01-01

    A number of programs have been set up to monitor populations of nongame migratory birds. We review these programs and their purposes and provide information on obtaining data or results from these programs. In addition, we review recommendations for improving these programs.

  20. CMM Data Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Due to the increase in the use of Coordinate Measuring Machines (CMMs) to measure fine details and complex geometries in manufacturing, many programs have been made to compile and analyze the data. These programs typically require extensive setup to determine the expected results in order to not only track the pass/fail of a dimension, but also to use statistical process control (SPC). These extra steps and setup times have been addressed through the CMM Data Analysis Tool, which only requires the output of the CMM to provide both pass/fail analysis on all parts run to the same inspection program asmore » well as provide graphs which help visualize where the part measures within the allowed tolerances. This provides feedback not only to the customer for approval of a part during development, but also to machining process engineers to identify when any dimension is drifting towards an out of tolerance condition during production. This program can handle hundreds of parts with complex dimensions and will provide an analysis within minutes.« less

  1. Composite Development and Applications for RLV Tankage

    NASA Technical Reports Server (NTRS)

    Wright, Richard J.; Achary, David C.; McBain, Michael C.

    2003-01-01

    The development of polymer composite cryogenic tanks is a critical step in creating the next generation of launch vehicles. Future launch vehicles need to minimize the gross liftoff weight (GLOW), which is possible due to the 28%-41% reduction in weight that composite materials can provide over current aluminum technology. The development of composite cryogenic tanks, feedlines, and unpressurized structures are key enabling technologies for performance and cost enhancements for Reusable Launch Vehicles (RLVs). The technology development of composite tanks has provided direct and applicable data for feedlines, unpressurized structures, material compatibility, and cryogenic fluid containment for highly loaded complex structures and interfaces. All three types of structure have similar material systems, processing parameters, scaling issues, analysis methodologies, NDE development, damage tolerance, and repair scenarios. Composite cryogenic tankage is the most complex of the 3 areas and provides the largest breakthrough in technology. A building block approach has been employed to bring this family of difficult technologies to maturity. This approach has built up composite materials, processes, design, analysis and test methods technology through a series of composite test programs beginning with the NASP program to meet aggressive performance goals for reusable launch vehicles. In this paper, the development and application of advanced composites for RLV use is described.

  2. - XSUMMER- Transcendental functions and symbolic summation in FORM

    NASA Astrophysics Data System (ADS)

    Moch, S.; Uwer, P.

    2006-05-01

    Harmonic sums and their generalizations are extremely useful in the evaluation of higher-order perturbative corrections in quantum field theory. Of particular interest have been the so-called nested sums, where the harmonic sums and their generalizations appear as building blocks, originating for example, from the expansion of generalized hypergeometric functions around integer values of the parameters. In this paper we discuss the implementation of several algorithms to solve these sums by algebraic means, using the computer algebra system FORM. Program summaryTitle of program:XSUMMER Catalogue identifier:ADXQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXQ_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland License:GNU Public License and FORM License Computers:all Operating system:all Program language:FORM Memory required to execute:Depending on the complexity of the problem, recommended at least 64 MB RAM No. of lines in distributed program, including test data, etc.:9854 No. of bytes in distributed program, including test data, etc.:126 551 Distribution format:tar.gz Other programs called:none External files needed:none Nature of the physical problem:Systematic expansion of higher transcendental functions in a small parameter. The expansions arise in the calculation of loop integrals in perturbative quantum field theory. Method of solution:Algebraic manipulations of nested sums. Restrictions on complexity of the problem:Usually limited only by the available disk space. Typical running time:Dependent on the complexity of the problem.

  3. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  4. Use of a New Portable Instrumented Impactor on the NASA Composite Crew Module Damage Tolerance Program

    NASA Technical Reports Server (NTRS)

    Jackson, Wade C.; Polis, Daniel L.

    2014-01-01

    Damage tolerance performance is critical to composite structures because surface impacts at relatively low energies may result in a significant strength loss. For certification, damage tolerance criteria require aerospace vehicles to meet design loads while containing damage at critical locations. Data from standard small coupon testing are difficult to apply to larger more complex structures. Due to the complexity of predicting both the impact damage and the residual properties, damage tolerance is demonstrated primarily by testing. A portable, spring-propelled, impact device was developed which allows the impact damage response to be investigated on large specimens, full-scale components, or entire vehicles. During impact, both the force history and projectile velocity are captured. The device was successfully used to demonstrate the damage tolerance performance of the NASA Composite Crew Module. The impactor was used to impact 18 different design features at impact energies up to 35 J. Detailed examples of these results are presented, showing impact force histories, damage inspection results, and response to loading.

  5. Prediction of Combustion Gas Deposit Compositions

    NASA Technical Reports Server (NTRS)

    Kohl, F. J.; Mcbride, B. J.; Zeleznik, F. J.; Gordon, S.

    1985-01-01

    Demonstrated procedure used to predict accurately chemical compositions of complicated deposit mixtures. NASA Lewis Research Center's Computer Program for Calculation of Complex Chemical Equilibrium Compositions (CEC) used in conjunction with Computer Program for Calculation of Ideal Gas Thermodynamic Data (PAC) and resulting Thermodynamic Data Base (THDATA) to predict deposit compositions from metal or mineral-seeded combustion processes.

  6. Computer simulation of space station computer steered high gain antenna

    NASA Technical Reports Server (NTRS)

    Beach, S. W.

    1973-01-01

    The mathematical modeling and programming of a complete simulation program for a space station computer-steered high gain antenna are described. The program provides for reading input data cards, numerically integrating up to 50 first order differential equations, and monitoring up to 48 variables on printed output and on plots. The program system consists of a high gain antenna, an antenna gimbal control system, an on board computer, and the environment in which all are to operate.

  7. Alignment and political will: upscaling an Australian respectful relationships program.

    PubMed

    Joyce, Andrew; Green, Celia; Kearney, Sarah; Leung, Loksee; Ollis, Debbie

    2018-05-29

    Many small scale efficacious programs and interventions need to be 'scaled-up' in order to reach a larger population. Although it has been argued that interventions deemed suitable for upscaling need to have demonstrated effectiveness, be able to be implemented cost-effectively and be accepted by intended recipients, these factors alone are insufficient in explaining which programs are adopted more broadly. Upscaling research often identifies political will as a key factor in explaining whether programs are supported and up-scaled, but this research lacks any depth into how political will is formed and has not applied policy theories to understanding the upscaling process. This article uses a political science lens to examine the key factors in the upscaling process of a Respectful Relationships in Schools Program. Focus groups and interviews were conducted with project staff, managers and community organizations involved in the program. The results reveal how a key focusing event related to a highly profiled personal tragedy propelled family violence into the national spotlight. At the same time, the organization leading the respectful relationships program leveraged their networks to position the program within the education department which enabled the government to quickly respond to the issue. The study highlights that political will is not a stand-alone factor as depicted by up-scaling models, but rather is the end point of a complex process that involves many elements including the establishment of networks and aligned programs that can capitalize when opportunities arise.

  8. The Development and Validation of a Teacher Preparation Program: Follow-Up Survey

    ERIC Educational Resources Information Center

    Schulte, Laura E.

    2008-01-01

    Students in my applied advanced statistics course for educational administration doctoral students developed a follow-up survey for teacher preparation programs, using the following scale development processes: adopting a framework; developing items; providing evidence of content validity; conducting a pilot test; and analyzing data. The students…

  9. Turbulence spectra in the noise source regions of the flow around complex surfaces

    NASA Technical Reports Server (NTRS)

    Olsen, W. A.; Boldman, D. R.

    1983-01-01

    The complex turbulent flow around three complex surfaces was measured in detail with a hot wire. The measured data include extensive spatial surveys of the mean velocity and turbulence intensity and measurements of the turbulence spectra and scale length at many locations. The publication of the turbulence data is completed by reporting a summary of the turbulence spectra that were measured within the noise source locations of the flow. The results suggest some useful simplifications in modeling the very complex turbulent flow around complex surfaces for aeroacoustic predictive models. The turbulence spectra also show that noise data from scale models of moderate size can be accurately scaled up to full size.

  10. DEKFIS user's guide: Discrete Extended Kalman Filter/Smoother program for aircraft and rotorcraft data consistency

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The computer program DEKFIS (discrete extended Kalman filter/smoother), formulated for aircraft and helicopter state estimation and data consistency, is described. DEKFIS is set up to pre-process raw test data by removing biases, correcting scale factor errors and providing consistency with the aircraft inertial kinematic equations. The program implements an extended Kalman filter/smoother using the Friedland-Duffy formulation.

  11. An interactive graphics program to retrieve, display, compare, manipulate, curve fit, difference and cross plot wind tunnel data

    NASA Technical Reports Server (NTRS)

    Elliott, R. D.; Werner, N. M.; Baker, W. M.

    1975-01-01

    The Aerodynamic Data Analysis and Integration System (ADAIS), developed as a highly interactive computer graphics program capable of manipulating large quantities of data such that addressable elements of a data base can be called up for graphic display, compared, curve fit, stored, retrieved, differenced, etc., was described. The general nature of the system is evidenced by the fact that limited usage has already occurred with data bases consisting of thermodynamic, basic loads, and flight dynamics data. Productivity using ADAIS of five times that for conventional manual methods of wind tunnel data analysis is routinely achieved. In wind tunnel data analysis, data from one or more runs of a particular test may be called up and displayed along with data from one or more runs of a different test. Curves may be faired through the data points by any of four methods, including cubic spline and least squares polynomial fit up to seventh order.

  12. Real time animation of space plasma phenomena

    NASA Technical Reports Server (NTRS)

    Jordan, K. F.; Greenstadt, E. W.

    1987-01-01

    In pursuit of real time animation of computer simulated space plasma phenomena, the code was rewritten for the Massively Parallel Processor (MPP). The program creates a dynamic representation of the global bowshock which is based on actual spacecraft data and designed for three dimensional graphic output. This output consists of time slice sequences which make up the frames of the animation. With the MPP, 16384, 512 or 4 frames can be calculated simultaneously depending upon which characteristic is being computed. The run time was greatly reduced which promotes the rapid sequence of images and makes real time animation a foreseeable goal. The addition of more complex phenomenology in the constructed computer images is now possible and work proceeds to generate these images.

  13. PCSYS: The optimal design integration system picture drawing system with hidden line algorithm capability for aerospace vehicle configurations

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Vanderburg, J. D.

    1977-01-01

    A vehicle geometric definition based upon quadrilateral surface elements to produce realistic pictures of an aerospace vehicle. The PCSYS programs can be used to visually check geometric data input, monitor geometric perturbations, and to visualize the complex spatial inter-relationships between the internal and external vehicle components. PCSYS has two major component programs. The between program, IMAGE, draws a complex aerospace vehicle pictorial representation based on either an approximate but rapid hidden line algorithm or without any hidden line algorithm. The second program, HIDDEN, draws a vehicle representation using an accurate but time consuming hidden line algorithm.

  14. Use of hospital-related health care among Health Links enrollees in the Central Ontario health region: a propensity-matched difference-in-differences study

    PubMed Central

    Mondor, Luke; Walker, Kevin; Bai, Yu Qing; Wodchis, Walter P.

    2017-01-01

    Background: Health Links are a new model of providing care coordination for high-cost, high-needs patients in Ontario. We evaluated use of hospital-related health care services among Health Links patients in the Central Local Health Integration Network (LHIN) of Ontario in the year before versus after program enrolment and compared rates of use with those among similar patients with complex needs not enrolled in the program (comparator group). Methods: We identified all patients who received a Health Links coordinated care plan before Jan. 1, 2015, using linked registry and health administrative data. We used propensity scores to match (1:1) enrollees (registry) with comparator patients (administrative data). Using a difference-in-differences approach with generalized estimating equations, we evaluated 5 measures of Health Link performance: rates of hospital admission, emergency department visits, days in acute care, 30-day readmissions and 7-day postdischarge primary care follow-up. Results: Of the 344 enrollees in the registry, we matched 313 [91.0%] to comparator patients. All measured sociodemographic, comorbidity and health care use characteristics were balanced between the 2 groups (all standardized differences < 0.10). For enrollees, the rate of days in acute care per person-year increased by 35% (incidence rate ratio 1.35 [confidence interval 1.11-1.65]) after versus before the index date, but differences were nonsignificant for all other measures. Difference-in-differences analyses revealed greater reductions in hospital admissions, emergency department visits and acute care days after the index date in the comparator group than among enrollees. Interpretation: Initial implementation of the Health Link program in the Central LHIN did not reduce selected indicators of Health Link performance among enrollees. As the Health Link program evolves and standardization is implemented, future research may reveal effects from the initiative in other outcomes or with longer follow-up. PMID:29025737

  15. Multichannel Networked Phasemeter Readout and Analysis

    NASA Technical Reports Server (NTRS)

    Edmonds, Karina

    2008-01-01

    Netmeter software reads a data stream from up to 250 networked phasemeters, synchronizes the data, saves the reduced data to disk (after applying a low-pass filter), and provides a Web server interface for remote control. Unlike older phasemeter software that requires a special, real-time operating system, this program can run on any general-purpose computer. It needs about five percent of the CPU (central processing unit) to process 20 channels because it adds built-in data logging and network-based GUIs (graphical user interfaces) that are implemented in Scalable Vector Graphics (SVG). Netmeter runs on Linux and Windows. It displays the instantaneous displacements measured by several phasemeters at a user-selectable rate, up to 1 kHz. The program monitors the measure and reference channel frequencies. For ease of use, levels of status in Netmeter are color coded: green for normal operation, yellow for network errors, and red for optical misalignment problems. Netmeter includes user-selectable filters up to 4 k samples, and user-selectable averaging windows (after filtering). Before filtering, the program saves raw data to disk using a burst-write technique.

  16. Complex sample survey estimation in static state-space

    Treesearch

    Raymond L. Czaplewski

    2010-01-01

    Increased use of remotely sensed data is a key strategy adopted by the Forest Inventory and Analysis Program. However, multiple sensor technologies require complex sampling units and sampling designs. The Recursive Restriction Estimator (RRE) accommodates this complexity. It is a design-consistent Empirical Best Linear Unbiased Prediction for the state-vector, which...

  17. Bioactive Natural Products Prioritization Using Massive Multi-informational Molecular Networks.

    PubMed

    Olivon, Florent; Allard, Pierre-Marie; Koval, Alexey; Righi, Davide; Genta-Jouve, Gregory; Neyts, Johan; Apel, Cécile; Pannecouque, Christophe; Nothias, Louis-Félix; Cachet, Xavier; Marcourt, Laurence; Roussi, Fanny; Katanaev, Vladimir L; Touboul, David; Wolfender, Jean-Luc; Litaudon, Marc

    2017-10-20

    Natural products represent an inexhaustible source of novel therapeutic agents. Their complex and constrained three-dimensional structures endow these molecules with exceptional biological properties, thereby giving them a major role in drug discovery programs. However, the search for new bioactive metabolites is hampered by the chemical complexity of the biological matrices in which they are found. The purification of single constituents from such matrices requires such a significant amount of work that it should be ideally performed only on molecules of high potential value (i.e., chemical novelty and biological activity). Recent bioinformatics approaches based on mass spectrometry metabolite profiling methods are beginning to address the complex task of compound identification within complex mixtures. However, in parallel to these developments, methods providing information on the bioactivity potential of natural products prior to their isolation are still lacking and are of key interest to target the isolation of valuable natural products only. In the present investigation, we propose an integrated analysis strategy for bioactive natural products prioritization. Our approach uses massive molecular networks embedding various informational layers (bioactivity and taxonomical data) to highlight potentially bioactive scaffolds within the chemical diversity of crude extracts collections. We exemplify this workflow by targeting the isolation of predicted active and nonactive metabolites from two botanical sources (Bocquillonia nervosa and Neoguillauminia cleopatra) against two biological targets (Wnt signaling pathway and chikungunya virus replication). Eventually, the detection and isolation processes of a daphnane diterpene orthoester and four 12-deoxyphorbols inhibiting the Wnt signaling pathway and exhibiting potent antiviral activities against the CHIKV virus are detailed. Combined with efficient metabolite annotation tools, this bioactive natural products prioritization pipeline proves to be efficient. Implementation of this approach in drug discovery programs based on natural extract screening should speed up and rationalize the isolation of bioactive natural products.

  18. Using Curriculum-Based Measurements for Program Evaluation: Expanding Roles for School Psychologists

    ERIC Educational Resources Information Center

    Tusing, Mary E.; Breikjern, Nicholle A.

    2017-01-01

    Educators increasingly need to evaluate schoolwide reform efforts; however, complex program evaluations often are not feasible in schools. Through a case example, we provide a heuristic for program evaluation that is easily replicated in schools. Criterion-referenced interpretations of schoolwide screening data were used to evaluate outcomes…

  19. Negotiating a "Scary Gap": Doctoral Candidates, "Writing Up" Qualitative Data and the Contemporary Supervisory Relationship

    ERIC Educational Resources Information Center

    Humphrey, Robin; Simpson, Bob

    2013-01-01

    The complex task of "writing up" qualitative data provides difficulties and challenges for both doctoral candidates and their supervisors, which can often result in detrimental effects on the supervisory relationship. These effects can be heightened by the pressures currently felt by supervisors, not least to ensure their supervisees…

  20. A Case for Data and Service Fusions

    NASA Astrophysics Data System (ADS)

    Huang, T.; Boening, C.; Quach, N. T.; Gill, K.; Zlotnicki, V.; Moore, B.; Tsontos, V. M.

    2015-12-01

    In this distributed, data-intensive era, developing any solution that requires multi-disciplinary data and service requires careful review of interfaces with data and service providers. Information is stored in many different locations and data services are distributed across the Internet. In design and development of mash-up heterogeneous data systems, the challenge is not entirely technological; it is our ability to document the external interface specifications and to create a coherent environment for our users. While is impressive to present a complex web of data, the true measure of our success is in the quality of the data we are serving, the throughput of our creation, and user experience. The presentation presents two current funded NASA projects that require integration of heterogeneous data and service that reside in different locations. The NASA Sea Level Change Portal is designed a "one-stop" source for current sea level change information. Behind this portal is an architecture that integrates data and services from various sources, which includes PI-generated products, satellite products from the DAACs, and metadata from ESDIS Common Metadata Repository (CMR) and other sources, and services reside in the data centers, universities, and ESDIS. The recently funded Distributed Oceanographic Matchup Service (DOMS) project is a project under the NASA Advance Information Technology (AIST) program. DOMS will integrate with satellite products managed by NASA Physical Oceanography Distributed Active Archive Center (PO.DAAC) and three different in-situ projects that are located in difference parts of the U.S. These projects are good examples of delivering content-rich solutions through mash-up of heterogeneous data and systems.

  1. Design, baseline characteristics, and early findings of the MPS VI (mucopolysaccharidosis VI) Clinical Surveillance Program (CSP).

    PubMed

    Hendriksz, Christian J; Giugliani, Roberto; Harmatz, Paul; Lampe, Christina; Martins, Ana Maria; Pastores, Gregory M; Steiner, Robert D; Leão Teles, Elisa; Valayannopoulos, Vassili

    2013-03-01

    To outline the design, baseline data, and 5-year follow-up data of patients with mucopolysaccharidosis (MPS) VI enrolled in the Clinical Surveillance Program (CSP), a voluntary, multinational, observational program. The MPS VI CSP was opened in 2005 to collect, for at least 15 years, observational data from standard clinical and laboratory assessments of patients with MPS VI. Baseline and follow-up data are documented by participating physicians in electronic case report forms. Between September 2005 and March 2010 the CSP enrolled 132 patients, including 123 who received enzyme replacement therapy (ERT) with galsulfase. Median age at enrolment was 13 years (range 1-59). Mean baseline data showed impaired growth, hepatosplenomegaly, and reduced endurance and pulmonary function. The most common findings were heart valve disease (90%), reduced visual acuity (79%), impaired hearing (59%), and hepatosplenomegaly (54%). Follow-up data up to 5 years in patients with pre- and post-ERT measurements showed a decrease in urinary glycosaminoglycans and increases in height and weight in patients <16 years and suggested reductions in liver and spleen size and improvements in endurance and pulmonary function after ERT was started. Vision, hearing, and cardiac function were unchanged. Safety data were in line with previous reports. The CSP represents the largest cross-sectional study of MPS VI to date. This first report provides information on the design and implementation of the program and population statistics for several clinical variables in patients with MPS VI. Data collected over 5 years suggest that ERT provides clinical benefit and is well-tolerated with no new safety concerns.

  2. BigDataScript: a scripting language for data pipelines.

    PubMed

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.

  3. BigDataScript: a scripting language for data pipelines

    PubMed Central

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778

  4. Measures of follow-up in early hearing detection and intervention programs: a need for standardization.

    PubMed

    Mason, Craig A; Gaffney, Marcus; Green, Denise R; Grosse, Scott D

    2008-06-01

    To demonstrate the need for standardized data definitions and reporting for early hearing detection and intervention (EHDI) programs collecting information on newborn hearing screening and follow-up, and types of information best collected in a standardized manner. A hypothetical birth cohort was used to show the potential effects of nonstandardized definitions and data classifications on rates of hearing screening, audiologic follow-up, and hearing loss. The true screening rate in this cohort was 92.4%. The calculated rate was between 90.0% and 96.5%, depending on the measure used. Among children documented as screened and referred for follow-up, 61.0% received this testing. Only 49.0% were documented to have been tested. Despite a true prevalence of 3.7 per 1,000 births, only 1.5 per 1,000 children were documented with a hearing loss. Ensuring that children receive recommended follow-up is challenging. Without complete reporting by audiologists to EHDI programs, accurate calculation of performance measures is impossible. Lack of documentation can lead to the overstatement of "loss to follow-up." Also, standardization of measures is essential for programs to evaluate how many children receive recommended services and assess progress toward national goals. A new survey has been implemented to collect more detailed and standardized information about recommended services.

  5. Accuracy of genomic selection in European maize elite breeding populations.

    PubMed

    Zhao, Yusheng; Gowda, Manje; Liu, Wenxin; Würschum, Tobias; Maurer, Hans P; Longin, Friedrich H; Ranc, Nicolas; Reif, Jochen C

    2012-03-01

    Genomic selection is a promising breeding strategy for rapid improvement of complex traits. The objective of our study was to investigate the prediction accuracy of genomic breeding values through cross validation. The study was based on experimental data of six segregating populations from a half-diallel mating design with 788 testcross progenies from an elite maize breeding program. The plants were intensively phenotyped in multi-location field trials and fingerprinted with 960 SNP markers. We used random regression best linear unbiased prediction in combination with fivefold cross validation. The prediction accuracy across populations was higher for grain moisture (0.90) than for grain yield (0.58). The accuracy of genomic selection realized for grain yield corresponds to the precision of phenotyping at unreplicated field trials in 3-4 locations. As for maize up to three generations are feasible per year, selection gain per unit time is high and, consequently, genomic selection holds great promise for maize breeding programs.

  6. Data Mining and Complex Problems: Case Study in Composite Materials

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis; Marin, Mario

    2009-01-01

    Data mining is defined as the discovery of useful, possibly unexpected, patterns and relationships in data using statistical and non-statistical techniques in order to develop schemes for decision and policy making. Data mining can be used to discover the sources and causes of problems in complex systems. In addition, data mining can support simulation strategies by finding the different constants and parameters to be used in the development of simulation models. This paper introduces a framework for data mining and its application to complex problems. To further explain some of the concepts outlined in this paper, the potential application to the NASA Shuttle Reinforced Carbon-Carbon structures and genetic programming is used as an illustration.

  7. Internet-based computer technology on radiotherapy.

    PubMed

    Chow, James C L

    2017-01-01

    Recent rapid development of Internet-based computer technologies has made possible many novel applications in radiation dose delivery. However, translational speed of applying these new technologies in radiotherapy could hardly catch up due to the complex commissioning process and quality assurance protocol. Implementing novel Internet-based technology in radiotherapy requires corresponding design of algorithm and infrastructure of the application, set up of related clinical policies, purchase and development of software and hardware, computer programming and debugging, and national to international collaboration. Although such implementation processes are time consuming, some recent computer advancements in the radiation dose delivery are still noticeable. In this review, we will present the background and concept of some recent Internet-based computer technologies such as cloud computing, big data processing and machine learning, followed by their potential applications in radiotherapy, such as treatment planning and dose delivery. We will also discuss the current progress of these applications and their impacts on radiotherapy. We will explore and evaluate the expected benefits and challenges in implementation as well.

  8. Small Rad51 and Dmc1 Complexes Often Co-occupy Both Ends of a Meiotic DNA Double Strand Break

    PubMed Central

    Brown, M. Scott; Grubb, Jennifer; Zhang, Annie; Rust, Michael J.; Bishop, Douglas K.

    2015-01-01

    The Eukaryotic RecA-like proteins Rad51 and Dmc1 cooperate during meiosis to promote recombination between homologous chromosomes by repairing programmed DNA double strand breaks (DSBs). Previous studies showed that Rad51 and Dmc1 form partially overlapping co-foci. Here we show these Rad51-Dmc1 co-foci are often arranged in pairs separated by distances of up to 400 nm. Paired co-foci remain prevalent when DSBs are dramatically reduced or when strand exchange or synapsis is blocked. Super-resolution dSTORM microscopy reveals that individual foci observed by conventional light microscopy are often composed of two or more substructures. The data support a model in which the two tracts of ssDNA formed by a single DSB separate from one another by distances of up to 400 nm, with both tracts often bound by one or more short (about 100 nt) Rad51 filaments and also by one or more short Dmc1 filaments. PMID:26719980

  9. The Canonical Robot Command Language (CRCL).

    PubMed

    Proctor, Frederick M; Balakirsky, Stephen B; Kootbally, Zeid; Kramer, Thomas R; Schlenoff, Craig I; Shackleford, William P

    2016-01-01

    Industrial robots can perform motion with sub-millimeter repeatability when programmed using the teach-and-playback method. While effective, this method requires significant up-front time, tying up the robot and a person during the teaching phase. Off-line programming can be used to generate robot programs, but the accuracy of this method is poor unless supplemented with good calibration to remove systematic errors, feed-forward models to anticipate robot response to loads, and sensing to compensate for unmodeled errors. These increase the complexity and up-front cost of the system, but the payback in the reduction of recurring teach programming time can be worth the effort. This payback especially benefits small-batch, short-turnaround applications typical of small-to-medium enterprises, who need the agility afforded by off-line application development to be competitive against low-cost manual labor. To fully benefit from this agile application tasking model, a common representation of tasks should be used that is understood by all of the resources required for the job: robots, tooling, sensors, and people. This paper describes an information model, the Canonical Robot Command Language (CRCL), which provides a high-level description of robot tasks and associated control and status information.

  10. The Canonical Robot Command Language (CRCL)

    PubMed Central

    Proctor, Frederick M.; Balakirsky, Stephen B.; Kootbally, Zeid; Kramer, Thomas R.; Schlenoff, Craig I.; Shackleford, William P.

    2017-01-01

    Industrial robots can perform motion with sub-millimeter repeatability when programmed using the teach-and-playback method. While effective, this method requires significant up-front time, tying up the robot and a person during the teaching phase. Off-line programming can be used to generate robot programs, but the accuracy of this method is poor unless supplemented with good calibration to remove systematic errors, feed-forward models to anticipate robot response to loads, and sensing to compensate for unmodeled errors. These increase the complexity and up-front cost of the system, but the payback in the reduction of recurring teach programming time can be worth the effort. This payback especially benefits small-batch, short-turnaround applications typical of small-to-medium enterprises, who need the agility afforded by off-line application development to be competitive against low-cost manual labor. To fully benefit from this agile application tasking model, a common representation of tasks should be used that is understood by all of the resources required for the job: robots, tooling, sensors, and people. This paper describes an information model, the Canonical Robot Command Language (CRCL), which provides a high-level description of robot tasks and associated control and status information. PMID:28529393

  11. Morning Star Cycle Two: Follow-up Study.

    ERIC Educational Resources Information Center

    Sloan, L. V.

    Semi-structured telephone interviews were used to gather follow-up data on students who completed the 1977-1979 Morning Star cycle two program, a community-based Native teacher education program at the Blue Quills Native Education Centre leading to a Bachelor of Education degree from the University of Alberta. Of the 24 students who completed…

  12. Using Block-local Atomicity to Detect Stale-value Concurrency Errors

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Havelund, Klaus; Biere, Armin

    2004-01-01

    Data races do not cover all kinds of concurrency errors. This paper presents a data-flow-based technique to find stale-value errors, which are not found by low-level and high-level data race algorithms. Stale values denote copies of shared data where the copy is no longer synchronized. The algorithm to detect such values works as a consistency check that does not require any assumptions or annotations of the program. It has been implemented as a static analysis in JNuke. The analysis is sound and requires only a single execution trace if implemented as a run-time checking algorithm. Being based on an analysis of Java bytecode, it encompasses the full program semantics, including arbitrarily complex expressions. Related techniques are more complex and more prone to over-reporting.

  13. The Vector, Signal, and Image Processing Library (VSIPL): an Open Standard for Astronomical Data Processing

    NASA Astrophysics Data System (ADS)

    Kepner, J. V.; Janka, R. S.; Lebak, J.; Richards, M. A.

    1999-12-01

    The Vector/Signal/Image Processing Library (VSIPL) is a DARPA initiated effort made up of industry, government and academic representatives who have defined an industry standard API for vector, signal, and image processing primitives for real-time signal processing on high performance systems. VSIPL supports a wide range of data types (int, float, complex, ...) and layouts (vectors, matrices and tensors) and is ideal for astronomical data processing. The VSIPL API is intended to serve as an open, vendor-neutral, industry standard interface. The object-based VSIPL API abstracts the memory architecture of the underlying machine by using the concept of memory blocks and views. Early experiments with VSIPL code conversions have been carried out by the High Performance Computing Program team at the UCSD. Commercially, several major vendors of signal processors are actively developing implementations. VSIPL has also been explicitly required as part of a recent Rome Labs teraflop procurement. This poster presents the VSIPL API, its functionality and the status of various implementations.

  14. Patterns in Illinois educational school data

    NASA Astrophysics Data System (ADS)

    Stevens, Cacey S.; Marder, Michael; Nagel, Sidney R.

    2015-06-01

    We examine Illinois educational data from standardized exams and analyze primary factors affecting the achievement of public school students. We focus on the simplest possible models: representation of data through visualizations and regressions on single variables. Exam scores are shown to depend on school type, location, and poverty concentration. For most schools in Illinois, student test scores decline linearly with poverty concentration. However, Chicago must be treated separately. Selective schools in Chicago, as well as some traditional and charter schools, deviate from this pattern based on poverty. For any poverty level, Chicago schools perform better than those in the rest of Illinois. Selective programs for gifted students show high performance at each grade level, most notably at the high school level, when compared to other Illinois school types. The case of Chicago charter schools is more complex. Up to 2008, Chicago charter and neighborhood schools had similar performance scores. In the last few years, charter students' scores overtook those of students in traditional schools as the number of charter school locations increased.

  15. Characterization of Representative Materials in Support of Safe, Long Term Storage of Surplus Plutonium in DOE-STD-3013 Containers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narlesky, Joshua E.; Stroud, Mary Ann; Smith, Paul Herrick

    2013-02-15

    The Surveillance and Monitoring Program is a joint Los Alamos National Laboratory/Savannah River Site effort funded by the Department of Energy-Environmental Management to provide the technical basis for the safe, long-term storage (up to 50 years) of over 6 metric tons of plutonium stored in over 5,000 DOE-STD-3013 containers at various facilities around the DOE complex. The majority of this material is plutonium that is surplus to the nuclear weapons program, and much of it is destined for conversion to mixed oxide fuel for use in US nuclear power plants. The form of the plutonium ranges from relatively pure metalmore » and oxide to very impure oxide. The performance of the 3013 containers has been shown to depend on moisture content and on the levels, types and chemical forms of the impurities. The oxide materials that present the greatest challenge to the storage container are those that contain chloride salts. Other common impurities include oxides and other compounds of calcium, magnesium, iron, and nickel. Over the past 15 years the program has collected a large body of experimental data on 54 samples of plutonium, with 53 chosen to represent the broader population of materials in storage. This paper summarizes the characterization data, moisture analysis, particle size, surface area, density, wattage, actinide composition, trace element impurity analysis, and shelf life surveillance data and includes origin and process history information. Limited characterization data on fourteen nonrepresentative samples is also presented.« less

  16. VORTAB - A data-tablet method of developing input data for the VORLAX program

    NASA Technical Reports Server (NTRS)

    Denn, F. M.

    1979-01-01

    A method of developing an input data file for use in the aerodynamic analysis of a complete airplane with the VORLAX computer program is described. The hardware consists of an interactive graphics terminal equipped with a graphics tablet. Software includes graphics routines from the Tektronix PLOT 10 package as well as the VORTAB program described. The user determines the size and location of each of the major panels for the aircraft before using the program. Data is entered both from the terminal keyboard and the graphics tablet. The size of the resulting data file is dependent on the complexity of the model and can vary from ten to several hundred card images. After the data are entered, two programs READB and PLOTB, are executed which plot the configuration allowing visual inspection of the model.

  17. Algebraic Functions, Computer Programming, and the Challenge of Transfer

    ERIC Educational Resources Information Center

    Schanzer, Emmanuel Tanenbaum

    2015-01-01

    Students' struggles with algebra are well documented. Prior to the introduction of functions, mathematics is typically focused on applying a set of arithmetic operations to compute an answer. The introduction of functions, however, marks the point at which mathematics begins to focus on building up abstractions as a way to solve complex problems.…

  18. Crocodile Chemistry. [CD-ROM].

    ERIC Educational Resources Information Center

    1999

    This high school chemistry resource is an on-screen chemistry lab. In the program, students can experiment with a huge range of chemicals, choosing the form, quantity and concentrations. Dangerous or difficult experiments can be investigated safely and easily. A vast range of equipment can be set up, and complex simulations can be put together and…

  19. Molecular identification and genetic diversity analysis of Chinese sugarcane (Saccharum spp. hybrids) varieties using SSR markers

    USDA-ARS?s Scientific Manuscript database

    Sugarcane (Saccharum spp. hybrids) is an important sugar and renewable bioenergy crop. However, its complex aneupolyploidy genome and vegetative mode of propagation often cause difficulty in selection and some variety identity issues in a breeding program. Therefore, the present study was set up to ...

  20. KRISSY: user's guide to modeling three-dimensional wind flow in complex terrain

    Treesearch

    Michael A. Fosberg; Michael L. Sestak

    1986-01-01

    KRISSY is a computer model for generating three-dimensional wind flows in complex terrain from data that were not or perhaps cannot be collected. The model is written in FORTRAN IV This guide describes data requirements, modeling, and output from an applications viewpoint rather than that of programming or theoretical modeling. KRISSY is designed to minimize...

  1. Numerical solution of the Navier-Stokes equations by discontinuous Galerkin method

    NASA Astrophysics Data System (ADS)

    Krasnov, M. M.; Kuchugov, P. A.; E Ladonkina, M.; E Lutsky, A.; Tishkin, V. F.

    2017-02-01

    Detailed unstructured grids and numerical methods of high accuracy are frequently used in the numerical simulation of gasdynamic flows in areas with complex geometry. Galerkin method with discontinuous basis functions or Discontinuous Galerkin Method (DGM) works well in dealing with such problems. This approach offers a number of advantages inherent to both finite-element and finite-difference approximations. Moreover, the present paper shows that DGM schemes can be viewed as Godunov method extension to piecewise-polynomial functions. As is known, DGM involves significant computational complexity, and this brings up the question of ensuring the most effective use of all the computational capacity available. In order to speed up the calculations, operator programming method has been applied while creating the computational module. This approach makes possible compact encoding of mathematical formulas and facilitates the porting of programs to parallel architectures, such as NVidia CUDA and Intel Xeon Phi. With the software package, based on DGM, numerical simulations of supersonic flow past solid bodies has been carried out. The numerical results are in good agreement with the experimental ones.

  2. The AIS-5000 parallel processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmitt, L.A.; Wilson, S.S.

    1988-05-01

    The AIS-5000 is a commercially available massively parallel processor which has been designed to operate in an industrial environment. It has fine-grained parallelism with up to 1024 processing elements arranged in a single-instruction multiple-data (SIMD) architecture. The processing elements are arranged in a one-dimensional chain that, for computer vision applications, can be as wide as the image itself. This architecture has superior cost/performance characteristics than two-dimensional mesh-connected systems. The design of the processing elements and their interconnections as well as the software used to program the system allow a wide variety of algorithms and applications to be implemented. In thismore » paper, the overall architecture of the system is described. Various components of the system are discussed, including details of the processing elements, data I/O pathways and parallel memory organization. A virtual two-dimensional model for programming image-based algorithms for the system is presented. This model is supported by the AIS-5000 hardware and software and allows the system to be treated as a full-image-size, two-dimensional, mesh-connected parallel processor. Performance bench marks are given for certain simple and complex functions.« less

  3. Follow-up of permanent hearing impairment in childhood.

    PubMed

    Della Volpe, A; De Lucia, A; Pastore, V; Bracci Laudiero, L; Buonissimo, I; Ricci, G

    2016-02-01

    Programmes for early childhood childhood hearing impairment identification allows to quickly start the appropriate hearing aid fitting and rehabilitation process; nevertheless, a large number of patients do not join the treatment program. The goal of this article is to present the results of a strategic review of the strengths, weaknesses, opportunities and threats connected with the audiologic/prosthetic/language follow-up process of children with bilateral permanent hearing impairment. Involving small children, the follow-up includes the involvement of specialised professionals of a multidisciplinary team and a complex and prolonged multi-faced management. Within the framework of the Italian Ministry of Health project CCM 2013 "Preventing Communication Disorders: a Regional Program for Early Identification, Intervention and Care of Hearing Impaired Children", the purpose of this analysis was to propose recommendations that can harmonise criteria for outcome evaluation and provide guidance on the most appropriate assessment methods to be used in the follow-up course of children with permanent hearing impairment. © Copyright by Società Italiana di Otorinolaringologia e Chirurgia Cervico-Facciale.

  4. IP-Based Video Modem Extender Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierson, L G; Boorman, T M; Howe, R E

    2003-12-16

    Visualization is one of the keys to understanding large complex data sets such as those generated by the large computing resources purchased and developed by the Advanced Simulation and Computing program (aka ASCI). In order to be convenient to researchers, visualization data must be distributed to offices and large complex visualization theaters. Currently, local distribution of the visual data is accomplished by distance limited modems and RGB switches that simply do not scale to hundreds of users across the local, metropolitan, and WAN distances without incurring large costs in fiber plant installation and maintenance. Wide Area application over the DOEmore » Complex is infeasible using these limited distance RGB extenders. On the other hand, Internet Protocols (IP) over Ethernet is a scalable well-proven technology that can distribute large volumes of data over these distances. Visual data has been distributed at lower resolutions over IP in industrial applications. This document describes requirements of the ASCI program in visual signal distribution for the purpose of identifying industrial partners willing to develop products to meet ASCI's needs.« less

  5. Data Management Challenges in a National Scientific Program of 55 Diverse Research Projects

    NASA Astrophysics Data System (ADS)

    De Bruin, T.

    2016-12-01

    In 2007-2015, the Dutch funding agency NWO funded the National Ocean and Coastal Research Program (in Dutch: ZKO). This program focused on `the scientific analysis of five societal challenges related to a sustainable use of the sea and coastal zones'. These five challenges were safety, economic yield, nature, spatial planning & development and water quality. The ZKO program was `set up to strengthen the cohesion and collaboration within Dutch marine research'. From the start of the program, data management was addressed, to allow data to be shared amongst the, diverse, research projects. The ZKO program was divided in 4 different themes (or regions). The `Carrying Capacity' theme was subdivided into 3 `research lines': Carrying capacity (Wadden Sea) - Policy-relevant Research - Monitoring - Hypothesis-driven Research Oceans North Sea Transnational Wadden Sea Research 56 Projects were funded, ranging from studies on the governance of the Wadden Sea to expeditions studying trace elements in the Atlantic Ocean. One of the first projects to be funded was the data management project. Its objectives were to allow data exchange between projects, to archive all relevant data from all ZKO projects and to make the data and publications publicly available, following the ZKO Data Policy. This project was carried out by the NIOZ Data Management Group. It turned out that the research projects had hardly any interest in sharing data between projects and had good (?) arguments not to share data at all until the end of the projects. A data portal was built, to host and make available all ZKO data and publications. When it came to submitting the data to this portal, most projects obliged willingly, though found it occasionally difficult to find time to do so. However, some projects refused to submit data to an open data portal, despite the rules set up by the funding agency and agreed by all. The take-home message of this presentation is that data sharing is a cultural and psychological issue, not a technical one. The presentation will explain how the data portal was set up and is embedded in national and international data access infrastructures. The focus of the presentation will be on the roles of research funders, researchers and their institutions, politics and society in achieving truly open data, using the ZKO program as a real-life example.

  6. Cost and schedule analytical techniques development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.

  7. Construction of Hierarchical Models for Fluid Dynamics in Earth and Planetary Sciences : DCMODEL project

    NASA Astrophysics Data System (ADS)

    Takahashi, Y. O.; Takehiro, S.; Sugiyama, K.; Odaka, M.; Ishiwatari, M.; Sasaki, Y.; Nishizawa, S.; Ishioka, K.; Nakajima, K.; Hayashi, Y.

    2012-12-01

    Toward the understanding of fluid motions of planetary atmospheres and planetary interiors by performing multiple numerical experiments with multiple models, we are now proceeding ``dcmodel project'', where a series of hierarchical numerical models with various complexity is developed and maintained. In ``dcmodel project'', a series of the numerical models are developed taking care of the following points: 1) a common ``style'' of program codes assuring readability of the software, 2) open source codes of the models to the public, 3) scalability of the models assuring execution on various scales of computational resources, 4) stressing the importance of documentation and presenting a method for writing reference manuals. The lineup of the models and utility programs of the project is as follows: Gtool5, ISPACK/SPML, SPMODEL, Deepconv, Dcpam, and Rdoc-f95. In the followings, features of each component are briefly described. Gtool5 (Ishiwatari et al., 2012) is a Fortran90 library, which provides data input/output interfaces and various utilities commonly used in the models of dcmodel project. A self-descriptive data format netCDF is adopted as a IO format of Gtool5. The interfaces of gtool5 library can reduce the number of operation steps for the data IO in the program code of the models compared with the interfaces of the raw netCDF library. Further, by use of gtool5 library, procedures for data IO and addition of metadata for post-processing can be easily implemented in the program codes in a consolidated form independent of the size and complexity of the models. ``ISPACK'' is the spectral transformation library and ``SPML (SPMODEL library)'' (Takehiro et al., 2006) is its wrapper library. Most prominent feature of SPML is a series of array-handling functions with systematic function naming rules, and this enables us to write codes with a form which is easily deduced from the mathematical expressions of the governing equations. ``SPMODEL'' (Takehiro et al., 2006) is a collection of various sample programs using ``SPML''. These sample programs provide the basekit for simple numerical experiments of geophysical fluid dynamics. For example, SPMODEL includes 1-dimensional KdV equation model, 2-dimensional barotropic, shallow water, Boussinesq models, 3-dimensional MHD dynamo models in rotating spherical shells. These models are written in the common style in harmony with SPML functions. ``Deepconv'' (Sugiyama et al., 2010) and ``Dcpam'' are a cloud resolving model and a general circulation model for the purpose of applications to the planetary atmospheres, respectively. ``Deepconv'' includes several physical processes appropriate for simulations of Jupiter and Mars atmospheres, while ``Dcpam'' does for simulations of Earth, Mars, and Venus-like atmospheres. ``Rdoc-f95'' is a automatic generator of reference manuals of Fortran90/95 programs, which is an extension of ruby documentation tool kit ``rdoc''. It analyzes dependency of modules, functions, and subroutines in the multiple program source codes. At the same time, it can list up the namelist variables in the programs.

  8. DISTILLER: a data integration framework to reveal condition dependency of complex regulons in Escherichia coli.

    PubMed

    Lemmens, Karen; De Bie, Tijl; Dhollander, Thomas; De Keersmaecker, Sigrid C; Thijs, Inge M; Schoofs, Geert; De Weerdt, Ami; De Moor, Bart; Vanderleyden, Jos; Collado-Vides, Julio; Engelen, Kristof; Marchal, Kathleen

    2009-01-01

    We present DISTILLER, a data integration framework for the inference of transcriptional module networks. Experimental validation of predicted targets for the well-studied fumarate nitrate reductase regulator showed the effectiveness of our approach in Escherichia coli. In addition, the condition dependency and modularity of the inferred transcriptional network was studied. Surprisingly, the level of regulatory complexity seemed lower than that which would be expected from RegulonDB, indicating that complex regulatory programs tend to decrease the degree of modularity.

  9. PubMed Central

    DE LUCIA, A.; PASTORE, V.; BRACCI LAUDIERO, L.; BUONISSIMO, I.; RICCI, G.

    2016-01-01

    SUMMARY Programmes for early childhood childhood hearing impairment identification allows to quickly start the appropriate hearing aid fitting and rehabilitation process; nevertheless, a large number of patients do not join the treatment program. The goal of this article is to present the results of a strategic review of the strengths, weaknesses, opportunities and threats connected with the audiologic/prosthetic/language follow-up process of children with bilateral permanent hearing impairment. Involving small children, the follow-up includes the involvement of specialised professionals of a multidisciplinary team and a complex and prolonged multi-faced management. Within the framework of the Italian Ministry of Health project CCM 2013 "Preventing Communication Disorders: a Regional Program for Early Identification, Intervention and Care of Hearing Impaired Children", the purpose of this analysis was to propose recommendations that can harmonise criteria for outcome evaluation and provide guidance on the most appropriate assessment methods to be used in the follow-up course of children with permanent hearing impairment. PMID:27054392

  10. Digital Systems Validation Handbook. Volume 2. Chapter 18. Avionic Data Bus Integration Technology

    DTIC Science & Technology

    1993-11-01

    interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion software, which make up digital...1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error detection and...formulate all the significant behavior of a system. MULTIVERSION PROGRAMMING. N-version programming. N-VERSION PROGRAMMING. The independent coding of a

  11. AN OPTIMIZED 64X64 POINT TWO-DIMENSIONAL FAST FOURIER TRANSFORM

    NASA Technical Reports Server (NTRS)

    Miko, J.

    1994-01-01

    Scientists at Goddard have developed an efficient and powerful program-- An Optimized 64x64 Point Two-Dimensional Fast Fourier Transform-- which combines the performance of real and complex valued one-dimensional Fast Fourier Transforms (FFT's) to execute a two-dimensional FFT and its power spectrum coefficients. These coefficients can be used in many applications, including spectrum analysis, convolution, digital filtering, image processing, and data compression. The program's efficiency results from its technique of expanding all arithmetic operations within one 64-point FFT; its high processing rate results from its operation on a high-speed digital signal processor. For non-real-time analysis, the program requires as input an ASCII data file of 64x64 (4096) real valued data points. As output, this analysis produces an ASCII data file of 64x64 power spectrum coefficients. To generate these coefficients, the program employs a row-column decomposition technique. First, it performs a radix-4 one-dimensional FFT on each row of input, producing complex valued results. Then, it performs a one-dimensional FFT on each column of these results to produce complex valued two-dimensional FFT results. Finally, the program sums the squares of the real and imaginary values to generate the power spectrum coefficients. The program requires a Banshee accelerator board with 128K bytes of memory from Atlanta Signal Processors (404/892-7265) installed on an IBM PC/AT compatible computer (DOS ver. 3.0 or higher) with at least one 16-bit expansion slot. For real-time operation, an ASPI daughter board is also needed. The real-time configuration reads 16-bit integer input data directly into the accelerator board, operating on 64x64 point frames of data. The program's memory management also allows accumulation of the coefficient results. The real-time processing rate to calculate and accumulate the 64x64 power spectrum output coefficients is less than 17.0 mSec. Documentation is included in the price of the program. Source code is written in C, 8086 Assembly, and Texas Instruments TMS320C30 Assembly Languages. This program is available on a 5.25 inch 360K MS-DOS format diskette. IBM and IBM PC are registered trademarks of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation.

  12. Aviation and programmatic analyses; Volume 1, Task 1: Aviation data base development and application. [for NASA OAST programs

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A method was developed for using the NASA aviation data base and computer programs in conjunction with the GE management analysis and projection service to perform simple and complex economic analysis for planning, forecasting, and evaluating OAST programs. Capabilities of the system are discussed along with procedures for making basic data tabulations, updates and entries. The system is applied in an agricultural aviation study in order to assess its value for actual utility in the OAST working environment.

  13. A Novel Approach for Modeling Chemical Reaction in Generalized Fluid System Simulation Program

    NASA Technical Reports Server (NTRS)

    Sozen, Mehmet; Majumdar, Alok

    2002-01-01

    The Generalized Fluid System Simulation Program (GFSSP) is a computer code developed at NASA Marshall Space Flight Center for analyzing steady state and transient flow rates, pressures, temperatures, and concentrations in a complex flow network. The code, which performs system level simulation, can handle compressible and incompressible flows as well as phase change and mixture thermodynamics. Thermodynamic and thermophysical property programs, GASP, WASP and GASPAK provide the necessary data for fluids such as helium, methane, neon, nitrogen, carbon monoxide, oxygen, argon, carbon dioxide, fluorine, hydrogen, water, a hydrogen, isobutane, butane, deuterium, ethane, ethylene, hydrogen sulfide, krypton, propane, xenon, several refrigerants, nitrogen trifluoride and ammonia. The program which was developed out of need for an easy to use system level simulation tool for complex flow networks, has been used for the following purposes to name a few: Space Shuttle Main Engine (SSME) High Pressure Oxidizer Turbopump Secondary Flow Circuits, Axial Thrust Balance of the Fastrac Engine Turbopump, Pressurized Propellant Feed System for the Propulsion Test Article at Stennis Space Center, X-34 Main Propulsion System, X-33 Reaction Control System and Thermal Protection System, and International Space Station Environmental Control and Life Support System design. There has been an increasing demand for implementing a combustion simulation capability into GFSSP in order to increase its system level simulation capability of a liquid rocket propulsion system starting from the propellant tanks up to the thruster nozzle for spacecraft as well as launch vehicles. The present work was undertaken for addressing this need. The chemical equilibrium equations derived from the second law of thermodynamics and the energy conservation equation derived from the first law of thermodynamics are solved simultaneously by a Newton-Raphson method. The numerical scheme was implemented as a User Subroutine in GFSSP.

  14. Developing the Planetary Science Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Erard, Stéphane; Cecconi, Baptiste; Le Sidaner, Pierre; Henry, Florence; Chauvin, Cyril; Berthier, Jérôme; André, Nicolas; Génot, Vincent; Schmitt, Bernard; Capria, Teresa; Chanteur, Gérard

    2015-08-01

    In the frame of the Europlanet-RI program, a prototype Virtual Observatory dedicated to Planetary Science has been set up. Most of the activity was dedicated to the definition of standards to handle data in this field. The aim was to facilitate searches in big archives as well as sparse databases, to make on-line data access and visualization possible, and to allow small data providers to make their data available in an interoperable environment with minimum effort. This system makes intensive use of studies and developments led in Astronomy (IVOA), Solar Science (HELIO), and space archive services (IPDA).The current architecture connects existing data services with IVOA or IPDA protocols whenever relevant. However, a more general standard has been devised to handle the specific complexity of Planetary Science, e.g. in terms of measurement types and coordinate frames. This protocol, named EPN-TAP, is based on TAP and includes precise requirements to describe the contents of a data service (Erard et al Astron & Comp 2014). A light framework (DaCHS/GAVO) and a procedure have been identified to install small data services, and several hands-on sessions have been organized already. The data services are declared in standard IVOA registries. Support to new data services in Europe will be provided during the proposed Europlanet H2020 program, with a focus on planetary mission support (Rosetta, Cassini…).A specific client (VESPA) has been developed at VO-Paris (http://vespa.obspm.fr). It is able to use all the mandatory parameters in EPN-TAP, plus extra parameters from individual services. A resolver for target names is also available. Selected data can be sent to VO visualization tools such as TOPCAT or Aladin though the SAMP protocol.Future steps will include the development of a connection between the VO world and GIS tools, and integration of heliophysics, planetary plasma and reference spectroscopic data.The EuroPlaNet-RI project was funded by the European Commission under the 7th Framework Program, grant 228319 "Capacities Specific Programme".

  15. Analysis of Academic and Non-Academic Outcomes from a Bottom-up Comprehensive School Reform in the Absence of Student Level Data through Simulation Methods: A Mixed Methods Case Study

    ERIC Educational Resources Information Center

    Sondergeld, Toni A.

    2009-01-01

    This dissertation examines the efficacy of a bottom-up comprehensive school reform (CSR) program by evaluating its impact on student achievement, attendance, and behavior outcomes through an explanatory mixed methods design. The CSR program (Gear Up) was implemented in an urban junior high school over the course of seven years allowing for…

  16. Program logic: a framework for health program design and evaluation - the Pap nurse in general practice program.

    PubMed

    Hallinan, Christine M

    2010-01-01

    In this paper, program logic will be used to 'map out' the planning, development and evaluation of the general practice Pap nurse program in the Australian general practice arena. The incorporation of program logic into the evaluative process supports a greater appreciation of the theoretical assumptions and external influences that underpin general practice Pap nurse activity. The creation of a program logic model is a conscious strategy that results an explicit understanding of the challenges ahead, the resources available and time frames for outcomes. Program logic also enables a recognition that all players in the general practice arena need to be acknowledged by policy makers, bureaucrats and program designers when addressing through policy, issues relating to equity and accessibility of health initiatives. Logic modelling allows decision makers to consider the complexities of causal associations when developing health care proposals and programs. It enables the Pap nurse in general practice program to be represented diagrammatically by linking outcomes (short, medium and long term) with both the program activities and program assumptions. The research methodology used in the evaluation of the Pap nurse in general practice program includes a descriptive study design and the incorporation of program logic, with a retrospective analysis of Australian data from 2001 to 2009. For the purposes of gaining both empirical and contextual data for this paper, a data set analysis and literature review was performed. The application of program logic as an evaluative tool for analysis of the Pap PN incentive program facilitates a greater understanding of complex general practice activity triggers, and also allows this greater understanding to be incorporated into policy to facilitate Pap PN activity, increase general practice cervical smear and ultimately decrease burden of disease.

  17. The Dryden Flight Research Center at Edwards Air Force Base is NASA's premier center for atmospheric flight research to validate high-risk aerospace technology.

    NASA Image and Video Library

    2001-07-25

    Since the 1940s the Dryden Flight Research Center, Edwards, California, has developed a unique and highly specialized capability for conducting flight research programs. The organization, made up of pilots, scientists, engineers, technicians, and mechanics, has been and will continue to be leaders in the field of advanced aeronautics. Located on the northwest "shore" of Rogers Dry Lake, the complex was built around the original administrative-hangar building constructed in 1954. Since then many additional support and operational facilities have been built including a number of unique test facilities such as the Thermalstructures Research Facility, Flow Visualization Facility, and the Integrated Test Facility. One of the most prominent structures is the space shuttle program's Mate-Demate Device and hangar in Area A to the north of the main complex. On the lakebed surface is a Compass Rose that gives pilots an instant compass heading. The Dryden complex originated at Edwards Air Force Base in support of the X-1 supersonic flight program. As other high-speed aircraft entered research programs, the facility became permanent and grew from a staff of five engineers in 1947 to a population in 2006 of nearly 1100 full-time government and contractor employees.

  18. NASA's Dryden Flight Research Center is situated immediately adjacent to the compass rose on the bed of Rogers Dry Lake at Edwards Air Force Base, Calif.

    NASA Image and Video Library

    2001-07-25

    Since the 1940s the Dryden Flight Research Center, Edwards, California, has developed a unique and highly specialized capability for conducting flight research programs. The organization, made up of pilots, scientists, engineers, technicians, and mechanics, has been and will continue to be leaders in the field of advanced aeronautics. Located on the northwest "shore" of Rogers Dry Lake, the complex was built around the original administrative-hangar building constructed in 1954. Since then many additional support and operational facilities have been built including a number of unique test facilities such as the Thermalstructures Research Facility, Flow Visualization Facility, and the Integrated Test Facility. One of the most prominent structures is the space shuttle program's Mate-Demate Device and hangar in Area A to the north of the main complex. On the lakebed surface is a Compass Rose that gives pilots an instant compass heading. The Dryden complex originated at Edwards Air Force Base in support of the X-1 supersonic flight program. As other high-speed aircraft entered research programs, the facility became permanent and grew from a staff of five engineers in 1947 to a population in 2006 of nearly 1100 full-time government and contractor employees.

  19. Programming Pluralism: Using Learning Analytics to Detect Patterns in the Learning of Computer Programming

    ERIC Educational Resources Information Center

    Blikstein, Paulo; Worsley, Marcelo; Piech, Chris; Sahami, Mehran; Cooper, Steven; Koller, Daphne

    2014-01-01

    New high-frequency, automated data collection and analysis algorithms could offer new insights into complex learning processes, especially for tasks in which students have opportunities to generate unique open-ended artifacts such as computer programs. These approaches should be particularly useful because the need for scalable project-based and…

  20. Environmental projects. Volume 1: Polychlorinated biphenyl (PCB) abatement program

    NASA Technical Reports Server (NTRS)

    Kushner, L.

    1987-01-01

    Six large parabolic dish antennas are located at the Goldstone Deep Space Communications Complex north of Barstow, California. Some of the ancillary electrical equipment of thes Deep Space Stations, particularly transformers and power capicitors, were filled with stable, fire-retardant, dielectric fluids containing substances called polychlorobiphenyls (PCBs). Because the Environmental Protection Agency has determined that PCBs are environmental pollutants toxic to humans, all NASA centers have been asked to participate in a PCB-abatement program. Under the supervision of JPL's Office of Telecommunications and Data Acquisition, a two-year long PCB-abatement program has eliminated PCBs from the Goldstone Complex.

  1. The Design, Development and Testing of a Multi-process Real-time Software System

    DTIC Science & Technology

    2007-03-01

    programming large systems stems from the complexity of dealing with many different details at one time. A sound engineering approach is to break...controls and 3) is portable to other OS platforms such as Microsoft Windows. Next, to reduce the complexity of the programming tasks, the system...processes depending on how often the process has to check to see if common data was modified. A good method for one process to quickly notify another

  2. Fast computation of close-coupling exchange integrals using polynomials in a tree representation

    NASA Astrophysics Data System (ADS)

    Wallerberger, Markus; Igenbergs, Katharina; Schweinzer, Josef; Aumayr, Friedrich

    2011-03-01

    The semi-classical atomic-orbital close-coupling method is a well-known approach for the calculation of cross sections in ion-atom collisions. It strongly relies on the fast and stable computation of exchange integrals. We present an upgrade to earlier implementations of the Fourier-transform method. For this purpose, we implement an extensive library for symbolic storage of polynomials, relying on sophisticated tree structures to allow fast manipulation and numerically stable evaluation. Using this library, we considerably speed up creation and computation of exchange integrals. This enables us to compute cross sections for more complex collision systems. Program summaryProgram title: TXINT Catalogue identifier: AEHS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 12 332 No. of bytes in distributed program, including test data, etc.: 157 086 Distribution format: tar.gz Programming language: Fortran 95 Computer: All with a Fortran 95 compiler Operating system: All with a Fortran 95 compiler RAM: Depends heavily on input, usually less than 100 MiB Classification: 16.10 Nature of problem: Analytical calculation of one- and two-center exchange matrix elements for the close-coupling method in the impact parameter model. Solution method: Similar to the code of Hansen and Dubois [1], we use the Fourier-transform method suggested by Shakeshaft [2] to compute the integrals. However, we heavily speed up the calculation using a library for symbolic manipulation of polynomials. Restrictions: We restrict ourselves to a defined collision system in the impact parameter model. Unusual features: A library for symbolic manipulation of polynomials, where polynomials are stored in a space-saving left-child right-sibling binary tree. This provides stable numerical evaluation and fast mutation while maintaining full compatibility with the original code. Additional comments: This program makes heavy use of the new features provided by the Fortran 90 standard, most prominently pointers, derived types and allocatable structures and a small portion of Fortran 95. Only newer compilers support these features. Following compilers support all features needed by the program. GNU Fortran Compiler "gfortran" from version 4.3.0 GNU Fortran 95 Compiler "g95" from version 4.2.0 Intel Fortran Compiler "ifort" from version 11.0

  3. Improved neutron activation prediction code system development

    NASA Technical Reports Server (NTRS)

    Saqui, R. M.

    1971-01-01

    Two integrated neutron activation prediction code systems have been developed by modifying and integrating existing computer programs to perform the necessary computations to determine neutron induced activation gamma ray doses and dose rates in complex geometries. Each of the two systems is comprised of three computational modules. The first program module computes the spatial and energy distribution of the neutron flux from an input source and prepares input data for the second program which performs the reaction rate, decay chain and activation gamma source calculations. A third module then accepts input prepared by the second program to compute the cumulative gamma doses and/or dose rates at specified detector locations in complex, three-dimensional geometries.

  4. Graph Structured Program Evolution: Evolution of Loop Structures

    NASA Astrophysics Data System (ADS)

    Shirakawa, Shinichi; Nagao, Tomoharu

    Recently, numerous automatic programming techniques have been developed and applied in various fields. A typical example is genetic programming (GP), and various extensions and representations of GP have been proposed thus far. Complex programs and hand-written programs, however, may contain several loops and handle multiple data types. In this chapter, we propose a new method called Graph Structured Program Evolution (GRAPE). The representation of GRAPE is a graph structure; therefore, it can represent branches and loops using this structure. Each programis constructed as an arbitrary directed graph of nodes and a data set. The GRAPE program handles multiple data types using the data set for each type, and the genotype of GRAPE takes the form of a linear string of integers. We apply GRAPE to three test problems, factorial, exponentiation, and list sorting, and demonstrate that the optimum solution in each problem is obtained by the GRAPE system.

  5. The effect of the support program on the resilience of female family caregivers of stroke patients: Randomized controlled trial.

    PubMed

    İnci, Fadime Hatice; Temel, Ayla Bayik

    2016-11-01

    The purpose of the study was to determine the effect of a support program on the resilience of female family caregivers of stroke patients. This is a randomized controlled trial. The sample consisted 70 female family caregivers (34 experimental, 36 control group). Data were collected three times (pretest-posttest, follow-up test). Data were collected using the demographical data form, the Family Index of Regenerativity and Adaptation-General. A significant difference was determined between the experimental and control group's follow-up test scores for relative and friend support, social support and family-coping coherence. A significant difference was determined between the experimental group's mean pretest, posttest and follow-up test scores in terms of family strain, relative and friend support, social support, family coping-coherence, family hardiness and family distress. These results suggest that the Support Program contributes to the improvement of the components of resilience of family caregivers of stroke patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Emetic and Electric Shock Alcohol Aversion Therapy: Six- and Twelve-Month Follow-Up.

    ERIC Educational Resources Information Center

    Cannon, Dale S.; Baker, Timothy B.

    1981-01-01

    Follow-up data are presented for 6- and 12-months on male alcoholics (N=20) who received either a multifaceted inpatient alcoholism treatment program alone (controls) or emetic or shock aversion therapy in addition to that program. Both emetic and control subjects compiled more days of abstinence than shock subjects. (Author)

  7. The impact of patient support programs on adherence, clinical, humanistic, and economic patient outcomes: a targeted systematic review

    PubMed Central

    Ganguli, Arijit; Clewell, Jerry; Shillington, Alicia C

    2016-01-01

    Background Patient support programs (PSPs), including medication management and counseling, have the potential to improve care in chronic disease states with complex therapies. Little is known about the program’s effects on improving clinical, adherence, humanistic, and cost outcomes. Purpose To conduct a targeted review describing medical conditions in which PSPs have been implemented; support delivery components (eg, face-to-face, phone, mail, and internet); and outcomes associated with implementation. Data sources MEDLINE – 10 years through March 2015 with supplemental handsearching of reference lists. Study selection English-language trials and observational studies of PSPs providing at minimum, counseling for medication management, measurement of ≥1 clinical outcome, and a 3-month follow-up period during which outcomes were measured. Data extraction Program characteristics and related clinical, adherence, humanistic, and cost outcomes were abstracted. Study quality and the overall strength of evidence were reviewed using standard criteria. Data synthesis Of 2,239 citations, 64 studies met inclusion criteria. All targeted chronic disease processes and the majority (48 [75%]) of programs offered in-clinic, face-to-face support. All but 9 (14.1%) were overseen by allied health care professionals (eg, nurses, pharmacists, paraprofessionals). Forty-one (64.1%) reported at least one significantly positive clinical outcome. The most frequent clinical outcome impacted was adherence, where 27 of 41 (66%) reported a positive outcome. Of 42 studies measuring humanistic outcomes (eg, quality of life, functional status), 27 (64%) reported significantly positive outcomes. Only 15 (23.4%) programs reported cost or utilization-related outcomes, and, of these, 12 reported positive impacts. Conclusion The preponderance of evidence suggests a positive impact of PSPs on adherence, clinical and humanistic outcomes. Although less often measured, health care utilization and costs are also reduced following PSP implementation. Further research is needed to better quantify which support programs, delivery methods, and components offer the greatest value for any particular medical condition. PMID:27175071

  8. Analog Input Data Acquisition Software

    NASA Technical Reports Server (NTRS)

    Arens, Ellen

    2009-01-01

    DAQ Master Software allows users to easily set up a system to monitor up to five analog input channels and save the data after acquisition. This program was written in LabVIEW 8.0, and requires the LabVIEW runtime engine 8.0 to run the executable.

  9. An Authentic Research Experience in an Astronomy Education Professional Development Program: An Analysis of 8 Years of Data on the NASA/IPAC Teacher Archive Research Program (NITARP)

    NASA Astrophysics Data System (ADS)

    Rebull, Luisa; Roberts, Tracy; Laurence, Wendi; Fitzgerald, Michael; French, Debbie; Gorjian, Varoujan; Squires, Gordon

    2018-01-01

    The NASA/IPAC Teacher Archive Research Program (NITARP) partners small groups of educators with a research astronomer for a year-long authentic research project. This program aligns well with the characteristics of high-quality professional development (PD) programs and has worked with a total of 103 educators since 2005. In this poster, we explore surveys obtained from 74 different educators, at up to four waypoints during the course of 13 months, incorporating data from the class of 2010 through the class of 2017. The reasons educators participate are mapped onto a continuum ranging from more inward-focused to more outward-focused; NITARP has had more outward-focused educators than inward-focused, though there is a bias against the extremes on either end of the continuum. This insight into teacher motivations has implications for how the educators are supported during the NITARP year. Three-quarters of the educators self-report some or major changes in their understanding of the nature of science. The program provides educators with experience collaborating with astronomers and other educators, and forges a strong link to the astronomical research community; the NITARP community of practice encourages and reinforces these linkages. During the experience, educators get comfortable with learning complex new concepts, with ~40% noting in their surveys that their approach to learning has changed. Educators are provided opportunities for professional growth; at least 12% have changed career paths substantially in part due to the program, and 11% report that the experience was “life changing.” At least 60% are including richer, more authentic science activities in their classrooms. This work illuminates what benefits the program brings to its participants, and serves as a model for similar PD programs in other STEM subjects.

  10. Software complex for geophysical data visualization

    NASA Astrophysics Data System (ADS)

    Kryukov, Ilya A.; Tyugin, Dmitry Y.; Kurkin, Andrey A.; Kurkina, Oxana E.

    2013-04-01

    The effectiveness of current research in geophysics is largely determined by the degree of implementation of the procedure of data processing and visualization with the use of modern information technology. Realistic and informative visualization of the results of three-dimensional modeling of geophysical processes contributes significantly into the naturalness of physical modeling and detailed view of the phenomena. The main difficulty in this case is to interpret the results of the calculations: it is necessary to be able to observe the various parameters of the three-dimensional models, build sections on different planes to evaluate certain characteristics and make a rapid assessment. Programs for interpretation and visualization of simulations are spread all over the world, for example, software systems such as ParaView, Golden Software Surfer, Voxler, Flow Vision and others. However, it is not always possible to solve the problem of visualization with the help of a single software package. Preprocessing, data transfer between the packages and setting up a uniform visualization style can turn into a long and routine work. In addition to this, sometimes special display modes for specific data are required and existing products tend to have more common features and are not always fully applicable to certain special cases. Rendering of dynamic data may require scripting languages that does not relieve the user from writing code. Therefore, the task was to develop a new and original software complex for the visualization of simulation results. Let us briefly list of the primary features that are developed. Software complex is a graphical application with a convenient and simple user interface that displays the results of the simulation. Complex is also able to interactively manage the image, resize the image without loss of quality, apply a two-dimensional and three-dimensional regular grid, set the coordinate axes with data labels and perform slice of data. The feature of geophysical data is their size. Detailed maps used in the simulations are large, thus rendering in real time can be difficult task even for powerful modern computers. Therefore, the performance of the software complex is an important aspect of this work. Complex is based on the latest version of graphic API: Microsoft - DirectX 11, which reduces overhead and harness the power of modern hardware. Each geophysical calculation is the adjustment of the mathematical model for a particular case, so the architecture of the complex visualization is created with the scalability and the ability to customize visualization objects, for better visibility and comfort. In the present study, software complex 'GeoVisual' was developed. One of the main features of this research is the use of bleeding-edge techniques of computer graphics in scientific visualization. The research was supported by The Ministry of education and science of Russian Federation, project 14.B37.21.0642.

  11. Bridging the gap between sustainable technology adoption and protecting natural resources: Predicting intentions to adopt energy management technologies in California

    DOE PAGES

    Chen, Bingye; Sintov, Nicole

    2016-10-24

    To achieve energy savings, emerging energy management technologies and programs require customer adoption. Although a variety of models can be used to explain the adoption of energy management technologies and programs, they overlook the seemingly unconventional element of level of affiliation with nature. In fact, connectedness to nature has been identified as an important driver of many pro-environmental behaviors, but its role in pro-environmental technology adoption is also not well understood. Can affiliation with nature help to bridge the apparent gap—and complex chain of events—between sustainable technology adoption and protecting natural resources? Based on survey data from 856 southern Californiamore » residents, this study investigated the influence of connectedness to nature and other factors on intentions to adopt five energy management technologies and programs: using three platforms to monitor home energy use (website, mobile phone application, in-home display); signing up for a time-of-use pricing plan; and participating in demand response events. Regression results showed that nature connectedness was the strongest predictor of all outcomes such that higher nature connectedness predicted greater likelihood of technology and program adoption. In conclusion, these findings suggest that connectedness to nature may facilitate “bridging the logic gap” between sustainable innovation adoption and environmental protection.« less

  12. Implementation of the first worldwide quality assurance program for cystic fibrosis multiple mutation detection in population-based screening.

    PubMed

    Earley, Marie C; Laxova, Anita; Farrell, Philip M; Driscoll-Dunn, Rena; Cordovado, Suzanne; Mogayzel, Peter J; Konstan, Michael W; Hannon, W Harry

    2011-07-15

    CDC's Newborn Screening Quality Assurance Program collaborated with several U.S. Cystic Fibrosis Care Centers to collect specimens for development of a molecular CFTR proficiency testing program using dried-blood spots for newborn screening laboratories. Adult and adolescent patients or carriers donated whole blood that was aliquoted onto filter paper cards. Five blind-coded specimens were sent to participating newborn screening laboratories quarterly. Proficiency testing results were evaluated based on presumptive clinical assessment. Individual evaluations and summary reports were sent to each participating laboratory and technical consultations were offered if incorrect assessments were reported. The current CDC repository contains specimens with 39 different CFTR mutations. Up to 45 laboratories have participated in the program. Three years of data showed that correct assessments were reported 97.7% of the time overall when both mutations could be determined. Incorrect assessments that could have lead to a missed case occurred 0.9% of the time, and no information was reported 1.1% of the time due to sample failure. Results show that laboratories using molecular assays to detect CFTR mutations are performing satisfactorily. The programmatic results presented demonstrate the importance and complexity of providing proficiency testing for DNA-based assays. Published by Elsevier B.V.

  13. Resolution of singularities for multi-loop integrals

    NASA Astrophysics Data System (ADS)

    Bogner, Christian; Weinzierl, Stefan

    2008-04-01

    We report on a program for the numerical evaluation of divergent multi-loop integrals. The program is based on iterated sector decomposition. We improve the original algorithm of Binoth and Heinrich such that the program is guaranteed to terminate. The program can be used to compute numerically the Laurent expansion of divergent multi-loop integrals regulated by dimensional regularisation. The symbolic and the numerical steps of the algorithm are combined into one program. Program summaryProgram title: sector_decomposition Catalogue identifier: AEAG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 47 506 No. of bytes in distributed program, including test data, etc.: 328 485 Distribution format: tar.gz Programming language: C++ Computer: all Operating system: Unix RAM: Depending on the complexity of the problem Classification: 4.4 External routines: GiNaC, available from http://www.ginac.de, GNU scientific library, available from http://www.gnu.org/software/gsl Nature of problem: Computation of divergent multi-loop integrals. Solution method: Sector decomposition. Restrictions: Only limited by the available memory and CPU time. Running time: Depending on the complexity of the problem.

  14. Transition from paediatric to adult care of adolescent patients with congenital heart disease: a pathway to optimal care.

    PubMed

    Strijbosch, A M M; Zwart, R; Blom, N A; Bouma, B J; Groenink, M; Boekholdt, S M; de Winter, R; Mulder, B J M; Backx, A P

    2016-11-01

    Adolescents with congenital heart disease transition from a paediatric to an adult setting. This is associated with loss-to-follow-up and suboptimal care. Increasing numbers of patients justify a special program. In this study we evaluated the cooperative program between paediatric and adult cardiology departments in a tertiary referral centre. In this retrospective study, patients with congenital heart disease with at least one appointment scheduled at the transition program between January 2010 and January 2015 were included. They were seen by a paediatric cardiologist at the age of 15 years in the paediatric department and from age 18 to 25 in the adult department. Demographic and medical data were collected from the electronic patient files. A total of 193 patients (105 males, 88 females) were identified. Sex distribution was almost equal. Most patients were 18-21 years of age. The largest group, 128 patients (67 %), lived within 50 kilometres of our hospital. Paediatric cardiologists referred 157 (81 %) of patients. General practitioners and cardiologists from outside our centre were important referrers for patients lost to follow-up, together accounting for 9 %. A total of 34 (18 %) patients missed an appointment without notification. Repeat offenders, 16 of 34 patients, formed a significant minority within this group. A total of 114 (59 %) patients were attending school, 46 (24 %) were employed, and 33 (17 %) patients were inactive. Activities are in line with capabilities. A nurse practitioner was involved with the 7 % with complex and psychosocial problems. Moderately severe congenital heart defects formed the largest patient category of 102 (53 %) patients. In 3 % of patients the diagnosis had to be revised or was significantly incomplete. In 30 (16 %) patients, cardiac diagnosis was part of a syndrome. Of the 193 patients, 117 (92 %) were in NYHA class I, with 12 (6 %) and 4 (2 %) patients falling into classes II and III, respectively. A viable transition program can be built by collaboration between paediatric and adult cardiology departments with the same treating physician taking care of patients between 15 and 25 years of age. General practitioners are important in returning lost-to-follow-up patients to specialised care. Nurse practitioners are essential in the care for patients with complex congenital heart disease.

  15. Strategies for more effective monitoring and evaluation systems in HIV programmatic scale-up in resource-limited settings: Implications for health systems strengthening.

    PubMed

    Nash, Denis; Elul, Batya; Rabkin, Miriam; Tun, May; Saito, Suzue; Becker, Mark; Nuwagaba-Biribonwoha, Harriet

    2009-11-01

    Program monitoring and evaluation (M&E) has the potential to be a cornerstone of health systems strengthening and of evidence-informed implementation and scale-up of HIV-related services in resource-limited settings. We discuss common challenges to M&E systems used in the rapid scale-up of HIV services as well as innovations that may have relevance to systems used to monitor, evaluate, and inform health systems strengthening. These include (1) Web-based applications with decentralized data entry and real-time access to summary reporting; (2) timely feedback of information to site and district staff; (3) site-level integration of traditionally siloed program area indicators; (4) longitudinal tracking of program and site characteristics; (5) geographic information systems; and (6) use of routinely collected aggregate data for epidemiologic analysis and operations research. Although conventionally used in the context of vertical programs, these approaches can form a foundation on which data relevant to other health services and systems can be layered, including prevention services, primary care, maternal-child health, and chronic disease management. Guiding principles for sustainable national M&E systems include country-led development and ownership, support for national programs and policies, interoperability, and employment of an open-source approach to software development.

  16. Statistical complex fatigue data for SAE 4340 steel and its use in design by reliability

    NASA Technical Reports Server (NTRS)

    Kececioglu, D.; Smith, J. L.

    1970-01-01

    A brief description of the complex fatigue machines used in the test program is presented. The data generated from these machines are given and discussed. Two methods of obtaining strength distributions from the data are also discussed. Then follows a discussion of the construction of statistical fatigue diagrams and their use in designing by reliability. Finally, some of the problems encountered in the test equipment and a corrective modification are presented.

  17. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  18. Spherical roller bearing analysis. SKF computer program SPHERBEAN. Volume 3: Program correlation with full scale hardware tests

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Rosenlieb, J. W.; Dyba, G.

    1980-01-01

    The results of a series of full scale hardware tests comparing predictions of the SPHERBEAN computer program with measured data are presented. The SPHERBEAN program predicts the thermomechanical performance characteristics of high speed lubricated double row spherical roller bearings. The degree of correlation between performance predicted by SPHERBEAN and measured data is demonstrated. Experimental and calculated performance data is compared over a range in speed up to 19,400 rpm (0.8 MDN) under pure radial, pure axial, and combined loads.

  19. Follow-up client satisfaction in a supported education program.

    PubMed

    Mowbray, C T; Bybee, D; Collins, M E

    2001-01-01

    Satisfaction data have recently returned to popularity, as an outcome measure in managed behavioral healthcare systems. However, there are few examples of management uses of such data. We collected data 12 months after participants had completed a supported education program, concerning their retrospective satisfaction and the barriers, needs, and personal difficulties currently experienced in their attempts to pursue post-secondary education or training. Data on follow-up supportive contacts were also obtained. Results supported participants' continuing satisfaction, and identified particular information items which were endorsed as most helpful. However, the data indicated that personal difficulties presented obstacles to many and that a majority of participants had current needs for financial aid, tutoring, job placements, support groups, and transportation. Following completion of the supported education program, many participants had continuing contacts in support of their educational plans. The amount of contact was generally low, however. In the future, supported education programs need to build in mechanisms to ensure students receive ongoing support for education, since this support was found to positively and significantly affect individuals' enrolling in college or training.

  20. Slide Set: Reproducible image analysis and batch processing with ImageJ.

    PubMed

    Nanes, Benjamin A

    2015-11-01

    Most imaging studies in the biological sciences rely on analyses that are relatively simple. However, manual repetition of analysis tasks across multiple regions in many images can complicate even the simplest analysis, making record keeping difficult, increasing the potential for error, and limiting reproducibility. While fully automated solutions are necessary for very large data sets, they are sometimes impractical for the small- and medium-sized data sets common in biology. Here we present the Slide Set plugin for ImageJ, which provides a framework for reproducible image analysis and batch processing. Slide Set organizes data into tables, associating image files with regions of interest and other relevant information. Analysis commands are automatically repeated over each image in the data set, and multiple commands can be chained together for more complex analysis tasks. All analysis parameters are saved, ensuring transparency and reproducibility. Slide Set includes a variety of built-in analysis commands and can be easily extended to automate other ImageJ plugins, reducing the manual repetition of image analysis without the set-up effort or programming expertise required for a fully automated solution.

  1. How to successfully implement a robotic pediatric surgery program: lessons learned after 96 procedures.

    PubMed

    de Lambert, Guénolée; Fourcade, Laurent; Centi, Joachim; Fredon, Fabien; Braik, Karim; Szwarc, Caroline; Longis, Bernard; Lardy, Hubert

    2013-06-01

    Both our teams were the first to implement pediatric robotic surgery in France. The aim of this study was to define the key points we brought to light so other pediatric teams that want to set up a robotic surgery program will benefit. We reviewed the medical records of all children who underwent robotic surgery between Nov 2007 and June 2011 in both departments, including patient data, installation and changes, operative time, hospital stay, intraoperative complications, and postoperative outcome. The department's internal organization, the organization within the hospital complex, and cost were evaluated. A total of 96 procedures were evaluated. There were 38 girls and 56 boys with average age at surgery of 7.6 years (range, 0.7-18 years) and average weight of 26 kg (range, 6-77 kg). Thirty-six patients had general surgery, 57 patients urologic surgery, and 1 thoracic surgery. Overall average operative time was 189 min (range, 70-550 min), and average hospital stay was 6.4 days (range, 2-24 days). The procedures of 3 patients were converted. Median follow-up was 18 months (range, 0.5-43 months). Robotic surgical procedure had an extra cost of 1934 compared to conventional open surgery. Our experience was similar to the findings described in the literature for feasibility, security, and patient outcomes; we had an overall operative success rate of 97 %. Three main actors are concerned in the implementation of a robotic pediatric surgery program: surgeons and anesthetists, nurses, and the administration. The surgeon is at the starting point with motivation for minimally invasive surgery without laparoscopic constraints. We found that it was possible to implement a long-lasting robotic surgery program with comparable quality of care.

  2. A system for programming experiments and for recording and analyzing data automatically1

    PubMed Central

    Herrick, Robert M.; Denelsbeck, John S.

    1963-01-01

    A system designed for use in complex operant conditioning experiments is described. Some of its key features are: (a) plugboards that permit the experimenter to change either from one program to another or from one analysis to another in less than a minute, (b) time-sharing of permanently-wired, electronic logic components, (c) recordings suitable for automatic analyses. Included are flow diagrams of the system and sample logic diagrams for programming experiments and for analyzing data. ImagesFig. 4. PMID:14055967

  3. Creating Meaningful Change in Education: A Cascading Logic Model. Scaling-Up Brief. Number 6

    ERIC Educational Resources Information Center

    Blase, Karen; Fixsen, Dean; Jackson, Kathleen Ryan

    2015-01-01

    Creating meaningful change in a state's education system from the capitol to the classroom is complex and challenging work. Over the past several decades, considerable research, policy, and funding have focused on the use of evidence-based programs (EBP) in schools. However, these practices only are effective when fully and effectively implemented…

  4. Climbing up the Leaderboard: An Empirical Study of Applying Gamification Techniques to a Computer Programming Class

    ERIC Educational Resources Information Center

    Fotaris, Panagiotis; Mastoras, Theodoros; Leinfellner, Richard; Rosunally, Yasmine

    2016-01-01

    Conventional taught learning practices often experience difficulties in keeping students motivated and engaged. Video games, however, are very successful at sustaining high levels of motivation and engagement through a set of tasks for hours without apparent loss of focus. In addition, gamers solve complex problems within a gaming environment…

  5. Swarmathon 2017

    NASA Image and Video Library

    2017-04-20

    This close-up shows Swarmie robots that were programmed with computer code by college and university students. During the Swarmathon competition at the Kennedy Space Center Visitor Complex, the small robots looked for "resources" in the form of cubes with AprilTags, similar to barcodes. Similar robots could help find resources when astronauts explore distant locations, such as the moon or Mars.

  6. Population-Level Scale-Up of Cervical Cancer Prevention Services in a Low-Resource Setting: Development, Implementation, and Evaluation of the Cervical Cancer Prevention Program in Zambia

    PubMed Central

    Parham, Groesbeck P.; Mwanahamuntu, Mulindi H.; Kapambwe, Sharon; Muwonge, Richard; Bateman, Allen C.; Blevins, Meridith; Chibwesha, Carla J.; Pfaendler, Krista S.; Mudenda, Victor; Shibemba, Aaron L.; Chisele, Samson; Mkumba, Gracilia; Vwalika, Bellington; Hicks, Michael L.; Vermund, Sten H.; Chi, Benjamin H.; Stringer, Jeffrey S. A.; Sankaranarayanan, Rengaswamy; Sahasrabuddhe, Vikrant V.

    2015-01-01

    Background Very few efforts have been undertaken to scale-up low-cost approaches to cervical cancer prevention in low-resource countries. Methods In a public sector cervical cancer prevention program in Zambia, nurses provided visual-inspection with acetic acid (VIA) and cryotherapy in clinics co-housed with HIV/AIDS programs, and referred women with complex lesions for histopathologic evaluation. Low-cost technological adaptations were deployed for improving VIA detection, facilitating expert physician opinion, and ensuring quality assurance. Key process and outcome indicators were derived by analyzing electronic medical records to evaluate program expansion efforts. Findings Between 2006-2013, screening services were expanded from 2 to 12 clinics in Lusaka, the most-populous province in Zambia, through which 102,942 women were screened. The majority (71.7%) were in the target age-range of 25–49 years; 28% were HIV-positive. Out of 101,867 with evaluable data, 20,419 (20%) were VIA positive, of whom 11,508 (56.4%) were treated with cryotherapy, and 8,911 (43.6%) were referred for histopathologic evaluation. Most women (87%, 86,301 of 98,961 evaluable) received same-day services (including 5% undergoing same-visit cryotherapy and 82% screening VIA-negative). The proportion of women with cervical intraepithelial neoplasia grade 2 and worse (CIN2+) among those referred for histopathologic evaluation was 44.1% (1,735/3,938 with histopathology results). Detection rates for CIN2+ and invasive cervical cancer were 17 and 7 per 1,000 women screened, respectively. Women with HIV were more likely to screen positive, to be referred for histopathologic evaluation, and to have cervical precancer and cancer than HIV-negative women. Interpretation We creatively disrupted the 'no screening' status quo prevailing in Zambia and addressed the heavy burden of cervical disease among previously unscreened women by establishing and scaling-up public-sector screening and treatment services at a population level. Key determinants for successful expansion included leveraging HIV/AIDS program investments, and context-specific information technology applications for quality assurance and filling human resource gaps. PMID:25885821

  7. Population-level scale-up of cervical cancer prevention services in a low-resource setting: development, implementation, and evaluation of the cervical cancer prevention program in Zambia.

    PubMed

    Parham, Groesbeck P; Mwanahamuntu, Mulindi H; Kapambwe, Sharon; Muwonge, Richard; Bateman, Allen C; Blevins, Meridith; Chibwesha, Carla J; Pfaendler, Krista S; Mudenda, Victor; Shibemba, Aaron L; Chisele, Samson; Mkumba, Gracilia; Vwalika, Bellington; Hicks, Michael L; Vermund, Sten H; Chi, Benjamin H; Stringer, Jeffrey S A; Sankaranarayanan, Rengaswamy; Sahasrabuddhe, Vikrant V

    2015-01-01

    Very few efforts have been undertaken to scale-up low-cost approaches to cervical cancer prevention in low-resource countries. In a public sector cervical cancer prevention program in Zambia, nurses provided visual-inspection with acetic acid (VIA) and cryotherapy in clinics co-housed with HIV/AIDS programs, and referred women with complex lesions for histopathologic evaluation. Low-cost technological adaptations were deployed for improving VIA detection, facilitating expert physician opinion, and ensuring quality assurance. Key process and outcome indicators were derived by analyzing electronic medical records to evaluate program expansion efforts. Between 2006-2013, screening services were expanded from 2 to 12 clinics in Lusaka, the most-populous province in Zambia, through which 102,942 women were screened. The majority (71.7%) were in the target age-range of 25-49 years; 28% were HIV-positive. Out of 101,867 with evaluable data, 20,419 (20%) were VIA positive, of whom 11,508 (56.4%) were treated with cryotherapy, and 8,911 (43.6%) were referred for histopathologic evaluation. Most women (87%, 86,301 of 98,961 evaluable) received same-day services (including 5% undergoing same-visit cryotherapy and 82% screening VIA-negative). The proportion of women with cervical intraepithelial neoplasia grade 2 and worse (CIN2+) among those referred for histopathologic evaluation was 44.1% (1,735/3,938 with histopathology results). Detection rates for CIN2+ and invasive cervical cancer were 17 and 7 per 1,000 women screened, respectively. Women with HIV were more likely to screen positive, to be referred for histopathologic evaluation, and to have cervical precancer and cancer than HIV-negative women. We creatively disrupted the 'no screening' status quo prevailing in Zambia and addressed the heavy burden of cervical disease among previously unscreened women by establishing and scaling-up public-sector screening and treatment services at a population level. Key determinants for successful expansion included leveraging HIV/AIDS program investments, and context-specific information technology applications for quality assurance and filling human resource gaps.

  8. Decision Support System for Determining Scholarship Selection using an Analytical Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Puspitasari, T. D.; Sari, E. O.; Destarianto, P.; Riskiawan, H. Y.

    2018-01-01

    Decision Support System is a computer program application that analyzes data and presents it so that users can make decision more easily. Determining Scholarship Selection study case in Senior High School in east Java wasn’t easy. It needed application to solve the problem, to improve the accuracy of targets for prospective beneficiaries of poor students and to speed up the screening process. This research will build system uses the method of Analytical Hierarchy Process (AHP) is a method that solves a complex and unstructured problem into its group, organizes the groups into a hierarchical order, inputs numerical values instead of human perception in comparing relative and ultimately with a synthesis determined elements that have the highest priority. The accuracy system for this research is 90%.

  9. Mentally stimulating activities at work during midlife and dementia risk after age 75: follow-up study from the Kungsholmen Project.

    PubMed

    Karp, Anita; Andel, Ross; Parker, Marti G; Wang, Hui-Xin; Winblad, Bengt; Fratiglioni, Laura

    2009-03-01

    Previous research has suggested that mental stimulation in different life periods may protect against dementia or delay disease onset. This study aimed to explore the association between work complexity factors at midlife and dementia risk in late life under the hypothesis that high work complexity may modulate the increased dementia risk due to low education. Population-based follow-up study. Urban. A cohort of 931 nondemented subjects, aged 75+ years from the Kungsholmen Project, Stockholm, examined twice over 6 years. Incident dementia cases were identified using Diagnostic and Statistical Manual of Mental Disorders, 3rd-Edition Revised criteria. Primary occupations were assigned into categories according to the Nordic Occupational Classification and matched to the 1970 U.S. Census to score the level of work complexity with data, people, and things by using a preformed matrix. Lower dementia risk was associated with complexity of work with both data (age and gender adjusted relative risk [aRR]: 0.85, 95% confidence interval [CI]: 0.75-0.95) and with people (aRR: 0.88, 95% CI: 0.80-0.97). Adjusting for education led to similar results, although no longer statistically significant. Further, the highest degrees of complexity of work with data that involves analyzing, coordinating, and synthesizing data were associated with lower dementia risk even among lower educated subjects (relative risk: 0.52, 95% CI: 0.29-0.95). No gender differences were detected. This study suggests that work complexity with data and people is related to lower risk of dementia and that the highest levels of work complexity may modulate the higher dementia risk due to low education.

  10. A Study of College Students' Construct of Parameter Passing Implications for Instruction.

    ERIC Educational Resources Information Center

    Madison, Sandra Kay

    Parameter passing is the mechanism by which various program modules share information in a complex program; this paper was a study of novice programmers' understanding of the parameter construct. The bulk of the data was collected from interviews with eight college students enrolled in a state university introductory computer programming course.…

  11. Moving bed reactor setup to study complex gas-solid reactions.

    PubMed

    Gupta, Puneet; Velazquez-Vargas, Luis G; Valentine, Charles; Fan, Liang-Shih

    2007-08-01

    A moving bed scale reactor setup for studying complex gas-solid reactions has been designed in order to obtain kinetic data for scale-up purpose. In this bench scale reactor setup, gas and solid reactants can be contacted in a cocurrent and countercurrent manner at high temperatures. Gas and solid sampling can be performed through the reactor bed with their composition profiles determined at steady state. The reactor setup can be used to evaluate and corroborate model parameters accounting for intrinsic reaction rates in both simple and complex gas-solid reaction systems. The moving bed design allows experimentation over a variety of gas and solid compositions in a single experiment unlike differential bed reactors where the gas composition is usually fixed. The data obtained from the reactor can also be used for direct scale-up of designs for moving bed reactors.

  12. Habitat Complexity Metrics to Guide Restoration of Large Rivers

    NASA Astrophysics Data System (ADS)

    Jacobson, R. B.; McElroy, B. J.; Elliott, C.; DeLonay, A.

    2011-12-01

    Restoration strategies on large, channelized rivers typically strive to recover lost habitat complexity, based on the assumption complexity and biophysical capacity are directly related. Although definition of links between complexity and biotic responses can be tenuous, complexity metrics have appeal because of their potential utility in quantifying habitat quality, defining reference conditions and design criteria, and measuring restoration progress. Hydroacoustic instruments provide many ways to measure complexity on large rivers, yet substantive questions remain about variables and scale of complexity that are meaningful to biota, and how complexity can be measured and monitored cost effectively. We explore these issues on the Missouri River, using the example of channel re-engineering projects that are intended to aid in recovery of the pallid sturgeon, an endangered benthic fish. We are refining understanding of what habitat complexity means for adult fish by combining hydroacoustic habitat assessments with acoustic telemetry to map locations during reproductive migrations and spawning. These data indicate that migrating sturgeon select points with relatively low velocity but adjacent to areas of high velocity (that is, with high velocity gradients); the integration of points defines pathways which minimize energy expenditures during upstream migrations of 10's to 100's of km. Complexity metrics that efficiently quantify migration potential at the reach scale are therefore directly relevant to channel restoration strategies. We are also exploring complexity as it relates to larval sturgeon dispersal. Larvae may drift for as many as 17 days (100's of km at mean velocities) before using up their yolk sac, after which they "settle" into habitats where they initiate feeding. An assumption underlying channel re-engineering is that additional channel complexity, specifically increased shallow, slow water, is necessary for early feeding and refugia. Development of complexity metrics is complicated by the fact that characteristics of channel morphology may increase complexity scores without necessarily increasing biophysical capacity for target species. For example, a cross section that samples depths and velocities across the thalweg (navigation channel) and into lentic habitat may score high on most measures of hydraulic or geomorphic complexity, but does not necessarily provide habitats beneficial to native species. Complexity measures need to be bounded by best estimates of native species requirements. In the absence of specific information, creation of habitat complexity for the sake of complexity may lead to unintended consequences, for example, lentic habitats that increase a complexity score but support invasive species. An additional practical constraint on complexity measures is the need to develop metrics that are can be deployed cost-effectively in an operational monitoring program. Design of a monitoring program requires informed choices of measurement variables, definition of reference sites, and design of sampling effort to capture spatial and temporal variability.

  13. Enhancing the educational achievement of at-risk youth.

    PubMed

    Schinke, S P; Cole, K C; Poulin, S R

    2000-03-01

    This study examined a non-school program aimed at enhancing the educational performance of economically disadvantaged early adolescents who live in public housing. The educational enhancement program included discussions with adults, writing activities, leisure reading, homework, helping others, and games using cognitive skills. A three-arm research design juxtaposed program youth who received educational enhancements with comparison youth in affiliated facilities who did not receive the program and with control youth in other community programs without educational enhancements. From youths, follow-up data collected 2 1/2 years after baseline revealed uniformly positive outcomes for program youth on measures of reading, verbal skills, writing, and tutoring. Teacher reports at final follow-up favored program and comparison youth over controls on measures of reading, writing, games, overall school performance, and interest in class material. School grades were higher for program youth than for comparison and control youth for reading, spelling, history, science, and social studies. Overall grade averages were higher for program youth versus comparisons and controls, as was school attendance. Study data lend empirical support to the provision of educational enhancements in non-school settings for at-risk youths.

  14. Identification of human microRNA targets from isolated argonaute protein complexes.

    PubMed

    Beitzinger, Michaela; Peters, Lasse; Zhu, Jia Yun; Kremmer, Elisabeth; Meister, Gunter

    2007-06-01

    MicroRNAs (miRNAs) constitute a class of small non-coding RNAs that regulate gene expression on the level of translation and/or mRNA stability. Mammalian miRNAs associate with members of the Argonaute (Ago) protein family and bind to partially complementary sequences in the 3' untranslated region (UTR) of specific target mRNAs. Computer algorithms based on factors such as free binding energy or sequence conservation have been used to predict miRNA target mRNAs. Based on such predictions, up to one third of all mammalian mRNAs seem to be under miRNA regulation. However, due to the low degree of complementarity between the miRNA and its target, such computer programs are often imprecise and therefore not very reliable. Here we report the first biochemical identification approach of miRNA targets from human cells. Using highly specific monoclonal antibodies against members of the Ago protein family, we co-immunoprecipitate Ago-bound mRNAs and identify them by cloning. Interestingly, most of the identified targets are also predicted by different computer programs. Moreover, we randomly analyzed six different target candidates and were able to experimentally validate five as miRNA targets. Our data clearly indicate that miRNA targets can be experimentally identified from Ago complexes and therefore provide a new tool to directly analyze miRNA function.

  15. Managing Complexity - Developing the Node Control Software For The International Space Station

    NASA Technical Reports Server (NTRS)

    Wood, Donald B.

    2000-01-01

    On December 4th, 1998 at 3:36 AM STS-88 (the space shuttle Endeavor) was launched with the "Node 1 Unity Module" in its payload bay. After working on the Space Station program for a very long time, that launch was one of the most beautiful sights I had ever seen! As the Shuttle proceeded to rendezvous with the Russian American module know as Zarya, I returned to Houston quickly to start monitoring the activation of the software I had spent the last 3 years working on. The FGB module (also known as "Zarya"), was grappled by the shuttle robotic arm, and connected to the Unity module. Crewmembers then hooked up the power and data connections between Zarya and Unity. On December 7th, 1998 at 9:49 PM CST the Node Control Software was activated. On December 15th, 1998, the Node-l/Zarya "cornerstone" of the International Space Station was left on-orbit. The Node Control Software (NCS) is the first software flown by NASA for the International Space Station (ISS). The ISS Program is considered the most complex international engineering effort ever undertaken. At last count some 18 countries are active partners in this global venture. NCS has performed all of its intended functions on orbit, over 200 miles above us. I'll be describing how we built the NCS software.

  16. Neonatal Information System Using an Interactive Microcomputer Data Base Management Program

    PubMed Central

    Engelke, Stephen C.; Paulette, Ed W.; Kopelman, Arthur E.

    1981-01-01

    A low cost, interactive microcomputer data base management system is presented which is being used in a neonatal follow-up program at the East Carolina University School of Medicine. The features and flexibility of the system could be applied to a variety of medical care settings.

  17. A comprehensive defect data bank for no. 2 common oak lumber

    Treesearch

    Edwin L. Lucas; Leathern R.R. Catron; Leathern R.R. Catron

    1973-01-01

    Computer simulation of rough mill cut-up operations allows lowcost evaluation of furniture rough mill cut-up procedures. The defect data bank serves as input to such simulation programs. The data bank contains a detailed accounting of defect data taken from 637 No. 2 Common oak boards. Included is a description of each defect (location, size, and type), as well as the...

  18. Ad Hoc modeling, expert problem solving, and R&T program evaluation

    NASA Technical Reports Server (NTRS)

    Silverman, B. G.; Liebowitz, J.; Moustakis, V. S.

    1983-01-01

    A simplified cost and time (SCAT) analysis program utilizing personal-computer technology is presented and demonstrated in the case of the NASA-Goddard end-to-end data system. The difficulties encountered in implementing complex program-selection and evaluation models in the research and technology field are outlined. The prototype SCAT system described here is designed to allow user-friendly ad hoc modeling in real time and at low cost. A worksheet constructed on the computer screen displays the critical parameters and shows how each is affected when one is altered experimentally. In the NASA case, satellite data-output and control requirements, ground-facility data-handling capabilities, and project priorities are intricately interrelated. Scenario studies of the effects of spacecraft phaseout or new spacecraft on throughput and delay parameters are shown. The use of a network of personal computers for higher-level coordination of decision-making processes is suggested, as a complement or alternative to complex large-scale modeling.

  19. Smoking cessation and its predictors: results from a community-based pharmacy tobacco cessation program in New Mexico.

    PubMed

    Khan, Nasreen; Anderson, Joe R; Du, Juan; Tinker, Dale; Bachyrycz, Amy M; Namdar, Rocsanna

    2012-09-01

    The New Mexico Pharmaceutical Care Foundation received funding through the Tobacco Use Prevention and Control Program (TUPAC) to provide support for pharmacist-delivered tobacco cessation services. The goal of the program was to increase the availability of tobacco cessation services to residents of New Mexico. Program outcomes are presented, using data from the first 2 fiscal years. To assess tobacco quit rates among smokers who participated in the community pharmacist-based program and identify the predictors of quitting at the end of a 6-month program. Pharmacists, who had received Rx for Change training, provided tobacco cessation services. Patients were scheduled for an initial visit and then were seen at regularly scheduled follow-up visits at 1 month, 3 months, and 6 months from the initial visit. Data collected at the initial visit included demographics, smoking history, and readiness for quitting. Smoking status was collected at each of the follow-up visits. Data were analyzed using SAS (SAS Institute) and STATA (StataCorp LP) statistical software. Tobacco quit rates were calculated at 1, 3, and 6 months. Multivariate regression analysis was performed to assess predictors of quitting. Standard errors were adjusted for repeated observation. Data were available for 346 participants. The average quit rate at the end of 6 months was 25%. Significant predictors of quitting were high confidence levels in quitting at baseline, individuals who had first cigarettes at least 30 minutes after waking up, first cessation attempt, and nonwhite patients. A smoking cessation program delivered through trained community pharmacists with prescriptive authority is an effective approach to reducing smoking. Further research should be conducted to compare the effectiveness of pharmacists with that of other providers of tobacco cessation services.

  20. DISTILLER: a data integration framework to reveal condition dependency of complex regulons in Escherichia coli

    PubMed Central

    Lemmens, Karen; De Bie, Tijl; Dhollander, Thomas; De Keersmaecker, Sigrid C; Thijs, Inge M; Schoofs, Geert; De Weerdt, Ami; De Moor, Bart; Vanderleyden, Jos; Collado-Vides, Julio; Engelen, Kristof; Marchal, Kathleen

    2009-01-01

    We present DISTILLER, a data integration framework for the inference of transcriptional module networks. Experimental validation of predicted targets for the well-studied fumarate nitrate reductase regulator showed the effectiveness of our approach in Escherichia coli. In addition, the condition dependency and modularity of the inferred transcriptional network was studied. Surprisingly, the level of regulatory complexity seemed lower than that which would be expected from RegulonDB, indicating that complex regulatory programs tend to decrease the degree of modularity. PMID:19265557

  1. Analytics that Inform the University: Using Data You Already Have

    ERIC Educational Resources Information Center

    Dziuban, Charles; Moskal, Patsy; Cavanagh, Thomas; Watts, Andre

    2012-01-01

    The authors describe the University of Central Florida's top-down/bottom-up action analytics approach to using data to inform decision-making at the University of Central Florida. The top-down approach utilizes information about programs, modalities, and college implementation of Web initiatives. The bottom-up approach continuously monitors…

  2. Computer Programs for Library Operations; Results of a Survey Conducted Between Fall 1971 and Spring 1972.

    ERIC Educational Resources Information Center

    Liberman, Eva; And Others

    Many library operations involving large data banks lend themselves readily to computer operation. In setting up library computer programs, in changing or expanding programs, cost in programming and time delays could be substantially reduced if the programmers had access to library computer programs being used by other libraries, providing similar…

  3. Polyglot Programming in Applications Used for Genetic Data Analysis

    PubMed Central

    Nowak, Robert M.

    2014-01-01

    Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development. PMID:25197633

  4. Polyglot programming in applications used for genetic data analysis.

    PubMed

    Nowak, Robert M

    2014-01-01

    Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development.

  5. Improving Risk Management and Resiliency: A Plan for a Proactive National Policy on Insurance Practices in FEMA’s Public Assistance Program

    DTIC Science & Technology

    2013-12-01

    DisasterRecoveryExpenditure/Pag es/default.aspx, Canadian Disaster Database, and www.fema.gov) 116 Table 15. Comparison of declaration criteria and disasters for $30 million...the role of insurance in FEMA’s Public Assistance program. The guidance provided in the 44 CFR has not kept up with the industry since being...the nation. xxix THIS PAGE INTENTIONALLY LEFT BLANK I. INTRODUCTION Insurance is a complex industry , which is a large component of the U.S

  6. Development and Pilot Testing of a Standardized Training Program for a Patient-Mentoring Intervention to Increase Adherence to Outpatient HIV Care

    PubMed Central

    Mignogna, Joseph; Stanley, Melinda A.; Davila, Jessica; Wear, Jackie; Amico, K. Rivet; Giordano, Thomas P.

    2012-01-01

    Abstract Although peer interventionists have been successful in medication treatment-adherence interventions, their role in complex behavior-change approaches to promote entry and reentry into HIV care requires further investigation. The current study sought to describe and test the feasibility of a standardized peer-mentor training program used for MAPPS (Mentor Approach for Promoting Patient Self-Care), a study designed to increase engagement and attendance at HIV outpatient visits among high-risk HIV inpatients using HIV-positive peer interventionists to deliver a comprehensive behavioral change intervention. Development of MAPPS and its corresponding training program included collaborations with mentors from a standing outpatient mentor program. The final training program included (1) a half-day workshop; (2) practice role-plays; and (3) formal, standardized patient role-plays, using trained actors with “real-time” video observation (and ratings from trainers). Mentor training occurred over a 6-week period and required demonstration of adherence and skill, as rated by MAPPS trainers. Although time intensive, ultimate certification of mentors suggested the program was both feasible and effective. Survey data indicated mentors thought highly of the training program, while objective rating data from trainers indicated mentors were able to understand and display standards associated with intervention fidelity. Data from the MAPPS training program provide preliminary evidence that peer mentors can be trained to levels necessary to ensure intervention fidelity, even within moderately complex behavioral-change interventions. Although additional research is needed due to limitations of the current study (e.g., limited generalizability due to sample size and limited breadth of clinical training opportunities), data from the current trial suggest that training programs such as MAPPS appear both feasible and effective. PMID:22248331

  7. A mobile phone-based, community health worker program for referral, follow-up, and service outreach in rural Zambia: outcomes and overview.

    PubMed

    Schuttner, Linnaea; Sindano, Ntazana; Theis, Mathew; Zue, Cory; Joseph, Jessica; Chilengi, Roma; Chi, Benjamin H; Stringer, Jeffrey S A; Chintu, Namwinga

    2014-08-01

    Mobile health (m-health) utilizes widespread access to mobile phone technologies to expand health services. Community health workers (CHWs) provide first-level contact with health facilities; combining CHW efforts with m-health may be an avenue for improving primary care services. As part of a primary care improvement project, a pilot CHW program was developed using a mobile phone-based application for outreach, referral, and follow-up between the clinic and community in rural Zambia. The program was implemented at six primary care sites. Computers were installed at clinics for data entry, and data were transmitted to central servers. In the field, using a mobile phone to send data and receive follow-up requests, CHWs conducted household health surveillance visits, referred individuals to clinic, and followed up clinic patients. From January to April 2011, 24 CHWs surveyed 6,197 households with 33,304 inhabitants. Of 15,539 clinic visits, 1,173 (8%) had a follow-up visit indicated and transmitted via a mobile phone to designated CHWs. CHWs performed one or more follow-ups on 74% (n=871) of active requests and obtained outcomes on 63% (n=741). From all community visits combined, CHWs referred 840 individuals to a clinic. CHWs completed all planned aspects of surveillance and outreach, demonstrating feasibility. Components of this pilot project may aid clinical care in rural settings and have potential for epidemiologic and health system applications. Thus, m-health has the potential to improve service outreach, guide activities, and facilitate data collection in Zambia.

  8. STEPS: Modeling and Simulating Complex Reaction-Diffusion Systems with Python

    PubMed Central

    Wils, Stefan; Schutter, Erik De

    2008-01-01

    We describe how the use of the Python language improved the user interface of the program STEPS. STEPS is a simulation platform for modeling and stochastic simulation of coupled reaction-diffusion systems with complex 3-dimensional boundary conditions. Setting up such models is a complicated process that consists of many phases. Initial versions of STEPS relied on a static input format that did not cleanly separate these phases, limiting modelers in how they could control the simulation and becoming increasingly complex as new features and new simulation algorithms were added. We solved all of these problems by tightly integrating STEPS with Python, using SWIG to expose our existing simulation code. PMID:19623245

  9. Design and implementation of population-based specialty care programs.

    PubMed

    Botts, Sheila R; Gee, Michael T; Chang, Christopher C; Young, Iris; Saito, Logan; Lyman, Alfred E

    2017-09-15

    The development, implementation, and scaling of 3 population-based specialty care programs in a large integrated healthcare system are reviewed, and the role of clinical pharmacy services in ensuring safe, effective, and affordable care is highlighted. The Kaiser Permanente (KP) integrated healthcare delivery model allows for rapid development and expansion of innovative population management programs involving pharmacy services. Clinical pharmacists have assumed integral roles in improving the safety and effectiveness of high-complexity, high-cost care for specialty populations. These roles require an appropriate practice scope and are supported by an advanced electronic health record with disease registries and electronic surveillance tools for care-gap identification. The 3 specialty population programs described were implemented to address variation or unrecognized gaps in care for at-risk specialty populations. The Home Phototherapy Program has leveraged internal partnerships with clinical pharmacists to improve access to cost-effective nonpharmacologic interventions for psoriasis and other skin disorders. The Multiple Sclerosis Care Program has incorporated clinical pharmacists into neurology care in order to apply clinical guidelines in a systematic manner. The KP SureNet program has used clinical pharmacists and data analytics to identify opportunities to prevent drug-related adverse outcomes and ensure timely follow-up. Specialty care programs improve quality, cost outcomes, and the patient experience by appropriating resources to provide systematic and targeted care to high-risk patients. KP leverages an integration of people, processes, and technology to develop and scale population-based specialty care. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  10. BIREFRINGENT FILTER MODEL

    NASA Technical Reports Server (NTRS)

    Cross, P. L.

    1994-01-01

    Birefringent filters are often used as line-narrowing components in solid state lasers. The Birefringent Filter Model program generates a stand-alone model of a birefringent filter for use in designing and analyzing a birefringent filter. It was originally developed to aid in the design of solid state lasers to be used on aircraft or spacecraft to perform remote sensing of the atmosphere. The model is general enough to allow the user to address problems such as temperature stability requirements, manufacturing tolerances, and alignment tolerances. The input parameters for the program are divided into 7 groups: 1) general parameters which refer to all elements of the filter; 2) wavelength related parameters; 3) filter, coating and orientation parameters; 4) input ray parameters; 5) output device specifications; 6) component related parameters; and 7) transmission profile parameters. The program can analyze a birefringent filter with up to 12 different components, and can calculate the transmission and summary parameters for multiple passes as well as a single pass through the filter. The Jones matrix, which is calculated from the input parameters of Groups 1 through 4, is used to calculate the transmission. Output files containing the calculated transmission or the calculated Jones' matrix as a function of wavelength can be created. These output files can then be used as inputs for user written programs. For example, to plot the transmission or to calculate the eigen-transmittances and the corresponding eigen-polarizations for the Jones' matrix, write the appropriate data to a file. The Birefringent Filter Model is written in Microsoft FORTRAN 2.0. The program format is interactive. It was developed on an IBM PC XT equipped with an 8087 math coprocessor, and has a central memory requirement of approximately 154K. Since Microsoft FORTRAN 2.0 does not support complex arithmetic, matrix routines for addition, subtraction, and multiplication of complex, double precision variables are included. The Birefringent Filter Model was written in 1987.

  11. The FORTRAN static source code analyzer program (SAP) user's guide, revision 1

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Eslinger, S.

    1982-01-01

    The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.

  12. Computer Program for Calculation of Complex Chemical Equilibrium Compositions and Applications II. Users Manual and Program Description. 2; Users Manual and Program Description

    NASA Technical Reports Server (NTRS)

    McBride, Bonnie J.; Gordon, Sanford

    1996-01-01

    This users manual is the second part of a two-part report describing the NASA Lewis CEA (Chemical Equilibrium with Applications) program. The program obtains chemical equilibrium compositions of complex mixtures with applications to several types of problems. The topics presented in this manual are: (1) details for preparing input data sets; (2) a description of output tables for various types of problems; (3) the overall modular organization of the program with information on how to make modifications; (4) a description of the function of each subroutine; (5) error messages and their significance; and (6) a number of examples that illustrate various types of problems handled by CEA and that cover many of the options available in both input and output. Seven appendixes give information on the thermodynamic and thermal transport data used in CEA; some information on common variables used in or generated by the equilibrium module; and output tables for 14 example problems. The CEA program was written in ANSI standard FORTRAN 77. CEA should work on any system with sufficient storage. There are about 6300 lines in the source code, which uses about 225 kilobytes of memory. The compiled program takes about 975 kilobytes.

  13. G.A.M.E.: GPU-accelerated mixture elucidator.

    PubMed

    Schurz, Alioune; Su, Bo-Han; Tu, Yi-Shu; Lu, Tony Tsung-Yu; Lin, Olivia A; Tseng, Yufeng J

    2017-09-15

    GPU acceleration is useful in solving complex chemical information problems. Identifying unknown structures from the mass spectra of natural product mixtures has been a desirable yet unresolved issue in metabolomics. However, this elucidation process has been hampered by complex experimental data and the inability of instruments to completely separate different compounds. Fortunately, with current high-resolution mass spectrometry, one feasible strategy is to define this problem as extending a scaffold database with sidechains of different probabilities to match the high-resolution mass obtained from a high-resolution mass spectrum. By introducing a dynamic programming (DP) algorithm, it is possible to solve this NP-complete problem in pseudo-polynomial time. However, the running time of the DP algorithm grows by orders of magnitude as the number of mass decimal digits increases, thus limiting the boost in structural prediction capabilities. By harnessing the heavily parallel architecture of modern GPUs, we designed a "compute unified device architecture" (CUDA)-based GPU-accelerated mixture elucidator (G.A.M.E.) that considerably improves the performance of the DP, allowing up to five decimal digits for input mass data. As exemplified by four testing datasets with verified constitutions from natural products, G.A.M.E. allows for efficient and automatic structural elucidation of unknown mixtures for practical procedures. Graphical abstract .

  14. Estimating School Efficiency: A Comparison of Methods Using Simulated Data.

    ERIC Educational Resources Information Center

    Bifulco, Robert; Bretschneider, Stuart

    2001-01-01

    Uses simulated data to assess the adequacy of two econometric and linear-programming techniques (data-envelopment analysis and corrected ordinary least squares) for measuring performance-based school reform. In complex data sets (simulated to contain measurement error and endogeneity), these methods are inadequate efficiency measures. (Contains 40…

  15. Review of 1953-2003 ORAU Follow-Up Studies on Science Education Programs: Impacts on Participants' Education and Careers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oak Ridge Associated Universities

    2006-06-01

    Through sponsorship of science education programs for undergraduates and graduates, such as research participation programs and fellowships, the Department of Energy (DOE) encouraged the development of adequate numbers of qualified science and engineering (S&E) personnel to meet its current and future research and development (R&D) needs. This retrospective study summarizes impacts of selected programs on these participants. The summary data are from follow-up studies conducted from 1953 through 2003 by Oak Ridge Associated Universities and its predecessor, the Oak Ridge Institute for Nuclear Studies (ORINS).

  16. Some programming techniques for increasing program versatility and efficiency on CDC equipment

    NASA Technical Reports Server (NTRS)

    Tiffany, S. H.; Newsom, J. R.

    1978-01-01

    Five programming techniques used to decrease core and increase program versatility and efficiency are explained. The techniques are: (1) dynamic storage allocation, (2) automatic core-sizing and core-resizing, (3) matrix partitioning, (4) free field alphanumeric reads, and (5) incorporation of a data complex. The advantages of these techniques and the basic methods for employing them are explained and illustrated. Several actual program applications which utilize these techniques are described as examples.

  17. detectIR: a novel program for detecting perfect and imperfect inverted repeats using complex numbers and vector calculation.

    PubMed

    Ye, Congting; Ji, Guoli; Li, Lei; Liang, Chun

    2014-01-01

    Inverted repeats are present in abundance in both prokaryotic and eukaryotic genomes and can form DNA secondary structures--hairpins and cruciforms that are involved in many important biological processes. Bioinformatics tools for efficient and accurate detection of inverted repeats are desirable, because existing tools are often less accurate and time consuming, sometimes incapable of dealing with genome-scale input data. Here, we present a MATLAB-based program called detectIR for the perfect and imperfect inverted repeat detection that utilizes complex numbers and vector calculation and allows genome-scale data inputs. A novel algorithm is adopted in detectIR to convert the conventional sequence string comparison in inverted repeat detection into vector calculation of complex numbers, allowing non-complementary pairs (mismatches) in the pairing stem and a non-palindromic spacer (loop or gaps) in the middle of inverted repeats. Compared with existing popular tools, our program performs with significantly higher accuracy and efficiency. Using genome sequence data from HIV-1, Arabidopsis thaliana, Homo sapiens and Zea mays for comparison, detectIR can find lots of inverted repeats missed by existing tools whose outputs often contain many invalid cases. detectIR is open source and its source code is freely available at: https://sourceforge.net/projects/detectir.

  18. NIH Teams with Public Libraries for ‘All of Us’ Research Program | NIH MedlinePlus the Magazine

    MedlinePlus

    ... Teams with Public Libraries for ‘All of Us’ Research Program NIH is coming to a library near ... has teamed up with NIH’s All of Us Research Program to gather health data from across the ...

  19. Dynamics behind the scale up of evidence-based obesity prevention: protocol for a multi-site case study of an electronic implementation monitoring system in health promotion practice.

    PubMed

    Conte, Kathleen P; Groen, Sisse; Loblay, Victoria; Green, Amanda; Milat, Andrew; Persson, Lina; Innes-Hughes, Christine; Mitchell, Jo; Thackway, Sarah; Williams, Mandy; Hawe, Penelope

    2017-12-06

    The effectiveness of many interventions to promote health and prevent disease has been well established. The imperative has therefore shifted from amassing evidence about efficacy to scale-up to maximise population-level health gains. Electronic implementation monitoring, or 'e-monitoring', systems have been designed to assist and track the delivery of preventive policies and programs. However, there is little evidence on whether e-monitoring systems improve the dissemination, adoption, and ongoing delivery of evidence-based preventive programs. Also, given considerable difficulties with e-monitoring systems in the clinical sector, scholars have called for a more sophisticated re-examination of e-monitoring's role in enhancing implementation. In the state of New South Wales (NSW), Australia, the Population Health Information Management System (PHIMS) was created to support the dissemination of obesity prevention programs to 6000 childcare centres and elementary schools across all 15 local health districts. We have established a three-way university-policymaker-practice research partnership to investigate the impact of PHIMS on practice, how PHIMS is used, and how achievement of key performance indicators of program adoption may be associated with local contextual factors. Our methods encompass ethnographic observation, key informant interviews and participatory workshops for data interpretation at a state and local level. We use an on-line social network analysis of the collaborative relationships across local health district health promotion teams to explore the relationship between PHIMS use and the organisational structure of practice. Insights will be sensitised by institutional theory, practice theory and complex adaptive system thinking, among other theories which make sense of socio-technical action. Our working hypothesis is that the science of getting evidence-based programs into practice rests on an in-depth understanding of the role they play in the on-going system of local relationships and multiple accountabilities. Data will be synthesised to produce a typology to characterise local context, PHIMS use and key performance indicator achievement (of program implementation) across the 15 local health districts. Results could be used to continuously align e-monitoring technologies within quality improvement processes to ensure that such technologies enhance practice and innovation. A partnership approach to knowledge production increases the likelihood that findings will be put into practice.

  20. More Time or Better Tools? A Large-Scale Retrospective Comparison of Pedagogical Approaches to Teach Programming

    ERIC Educational Resources Information Center

    Silva-Maceda, Gabriela; Arjona-Villicaña, P. David; Castillo-Barrera, F. Edgar

    2016-01-01

    Learning to program is a complex task, and the impact of different pedagogical approaches to teach this skill has been hard to measure. This study examined the performance data of seven cohorts of students (N = 1168) learning programming under three different pedagogical approaches. These pedagogical approaches varied either in the length of the…

  1. Feasibility of "Standardized Clinician" Methodology for Patient Training on Hospital-to-Home Transitions.

    PubMed

    Wehbe-Janek, Hania; Hochhalter, Angela K; Castilla, Theresa; Jo, Chanhee

    2015-02-01

    Patient engagement in health care is increasingly recognized as essential for promoting the health of individuals and populations. This study pilot tested the standardized clinician (SC) methodology, a novel adaptation of standardized patient methodology, for teaching patient engagement skills for the complex health care situation of transitioning from a hospital back to home. Sixty-seven participants at heightened risk for hospitalization were randomly assigned to either simulation exposure-only or full-intervention group. Both groups participated in simulation scenarios with "standardized clinicians" around tasks related to hospital discharge and follow-up. The full-intervention group was also debriefed after scenario sets and learned about tools for actively participating in hospital-to-home transitions. Measures included changes in observed behaviors at baseline and follow-up and an overall program evaluation. The full-intervention group showed increases in observed tool possession (P = 0.014) and expression of their preferences and values (P = 0.043). The simulation exposure-only group showed improvement in worksheet scores (P = 0.002) and fewer engagement skills (P = 0.021). Both groups showed a decrease in telling an SC about their hospital admission (P < 0.05). Open-ended comments from the program evaluation were largely positive. Both groups benefited from exposure to the SC intervention. Program evaluation data suggest that simulation training is feasible and may provide a useful methodology for teaching patient skills for active engagement in health care. Future studies are warranted to determine if this methodology can be used to assess overall patient engagement and whether new patient learning transfers to health care encounters.

  2. [Development of a distance education program in the public health system in Chile, 2004-2009].

    PubMed

    Carabantes C, Jorge; Guerra U, Manuel; Guillou, Michèle

    2010-09-01

    This paper reports the gradual development and results achieved in the distance education program set up in the Public Health System in Chile in 2004. Up to date, more than 22,000 students from 29 different health divisions have been trained. This strategy was designed to provide more flexibility and diversity to the training programs of the Health System within the framework of a deep and complex organizational change promoted by Health Reform. The main results show that the integration of organizational, teaching, logistic and budgetary aspects has turned out to be a key element in its success, validating the relevance of the provided solutions. The access to training by means of e-learning or blended learning (electronic education that includes traditional and distance learning activities) allowed employees to choose more independently what, where and when to study. This fact accounts for the high demand for this program. Through this initiative, the National Health System, introduced a wider scope of responses to training needs, which will mean a better adaptation to the challenges associated to health care.

  3. pdb-care (PDB carbohydrate residue check): a program to support annotation of complex carbohydrate structures in PDB files.

    PubMed

    Lütteke, Thomas; von der Lieth, Claus-W

    2004-06-04

    Carbohydrates are involved in a variety of fundamental biological processes and pathological situations. They therefore have a large pharmaceutical and diagnostic potential. Knowledge of the 3D structure of glycans is a prerequisite for a complete understanding of their biological functions. The largest source of biomolecular 3D structures is the Protein Data Bank. However, about 30% of all 1663 PDB entries (version September 2003) containing carbohydrates comprise errors in glycan description. Unfortunately, no software is currently available which aligns the 3D information with the reported assignments. It is the aim of this work to fill this gap. The pdb-care program http://www.glycosciences.de/tools/pdb-care/ is able to identify and assign carbohydrate structures using only atom types and their 3D atom coordinates given in PDB-files. Looking up a translation table where systematic names and the respective PDB residue codes are listed, both assignments are compared and inconsistencies are reported. Additionally, the reliability of reported and calculated connectivities for molecules listed within the HETATOM records is checked and unusual values are reported. Frequent use of pdb-care will help to improve the quality of carbohydrate data contained in the PDB. Automatic assignment of carbohydrate structures contained in PDB entries will enable the cross-linking of glycobiology resources with genomic and proteomic data collections.

  4. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    NASA Technical Reports Server (NTRS)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the APT processor capabilities are made. This phase initializes character recognition and syntax tables for the APT processor by creating FORTRAN block data programs. The APT processor consists of four components: the translator, the execution complex, the subroutine library, and the CL editor. The translator examines each APT statement in the part program for recognizable structure and generates a new statement, or series of statements, in an intermediate language. The execution complex processes all of the definition, motion, and related statements to generate cutter location coordinates. The subroutine library contains routines defining the algorithms required to process the sequenced list of intermediate language commands generated by the translator. The CL editor re-processes the cutter location coordinates according to user supplied commands to generate a final CL file. A sample post processor is also included which translates a CL file into a form for use with a Wales Strippit Fabramatic Model 30/30 sheet metal punch. The user should be able to readily develop post processors for other N/C machine tools. The APT language is a statement oriented, sequence dependent language. With the exception of such programming techniques as looping and macros, statements in an APT program are executed in a strict first-to-last sequence. In order to provide programming capability for the broadest possible range of parts and of machine tools, APT input (and output) is generalized, as represented by 3-dimensional geometry and tools, and arbitrarily uniform, as represented by the moving tool concept and output data in absolute coordinates. A command procedure allows the user to select the desired part program, ask for a graphics file of cutter motions in IGES format, and submit the procedure as a batch job, if desired. The APT system software is written in FORTRAN 77 for batch and interactive execution and has been implemented on a DEC VAX series computer under VMS 4.4. The enhancements for this version of APT were last updated in June, 1989. The NASA adaptation, with enhancements, of the public domain version of the APT IV/SSX8 software to the DEC VAX-11/780 is available by license for a period of ten (10) years to approved licensees. The licensed program product delivered includes the APT IV/SSX8 system source code, object code, executable images, and command procedures and one set of supporting documentation. Additional copies of the supporting documentation may be purchased at any time at the price indicated below.

  5. Performance Assessment of Refractory Concrete Used on the Space Shuttle's Launch Pad

    NASA Technical Reports Server (NTRS)

    Trejo, David; Calle, Luz Marina; Halman, Ceki

    2005-01-01

    The John F. Kennedy Space Center (KSC) maintains several facilities for launching space vehicles. During recent launches it has been observed that the refractory concrete materials that protect the steel-framed flame duct are breaking away from this base structure and are being projected at high velocities. There is significant concern that these projected pieces can strike the launch complex or space vehicle during the launch, jeopardizing the safety of the mission. A qualification program is in place to evaluate the performance of different refractory concretes and data from these tests have been used to assess the performance of the refractory concretes. However, there is significant variation in the test results, possibly making the existing qualification test program unreliable. This paper will evaluate data from past qualification tests, identify potential key performance indicators for the launch complex, and will recommend a new qualification test program that can be used to better qualify refractory concrete.

  6. A Multiobjective Interval Programming Model for Wind-Hydrothermal Power System Dispatching Using 2-Step Optimization Algorithm

    PubMed Central

    Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision. PMID:24895663

  7. A multiobjective interval programming model for wind-hydrothermal power system dispatching using 2-step optimization algorithm.

    PubMed

    Ren, Kun; Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision.

  8. "Talkin' about a revolution": How electronic health records can facilitate the scale-up of HIV care and treatment and catalyze primary care in resource-constrained settings.

    PubMed

    Braitstein, Paula; Einterz, Robert M; Sidle, John E; Kimaiyo, Sylvester; Tierney, William

    2009-11-01

    Health care for patients with HIV infection in developing countries has increased substantially in response to major international funding. Scaling up treatment programs requires timely data on the type, quantity, and quality of care being provided. Increasingly, such programs are turning to electronic health records (EHRs) to provide these data. We describe how a medical school in the United States and another in Kenya collaborated to develop and implement an EHR in a large HIV/AIDS care program in western Kenya. These data were used to manage patients, providers, and the program itself as it grew to encompass 18 sites serving more than 90,000 patients. Lessons learned have been applicable beyond HIV/AIDS to include primary care, chronic disease management, and community-based health screening and disease prevention programs. EHRs will be key to providing the highest possible quality of care for the funds developing countries can commit to health care. Public, private, and academic partnerships can facilitate the development and implementation of EHRs in resource-constrained settings.

  9. Translation Analysis on Civil Engineering Text Produced by Machine Translator

    NASA Astrophysics Data System (ADS)

    Sutopo, Anam

    2018-02-01

    Translation is extremely needed in communication since people have serious problem in the language used. Translation activity is done by the person in charge for translating the material. Translation activity is also able to be done by machine. It is called machine translation, reflected in the programs developed by programmer. One of them is Transtool. Many people used Transtool for helping them in solving the problem related with translation activities. This paper wants to deliver how important is the Transtool program, how effective is Transtool program and how is the function of Transtool for human business. This study applies qualitative research. The sources of data were document and informant. This study used documentation and in dept-interviewing as the techniques for collecting data. The collected data were analyzed by using interactive analysis. The results of the study show that, first; Transtool program is helpful for people in translating the civil engineering text and it functions as the aid or helper, second; the working of Transtool software program is effective enough and third; the result of translation produced by Transtool is good for short and simple sentences and not readable, not understandable and not accurate for long sentences (compound, complex and compound complex) thought the result is informative. The translated material must be edited by the professional translator.

  10. Testing Mediators Hypothesized to Account for the Effects of a Dissonance-Based Eating Disorder Prevention Program over Longer Term Follow-Up

    ERIC Educational Resources Information Center

    Stice, Eric; Marti, C. Nathan; Rohde, Paul; Shaw, Heather

    2011-01-01

    Objective: Test the hypothesis that reductions in thin-ideal internalization and body dissatisfaction mediate the effects of a dissonance-based eating disorder prevention program on reductions in eating disorder symptoms over 1-year follow-up. Method: Data were drawn from a randomized effectiveness trial in which 306 female high school students…

  11. Age 26 Cost-Benefit Analysis of the Child-Parent Center Early Education Program

    ERIC Educational Resources Information Center

    Reynolds, Arthur J.; Temple, Judy A.; White, Barry A. B.; Ou, Suh-Ruu; Robertson, Dylan L.

    2011-01-01

    Using data collected up to age 26 in the Chicago Longitudinal Study, this cost-benefit analysis of the Child-Parent Centers (CPC) is the first for a sustained publicly funded early intervention. The program provides services for low-income families beginning at age 3 in 20 school sites. Kindergarten and school-age services are provided up to age 9…

  12. The Impact of Ensemble Kalman Filter Assimilation of Near-Surface Observations on the Predictability of Atmospheric Conditions over Complex Terrain: Results from Recent MATERHORN Field Program

    NASA Astrophysics Data System (ADS)

    Pu, Z.; Zhang, H.

    2013-12-01

    Near-surface atmospheric observations are the main conventional observations for weather forecasts. However, in modern numerical weather prediction, the use of surface observations, especially those data over complex terrain, remains a unique challenge. There are fundamental difficulties in assimilating surface observations with three-dimensional variational data assimilation (3DVAR). In our early study[1] (Pu et al. 2013), a series of observing system simulation experiments was performed with the ensemble Kalman filter (EnKF) and compared with 3DVAR for its ability to assimilate surface observations with 3DVAR. Using the advanced research version of the Weather Research and Forecasting (WRF) model, results demonstrate that the EnKF can overcome some fundamental limitations that 3DVAR has in assimilating surface observations over complex terrain. Specifically, through its flow-dependent background error term, the EnKF produces more realistic analysis increments over complex terrain in general. Over complex terrain, the EnKF clearly performs better than 3DVAR, because it is more capable of handling surface data in the presence of terrain misrepresentation. With this presentation, we further examine the impact of EnKF data assimilation on the predictability of atmospheric conditions over complex terrain with the WRF model and the observations obtained from the most recent field experiments of the Mountain Terrain Atmospheric Modeling and Observations (MATERHORN) Program. The MATERHORN program provides comprehensive observations over mountainous regions, allowing the opportunity to study the predictability of atmospheric conditions over complex terrain in great details. Specifically, during fall 2012 and spring 2013, comprehensive observations were collected of soil states, surface energy budgets, near-surface atmospheric conditions, and profiling measurements from multiple platforms (e.g., balloon, lidar, radiosondes, etc.) over Dugway Proving Ground (DPG), Utah. With the near-surface observations and sounding data obtained during the MATERHORN fall 2012 field experiment, a month-long cycled EnKF analysis and forecast was produced with the WRF model and an advanced EnKF data assimilation system. Results are compared with the WRF near real-time forecasting during the same month and a set of analysis with 3DVAR data assimilation. Overall evaluation suggests some useful insights on the impacts of different data assimilation methods, surface and soil states, terrain representation on the predictability of atmospheric conditions over mountainous terrain. Details will be presented. References [1] Pu, Z., H. Zhang, and J. A. Anderson,. 'Ensemble Kalman filter assimilation of near-surface observations over complex terrain: Comparison with 3DVAR for short-range forecasts.' Tellus A, vol. 65,19620. 2013. http://dx.doi.org/10.3402/tellusa.v65i0. 19620.

  13. Emotion Locomotion: Promoting the Emotional Health of Elementary School Children by Recognizing Emotions

    ERIC Educational Resources Information Center

    McLachlan, Debra A.; Burgos, Teresa; Honeycutt, Holly K.; Linam, Eve H.; Moneymaker, Laura D.; Rathke, Meghan K.

    2009-01-01

    Emotion recognition is a critical life skill children need for mental health promotion to meet the complexities and challenges of growing up in the world today. Five nursing students and their instructor designed "Emotion Locomotion," a program for children ages 6-8 during a public health nursing practicum for an inner-city parochial school.…

  14. Mentat: An object-oriented macro data flow system

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Liu, Jane W. S.

    1988-01-01

    Mentat, an object-oriented macro data flow system designed to facilitate parallelism in distributed systems, is presented. The macro data flow model is a model of computation similar to the data flow model with two principal differences: the computational complexity of the actors is much greater than in traditional data flow systems, and there are persistent actors that maintain state information between executions. Mentat is a system that combines the object-oriented programming paradigm and the macro data flow model of computation. Mentat programs use a dynamic structure called a future list to represent the future of computations.

  15. Heuristic Implementation of Dynamic Programming for Matrix Permutation Problems in Combinatorial Data Analysis

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie

    2008-01-01

    Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…

  16. The Search for Extraterrestrial Intelligence

    NASA Technical Reports Server (NTRS)

    Tucher, A.

    1985-01-01

    The development of NASA's SETI project and strategies for searching radio signals are reviewed. A computer program was written in FORTRAN to set up data from observations taken at Jodrell Bank. These data are to be used with a larger program to find the average radio signal strength at each of the approximately 63,000 channels.

  17. Statistical techniques applied to aerial radiometric surveys (STAARS): principal components analysis user's manual. [NURE program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.

    1981-01-01

    A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From thismore » analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.« less

  18. Array data extractor (ADE): a LabVIEW program to extract and merge gene array data.

    PubMed

    Kurtenbach, Stefan; Kurtenbach, Sarah; Zoidl, Georg

    2013-12-01

    Large data sets from gene expression array studies are publicly available offering information highly valuable for research across many disciplines ranging from fundamental to clinical research. Highly advanced bioinformatics tools have been made available to researchers, but a demand for user-friendly software allowing researchers to quickly extract expression information for multiple genes from multiple studies persists. Here, we present a user-friendly LabVIEW program to automatically extract gene expression data for a list of genes from multiple normalized microarray datasets. Functionality was tested for 288 class A G protein-coupled receptors (GPCRs) and expression data from 12 studies comparing normal and diseased human hearts. Results confirmed known regulation of a beta 1 adrenergic receptor and further indicate novel research targets. Although existing software allows for complex data analyses, the LabVIEW based program presented here, "Array Data Extractor (ADE)", provides users with a tool to retrieve meaningful information from multiple normalized gene expression datasets in a fast and easy way. Further, the graphical programming language used in LabVIEW allows applying changes to the program without the need of advanced programming knowledge.

  19. Generating a 2D Representation of a Complex Data Structure

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A computer program, designed to assist in the development and debugging of other software, generates a two-dimensional (2D) representation of a possibly complex n-dimensional (where n is an integer >2) data structure or abstract rank-n object in that other software. The nature of the 2D representation is such that it can be displayed on a non-graphical output device and distributed by non-graphical means.

  20. Building University Capacity to Visualize Solutions to Complex Problems in the Arctic

    NASA Astrophysics Data System (ADS)

    Broderson, D.; Veazey, P.; Raymond, V. L.; Kowalski, K.; Prakash, A.; Signor, B.

    2016-12-01

    Rapidly changing environments are creating complex problems across the globe, which are particular magnified in the Arctic. These worldwide challenges can best be addressed through diverse and interdisciplinary research teams. It is incumbent on such teams to promote co-production of knowledge and data-driven decision-making by identifying effective methods to communicate their findings and to engage with the public. Decision Theater North (DTN) is a new semi-immersive visualization system that provides a space for teams to collaborate and develop solutions to complex problems, relying on diverse sets of skills and knowledge. It provides a venue to synthesize the talents of scientists, who gather information (data); modelers, who create models of complex systems; artists, who develop visualizations; communicators, who connect and bridge populations; and policymakers, who can use the visualizations to develop sustainable solutions to pressing problems. The mission of Decision Theater North is to provide a cutting-edge visual environment to facilitate dialogue and decision-making by stakeholders including government, industry, communities and academia. We achieve this mission by adopting a multi-faceted approach reflected in the theater's design, technology, networking capabilities, user support, community relationship building, and strategic partnerships. DTN is a joint project of Alaska's National Science Foundation Experimental Program to Stimulate Competitive Research (NSF EPSCoR) and the University of Alaska Fairbanks (UAF), who have brought the facility up to full operational status and are now expanding its development space to support larger team science efforts. Based in Fairbanks, Alaska, DTN is uniquely poised to address changes taking place in the Arctic and subarctic, and is connected with a larger network of decision theaters that include the Arizona State University Decision Theater Network and the McCain Institute in Washington, DC.

  1. Common occupational classification system - revision 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stahlman, E.J.; Lewis, R.E.

    1996-05-01

    Workforce planning has become an increasing concern within the DOE community as the Office of Environmental Restoration and Waste Management (ER/WM or EM) seeks to consolidate and refocus its activities and the Office of Defense Programs (DP) closes production sites. Attempts to manage the growth and skills mix of the EM workforce while retaining the critical skills of the DP workforce have been difficult due to the lack of a consistent set of occupational titles and definitions across the complex. Two reasons for this difficulty may be cited. First, classification systems commonly used in industry often fail to cover inmore » sufficient depth the unique demands of DOE`s nuclear energy and research community. Second, the government practice of contracting the operation of government facilities to the private sector has introduced numerous contractor-specific classification schemes to the DOE complex. As a result, sites/contractors report their workforce needs using unique classification systems. It becomes difficult, therefore, to roll these data up to the national level necessary to support strategic planning and analysis. The Common Occupational Classification System (COCS) is designed to overcome these workforce planning barriers. The COCS is based on earlier workforce planning activities and the input of technical, workforce planning, and human resource managers from across the DOE complex. It provides a set of mutually-exclusive occupation titles and definitions that cover the broad range of activities present in the DOE complex. The COCS is not a required record-keeping or data management guide. Neither is it intended to replace contractor/DOE-specific classification systems. Instead, the system provides a consistent, high- level, functional structure of occupations to which contractors can crosswalk (map) their job titles.« less

  2. The NLstart2run study: health effects of a running promotion program in novice runners, design of a prospective cohort study

    PubMed Central

    2013-01-01

    Background Running is associated with desirable lifestyle changes. Therefore several initiatives have been undertaken to promote running. Exact data on the health effects as a result of participating in a short-term running promotion program, however, is scarce. One important reason for dropout from a running program is a running-related injury (RRI). The incidence of RRIs is high, especially in novice runners. Several studies examined potential risk factors for RRIs, however, due to the often underpowered studies it is not possible to reveal the complex mechanism leading to an RRI yet. The primary objectives are to determine short- and long-term health effects of a nationwide “Start to Run” program and to identify determinants for RRIs in novice runners. Secondary objectives include examining reasons and determinants for dropout, medical consumption and economical consequences of RRIs as a result of a running promotion program. Methods/design The NLstart2run study is a multi-center prospective cohort study with a follow-up at 6, 12, 24 and 52 weeks. All participants that sign up for the Start to Run program in 2013, which is offered by the Dutch Athletics Federation, will be asked to participate in the study. During the running program a digital running log will be completed by the participants every week to administer exposure and running related pain. After the running program the log will be completed every second week. An RRI is defined as any musculoskeletal ailment of the lower extremity or back that the participant attributed to running and hampers running ability for at least one week. Discussion The NLstart2run study will provide insight into the short- and long-term health effects as a result of a short-term running promotion program. Reasons and determinants for dropout from a running promotion program will be examined as well. The study will result in several leads for future RRI prevention and as a result minimize dropout due to injury. This information may increase the effectiveness of future running promotion programs and will thereby contribute positively to public health. Trial registration The Netherlands National Trial Register NTR3676. The NTR is part of the WHO Primary Registries. PMID:23890182

  3. The NLstart2run study: health effects of a running promotion program in novice runners, design of a prospective cohort study.

    PubMed

    Kluitenberg, Bas; van Middelkoop, Marienke; Diercks, Ron L; Hartgens, Fred; Verhagen, Evert; Smits, Dirk-Wouter; Buist, Ida; van der Worp, Henk

    2013-07-26

    Running is associated with desirable lifestyle changes. Therefore several initiatives have been undertaken to promote running. Exact data on the health effects as a result of participating in a short-term running promotion program, however, is scarce. One important reason for dropout from a running program is a running-related injury (RRI). The incidence of RRIs is high, especially in novice runners. Several studies examined potential risk factors for RRIs, however, due to the often underpowered studies it is not possible to reveal the complex mechanism leading to an RRI yet.The primary objectives are to determine short- and long-term health effects of a nationwide "Start to Run" program and to identify determinants for RRIs in novice runners. Secondary objectives include examining reasons and determinants for dropout, medical consumption and economical consequences of RRIs as a result of a running promotion program. The NLstart2run study is a multi-center prospective cohort study with a follow-up at 6, 12, 24 and 52 weeks. All participants that sign up for the Start to Run program in 2013, which is offered by the Dutch Athletics Federation, will be asked to participate in the study.During the running program a digital running log will be completed by the participants every week to administer exposure and running related pain. After the running program the log will be completed every second week. An RRI is defined as any musculoskeletal ailment of the lower extremity or back that the participant attributed to running and hampers running ability for at least one week. The NLstart2run study will provide insight into the short- and long-term health effects as a result of a short-term running promotion program. Reasons and determinants for dropout from a running promotion program will be examined as well. The study will result in several leads for future RRI prevention and as a result minimize dropout due to injury. This information may increase the effectiveness of future running promotion programs and will thereby contribute positively to public health. The Netherlands National Trial Register NTR3676. The NTR is part of the WHO Primary Registries.

  4. Computer program for thin-wire structures in a homogeneous conducting medium

    NASA Technical Reports Server (NTRS)

    Richmond, J. H.

    1974-01-01

    A computer program is presented for thin-wire antennas and scatters in a homogeneous conducting medium. The anaylsis is performed in the real or complex frequency domain. The program handles insulated and bare wires with finite conductivity and lumped loads. The output data includes the current distribution, impedance, radiation efficiency, gain, absorption cross section, scattering cross section, echo area and the polarization scattering matrix. The program uses sinusoidal bases and Galerkin's method.

  5. KSC-07pd1522

    NASA Image and Video Library

    2007-06-16

    KENNEDY SPACE CENTER, FLA. -- The destruction of the 209-foot-tall mobile service tower on Pad 39-B at Space Launch Complex 36 on Cape Canaveral Air Force Station kicks up a wall of dust. The tower is one of two that were identified for demolition. The old towers are being toppled as part of the ongoing project to demolish the historic site to prevent corrosion from becoming a safety concern. A majority of the steel will be recycled and the rest will be taken to the landfill at CCAFS. Complex 36 was the birthplace of NASA's planetary launch program. It was built for the Atlas/Centaur development program and was operated under NASA's sponsorship until the late 1980s. Complex 36 hosted many historic missions over the years including Surveyor that landed on the moon and Mariner that orbited Mars and included one to Mercury. Two of the most historic launches were the Pioneer 10 and 11 space probes that were launched to Jupiter and are now outside of the solar system in interstellar space. Also, the historic Pioneer Venus spacecraft included an orbiter and a set of probes that were dispatched to the surface. While Launch Complex 36 is gone, the Atlas/Centaur rocket continues to be launched as the Atlas V from Complex 41. Photo credit: NASA/Charisse Nahser

  6. Best Practices in Identifying Students for Gifted and Talented Education Programs

    ERIC Educational Resources Information Center

    Worrell, Frank C.; Erwin, Jesse O.

    2011-01-01

    As school psychologists move from dichotomous categorizations of students as "gifted" or "nongifted" toward a more comprehensive approach to identification, their task becomes increasingly complex. In the present article, the authors outline practices at the planning, programming, and data collection stages of the identification process in hopes…

  7. Commerical Crew Astronauts Visit Launch Complex 39A

    NASA Image and Video Library

    2018-03-27

    Commercial Crew Program astronauts, from the left, Suni Williams, Eric Boe, Bob Behnken and Doug Hurley take in the view from the top of Launch Complex 39A at Kennedy Space Center. The astronauts toured the pad for an up-close look at modifications that are in work for the SpaceX Crew Dragon flight tests. Tower modifications included l removal of the space shuttle era rotating service structure. Future integration of the crew access arm will allow for safe crew entry for launch and exit from the spacecraft in the unlikely event a pad abort is required.

  8. Commerical Crew Astronauts Visit Launch Complex 39A

    NASA Image and Video Library

    2018-03-27

    Commercial Crew Program astronauts, from the left Doug Hurley, Eric Boe, Bob Behnken and Suni Williams, pose just outside Launch Complex 39A at NASA's Kennedy Space Center in Florida. The astronauts toured the pad for an up-close look at modifications that are in work for the SpaceX Crew Dragon flight tests. The tower modifications included removal of the space shuttle era rotating service structure. Future integration of the crew access arm will allow for safe crew entry for launch and exit from the spacecraft in the unlikely event a pad abort is required.

  9. Monte Carlo simulation of parameter confidence intervals for non-linear regression analysis of biological data using Microsoft Excel.

    PubMed

    Lambert, Ronald J W; Mytilinaios, Ioannis; Maitland, Luke; Brown, Angus M

    2012-08-01

    This study describes a method to obtain parameter confidence intervals from the fitting of non-linear functions to experimental data, using the SOLVER and Analysis ToolPaK Add-In of the Microsoft Excel spreadsheet. Previously we have shown that Excel can fit complex multiple functions to biological data, obtaining values equivalent to those returned by more specialized statistical or mathematical software. However, a disadvantage of using the Excel method was the inability to return confidence intervals for the computed parameters or the correlations between them. Using a simple Monte-Carlo procedure within the Excel spreadsheet (without recourse to programming), SOLVER can provide parameter estimates (up to 200 at a time) for multiple 'virtual' data sets, from which the required confidence intervals and correlation coefficients can be obtained. The general utility of the method is exemplified by applying it to the analysis of the growth of Listeria monocytogenes, the growth inhibition of Pseudomonas aeruginosa by chlorhexidine and the further analysis of the electrophysiological data from the compound action potential of the rodent optic nerve. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  10. The Amazing Histogram.

    ERIC Educational Resources Information Center

    Vandermeulen, H.; DeWreede, R. E.

    1983-01-01

    Presents a histogram drawing program which sorts real numbers in up to 30 categories. Entered data are sorted and saved in a text file which is then used to generate the histogram. Complete Applesoft program listings are included. (JN)

  11. Measurement and prediction of propeller flow field on the PTA aircraft at speeds of up to Mach 0.85. [Propfan Test Assessment

    NASA Technical Reports Server (NTRS)

    Aljabri, Abdullah S.

    1988-01-01

    High speed subsonic transports powered by advanced propellers provide significant fuel savings compared to turbofan powered transports. Unfortunately, however, propfans must operate in aircraft-induced nonuniform flow fields which can lead to high blade cyclic stresses, vibration and noise. To optimize the design and installation of these advanced propellers, therefore, detailed knowledge of the complex flow field is required. As part of the NASA Propfan Test Assessment (PTA) program, a 1/9 scale semispan model of the Gulfstream II propfan test-bed aircraft was tested in the NASA-Lewis 8 x 6 supersonic wind tunnel to obtain propeller flow field data. Detailed radial and azimuthal surveys were made to obtain the total pressure in the flow and the three components of velocity. Data was acquired for Mach numbers ranging from 0.6 to 0.85. Analytical predictions were also made using a subsonic panel method, QUADPAN. Comparison of wind-tunnel measurements and analytical predictions show good agreement throughout the Mach range.

  12. Using Carbon Emissions Data to "Heat Up" Descriptive Statistics

    ERIC Educational Resources Information Center

    Brooks, Robert

    2012-01-01

    This article illustrates using carbon emissions data in an introductory statistics assignment. The carbon emissions data has desirable characteristics including: choice of measure; skewness; and outliers. These complexities allow research and public policy debate to be introduced. (Contains 4 figures and 2 tables.)

  13. Getting Back to Living: Further Evidence for the Efficacy of an Interdisciplinary Pediatric Pain Treatment Program.

    PubMed

    Bruce, Barbara K; Ale, Chelsea M; Harrison, Tracy E; Bee, Susan; Luedtke, Connie; Geske, Jennifer; Weiss, Karen E

    2017-06-01

    This study examined key functional outcomes following a 3-week interdisciplinary pediatric pain rehabilitation program for adolescents with chronic pain. Maintenance of gains was evaluated at 3-month follow-up. Participants included 171 adolescents (12 to 18 y of age) with chronic pain who completed a hospital-based outpatient pediatric pain rehabilitation program. Participants completed measures of functional disability, depressive symptoms, pain catastrophizing, opioid use, school attendance, and pain severity at admission, discharge, and at 3-month follow-up. Similar to other interdisciplinary pediatric pain rehabilitation program outcome studies, significant improvements were observed at the end of the program. These improvements appeared to be maintained or further improved at 3-month follow-up. Nearly 14% of the patients were taking daily opioid medication at admission to the program. All adolescents were completely tapered off of these medications at the end of the 3-week program and remained abstinent at 3-month follow-up. This study adds to the available data supporting interdisciplinary pediatric pain rehabilitation as effective in improving functioning and psychological distress even when discontinuing opioids. Implications for future research and limitations of the study are discussed.

  14. Implementation of an evidence-based intervention to improve the wellbeing of people with dementia and their carers: study protocol for 'Care of People with dementia in their Environments (COPE)' in the Australian context.

    PubMed

    Clemson, Lindy; Laver, Kate; Jeon, Yun-Hee; Comans, Tracy A; Scanlan, Justin; Rahja, Miia; Culph, Jennifer; Low, Lee-Fay; Day, Sally; Cations, Monica; Crotty, Maria; Kurrle, Susan; Piersol, Catherine; Gitlin, Laura N

    2018-05-09

    There are effective non-pharmacological treatment programs that reduce functional disability and changed behaviours in people with dementia. However, these programs (such as the Care of People with dementia in their Environments (COPE) program) are not widely available. The primary aim of this study is to determine the strategies and processes that enable the COPE program to be implemented into existing dementia care services in Australia. This study uses a mixed methods approach to test an implementation strategy. The COPE intervention (up to ten consultations with an occupational therapist and up to two consultations with a nurse) will be implemented using a number of strategies including planning (such as developing and building relationships with dementia care community service providers), educating (training nurses and occupational therapists in how to apply the intervention), restructuring (organisations establishing referral systems; therapist commitment to provide COPE to five clients following training) and quality management (coaching, support, reminders and fidelity checks). Qualitative and quantitative data will contribute to understanding how COPE is adopted and implemented. Feasibility, fidelity, acceptability, uptake and service delivery contexts will be explored and a cost/benefit evaluation conducted. Client outcomes of activity engagement and caregiver wellbeing will be assessed in a pragmatic pre-post evaluation. While interventions that promote independence and wellbeing are effective and highly valued by people with dementia and their carers, access to such programs is limited. Barriers to translation that have been previously identified are addressed in this study, including limited training opportunities and a lack of confidence in clinicians working with complex symptoms of dementia. A strength of the study is that it involves implementation within different types of existing services, such as government and private providers, so the study will provide useful guidance for further future rollout. 16 February 2017; ACTRN12617000238370 .

  15. A radiation-hardened, computer for satellite applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaona, J.I. Jr.

    1996-08-01

    This paper describes high reliability radiation hardened computers built by Sandia for application aboard DOE satellite programs requiring 32 bit processing. The computers highlight a radiation hardened (10 kGy(Si)) R3000 executing up to 10 million reduced instruction set instructions (RISC) per second (MIPS), a dual purpose module control bus used for real-time default and power management which allows for extended mission operation on as little as 1.2 watts, and a local area network capable of 480 Mbits/s. The central processing unit (CPU) is the NASA Goddard R3000 nicknamed the ``Mongoose or Mongoose 1``. The Sandia Satellite Computer (SSC) uses Rational`smore » Ada compiler, debugger, operating system kernel, and enhanced floating point emulation library targeted at the Mongoose. The SSC gives Sandia the capability of processing complex types of spacecraft attitude determination and control algorithms and of modifying programmed control laws via ground command. And in general, SSC offers end users the ability to process data onboard the spacecraft that would normally have been sent to the ground which allows reconsideration of traditional space-grounded partitioning options.« less

  16. A randomized waitlist-controlled trial of culturally sensitive relationship education for male same-sex couples.

    PubMed

    Whitton, Sarah W; Weitbrecht, Eliza M; Kuryluk, Amanda D; Hutsell, David W

    2016-09-01

    Relationship education, effective in improving relationship quality among different-sex couples, represents a promising and nonstigmatizing approach to promoting the health and stability of same-sex couples. A new culturally sensitive relationship education program was developed specifically for male same-sex couples, which includes adaptations of evidence-based strategies to build core relationship skills (e.g., communication skills training) and newly developed content to address unique challenges faced by this group (e.g., discrimination; low social support). A small randomized waitlist-control trial (N = 20 couples) was conducted to evaluate the program. To assess program efficacy, dyadic longitudinal data (collected at pre- and postprogram and 3-month follow-up) were analyzed using multilevel models that accounted for nonindependence in data from indistinguishable dyads. Results indicated significant program effects in comparison to waitlist controls on couple constructive and destructive communication, perceived stress, and relationship satisfaction. Gains in each of these areas were maintained at 3-month follow-up. Although there was no evidence of within-person program effects on social support, satisfaction, or relationship instability immediately postprogram, all 3 showed within-person improvements by follow-up. Ratings of program satisfaction were high. In summary, study findings support the feasibility, acceptability, and initial efficacy of the program and highlight the potential value of culturally sensitive adaptations of relationship education for same-sex couples. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  18. Creating Access Points to Instrument-Based Atmospheric Data: Perspectives from the ARM Metadata Manager

    NASA Astrophysics Data System (ADS)

    Troyan, D.

    2016-12-01

    The Atmospheric Radiation Measurement (ARM) program has been collecting data from instruments in diverse climate regions for nearly twenty-five years. These data are made available to all interested parties at no cost via specially designed tools found on the ARM website (www.arm.gov). Metadata is created and applied to the various datastreams to facilitate information retrieval using the ARM website, the ARM Data Discovery Tool, and data quality reporting tools. Over the last year, the Metadata Manager - a relatively new position within the ARM program - created two documents that summarize the state of ARM metadata processes: ARM Metadata Workflow, and ARM Metadata Standards. These documents serve as guides to the creation and management of ARM metadata. With many of ARM's data functions spread around the Department of Energy national laboratory complex and with many of the original architects of the metadata structure no longer working for ARM, there is increased importance on using these documents to resolve issues from data flow bottlenecks and inaccurate metadata to improving data discovery and organizing web pages. This presentation will provide some examples from the workflow and standards documents. The examples will illustrate the complexity of the ARM metadata processes and the efficiency by which the metadata team works towards achieving the goal of providing access to data collected under the auspices of the ARM program.

  19. Fast Geostatistical Inversion using Randomized Matrix Decompositions and Sketchings for Heterogeneous Aquifer Characterization

    NASA Astrophysics Data System (ADS)

    O'Malley, D.; Le, E. B.; Vesselinov, V. V.

    2015-12-01

    We present a fast, scalable, and highly-implementable stochastic inverse method for characterization of aquifer heterogeneity. The method utilizes recent advances in randomized matrix algebra and exploits the structure of the Quasi-Linear Geostatistical Approach (QLGA), without requiring a structured grid like Fast-Fourier Transform (FFT) methods. The QLGA framework is a more stable version of Gauss-Newton iterates for a large number of unknown model parameters, but provides unbiased estimates. The methods are matrix-free and do not require derivatives or adjoints, and are thus ideal for complex models and black-box implementation. We also incorporate randomized least-square solvers and data-reduction methods, which speed up computation and simulate missing data points. The new inverse methodology is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. Inversion results based on series of synthetic problems with steady-state and transient calibration data are presented.

  20. Experiences with the Twitter Health Surveillance (THS) System

    PubMed Central

    Rodríguez-Martínez, Manuel

    2018-01-01

    Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype. PMID:29607412

  1. Experiences with the Twitter Health Surveillance (THS) System.

    PubMed

    Rodríguez-Martínez, Manuel

    2017-06-01

    Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype.

  2. Complex Adaptive Systems Based Data Integration: Theory and Applications

    ERIC Educational Resources Information Center

    Rohn, Eliahu

    2008-01-01

    Data Definition Languages (DDLs) have been created and used to represent data in programming languages and in database dictionaries. This representation includes descriptions in the form of data fields and relations in the form of a hierarchy, with the common exception of relational databases where relations are flat. Network computing created an…

  3. Evaluation Strategies in Financial Education: Evaluation with Imperfect Instruments

    ERIC Educational Resources Information Center

    Robinson, Lauren; Dudensing, Rebekka; Granovsky, Nancy L.

    2016-01-01

    Program evaluation often suffers due to time constraints, imperfect instruments, incomplete data, and the need to report standardized metrics. This article about the evaluation process for the Wi$eUp financial education program showcases the difficulties inherent in evaluation and suggests best practices for assessing program effectiveness. We…

  4. The Cementitious Barriers Partnership Experimental Programs and Software Advancing DOE’s Waste Disposal/Tank Closure Efforts – 15436

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, Heather; Flach, Greg; Smith, Frank

    2015-01-27

    The U.S. Department of Energy Environmental Management (DOE-EM) Office of Tank Waste Management-sponsored Cementitious Barriers Partnership (CBP) is chartered with providing the technical basis for implementing cement-based waste forms and radioactive waste containment structures for long-term disposal. DOE needs in this area include the following to support progress in final treatment and disposal of legacy waste and closure of High-Level Waste (HLW) tanks in the DOE complex: long-term performance predictions, flow sheet development and flow sheet enhancements, and conceptual designs for new disposal facilities. The DOE-EM Cementitious Barriers Partnership is producing software and experimental programs resulting in new methods andmore » data needed for end-users involved with environmental cleanup and waste disposal. Both the modeling tools and the experimental data have already benefited the DOE sites in the areas of performance assessments by increasing confidence backed up with modeling support, leaching methods, and transport properties developed for actual DOE materials. In 2014, the CBP Partnership released the CBP Software Toolbox –“Version 2.0” which provides concrete degradation models for 1) sulfate attack, 2) carbonation, and 3) chloride initiated rebar corrosion, and includes constituent leaching. These models are applicable and can be used by both DOE and the Nuclear Regulatory Commission (NRC) for service life and long-term performance evaluations and predictions of nuclear and radioactive waste containment structures across the DOE complex, including future SRS Saltstone and HLW tank performance assessments and special analyses, Hanford site HLW tank closure projects and other projects in which cementitious barriers are required, the Advanced Simulation Capability for Environmental Management (ASCEM) project which requires source terms from cementitious containment structures as input to their flow simulations, regulatory reviews of DOE performance assessments, and Nuclear Regulatory Commission reviews of commercial nuclear power plant (NPP) structures which are part of the overall US Energy Security program to extend the service life of NPPs. In addition, the CBP experimental programs have had a significant impact on the DOE complex by providing specific data unique to DOE sodium salt wastes at Hanford and SRS which are not readily available in the literature. Two recent experimental programs on cementitious phase characterization and on technetium (Tc) mobility have provided significant conclusions as follows: recent mineralogy characterization discussed in this paper illustrates that sodium salt waste form matrices are somewhat similar to but not the same as those found in blended cement matrices which to date have been used in long-term thermodynamic modeling and contaminant sequestration as a first approximation. Utilizing the CBP generated data in long-term performance predictions provides for a more defensible technical basis in performance evaluations. In addition, recent experimental studies related to technetium mobility indicate that conventional leaching protocols may not be conservative for direct disposal of Tc-containing waste forms in vadose zone environments. These results have the potential to influence the current Hanford supplemental waste treatment flow sheet and disposal conceptual design.« less

  5. Students' Explanations in Complex Learning of Disciplinary Programming

    ERIC Educational Resources Information Center

    Vieira, Camilo

    2016-01-01

    Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or…

  6. Applications of genetic programming in cancer research.

    PubMed

    Worzel, William P; Yu, Jianjun; Almal, Arpit A; Chinnaiyan, Arul M

    2009-02-01

    The theory of Darwinian evolution is the fundamental keystones of modern biology. Late in the last century, computer scientists began adapting its principles, in particular natural selection, to complex computational challenges, leading to the emergence of evolutionary algorithms. The conceptual model of selective pressure and recombination in evolutionary algorithms allow scientists to efficiently search high dimensional space for solutions to complex problems. In the last decade, genetic programming has been developed and extensively applied for analysis of molecular data to classify cancer subtypes and characterize the mechanisms of cancer pathogenesis and development. This article reviews current successes using genetic programming and discusses its potential impact in cancer research and treatment in the near future.

  7. The NASA Electronic Parts and Packaging (NEPP) Program: Insertion of New Electronics Technologies

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.; Sampson, Michael J.

    2007-01-01

    This viewgraph presentation gives an overview of NASA Electronic Parts and Packaging (NEPP) Program's new electronics technology trends. The topics include: 1) The Changing World of Radiation Testing of Memories; 2) Even Application-Specific Tests are Costly!; 3) Hypothetical New Technology Part Qualification Cost; 4) Where we are; 5) Approaching FPGAs as a More Than a "Part" for Reliability; 6) FPGAs Beget Novel Radiation Test Setups; 7) Understanding the Complex Radiation Data; 8) Tracking Packaging Complexity and Reliability for FPGAs; 9) Devices Supporting the FPGA Need to be Considered; 10) Summary of the New Electronic Technologies and Insertion into Flight Programs Workshop; and 11) Highlights of Panel Notes and Comments

  8. Visualizing astronomy data using VRML

    NASA Astrophysics Data System (ADS)

    Beeson, Brett; Lancaster, Michael; Barnes, David G.; Bourke, Paul D.; Rixon, Guy T.

    2004-09-01

    Visualisation is a powerful tool for understanding the large data sets typical of astronomical surveys and can reveal unsuspected relationships and anomalous regions of parameter space which may be difficult to find programatically. Visualisation is a classic information technology for optimising scientific return. We are developing a number of generic on-line visualisation tools as a component of the Australian Virtual Observatory project. The tools will be deployed within the framework of the International Virtual Observatory Alliance (IVOA), and follow agreed-upon standards to make them accessible by other programs and people. We and our IVOA partners plan to utilise new information technologies (such as grid computing and web services) to advance the scientific return of existing and future instrumentation. Here we present a new tool - VOlume - which visualises point data. Visualisation of astronomical data normally requires the local installation of complex software, the downloading of potentially large datasets, and very often time-consuming and tedious data format conversions. VOlume enables the astronomer to visualise data using just a web browser and plug-in. This is achieved using IVOA standards which allow us to pass data between Web Services, Java Servlet Technology and Common Gateway Interface programs. Data from a catalogue server can be streamed in eXtensible Mark-up Language format to a servlet which produces Virtual Reality Modeling Language output. The user selects elements of the catalogue to map to geometry and then visualises the result in a browser plug-in such as Cortona or FreeWRL. Other than requiring an input VOTable format file, VOlume is very general. While its major use will likely be to display and explore astronomical source catalogues, it can easily render other important parameter fields such as the sky and redshift coverage of proposed surveys or the sampling of the visibility plane by a rotation-synthesis interferometer.

  9. WALLY 1 ...A large, principal components regression program with varimax rotation of the factor weight matrix

    Treesearch

    James R. Wallis

    1965-01-01

    Written in Fortran IV and MAP, this computer program can handle up to 120 variables, and retain 40 principal components. It can perform simultaneous regression of up to 40 criterion variables upon the varimax rotated factor weight matrix. The columns and rows of all output matrices are labeled by six-character alphanumeric names. Data input can be from punch cards or...

  10. Nurse Family Partnership: Comparing Costs per Family in Randomized Trials Versus Scale-Up.

    PubMed

    Miller, Ted R; Hendrie, Delia

    2015-12-01

    The literature that addresses cost differences between randomized trials and full-scale replications is quite sparse. This paper examines how costs differed among three randomized trials and six statewide scale-ups of nurse family partnership (NFP) intensive home visitation to low income first-time mothers. A literature review provided data on pertinent trials. At our request, six well-established programs reported their total expenditures. We adjusted the costs to national prices based on mean hourly wages for registered nurses and then inflated them to 2010 dollars. A centralized data system provided utilization. Replications had fewer home visits per family than trials (25 vs. 31, p = .05), lower costs per client ($8860 vs. $12,398, p = .01), and lower costs per visit ($354 vs. $400, p = .30). Sample size limited the significance of these differences. In this type of labor intensive program, costs probably were lower in scale-up than in randomized trials. Key cost drivers were attrition and the stable caseload size possible in an ongoing program. Our estimates reveal a wide variation in cost per visit across six state programs, which suggests that those planning replications should not expect a simple rule to guide cost estimations for scale-ups. Nevertheless, NFP replications probably achieved some economies of scale.

  11. CROPPER: a metagene creator resource for cross-platform and cross-species compendium studies.

    PubMed

    Paananen, Jussi; Storvik, Markus; Wong, Garry

    2006-09-22

    Current genomic research methods provide researchers with enormous amounts of data. Combining data from different high-throughput research technologies commonly available in biological databases can lead to novel findings and increase research efficiency. However, combining data from different heterogeneous sources is often a very arduous task. These sources can be different microarray technology platforms, genomic databases, or experiments performed on various species. Our aim was to develop a software program that could facilitate the combining of data from heterogeneous sources, and thus allow researchers to perform genomic cross-platform/cross-species studies and to use existing experimental data for compendium studies. We have developed a web-based software resource, called CROPPER that uses the latest genomic information concerning different data identifiers and orthologous genes from the Ensembl database. CROPPER can be used to combine genomic data from different heterogeneous sources, allowing researchers to perform cross-platform/cross-species compendium studies without the need for complex computational tools or the requirement of setting up one's own in-house database. We also present an example of a simple cross-platform/cross-species compendium study based on publicly available Parkinson's disease data derived from different sources. CROPPER is a user-friendly and freely available web-based software resource that can be successfully used for cross-species/cross-platform compendium studies.

  12. FASTER 3: A generalized-geometry Monte Carlo computer program for the transport of neutrons and gamma rays. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    Jordan, T. M.

    1970-01-01

    A description of the FASTER-III program for Monte Carlo Carlo calculation of photon and neutron transport in complex geometries is presented. Major revisions include the capability of calculating minimum weight shield configurations for primary and secondary radiation and optimal importance sampling parameters. The program description includes a users manual describing the preparation of input data cards, the printout from a sample problem including the data card images, definitions of Fortran variables, the program logic, and the control cards required to run on the IBM 7094, IBM 360, UNIVAC 1108 and CDC 6600 computers.

  13. Genetic aspect of Alzheimer disease: Results of complex segregation analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadonvick, A.D.; Lee, I.M.L.; Bailey-Wilson, J.E.

    1994-09-01

    The study was designed to evaluate the possibility that a single major locus will explain the segregation of Alzheimer disease (AD). The data were from the population-based AD Genetic Database and consisted of 402 consecutive, unrelated probands, diagnosed to have either `probable` or `autopsy confirmed` AD and their 2,245 first-degree relatives. In this analysis, a relative was considered affected with AD only when there were sufficient medical/autopsy data to support diagnosis of AD being the most likely cause of the dementia. Transmission probability models allowing for a genotype-dependent and logistically distributed age-of-onset were used. The program REGTL in the S.A.G.E.more » computer program package was used for a complex segregation analysis. The models included correction for single ascertainment. Regressive familial effects were not estimated. The data were analyzed to test for single major locus (SML), random transmission and no transmission (environmental) hypotheses. The results of the complex segregation analysis showed that (1) the SML was the best fit, and (2) the non-genetic models could be rejected.« less

  14. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The purpose of grant NCC3-966 was to investigate and evaluate the interchange of application-specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  15. Type-Based Access Control in Data-Centric Systems

    NASA Astrophysics Data System (ADS)

    Caires, Luís; Pérez, Jorge A.; Seco, João Costa; Vieira, Hugo Torres; Ferrão, Lúcio

    Data-centric multi-user systems, such as web applications, require flexible yet fine-grained data security mechanisms. Such mechanisms are usually enforced by a specially crafted security layer, which adds extra complexity and often leads to error prone coding, easily causing severe security breaches. In this paper, we introduce a programming language approach for enforcing access control policies to data in data-centric programs by static typing. Our development is based on the general concept of refinement type, but extended so as to address realistic and challenging scenarios of permission-based data security, in which policies dynamically depend on the database state, and flexible combinations of column- and row-level protection of data are necessary. We state and prove soundness and safety of our type system, stating that well-typed programs never break the declared data access control policies.

  16. Tsunami Hazard in the Algerian Coastline

    NASA Astrophysics Data System (ADS)

    Amir, L. A.

    2008-05-01

    The Algerian coastline is located at the border between the African and the Eurasian tectonic plates. The collision between these two plates is approximately 4 to 7 mm/yr. The Alps and the tellian Atlas result from this convergence. Historical and present day data show the occurrence of earthquakes with magnitude up to 7 degrees on Richter scale in the northern part of the country. Cities were destroyed and the number of victims reached millions of people. Recently, small seismic waves generated by a destructive earthquake (Epicenter: 36.90N, 3.71E; Mw=6.8; Algeria, 2003, NEIC) were recorded in the French and Spanish coasts. This event raised again the issue of tsunami hazard in western Mediterranean region. For the Algerian study case, the assessment of seismic and tsunami hazard is a matter of great interest because of fast urban development of cities like Algiers. This study aims to provide scientific arguments to help in the elaboration of the Mediterranean tsunami alert program. This is a real complex issue because (1) the western part of the sea is narrow, (2) constructions on the Algerian coastline do not respect safety standards and (3) the seismic hazard is important. The present work is based on a numerical modeling approach. Firstly, a database is created to gather and list information related to seismology, tectonic, abnormal sea level's variations recorded/observed, submarine and coastal topographic data for the western part of the Mediterranean margin. This database helped to propose series of scenario that could trigger tsunami in the Mediterranean sea. Seismic moment, rake and focal depth are the major parameters that constrain the modeling input seismic data. Then, the undersea earthquakes modeling and the seabed deformations are computed with a program adapted from the rngchn code based on Okada's analytic equations. The last task of this work consisted to calculate the initial water surface displacement and simulate the triggered tsunami. Generation and propagation of induced seismic waves were estimated with another program adapted from the swan code for the resolution of the hydrodynamic shallow water equations. The results obtained will be firstly presented. Then, based on seismic waves travel times and run up height values, a large discussion will focus on the tsunami alert program for cities marked by fast urban development.

  17. Learning from Follow Up Surveys of Graduates: The Austin Teacher Program and the Benchmark Project. A Discussion Paper.

    ERIC Educational Resources Information Center

    Baker, Thomas E.

    This paper describes Austin College's (Texas) participation in the Benchmark Project, a collaborative followup study of teacher education graduates and their principals, focusing on the second round of data collection. The Benchmark Project was a collaboration of 11 teacher preparation programs that gathered and analyzed data comparing graduates…

  18. Strengthening Teachers' Abilities to Implement a Vision Health Program in Taiwanese Schools

    ERIC Educational Resources Information Center

    Chang, L. C.; Liao, L. L.; Chen, M. I.; Niu, Y. Z.; Hsieh, P. L.

    2017-01-01

    We designed a school-based, nationwide program called the "New Era in Eye Health" to strengthen teacher training and to examine whether the existence of a government vision care policy influenced teachers' vision care knowledge and students' behavior. Baseline data and 3-month follow-up data were compared. A random sample of teachers (n…

  19. Quality of Care for Patients with Type 2 Diabetes Mellitus in the Netherlands and the United States: A Comparison of Two Quality Improvement Programs

    PubMed Central

    Valk, Gerlof D; Renders, Carry M; Kriegsman, Didi MW; Newton, Katherine M; Twisk, Jos WR; van Eijk, Jacques ThM; van der Wal, Gerrit; Wagner, Edward H

    2004-01-01

    Objective To assess differences in diabetes care and patient outcomes by comparing two multifaceted quality improvement programs in two different countries, and to increase knowledge of effective elements of such programs. Study Setting Primary care in the ExtraMural Clinic (EMC) of the Department of General Practice of the Vrije Universiteit in Amsterdam, the Netherlands, and the Group Health Cooperative (GHC), a group-model health maintenance organization (HMO) in western Washington State in the United States. Data were collected from 1992 to 1997. Study Design In this observational study two diabetes cohorts in which a quality improvement program was implemented were compared. Both programs included a medical record system, clinical practice guidelines, physician educational meetings, audit, and feedback. Only the Dutch program (EMC) included guidelines on the structure of diabetes care and a recall system. Only the GHC program included educational outreach visits, formation of multidisciplinary teams, and patient self-management support. Data Collection Included were 379 EMC patients, and 2,119 GHC patients with type 2 diabetes mellitus. Main process outcomes were: annual number of diabetes visits, and number of HbA1c and blood lipid measurements. Main patient outcomes were HbA1c and blood lipid levels. Multilevel analysis was used to adjust for dependency between repeated observations within one patient and for clustering of patients within general practices. Principal Findings In the EMC process outcomes and glycemic control improved more than at GHC, however, GHC had better baseline measures. There were no differences between programs on blood lipid control. During follow-up, intensification of pharmacotherapy was noted at both sites. Differences noted between programs were in line with differences in diabetes guidelines. Conclusions Following implementation of guidelines and organizational improvement efforts, change occurred primarily in the process outcomes, rather than in the patient outcomes. Although much effort was put into improving process and patient outcomes, both complex programs still showed only moderate effects. PMID:15230924

  20. Parental experiences of a developmentally focused care program for infants and children during prolonged hospitalization.

    PubMed

    So, Stephanie; Rogers, Alaine; Patterson, Catherine; Drew, Wendy; Maxwell, Julia; Darch, Jane; Hoyle, Carolyn; Patterson, Sarah; Pollock-BarZiv, Stacey

    2014-06-01

    This study investigates parental experiences and perceptions of the care received during their child's prolonged hospitalization. It relates this care to the Beanstalk Program (BP), a develop-mentally focused care program provided to these families within an acute care hospital setting. A total of 20 parents (of children hospitalized between 1-15 months) completed the Measures of Processes of Care (MPOC-20) with additional questions regarding the BP. Scores rate the extent of the health-care provider's behaviour as perceived by the family, ranging from 'to a great extent' (7) to 'never' (1). Parents rated Respectful and Supportive Care (6.33) as highest, while Providing General Information (5.65) was rated lowest. Eleven parents participated in a follow-up, qualitative, semi-structured interview. Interview data generated key themes: (a) parents strive for positive and normal experiences for their child within the hospital environment; (b) parents value the focus on child development in the midst of their child's complex medical care; and (c) appropriate developmentally focused education helps parents shift from feeling overwhelmed with a medically ill child to instilling feelings of confidence and empowerment to care for their child and transition home. These results emphasize the importance of enhancing child development for hospitalized infants and young children through programs such as the BP. © The Author(s) 2013.

  1. Protocol for the process evaluation of a complex intervention designed to increase the use of research in health policy and program organisations (the SPIRIT study).

    PubMed

    Haynes, Abby; Brennan, Sue; Carter, Stacy; O'Connor, Denise; Schneider, Carmen Huckel; Turner, Tari; Gallego, Gisselle

    2014-09-27

    Process evaluation is vital for understanding how interventions function in different settings, including if and why they have different effects or do not work at all. This is particularly important in trials of complex interventions in 'real world' organisational settings where causality is difficult to determine. Complexity presents challenges for process evaluation, and process evaluations that tackle complexity are rarely reported. This paper presents the detailed protocol for a process evaluation embedded in a randomised trial of a complex intervention known as SPIRIT (Supporting Policy In health with Research: an Intervention Trial). SPIRIT aims to build capacity for using research in health policy and program agencies. We describe the flexible and pragmatic methods used for capturing, managing and analysing data across three domains: (a) the intervention as it was implemented; (b) how people participated in and responded to the intervention; and (c) the contextual characteristics that mediated this relationship and may influence outcomes. Qualitative and quantitative data collection methods include purposively sampled semi-structured interviews at two time points, direct observation and coding of intervention activities, and participant feedback forms. We provide examples of the data collection and data management tools developed. This protocol provides a worked example of how to embed process evaluation in the design and evaluation of a complex intervention trial. It tackles complexity in the intervention and its implementation settings. To our knowledge, it is the only detailed example of the methods for a process evaluation of an intervention conducted as part of a randomised trial in policy organisations. We identify strengths and weaknesses, and discuss how the methods are functioning during early implementation. Using 'insider' consultation to develop methods is enabling us to optimise data collection while minimising discomfort and burden for participants. Embedding the process evaluation within the trial design is facilitating access to data, but may impair participants' willingness to talk openly in interviews. While it is challenging to evaluate the process of conducting a randomised trial of a complex intervention, our experience so far suggests that it is feasible and can add considerably to the knowledge generated.

  2. Visualization of International Solar-Terrestrial Physics Program (ISTP) data

    NASA Technical Reports Server (NTRS)

    Kessel, Ramona L.; Candey, Robert M.; Hsieh, Syau-Yun W.; Kayser, Susan

    1995-01-01

    The International Solar-Terrestrial Physics Program (ISTP) is a multispacecraft, multinational program whose objective is to promote further understanding of the Earth's complex plasma environment. Extensive data sharing and data analysis will be needed to ensure the success of the overall ISTP program. For this reason, there has been a special emphasis on data standards throughout ISTP. One of the key tools will be the common data format (CDF), developed, maintained, and evolved at the National Space Science Data Center (NSSDC), with the set of ISTP implementation guidelines specially designed for space physics data sets by the Space Physics Data Facility (associated with the NSSDC). The ISTP guidelines were developed to facilitate searching, plotting, merging, and subsetting of data sets. We focus here on the plotting application. A prototype software package was developed to plot key parameter (KP) data from the ISTP program at the Science Planning and Operations Facility (SPOF). The ISTP Key Parameter Visualization Tool is based on the Interactive Data Language (IDL) and is keyed to the ISTP guidelines, reading data stored in CDF. With the combination of CDF, the ISTP guidelines, and the visualization software, we can look forward to easier and more effective data sharing and use among ISTP scientists.

  3. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator.

    PubMed

    Drewes, Rich; Zou, Quan; Goodman, Philip H

    2009-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.

  4. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator

    PubMed Central

    Drewes, Rich; Zou, Quan; Goodman, Philip H.

    2008-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707

  5. Development of a Night Vision Goggle Heads-Up Display for Paratrooper Guidance

    DTIC Science & Technology

    2008-06-01

    and GPS data [MIC07]. requiring altitude, position, velocity, acceleration, and angular rates for navigation or control. An internal GPS receiver...Language There are several programming languages that provide the operating capabilities for this program. Languages like JAVA and C# provide an...acceleration, and angular rates. Figure 3.6 illustrates the MIDG hardware’s input and output data. The sensor actually generates the INS data, which is

  6. New challenges for Life Sciences flight project management

    NASA Technical Reports Server (NTRS)

    Huntoon, C. L.

    1999-01-01

    Scientists have conducted studies involving human spaceflight crews for over three decades. These studies have progressed from simple observations before and after each flight to sophisticated experiments during flights of several weeks up to several months. The findings from these experiments are available in the scientific literature. Management of these flight experiments has grown into a system fashioned from the Apollo Program style, focusing on budgeting, scheduling and allocation of human and material resources. While these areas remain important to the future, the International Space Station (ISS) requires that the Life Sciences spaceflight experiments expand the existing project management methodology. The use of telescience with state-the-art information technology and the multi-national crews and investigators challenges the former management processes. Actually conducting experiments on board the ISS will be an enormous undertaking and International Agreements and Working Groups will be essential in giving guidance to the flight project management Teams forged in this matrix environment must be competent to make decisions and qualified to work with the array of engineers, scientists, and the spaceflight crews. In order to undertake this complex task, data systems not previously used for these purposes must be adapted so that the investigators and the project management personnel can all share in important information as soon as it is available. The utilization of telescience and distributed experiment operations will allow the investigator to remain involved in their experiment as well as to understand the numerous issues faced by other elements of the program The complexity in formation and management of project teams will be a new kind of challenge for international science programs. Meeting that challenge is essential to assure success of the International Space Station as a laboratory in space.

  7. New challenges for Life Sciences flight project management.

    PubMed

    Huntoon, C L

    1999-01-01

    Scientists have conducted studies involving human spaceflight crews for over three decades. These studies have progressed from simple observations before and after each flight to sophisticated experiments during flights of several weeks up to several months. The findings from these experiments are available in the scientific literature. Management of these flight experiments has grown into a system fashioned from the Apollo Program style, focusing on budgeting, scheduling and allocation of human and material resources. While these areas remain important to the future, the International Space Station (ISS) requires that the Life Sciences spaceflight experiments expand the existing project management methodology. The use of telescience with state-the-art information technology and the multi-national crews and investigators challenges the former management processes. Actually conducting experiments on board the ISS will be an enormous undertaking and International Agreements and Working Groups will be essential in giving guidance to the flight project management Teams forged in this matrix environment must be competent to make decisions and qualified to work with the array of engineers, scientists, and the spaceflight crews. In order to undertake this complex task, data systems not previously used for these purposes must be adapted so that the investigators and the project management personnel can all share in important information as soon as it is available. The utilization of telescience and distributed experiment operations will allow the investigator to remain involved in their experiment as well as to understand the numerous issues faced by other elements of the program The complexity in formation and management of project teams will be a new kind of challenge for international science programs. Meeting that challenge is essential to assure success of the International Space Station as a laboratory in space.

  8. New challenges for life sciences flight project management

    NASA Astrophysics Data System (ADS)

    Huntoon, Carolyn L.

    1999-09-01

    Scientists have conducted studies involving human spaceflight crews for over three decades. These studies have progressed from simple observations before and after each flight to sophisticated experiments during flights of several weeks up to several months. The findings from these experiments are available in the scientific literature. Management of these flight experiments has grown into a system fashioned from the Apollo Program style, focusing on budgeting, scheduling and allocation of human and material resources. While these areas remain important to the future, the International Space Station (ISS) requires that the Life Sciences spaceflight experiments expand the existing project management methodology. The use of telescience with state-of-the-art information technology and the multi-national crews and investigators challenges the former management processes. Actually conducting experiments on board the ISS will be an enormous undertaking and International Agreements and Working Groups will be essential in giving guidance to the flight project management Teams forged in this matrix environment must be competent to make decisions and qualified to work with the array of engineers, scientists, and the spaceflight crews. In order to undertake this complex task, data systems not previously used for these purposes must be adapted so that the investigators and the project management personnel can all share in important information as soon as it is available. The utilization of telescience and distributed experiment operations will allow the investigator to remain involved in their experiment as well as to understand the numerous issues faced by other elements of the program. The complexity in formation and management of project teams will be a new kind of challenge for international science programs. Meeting that challenge is essential to assure success of the International Space Station as a laboratory in space.

  9. Beef cattle growing and backgrounding programs.

    PubMed

    Peel, Derrell S

    2003-07-01

    The stocker industry is one of many diverse production and marketing activities that make up the United States beef industry. The stocker industry is probably the least understood industry sector and yet it plays a vital role in helping the industry exploit its competitive advantage of using forage resources and providing an economical means of adjusting the timing and volume of cattle and meat in a complex market environment.

  10. Generalize aerodynamic coefficient table storage, checkout and interpolation for aircraft simulation

    NASA Technical Reports Server (NTRS)

    Neuman, F.; Warner, N.

    1973-01-01

    The set of programs described has been used for rapidly introducing, checking out and very efficiently using aerodynamic tables in complex aircraft simulations on the IBM 360. The preprocessor program reads in tables with different names and dimensions and stores then on disc storage according to the specified dimensions. The tables are read in from IBM cards in a format which is convenient to reduce the data from the original graphs. During table processing, new auxiliary tables are generated which are required for table cataloging and for efficient interpolation. In addition, DIMENSION statements for the tables as well as READ statements are punched so that they may be used in other programs for readout of the data from disc without chance of programming errors. A quick data checking graphical output for all tables is provided in a separate program.

  11. Effectiveness of preventive home visits in reducing the risk of falls in old age: a randomized controlled trial

    PubMed Central

    Luck, Tobias; Motzek, Tom; Luppa, Melanie; Matschinger, Herbert; Fleischer, Steffen; Sesselmann, Yves; Roling, Gudrun; Beutner, Katrin; König, Hans-Helmut; Behrens, Johann; Riedel-Heller, Steffi G

    2013-01-01

    Background Falls in older people are a major public health issue, but the underlying causes are complex. We sought to evaluate the effectiveness of preventive home visits as a multifactorial, individualized strategy to reduce falls in community-dwelling older people. Methods Data were derived from a prospective randomized controlled trial with follow-up examination after 18 months. Two hundred and thirty participants (≥80 years of age) with functional impairment were randomized to intervention and control groups. The intervention group received up to three preventive home visits including risk assessment, home counseling intervention, and a booster session. The control group received no preventive home visits. Structured interviews at baseline and follow-up provided information concerning falls in both study groups. Random-effects Poisson regression evaluated the effect of preventive home visits on the number of falls controlling for covariates. Results Random-effects Poisson regression showed a significant increase in the number of falls between baseline and follow-up in the control group (incidence rate ratio 1.96) and a significant decrease in the intervention group (incidence rate ratio 0.63) controlling for age, sex, family status, level of care, and impairment in activities of daily living. Conclusion Our results indicate that a preventive home visiting program can be effective in reducing falls in community-dwelling older people. PMID:23788832

  12. Bio-inspired self-shaping ceramics

    PubMed Central

    Bargardi, Fabio L.; Le Ferrand, Hortense; Libanori, Rafael; Studart, André R.

    2016-01-01

    Shaping ceramics into complex and intricate geometries using cost-effective processes is desirable in many applications but still remains an open challenge. Inspired by plant seed dispersal units that self-fold on differential swelling, we demonstrate that self-shaping can be implemented in ceramics by programming the material's microstructure to undergo local anisotropic shrinkage during heat treatment. Such microstructural design is achieved by magnetically aligning functionalized ceramic platelets in a liquid ceramic suspension, subsequently consolidated through an established enzyme-catalysed reaction. By fabricating alumina compacts exhibiting bio-inspired bilayer architectures, we achieve deliberate control over shape change during the sintering step. Bending, twisting or combinations of these two basic movements can be successfully programmed to obtain a myriad of complex shapes. The simplicity and the universality of such a bottom-up shaping method makes it attractive for applications that would benefit from low-waste ceramic fabrication, temperature-resistant interlocking structures or unusual geometries not accessible using conventional top–down manufacturing. PMID:28008930

  13. Bio-inspired self-shaping ceramics

    NASA Astrophysics Data System (ADS)

    Bargardi, Fabio L.; Le Ferrand, Hortense; Libanori, Rafael; Studart, André R.

    2016-12-01

    Shaping ceramics into complex and intricate geometries using cost-effective processes is desirable in many applications but still remains an open challenge. Inspired by plant seed dispersal units that self-fold on differential swelling, we demonstrate that self-shaping can be implemented in ceramics by programming the material's microstructure to undergo local anisotropic shrinkage during heat treatment. Such microstructural design is achieved by magnetically aligning functionalized ceramic platelets in a liquid ceramic suspension, subsequently consolidated through an established enzyme-catalysed reaction. By fabricating alumina compacts exhibiting bio-inspired bilayer architectures, we achieve deliberate control over shape change during the sintering step. Bending, twisting or combinations of these two basic movements can be successfully programmed to obtain a myriad of complex shapes. The simplicity and the universality of such a bottom-up shaping method makes it attractive for applications that would benefit from low-waste ceramic fabrication, temperature-resistant interlocking structures or unusual geometries not accessible using conventional top-down manufacturing.

  14. A Health Belief Model-Social Learning Theory approach to adolescents' fertility control: findings from a controlled field trial.

    PubMed

    Eisen, M; Zellman, G L; McAlister, A L

    1992-01-01

    We evaluated an 8- to 12-hour Health Belief Model-Social Learning Theory (HBM-SLT)-based sex education program against several community- and school-based interventions in a controlled field experiment. Data on sexual and contraceptive behavior were collected from 1,444 adolescents unselected for gender, race/ethnicity, or virginity status in a pretest-posttest design. Over 60% completed the one-year follow-up. Multivariate analyses were conducted separately for each preintervention virginity status by gender grouping. The results revealed differential program impacts. First, for preintervention virgins, there were no gender or intervention differences in abstinence maintenance over the follow-up year. Second, female preintervention Comparison program virgins used effective contraceptive methods more consistently than those who attended the HBM-SLT program (p less than 0.01); among males, the intervention programs were equally effective. Third, both interventions significantly increased contraceptive efficiency for teenagers who were sexually active before attending the programs. For males, the HBM-SLT program led to significantly greater follow-up contraceptive efficiency than the Comparison program with preintervention contraceptive efficiency controlled (p less than 0.05); for females, the programs produced equivalent improvement. Implications for program planning and evaluation are discussed.

  15. Use of a Business Approach to Improve Disease Surveillance Data Management Systems and Information Technology Process in Florida's Bureau of STD Prevention and Control.

    PubMed

    Shiver, Stacy A; Schmitt, Karla; Cooksey, Adrian

    2009-01-01

    The business of sexually transmitted disease (STD) prevention and control demands technology that is capable of supporting a wide array of program activities-from the processing of laboratory test results to the complex and confidential process involved in contact investigation. The need for a tool that enables public health officials to successfully manage the complex operations encountered in an STD prevention and control program, and the need to operate in an increasingly poor resource environment, led the Florida Bureau of STD to develop the Patient Reporting Investigation Surveillance Manager. Its unique approach, technical architecture, and sociotechnical philosophy have made this business application successful in real-time monitoring of disease burden for local communities, identification of emerging outbreaks, monitoring and assurance of appropriate treatments, improving access to laboratory data, and improving the quality of data for epidemiologic analysis. Additionally, the effort attempted to create and release a product that promoted the Centers for Disease Control and Prevention's ideas for integration of programs and processes.

  16. An Investigation of Unified Memory Access Performance in CUDA

    PubMed Central

    Landaverde, Raphael; Zhang, Tiansheng; Coskun, Ayse K.; Herbordt, Martin

    2015-01-01

    Managing memory between the CPU and GPU is a major challenge in GPU computing. A programming model, Unified Memory Access (UMA), has been recently introduced by Nvidia to simplify the complexities of memory management while claiming good overall performance. In this paper, we investigate this programming model and evaluate its performance and programming model simplifications based on our experimental results. We find that beyond on-demand data transfers to the CPU, the GPU is also able to request subsets of data it requires on demand. This feature allows UMA to outperform full data transfer methods for certain parallel applications and small data sizes. We also find, however, that for the majority of applications and memory access patterns, the performance overheads associated with UMA are significant, while the simplifications to the programming model restrict flexibility for adding future optimizations. PMID:26594668

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cochran, John Russell

    The Al Tuwaitha nuclear complex near Baghdad contains a number of facilities from Saddam Hussan's nuclear weapons program. Past military operations, lack of upkeep and looting have created an enormous radioactive waste problem at the Al Tuwaitha complex, which contains various, uncharacterized radioactive wastes, yellow cake, sealed radioactive sources, and contaminated metals that must be constantly guarded. Iraq has never had a radioactive waste disposal facility and the lack of a disposal facility means that ever increasing quantities of radioactive material must be held in guarded storage. The Iraq Nuclear Facility Dismantlement and Disposal Program (the NDs Program) has beenmore » initiated by the U.S. Department of State (DOS) to assist the Government of Iraq (GOI) in eliminating the threats from poorly controlled radioactive materials, while building human capacities so that the GOI can manage other environmental cleanups in their country. The DOS is funding the IAEA to provide technical assistance via Technical Cooperation projects. Program coordination will be provided by the DOS, consistent with GOI policies, and Sandia National Laboratories will be responsible for coordination of participants and waste management support. Texas Tech University will continue to provide in-country assistance, including radioactive waste characterization and the stand-up of the Iraq Nuclear Services Company. The GOI owns the problems in Iraq and will be responsible for implementation of the NDs Program.« less

  18. The Impact of Criminal Justice Involvement and Housing Outcomes Among Homeless Persons with Co-occurring Disorders.

    PubMed

    Mitchell, Jessica N; Clark, Colleen; Guenther, Christina C

    2017-11-01

    The relationship between criminal justice involvement and housing among homeless persons with co-occurring disorders was examined. Program participants assisted in moving to stable housing were interviewed at baseline, six months, and discharge. Those who remained homeless at follow-up and discharge had significantly more time in jail in the past month than those who were housed. However, criminal justice involvement was not significantly related to housing status at the six month follow-up or discharge. Findings suggest that housing people with complex behavioral health issues reduces the likelihood of further criminal justice involvement.

  19. KENNEDY SPACE CENTER, FLA. - Center Director and former astronaut Roy D. Bridges, Jr., (holding scissors) cuts the ribbon at a ceremony officially opening the U.S. Astronaut Hall of Fame as part of the Kennedy Space Center Visitor Complex. Invited guests and dignitaries look on, such as former astronauts Edgar D. Mitchell on Bridges' left and James Lovell (hand up) and Buzz Aldrin on his right. The ceremony was held in conjunction with the induction of four Space Shuttle astronauts into the Hall of Fame including Daniel Brandenstein, Robert "Hoot" Gibson, Story Musgrave, and Sally Ride. Conceived by six of the Mercury Program astronauts, the U.S. Astronaut Hall of Fame opened in 1990 to provide a place where space travelers could be remembered for their participation and accomplishments in the U.S. space program. The four new inductees join 48 previously honored astronauts from the ranks of the Gemini, Apollo, Skylab, Apollo-Soyuz, and Space Shuttle programs.

    NASA Image and Video Library

    2003-06-20

    KENNEDY SPACE CENTER, FLA. - Center Director and former astronaut Roy D. Bridges, Jr., (holding scissors) cuts the ribbon at a ceremony officially opening the U.S. Astronaut Hall of Fame as part of the Kennedy Space Center Visitor Complex. Invited guests and dignitaries look on, such as former astronauts Edgar D. Mitchell on Bridges' left and James Lovell (hand up) and Buzz Aldrin on his right. The ceremony was held in conjunction with the induction of four Space Shuttle astronauts into the Hall of Fame including Daniel Brandenstein, Robert "Hoot" Gibson, Story Musgrave, and Sally Ride. Conceived by six of the Mercury Program astronauts, the U.S. Astronaut Hall of Fame opened in 1990 to provide a place where space travelers could be remembered for their participation and accomplishments in the U.S. space program. The four new inductees join 48 previously honored astronauts from the ranks of the Gemini, Apollo, Skylab, Apollo-Soyuz, and Space Shuttle programs.

  20. DnaSAM: Software to perform neutrality testing for large datasets with complex null models.

    PubMed

    Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B

    2010-05-01

    Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.

  1. Engaging Underrepresented High School Students in Data Driven Storytelling: An Examination of Learning Experiences and Outcomes for a Cohort of Rising Seniors Enrolled in the Gaining Early Awareness and Readiness for Undergraduate Program (GEAR UP)

    ERIC Educational Resources Information Center

    Dierker, Lisa; Ward, Nadia; Alexander, Jalen; Donate, Emmanuel

    2017-01-01

    Background: Upward trends in data-oriented careers threaten to further increase the underrepresentation of both females and individuals from racial minority groups in programs focused on data analysis and applied statistics. To begin to develop the necessary skills for a data-oriented career, project-based learning seems the most promising given…

  2. LOFT data acquisition and visual display system (DAVDS) presentation program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bullock, M.G.; Miyasaki, F.S.

    1976-03-01

    The Data Acquisition and Visual Display System (DAVDS) at the Loss-of-Fluid Test Facility (LOFT) has 742 data channel recording capability of which 576 are recorded digitally. The purpose of this computer program is to graphically present the data acquired and/or processed by the LOFT DAVDS. This program takes specially created plot data buffers of up to 1024 words and generates time history plots on the system electrostatic printer-plotter. The data can be extracted from two system input devices: Magnetic disk or digital magnetic tape. Versatility has been designed in the program by providing the user three methods of scaling plots:more » Automatic, control record, and manual. Time required to produce a plot on the system electrostatic printer-plotter varies from 30 to 90 seconds depending on the options selected. The basic computer and program details are described.« less

  3. Model Checker for Java Programs

    NASA Technical Reports Server (NTRS)

    Visser, Willem

    2007-01-01

    Java Pathfinder (JPF) is a verification and testing environment for Java that integrates model checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.

  4. On Solid Legal Ground: Bringing Information Literacy to Undergraduate-Level Law Courses

    ERIC Educational Resources Information Center

    Ryesky, Kenneth H.

    2007-01-01

    The complexities of the Internet and other electronic data technologies have greatly heightened the information literacy needs of students in all subjects. Law courses are common components of many undergraduate programs and other settings external to a law degree program. The field of law has many information literacy aspects which are…

  5. Modifications of highway air pollution models for complex geometries, volume II : wind tunnel test program.

    DOT National Transportation Integrated Search

    2002-09-01

    This is volume I1 of a two-volume report of a study to increase the scope and clarity of air pollution models for : depressed highway and street canyon sites. It presents the atmospheric wind tunnel program conducted to increase the : data base and i...

  6. Enabling Long-Term Oceanographic Research: Changing Data Practices, Information Management Strategies and Informatics

    NASA Astrophysics Data System (ADS)

    Baker, K. S.; Chandler, C. L.

    2008-12-01

    Data management and informatics research are in a state of change in terms of data practices, information strategies, and roles. New ways of thinking about data and data management can facilitate interdisciplinary global ocean science. To meet contemporary expectations for local data use and reuse by a variety of audiences, collaborative strategies involving diverse teams of information professionals are developing. Such changes are fostering the growth of information infrastructures that support multi-scale sampling, data integration, and nascent networks of data repositories. In this retrospective, two examples of oceanographic projects incorporating data management in partnership with long-term science programs are reviewed: the Palmer Station Long-Term Ecological Research program (Palmer LTER) and the United States Joint Global Ocean Flux Study (US JGOFS). Lessons learned - short-term and long-term - from a decade of data management within these two communities will be presented. A conceptual framework called Ocean Informatics provides one example for managing the complexities inherent to sharing oceanographic data. Elements are discussed that address the economies-of-scale as well as the complexities-of-scale pertinent to a broad vision of information management and scientific research.

  7. Massive processing of pyro-chromatogram mass spectra (py-GCMS) of soil samples using the PARAFAC2 algorithm

    NASA Astrophysics Data System (ADS)

    Cécillon, Lauric; Quénéa, Katell; Anquetil, Christelle; Barré, Pierre

    2015-04-01

    Due to its large heterogeneity at all scales (from soil core to the globe), several measurements are often mandatory to get a meaningful value of a measured soil property. A large number of measurements can therefore be needed to study a soil property whatever the scale of the study. Moreover, several soil investigation techniques produce large and complex datasets, such as pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) which produces complex 3-way data. In this context, straightforward methods designed to speed up data treatments are needed to deal with large datasets. GC-MS pyrolysis (py-GCMS) is a powerful and frequently used tool to characterize soil organic matter (SOM). However, the treatment of the results of a py-GCMS analysis of soil sample is time consuming (number of peaks, co-elution, etc.) and the treatment of large data set of py-GCMS results is rather laborious. Moreover, peak position shifts and baseline drifts between analyses make the automation of GCMS programs data treatment difficult. These problems can be fixed using the Parallel Factor Analysis 2 (PARAFAC 2, Kiers et al., 1999; Bro et al., 1999). This algorithm has been applied frequently on chromatography data but has never been applied to analyses of SOM. We developed a Matlab routine based on existing Matlab packages dedicated to the simultaneous treatment of dozens of pyro-chromatograms mass spectra. We applied this routine on 40 soil samples. The benefits and expected improvements of our method will be discussed in our poster. References Kiers et al. (1999) PARAFAC2 - PartI. A direct fitting algorithm for the PARAFAC2 model. Journal of Chemometrics, 13: 275-294. Bro et al. (1999) PARAFAC2 - PartII. Modeling chromatographic data with retention time shifts. Journal of Chemometrics, 13: 295-309.

  8. Data Structures for Extreme Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahan, Simon

    As computing problems of national importance grow, the government meets the increased demand by funding the development of ever larger systems. The overarching goal of the work supported in part by this grant is to increase efficiency of programming and performing computations on these large computing systems. In past work, we have demonstrated that some of these computations once thought to require expensive hardware designs and/or complex, special-purpose programming may be executed efficiently on low-cost commodity cluster computing systems using a general-purpose “latency-tolerant” programming framework. One important developed application of the ideas underlying this framework is graph database technology supportingmore » social network pattern matching used by US intelligence agencies to more quickly identify potential terrorist threats. This database application has been spun out by the Pacific Northwest National Laboratory, a Department of Energy Laboratory, into a commercial start-up, Trovares Inc. We explore an alternative application of the same underlying ideas to a well-studied challenge arising in engineering: solving unstructured sparse linear equations. Solving these equations is key to predicting the behavior of large electronic circuits before they are fabricated. Predicting that behavior ahead of fabrication means that designs can optimized and errors corrected ahead of the expense of manufacture.« less

  9. Engineering the object-relation database model in O-Raid

    NASA Technical Reports Server (NTRS)

    Dewan, Prasun; Vikram, Ashish; Bhargava, Bharat

    1989-01-01

    Raid is a distributed database system based on the relational model. O-raid is an extension of the Raid system and will support complex data objects. The design of O-Raid is evolutionary and retains all features of relational data base systems and those of a general purpose object-oriented programming language. O-Raid has several novel properties. Objects, classes, and inheritance are supported together with a predicate-base relational query language. O-Raid objects are compatible with C++ objects and may be read and manipulated by a C++ program without any 'impedance mismatch'. Relations and columns within relations may themselves be treated as objects with associated variables and methods. Relations may contain heterogeneous objects, that is, objects of more than one class in a certain column, which can individually evolve by being reclassified. Special facilities are provided to reduce the data search in a relation containing complex objects.

  10. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  11. Smartfiles: An OO approach to data file interoperability

    NASA Technical Reports Server (NTRS)

    Haines, Matthew; Mehrotra, Piyush; Vanrosendale, John

    1995-01-01

    Data files for scientific and engineering codes typically consist of a series of raw data values whose descriptions are buried in the programs that interact with these files. In this situation, making even minor changes in the file structure or sharing files between programs (interoperability) can only be done after careful examination of the data file and the I/O statement of the programs interacting with this file. In short, scientific data files lack self-description, and other self-describing data techniques are not always appropriate or useful for scientific data files. By applying an object-oriented methodology to data files, we can add the intelligence required to improve data interoperability and provide an elegant mechanism for supporting complex, evolving, or multidisciplinary applications, while still supporting legacy codes. As a result, scientists and engineers should be able to share datasets with far greater ease, simplifying multidisciplinary applications and greatly facilitating remote collaboration between scientists.

  12. [Effects of a Hospital Based Follow-Up Program for Mothers with Very Low Birth Weight Infants].

    PubMed

    Kim, Min Hee; Ji, Eun Sun

    2016-02-01

    This paper reports the results of a hospital centered follow-up program on parenting stress, parenting efficacy and coping for mothers with very low birth weight (VLBW) infants. The follow-up program consisted of home visiting by an expert group and self-help program for 1 year. A non-equivalent control group pre-post quasi-experimental design was used. Participants were 70 mothers with low birth weight infants and were assigned to one of two groups, an experimental groups (n=28), which received the family support program; and a control group (n=27), which received the usual discharge education. Data were analyzed using χ²-test, t-test, and ANCOVA with IBM SPSS statistics 20.0. Mothers' parenting stress (F=5.66, p=.004) was significantly decreased in the experimental group. There were also significant increases in parenting efficacy (F=13.05, p<.001) and coping (F=8.91, p=.002) in the experimental group. The study findings suggest that a follow-up program for mothers with VLBW infants is an effective intervention to decrease mothers' parenting stress and to enhance parenting efficacy and coping.

  13. Program LRCDM2: Improved aerodynamic prediction program for supersonic canard-tail missiles with axisymmetric bodies

    NASA Technical Reports Server (NTRS)

    Dillenius, Marnix F. E.

    1985-01-01

    Program LRCDM2 was developed for supersonic missiles with axisymmetric bodies and up to two finned sections. Predicted are pressure distributions and loads acting on a complete configuration including effects of body separated flow vorticity and fin-edge vortices. The computer program is based on supersonic panelling and line singularity methods coupled with vortex tracking theory. Effects of afterbody shed vorticity on the afterbody and tail-fin pressure distributions can be optionally treated by companion program BDYSHD. Preliminary versions of combined shock expansion/linear theory and Newtonian/linear theory have been implemented as optional pressure calculation methods to extend the Mach number and angle-of-attack ranges of applicability into the nonlinear supersonic flow regime. Comparisons between program results and experimental data are given for a triform tail-finned configuration and for a canard controlled configuration with a long afterbody for Mach numbers up to 2.5. Initial tests of the nonlinear/linear theory approaches show good agreement for pressures acting on a rectangular wing and a delta wing with attached shocks for Mach numbers up to 4.6 and angles of attack up to 20 degrees.

  14. Multi-functional bis(alkynyl)gold(iii) N⁁C complexes with distinct mechanochromic luminescence and electroluminescence properties† †Electronic supplementary information (ESI) available: CCDC 1552808. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c7sc02410j

    PubMed Central

    Wong, Ben Yiu-Wing; Wong, Hok-Lai; Wong, Yi-Chun; Au, Vonika Ka-Man

    2017-01-01

    A new class of donor–acceptor type luminescent bis(alkynyl)gold(iii) N⁁C complexes has been synthesized and characterized. These gold(iii) complexes not only exhibit high photoluminescence quantum yields of up to 0.81, but also interesting mechanochromic luminescence behaviors that are reversible. Upon grinding, a dramatic luminescence color change from green to red can be observed in solid samples of the gold(iii) complexes, and the mechanochromic luminescence can be readily tuned via a judicious selection of substituents on the pyridine ring. In addition, solution-processable OLEDs based on this class of complexes with EQE values of up to 4.0% have been realized, representing the first demonstration of bis(alkynyl)gold(iii) N⁁C complexes as emissive materials in solution-processable OLEDs. PMID:29147519

  15. Human Services Occupations in the Two-Year College: A Handbook.

    ERIC Educational Resources Information Center

    Kiffer, Theodore E.; Burns, Martha A.

    This handbook is intended as a guide for community college administrators in setting up human services programs. (Human services programs refer here to training programs for paraprofessionals involved in helping people.) Data were gathered from 176 two-year colleges regarding the human services curricula offered in 1970-71. In Part I, the survey…

  16. Proximate Effects of a Child Sexual Abuse Prevention Program in Elementary School Children.

    ERIC Educational Resources Information Center

    Hebert, Martine; Lavoie, Francine; Piche, Christiane; Poitras, Michele

    2001-01-01

    The effects of the sexual child abuse prevention program ESPACE were evaluated with 133 Canadian children (grades 1-3). Children participating in the prevention program showed greater preventive knowledge and skills relative to children not participating. Follow-up data showed knowledge gains were maintained while the preventive skill gains may…

  17. Analysis of structural dynamic data from Skylab. Volume 1: Technical discussion

    NASA Technical Reports Server (NTRS)

    Demchak, L.; Harcrow, H.

    1976-01-01

    A compendium of Skylab structural dynamics analytical and test programs is presented. These programs are assessed to identify lessons learned from the structural dynamic prediction effort and to provide guidelines for future analysts and program managers of complex spacecraft systems. It is a synopsis of the structural dynamic effort performed under the Skylab Integration contract and specifically covers the development, utilization, and correlation of Skylab Dynamic Orbital Models.

  18. Root Cause Analyses of Nunn-McCurdy Breaches. Volume 2: Excalibur Artillery Projectile and the Navy Enterprise Resource Planning Program, with an Approach to Analyzing Program Complexity and Risk

    DTIC Science & Technology

    2012-01-01

    The mismatch, it was feared, would wreck the processes that the ERP was trying to improve; customers did not have the choice of putting the ERP...program features ahead of attempting a deep dive into the data looking for problems. An initial conceptual framework would allow a decision- maker to

  19. Resource utilization in children with tuberous sclerosis complex and associated seizures: a retrospective chart review study.

    PubMed

    Lennert, Barb; Farrelly, Eileen; Sacco, Patricia; Pira, Geraldine; Frost, Michael

    2013-04-01

    Seizures are a hallmark manifestation of tuberous sclerosis complex, yet data characterizing resource utilization are lacking. This retrospective chart review was performed to assess the economic burden of tuberous sclerosis complex with neurologic manifestations. Demographic and resource utilization data were collected for 95 patients for up to 5 years after tuberous sclerosis complex diagnosis. Mean age at diagnosis was 3.1 years, with complex partial and infantile spasms as the most common seizure types. In the first 5 years post-diagnosis, 83.2% required hospitalization, 30.5% underwent surgery, and the majority of patients (90.5%) underwent ≥3 testing procedures. In 79 patients with a full 5 years of data, hospitalizations, intensive care unit stays, diagnostic testing, and rehabilitation services decreased over the 5-year period. Resource utilization is cost-intensive in children with tuberous sclerosis complex and associated seizures during the first few years following diagnosis. Improving seizure control and reducing health care costs in this population remain unmet needs.

  20. Assessing Complex Academic Performance at the Group Level.

    ERIC Educational Resources Information Center

    Scarloss, Beth

    This study was a secondary analysis of data collected by staff of the Program for Complex Instruction (PCI). The purpose of the larger study was to investigate the effect on learning gains of having students know the content and performance standards on which they will be judged as well as the effect of using evaluation criteria. This study looks…

  1. International Semiotics: Item Difficulty and the Complexity of Science Item Illustrations in the PISA-2009 International Test Comparison

    ERIC Educational Resources Information Center

    Solano-Flores, Guillermo; Wang, Chao; Shade, Chelsey

    2016-01-01

    We examined multimodality (the representation of information in multiple semiotic modes) in the context of international test comparisons. Using Program of International Student Assessment (PISA)-2009 data, we examined the correlation of the difficulty of science items and the complexity of their illustrations. We observed statistically…

  2. L2 Grammatical Gender in a Complex Morphological System: The Case of German

    ERIC Educational Resources Information Center

    Spinner, Patti; Juffs, Alan

    2008-01-01

    In order to determine the nature of naturalistic learners' difficulty with grammatical gender in a complex morphological system, the longitudinal production data of an early naturalistic L1-Italian and L1-Turkish learner who are acquiring German are examined in light of current theories of gender within Chomsky's (1995) Minimalist Program. After…

  3. Dynamic Processes of Speech Development by Seven Adult Learners of Japanese in a Domestic Immersion Context

    ERIC Educational Resources Information Center

    Fukuda, Makiko

    2014-01-01

    The present study revealed the dynamic process of speech development in a domestic immersion program by seven adult beginning learners of Japanese. The speech data were analyzed with fluency, accuracy, and complexity measurements at group, interindividual, and intraindividual levels. The results revealed the complex nature of language development…

  4. Real-time data reduction capabilities at the Langley 7 by 10 foot high speed tunnel

    NASA Technical Reports Server (NTRS)

    Fox, C. H., Jr.

    1980-01-01

    The 7 by 10 foot high speed tunnel performs a wide range of tests employing a variety of model installation methods. To support the reduction of static data from this facility, a generalized wind tunnel data reduction program had been developed for use on the Langley central computer complex. The capabilities of a version of this generalized program adapted for real time use on a dedicated on-site computer are discussed. The input specifications, instructions for the console operator, and full descriptions of the algorithms are included.

  5. Return-to-Work Within a Complex and Dynamic Organizational Work Disability System.

    PubMed

    Jetha, Arif; Pransky, Glenn; Fish, Jon; Hettinger, Lawrence J

    2016-09-01

    Background Return-to-work (RTW) within a complex organizational system can be associated with suboptimal outcomes. Purpose To apply a sociotechnical systems perspective to investigate complexity in RTW; to utilize system dynamics modeling (SDM) to examine how feedback relationships between individual, psychosocial, and organizational factors make up the work disability system and influence RTW. Methods SDMs were developed within two companies. Thirty stakeholders including senior managers, and frontline supervisors and workers participated in model building sessions. Participants were asked questions that elicited information about the structure of the work disability system and were translated into feedback loops. To parameterize the model, participants were asked to estimate the shape and magnitude of the relationship between key model components. Data from published literature were also accessed to supplement participant estimates. Data were entered into a model created in the software program Vensim. Simulations were conducted to examine how financial incentives and light duty work disability-related policies, utilized by the participating companies, influenced RTW likelihood and preparedness. Results The SDMs were multidimensional, including individual attitudinal characteristics, health factors, and organizational components. Among the causal pathways uncovered, psychosocial components including workplace social support, supervisor and co-worker pressure, and supervisor-frontline worker communication impacted RTW likelihood and preparedness. Interestingly, SDM simulations showed that work disability-related policies in both companies resulted in a diminishing or opposing impact on RTW preparedness and likelihood. Conclusion SDM provides a novel systems view of RTW. Policy and psychosocial component relationships within the system have important implications for RTW, and may contribute to unanticipated outcomes.

  6. Impact of a Web-based worksite health promotion program on absenteeism.

    PubMed

    Niessen, Maurice A J; Kraaijenhagen, Roderik A; Dijkgraaf, Marcel G W; Van Pelt, Danielle; Van Kalken, Coen K; Peek, Niels

    2012-04-01

    To evaluate the effect of participation in a comprehensive, Web-based worksite health promotion program on absenteeism. Study population consists of Dutch workers employed at a large financial services company. Linear regression was used to assess the impact of program attendance on the difference between baseline and follow-up absenteeism rates, controlling for gender, age, job level, years of employment, and noncompletion of the program. Data from 20,797 individuals were analyzed; 3826 individuals enrolled in the program during the study period. A 20.3% reduction in absenteeism was shown among program attendees compared with nonparticipants during a median follow-up period of 23.3 months. Participating in the worksite health promotion program led to an immediate reduction in absenteeism. Improved psychological well-being, increased exercise, and weight reduction are possible pathways toward this reduction.

  7. Unique barriers and needs in weight management for obese women with fibromyalgia.

    PubMed

    Craft, Jennifer M; Ridgeway, Jennifer L; Vickers, Kristin S; Hathaway, Julie C; Vincent, Ann; Oh, Terry H

    2015-01-01

    The aim of this study was to identify barriers, needs, and preferences of weight management intervention for women with fibromyalgia (FM). Obesity appears in higher rates in women with fibromyalgia compared to the population at large, and no study to date has taken a qualitative approach to better understand how these women view weight management in relation to their disease and vice versa. We designed a qualitative interview study with women patients with FM and obesity. Women (N = 15) were recruited by their participation in a fibromyalgia treatment program (FTP) within the year prior. The women approached for the study met the following inclusion criteria: confirmed diagnosis of FM, age between 30 and 60 years (M = 51 ± 6.27), and body mass index (BMI) ≥ 30 (M = 37.88 ± 4.87). Patients completed questionnaire data prior to their participation in focus groups (N = 3), including weight loss history, physical activity data, the Revised Fibromyalgia Impact Questionnaire (FIQR), and the Patient Health Questionnaire 9-item (PHQ-9). Three focus group interviews were conducted to collect qualitative data. Consistent themes were revealed within and between groups. Patients expressed the complex relationships between FM symptoms, daily responsibilities, and weight management. Weight was viewed as an emotionally laden topic requiring compassionate delivery of programming from an empathetic leader who is knowledgeable about fibromyalgia. Patients view themselves as complex and different, requiring a specifically tailored weight management program for women with FM. Women with FM identify unique barriers to weight management, including the complex interrelationships between symptoms of FM and health behaviors, such as diet and exercise. They prefer a weight management program for women with FM that consists of an in-person, group-based approach with a leader but are open to a tailored conventional weight management program. Feasibility may be one of the biggest barriers to such a program both from an institutional and individual perspective. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Follow-up: who does it and how do they do it?

    PubMed

    Chamberlain, J M; Carraccio, C L

    1994-12-01

    Follow-up appointments and phone contact after discharge are important components of the emergency department (ED) encounter. We surveyed ED directors at hospitals with accredited pediatric residency programs to determine mechanisms for follow-up 1) to chart progression of illness (POI), 2) for positive laboratory or x-ray results, and 3) for specific illness such as child abuse, burns, and complex wounds. One hundred thirty-five of 207 program directors responded (65%). To follow POI, 54% of EDs use the ED itself, and 59% send patients to community physicians. Of those that use community physicians, 24% do not notify the physician to expect a follow-up visit, and 27% do not send a copy of the ED chart to a physician's office. To follow POI, 20% of EDs have no formal mechanism for telephone follow-up. Sixteen percent keep no record of phone contact. For follow-up of positive laboratory tests or x-rays, results are better; only 4 and 5%, respectively, do not keep records of phone contact. Eleven percent of EDs have no mechanism for follow-up of child abuse. Mechanisms for follow-up of children seen in the ED are variable. We have identified deficiencies in the following areas: 1) lack of communication with the physician to provide follow-up, 2) lack of documentation regarding subsequent patient contacts for POI and positive test results, and 3) lack of resources to follow victims of child abuse. These deficiencies have potential implications regarding optimal patient outcome.

  9. MEMOPS: data modelling and automatic code generation.

    PubMed

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  10. Data Processing Center of Radioastron Project: 3 years of operation.

    NASA Astrophysics Data System (ADS)

    Shatskaya, Marina

    ASC DATA PROCESSING CENTER (DPC) of Radioastron Project is a fail-safe complex centralized system of interconnected software/ hardware components along with organizational procedures. Tasks facing of the scientific data processing center are organization of service information exchange, collection of scientific data, storage of all of scientific data, data science oriented processing. DPC takes part in the informational exchange with two tracking stations in Pushchino (Russia) and Green Bank (USA), about 30 ground telescopes, ballistic center, tracking headquarters and session scheduling center. Enormous flows of information go to Astro Space Center. For the inquiring of enormous data volumes we develop specialized network infrastructure, Internet channels and storage. The computer complex has been designed at the Astro Space Center (ASC) of Lebedev Physical Institute and includes: - 800 TB on-line storage, - 2000 TB hard drive archive, - backup system on magnetic tapes (2000 TB); - 24 TB redundant storage at Pushchino Radio Astronomy Observatory; - Web and FTP servers, - DPC management and data transmission networks. The structure and functions of ASC Data Processing Center are fully adequate to the data processing requirements of the Radioastron Mission and has been successfully confirmed during Fringe Search, Early Science Program and first year of Key Science Program.

  11. How Do Mode and Timing of Follow-up Surveys Affect Evaluation Success?

    ERIC Educational Resources Information Center

    Koundinya, Vikram; Klink, Jenna; Deming, Philip; Meyers, Andrew; Erb, Kevin

    2016-01-01

    This article presents the analysis of evaluation methods used in a well-designed and comprehensive evaluation effort of a significant Extension program. The evaluation data collection methods were analyzed by questionnaire mode and timing of follow-up surveys. Response rates from the short- and long-term follow-ups and different questionnaire…

  12. PolyCheck: Dynamic Verification of Iteration Space Transformations on Affine Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bao, Wenlei; Krishnamoorthy, Sriram; Pouchet, Louis-noel

    2016-01-11

    High-level compiler transformations, especially loop transformations, are widely recognized as critical optimizations to restructure programs to improve data locality and expose parallelism. Guaranteeing the correctness of program transformations is essential, and to date three main approaches have been developed: proof of equivalence of affine programs, matching the execution traces of programs, and checking bit-by-bit equivalence of the outputs of the programs. Each technique suffers from limitations in either the kind of transformations supported, space complexity, or the sensitivity to the testing dataset. In this paper, we take a novel approach addressing all three limitations to provide an automatic bug checkermore » to verify any iteration reordering transformations on affine programs, including non-affine transformations, with space consumption proportional to the original program data, and robust to arbitrary datasets of a given size. We achieve this by exploiting the structure of affine program control- and data-flow to generate at compile-time lightweight checker code to be executed within the transformed program. Experimental results assess the correctness and effectiveness of our method, and its increased coverage over previous approaches.« less

  13. Adult Illiterates and Adult Literacy Programs: A Summary of Descriptive Data.

    ERIC Educational Resources Information Center

    McGrail, Janet

    A portrait of illiterates and literacy programs in the United States in the 1980s is derived from this summary of the most up-to-date, valid information that could be obtained from a literature review. The first section on adult illiterates identifies data sources, numbers of illiterates, and characteristics of the five main groups (the elderly,…

  14. Representation and Use of Temporal Information in ONCOCIN

    PubMed Central

    Kahn, Michael G.; Ferguson, Jay C.; Shortliffe, Edward H.; Fagan, Lawrence M.

    1985-01-01

    The past medical history of a patient is a complex collection of events yet the understanding of these past events is critical for effective medical diagnostic and therapeutic decisions. Although computers can store vast quantities of patient data, diagnostic and therapeutic computer programs have had difficulty in accessing and analyzing collections of patient information that is clinically pertinent to a specific decision facing a particular patient at a given moment in his disease. Without some model of the patient's past, the computer cannot fully interpret the meaning of the available patient data. We present some of the difficulties that were encountered in ONCOCIN, a cancer chemotherapy planning program. This program must be able to reason about the patient's past treatment history in order to generate a therapy plan that is responsive to the problems he or she may have encountered in the past. A design is presented that supports a more intuitive approach to capture and analyze important temporal relationships in a patient's computer record. In order to represent the time course of a patient, we have implemented a structure called the temporal network and a temporal syntax for data storage and retrieval. Using this system, ONCOCIN is able to quickly obtain data that is patient-specific and context-sensitive. Adding the temporal network to the ONCOCIN system has markedly improved the program's handling of complex temporal issues.

  15. TERRA: Building New Communities for Advanced Biofuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cornelius, Joe; Mockler, Todd; Tuinstra, Mitch

    ARPA-E’s Transportation Energy Resources from Renewable Agriculture (TERRA) program is bringing together top experts from different disciplines – agriculture, robotics and data analytics – to rethink the production of advanced biofuel crops. ARPA-E Program Director Dr. Joe Cornelius discusses the TERRA program and explains how ARPA-E’s model enables multidisciplinary collaboration among diverse communities. The video focuses on two TERRA projects—Donald Danforth Center and Purdue University—that are developing and integrating cutting-edge remote sensing platforms, complex data analytics tools and plant breeding technologies to tackle the challenge of sustainably increasing biofuel stocks.

  16. Data compression of discrete sequence: A tree based approach using dynamic programming

    NASA Technical Reports Server (NTRS)

    Shivaram, Gurusrasad; Seetharaman, Guna; Rao, T. R. N.

    1994-01-01

    A dynamic programming based approach for data compression of a ID sequence is presented. The compression of an input sequence of size N to that of a smaller size k is achieved by dividing the input sequence into k subsequences and replacing the subsequences by their respective average values. The partitioning of the input sequence is carried with the intention of reducing the mean squared error in the reconstructed sequence. The complexity involved in finding the partitions which would result in such an optimal compressed sequence is reduced by using the dynamic programming approach, which is presented.

  17. Array data extractor (ADE): a LabVIEW program to extract and merge gene array data

    PubMed Central

    2013-01-01

    Background Large data sets from gene expression array studies are publicly available offering information highly valuable for research across many disciplines ranging from fundamental to clinical research. Highly advanced bioinformatics tools have been made available to researchers, but a demand for user-friendly software allowing researchers to quickly extract expression information for multiple genes from multiple studies persists. Findings Here, we present a user-friendly LabVIEW program to automatically extract gene expression data for a list of genes from multiple normalized microarray datasets. Functionality was tested for 288 class A G protein-coupled receptors (GPCRs) and expression data from 12 studies comparing normal and diseased human hearts. Results confirmed known regulation of a beta 1 adrenergic receptor and further indicate novel research targets. Conclusions Although existing software allows for complex data analyses, the LabVIEW based program presented here, “Array Data Extractor (ADE)”, provides users with a tool to retrieve meaningful information from multiple normalized gene expression datasets in a fast and easy way. Further, the graphical programming language used in LabVIEW allows applying changes to the program without the need of advanced programming knowledge. PMID:24289243

  18. Online evaluation programs: benefits and limitations.

    PubMed

    Burhansstipanov, Linda; Clark, Richard E; Watanabe-Galloway, Shinobu; Petereit, Daniel G; Eschiti, Valerie; Krebs, Linda U; Pingatore, Noel L

    2012-04-01

    Patient navigation programs are increasing throughout the USA, yet some evaluation measures are too vague to determine what and how navigation functions. Through collaborative efforts an online evaluation program was developed. The goal of this evaluation program is to make data entry accurate, simple, and efficient. This comprehensive program includes major components on staff, mentoring, committees, partnerships, grants/studies, products, dissemination, patient navigation, and reports. Pull down menus, radio buttons, and check boxes are incorporated whenever possible. Although the program has limitations, the benefits of having access to current, up-to-date program data 24/7 are worth overcoming the challenges. Of major benefit is the ability of the staff to tailor summary reports to provide anonymous feedback in a timely manner to community partners and participants. The tailored data are useful for the partners to generate summaries for inclusion in new grant applications.

  19. KSC-2012-3033a

    NASA Image and Video Library

    2012-05-23

    CAPE CANAVERAL, Fla. – At the NASA Railroad Yard at NASA’s Kennedy Space Center in Florida, preparations are under way for the departure of a train made up of tank cars. The railroad’s track runs past Kennedy’s 525-foot-tall Vehicle Assembly Building in the background. The train is headed for the Florida East Coast Railway interchange in Titusville, Fla., where the train’s helium tank cars, a liquid oxygen tank car, and a liquid hydrogen dewar or tank car will be transferred for delivery to the SpaceX engine test complex outside McGregor, Texas. The railroad cars were needed in support of the Space Shuttle Program but currently are not in use by NASA following the completion of the program in 2011. Originally, the tankers belonged to the U.S. Bureau of Mines. At the peak of the shuttle program, there were approximately 30 cars in the fleet. About half the cars were returned to the bureau as launch activity diminished. Five tank cars are being loaned to SpaceX and repurposed to support their engine tests in Texas. Eight cars previously were shipped to California on loan to support the SpaceX Falcon 9 rocket launches from Space Launch Complex-4 on Vandenberg Air Force Base. SpaceX already has three helium tank cars previously used for the shuttle program at Space Launch Complex-40 on Cape Canaveral Air Force Station in Florida. For more information, visit http://www.nasa.gov/spacex. Photo credit: NASA/Jim Grossmann

  20. KSC-2012-3032a

    NASA Image and Video Library

    2012-05-23

    CAPE CANAVERAL, Fla. – At the NASA Railroad Yard at NASA’s Kennedy Space Center in Florida, preparations are under way for the departure of a train made up of tank cars. The train will pass by Kennedy’s 525-foot-tall Vehicle Assembly Building in the background. The train is headed for the Florida East Coast Railway interchange in Titusville, Fla., where the train’s helium tank cars, a liquid oxygen tank car, and a liquid hydrogen dewar or tank car will be transferred for delivery to the SpaceX engine test complex outside McGregor, Texas. The railroad cars were needed in support of the Space Shuttle Program but currently are not in use by NASA following the completion of the program in 2011. Originally, the tankers belonged to the U.S. Bureau of Mines. At the peak of the shuttle program, there were approximately 30 cars in the fleet. About half the cars were returned to the bureau as launch activity diminished. Five tank cars are being loaned to SpaceX and repurposed to support their engine tests in Texas. Eight cars previously were shipped to California on loan to support the SpaceX Falcon 9 rocket launches from Space Launch Complex-4 on Vandenberg Air Force Base. SpaceX already has three helium tank cars previously used for the shuttle program at Space Launch Complex-40 on Cape Canaveral Air Force Station in Florida. For more information, visit http://www.nasa.gov/spacex. Photo credit: NASA/Jim Grossmann

  1. 75 FR 63194 - Notice of Proposed Information Collection for Public Comment on the Follow-Up Survey and Data...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ... Information Collection for Public Comment on the Follow-Up Survey and Data Collection Guide for the Evaluation of the Rapid Re-Housing for Homeless Families Demonstration Program AGENCY: Office of Policy... (OMB) for review, as required by the Paperwork Reduction Act of 1995, Public Law 104-13 (44 U.S.C. 3506...

  2. Making the Numbers Add up: A Guide for Using Data in College Access and Success Programs. Results and Reflections. An Evaluation Report

    ERIC Educational Resources Information Center

    Lumina Foundation for Education, 2009

    2009-01-01

    The purpose of this guide is to help readers clarify their roles in the college access and success system and to identify how they might use data to create change for students. This guide shows how data can strengthen current programs and support broader changes that ease the path to college for students. This guide illuminates how a long-term…

  3. A computer program designed to produce tables from alphanumeric data

    USGS Publications Warehouse

    Ridgley, Jennie L.; Schnabel, Robert Wayne

    1978-01-01

    This program is designed to produce tables from alphanumeric data. Each line of data that appears in the table is entered into a data file as a single line of data. Where necessary, a predetermined delimiter is added to break up the data into column data. The program can process the following types of data: (1) title, (2) headnote, (3) footnote, (4) two levels of column headers, (5) solid lines, (6) blank lines, (7) most types of numeric data, and (8) all types of alphanumeric data. In addition, the program can produce a series of continuation tables from large data sets. Fitting of all data to the final table format is performed by the program, although provisions have been made for user-modification of the final format. The width of the table is adjustable, but may not exceed 158 characters per line. The program is useful in that it permits alteration of original data or table format without having to physically retype all or portions of the table. The final results may be obtained quickly using interactive terminals, and execution of the program requires only minimal knowledge of computer usage. Tables produced may be of publishable quality, especially when reduced. Complete user documentation and program listing are included. NOTE: Although this program has been subjected to many tests a warranty on accuracy or proper functioning is neither implied nor expressed.

  4. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The purpose was to investigate and evaluate the interchange of application- specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. See Engineering Analysis Using a Web-Based Protocol by J.D. Schoeffler and R.W. Claus, NASA TM-2002-211981, October 2002. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  5. Scale-up of national antiretroviral therapy programs: progress and challenges in the Asia Pacific region.

    PubMed

    Srikantiah, Padmini; Ghidinelli, Massimo; Bachani, Damodar; Chasombat, Sanchai; Daoni, Esorom; Mustikawati, Dyah E; Nhan, Do T; Pathak, Laxmi R; San, Khin O; Vun, Mean C; Zhang, Fujie; Lo, Ying-Ru; Narain, Jai P

    2010-09-01

    There has been tremendous scale-up of antiretroviral therapy (ART) services in the Asia Pacific region, which is home to an estimated 4.7 million persons living with HIV/AIDS. We examined treatment scale-up, ART program practices, and clinical outcome data in the nine low-and-middle-income countries that share over 95% of the HIV burden in the region. Standardized indicators for ART scale-up and treatment outcomes were examined for Cambodia, China, India, Indonesia, Myanmar, Nepal, Papua New Guinea, Thailand, and Vietnam using data submitted by each country to the WHO/The Joint United Nations Programme on HIV/AIDS (UNAIDS)/UNICEF joint framework tool for monitoring the health sector response to HIV/AIDS. Data on ART program practices were abstracted from National HIV Treatment Guidelines for each country. At the end of 2009, over 700,000 HIV-infected persons were receiving ART in the nine focus countries. Treatment coverage varies widely in the region, ranging from 16 to 93%. All nine countries employ a public health approach to ART services and provide a standardized first-line nonnucleoside reverse transcriptase inhibitor-based regimen. Among patients initiated on first-line ART in these countries, 65-88% remain alive and on treatment 12 months later. Over 50% of mortality occurs in the first 6 months of therapy, and losses to follow-up range from 8 to 16% at 2 years. Impressive ART scale-up efforts in the region have resulted in significant improvements in survival among persons receiving therapy. Continued funding support and political commitment will be essential for further expansion of public sector ART services to those in need. To improve treatment outcomes, national programs should focus on earlier identification of persons requiring ART, decentralization of ART services, and the development of stronger healthcare systems to support the provision of a continuum of HIV care.

  6. Automated a complex computer aided design concept generated using macros programming

    NASA Astrophysics Data System (ADS)

    Rizal Ramly, Mohammad; Asrokin, Azharrudin; Abd Rahman, Safura; Zulkifly, Nurul Ain Md

    2013-12-01

    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes.

  7. The Caltech Concurrent Computation Program - Project description

    NASA Technical Reports Server (NTRS)

    Fox, G.; Otto, S.; Lyzenga, G.; Rogstad, D.

    1985-01-01

    The Caltech Concurrent Computation Program wwhich studies basic issues in computational science is described. The research builds on initial work where novel concurrent hardware, the necessary systems software to use it and twenty significant scientific implementations running on the initial 32, 64, and 128 node hypercube machines have been constructed. A major goal of the program will be to extend this work into new disciplines and more complex algorithms including general packages that decompose arbitrary problems in major application areas. New high-performance concurrent processors with up to 1024-nodes, over a gigabyte of memory and multigigaflop performance are being constructed. The implementations cover a wide range of problems in areas such as high energy and astrophysics, condensed matter, chemical reactions, plasma physics, applied mathematics, geophysics, simulation, CAD for VLSI, graphics and image processing. The products of the research program include the concurrent algorithms, hardware, systems software, and complete program implementations.

  8. Material fatigue data obtained by card-programmed hydraulic loading system

    NASA Technical Reports Server (NTRS)

    Davis, W. T.

    1967-01-01

    Fatigue tests using load distributions from actual loading histories encountered in flight are programmed on punched electronic accounting machine cards. With this hydraulic loading system, airframe designers can apply up to 55 load levels to a test specimen.

  9. [The analysis of complex interventions in public health: the case of the prevention of sexually transmitted infections and blood-borne infections in Montreal].

    PubMed

    Bilodeau, Angèle; Beauchemin, Jean; Bourque, Denis; Galarneau, Marilène

    2013-02-11

    Based on a theory of intervention as a complex action system, analyze collaboration among partners in Montréal's sexually transmitted and blood-borne infections (STBBI) prevention program to identify main operations problems and possible scenarios for change to achieve better outcomes. A descriptive study was conducted using three data sources - public policies and programs, system management documents, and interviews with three types of partners. The results were validated with stakeholders. Five main operations problems affecting the capacity of the system to provide expected services were identified, as well as strategies the partners use to address these. Two scenarios for system change to increase its effectiveness in achieving program goals are discussed.

  10. Scientific Programming Using Java: A Remote Sensing Example

    NASA Technical Reports Server (NTRS)

    Prados, Don; Mohamed, Mohamed A.; Johnson, Michael; Cao, Changyong; Gasser, Jerry

    1999-01-01

    This paper presents results of a project to port remote sensing code from the C programming language to Java. The advantages and disadvantages of using Java versus C as a scientific programming language in remote sensing applications are discussed. Remote sensing applications deal with voluminous data that require effective memory management, such as buffering operations, when processed. Some of these applications also implement complex computational algorithms, such as Fast Fourier Transformation analysis, that are very performance intensive. Factors considered include performance, precision, complexity, rapidity of development, ease of code reuse, ease of maintenance, memory management, and platform independence. Performance of radiometric calibration code written in Java for the graphical user interface and of using C for the domain model are also presented.

  11. Systems Engineering Design Via Experimental Operation Research: Complex Organizational Metric for Programmatic Risk Environments (COMPRE)

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    1999-01-01

    Unique and innovative graph theory, neural network, organizational modeling, and genetic algorithms are applied to the design and evolution of programmatic and organizational architectures. Graph theory representations of programs and organizations increase modeling capabilities and flexibility, while illuminating preferable programmatic/organizational design features. Treating programs and organizations as neural networks results in better system synthesis, and more robust data modeling. Organizational modeling using covariance structures enhances the determination of organizational risk factors. Genetic algorithms improve programmatic evolution characteristics, while shedding light on rulebase requirements for achieving specified technological readiness levels, given budget and schedule resources. This program of research improves the robustness and verifiability of systems synthesis tools, including the Complex Organizational Metric for Programmatic Risk Environments (COMPRE).

  12. Graduating into Start-up: Exploring the Transition

    ERIC Educational Resources Information Center

    Nabi, Ghulam; Holden, Rick; Walmsley, Andreas

    2009-01-01

    The main purpose of the exploratory research discussed in this paper was to generate insights into the complexity of the career-making processes involved in the transition from being a student to starting up a business. Using story-telling interviews, data were collected from fifteen graduates based in the Yorkshire region of the UK. Qualitative…

  13. From which soil metal fractions Fe, Mn, Zn and Cu are taken up by olive trees (Olea europaea L., cv. 'Chondrolia Chalkidikis') in organic groves?

    PubMed

    Chatzistathis, T; Papaioannou, A; Gasparatos, D; Molassiotis, A

    2017-12-01

    Organic farming has been proposed as an alternative agricultural system to help solve environmental problems, like the sustainable management of soil micronutrients, without inputs of chemical fertilizers. The purposes of this study were: i) to assess Fe, Mn, Zn and Cu bioavailability through the determination of sequentially extracted chemical forms (fractions) and their correlation with foliar micronutrient concentrations in mature organic olive (cv. 'Chondrolia Chalkidikis') groves; ii) to determine the soil depth and the available forms (fractions) by which the 4 metals are taken up by olive trees. DTPA extractable (from the soil layers 0-20, 20-40 and 40-60 cm) and foliar micronutrient concentrations were determined in two organic olive groves. Using the Tessier fractionation, five fractions, for all the metals, were found: exchangeable, bound to carbonates (acid-soluble), bound to Fe-Mn oxides (reducible), organic (oxidizable), as well as residual form. Our results indicated that Fe was taken up by the olive trees as organic complex, mainly from the soil layer 40-60 cm. Manganese was taken up from the exchangeable fraction (0-20 cm); Zinc was taken up as organic complex from the layers 0-20 and 40-60 cm, as well as in the exchangeable form from the upper 20 cm. Copper was taken up from the soil layers 0-20 and 40-60 cm as soluble organic complex, and as exchangeable ion from the upper 20 cm. Our data reveal the crucial role of organic matter to sustain metal (Fe, Zn and Cu) uptake -as soluble complexes-by olive trees, in mature organic groves grown on calcareous soils; it is also expected that these data will constitute a thorough insight and useful tool towards a successful nutrient and organic C management for organic olive groves, since no serious nutritional deficiencies were found. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Multiple Intravenous Infusions Phase 2b: Laboratory Study

    PubMed Central

    Pinkney, Sonia; Fan, Mark; Chan, Katherine; Koczmara, Christine; Colvin, Christopher; Sasangohar, Farzan; Masino, Caterina; Easty, Anthony; Trbovich, Patricia

    2014-01-01

    Background Administering multiple intravenous (IV) infusions to a single patient via infusion pump occurs routinely in health care, but there has been little empirical research examining the risks associated with this practice or ways to mitigate those risks. Objectives To identify the risks associated with multiple IV infusions and assess the impact of interventions on nurses’ ability to safely administer them. Data Sources and Review Methods Forty nurses completed infusion-related tasks in a simulated adult intensive care unit, with and without interventions (i.e., repeated-measures design). Results Errors were observed in completing common tasks associated with the administration of multiple IV infusions, including the following (all values from baseline, which was current practice): setting up and programming multiple primary continuous IV infusions (e.g., 11.7% programming errors) identifying IV infusions (e.g., 7.7% line-tracing errors) managing dead volume (e.g., 96.0% flush rate errors following IV syringe dose administration) setting up a secondary intermittent IV infusion (e.g., 11.3% secondary clamp errors) administering an IV pump bolus (e.g., 11.5% programming errors) Of 10 interventions tested, 6 (1 practice, 3 technology, and 2 educational) significantly decreased or even eliminated errors compared to baseline. Limitations The simulation of an adult intensive care unit at 1 hospital limited the ability to generalize results. The study results were representative of nurses who received training in the interventions but had little experience using them. The longitudinal effects of the interventions were not studied. Conclusions Administering and managing multiple IV infusions is a complex and risk-prone activity. However, when a patient requires multiple IV infusions, targeted interventions can reduce identified risks. A combination of standardized practice, technology improvements, and targeted education is required. PMID:26316919

  15. Trans-National Scale-Up of Services in Global Health

    PubMed Central

    Shahin, Ilan; Sohal, Raman; Ginther, John; Hayden, Leigh; MacDonald, John A.; Mossman, Kathryn; Parikh, Himanshu; McGahan, Anita; Mitchell, Will; Bhattacharyya, Onil

    2014-01-01

    Background Scaling up innovative healthcare programs offers a means to improve access, quality, and health equity across multiple health areas. Despite large numbers of promising projects, little is known about successful efforts to scale up. This study examines trans-national scale, whereby a program operates in two or more countries. Trans-national scale is a distinct measure that reflects opportunities to replicate healthcare programs in multiple countries, thereby providing services to broader populations. Methods Based on the Center for Health Market Innovations (CHMI) database of nearly 1200 health programs, the study contrasts 116 programs that have achieved trans-national scale with 1,068 single-country programs. Data was collected on the programs' health focus, service activity, legal status, and funding sources, as well as the programs' locations (rural v. urban emphasis), and founding year; differences are reported with statistical significance. Findings This analysis examines 116 programs that have achieved trans-national scale (TNS) across multiple disease areas and activity types. Compared to 1,068 single-country programs, we find that trans-nationally scaled programs are more donor-reliant; more likely to focus on targeted health needs such as HIV/AIDS, TB, malaria, or family planning rather than provide more comprehensive general care; and more likely to engage in activities that support healthcare services rather than provide direct clinical care. Conclusion This work, based on a large data set of health programs, reports on trans-national scale with comparison to single-country programs. The work is a step towards understanding when programs are able to replicate their services as they attempt to expand health services for the poor across countries and health areas. A subset of these programs should be the subject of case studies to understand factors that affect the scaling process, particularly seeking to identify mechanisms that lead to improved health outcomes. PMID:25375328

  16. Aspect-Oriented Programming

    NASA Technical Reports Server (NTRS)

    Elrad, Tzilla (Editor); Filman, Robert E. (Editor); Bader, Atef (Editor)

    2001-01-01

    Computer science has experienced an evolution in programming languages and systems from the crude assembly and machine codes of the earliest computers through concepts such as formula translation, procedural programming, structured programming, functional programming, logic programming, and programming with abstract data types. Each of these steps in programming technology has advanced our ability to achieve clear separation of concerns at the source code level. Currently, the dominant programming paradigm is object-oriented programming - the idea that one builds a software system by decomposing a problem into objects and then writing the code of those objects. Such objects abstract together behavior and data into a single conceptual and physical entity. Object-orientation is reflected in the entire spectrum of current software development methodologies and tools - we have OO methodologies, analysis and design tools, and OO programming languages. Writing complex applications such as graphical user interfaces, operating systems, and distributed applications while maintaining comprehensible source code has been made possible with OOP. Success at developing simpler systems leads to aspirations for greater complexity. Object orientation is a clever idea, but has certain limitations. We are now seeing that many requirements do not decompose neatly into behavior centered on a single locus. Object technology has difficulty localizing concerns invoking global constraints and pandemic behaviors, appropriately segregating concerns, and applying domain-specific knowledge. Post-object programming (POP) mechanisms that look to increase the expressiveness of the OO paradigm are a fertile arena for current research. Examples of POP technologies include domain-specific languages, generative programming, generic programming, constraint languages, reflection and metaprogramming, feature-oriented development, views/viewpoints, and asynchronous message brokering. (Czarneclu and Eisenecker s book includes a good survey of many of these technologies).

  17. Satellite Data Processing System (SDPS) users manual V1.0

    NASA Technical Reports Server (NTRS)

    Caruso, Michael; Dunn, Chris

    1989-01-01

    SDPS is a menu driven interactive program designed to facilitate the display and output of image and line-based data sets common to telemetry, modeling and remote sensing. This program can be used to display up to four separate raster images and overlay line-based data such as coastlines, ship tracks and velocity vectors. The program uses multiple windows to communicate information with the user. At any given time, the program may have up to four image display windows as well as auxiliary windows containing information about each image displayed. SDPS is not a commercial program. It does not contain complete type checking or error diagnostics which may allow the program to crash. Known anomalies will be mentioned in the appropriate section as notes or cautions. SDPS was designed to be used on Sun Microsystems Workstations running SunView1 (Sun Visual/Integrated Environment for Workstations). It was primarily designed to be used on workstations equipped with color monitors, but most of the line-based functions and several of the raster-based functions can be used with monochrome monitors. The program currently runs on Sun 3 series workstations running Sun OS 4.0 and should port easily to Sun 4 and Sun 386 series workstations with SunView1. Users should also be familiar with UNIX, Sun workstations and the SunView window system.

  18. Efficient and flexible memory architecture to alleviate data and context bandwidth bottlenecks of coarse-grained reconfigurable arrays

    NASA Astrophysics Data System (ADS)

    Yang, Chen; Liu, LeiBo; Yin, ShouYi; Wei, ShaoJun

    2014-12-01

    The computational capability of a coarse-grained reconfigurable array (CGRA) can be significantly restrained due to data and context memory bandwidth bottlenecks. Traditionally, two methods have been used to resolve this problem. One method loads the context into the CGRA at run time. This method occupies very small on-chip memory but induces very large latency, which leads to low computational efficiency. The other method adopts a multi-context structure. This method loads the context into the on-chip context memory at the boot phase. Broadcasting the pointer of a set of contexts changes the hardware configuration on a cycle-by-cycle basis. The size of the context memory induces a large area overhead in multi-context structures, which results in major restrictions on application complexity. This paper proposes a Predictable Context Cache (PCC) architecture to address the above context issues by buffering the context inside a CGRA. In this architecture, context is dynamically transferred into the CGRA. Utilizing a PCC significantly reduces the on-chip context memory and the complexity of the applications running on the CGRA is no longer restricted by the size of the on-chip context memory. Data preloading is the most frequently used approach to hide input data latency and speed up the data transmission process for the data bandwidth issue. Rather than fundamentally reducing the amount of input data, the transferred data and computations are processed in parallel. However, the data preloading method cannot work efficiently because data transmission becomes the critical path as the reconfigurable array scale increases. This paper also presents a Hierarchical Data Memory (HDM) architecture as a solution to the efficiency problem. In this architecture, high internal bandwidth is provided to buffer both reused input data and intermediate data. The HDM architecture relieves the external memory from the data transfer burden so that the performance is significantly improved. As a result of using PCC and HDM, experiments running mainstream video decoding programs achieved performance improvements of 13.57%-19.48% when there was a reasonable memory size. Therefore, 1080p@35.7fps for H.264 high profile video decoding can be achieved on PCC and HDM architecture when utilizing a 200 MHz working frequency. Further, the size of the on-chip context memory no longer restricted complex applications, which were efficiently executed on the PCC and HDM architecture.

  19. ALCF Data Science Program: Productive Data-centric Supercomputing

    NASA Astrophysics Data System (ADS)

    Romero, Nichols; Vishwanath, Venkatram

    The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.

  20. Modelling of the tunnelling effect in granulated metallic nanostructures

    NASA Astrophysics Data System (ADS)

    Istratov, A. V.; Kucherik, A. O.

    2018-01-01

    Obtaining thin films of today is unthinkable without use of mathematical modeling, numerical methods and complex programs. In this regard, the practical importance of this calculations is that it can be used to investigate the conductivity of nano-sized granular structures that expands the diagnostic capabilities of thin films, opens up new perspectives in the creation of new devices based on thin-film technology, allow to predict their properties.

  1. Prenatal attitudes and parity predict selection into a U.S. child health program: a short report.

    PubMed

    Martin-Anderson, Sarah

    2013-10-01

    Public policies are a determinant of child health disparities; sound evaluation of these programs is essential for good governance. It is impossible in most countries to randomize assignment into child health programs that directly offer benefits. In the absence of this, researchers face the threat of selection bias-the idea that there are innate, immeasurable differences between those who take-up treatment and those who don't. In the field of Program Evaluation we are most concerned with the differences between the eligible people who take-up a program and the eligible people who choose not to enroll. Using a case study of a large U.S. nutrition program, this report illustrates how the perceived benefits of participation may affect the decision to take-up a program. In turn, this highlights sources of potential selection bias. Using data from a longitudinal study of mothers and infants conducted between May and December of 2005, I show that attitudes and beliefs prenatally toward breastfeeding determine enrollment in a U.S nutrition program that offers free Infant Formula. I also find that the significance of the selection bias differs by parity. Analysis reveals that maternal attitudinal responses are more highly predictive of future behavior, compared to standard demographic variables. In sum, this paper makes a case for rigorously understanding the factors that determine take-up of a program and how those factors can modify the results of a program evaluation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Learning directed acyclic graphs from large-scale genomics data.

    PubMed

    Nikolay, Fabio; Pesavento, Marius; Kritikos, George; Typas, Nassos

    2017-09-20

    In this paper, we consider the problem of learning the genetic interaction map, i.e., the topology of a directed acyclic graph (DAG) of genetic interactions from noisy double-knockout (DK) data. Based on a set of well-established biological interaction models, we detect and classify the interactions between genes. We propose a novel linear integer optimization program called the Genetic-Interactions-Detector (GENIE) to identify the complex biological dependencies among genes and to compute the DAG topology that matches the DK measurements best. Furthermore, we extend the GENIE program by incorporating genetic interaction profile (GI-profile) data to further enhance the detection performance. In addition, we propose a sequential scalability technique for large sets of genes under study, in order to provide statistically significant results for real measurement data. Finally, we show via numeric simulations that the GENIE program and the GI-profile data extended GENIE (GI-GENIE) program clearly outperform the conventional techniques and present real data results for our proposed sequential scalability technique.

  3. A Synthetic Biology Framework for Programming Eukaryotic Transcription Functions

    PubMed Central

    Khalil, Ahmad S.; Lu, Timothy K.; Bashor, Caleb J.; Ramirez, Cherie L.; Pyenson, Nora C.; Joung, J. Keith; Collins, James J.

    2013-01-01

    SUMMARY Eukaryotic transcription factors (TFs) perform complex and combinatorial functions within transcriptional networks. Here, we present a synthetic framework for systematically constructing eukaryotic transcription functions using artificial zinc fingers, modular DNA-binding domains found within many eukaryotic TFs. Utilizing this platform, we construct a library of orthogonal synthetic transcription factors (sTFs) and use these to wire synthetic transcriptional circuits in yeast. We engineer complex functions, such as tunable output strength and transcriptional cooperativity, by rationally adjusting a decomposed set of key component properties, e.g., DNA specificity, affinity, promoter design, protein-protein interactions. We show that subtle perturbations to these properties can transform an individual sTF between distinct roles (activator, cooperative factor, inhibitory factor) within a transcriptional complex, thus drastically altering the signal processing behavior of multi-input systems. This platform provides new genetic components for synthetic biology and enables bottom-up approaches to understanding the design principles of eukaryotic transcriptional complexes and networks. PMID:22863014

  4. Sorting on STAR. [CDC computer algorithm timing comparison

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Timing comparisons are given for three sorting algorithms written for the CDC STAR computer. One algorithm is Hoare's (1962) Quicksort, which is the fastest or nearly the fastest sorting algorithm for most computers. A second algorithm is a vector version of Quicksort that takes advantage of the STAR's vector operations. The third algorithm is an adaptation of Batcher's (1968) sorting algorithm, which makes especially good use of vector operations but has a complexity of N(log N)-squared as compared with a complexity of N log N for the Quicksort algorithms. In spite of its worse complexity, Batcher's sorting algorithm is competitive with the serial version of Quicksort for vectors up to the largest that can be treated by STAR. Vector Quicksort outperforms the other two algorithms and is generally preferred. These results indicate that unusual instruction sets can introduce biases in program execution time that counter results predicted by worst-case asymptotic complexity analysis.

  5. Integration of Nutrient and Activity Analysis Software into a Worksite Weight Management Program.

    ERIC Educational Resources Information Center

    Dennison, Darwin; And Others

    1990-01-01

    A weight management program utilized the participant's own data for the participant to (1) understand energy balance; (2) compare his/her diet with U.S. dietary codes; (3) know which food selections were high in calories, fat, and cholesterol, and low in complex carbohydrates and fiber; and (4) understand weight management. (JD)

  6. Freeing Space for NASA: Incorporating a Lossless Compression Algorithm into NASA's FOSS System

    NASA Technical Reports Server (NTRS)

    Fiechtner, Kaitlyn; Parker, Allen

    2011-01-01

    NASA's Fiber Optic Strain Sensing (FOSS) system can gather and store up to 1,536,000 bytes (1.46 megabytes) per second. Since the FOSS system typically acquires hours - or even days - of data, the system can gather hundreds of gigabytes of data for a given test event. To store such large quantities of data more effectively, NASA is modifying a Lempel-Ziv-Oberhumer (LZO) lossless data compression program to compress data as it is being acquired in real time. After proving that the algorithm is capable of compressing the data from the FOSS system, the LZO program will be modified and incorporated into the FOSS system. Implementing an LZO compression algorithm will instantly free up memory space without compromising any data obtained. With the availability of memory space, the FOSS system can be used more efficiently on test specimens, such as Unmanned Aerial Vehicles (UAVs) that can be in flight for days. By integrating the compression algorithm, the FOSS system can continue gathering data, even on longer flights.

  7. Hydrogen-oxygen driven Zero Emissions bus drives around KSC Visitor Complex

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Zero Emissions (ZE) transit bus passes a mock-up orbiter named Explorer on a trek through the KSC Visitor Complex. Provided by dbb fuel cell engines inc. of Vancouver, Canada, the ZE bus was brought to KSC as part of the Center's Alternative Fuel Initiatives Program. The bus uses a Proton Exchange Membrane fuel cell in which hydrogen and oxygen, from atmospheric air, react to produce electricity that powers an electric motor drive system. The by-product 'exhaust' from the fuel cell is water vapor, thus zero harmful emissions. A typical diesel-powered bus emits more than a ton of harmful pollutants from its exhaust every year. The ZE bus is being used on tour routes at the KSC Visitor Complex for two days to introduce the public to the concept.

  8. Virtual reality simulator training of laparoscopic cholecystectomies - a systematic review.

    PubMed

    Ikonen, T S; Antikainen, T; Silvennoinen, M; Isojärvi, J; Mäkinen, E; Scheinin, T M

    2012-01-01

    Simulators are widely used in occupations where practice in authentic environments would involve high human or economic risks. Surgical procedures can be simulated by increasingly complex and expensive techniques. This review gives an update on computer-based virtual reality (VR) simulators in training for laparoscopic cholecystectomies. From leading databases (Medline, Cochrane, Embase), randomised or controlled trials and the latest systematic reviews were systematically searched and reviewed. Twelve randomised trials involving simulators were identified and analysed, as well as four controlled studies. Furthermore, seven studies comparing black boxes and simulators were included. The results indicated any kind of simulator training (black box, VR) to be beneficial at novice level. After VR training, novice surgeons seemed to be able to perform their first live cholecystectomies with fewer errors, and in one trial the positive effect remained during the first ten cholecystectomies. No clinical follow-up data were found. Optimal learning requires skills training to be conducted as part of a systematic training program. No data on the cost-benefit of simulators were found, the price of a VR simulator begins at EUR 60 000. Theoretical background to learning and limited research data support the use of simulators in the early phases of surgical training. The cost of buying and using simulators is justified if the risk of injuries and complications to patients can be reduced. Developing surgical skills requires repeated training. In order to achieve optimal learning a validated training program is needed.

  9. COSP for Windows: Strategies for Rapid Analyses of Cyclic Oxidation Behavior

    NASA Technical Reports Server (NTRS)

    Smialek, James L.; Auping, Judith V.

    2002-01-01

    COSP is a publicly available computer program that models the cyclic oxidation weight gain and spallation process. Inputs to the model include the selection of an oxidation growth law and a spalling geometry, plus oxide phase, growth rate, spall constant, and cycle duration parameters. Output includes weight change, the amounts of retained and spalled oxide, the total oxygen and metal consumed, and the terminal rates of weight loss and metal consumption. The present version is Windows based and can accordingly be operated conveniently while other applications remain open for importing experimental weight change data, storing model output data, or plotting model curves. Point-and-click operating features include multiple drop-down menus for input parameters, data importing, and quick, on-screen plots showing one selection of the six output parameters for up to 10 models. A run summary text lists various characteristic parameters that are helpful in describing cyclic behavior, such as the maximum weight change, the number of cycles to reach the maximum weight gain or zero weight change, the ratio of these, and the final rate of weight loss. The program includes save and print options as well as a help file. Families of model curves readily show the sensitivity to various input parameters. The cyclic behaviors of nickel aluminide (NiAl) and a complex superalloy are shown to be properly fitted by model curves. However, caution is always advised regarding the uniqueness claimed for any specific set of input parameters,

  10. AN ATTEMPT TO FIND AN A PRIORI MEASURE OF STEP SIZE. COMPARATIVE STUDIES OF PRINCIPLES FOR PROGRAMMING MATHEMATICS IN AUTOMATED INSTRUCTION, TECHNICAL REPORT NO. 13.

    ERIC Educational Resources Information Center

    ROSEN, ELLEN F.; STOLUROW, LAWRENCE M.

    IN ORDER TO FIND A GOOD PREDICTOR OF EMPIRICAL DIFFICULTY, AN OPERATIONAL DEFINITION OF STEP SIZE, TEN PROGRAMER-JUDGES RATED CHANGE IN COMPLEXITY IN TWO VERSIONS OF A MATHEMATICS PROGRAM, AND THESE RATINGS WERE THEN COMPARED WITH MEASURES OF EMPIRICAL DIFFICULTY OBTAINED FROM STUDENT RESPONSE DATA. THE TWO VERSIONS, A 54 FRAME BOOKLET AND A 35…

  11. Youth-Initiated HIV Risk and Substance Use Prevention Program.

    ERIC Educational Resources Information Center

    Goggin, K.; Metcalf, K.; Wise, D.; Kennedy, S.; Murray, T.; Burgess, D.; Reese-Smith, J.; Terhune, N.; Broadus, K.; Downes, A.; Buckendahl, H.

    This study evaluates the first year of a novel HIV and substance use prevention program for inner city youth (Offering New Youth eXperiences--ONYX). Baseline and follow-up measures of knowledge, attitudes, and risk behaviors were administered seven months apart to 441 youth participating in the ONYX program. Youth (n=71) who provided data at both…

  12. Curriculum Development and Evaluation: Research and Development Program on Preschool Disadvantaged Children. Final Report. (Volume II of III Volumes).

    ERIC Educational Resources Information Center

    Bereiter, Carl; And Others

    Seven studies were undertaken to further extend the development and testing of an academically-oriented preschool program for disadvantaged children. The studies investigated (1) Curricula Development and Testing in Bereiter-Engelmann Program, (2) Dual Kindergarten, (3) Follow-Up Data on the Achievement of Disadvantaged Children Who Participated…

  13. Impactful Student Learning Outcomes of One-to-One Student Laptop Programs in Low Socioeconomic Schools

    ERIC Educational Resources Information Center

    Harris, Matthew Joseph

    2010-01-01

    At present, a majority of one-to-one student laptop programs exist in schools that serve affluent communities, which denies low socioeconomic students the learning benefits of ubiquitous access to technology. Using a "Studying Up-Studying Down" paradigm, this multi-site case study collected mixed method data from program participants at five…

  14. Software Reviews: Programs Worth a Second Look.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1989

    1989-01-01

    Reviews three software programs: (1) "Microsoft Works 2.0": word processing, data processing, and telecommunications, grades 7 and up; (2) "AppleWorks GS": word processor, database, spreadsheet, graphics, and telecommunications, grades 3-12, Apple IIGS; (3) "Choices, Choices: On the Playground, Taking Responsibility":…

  15. Survey of Airport Access Analysis Techniques - Models, Data and a Research Program

    DOT National Transportation Integrated Search

    1972-06-01

    The report points up the differences and similarities between airport access travel and general urban trip making. Models and surveys developed for, or applicable, to airport access planning are reviewed. A research program is proposed which would ge...

  16. A practical tool for monitoring the performance of measuring systems in a laboratory network: report of an ACB Working Group.

    PubMed

    Ayling, Pete; Hill, Robert; Jassam, Nuthar; Kallner, Anders; Khatami, Zahra

    2017-11-01

    Background A logical consequence of the introduction of robotics and high-capacity analysers has seen a consolidation to larger units. This requires new structures and quality systems to ensure that laboratories deliver consistent and comparable results. Methods A spreadsheet program was designed to accommodate results from up to 12 different instruments/laboratories and present IQC data, i.e. Levey-Jennings and Youden plots and comprehensive numerical tables of the performance of each item. Input of data was made possible by a 'data loader' by which IQC data from the individual instruments could be transferred to the spreadsheet program on line. Results A set of real data from laboratories is used to populate the data loader and the networking software program. Examples are present from the analysis of variance components, the Levey-Jennings and Youden plots. Conclusions This report presents a software package that allows the simultaneous management and detailed monitoring of the performance of up to 12 different instruments/laboratories in a fully interactive mode. The system allows a quality manager of networked laboratories to have a continuous updated overview of the performance. This software package has been made available at the ACB website.

  17. Software for Preprocessing Data from Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2004-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.

  18. When Remedial Means What It Says: How Teachers Use Data to Reform Instructional Interventions

    ERIC Educational Resources Information Center

    Tedford, Jennifer

    2009-01-01

    As technology becomes increasingly integrated into K-12 education, the use of data is growing in volume and complexity, resulting in a paradox of information overload for educators. While administrators and teachers have access to more data than ever before, they are only just beginning to understand the impact of data on program improvement. In a…

  19. Phenotypic correlations between ovum pick-up in vitro production traits and pregnancy rates in Zebu cows.

    PubMed

    Vega, W H O; Quirino, C R; Serapião, R V; Oliveira, C S; Pacheco, A

    2015-07-03

    The growth of the Gyr breed in Brazil in terms of genetic gain for milk, along with conditions for market, has led to the use of ovum pick-up in vitro production (OPU-IVP) as a leader in biotechnology for the multiplication of genetic material. The aim of this study was to investigate phenotypic correlations between OPU-IVP-linked characteristics and pregnancy rates registered in an embryo transfer program using Gyr cows as oocyte donors. Data collected from 211 OPU sessions and 298 embryo transfers during the years 2012 and 2013 were analyzed and statistical analysis was performed. Estimates of simple Pearson correlations were calculated for NVcoc and PVcoc (number and proportion of viable cumulus-oocyte complexes, respectively); NcleavD4 and PcleavD4 (number and proportion of cleaved embryos on day 4 of culture, respectively); NTembD7 and PTembD7 (number and proportion of transferable embryos on day 7 of culture, respectively); NPrD30 and PPrD30 (number and proportion of pregnancies 30 days after transfer, respectively); and NPrD60 and PPrD60 (number and proportion of pregnancies 60 days after transfer, respectively). Moderate to moderately high correlations were found for all numerical characteristics, suggesting these as the most suitable parameters for selection of oocyte donors in Gyr programs. NVcoc is proposed as a selection trait due to positive correlations with percentage traits and pregnancy rates 30 and 60 days after transfer.

  20. Automated, per pixel Cloud Detection from High-Resolution VNIR Data

    NASA Technical Reports Server (NTRS)

    Varlyguin, Dmitry L.

    2007-01-01

    CASA is a fully automated software program for the per-pixel detection of clouds and cloud shadows from medium- (e.g., Landsat, SPOT, AWiFS) and high- (e.g., IKONOS, QuickBird, OrbView) resolution imagery without the use of thermal data. CASA is an object-based feature extraction program which utilizes a complex combination of spectral, spatial, and contextual information available in the imagery and the hierarchical self-learning logic for accurate detection of clouds and their shadows.

  1. Design and implementation of a telecommunication interface for the TAATM/TCV real-time experiment

    NASA Technical Reports Server (NTRS)

    Nolan, J. D.

    1981-01-01

    The traffic situation display experiment of the terminal configured vehicle (TCV) research program requires a bidirectional data communications tie line between an computer complex. The tie line is used in a real time environment on the CYBER 175 computer by the terminal area air traffic model (TAATM) simulation program. Aircraft position data are processed by TAATM with the resultant output sent to the facility for the generation of air traffic situation displays which are transmitted to a research aircraft.

  2. From Student Follow-Up Responses to a Statewide Supply/Demand Analysis of Educational Programs.

    ERIC Educational Resources Information Center

    Hall, Toni

    The Texas Student Follow-up Information System (Tex-SIS) for comprehensive postsecondary follow-up and the supply/demand analysis work of the Texas 1202 Commission, Office of Postsecondary Education Planning, together may provide a valuable prototype for other states and perhaps even for a national system of data collection and analysis. Tex-SIS…

  3. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1988-01-01

    The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.

  4. On safari to Random Jungle: a fast implementation of Random Forests for high-dimensional data

    PubMed Central

    Schwarz, Daniel F.; König, Inke R.; Ziegler, Andreas

    2010-01-01

    Motivation: Genome-wide association (GWA) studies have proven to be a successful approach for helping unravel the genetic basis of complex genetic diseases. However, the identified associations are not well suited for disease prediction, and only a modest portion of the heritability can be explained for most diseases, such as Type 2 diabetes or Crohn's disease. This may partly be due to the low power of standard statistical approaches to detect gene–gene and gene–environment interactions when small marginal effects are present. A promising alternative is Random Forests, which have already been successfully applied in candidate gene analyses. Important single nucleotide polymorphisms are detected by permutation importance measures. To this day, the application to GWA data was highly cumbersome with existing implementations because of the high computational burden. Results: Here, we present the new freely available software package Random Jungle (RJ), which facilitates the rapid analysis of GWA data. The program yields valid results and computes up to 159 times faster than the fastest alternative implementation, while still maintaining all options of other programs. Specifically, it offers the different permutation importance measures available. It includes new options such as the backward elimination method. We illustrate the application of RJ to a GWA of Crohn's disease. The most important single nucleotide polymorphisms (SNPs) validate recent findings in the literature and reveal potential interactions. Availability: The RJ software package is freely available at http://www.randomjungle.org Contact: inke.koenig@imbs.uni-luebeck.de; ziegler@imbs.uni-luebeck.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20505004

  5. What is the strength of evidence for heart failure disease-management programs?

    PubMed

    Clark, Alexander M; Savard, Lori A; Thompson, David R

    2009-07-28

    Heart failure (HF) disease-management programs are increasingly common. However, some large and recent trials of programs have not reported positive findings. There have also been parallel recent advances in reporting standards and theory around complex nonpharmacological interventions. These developments compel reconsideration in this Viewpoint of how research into HF-management programs should be evaluated, the quality, specificity, and usefulness of this evidence, and the recommendations for future research. Addressing the main determinants of intervention effectiveness by using the PICO (Patient, Intervention, Comparison, and Outcome) approach and the recent CONSORT (Consolidated Standards of Reporting Trials) statement on nonpharmacological trials, we will argue that in both current trials and meta-analyses, interventions and comparisons are not sufficiently well described; that complex programs have been excessively oversimplified; and that potentially salient differences in programs, populations, and settings are not incorporated into analyses. In preference to more general meta-analyses of programs, adequate descriptions are first needed of populations, interventions, comparisons, and outcomes in past and future trials. This could be achieved via a systematic survey of study authors based on the CONSORT statement. These more detailed data on studies should be incorporated into future meta-analyses of comparable trials and used with other techniques such as patient-based outcomes data and meta-regression. Although trials and meta-analyses continue to have potential to generate useful evidence, a more specific evidence base is needed to support the development of effective programs for different populations and settings.

  6. An Embedded Reconfigurable Logic Module

    NASA Technical Reports Server (NTRS)

    Tucker, Jerry H.; Klenke, Robert H.; Shams, Qamar A. (Technical Monitor)

    2002-01-01

    A Miniature Embedded Reconfigurable Computer and Logic (MERCAL) module has been developed and verified. MERCAL was designed to be a general-purpose, universal module that that can provide significant hardware and software resources to meet the requirements of many of today's complex embedded applications. This is accomplished in the MERCAL module by combining a sub credit card size PC in a DIMM form factor with a XILINX Spartan I1 FPGA. The PC has the ability to download program files to the FPGA to configure it for different hardware functions and to transfer data to and from the FPGA via the PC's ISA bus during run time. The MERCAL module combines, in a compact package, the computational power of a 133 MHz PC with up to 150,000 gate equivalents of digital logic that can be reconfigured by software. The general architecture and functionality of the MERCAL hardware and system software are described.

  7. Development of monitoring and diagnostic methods for robots used in remediation of waste sites. 1997 annual progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tecza, J.

    1998-06-01

    'Safe and efficient clean up of hazardous and radioactive waste sites throughout the DOE complex will require extensive use of robots. This research effort focuses on developing Monitoring and Diagnostic (M and D) methods for robots that will provide early detection, isolation, and tracking of impending faults before they result in serious failure. The utility and effectiveness of applying M and D methods to hydraulic robots has never been proven. The present research program is utilizing seeded faults in a laboratory test rig that is representative of an existing hydraulically-powered remediation robot. This report summarizes activity conducted in the firstmore » 9 months of the project. The research team has analyzed the Rosie Mobile Worksystem as a representative hydraulic robot, developed a test rig for implanted fault testing, developed a test plan and agenda, and established methods for acquiring and analyzing the test data.'« less

  8. High-throughput mouse genotyping using robotics automation.

    PubMed

    Linask, Kaari L; Lo, Cecilia W

    2005-02-01

    The use of mouse models is rapidly expanding in biomedical research. This has dictated the need for the rapid genotyping of mutant mouse colonies for more efficient utilization of animal holding space. We have established a high-throughput protocol for mouse genotyping using two robotics workstations: a liquid-handling robot to assemble PCR and a microfluidics electrophoresis robot for PCR product analysis. This dual-robotics setup incurs lower start-up costs than a fully automated system while still minimizing human intervention. Essential to this automation scheme is the construction of a database containing customized scripts for programming the robotics workstations. Using these scripts and the robotics systems, multiple combinations of genotyping reactions can be assembled simultaneously, allowing even complex genotyping data to be generated rapidly with consistency and accuracy. A detailed protocol, database, scripts, and additional background information are available at http://dir.nhlbi.nih.gov/labs/ldb-chd/autogene/.

  9. Humidity Data for 9975 Shipping Packages with Softwood Fiberboard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daugherty, W. L.

    The 9975 surveillance program is developing a technical basis to support extending the storage period of 9975 packages in K-Area Complex beyond the currently approved 15 years. A key element of this effort is developing a better understanding of degradation of the fiberboard assembly under storage conditions. This degradation is influenced greatly by the moisture content of the fiberboard, which is not well characterized on an individual package basis. Direct measurements of humidity and fiberboard moisture content have been made on two test packages with softwood fiberboard and varying internal heat levels from 0 up to 19W. Comparable measurements withmore » cane fiberboard have been reported previously. With an internal heat load, a temperature gradient in the fiberboard assembly leads to varying relative humidity in the air around the fiberboard. However, the absolute humidity tends to remain approximately constant throughout the package, especially at lower heat loads.« less

  10. A programmable optimization environment using the GAMESS-US and MERLIN/MCL packages. Applications on intermolecular interaction energies

    NASA Astrophysics Data System (ADS)

    Kalatzis, Fanis G.; Papageorgiou, Dimitrios G.; Demetropoulos, Ioannis N.

    2006-09-01

    The Merlin/MCL optimization environment and the GAMESS-US package were combined so as to offer an extended and efficient quantum chemistry optimization system, capable of implementing complex optimization strategies for generic molecular modeling problems. A communication and data exchange interface was established between the two packages exploiting all Merlin features such as multiple optimizers, box constraints, user extensions and a high level programming language. An important feature of the interface is its ability to perform dimer computations by eliminating the basis set superposition error using the counterpoise (CP) method of Boys and Bernardi. Furthermore it offers CP-corrected geometry optimizations using analytic derivatives. The unified optimization environment was applied to construct portions of the intermolecular potential energy surface of the weakly bound H-bonded complex C 6H 6-H 2O by utilizing the high level Merlin Control Language. The H-bonded dimer HF-H 2O was also studied by CP-corrected geometry optimization. The ab initio electronic structure energies were calculated using the 6-31G ** basis set at the Restricted Hartree-Fock and second-order Moller-Plesset levels, while all geometry optimizations were carried out using a quasi-Newton algorithm provided by Merlin. Program summaryTitle of program: MERGAM Catalogue identifier:ADYB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADYB_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: The program is designed for machines running the UNIX operating system. It has been tested on the following architectures: IA32 (Linux with gcc/g77 v.3.2.3), AMD64 (Linux with the Portland group compilers v.6.0), SUN64 (SunOS 5.8 with the Sun Workshop compilers v.5.2) and SGI64 (IRIX 6.5 with the MIPSpro compilers v.7.4) Installations: University of Ioannina, Greece Operating systems or monitors under which the program has been tested: UNIX Programming language used: ANSI C, ANSI Fortran-77 No. of lines in distributed program, including test data, etc.:11 282 No. of bytes in distributed program, including test data, etc.: 49 458 Distribution format: tar.gz Memory required to execute with typical data: Memory requirements mainly depend on the selection of a GAMESS-US basis set and the number of atoms No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: no Nature of physical problem: Multidimensional geometry optimization is of great importance in any ab initio calculation since it usually is one of the most CPU-intensive tasks, especially on large molecular systems. For example, the geometric and energetic description of van der Waals and weakly bound H-bonded complexes requires the construction of related important portions of the multidimensional intermolecular potential energy surface (IPES). So the various held views about the nature of these bonds can be quantitatively tested. Method of solution: The Merlin/MCL optimization environment was interconnected with the GAMESS-US package to facilitate geometry optimization in quantum chemistry problems. The important portions of the IPES require the capability to program optimization strategies. The Merlin/MCL environment was used for the implementation of such strategies. In this work, a CP-corrected geometry optimization was performed on the HF-H 2O complex and an MCL program was developed to study portions of the potential energy surface of the C 6H 6-H 2O complex. Restrictions on the complexity of the problem: The Merlin optimization environment and the GAMESS-US package must be installed. The MERGAM interface requires GAMESS-US input files that have been constructed in Cartesian coordinates. This restriction occurs from a design-time requirement to not allow reorientation of atomic coordinates; this rule holds always true when applying the COORD = UNIQUE keyword in a GAMESS-US input file. Typical running time: It depends on the size of the molecular system, the size of the basis set and the method of electron correlation. Execution of the test run took approximately 5 min on a 2.8 GHz Intel Pentium CPU.

  11. Shuttle user analysis (study 2.2): Volume 3. Business Risk And Value of Operations in space (BRAVO). Part 4: Computer programs and data look-up

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Computer program listings as well as graphical and tabulated data needed by the analyst to perform a BRAVO analysis were examined. Graphical aid which can be used to determine the earth coverage of satellites in synchronous equatorial orbits was described. A listing for satellite synthesis computer program as well as a sample printout for the DSCS-11 satellite program and a listing of the symbols used in the program were included. The APL language listing for the payload program cost estimating computer program was given. This language is compatible with many of the time sharing remote terminals computers used in the United States. Data on the intelsat communications network was studied. Costs for telecommunications systems leasing, line of sight microwave relay communications systems, submarine telephone cables, and terrestrial power generation systems were also described.

  12. Data-Based Interval Throwing Programs for Collegiate Softball Players

    PubMed Central

    Axe, Michael J.; Windley, Thomas C.; Snyder-Mackler, Lynn

    2002-01-01

    Objective: To construct interval throwing programs followed by a simulated game for collegiate softball players at all positions. The programs are intended to be used as functional progressions within a comprehensive rehabilitation program for an injured athlete or to augment off-season conditioning workouts. Design and Setting: We collected data over a single season of National Collegiate Athletic Association softball at the University of Delaware and Goldey Beacom College. We observed 220 half-innings of play and 2785 pitches during data collection. Subjects: The subjects were collegiate-level softball players at all positions of play. Measurements: We recorded the number of pitches for pitchers. For catchers, we recorded the number of sprints to back up a play, time in the squat stance, throws back to the pitcher, and the perceived effort and distance of all other throws. We also collected the perceived effort and distance of all throws for infielders and outfielders. Results: Pitchers threw an average of 89.61 pitches per game; catchers were in the squat stance 14.13 minutes per game; infielders threw the ball between 4.28 times per game and 6.30 times per game; and outfielders threw distances of up to 175 feet. Conclusions: We devised the interval throwing programs from the data collected, field dimensions, the types of injuries found to occur in softball, and a general understanding of tissue healing. We designed programs that allow a safe and efficient progressive return to sport. PMID:12937435

  13. Impact of the Primary Care Exception on Family Medicine Resident Coding.

    PubMed

    Cawse-Lucas, Jeanne; Evans, David V; Ruiz, David R; Allcut, Elizabeth A; Andrilla, C Holly A; Thompson, Matthew; Norris, Thomas E

    2016-03-01

    The Medicare Primary Care Exception (PCE) allows residents to see and bill for less-complex patients independently in the primary care setting, requiring attending physicians only to see patients for higher-level visits and complete physical exams in order to bill for them as such. Primary care residencies apply the PCE in various ways. We investigated the impact of the PCE on resident coding practices. Family medicine residency directors in a five-state region completed a survey regarding interpretation and application of the PCE, including the number of established patient evaluation and management codes entered by residents and attending faculty at their institution. The percentage of high-level codes was compared between residencies using chi-square tests. We analyzed coding data for 125,016 visits from 337 residents and 172 faculty physicians in 15 of 18 eligible family medicine residencies. Among programs applying the PCE criteria to all patients, residents billed 86.7% low-mid complexity and 13.3% high-complexity visits. In programs that only applied the PCE to Medicare patients, residents billed 74.9% low-mid complexity visits and 25.2% high-complexity visits. Attending physicians coded more high-complexity visits at both types of programs. The estimated revenue loss over the 1,650 RRC-required outpatient visits was $2,558.66 per resident and $57,569.85 per year for the average residency in our sample. Residents at family medicine programs that apply the PCE to all patients bill significantly fewer high-complexity visits. This finding leads to compliance and regulatory concerns and suggests significant revenue loss. Further study is required to determine whether this discrepancy also reflects inaccuracy in coding.

  14. Using value-based analysis to influence outcomes in complex surgical systems.

    PubMed

    Kirkpatrick, John R; Marks, Stanley; Slane, Michele; Kim, Donald; Cohen, Lance; Cortelli, Michael; Plate, Juan; Perryman, Richard; Zapas, John

    2015-04-01

    Value-based analysis (VBA) is a management strategy used to determine changes in value (quality/cost) when a usual practice (UP) is replaced by a best practice (BP). Previously validated in clinical initiatives, its usefulness in complex systems is unknown. To answer this question, we used VBA to correct deficiencies in cardiac surgery at Memorial Healthcare System. Cardiac surgery is a complex surgical system that lends itself to VBA because outcomes metrics provided by the Society of Thoracic Surgeons provide an estimate of quality; cost is available from Centers for Medicare and Medicaid Services and other contemporary sources; the UP can be determined; and the best practice can be established. Analysis of the UP at Memorial Healthcare System revealed considerable deficiencies in selection of patients for surgery; the surgery itself, including choice of procedure and outcomes; after care; follow-up; and control of expenditures. To correct these deficiencies, each UP was replaced with a BP. Changes included replacement of most of the cardiac surgeons; conversion to an employed physician model; restructuring of a heart surgery unit; recruitment of cardiac anesthesiologists; introduction of an interactive educational program; eliminating unsafe practices; and reducing cost. There was a significant (p < 0.01) reduction in readmissions, complications, and mortality between 2009 and 2013. Memorial Healthcare System was only 1 of 17 (1.7%) database participants (n = 1,009) to achieve a Society of Thoracic Surgeons 3-star rating in all 3 measured categories. Despite substantial improvements in quality, the cost per case and the length of stay declined. These changes created a savings opportunity of $14 million, with actual savings of $10.4 million. These findings suggest that VBA can be a powerful tool to enhance value (quality/cost) in a complex surgical system. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  15. The Steroid Hormone 20-Hydroxyecdysone Up-regulates Ste-20 Family Serine/Threonine Kinase Hippo to Induce Programmed Cell Death*

    PubMed Central

    Dong, Du-Juan; Jing, Yu-Pu; Liu, Wen; Wang, Jin-Xing; Zhao, Xiao-Fan

    2015-01-01

    The steroid hormone 20-hydroxyecdysone (20E) and the serine/threonine Ste20-like kinase Hippo signal promote programmed cell death (PCD) during development, although the interaction between them remains unclear. Here, we present evidence that 20E up-regulates Hippo to induce PCD during the metamorphic development of insects. We found that Hippo is involved in 20E-induced metamorphosis via promoting the phosphorylation and cytoplasmic retention of Yorkie (Yki), causing suppressed expression of the inhibitor of apoptosis (IAP), thereby releasing its inhibitory effect on caspase. Furthermore, we show that 20E induced the expression of Hippo at the transcriptional level through the ecdysone receptor (EcR), ultraspiracle protein (USP), and hormone receptor 3 (HR3). We also found that Hippo suppresses the binding of Yki complex to the HR3 promoter. In summary, 20E up-regulates the transcription of Hippo via EcRB1, USP1, and HR3 to induce PCD, and Hippo has negative feedback effects on HR3 expression. These two signaling pathways coordinate PCD during insect metamorphosis. PMID:26272745

  16. Simulation of ultra-high energy photon propagation in the geomagnetic field

    NASA Astrophysics Data System (ADS)

    Homola, P.; Góra, D.; Heck, D.; Klages, H.; PeĶala, J.; Risse, M.; Wilczyńska, B.; Wilczyński, H.

    2005-12-01

    The identification of primary photons or specifying stringent limits on the photon flux is of major importance for understanding the origin of ultra-high energy (UHE) cosmic rays. UHE photons can initiate particle cascades in the geomagnetic field, which leads to significant changes in the subsequent atmospheric shower development. We present a Monte Carlo program allowing detailed studies of conversion and cascading of UHE photons in the geomagnetic field. The program named PRESHOWER can be used both as an independent tool or together with a shower simulation code. With the stand-alone version of the code it is possible to investigate various properties of the particle cascade induced by UHE photons interacting in the Earth's magnetic field before entering the Earth's atmosphere. Combining this program with an extensive air shower simulation code such as CORSIKA offers the possibility of investigating signatures of photon-initiated showers. In particular, features can be studied that help to discern such showers from the ones induced by hadrons. As an illustration, calculations for the conditions of the southern part of the Pierre Auger Observatory are presented. Catalogue identifier:ADWG Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWG Program obtainable: CPC Program Library, Quen's University of Belfast, N. Ireland Computer on which the program has been thoroughly tested:Intel-Pentium based PC Operating system:Linux, DEC-Unix Programming language used:C, FORTRAN 77 Memory required to execute with typical data:<100 kB No. of bits in a word:32 Has the code been vectorized?:no Number of lines in distributed program, including test data, etc.:2567 Number of bytes in distributed program, including test data, etc.:25 690 Distribution format:tar.gz Other procedures used in PRESHOWER:IGRF [N.A. Tsyganenko, National Space Science Data Center, NASA GSFC, Greenbelt, MD 20771, USA, http://nssdc.gsfc.nasa.gov/space/model/magnetos/data-based/geopack.html], bessik, ran2 [Numerical Recipes, http://www.nr.com]. Nature of the physical problem:Simulation of a cascade of particles initiated by UHE photon passing through the geomagnetic field above the Earth's atmosphere. Method of solution: The primary photon is tracked until its conversion into ee pair or until it reaches the upper atmosphere. If conversion occurred each individual particle in the resultant preshower is checked for either bremsstrahlung radiation (electrons) or secondary gamma conversion (photons). The procedure ends at the top of atmosphere and the shower particle data are saved. Restrictions on the complexity of the problem: Gamma conversion into particles other than electron pair has not been taken into account. Typical running time: 100 preshower events with primary energy 10 eV require a 800 MHz CPU time of about 50 min, with 10 eV the simulation time for 100 events grows up to 500 min.

  17. Feasibility study of current pulse induced 2-bit/4-state multilevel programming in phase-change memory

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Fan, Xi; Chen, Houpeng; Wang, Yueqing; Liu, Bo; Song, Zhitang; Feng, Songlin

    2017-08-01

    In this brief, multilevel data storage for phase-change memory (PCM) has attracted more attention in the memory market to implement high capacity memory system and reduce cost-per-bit. In this work, we present a universal programing method of SET stair-case current pulse in PCM cells, which can exploit the optimum programing scheme to achieve 2-bit/ 4state resistance-level with equal logarithm interval. SET stair-case waveform can be optimized by TCAD real time simulation to realize multilevel data storage efficiently in an arbitrary phase change material. Experimental results from 1 k-bit PCM test-chip have validated the proposed multilevel programing scheme. This multilevel programming scheme has improved the information storage density, robustness of resistance-level, energy efficient and avoiding process complexity.

  18. Comprehensive Adolescent Health Programs That Include Sexual and Reproductive Health Services: A Systematic Review

    PubMed Central

    Parekh, Jenita; Tunçalp, Özge; Turke, Shani; Blum, Robert William

    2014-01-01

    We systematically reviewed peer-reviewed and gray literature on comprehensive adolescent health (CAH) programs (1998–2013), including sexual and reproductive health services. We screened 36 119 records and extracted articles using predefined criteria. We synthesized data into descriptive characteristics and assessed quality by evidence level. We extracted data on 46 programs, of which 19 were defined as comprehensive. Ten met all inclusion criteria. Most were US based; others were implemented in Egypt, Ethiopia, and Mexico. Three programs displayed rigorous evidence; 5 had strong and 2 had modest evidence. Those with rigorous or strong evidence directly or indirectly influenced adolescent sexual and reproductive health. The long-term impact of many CAH programs cannot be proven because of insufficient evaluations. Evaluation approaches that take into account the complex operating conditions of many programs are needed to better understand mechanisms behind program effects. PMID:25320876

  19. About the Exposure Factors Program | Science Inventory | US ...

    EPA Pesticide Factsheets

    The development of the latest version of the Exposure Factors Handbook (EFH): 2011 Edition (EPA/600/R-09/052F) has maintained the need for a more comprehensive program that addresses issues related to exposure factors. Since the first version of the EFH was released in 1997, the need for the most up-to-date and accurate data on exposure factors used in assessing exposure to contaminants in the environment is of high priority to exposure assessors throughout the U.S. The completion of the 2011 edition of the Exposure Factors Handbook has only been the first step in fulfilling this need. Many data needs have been identified and follow up research is underway to address some of the data gaps. This web page is intended to provide a

  20. Characterization of representative materials in support of safe, long term storage of surplus plutonium in DOE-STD-3013 containers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Paul H; Narlesky, Joshua E; Worl, Laura A

    2010-01-01

    The Surveillance and Monitoring Program (SMP) is a joint LANL/SRS effort funded by DOE/EM to provide the technical basis for the safe, long-term storage (up to 50 years) of over 6 metric tons of plutonium stored in over 5000 DOE-STD-3013 containers at various facilities around the DOE complex. The majority of this material is plutonium that is surplus to the nuclear weapons program, and much of it is destined for conversion to mixed oxide fuel for use in US nuclear power plants. The form of the plutonium ranges from relatively pure metal and oxide to very impure oxide. The performancemore » of the 3013 containers has been shown to depend on moisture content and on the levels, types and chemical forms of the impurities. The oxide materials that present the greatest challenge to the storage container are those that contain chloride salts. The chlorides (NaCl, KCl, CaCl{sub 2}, and MgCl{sub 2}) range from less than half of the impurities present to nearly all the impurities. Other common impurities include oxides and other compounds of calcium, magnesium, iron, and nickel. Over the past 15 years the program has collected a large body of experimental data on over 60 samples of plutonium chosen to represent the broader population of materials in storage. This paper will summarize the characterization data, including the origin and process history, particle size, surface area, density, calorimetry, chemical analysis, moisture analysis, prompt gamma, gas generation and corrosion behavior.« less

Top