Sample records for large scale program

  1. Designing for Scale: Reflections on Rolling Out Reading Improvement in Kenya and Liberia.

    PubMed

    Gove, Amber; Korda Poole, Medina; Piper, Benjamin

    2017-03-01

    Since 2008, the Ministries of Education in Liberia and Kenya have undertaken transitions from small-scale pilot programs to improve reading outcomes among primary learners to the large-scale implementation of reading interventions. The effects of the pilots on learning outcomes were significant, but questions remained regarding whether such large gains could be sustained at scale. In this article, the authors dissect the Liberian and Kenyan experiences with implementing large-scale reading programs, documenting the critical components and conditions of the program designs that affected the likelihood of successfully transitioning from pilot to scale. They also review the design, deployment, and effectiveness of each pilot program and the scale, design, duration, enabling conditions, and initial effectiveness results of the scaled programs in each country. The implications of these results for the design of both pilot and large-scale reading programs are discussed in light of the experiences of both the Liberian and Kenyan programs. © 2017 Wiley Periodicals, Inc.

  2. A Functional Model for Management of Large Scale Assessments.

    ERIC Educational Resources Information Center

    Banta, Trudy W.; And Others

    This functional model for managing large-scale program evaluations was developed and validated in connection with the assessment of Tennessee's Nutrition Education and Training Program. Management of such a large-scale assessment requires the development of a structure for the organization; distribution and recovery of large quantities of…

  3. Critical Issues in Large-Scale Assessment: A Resource Guide.

    ERIC Educational Resources Information Center

    Redfield, Doris

    The purpose of this document is to provide practical guidance and support for the design, development, and implementation of large-scale assessment systems that are grounded in research and best practice. Information is included about existing large-scale testing efforts, including national testing programs, state testing programs, and…

  4. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  5. Optical/IR from ground

    NASA Technical Reports Server (NTRS)

    Strom, Stephen; Sargent, Wallace L. W.; Wolff, Sidney; Ahearn, Michael F.; Angel, J. Roger; Beckwith, Steven V. W.; Carney, Bruce W.; Conti, Peter S.; Edwards, Suzan; Grasdalen, Gary

    1991-01-01

    Optical/infrared (O/IR) astronomy in the 1990's is reviewed. The following subject areas are included: research environment; science opportunities; technical development of the 1980's and opportunities for the 1990's; and ground-based O/IR astronomy outside the U.S. Recommendations are presented for: (1) large scale programs (Priority 1: a coordinated program for large O/IR telescopes); (2) medium scale programs (Priority 1: a coordinated program for high angular resolution; Priority 2: a new generation of 4-m class telescopes); (3) small scale programs (Priority 1: near-IR and optical all-sky surveys; Priority 2: a National Astrometric Facility); and (4) infrastructure issues (develop, purchase, and distribute optical CCDs and infrared arrays; a program to support large optics technology; a new generation of large filled aperture telescopes; a program to archive and disseminate astronomical databases; and a program for training new instrumentalists)

  6. CHARACTERIZATION OF SMALL ESTUARIES AS A COMPONENT OF A REGIONAL-SCALE MONITORING PROGRAM

    EPA Science Inventory

    Large-scale environmental monitoring programs, such as EPA's Environmental Monitoring and Assessment Program (EMAP), by nature focus on estimating the ecological condition of large geographic areas. Generally missing is the ability to provide estimates of condition of individual ...

  7. Small-scale monitoring - can it be integrated with large-scale programs?

    Treesearch

    C. M. Downes; J. Bart; B. T. Collins; B. Craig; B. Dale; E. H. Dunn; C. M. Francis; S. Woodley; P. Zorn

    2005-01-01

    There are dozens of programs and methodologies for monitoring and inventory of bird populations, differing in geographic scope, species focus, field methods and purpose. However, most of the emphasis has been placed on large-scale monitoring programs. People interested in assessing bird numbers and long-term trends in small geographic areas such as a local birding area...

  8. Performance of lap splices in large-scale column specimens affected by ASR and/or DEF.

    DOT National Transportation Integrated Search

    2012-06-01

    This research program conducted a large experimental program, which consisted of the design, construction, : curing, deterioration, and structural load testing of 16 large-scale column specimens with a critical lap splice : region, and then compared ...

  9. Aquatic Plant Control Research Program. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 5. Synthesis Report.

    DTIC Science & Technology

    1984-06-01

    RD-Rl45 988 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MANAGEMENT ..(U) ARMY ENGINEER WATERWAYS EXPERIMENT STATION VICKSBURG MS...REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR -, CONTROL OF PROBLEM AQUATIC PLANTS Report 5 SYNTHESIS REPORT bv Andrew...Corps of Engineers Washington, DC 20314 84 0,_1 oil.. LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC

  10. 5 years of experience with a large-scale mentoring program for medical students.

    PubMed

    Pinilla, Severin; Pander, Tanja; von der Borch, Philip; Fischer, Martin R; Dimitriadis, Konstantinos

    2015-01-01

    In this paper we present our 5-year-experience with a large-scale mentoring program for undergraduate medical students at the Ludwig Maximilians-Universität Munich (LMU). We implemented a two-tiered program with a peer-mentoring concept for preclinical students and a 1:1-mentoring concept for clinical students aided by a fully automated online-based matching algorithm. Approximately 20-30% of each student cohort participates in our voluntary mentoring program. Defining ideal program evaluation strategies, recruiting mentors from beyond the academic environment and accounting for the mentoring network reality remain challenging. We conclude that a two-tiered program is well accepted by students and faculty. In addition the online-based matching seems to be effective for large-scale mentoring programs.

  11. Problems and Solutions in Evaluating Child Outcomes of Large-Scale Educational Programs.

    ERIC Educational Resources Information Center

    Abrams, Allan S.; And Others

    1979-01-01

    Evaluation of large-scale programs is problematical because of inherent bias in assignment of treatment and control groups, resulting in serious regression artifacts even with the use of analysis of covariance designs. Nonuniformity of program implementation across sites and classrooms is also a problem. (Author/GSK)

  12. Large-scale linear programs in planning and prediction.

    DOT National Transportation Integrated Search

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  13. A new resource for developing and strengthening large-scale community health worker programs.

    PubMed

    Perry, Henry; Crigler, Lauren; Lewin, Simon; Glenton, Claire; LeBan, Karen; Hodgins, Steve

    2017-01-12

    Large-scale community health worker programs are now growing in importance around the world in response to the resurgence of interest and growing evidence of the importance of community-based primary health care for improving the health of populations in resource-constrained, high-mortality settings. These programs, because of their scale and operational challenges, merit special consideration by the global health community, national policy-makers, and program implementers. A new online resource is now available to assist in that effort: Developing and Strengthening Community Health Worker Programs at Scale: A Reference Guide and Case Studies for Program Managers and Policymakers ( http://www.mchip.net/CHWReferenceGuide ). This CHW Reference Guide is the product of 27 different collaborators who, collectively, have a formidable breadth and depth of experience and knowledge about CHW programming around the world. It provides a thoughtful discussion about the many operational issues that large-scale CHW programs need to address as they undergo the process of development, expansion or strengthening. Detailed case studies of 12 national CHW programs are included in the Appendix-the most current and complete cases studies as a group that are currently available. Future articles in this journal will highlight many of the themes in the CHW Reference Guide and provide an update of recent advances and experiences. These articles will serve, we hope, to (1) increase awareness about the CHW Reference Guide and its usefulness and (2) connect a broader audience to the critical importance of strengthening large-scale CHW programs for the health benefits that they can bring to underserved populations around the world.

  14. Estimating the Effectiveness of Special Education Using Large-Scale Assessment Data

    ERIC Educational Resources Information Center

    Ewing, Katherine Anne

    2009-01-01

    The inclusion of students with disabilities in large scale assessment and accountability programs has provided new opportunities to examine the impact of special education services on student achievement. Hanushek, Kain, and Rivkin (1998, 2002) evaluated the effectiveness of special education programs by examining students' gains on a large-scale…

  15. Iraq: Recent Developments in Reconstruction Assistance

    DTIC Science & Technology

    2005-05-12

    in Reconstruction Assistance Summary Large-scale reconstruction assistance programs are being undertaken by the United States following the war with... assistance programs , the Coalition Provisional Authority (CPA), dissolved, and sovereignty was returned to Iraq. Security Council Resolution 1546 of June...Assessment.pdf]. Iraq: Recent Developments in Reconstruction Assistance Large-scale reconstruction assistance programs are being undertaken by the United

  16. Iraq: Recent Developments in Reconstruction Assistance

    DTIC Science & Technology

    2005-03-23

    Developments in Reconstruction Assistance Summary Large-scale reconstruction assistance programs are being undertaken by the United States following the war...currently the case. The House approved the measure on March 16. On June 28, 2004, the entity implementing assistance programs , the Coalition Provisional...Large-scale reconstruction assistance programs are being undertaken by the United States in Iraq. This report describes recent developments in this

  17. Large-Scale 1:1 Computing Initiatives: An Open Access Database

    ERIC Educational Resources Information Center

    Richardson, Jayson W.; McLeod, Scott; Flora, Kevin; Sauers, Nick J.; Kannan, Sathiamoorthy; Sincar, Mehmet

    2013-01-01

    This article details the spread and scope of large-scale 1:1 computing initiatives around the world. What follows is a review of the existing literature around 1:1 programs followed by a description of the large-scale 1:1 database. Main findings include: 1) the XO and the Classmate PC dominate large-scale 1:1 initiatives; 2) if professional…

  18. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  19. Aquatic Plant Control Research Program. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Reports 2 and 3. First and Second Year Poststocking Results. Volume 5. The Herpetofauna of Lake Conway, Florida: Community Analysis.

    DTIC Science & Technology

    1983-07-01

    TEST CHART NATIONAL BVIREAU OF StANARS-1963- I AQUATIC PLANT CONTROL RESEARCH PROGRAM TECHNICAL REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF...Waterways Experiment Station P. 0. Box 631, Vicksburg, Miss. 39180 83 11 01 018 - I ., lit I III I | LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE...No. 3. RECIPIENT’S CATALOG NUMBER Technical Report A-78-2 Aa 1 Lj 19 ________5!1___ A. TITLE (Ad Subtitle) LARGE-SCALE OPERATIONS MANAGEMENT S. TYPE

  20. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  1. Large-scale translocation strategies for reintroducing red-cockaded woodpeckers

    Treesearch

    Daniel Saenz; Kristen A. Baum; Richard N. Conner; D. Craig Rudolph; Ralph Costa

    2002-01-01

    Translocation of wild birds is a potential conservation strategy for the endangered red-cockaded woodpecker (Picoides borealis). We developed and tested 8 large-scale translocation strategy models for a regional red-cockaded woodpecker reintroduction program. The purpose of the reintroduction program is to increase the number of red-cockaded...

  2. Space transportation booster engine thrust chamber technology, large scale injector

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.

    1993-01-01

    The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.

  3. Large-scale Advanced Prop-fan (LAP) technology assessment report

    NASA Technical Reports Server (NTRS)

    Degeorge, C. L.

    1988-01-01

    The technologically significant findings and accomplishments of the Large Scale Advanced Prop-Fan (LAP) program in the areas of aerodynamics, aeroelasticity, acoustics and materials and fabrication are described. The extent to which the program goals related to these disciplines were achieved is discussed, and recommendations for additional research are presented. The LAP program consisted of the design, manufacture and testing of a near full-scale Prop-Fan or advanced turboprop capable of operating efficiently at speeds to Mach .8. An aeroelastically scaled model of the LAP was also designed and fabricated. The goal of the program was to acquire data on Prop-Fan performance that would indicate the technology readiness of Prop-Fans for practical applications in commercial and military aviation.

  4. Architectural Visualization of C/C++ Source Code for Program Comprehension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panas, T; Epperly, T W; Quinlan, D

    2006-09-01

    Structural and behavioral visualization of large-scale legacy systems to aid program comprehension is still a major challenge. The challenge is even greater when applications are implemented in flexible and expressive languages such as C and C++. In this paper, we consider visualization of static and dynamic aspects of large-scale scientific C/C++ applications. For our investigation, we reuse and integrate specialized analysis and visualization tools. Furthermore, we present a novel layout algorithm that permits a compressive architectural view of a large-scale software system. Our layout is unique in that it allows traditional program visualizations, i.e., graph structures, to be seen inmore » relation to the application's file structure.« less

  5. Detecting and Correcting Scale Drift in Test Equating: An Illustration from a Large Scale Testing Program

    ERIC Educational Resources Information Center

    Puhan, Gautam

    2009-01-01

    The purpose of this study is to determine the extent of scale drift on a test that employs cut scores. It was essential to examine scale drift for this testing program because new forms in this testing program are often put on scale through a series of intermediate equatings (known as equating chains). This process may cause equating error to…

  6. A large-scale initiative to disseminate an evidence-based drug abuse prevention program in Italy: Lessons learned for practitioners and researchers.

    PubMed

    Velasco, Veronica; Griffin, Kenneth W; Antichi, Mariella; Celata, Corrado

    2015-10-01

    Across developed countries, experimentation with alcohol, tobacco, and other drugs often begins in the early adolescent years. Several evidence-based programs have been developed to prevent adolescent substance use. Many of the most rigorously tested and empirically supported prevention programs were initially developed and tested in the United States. Increasingly, these interventions are being adopted for use in Europe and throughout the world. This paper reports on a large-scale comprehensive initiative designed to select, adapt, implement, and sustain an evidence-based drug abuse prevention program in Italy. As part of a large-scale regionally funded collaboration in the Lombardy region of Italy, we report on processes through which a team of stakeholders selected, translated and culturally adapted, planned, implemented and evaluated the Life Skills Training (LST) school-based drug abuse prevention program, an evidence-based intervention developed in the United States. We discuss several challenges and lessons learned and implications for prevention practitioners and researchers attempting to undertake similar international dissemination projects. We review several published conceptual models designed to promote the replication and widespread dissemination of effective programs, and discuss their strengths and limitations in the context of planning and implementing a complex, large-scale real-world dissemination effort. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. DIALOG: An executive computer program for linking independent programs

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Hague, D. S.; Watson, D. A.

    1973-01-01

    A very large scale computer programming procedure called the DIALOG executive system was developed for the CDC 6000 series computers. The executive computer program, DIALOG, controls the sequence of execution and data management function for a library of independent computer programs. Communication of common information is accomplished by DIALOG through a dynamically constructed and maintained data base of common information. Each computer program maintains its individual identity and is unaware of its contribution to the large scale program. This feature makes any computer program a candidate for use with the DIALOG executive system. The installation and uses of the DIALOG executive system are described.

  8. State of the Art Methodology for the Design and Analysis of Future Large Scale Evaluations: A Selective Examination.

    ERIC Educational Resources Information Center

    Burstein, Leigh

    Two specific methods of analysis in large-scale evaluations are considered: structural equation modeling and selection modeling/analysis of non-equivalent control group designs. Their utility in large-scale educational program evaluation is discussed. The examination of these methodological developments indicates how people (evaluators,…

  9. Large-scale budget applications of mathematical programming in the Forest Service

    Treesearch

    Malcolm Kirby

    1978-01-01

    Mathematical programming applications in the Forest Service, U.S. Department of Agriculture, are growing. They are being used for widely varying problems: budgeting, lane use planning, timber transport, road maintenance and timber harvest planning. Large-scale applications are being mace in budgeting. The model that is described can be used by developing economies....

  10. Status of JUPITER Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inoue, T.; Shirakata, K.; Kinjo, K.

    To obtain the data necessary for evaluating the nuclear design method of a large-scale fast breeder reactor, criticality tests with a large- scale homogeneous reactor were conducted as part of a joint research program by Japan and the U.S. Analyses of the tests are underway in both countries. The purpose of this paper is to describe the status of this project.

  11. Exercises in Evaluation of a Large-Scale Educational Program.

    ERIC Educational Resources Information Center

    Glass, Gene V.

    This workbook is designed to serve as training experience for educational evaluators at the preservice (graduate school) or inservice stages. The book comprises a series of exercises in the planning, execution, and reporting of the evaluation of a large-scale educational program in this case Title I of the Elementary and Secondary Education Act of…

  12. Community health worker programs in India: a rights-based review.

    PubMed

    Bhatia, Kavita

    2014-09-01

    This article presents a historical review of national community health worker (CHW) programs in India using a gender- and rights-based lens. The aim is to derive relevant policy implications to stem attrition and enable sustenance of large-scale CHW programs. For the literature review, relevant government policies, minutes of meetings, reports, newspaper articles and statistics were accessed through official websites and a hand search was conducted for studies on the rights-based aspects of large-scale CHW programs. The analysis shows that the CHWs in three successive Indian national CHW programs have consistently asked for reforms in their service conditions, including increased remuneration. Despite an evolution in stakeholder perspectives regarding the rights of CHWs, service reforms are slow. Performance-based payments do not provide the financial security expected by CHWs as demonstrated in the recent Accredited Social Health Activist (ASHA) program. In most countries, CHWs, who are largely women, have never been integrated into the established, salaried team of health system workers. The two hallmark characteristics of CHWs, namely, their volunteer status and the flexibility of their tasks and timings, impede their rights. The consequences of initiating or neglecting standardization should be considered by all countries with large-scale CHW programs like the ASHA program. © Royal Society for Public Health 2014.

  13. Supporting large scale applications on networks of workstations

    NASA Technical Reports Server (NTRS)

    Cooper, Robert; Birman, Kenneth P.

    1989-01-01

    Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.

  14. Trans-National Scale-Up of Services in Global Health

    PubMed Central

    Shahin, Ilan; Sohal, Raman; Ginther, John; Hayden, Leigh; MacDonald, John A.; Mossman, Kathryn; Parikh, Himanshu; McGahan, Anita; Mitchell, Will; Bhattacharyya, Onil

    2014-01-01

    Background Scaling up innovative healthcare programs offers a means to improve access, quality, and health equity across multiple health areas. Despite large numbers of promising projects, little is known about successful efforts to scale up. This study examines trans-national scale, whereby a program operates in two or more countries. Trans-national scale is a distinct measure that reflects opportunities to replicate healthcare programs in multiple countries, thereby providing services to broader populations. Methods Based on the Center for Health Market Innovations (CHMI) database of nearly 1200 health programs, the study contrasts 116 programs that have achieved trans-national scale with 1,068 single-country programs. Data was collected on the programs' health focus, service activity, legal status, and funding sources, as well as the programs' locations (rural v. urban emphasis), and founding year; differences are reported with statistical significance. Findings This analysis examines 116 programs that have achieved trans-national scale (TNS) across multiple disease areas and activity types. Compared to 1,068 single-country programs, we find that trans-nationally scaled programs are more donor-reliant; more likely to focus on targeted health needs such as HIV/AIDS, TB, malaria, or family planning rather than provide more comprehensive general care; and more likely to engage in activities that support healthcare services rather than provide direct clinical care. Conclusion This work, based on a large data set of health programs, reports on trans-national scale with comparison to single-country programs. The work is a step towards understanding when programs are able to replicate their services as they attempt to expand health services for the poor across countries and health areas. A subset of these programs should be the subject of case studies to understand factors that affect the scaling process, particularly seeking to identify mechanisms that lead to improved health outcomes. PMID:25375328

  15. Framing Innovation: Does an Instructional Vision Help Superintendents Gain Acceptance for a Large-Scale Technology Initiative?

    ERIC Educational Resources Information Center

    Flanagan, Gina E.

    2014-01-01

    There is limited research that outlines how a superintendent's instructional vision can help to gain acceptance of a large-scale technology initiative. This study explored how superintendents gain acceptance for a large-scale technology initiative (specifically a 1:1 device program) through various leadership actions. The role of the instructional…

  16. Large-Scale Multiobjective Static Test Generation for Web-Based Testing with Integer Programming

    ERIC Educational Resources Information Center

    Nguyen, M. L.; Hui, Siu Cheung; Fong, A. C. M.

    2013-01-01

    Web-based testing has become a ubiquitous self-assessment method for online learning. One useful feature that is missing from today's web-based testing systems is the reliable capability to fulfill different assessment requirements of students based on a large-scale question data set. A promising approach for supporting large-scale web-based…

  17. Aquatic Plant Control Research Program. Large-Scale Operations Management Test (LSOMT) of Insects and Pathogens for Control of Waterhyacinth in Louisiana. Volume 1. Results for 1979-1981.

    DTIC Science & Technology

    1985-01-01

    RD-Ai56 759 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MRNAGEMENT..(U) ARMY ENGINEER WATERAYS EXPERIMENT STATION VICKSBURG MS...PO Box 631, Vicksburg, Aquatic Plant Control Mississippi 39180-0631 and University of Research Program Tennessee-Chattanooga, Chattanooga, 12...19. KEY WORDS (Continue on reverse side if necesary nd identify by block number) - Aquatic plant control Louisiana Biological control Plant

  18. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  19. Issues in Estimating Program Effects and Studying Implementation in Large-Scale Educational Experiments: The Case of a Connected Classroom Technology Program

    ERIC Educational Resources Information Center

    Shin, Hye Sook

    2009-01-01

    Using data from a nationwide, large-scale experimental study of the effects of a connected classroom technology on student learning in algebra (Owens et al., 2004), this dissertation focuses on challenges that can arise in estimating treatment effects in educational field experiments when samples are highly heterogeneous in terms of various…

  20. Lichen elemental content bioindicators for air quality in upper Midwest, USA: A model for large-scale monitoring

    Treesearch

    Susan Will-Wolf; Sarah Jovan; Michael C. Amacher

    2017-01-01

    Our development of lichen elemental bioindicators for a United States of America (USA) national monitoring program is a useful model for other large-scale programs. Concentrations of 20 elements were measured, validated, and analyzed for 203 samples of five common lichen species. Collections were made by trained non-specialists near 75 permanent plots and an expert...

  1. Large-Scale Coronal Heating from the Solar Magnetic Network

    NASA Technical Reports Server (NTRS)

    Falconer, David A.; Moore, Ronald L.; Porter, Jason G.; Hathaway, David H.

    1999-01-01

    In Fe 12 images from SOHO/EIT, the quiet solar corona shows structure on scales ranging from sub-supergranular (i.e., bright points and coronal network) to multi- supergranular. In Falconer et al 1998 (Ap.J., 501, 386) we suppressed the large-scale background and found that the network-scale features are predominantly rooted in the magnetic network lanes at the boundaries of the supergranules. The emission of the coronal network and bright points contribute only about 5% of the entire quiet solar coronal Fe MI emission. Here we investigate the large-scale corona, the supergranular and larger-scale structure that we had previously treated as a background, and that emits 95% of the total Fe XII emission. We compare the dim and bright halves of the large- scale corona and find that the bright half is 1.5 times brighter than the dim half, has an order of magnitude greater area of bright point coverage, has three times brighter coronal network, and has about 1.5 times more magnetic flux than the dim half These results suggest that the brightness of the large-scale corona is more closely related to the large- scale total magnetic flux than to bright point activity. We conclude that in the quiet sun: (1) Magnetic flux is modulated (concentrated/diluted) on size scales larger than supergranules. (2) The large-scale enhanced magnetic flux gives an enhanced, more active, magnetic network and an increased incidence of network bright point formation. (3) The heating of the large-scale corona is dominated by more widespread, but weaker, network activity than that which heats the bright points. This work was funded by the Solar Physics Branch of NASA's office of Space Science through the SR&T Program and the SEC Guest Investigator Program.

  2. Equating in Small-Scale Language Testing Programs

    ERIC Educational Resources Information Center

    LaFlair, Geoffrey T.; Isbell, Daniel; May, L. D. Nicolas; Gutierrez Arvizu, Maria Nelly; Jamieson, Joan

    2017-01-01

    Language programs need multiple test forms for secure administrations and effective placement decisions, but can they have confidence that scores on alternate test forms have the same meaning? In large-scale testing programs, various equating methods are available to ensure the comparability of forms. The choice of equating method is informed by…

  3. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    PubMed

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs explored in an RCT are inadequate in describing the enablers and barriers of upscaled community program implementation. Methods for data collection, analysis and reporting also require consideration. We present a number of experiential reflections and suggestions for the successful evaluation of future upscaled community programs which are scarcely reported in the literature. PEACH™ QLD was retrospectively registered with the Australian New Zealand Clinical Trials Registry on 28 February 2017 (ACTRN12617000315314).

  4. Performance of lap splices in large-scale column specimens affected by ASR and/or DEF-extension phase.

    DOT National Transportation Integrated Search

    2015-03-01

    A large experimental program, consisting of the design, construction, curing, exposure, and structural load : testing of 16 large-scale column specimens with a critical lap splice region that were influenced by varying : stages of alkali-silica react...

  5. DIALOG: An executive computer program for linking independent programs

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Hague, D. S.; Watson, D. A.

    1973-01-01

    A very large scale computer programming procedure called the DIALOG Executive System has been developed for the Univac 1100 series computers. The executive computer program, DIALOG, controls the sequence of execution and data management function for a library of independent computer programs. Communication of common information is accomplished by DIALOG through a dynamically constructed and maintained data base of common information. The unique feature of the DIALOG Executive System is the manner in which computer programs are linked. Each program maintains its individual identity and as such is unaware of its contribution to the large scale program. This feature makes any computer program a candidate for use with the DIALOG Executive System. The installation and use of the DIALOG Executive System are described at Johnson Space Center.

  6. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    PubMed

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.

  7. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions

    PubMed Central

    Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128

  8. Small-scale test program to develop a more efficient swivel nozzle thrust deflector for V/STOL lift/cruise engines

    NASA Technical Reports Server (NTRS)

    Schlundt, D. W.

    1976-01-01

    The installed performance degradation of a swivel nozzle thrust deflector system obtained during increased vectoring angles of a large-scale test program was investigated and improved. Small-scale models were used to generate performance data for analyzing selected swivel nozzle configurations. A single-swivel nozzle design model with five different nozzle configurations and a twin-swivel nozzle design model, scaled to 0.15 size of the large-scale test hardware, were statically tested at low exhaust pressure ratios of 1.4, 1.3, 1.2, and 1.1 and vectored at four nozzle positions from 0 deg cruise through 90 deg vertical used for the VTOL mode.

  9. Stream Flow Prediction by Remote Sensing and Genetic Programming

    NASA Technical Reports Server (NTRS)

    Chang, Ni-Bin

    2009-01-01

    A genetic programming (GP)-based, nonlinear modeling structure relates soil moisture with synthetic-aperture-radar (SAR) images to present representative soil moisture estimates at the watershed scale. Surface soil moisture measurement is difficult to obtain over a large area due to a variety of soil permeability values and soil textures. Point measurements can be used on a small-scale area, but it is impossible to acquire such information effectively in large-scale watersheds. This model exhibits the capacity to assimilate SAR images and relevant geoenvironmental parameters to measure soil moisture.

  10. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    DTIC Science & Technology

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  11. Using LISREL to Evaluate Measurement Models and Scale Reliability.

    ERIC Educational Resources Information Center

    Fleishman, John; Benson, Jeri

    1987-01-01

    LISREL program was used to examine measurement model assumptions and to assess reliability of Coopersmith Self-Esteem Inventory for Children, Form B. Data on 722 third-sixth graders from over 70 schools in large urban school district were used. LISREL program assessed (1) nature of basic measurement model for scale, (2) scale invariance across…

  12. Public attitudes toward programs of large-scale technological changes: Some reflections and policy prescriptions, appendix E

    NASA Technical Reports Server (NTRS)

    Shostak, A. B.

    1973-01-01

    The question of how ready the public is for the implementation of large-scale programs of technological change is considered. Four vital aspects of the issue are discussed which include: (1) the ways in which the public mis-perceives the change process, (2) the ways in which recent history impacts on public attitudes, (3) the ways in which the public divides among itself, and (4) the fundamentals of public attitudes towards change. It is concluded that nothing is so critical in the 1970's to securing public approval for large-scale planned change projects as is securing the approval by change-agents of the public.

  13. Technical instrumentation R&D for ILD SiW ECAL large scale device

    NASA Astrophysics Data System (ADS)

    Balagura, V.

    2018-03-01

    Calorimeters with silicon detectors have many unique features and are proposed for several world-leading experiments. We describe the R&D program of the large scale detector element with up to 12 000 readout channels for the International Large Detector (ILD) at the future e+e‑ ILC collider. The program is focused on the readout front-end electronics embedded inside the calorimeter. The first part with 2 000 channels and two small silicon sensors has already been constructed, the full prototype is planned for the beginning of 2018.

  14. SCALE(ing)-UP Teaching: A Case Study of Student Motivation in an Undergraduate Course

    ERIC Educational Resources Information Center

    Chittum, Jessica R.; McConnell, Kathryne Drezek; Sible, Jill

    2017-01-01

    Teaching large classes is increasingly common; thus, demand for effective large-class pedagogy is rising. One method, titled "SCALE-UP" (Student-Centered Active Learning Environment for Undergraduate Programs), is intended for large classes and involves collaborative, active learning in a technology-rich and student-centered environment.…

  15. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    NASA Technical Reports Server (NTRS)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  16. Time Discounting and Credit Market Access in a Large-Scale Cash Transfer Programme.

    PubMed

    Handa, Sudhanshu; Martorano, Bruno; Halpern, Carolyn; Pettifor, Audrey; Thirumurthy, Harsha

    2016-06-01

    Time discounting is thought to influence decision-making in almost every sphere of life, including personal finances, diet, exercise and sexual behavior. In this article we provide evidence on whether a national poverty alleviation program in Kenya can affect inter-temporal decisions. We administered a preferences module as part of a large-scale impact evaluation of the Kenyan Government's Cash Transfer for Orphans and Vulnerable Children. Four years into the program we find that individuals in the treatment group are only marginally more likely to wait for future money, due in part to the erosion of the value of the transfer by inflation. However among the poorest households for whom the value of transfer is still relatively large we find significant program effects on the propensity to wait. We also find strong program effects among those who have access to credit markets though the program itself does not improve access to credit.

  17. Multidimensional model to assess the readiness of Saudi Arabia to implement evidence based child maltreatment prevention programs at a large scale.

    PubMed

    Almuneef, Maha A; Qayad, Mohamed; Noor, Ismail K; Al-Eissa, Majid A; Albuhairan, Fadia S; Inam, Sarah; Mikton, Christopher

    2014-03-01

    There has been increased awareness of child maltreatment in Saudi Arabia recently. This study assessed the readiness for implementing large-scale evidence-based child maltreatment prevention programs in Saudi Arabia. Key informants, who were key decision makers and senior managers in the field of child maltreatment, were invited to participate in the study. A multidimensional tool, developed by WHO and collaborators from several middle and low income countries, was used to assess 10 dimensions of readiness. A group of experts also gave an objective assessment of the 10 dimensions and key informants' and experts' scores were compared. On a scale of 100, the key informants gave a readiness score of 43% for Saudi Arabia to implement large-scale, evidence-based CM prevention programs, and experts gave an overall readiness score of 40%. Both the key informants and experts agreed that 4 of the dimensions (attitudes toward child maltreatment prevention, institutional links and resources, material resources, and human and technical resources) had low readiness scores (<5) each and three dimensions (knowledge of child maltreatment prevention, scientific data on child maltreatment prevention, and will to address child maltreatment problem) had high readiness scores (≥5) each. There was significant disagreement between key informants and experts on the remaining 3 dimensions. Overall, Saudi Arabia has a moderate/fair readiness to implement large-scale child maltreatment prevention programs. Capacity building; strengthening of material resources; and improving institutional links, collaborations, and attitudes toward the child maltreatment problem are required to improve the country's readiness to implement such programs. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Practical experience from the Office of Adolescent Health's large scale implementation of an evidence-based Teen Pregnancy Prevention Program.

    PubMed

    Margolis, Amy Lynn; Roper, Allison Yvonne

    2014-03-01

    After 3 years of experience overseeing the implementation and evaluation of evidence-based teen pregnancy prevention programs in a diversity of populations and settings across the country, the Office of Adolescent Health (OAH) has learned numerous lessons through practical application and new experiences. These lessons and experiences are applicable to those working to implement evidence-based programs on a large scale. The lessons described in this paper focus on what it means for a program to be implementation ready, the role of the program developer in replicating evidence-based programs, the importance of a planning period to ensure quality implementation, the need to define and measure fidelity, and the conditions necessary to support rigorous grantee-level evaluation. Published by Elsevier Inc.

  19. Converting Data to Knowledge: One District's Experience Using Large-Scale Proficiency Assessment

    ERIC Educational Resources Information Center

    Davin, Kristin J.; Rempert, Tania A.; Hammerand, Amy A.

    2014-01-01

    The present study reports data from a large-scale foreign language proficiency assessment to explore trends across a large urban school district. These data were used in conjunction with data from teacher and student questionnaires to make recommendations for foreign language programs across the district. This evaluation process resulted in…

  20. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  1. Enterprise tools to promote interoperability: MonitoringResources.org supports design and documentation of large-scale, long-term monitoringprograms

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Scully, R. A.; Bayer, J.

    2016-12-01

    Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.

  2. Scaling up Education Reform

    ERIC Educational Resources Information Center

    Gaffney, Jon D. H.; Richards, Evan; Kustusch, Mary Bridget; Ding, Lin; Beichner, Robert J.

    2008-01-01

    The SCALE-UP (Student-Centered Activities for Large Enrollment for Undergraduate Programs) project was developed to implement reforms designed for small classes into large physics classes. Over 50 schools across the country, ranging from Wake Technical Community College to Massachusetts Institute of Technology (MIT), have adopted it for classes of…

  3. An Optimization Code for Nonlinear Transient Problems of a Large Scale Multidisciplinary Mathematical Model

    NASA Astrophysics Data System (ADS)

    Takasaki, Koichi

    This paper presents a program for the multidisciplinary optimization and identification problem of the nonlinear model of large aerospace vehicle structures. The program constructs the global matrix of the dynamic system in the time direction by the p-version finite element method (pFEM), and the basic matrix for each pFEM node in the time direction is described by a sparse matrix similarly to the static finite element problem. The algorithm used by the program does not require the Hessian matrix of the objective function and so has low memory requirements. It also has a relatively low computational cost, and is suited to parallel computation. The program was integrated as a solver module of the multidisciplinary analysis system CUMuLOUS (Computational Utility for Multidisciplinary Large scale Optimization of Undense System) which is under development by the Aerospace Research and Development Directorate (ARD) of the Japan Aerospace Exploration Agency (JAXA).

  4. Botswana water and surface energy balance research program. Part 2: Large scale moisture and passive microwaves

    NASA Technical Reports Server (NTRS)

    Vandegriend, A. A.; Owe, M.; Chang, A. T. C.

    1992-01-01

    The Botswana water and surface energy balance research program was developed to study and evaluate the integrated use of multispectral satellite remote sensing for monitoring the hydrological status of the Earth's surface. The research program consisted of two major, mutually related components: a surface energy balance modeling component, built around an extensive field campaign; and a passive microwave research component which consisted of a retrospective study of large scale moisture conditions and Nimbus scanning multichannel microwave radiometer microwave signatures. The integrated approach of both components are explained in general and activities performed within the passive microwave research component are summarized. The microwave theory is discussed taking into account: soil dielectric constant, emissivity, soil roughness effects, vegetation effects, optical depth, single scattering albedo, and wavelength effects. The study site is described. The soil moisture data and its processing are considered. The relation between observed large scale soil moisture and normalized brightness temperatures is discussed. Vegetation characteristics and inverse modeling of soil emissivity is considered.

  5. Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel supercomputers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.

  6. Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Byun, Chansup; Kwak, Dochan (Technical Monitor)

    2001-01-01

    A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel super computers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.

  7. Optimizing Implementation of Obesity Prevention Programs: A Qualitative Investigation Within a Large-Scale Randomized Controlled Trial.

    PubMed

    Kozica, Samantha L; Teede, Helena J; Harrison, Cheryce L; Klein, Ruth; Lombard, Catherine B

    2016-01-01

    The prevalence of obesity in rural and remote areas is elevated in comparison to urban populations, highlighting the need for interventions targeting obesity prevention in these settings. Implementing evidence-based obesity prevention programs is challenging. This study aimed to investigate factors influencing the implementation of obesity prevention programs, including adoption, program delivery, community uptake, and continuation, specifically within rural settings. Nested within a large-scale randomized controlled trial, a qualitative exploratory approach was adopted, with purposive sampling techniques utilized, to recruit stakeholders from 41 small rural towns in Australia. In-depth semistructured interviews were conducted with clinical health professionals, health service managers, and local government employees. Open coding was completed independently by 2 investigators and thematic analysis undertaken. In-depth interviews revealed that obesity prevention programs were valued by the rural workforce. Program implementation is influenced by interrelated factors across: (1) contextual factors and (2) organizational capacity. Key recommendations to manage the challenges of implementing evidence-based programs focused on reducing program delivery costs, aided by the provision of a suite of implementation and evaluation resources. Informing the scale-up of future prevention programs, stakeholders highlighted the need to build local rural capacity through developing supportive university partnerships, generating local program ownership and promoting active feedback to all program partners. We demonstrate that the rural workforce places a high value on obesity prevention programs. Our results inform the future scale-up of obesity prevention programs, providing an improved understanding of strategies to optimize implementation of evidence-based prevention programs. © 2015 National Rural Health Association.

  8. Time Discounting and Credit Market Access in a Large-Scale Cash Transfer Programme

    PubMed Central

    Handa, Sudhanshu; Martorano, Bruno; Halpern, Carolyn; Pettifor, Audrey; Thirumurthy, Harsha

    2017-01-01

    Summary Time discounting is thought to influence decision-making in almost every sphere of life, including personal finances, diet, exercise and sexual behavior. In this article we provide evidence on whether a national poverty alleviation program in Kenya can affect inter-temporal decisions. We administered a preferences module as part of a large-scale impact evaluation of the Kenyan Government’s Cash Transfer for Orphans and Vulnerable Children. Four years into the program we find that individuals in the treatment group are only marginally more likely to wait for future money, due in part to the erosion of the value of the transfer by inflation. However among the poorest households for whom the value of transfer is still relatively large we find significant program effects on the propensity to wait. We also find strong program effects among those who have access to credit markets though the program itself does not improve access to credit. PMID:28260842

  9. STE thrust chamber technology: Main injector technology program and nozzle Advanced Development Program (ADP)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The purpose of the STME Main Injector Program was to enhance the technology base for the large-scale main injector-combustor system of oxygen-hydrogen booster engines in the areas of combustion efficiency, chamber heating rates, and combustion stability. The initial task of the Main Injector Program, focused on analysis and theoretical predictions using existing models, was complemented by the design, fabrication, and test at MSFC of a subscale calorimetric, 40,000-pound thrust class, axisymmetric thrust chamber operating at approximately 2,250 psi and a 7:1 expansion ratio. Test results were used to further define combustion stability bounds, combustion efficiency, and heating rates using a large injector scale similar to the Pratt & Whitney (P&W) STME main injector design configuration including the tangential entry swirl coaxial injection elements. The subscale combustion data was used to verify and refine analytical modeling simulation and extend the database range to guide the design of the large-scale system main injector. The subscale injector design incorporated fuel and oxidizer flow area control features which could be varied; this allowed testing of several design points so that the STME conditions could be bracketed. The subscale injector design also incorporated high-reliability and low-cost fabrication techniques such as a one-piece electrical discharged machined (EDMed) interpropellant plate. Both subscale and large-scale injectors incorporated outer row injector elements with scarfed tip features to allow evaluation of reduced heating rates to the combustion chamber.

  10. MONITORING COASTAL RESOURCES AT MULTIPLE SPATIAL AND TEMPORAL SCALES: LESSONS FROM EMAP 2001 EMAP SYMPOSIUM, APRIL 24-27, PENSACOLA BEACH, FL

    EPA Science Inventory

    In 1990, EMAP's Coastal Monitoring Program conducted its first regional sampling program in the Virginian Province. This first effort focused only at large spatial scales (regional) with some stratification to examine estuarine types. In the ensuing decade, EMAP-Coastal has condu...

  11. Scaling and Sustaining Effective Early Childhood Programs through School-Family-University Collaboration

    ERIC Educational Resources Information Center

    Reynolds, Arthur J.; Hayakawa, Momoko; Ou, Suh-Ruu; Mondi, Christina F.; Englund, Michelle M.; Candee, Allyson J.; Smerillo, Nicole E.

    2017-01-01

    We describe the development, implementation, and evaluation of a comprehensive preschool to third grade prevention program for the goals of sustaining services at a large scale. The Midwest Child-Parent Center (CPC) Expansion is a multilevel collaborative school reform model designed to improve school achievement and parental involvement from ages…

  12. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  13. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume IV. Nitrogen and Phosphorus Dynamics of the Lake Conway Ecosystem: Loading Budgets and a Dynamic Hydrologic Phosphorus Model.

    DTIC Science & Technology

    1982-08-01

    AD-AIA 700 FLORIDA UN1V GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN -ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMOR--ENL...Conway ecosystem and is part of the Large- Scale Operations Management Test (LSOMT) of the Aquatic Plant Control Research Program (APCRP) at the WES...should be cited as follows: Blancher, E. C., II, and Fellows, C. R. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control

  14. Leveraging Resources to Address Transportation Needs: Transportation Pooled Fund Program

    DOT National Transportation Integrated Search

    2004-05-28

    This brochure describes the Transportation Pooled Fund (TPF) Program. The objectives of the TPF Program are to leverage resources, avoid duplication of effort, undertake large-scale projects, obtain greater input on project definition, achieve broade...

  15. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    NASA Technical Reports Server (NTRS)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  16. Architecture and Programming Models for High Performance Intensive Computation

    DTIC Science & Technology

    2016-06-29

    Applications Systems and Large-Scale-Big-Data & Large-Scale-Big-Computing (DDDAS- LS ). ICCS 2015, June 2015. Reykjavk, Ice- land. 2. Bo YT, Wang P, Guo ZL...The Mahali project,” Communications Magazine , vol. 52, pp. 111–133, Aug 2014. 14 DISTRIBUTION A: Distribution approved for public release. Response ID

  17. Child Demographics Associated with Outcomes in a Community-Based Pivotal Response Training Program

    ERIC Educational Resources Information Center

    Baker-Ericzen, Mary J.; Stahmer, Aubyn C.; Burns, Amelia

    2007-01-01

    Although knowledge about the efficacy of treatments such as pivotal response training (PRT) for children with autism is increasing, studies of large-scale effectiveness for and transportability to diverse community populations are needed. The current study provides a large-scale preliminary assessment of (a) the effectiveness of a community-based…

  18. Measuring Family Outcomes Early Intervention: Findings from a Large-Scale Assessment

    ERIC Educational Resources Information Center

    Raspa, Melissa; Bailey, Donald B., Jr.; Olmsted, Murrey G.; Nelson, Robin; Robinson, Nyle; Simpson, Mary Ellen; Guillen, Chelsea; Houts, Renate

    2010-01-01

    This article reports data from a large-scale assessment using the Family Outcomes Survey with families participating in early intervention. The study was designed to determine how families describe themselves with regard to outcomes achieved, the extent to which outcomes are interrelated, and the extent to which child, family, and program factors…

  19. Dynamics of the McDonnell Douglas Large Scale Dynamic Rig and Dynamic Calibration of the Rotor Balance

    DOT National Transportation Integrated Search

    1994-10-01

    A shake test was performed on the Large Scale Dynamic Rig in the 40- by 80-Foot Wind Tunnel in support of the McDonnell Douglas Advanced Rotor Technology (MDART) Test Program. The shake test identifies the hub modes and the dynamic calibration matrix...

  20. Assessing Large-Scale Public Job Creation. R&D Monograph 67.

    ERIC Educational Resources Information Center

    Employment and Training Administration (DOL), Washington, DC.

    To assess the feasibility of large-scale, countercyclical public job creation, a study was initiated. Job creation program activities were examined in terms of how many activities could be undertaken; what would be their costs; and what would be their characteristics (labor-intensity, skill-mix, and political acceptability) that might contribute…

  1. The large scale microelectronics Computer-Aided Design and Test (CADAT) system

    NASA Technical Reports Server (NTRS)

    Gould, J. M.

    1978-01-01

    The CADAT system consists of a number of computer programs written in FORTRAN that provide the capability to simulate, lay out, analyze, and create the artwork for large scale microelectronics. The function of each software component of the system is described with references to specific documentation for each software component.

  2. Prediction of monthly rainfall on homogeneous monsoon regions of India based on large scale circulation patterns using Genetic Programming

    NASA Astrophysics Data System (ADS)

    Kashid, Satishkumar S.; Maity, Rajib

    2012-08-01

    SummaryPrediction of Indian Summer Monsoon Rainfall (ISMR) is of vital importance for Indian economy, and it has been remained a great challenge for hydro-meteorologists due to inherent complexities in the climatic systems. The Large-scale atmospheric circulation patterns from tropical Pacific Ocean (ENSO) and those from tropical Indian Ocean (EQUINOO) are established to influence the Indian Summer Monsoon Rainfall. The information of these two large scale atmospheric circulation patterns in terms of their indices is used to model the complex relationship between Indian Summer Monsoon Rainfall and the ENSO as well as EQUINOO indices. However, extracting the signal from such large-scale indices for modeling such complex systems is significantly difficult. Rainfall predictions have been done for 'All India' as one unit, as well as for five 'homogeneous monsoon regions of India', defined by Indian Institute of Tropical Meteorology. Recent 'Artificial Intelligence' tool 'Genetic Programming' (GP) has been employed for modeling such problem. The Genetic Programming approach is found to capture the complex relationship between the monthly Indian Summer Monsoon Rainfall and large scale atmospheric circulation pattern indices - ENSO and EQUINOO. Research findings of this study indicate that GP-derived monthly rainfall forecasting models, that use large-scale atmospheric circulation information are successful in prediction of All India Summer Monsoon Rainfall with correlation coefficient as good as 0.866, which may appears attractive for such a complex system. A separate analysis is carried out for All India Summer Monsoon rainfall for India as one unit, and five homogeneous monsoon regions, based on ENSO and EQUINOO indices of months of March, April and May only, performed at end of month of May. In this case, All India Summer Monsoon Rainfall could be predicted with 0.70 as correlation coefficient with somewhat lesser Correlation Coefficient (C.C.) values for different 'homogeneous monsoon regions'.

  3. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  4. Urban forest health monitoring: large-scale assessments in the United States

    Treesearch

    Anne Buckelew Cumming; Daniel B. Twardus; David J. Nowak

    2008-01-01

    The U.S. Department of Agriculture, Forest Service (USFS), together with state partners, developed methods to monitor urban forest structure, function, and health at a large statewide scale. Pilot studies have been established in five states using protocols based on USFS Forest Inventory and Analysis and Forest Health Monitoring program data collection standards....

  5. Achievement in Large-Scale National Numeracy Assessment: An Ecological Study of Motivation and Student, Home, and School Predictors

    ERIC Educational Resources Information Center

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    With the rise of large-scale academic assessment programs around the world, there is a need to better understand the factors predicting students' achievement in these assessment exercises. This investigation into national numeracy assessment drew on ecological and transactional conceptualizing involving student, student/home, and school factors.…

  6. Applying the 15 Public Health Emergency Preparedness Capabilities to Support Large-Scale Tuberculosis Investigations in Complex Congregate Settings

    PubMed Central

    Toren, Katelynne Gardner; Elsenboss, Carina; Narita, Masahiro

    2017-01-01

    Public Health—Seattle and King County, a metropolitan health department in western Washington, experiences rates of tuberculosis (TB) that are 1.6 times higher than are state and national averages. The department’s TB Control Program uses public health emergency management tools and capabilities sustained with Centers for Disease Control and Prevention grant funding to manage large-scale complex case investigations. We have described 3 contact investigations in large congregate settings that the TB Control Program conducted in 2015 and 2016. The program managed the investigations using public health emergency management tools, with support from the Preparedness Program. The 3 investigations encompassed medical evaluation of more than 1600 people, used more than 100 workers, identified nearly 30 individuals with latent TB infection, and prevented an estimated 3 cases of active disease. These incidents exemplify how investments in public health emergency preparedness can enhance health outcomes in traditional areas of public health. PMID:28892445

  7. Large-Scale Coronal Heating from "Cool" Activity in the Solar Magnetic Network

    NASA Technical Reports Server (NTRS)

    Falconer, D. A.; Moore, R. L.; Porter, J. G.; Hathaway, D. H.

    1999-01-01

    In Fe XII images from SOHO/EIT, the quiet solar corona shows structure on scales ranging from sub-supergranular (i.e., bright points and coronal network) to multi-supergranular (large-scale corona). In Falconer et al 1998 (Ap.J., 501, 386) we suppressed the large-scale background and found that the network-scale features are predominantly rooted in the magnetic network lanes at the boundaries of the supergranules. Taken together, the coronal network emission and bright point emission are only about 5% of the entire quiet solar coronal Fe XII emission. Here we investigate the relationship between the large-scale corona and the network as seen in three different EIT filters (He II, Fe IX-X, and Fe XII). Using the median-brightness contour, we divide the large-scale Fe XII corona into dim and bright halves, and find that the bright-half/dim half brightness ratio is about 1.5. We also find that the bright half relative to the dim half has 10 times greater total bright point Fe XII emission, 3 times greater Fe XII network emission, 2 times greater Fe IX-X network emission, 1.3 times greater He II network emission, and has 1.5 times more magnetic flux. Also, the cooler network (He II) radiates an order of magnitude more energy than the hotter coronal network (Fe IX-X, and Fe XII). From these results we infer that: 1) The heating of the network and the heating of the large-scale corona each increase roughly linearly with the underlying magnetic flux. 2) The production of network coronal bright points and heating of the coronal network each increase nonlinearly with the magnetic flux. 3) The heating of the large-scale corona is driven by widespread cooler network activity rather than by the exceptional network activity that produces the network coronal bright points and the coronal network. 4) The large-scale corona is heated by a nonthermal process since the driver of its heating is cooler than it is. This work was funded by the Solar Physics Branch of NASA's office of Space Science through the SR&T Program and the SEC Guest Investigator Program.

  8. What Does It Take to Scale Up Innovations? An Examination of Teach for America, the Harlem Children's Zone and the Knowledge is Power Program

    ERIC Educational Resources Information Center

    Levin, Ben

    2013-01-01

    This brief discusses the problem of scaling innovations in education in the United States so that they can serve very large numbers of students. It begins with a general discussion of the issues involved, develops a set of five criteria for assessing challenges of scaling, and then uses three programs widely discussed in the U.S. as examples of…

  9. Solving large-scale fixed cost integer linear programming models for grid-based location problems with heuristic techniques

    NASA Astrophysics Data System (ADS)

    Noor-E-Alam, Md.; Doucette, John

    2015-08-01

    Grid-based location problems (GBLPs) can be used to solve location problems in business, engineering, resource exploitation, and even in the field of medical sciences. To solve these decision problems, an integer linear programming (ILP) model is designed and developed to provide the optimal solution for GBLPs considering fixed cost criteria. Preliminary results show that the ILP model is efficient in solving small to moderate-sized problems. However, this ILP model becomes intractable in solving large-scale instances. Therefore, a decomposition heuristic is proposed to solve these large-scale GBLPs, which demonstrates significant reduction of solution runtimes. To benchmark the proposed heuristic, results are compared with the exact solution via ILP. The experimental results show that the proposed method significantly outperforms the exact method in runtime with minimal (and in most cases, no) loss of optimality.

  10. Selecting, adapting, and sustaining programs in health care systems

    PubMed Central

    Zullig, Leah L; Bosworth, Hayden B

    2015-01-01

    Practitioners and researchers often design behavioral programs that are effective for a specific population or problem. Despite their success in a controlled setting, relatively few programs are scaled up and implemented in health care systems. Planning for scale-up is a critical, yet often overlooked, element in the process of program design. Equally as important is understanding how to select a program that has already been developed, and adapt and implement the program to meet specific organizational goals. This adaptation and implementation requires attention to organizational goals, available resources, and program cost. We assert that translational behavioral medicine necessitates expanding successful programs beyond a stand-alone research study. This paper describes key factors to consider when selecting, adapting, and sustaining programs for scale-up in large health care systems and applies the Knowledge to Action (KTA) Framework to a case study, illustrating knowledge creation and an action cycle of implementation and evaluation activities. PMID:25931825

  11. E-Mentoring for Social Equity: Review of Research to Inform Program Development

    ERIC Educational Resources Information Center

    Single, Peg Boyle; Single, Richard M.

    2005-01-01

    The advent of user-friendly email programs and web browsers created possibilities for widespread use of e-mentoring programs. In this review of the research, we presented the history of e-mentoring programs and defined e-mentoring and structured e-mentoring programs, focusing on large-scale e-mentoring programs that addressed issues of social…

  12. Coaching as Part of a Pilot Quality Rating Scale Initiative: Challenges to--and Supports for--the Change-Making Process

    ERIC Educational Resources Information Center

    Ackerman, Debra J.

    2008-01-01

    Several nonprofit agencies in a large Midwestern city provide assistance to early care and education programs participating in a pilot Quality Rating Scale (QRS) initiative by pairing them with itinerant consultants, who are known as coaches. Despite this assistance, not all programs improve their QRS score. Furthermore, while pilot stakeholders…

  13. Implementing Projects in Calculus on a Large Scale at the University of South Florida

    ERIC Educational Resources Information Center

    Fox, Gordon A.; Campbell, Scott; Grinshpan, Arcadii; Xu, Xiaoying; Holcomb, John; Bénéteau, Catherine; Lewis, Jennifer E.; Ramachandran, Kandethody

    2017-01-01

    This paper describes the development of a program of project-based learning in Calculus courses at a large urban research university. In this program, students developed research projects in consultation with a faculty advisor in their major, and supervised by their calculus instructors. Students wrote up their projects in a prescribed format…

  14. Three-Year Evaluation of a Large Scale Early Grade French Immersion Program: The Ottawa Study

    ERIC Educational Resources Information Center

    Barik, Henri; Swain, Marrill

    1975-01-01

    The school performance of pupils in grades K-2 of the French immersion program in operation in Ottawa public schools is evaluated in comparison with that of pupils in the regular English program. (Author/RM)

  15. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  16. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  17. ECOLOGICAL RESEARCH IN THE LARGE-SCALE BIOSPHERE–ATMOSPHERE EXPERIMENT IN AMAZONIA: EARLY RESULTS.

    Treesearch

    M. Keller; A. Alencar; G. P. Asner; B. Braswell; M. Bustamente; E. Davidson; T. Feldpausch; E. Fern ndes; M. Goulden; P. Kabat; B. Kruijt; F. Luizao; S. Miller; D. Markewitz; A. D. Nobre; C. A. Nobre; N. Priante Filho; H. Rocha; P. Silva Dias; C von Randow; G. L. Vourlitis

    2004-01-01

    The Large-scale Biosphere–Atmosphere Experiment in Amazonia (LBA) is a multinational, interdisciplinary research program led by Brazil. Ecological studies in LBA focus on how tropical forest conversion, regrowth, and selective logging influence carbon storage, nutrient dynamics, trace gas fluxes, and the prospect for sustainable land use in the Amazon region. Early...

  18. Measurement repeatability of a large-scale inventory of forest fuels

    Treesearch

    J.A. Westfall; C.W. Woodall

    2007-01-01

    An efficient and accurate inventory of forest fuels at large scales is critical for assessment of forest fire hazards across landscapes. The Forest Inventory and Analysis (FIA) program of the USDA Forest Service conducts a national inventory of fuels along with blind remeasurement of a portion of inventory plots to monitor and improve data quality. The goal of this...

  19. Secondary Analysis and Large-Scale Assessments. Monograph in the Faculty of Education Research Seminar and Workshop Series.

    ERIC Educational Resources Information Center

    Tobin, Kenneth; Fraser, Barry J.

    Large scale assessments of educational progress can be useful tools to judge the effectiveness of educational programs and assessments. This document contains papers presented at the research seminar on this topic held at the Western Australian Institute of Technology in November, 1984. It is the fifth in a series of publications of papers…

  20. Large-Scale Academic Achievement Testing of Deaf and Hard-of-Hearing Students: Past, Present, and Future

    ERIC Educational Resources Information Center

    Qi, Sen; Mitchell, Ross E.

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the…

  1. Effectiveness of Large-Scale Community-Based Intensive Behavioral Intervention: A Waitlist Comparison Study Exploring Outcomes and Predictors

    ERIC Educational Resources Information Center

    Flanagan, Helen E.; Perry, Adrienne; Freeman, Nancy L.

    2012-01-01

    File review data were used to explore the impact of a large-scale publicly funded Intensive Behavioral Intervention (IBI) program for young children with autism. Outcomes were compared for 61 children who received IBI and 61 individually matched children from a waitlist comparison group. In addition, predictors of better cognitive outcomes were…

  2. Static analysis techniques for semiautomatic synthesis of message passing software skeletons

    DOE PAGES

    Sottile, Matthew; Dagit, Jason; Zhang, Deli; ...

    2015-06-29

    The design of high-performance computing architectures demands performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a “program skeleton” that we discuss in this article is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed formore » the purposes of the skeleton. In this work, we develop a semiautomatic approach for extracting program skeletons based on compiler program analysis. Finally, we demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator.« less

  3. Managing Large-Scale Online Graduate Programs

    ERIC Educational Resources Information Center

    Singleton, Jacques; Bowser, Audrey; Hux, Annette; Neal, Gwendolyn

    2013-01-01

    As with most states, Arkansas is experiencing substantial growth in the delivery of academic programs and courses by distance learning provided by institutions of higher education. At Arkansas State University faculty have adhered to the need of students and developed a completely online certification and master's program in Educational…

  4. A Universal Model for Solar Eruptions

    NASA Astrophysics Data System (ADS)

    Wyper, Peter; Antiochos, Spiro K.; DeVore, C. Richard

    2017-08-01

    We present a universal model for solar eruptions that encompasses coronal mass ejections (CMEs) at one end of the scale, to coronal jets at the other. The model is a natural extension of the Magnetic Breakout model for large-scale fast CMEs. Using high-resolution adaptive mesh MHD simulations conducted with the ARMS code, we show that so-called blowout or mini-filament coronal jets can be explained as one realisation of the breakout process. We also demonstrate the robustness of this “breakout-jet” model by studying three realisations in simulations with different ambient field inclinations. We conclude that magnetic breakout supports both large-scale fast CMEs and small-scale coronal jets, and by inference eruptions at scales in between. Thus, magnetic breakout provides a unified model for solar eruptions. P.F.W was supported in this work by an award of a RAS Fellowship and an appointment to the NASA Postdoctoral Program. C.R.D and S.K.A were supported by NASA’s LWS TR&T and H-SR programs.

  5. Effects of Interim Assessments on Student Achievement: Evidence from a Large-Scale Experiment

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros; Miller, Shazia R.; van der Ploeg, Arie; Li, Wei

    2016-01-01

    We use data from a large-scale, school-level randomized experiment conducted in 2010-2011 in public schools in Indiana. Our sample includes more than 30,000 students in 70 schools. We examine the impact of two interim assessment programs (i.e., mCLASS in Grades K-2 and Acuity in Grades 3--8) on mathematics and reading achievement. Two-level models…

  6. Development of lichen response indexes using a regional gradient modeling approach for large-scale monitoring of forests

    Treesearch

    Susan Will-Wolf; Peter Neitlich

    2010-01-01

    Development of a regional lichen gradient model from community data is a powerful tool to derive lichen indexes of response to environmental factors for large-scale and long-term monitoring of forest ecosystems. The Forest Inventory and Analysis (FIA) Program of the U.S. Department of Agriculture Forest Service includes lichens in its national inventory of forests of...

  7. IKONOS imagery for the Large Scale Biosphere–Atmosphere Experiment in Amazonia (LBA).

    Treesearch

    George Hurtt; Xiangming Xiao; Michael Keller; Michael Palace; Gregory P. Asner; Rob Braswell; Brond& #305; Eduardo S. zio; Manoel Cardoso; Claudio J.R. Carvalho; Matthew G. Fearon; Liane Guild; Steve Hagen; Scott Hetrick; Berrien Moore III; Carlos Nobre; Jane M. Read; S& aacute; Tatiana NO-VALUE; Annette Schloss; George Vourlitis; Albertus J. Wickel

    2003-01-01

    The LBA-ECO program is one of several international research components under the Brazilian-led Large Scale Biosphere–Atmosphere Experiment in Amazonia (LBA). The field-oriented research activities of this study are organized along transects and include a set of primary field sites, where the major objective is to study land-use change and ecosystem dynamics, and a...

  8. In Search of the Rainbow: Pathways to Quality in Large-Scale Programmes for Young Disadvantaged Children. Early Childhood Development: Practice and Reflections Number 10.

    ERIC Educational Resources Information Center

    Woodhead, Martin

    Those involved in early childhood development must recognize that many of their most cherished beliefs about what is best for children are cultural constructions. This book focuses on quality in large-scale programs for disadvantaged young children in a variety of cultural settings. Chapter 1, "Changing Childhoods," discusses issues…

  9. Avionic Data Bus Integration Technology

    DTIC Science & Technology

    1991-12-01

    address the hardware-software interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion ...the SCP. In 1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error... MULTIVERSION PROGRAMMING. N-version programming. 226 N-VERSION PROGRAMMING. The independent coding of a number, N, of redundant computer programs that

  10. Overview and current status of DOE/UPVG`s TEAM-UP Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hester, S.

    1995-11-01

    An overview is given of the Utility Photovoltaic Group. The mission is to accelerate the use of small-scale and large scale applications of photovoltaics for the benefit of the electric utilities and their customers.

  11. MRMPROBS: a data assessment and metabolite identification tool for large-scale multiple reaction monitoring based widely targeted metabolomics.

    PubMed

    Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro

    2013-05-21

    We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.

  12. Visual analysis of inter-process communication for large-scale parallel computing.

    PubMed

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  13. Iraq: Reconstruction Assistance

    DTIC Science & Technology

    2007-06-25

    Iraq: Reconstruction Assistance Summary A large-scale assistance program has been undertaken by the United States in Iraq since mid-2003. To date...28, 2004, the entity implementing assistance programs , the Coalition Provisional Authority (CPA), dissolved, and sovereignty was returned to Iraq. U.N...10 U.S. Assistance Policy and Program Structure . . . . . . . . . . . . . . . . . . . . . . . . . . 11 U.S. Reconstruction

  14. The Comprehensive Project for Deprived Communitites in Israel.

    ERIC Educational Resources Information Center

    Goldstein, Joseph

    A large-scale educational program, involving 30 settlements and neighborhoods that had been defined as suffering from deprivation, this project included a variety of reinforcement and enrichment programs. Information for a case study of the program was collected through interviews. Findings indicated that the guiding principles of the program…

  15. Enhanced Graphics for Extended Scale Range

    NASA Technical Reports Server (NTRS)

    Hanson, Andrew J.; Chi-Wing Fu, Philip

    2012-01-01

    Enhanced Graphics for Extended Scale Range is a computer program for rendering fly-through views of scene models that include visible objects differing in size by large orders of magnitude. An example would be a scene showing a person in a park at night with the moon, stars, and galaxies in the background sky. Prior graphical computer programs exhibit arithmetic and other anomalies when rendering scenes containing objects that differ enormously in scale and distance from the viewer. The present program dynamically repartitions distance scales of objects in a scene during rendering to eliminate almost all such anomalies in a way compatible with implementation in other software and in hardware accelerators. By assigning depth ranges correspond ing to rendering precision requirements, either automatically or under program control, this program spaces out object scales to match the precision requirements of the rendering arithmetic. This action includes an intelligent partition of the depth buffer ranges to avoid known anomalies from this source. The program is written in C++, using OpenGL, GLUT, and GLUI standard libraries, and nVidia GEForce Vertex Shader extensions. The program has been shown to work on several computers running UNIX and Windows operating systems.

  16. HAPEX-Sahel: A large-scale study of land-atmosphere interactions in the semi-arid tropics

    NASA Technical Reports Server (NTRS)

    Gutorbe, J-P.; Lebel, T.; Tinga, A.; Bessemoulin, P.; Brouwer, J.; Dolman, A.J.; Engman, E. T.; Gash, J. H. C.; Hoepffner, M.; Kabat, P.

    1994-01-01

    The Hydrologic Atmospheric Pilot EXperiment in the Sahel (HAPEX-Sahel) was carried out in Niger, West Africa, during 1991-1992, with an intensive observation period (IOP) in August-October 1992. It aims at improving the parameteriztion of land surface atmospheric interactions at the Global Circulation Model (GCM) gridbox scale. The experiment combines remote sensing and ground based measurements with hydrological and meteorological modeling to develop aggregation techniques for use in large scale estimates of the hydrological and meteorological behavior of large areas in the Sahel. The experimental strategy consisted of a period of intensive measurements during the transition period of the rainy to the dry season, backed up by a series of long term measurements in a 1 by 1 deg square in Niger. Three 'supersites' were instrumented with a variety of hydrological and (micro) meteorological equipment to provide detailed information on the surface energy exchange at the local scale. Boundary layer measurements and aircraft measurements were used to provide information at scales of 100-500 sq km. All relevant remote sensing images were obtained for this period. This program of measurements is now being analyzed and an extensive modelling program is under way to aggregate the information at all scales up to the GCM grid box scale. The experimental strategy and some preliminary results of the IOP are described.

  17. Faculty Navigating Institutional Waters: Suggestions for Bottom-Up Design of Online Programs

    ERIC Educational Resources Information Center

    Ferdig, Richard E.; Dawson, Kara

    2006-01-01

    Many faculty make the mistake of trying to start with an online degree. Administration, administrative policies and even other faculty are not necessarily ready for completely online programs. Large-scale programs are risky in the eyes of administration. Putting a program online will often involve decisions at multiple levels, months for business…

  18. Teaching Practices and Language Use in Two-Way Dual Language Immersion Programs in a Large Public School District

    ERIC Educational Resources Information Center

    Li, Jennifer; Steele, Jennifer; Slater, Robert; Bacon, Michael; Miller, Trey

    2016-01-01

    Many educators and policy makers look to two-way dual language immersion as one of the most promising options to close achievement gaps for English learners. However, the programs' effectiveness depends on the quality of their implementation. This article reports on a large-scale study of the implementation of dual language immersion across a…

  19. Quarter Scale RLV Multi-Lobe LH2 Tank Test Program

    NASA Technical Reports Server (NTRS)

    Blum, Celia; Puissegur, Dennis; Tidwell, Zeb; Webber, Carol

    1998-01-01

    Thirty cryogenic pressure cycles have been completed on the Lockheed Martin Michoud Space Systems quarter scale RLV composite multi-lobe liquid hydrogen propellant tank assembly, completing the initial phases of testing and demonstrating technologies key to the success of large scale composite cryogenic tankage for X33, RLV, and other future launch vehicles.

  20. The Personal Selling Ethics Scale: Revisions and Expansions for Teaching Sales Ethics

    ERIC Educational Resources Information Center

    Donoho, Casey; Heinze, Timothy

    2011-01-01

    The field of sales draws a large number of marketing graduates. Sales curricula used within today's marketing programs should include rigorous discussions of sales ethics. The Personal Selling Ethics Scale (PSE) provides an analytical tool for assessing and discussing students' ethical sales sensitivities. However, since the scale fails to address…

  1. Large-Scale Advanced Prop-Fan (LAP)

    NASA Technical Reports Server (NTRS)

    Degeorge, C. L.

    1988-01-01

    In recent years, considerable attention has been directed toward improving aircraft fuel efficiency. Analytical studies and research with wind tunnel models have demonstrated that the high inherent efficiency of low speed turboprop propulsion systems may now be extended to the Mach .8 flight regime of today's commercial airliners. This can be accomplished with a propeller, employing a large number of thin highly swept blades. The term Prop-Fan has been coined to describe such a propulsion system. In 1983 the NASA-Lewis Research Center contracted with Hamilton Standard to design, build and test a near full scale Prop-Fan, designated the Large Scale Advanced Prop-Fan (LAP). This report provides a detailed description of the LAP program. The assumptions and analytical procedures used in the design of Prop-Fan system components are discussed in detail. The manufacturing techniques used in the fabrication of the Prop-Fan are presented. Each of the tests run during the course of the program are also discussed and the major conclusions derived from them stated.

  2. Improved Blood Pressure Control Associated With a Large-Scale Hypertension Program

    PubMed Central

    Jaffe, Marc G.; Lee, Grace A.; Young, Joseph D.; Sidney, Stephen; Go, Alan S.

    2014-01-01

    Importance Hypertension control for large populations remains a major challenge. Objective To describe a large-scale hypertension program in northern California and to compare rates of hypertension control of the program to statewide and national estimates. Design, Setting, and Patients The Kaiser Permanente Northern California (KPNC) Hypertension program included a multi-faceted approach to blood pressure control. Patients identified with hypertension within an integrated health care delivery system in northern California from 2001–2009 were included. The comparison group included insured patients in California between 2006–2009 who were included in the Healthcare Effectiveness Data and Information Set (HEDIS) commercial measurement by California health insurance plans participating in the National Committee for Quality Assurance (NQCA) quality measure reporting process. A secondary comparison group was the reported national mean NCQA HEDIS commercial rates of hypertension control from 2001–2009 from health plans that participated in the NQCA HEDIS quality measure reporting process. Main Outcome Measure Hypertension control as defined by NCQA HEDIS. Results The KPNC hypertension registry established in 2001 included 349,937 patients and grew to 652,763 by 2009. The NCQA HEDIS commercial measurement for hypertension control increased from 44% to 80% during the study period. In contrast, the national mean NCQA HEDIS commercial measurement increased modestly from 55.4% to 64.1%. California mean NCQA HEDIS commercial rates of hypertension were similar to those reported nationally from 2006–2009. (63.4% to 69.4%). Conclusion and Relevance Among adults diagnosed with hypertension, implementation of a large-scale hypertension program was associated with a significant increase in hypertension control compared with state and national control rates. PMID:23989679

  3. 38 CFR 77.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... appropriate individuals; and (3) Coordination, Paralympic classification of athletes, athlete assessment... grant under this part. International Paralympic Committee (IPC) means the global governing body of the Paralympic movement. Large-scale adaptive sports program means (1) An adaptive sports program of a National...

  4. A Large Scale, High Resolution Agent-Based Insurgency Model

    DTIC Science & Technology

    2013-09-30

    CUDA) is NVIDIA Corporation’s software development model for General Purpose Programming on Graphics Processing Units (GPGPU) ( NVIDIA Corporation ...Conference. Argonne National Laboratory, Argonne, IL, October, 2005. NVIDIA Corporation . NVIDIA CUDA Programming Guide 2.0 [Online]. NVIDIA Corporation

  5. Lessons Learned from the Everglades Collaborative Adaptive Management Program

    EPA Science Inventory

    Recent technical papers explore whether adaptive management (AM) is useful for environmental management and restoration efforts and discuss the many challenges to overcome for successful implementation, especially for large-scale restoration programs (McLain and Lee 1996; Levine ...

  6. Strategies for Energy Efficient Resource Management of Hybrid Programming Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Dong; Supinski, Bronis de; Schulz, Martin

    2013-01-01

    Many scientific applications are programmed using hybrid programming models that use both message-passing and shared-memory, due to the increasing prevalence of large-scale systems with multicore, multisocket nodes. Previous work has shown that energy efficiency can be improved using software-controlled execution schemes that consider both the programming model and the power-aware execution capabilities of the system. However, such approaches have focused on identifying optimal resource utilization for one programming model, either shared-memory or message-passing, in isolation. The potential solution space, thus the challenge, increases substantially when optimizing hybrid models since the possible resource configurations increase exponentially. Nonetheless, with the accelerating adoptionmore » of hybrid programming models, we increasingly need improved energy efficiency in hybrid parallel applications on large-scale systems. In this work, we present new software-controlled execution schemes that consider the effects of dynamic concurrency throttling (DCT) and dynamic voltage and frequency scaling (DVFS) in the context of hybrid programming models. Specifically, we present predictive models and novel algorithms based on statistical analysis that anticipate application power and time requirements under different concurrency and frequency configurations. We apply our models and methods to the NPB MZ benchmarks and selected applications from the ASC Sequoia codes. Overall, we achieve substantial energy savings (8.74% on average and up to 13.8%) with some performance gain (up to 7.5%) or negligible performance loss.« less

  7. Report on phase 1 of the Microprocessor Seminar. [and associated large scale integration

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Proceedings of a seminar on microprocessors and associated large scale integrated (LSI) circuits are presented. The potential for commonality of device requirements, candidate processes and mechanisms for qualifying candidate LSI technologies for high reliability applications, and specifications for testing and testability were among the topics discussed. Various programs and tentative plans of the participating organizations in the development of high reliability LSI circuits are given.

  8. The dynamics and evolution of clusters of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret; Huchra, John P.

    1987-01-01

    Research was undertaken to produce a coherent picture of the formation and evolution of large-scale structures in the universe. The program is divided into projects which examine four areas: the relationship between individual galaxies and their environment; the structure and evolution of individual rich clusters of galaxies; the nature of superclusters; and the large-scale distribution of individual galaxies. A brief review of results in each area is provided.

  9. NASA: Assessments of Selected Large-Scale Projects

    DTIC Science & Technology

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  10. Studies of land-cover, land-use, and biophysical properties of vegetation in the Large Scale Biosphere Atmosphere experiment in Amazonia.

    Treesearch

    Dar A. Robertsa; Michael Keller; Joao Vianei Soares

    2003-01-01

    We summarize early research on land-cover, land-use, and biophysical properties of vegetation from the Large Scale Biosphere Atmosphere (LBA) experiment in Amazoˆnia. LBA is an international research program developed to evaluate regional function and to determine how land-use and climate modify biological, chemical and physical processes there. Remote sensing has...

  11. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  12. An Evaluation of the Cost Effectiveness of Alternative Compensatory Reading Programs, Volume IV: Cost Analysis of Summer Programs. Final Report.

    ERIC Educational Resources Information Center

    Al-Salam, Nabeel; Flynn, Donald L.

    This report describes the results of a study of the cost and cost effectiveness of 27 summer reading programs, carried through as part of a large-scale evaluation of compensatory reading programs. Three other reports describe cost and cost-effectiveness studies of programs during the regular school year. On an instructional-hour basis, the total…

  13. The International Symposium on Applied Military Psychology (20th) Held on 25-29 June 1984 in Brussels, Belgium.

    DTIC Science & Technology

    1984-12-07

    and organization of psychological services, adjustment to military life and stress, organizational diagnosis and intervention, evaluation of new programs, and new emphases in large-scale research programs for the future.

  14. Can data from disparate long-term fish monitoring programs be used to increase our understanding of regional and continental trends in large river assemblages?

    USGS Publications Warehouse

    Counihan, Timothy D.; Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.

    2018-01-01

    Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers.

  15. Can data from disparate long-term fish monitoring programs be used to increase our understanding of regional and continental trends in large river assemblages?

    PubMed Central

    Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian S.; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.

    2018-01-01

    Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers. PMID:29364953

  16. Teaching Real Science with a Microcomputer.

    ERIC Educational Resources Information Center

    Naiman, Adeline

    1983-01-01

    Discusses various ways science can be taught using microcomputers, including simulations/games which allow large-scale or historic experiments to be replicated on a manageable scale in a brief time. Examples of several computer programs are also presented, including "Experiments in Human Physiology,""Health Awareness…

  17. The Reach up Early Childhood Parenting Program: Origins, Content, and Implementation

    ERIC Educational Resources Information Center

    Walker, Susan P.; Chang, Susan M.; Smith, Joanne A.; Baker-Henningham, Helen

    2018-01-01

    Nurturing care in early childhood requires responsive interactions and opportunities to learn; however, there are few large-scale programs in low- and middle-income countries that support parents' ability to provide responsive care and activities that help children learn. The Reach Up training program was developed to increase capacity of…

  18. Laying a Solid Foundation: Strategies for Effective Program Replication

    ERIC Educational Resources Information Center

    Summerville, Geri

    2009-01-01

    The replication of proven social programs is a cost-effective and efficient way to achieve large-scale, positive social change. Yet there has been little guidance available about how to approach program replication and limited development of systems--at local, state or federal levels--to support replication efforts. "Laying a Solid Foundation:…

  19. Behavioral Effects Within and Between Individual and Group Reinforcement Procedures.

    ERIC Educational Resources Information Center

    Reese, Sandra C.; And Others

    This paper briefly outlines the outcomes of a large-scale behavioral program, Preparation through Responsive Educational Programs (PREP), involving students with academic and social deficits from a 1350-student junior high school. Overall program effectiveness was assessed by outcome criteria of total school grades, grades in non-PREP classes,…

  20. Evaluating Federal Social Programs: Finding out What Works and What Does Not

    ERIC Educational Resources Information Center

    Muhlhausen, David B.

    2012-01-01

    Federal social programs are rarely evaluated to determine whether they are actually accomplishing their intended purposes. As part of its obligation to spend taxpayers' dollars wisely, Congress should mandate that experimental evaluations of every federal social program be conducted. The evaluations should be large-scale, multisite studies to…

  1. Building the Case for Large Scale Behavioral Education Adoptions

    ERIC Educational Resources Information Center

    Layng, Zachary R.; Layng, T. V. Joe

    2012-01-01

    Behaviorally-designed educational programs are often based on a research tradition that is not widely understood by potential users of the programs. Though the data may be sound and the prediction of outcomes for individual learners quite good, those advocating adoption of behaviorally-designed educational programs may need to do more in order to…

  2. Afforestation may have little effect on hydrological cycle over the Three-North region of China

    NASA Astrophysics Data System (ADS)

    Meng, S.; Xie, X.

    2017-12-01

    Afforestation or reforestation is generally effective to improve environmental conditions, and it may have substantial impact on hydrological cycle by increasing rainfall interception and transpiration. To combat desertification and to control dust storms, China has implemented a few Large-scale afforestation programs since 1980s, including the world's most ambitious afforestation program, the Three-North Forest Shelterbelt (TNFS) program in the arid and semiarid land areas. This afforestation plan covers about 4 million km2 (> 42%) of the land area of China. Although the TNFS program eased environmental problems in the region to some degree, the consequences of large-scale afforestation on hydrological cycles is still controversial. To identify the impact of the afforestation on hydrological cycle at regional scale, we employed a large-scale hydrological model, i.e., the Variable Infiltration Capacity (VIC) model, and satellite remote sensing data sets, i.e., leaf area index (LAI) from the Moderate Resolution Imaging Spectroradiometer (MODIS) and Global LAnd Surface satellite (GLASS). The VIC modelling was forced with long-term dynamic LAI and gridded atmospheric data. We focused on the period of 2000-2015 when fewer afforestation activities implemented and the vegetation in steady growth stage in the three-north region. The results show that, despite the spatial heterogeneity, LAI in the growing season exhibits a slight increase across the three-north region, which is the contribution of the vegetation growth due to afforestation program. Evapotranspiration (ET) increased at a rate of 3.93 mm/yr over the whole region from 2000 to 2015. The spatial pattern of ET is consistent with the changes in LAI and precipitation, but this does not mean vegetation growth contributed equally. Based on factor-distinguishing simulations, we found that precipitation change has more significant influence on hydrological cycle than vegetation growth. Therefore, the afforestation practices are influential at small-catchment scale, but at regional scale, they may have little effect on the hydrological cycles. For sustainable water resource management, we should pay special attention on climate change rather than the afforestation efforts.

  3. High Quantum Efficiency OLED Lighting Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiang, Joseph

    The overall goal of the program was to apply improvements in light outcoupling technology to a practical large area plastic luminaire, and thus enable the product vision of an extremely thin form factor high efficiency large area light source. The target substrate was plastic and the baseline device was operating at 35 LPW at the start of the program. The target LPW of the program was a >2x improvement in the LPW efficacy and the overall amount of light to be delivered was relatively high 900 lumens. Despite the extremely difficult challenges associated with scaling up a wet solution processmore » on plastic substrates, the program was able to make substantial progress. A small molecule wet solution process was successfully implemented on plastic substrates with almost no loss in efficiency in transitioning from the laboratory scale glass to large area plastic substrates. By transitioning to a small molecule based process, the LPW entitlement increased from 35 LPW to 60 LPW. A further 10% improvement in outcoupling efficiency was demonstrated via the use of a highly reflecting cathode, which reduced absorptive loss in the OLED device. The calculated potential improvement in some cases is even larger, ~30%, and thus there is considerable room for optimism in improving the net light coupling efficacy, provided absorptive loss mechanisms are eliminated. Further improvements are possible if scattering schemes such as the silver nanowire based hard coat structure are fully developed. The wet coating processes were successfully scaled to large area plastic substrate and resulted in the construction of a 900 lumens luminaire device.« less

  4. Transcriptional analysis of product-concentration driven changes in cellular programs of recombinant Clostridium acetobutylicumstrains.

    PubMed

    Tummala, Seshu B; Junne, Stefan G; Paredes, Carlos J; Papoutsakis, Eleftherios T

    2003-12-30

    Antisense RNA (asRNA) downregulation alters protein expression without changing the regulation of gene expression. Downregulation of primary metabolic enzymes possibly combined with overexpression of other metabolic enzymes may result in profound changes in product formation, and this may alter the large-scale transcriptional program of the cells. DNA-array based large-scale transcriptional analysis has the potential to elucidate factors that control cellular fluxes even in the absence of proteome data. These themes are explored in the study of large-scale transcriptional analysis programs and the in vivo primary-metabolism fluxes of several related recombinant C. acetobutylicum strains: C. acetobutylicum ATCC 824(pSOS95del) (plasmid control; produces high levels of butanol snd acetone), 824(pCTFB1AS) (expresses antisense RNA against CoA transferase (ctfb1-asRNA); produces very low levels of butanol and acetone), and 824(pAADB1) (expresses ctfb1-asRNA and the alcohol-aldehyde dahydrogenase gene (aad); produce high alcohol and low acetone levels). DNA-array based transcriptional analysis revealed that the large changes in product concentrations (snd notably butanol concentration) due to ctfb1-asRNA expression alone and in combination with aad overexpression resulted in dramatic changes of the cellular transcriptome. Cluster analysis and gene expression patterns of established and putative operons involved in stress response, motility, sporulation, and fatty-acid biosynthesis indicate that these simple genetic changes dramatically alter the cellular programs of C. acetobutylicum. Comparison of gene expression and flux analysis data may point to possible flux-controling steps and suggest unknown regulatory mechanisms. Copyright 2003; Wiley Periodicals, Inc.

  5. Controlling Guessing Bias in the Dichotomous Rasch Model Applied to a Large-Scale, Vertically Scaled Testing Program

    ERIC Educational Resources Information Center

    Andrich, David; Marais, Ida; Humphry, Stephen Mark

    2016-01-01

    Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The…

  6. Building an Effective and Affordable K-12 Geoscience Outreach Program from the Ground Up: A Simple Model for Universities

    ERIC Educational Resources Information Center

    Dahl, Robyn Mieko; Droser, Mary L.

    2016-01-01

    University earth science departments seeking to establish meaningful geoscience outreach programs often pursue large-scale, grant-funded programs. Although this type of outreach is highly successful, it is also extremely costly, and grant funding can be difficult to secure. Here, we present the Geoscience Education Outreach Program (GEOP), a…

  7. Becoming a Leader along the Way: Embedding Leadership Training into a Large-Scale Peer-Learning Program in the STEM Disciplines

    ERIC Educational Resources Information Center

    Micari, Marina; Gould, Amy Knife; Lainez, Louie

    2010-01-01

    Although many college students enter leadership programs with the express goal of developing leadership skills, some specialized leadership programs draw students who seek to gain expertise in a disciplinary area, with leadership development as a secondary goal. In the latter case, program developers face the challenge of generating enthusiasm…

  8. NASA/FAA general aviation crash dynamics program - An update

    NASA Technical Reports Server (NTRS)

    Hayduk, R. J.; Thomson, R. G.; Carden, H. D.

    1979-01-01

    Work in progress in the NASA/FAA General Aviation Crash Dynamics Program for the development of technology for increased crash-worthiness and occupant survivability of general aviation aircraft is presented. Full-scale crash testing facilities and procedures are outlined, and a chronological summary of full-scale tests conducted and planned is presented. The Plastic and Large Deflection Analysis of Nonlinear Structures and Modified Seat Occupant Model for Light Aircraft computer programs which form part of the effort to predict nonlinear geometric and material behavior of sheet-stringer aircraft structures subjected to large deformations are described, and excellent agreement between simulations and experiments is noted. The development of structural concepts to attenuate the load transmitted to the passenger through the seats and subfloor structure is discussed, and an apparatus built to test emergency locator transmitters in a realistic environment is presented.

  9. The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.

    PubMed

    Pang, Haotian; Liu, Han; Vanderbei, Robert

    2014-02-01

    We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.

  10. Effect Of A Large-Scale Social Franchising And Telemedicine Program On Childhood Diarrhea And Pneumonia Outcomes In India.

    PubMed

    Mohanan, Manoj; Babiarz, Kimberly S; Goldhaber-Fiebert, Jeremy D; Miller, Grant; Vera-Hernández, Marcos

    2016-10-01

    Despite the rapid growth of social franchising, there is little evidence on its population impact in the health sector. Similar in many ways to private-sector commercial franchising, social franchising can be found in sectors with a social objective, such as health care. This article evaluates the World Health Partners (WHP) Sky program, a large-scale social franchising and telemedicine program in Bihar, India. We studied appropriate treatment for childhood diarrhea and pneumonia and associated health care outcomes. We used multivariate difference-in-differences models to analyze data on 67,950 children ages five and under in 2011 and 2014. We found that the WHP-Sky program did not improve rates of appropriate treatment or disease prevalence. Both provider participation and service use among target populations were low. Our results do not imply that social franchising cannot succeed; instead, they underscore the importance of understanding factors that explain variation in the performance of social franchises. Our findings also highlight, for donors and governments in particular, the importance of conducting rigorous impact evaluations of new and potentially innovative health care delivery programs before investing in scaling them up. Published by Project HOPE—The People-to-People Health Foundation, Inc.

  11. Poverty-alleviation program participation and salivary cortisol in very low-income children.

    PubMed

    Fernald, Lia C H; Gunnar, Megan R

    2009-06-01

    Correlational studies have shown associations between social class and salivary cortisol suggestive of a causal link between childhood poverty and activity of the stress-sensitive hypothalamic-pituitary-adrenocortical (HPA) system. Using a quasi-experimental design, we evaluated the associations between a family's participation in a large-scale, conditional cash transfer program in Mexico (Oportunidades, formerly Progresa) during the child's early years of life and children's salivary cortisol (baseline and responsivity). We also examined whether maternal depressive symptoms moderated the effect of program participation. Low-income households (income <20th percentile nationally) from rural Mexico were enrolled in a large-scale poverty-alleviation program between 1998 and 1999. A comparison group of households from demographically similar communities was recruited in 2003. Following 3.5 years of participation in the Oportunidades program, three saliva samples were obtained from children aged 2-6 years from intervention and comparison households (n=1197). Maternal depressive symptoms were obtained using the Center for Epidemiologic Studies-Depression Scale (CES-D). Results were that children who had been in the Oportunidades program had lower salivary cortisol levels when compared with those who had not participated in the program, while controlling for a wide range of individual-, household- and community-level variables. Reactivity patterns of salivary cortisol did not differ between intervention and comparison children. Maternal depression moderated the association between Oportunidades program participation and baseline salivary cortisol in children. Specifically, there was a large and significant Oportunidades program effect of lowering cortisol in children of mothers with high depressive symptoms but not in children of mothers with low depressive symptomatology. These findings provide the strongest evidence to date that the economic circumstances of a family can influence a child's developing stress system and provide a mechanism through which poverty early in life could alter life-course risk for physical and mental health disorders.

  12. The place of algae in agriculture: policies for algal biomass production.

    PubMed

    Trentacoste, Emily M; Martinez, Alice M; Zenk, Tim

    2015-03-01

    Algae have been used for food and nutraceuticals for thousands of years, and the large-scale cultivation of algae, or algaculture, has existed for over half a century. More recently algae have been identified and developed as renewable fuel sources, and the cultivation of algal biomass for various products is transitioning to commercial-scale systems. It is crucial during this period that institutional frameworks (i.e., policies) support and promote development and commercialization and anticipate and stimulate the evolution of the algal biomass industry as a source of renewable fuels, high value protein and carbohydrates and low-cost drugs. Large-scale cultivation of algae merges the fundamental aspects of traditional agricultural farming and aquaculture. Despite this overlap, algaculture has not yet been afforded a position within agriculture or the benefits associated with it. Various federal and state agricultural support and assistance programs are currently appropriated for crops, but their extension to algal biomass is uncertain. These programs are essential for nascent industries to encourage investment, build infrastructure, disseminate technical experience and information, and create markets. This review describes the potential agricultural policies and programs that could support algal biomass cultivation, and the barriers to the expansion of these programs to algae.

  13. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations

    PubMed Central

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts. PMID:25993414

  14. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations.

    PubMed

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts.

  15. The epistemic culture in an online citizen science project: Programs, antiprograms and epistemic subjects.

    PubMed

    Kasperowski, Dick; Hillman, Thomas

    2018-05-01

    In the past decade, some areas of science have begun turning to masses of online volunteers through open calls for generating and classifying very large sets of data. The purpose of this study is to investigate the epistemic culture of a large-scale online citizen science project, the Galaxy Zoo, that turns to volunteers for the classification of images of galaxies. For this task, we chose to apply the concepts of programs and antiprograms to examine the 'essential tensions' that arise in relation to the mobilizing values of a citizen science project and the epistemic subjects and cultures that are enacted by its volunteers. Our premise is that these tensions reveal central features of the epistemic subjects and distributed cognition of epistemic cultures in these large-scale citizen science projects.

  16. Centrifuge impact cratering experiments: Scaling laws for non-porous targets

    NASA Technical Reports Server (NTRS)

    Schmidt, Robert M.

    1987-01-01

    This research is a continuation of an ongoing program whose objective is to perform experiments and to develop scaling relationships for large body impacts onto planetary surfaces. The development of the centrifuge technique has been pioneered by the present investigator and is used to provide experimental data for actual target materials of interest. With both powder and gas guns mounted on a rotor arm, it is possible to match various dimensionless similarity parameters, which have been shown to govern the behavior of large scale impacts. Current work is directed toward the determination of scaling estimates for nonporous targets. The results are presented in summary form.

  17. THE NATIONAL COAST ASSESSMENT : EPA TECHNOLOGY TRANSFER TO STRATEGIC PARTNERS

    EPA Science Inventory

    The National Coastal Assessment (NCA) is a large-scale, comprehensive environmental monitoring program designed to characterize the ecological condition of the Nation's coastal resources (estuaries and near shore waters). A key to the success of the program is the development of ...

  18. UPDATE ON THE MARINA STUDY ON LAKE TEXOMA

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) has instituted a program for Risk Management Research for Ecosystem Restoration in Watersheds. As part of this program a large scale project was initiated on Lake Texoma and the surrounding watershed to evaluate the assimi...

  19. The Challenge: Overcoming the Pitfalls.

    ERIC Educational Resources Information Center

    Lozier, G. Gregory; Teeter, Deborah J.

    1993-01-01

    Some organizations are having difficulty with the Total Quality Management (TQM) approach. Problems appear to come from reliance on prepackaged TQM programs, large-scale, diffuse implementation, mass training programs, measurement paralysis, overemphasis on tools, process selection, outmoded reward structures, and simplistic views of change and…

  20. Impact of Large Scale Energy Efficiency Programs On Consumer Tariffs and Utility Finances in India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abhyankar, Nikit; Phadke, Amol

    2011-01-20

    Large-scale EE programs would modestly increase tariffs but reduce consumers' electricity bills significantly. However, the primary benefit of EE programs is a significant reduction in power shortages, which might make these programs politically acceptable even if tariffs increase. To increase political support, utilities could pursue programs that would result in minimal tariff increases. This can be achieved in four ways: (a) focus only on low-cost programs (such as replacing electric water heaters with gas water heaters); (b) sell power conserved through the EE program to the market at a price higher than the cost of peak power purchase; (c) focusmore » on programs where a partial utility subsidy of incremental capital cost might work and (d) increase the number of participant consumers by offering a basket of EE programs to fit all consumer subcategories and tariff tiers. Large scale EE programs can result in consistently negative cash flows and significantly erode the utility's overall profitability. In case the utility is facing shortages, the cash flow is very sensitive to the marginal tariff of the unmet demand. This will have an important bearing on the choice of EE programs in Indian states where low-paying rural and agricultural consumers form the majority of the unmet demand. These findings clearly call for a flexible, sustainable solution to the cash-flow management issue. One option is to include a mechanism like FAC in the utility incentive mechanism. Another sustainable solution might be to have the net program cost and revenue loss built into utility's revenue requirement and thus into consumer tariffs up front. However, the latter approach requires institutionalization of EE as a resource. The utility incentive mechanisms would be able to address the utility disincentive of forgone long-run return but have a minor impact on consumer benefits. Fundamentally, providing incentives for EE programs to make them comparable to supply-side investments is a way of moving the electricity sector toward a model focused on providing energy services rather than providing electricity.« less

  1. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  2. Large-Scale Biomonitoring of Remote and Threatened Ecosystems via High-Throughput Sequencing

    PubMed Central

    Gibson, Joel F.; Shokralla, Shadi; Curry, Colin; Baird, Donald J.; Monk, Wendy A.; King, Ian; Hajibabaei, Mehrdad

    2015-01-01

    Biodiversity metrics are critical for assessment and monitoring of ecosystems threatened by anthropogenic stressors. Existing sorting and identification methods are too expensive and labour-intensive to be scaled up to meet management needs. Alternately, a high-throughput DNA sequencing approach could be used to determine biodiversity metrics from bulk environmental samples collected as part of a large-scale biomonitoring program. Here we show that both morphological and DNA sequence-based analyses are suitable for recovery of individual taxonomic richness, estimation of proportional abundance, and calculation of biodiversity metrics using a set of 24 benthic samples collected in the Peace-Athabasca Delta region of Canada. The high-throughput sequencing approach was able to recover all metrics with a higher degree of taxonomic resolution than morphological analysis. The reduced cost and increased capacity of DNA sequence-based approaches will finally allow environmental monitoring programs to operate at the geographical and temporal scale required by industrial and regulatory end-users. PMID:26488407

  3. State Demolition Information

    EPA Pesticide Factsheets

    Contact information and guidances for each state and selected territories's environmental agencies and programs relevant to large-scale residential demolition including asbestos, lead, and open burning.

  4. Creating a gold medal Olympic and Paralympics health care team: a satisfaction survey of the mobile medical unit/polyclinic team training for the Vancouver 2010 winter games

    PubMed Central

    2013-01-01

    Background The mobile medical unit/polyclinic (MMU/PC) was an essential part of the medical services to support ill or injured Olympic or Paralympics family during the 2010 Olympic and Paralympics winter games. The objective of this study was to survey the satisfaction of the clinical staff that completed the training programs prior to deployment to the MMU. Methods Medical personnel who participated in at least one of the four training programs, including (1) week-end sessions; (2) web-based modules; (3) just-in-time training; and (4) daily simulation exercises were invited to participate in a web-based survey and comment on their level of satisfaction with training program. Results A total of 64 (out of 94 who were invited) physicians, nurses and respiratory therapists completed the survey. All participants reported favorably that the MMU/PC training positively impacted their knowledge, skills and team functions while deployed at the MMU/PC during the 2010 Olympic Games. However, components of the training program were valued differently depending on clinical job title, years of experience, and prior experience in large scale events. Respondents with little or no experience working in large scale events (45%) rated daily simulations as the most valuable component of the training program for strengthening competencies and knowledge in clinical skills for working in large scale events. Conclusion The multi-phase MMU/PC training was found to be beneficial for preparing the medical team for the 2010 Winter Games. In particular this survey demonstrates the effectiveness of simulation training programs on teamwork competencies in ad hoc groups. PMID:24225074

  5. The assessment of the readiness of five countries to implement child maltreatment prevention programs on a large scale.

    PubMed

    Mikton, Christopher; Power, Mick; Raleva, Marija; Makoae, Mokhantso; Al Eissa, Majid; Cheah, Irene; Cardia, Nancy; Choo, Claire; Almuneef, Maha

    2013-12-01

    This study aimed to systematically assess the readiness of five countries - Brazil, the Former Yugoslav Republic of Macedonia, Malaysia, Saudi Arabia, and South Africa - to implement evidence-based child maltreatment prevention programs on a large scale. To this end, it applied a recently developed method called Readiness Assessment for the Prevention of Child Maltreatment based on two parallel 100-item instruments. The first measures the knowledge, attitudes, and beliefs concerning child maltreatment prevention of key informants; the second, completed by child maltreatment prevention experts using all available data in the country, produces a more objective assessment readiness. The instruments cover all of the main aspects of readiness including, for instance, availability of scientific data on the problem, legislation and policies, will to address the problem, and material resources. Key informant scores ranged from 31.2 (Brazil) to 45.8/100 (the Former Yugoslav Republic of Macedonia) and expert scores, from 35.2 (Brazil) to 56/100 (Malaysia). Major gaps identified in almost all countries included a lack of professionals with the skills, knowledge, and expertise to implement evidence-based child maltreatment programs and of institutions to train them; inadequate funding, infrastructure, and equipment; extreme rarity of outcome evaluations of prevention programs; and lack of national prevalence surveys of child maltreatment. In sum, the five countries are in a low to moderate state of readiness to implement evidence-based child maltreatment prevention programs on a large scale. Such an assessment of readiness - the first of its kind - allows gaps to be identified and then addressed to increase the likelihood of program success. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Large-scale academic achievement testing of deaf and hard-of-hearing students: past, present, and future.

    PubMed

    Qi, Sen; Mitchell, Ross E

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the validity and reliability of using the Stanford for this special student population still require extensive scrutiny. Recent shifts in educational policy environment, which require that schools enable all children to achieve proficiency through accountability testing, warrants a close examination of the adequacy and relevance of the current large-scale testing of deaf and hard-of-hearing students. This study has three objectives: (a) it will summarize the historical data over the last three decades to indicate trends in academic achievement for this special population, (b) it will analyze the current federal laws and regulations related to educational testing and special education, thereby identifying gaps between policy and practice in the field, especially identifying the limitations of current testing programs in assessing what deaf and hard-of-hearing students know, and (c) it will offer some insights and suggestions for future testing programs for deaf and hard-of-hearing students.

  7. Beyond Widgets -- Systems Incentive Programs for Utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Regnier, Cindy; Mathew, Paul; Robinson, Alastair

    Utility incentive programs remain one of the most significant means of deploying commercialized, but underutilized building technologies to scale. However, these programs have been largely limited to component-based products (e.g., lamps, RTUs). While some utilities do provide ‘custom’ incentive programs with whole building and system level technical assistance, these programs require deeper levels of analysis, resulting in higher program costs. This results in custom programs being restricted to utilities with greater resources, and are typically applied mainly to large or energy-intensive facilities, leaving much of the market without cost effective access and incentives for these solutions. In addition, with increasinglymore » stringent energy codes, cost effective component-based solutions that achieve significant savings are dwindling. Building systems (e.g., integrated façade, HVAC and/or lighting solutions) can deliver higher savings that translate into large sector-wide savings if deployed at the scale of these programs. However, systems application poses a number of challenges – baseline energy use must be defined and measured; the metrics for energy and performance must be defined and tested against; in addition, system savings must be validated under well understood conditions. This paper presents a sample of findings of a project to develop validated utility incentive program packages for three specific integrated building systems, in collaboration with Xcel Energy (CO, MN), ComEd, and a consortium of California Public Owned Utilities (CA POUs) (Northern California Power Agency(NCPA) and the Southern California Public Power Authority(SCPPA)). Furthermore, these program packages consist of system specifications, system performance, M&V protocols, streamlined assessment methods, market assessment and implementation guidance.« less

  8. Support Vector Machines Trained with Evolutionary Algorithms Employing Kernel Adatron for Large Scale Classification of Protein Structures.

    PubMed

    Arana-Daniel, Nancy; Gallegos, Alberto A; López-Franco, Carlos; Alanís, Alma Y; Morales, Jacob; López-Franco, Adriana

    2016-01-01

    With the increasing power of computers, the amount of data that can be processed in small periods of time has grown exponentially, as has the importance of classifying large-scale data efficiently. Support vector machines have shown good results classifying large amounts of high-dimensional data, such as data generated by protein structure prediction, spam recognition, medical diagnosis, optical character recognition and text classification, etc. Most state of the art approaches for large-scale learning use traditional optimization methods, such as quadratic programming or gradient descent, which makes the use of evolutionary algorithms for training support vector machines an area to be explored. The present paper proposes an approach that is simple to implement based on evolutionary algorithms and Kernel-Adatron for solving large-scale classification problems, focusing on protein structure prediction. The functional properties of proteins depend upon their three-dimensional structures. Knowing the structures of proteins is crucial for biology and can lead to improvements in areas such as medicine, agriculture and biofuels.

  9. Pain Neurophysiology Education and Therapeutic Exercise for Patients With Chronic Low Back Pain: A Single-Blind Randomized Controlled Trial.

    PubMed

    Bodes Pardo, Gema; Lluch Girbés, Enrique; Roussel, Nathalie A; Gallego Izquierdo, Tomás; Jiménez Penick, Virginia; Pecos Martín, Daniel

    2018-02-01

    To assess the effect of a pain neurophysiology education (PNE) program plus therapeutic exercise (TE) for patients with chronic low back pain (CLBP). Single-blind randomized controlled trial. Private clinic and university. Patients with CLBP for ≥6 months (N=56). Participants were randomized to receive either a TE program consisting of motor control, stretching, and aerobic exercises (n=28) or the same TE program in addition to a PNE program (n=28), conducted in two 30- to 50-minute sessions in groups of 4 to 6 participants. The primary outcome was pain intensity rated on the numerical pain rating scale which was completed immediately after treatment and at 1- and 3-month follow-up. Secondary outcome measures were pressure pain threshold, finger-to-floor distance, Roland-Morris Disability Questionnaire, Pain Catastrophizing Scale, Tampa Scale for Kinesiophobia, and Patient Global Impression of Change. At 3-month follow-up, a large change in pain intensity (numerical pain rating scale: -2.2; -2.93 to -1.28; P<.001; d=1.37) was observed for the PNE plus TE group, and a moderate effect size was observed for the secondary outcome measures. Combining PNE with TE resulted in significantly better results for participants with CLBP, with a large effect size, compared with TE alone. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  10. Status of DSMT research program

    NASA Technical Reports Server (NTRS)

    Mcgowan, Paul E.; Javeed, Mehzad; Edighoffer, Harold H.

    1991-01-01

    The status of the Dynamic Scale Model Technology (DSMT) research program is presented. DSMT is developing scale model technology for large space structures as part of the Control Structure Interaction (CSI) program at NASA Langley Research Center (LaRC). Under DSMT a hybrid-scale structural dynamics model of Space Station Freedom was developed. Space Station Freedom was selected as the focus structure for DSMT since the station represents the first opportunity to obtain flight data on a complex, three-dimensional space structure. Included is an overview of DSMT including the development of the space station scale model and the resulting hardware. Scaling technology was developed for this model to achieve a ground test article which existing test facilities can accommodate while employing realistically scaled hardware. The model was designed and fabricated by the Lockheed Missile and Space Co., and is assembled at LaRc for dynamic testing. Also, results from ground tests and analyses of the various model components are presented along with plans for future subassembly and matted model tests. Finally, utilization of the scale model for enhancing analysis verification of the full-scale space station is also considered.

  11. Description of a Compensatory College Education Program for the Disadvantaged and Its Associated Research and Evaluation Program.

    ERIC Educational Resources Information Center

    Spuck, Dennis W.; And Others

    This paper reports on a large-scale project of research and evaluation of a program for disadvantaged minority group students conducted by the Center for Educational Opportunity at the Claremont Colleges. The Program of Special Directed Studies for Transition to College (PSDS), a five-year experimental project, is aimed at providing a four-year,…

  12. The PR2D (Place, Route in 2-Dimensions) automatic layout computer program handbook

    NASA Technical Reports Server (NTRS)

    Edge, T. M.

    1978-01-01

    Place, Route in 2-Dimensions is a standard cell automatic layout computer program for generating large scale integrated/metal oxide semiconductor arrays. The program was utilized successfully for a number of years in both government and private sectors but until now was undocumented. The compilation, loading, and execution of the program on a Sigma V CP-V operating system is described.

  13. The Productive Ward Program™: A Two-Year Implementation Impact Review Using a Longitudinal Multilevel Study.

    PubMed

    Van Bogaert, Peter; Van Heusden, Danny; Verspuy, Martijn; Wouters, Kristien; Slootmans, Stijn; Van der Straeten, Johnny; Van Aken, Paul; White, Mark

    2017-03-01

    Aim To investigate the impact of the quality improvement program "Productive Ward - Releasing Time to Care™" using nurses' and midwives' reports of practice environment, burnout, quality of care, job outcomes, as well as workload, decision latitude, social capital, and engagement. Background Despite the requirement for health systems to improve quality and the proliferation of quality improvement programs designed for healthcare, the empirical evidence supporting large-scale quality improvement programs impacting patient satisfaction, staff engagement, and quality care remains sparse. Method A longitudinal study was performed in a large 600-bed acute care university hospital at two measurement intervals for nurse practice environment, burnout, and quality of care and job outcomes and three measurement intervals for workload, decision latitude, social capital, and engagement between June 2011 and November 2014. Results Positive results were identified in practice environment, decision latitude, and social capital. Less favorable results were identified in relation to perceived workload, emotional exhaustion. and vigor. Moreover, measures of quality of care and job satisfaction were reported less favorably. Conclusion This study highlights the need to further understand how to implement large-scale quality improvement programs so that they integrate with daily practices and promote "quality improvement" as "business as usual."

  14. Timber markets and fuel treatments in the western US

    Treesearch

    Karen L. Abt; Jeffrey P. Prestemon

    2006-01-01

    We developed a model of interrelated timber markets in the U.S. West to assess the impacts of large-scale fuel reduction programs on these markets, and concomitant effects of the market on the fuel reduction programs. The linear programming spatial equilibrium model allows interstate and international trade with western Canada and the rest of the world, while...

  15. LANDSAT activities in the Republic of Zaire

    NASA Technical Reports Server (NTRS)

    Ilunga, S.

    1975-01-01

    An overview of the LANDSAT data utilization program of the Republic of Zaire is presented. The program emphasizes topics of economic significance to the national development program of Zaire: (1) agricultural land use capability analysis, including evaluation of the effects of large-scale burnings; (2) mineral resources evaluation; and (3) production of mapping materials for poorly covered regions.

  16. Pre-Layoff Intervention: A Response to Unemployment. Second Edition.

    ERIC Educational Resources Information Center

    Stone, Judson; And Others

    Based on a program provided by a consortium of mental health centers in the Detroit, Michigan, area, this manual is intended to assist in the development and delivery of programs that allay or prevent the devastating human impact of plant shutdowns and large-scale layoffs. The guide focuses on delivery of programs that promote more effective use…

  17. Psychosocial Profiles of Delinquent and Nondelinquent Participants in a Sports Program.

    ERIC Educational Resources Information Center

    Yiannakis, Andrew

    This study attempted to find reasons for the large proportion of dropouts in the federal government's National Summer Youth Sports Program. Selected scales of the Jesness Inventory were administered (value orientation, alienation, denial, and occupational aspiration) at the beginning of the program to 66 11-year-old boys enrolled in a 1971 program…

  18. A Comparison of Large-Scale Reforestation Techniques Commonly Used on Abandoned Fields in the Lower Mississippi Alluvial Vally

    Treesearch

    Callie Jo Schweitzer; John A. Stanturf

    1999-01-01

    Reforesting abandoned land in the lower Mississippi alluvial valley has attracted heightened attention. Currently, federal cost share programs, such as the Wetland Reserve Program and the Conservation Reserve Program, are enticing landowners to consider reforesting lands that are marginally productive for agriculture. This study examined four reforestation techniques...

  19. Fixing Teacher Professional Development

    ERIC Educational Resources Information Center

    Hill, Heather C.

    2009-01-01

    The professional development "system" for teachers is, by all accounts, broken. Despite evidence that specific programs can improve teacher knowledge and practice and student outcomes, these programs seldom reach real teachers on a large scale. Typically, reformers address such perceptions of failure by discovering and celebrating new formats and…

  20. Cryogenic Tank Technology Program (CTTP)

    NASA Technical Reports Server (NTRS)

    Vaughn, T. P.

    2001-01-01

    The objectives of the Cryogenic Tank Technology Program were to: (1) determine the feasibility and cost effectiveness of near net shape hardware; (2) demonstrate near net shape processes by fabricating large scale-flight quality hardware; and (3) advance state of current weld processing technologies for aluminum lithium alloys.

  1. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming

    PubMed Central

    Moreau, Thomas; Evans, Amanda L.; Vasquez, Louella; Tijssen, Marloes R.; Yan, Ying; Trotter, Matthew W.; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M.; Pask, Dean C.; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H.; Pedersen, Roger A.; Ghevaert, Cedric

    2016-01-01

    The production of megakaryocytes (MKs)—the precursors of blood platelets—from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 105 mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology. PMID:27052461

  2. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming.

    PubMed

    Moreau, Thomas; Evans, Amanda L; Vasquez, Louella; Tijssen, Marloes R; Yan, Ying; Trotter, Matthew W; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M; Pask, Dean C; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H; Pedersen, Roger A; Ghevaert, Cedric

    2016-04-07

    The production of megakaryocytes (MKs)--the precursors of blood platelets--from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 10(5) mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology.

  3. The promise and limitations of cash transfer programs for HIV prevention.

    PubMed

    Fieno, John; Leclerc-Madlala, Suzanne

    2014-01-01

    As the search for more effective HIV prevention strategies continues, increased attention is being paid to the potential role of cash transfers in prevention programming in sub-Saharan Africa. To date, studies testing the impact of both conditional and unconditional cash transfers on HIV-related behaviours and outcomes in sub-Saharan Africa have been relatively small-scale and their potential feasibility, costs and benefits at scale, among other things, remain largely unexplored. This article examines elements of a successful cash transfer program from Latin America and discusses challenges inherent in scaling-up such programs. The authors attempt a cost simulation of a cash transfer program for HIV prevention in South Africa comparing its cost and relative effectiveness--in number of HIV infections averted--against other prevention interventions. If a cash transfer program were to be taken to scale, the intervention would not have a substantial effect on decreasing the force of the epidemic in middle- and low-income countries. The integration of cash transfer programs into other sectors and linking them to a broader objective such as girls' educational attainment may be one way of addressing doubts raised by the authors regarding their value for HIV prevention.

  4. NASA advanced turboprop research and concept validation program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitlow, J.B. Jr.; Sievers, G.K.

    1988-01-01

    NASA has determined by experimental and analytical effort that use of advanced turboprop propulsion instead of the conventional turbofans in the older narrow-body airline fleet could reduce fuel consumption for this type of aircraft by up to 50 percent. In cooperation with industry, NASA has defined and implemented an Advanced Turboprop (ATP) program to develop and validate the technology required for these new high-speed, multibladed, thin, swept propeller concepts. This paper presents an overview of the analysis, model-scale test, and large-scale flight test elements of the program together with preliminary test results, as available.

  5. "Fan-Tip-Drive" High-Power-Density, Permanent Magnet Electric Motor and Test Rig Designed for a Nonpolluting Aircraft Propulsion Program

    NASA Technical Reports Server (NTRS)

    Brown, Gerald V.; Kascak, Albert F.

    2004-01-01

    A scaled blade-tip-drive test rig was designed at the NASA Glenn Research Center. The rig is a scaled version of a direct-current brushless motor that would be located in the shroud of a thrust fan. This geometry is very attractive since the allowable speed of the armature is approximately the speed of the blade tips (Mach 1 or 1100 ft/s). The magnetic pressure generated in the motor acts over a large area and, thus, produces a large force or torque. This large force multiplied by the large velocity results in a high-power-density motor.

  6. Cache Coherence Protocols for Large-Scale Multiprocessors

    DTIC Science & Technology

    1990-09-01

    and is compared with the other protocols for large-scale machines. In later analysis, this coherence method is designated by the acronym OCPD , which...private read misses 2 6 6 ( OCPD ) private write misses 2 6 6 Table 4.2: Transaction Types and Costs. the performance of the memory system. These...methodologies. Figure 4-2 shows the processor utiliza- tions of the Weather program, with special code in the dyn-nic post-mortem sched- 94 OCPD DlrINB

  7. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    DTIC Science & Technology

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...of agents, and each agent attempts to form a coalition with its most profitable partner. The second algorithm builds upon the Shapley for- mula [37...ters at the second layer. These Category Layer clusters each represent a single resource, and agents join one or more clusters based on their

  8. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    PubMed

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. An outdoor test facility for the large-scale production of microalgae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, D.A.; Weissman, J.; Goebel, R.

    The goal of the US Department of EnergySolar Energy Research Institute's Aquatic Species Program is to develop the technology base to produce liquid fuels from microalgae. This technology is being initially developed for the desert Southwest. As part of this program an outdoor test facility has been designed and constructed in Roswell, New Mexico. The site has a large existing infrastructure, a suitable climate, and abundant saline groundwater. This facility will be used to evaluate productivity of microalgae strains and conduct large-scale experiments to increase biomass productivity while decreasing production costs. Six 3-m/sup 2/ fiberglass raceways were constructed. Several microalgaemore » strains were screened for growth, one of which had a short-term productivity rate of greater than 50 g dry wt m/sup /minus/2/ d/sup /minus/1/. Two large-scale, 0.1-ha raceways have also been built. These are being used to evaluate the performance trade-offs between low-cost earthen liners and higher cost plastic liners. A series of hydraulic measurements is also being carried out to evaluate future improved pond designs. Future plans include a 0.5-ha pond, which will be built in approximately 2 years to test a scaled-up system. This unique facility will be available to other researchers and industry for studies on microalgae productivity. 6 refs., 9 figs., 1 tab.« less

  10. Effects of Large-Scale Releases on the Genetic Structure of Red Sea Bream (Pagrus major, Temminck et Schlegel) Populations in Japan.

    PubMed

    Blanco Gonzalez, Enrique; Aritaki, Masato; Knutsen, Halvor; Taniguchi, Nobuhiko

    2015-01-01

    Large-scale hatchery releases are carried out for many marine fish species worldwide; nevertheless, the long-term effects of this practice on the genetic structure of natural populations remains unclear. The lack of knowledge is especially evident when independent stock enhancement programs are conducted simultaneously on the same species at different geographical locations, as occurs with red sea bream (Pagrus major, Temminck et Schlegel) in Japan. In this study, we examined the putative effects of intensive offspring releases on the genetic structure of red sea bream populations along the Japanese archipelago by genotyping 848 fish at fifteen microsatellite loci. Our results suggests weak but consistent patterns of genetic divergence (F(ST) = 0.002, p < 0.001). Red sea bream in Japan appeared spatially structured with several patches of distinct allelic composition, which corresponded to areas receiving an important influx of fish of hatchery origin, either released intentionally or from unintentional escapees from aquaculture operations. In addition to impacts upon local populations inhabiting semi-enclosed embayments, large-scale releases (either intentionally or from unintentional escapes) appeared also to have perturbed genetic structure in open areas. Hence, results of the present study suggest that independent large-scale marine stock enhancement programs conducted simultaneously on one species at different geographical locations may compromise native genetic structure and lead to patchy patterns in population genetic structure.

  11. Workplan for Catalyzing Collaboration with Amazonian Universities in the Large Scale Biosphere-Atmosphere Experiment in Amazonia (LBA)

    NASA Technical Reports Server (NTRS)

    Brown, I. Foster; Moreira, Adriana

    1997-01-01

    Success of the Large-Scale Biosphere-Atmospheric Experiment in Amazonia (LBA) program depends on several critical factors, the most important being the effective participation of Amazonian researchers and institutions. Without host-county counterparts, particularly in Amazonia, many important studies cannot he undertaken due either to lack of qualified persons or to legal constraints. No less important, the acceptance of the LBA program in Amazonia is also dependent on what LBA can do for improving the scientific expertise in Amazonia. Gaining the active investment of Amazonian scientists in a comprehensive research program is not a trivial task. Potential collaborators are few, particularly where much of the research was to be originally focused - the southern arc of Brazilian Amazonia. The mid-term goals of the LBA Committee on Training and Education are to increase the number of collaborators and to demonstrate that LBA will be of benefit to the region.

  12. An interactive web-based system using cloud for large-scale visual analytics

    NASA Astrophysics Data System (ADS)

    Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.

    2015-03-01

    Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.

  13. Syringe Exchange, Injecting and Intranasal Drug Use

    PubMed Central

    Arasteh, Kamyar; McKnight, Courtney; Ringer, Martin; Friedman, Samuel R.

    2016-01-01

    Objective To assess trends in injecting and non-injecting drug use after implementation of large-scale syringe exchange in New York City. The belief that implementation of syringe exchange will lead to increased drug injecting has been a persistent argument against syringe exchange. Methods Administrative data on route of administration for primary drug of abuse among patients entering the Beth Israel methadone maintenance program from 1995 – 2007. Approximately 2000 patients enter the program each year. Results During and after the period of large scale implementation of syringe exchange, the numbers of methadone program entrants reporting injecting drug use decreased while the numbers of entrants reporting intranasal drug use increased (p < .001). Conclusion While assessing possible effects of syringe exchange on trends in injecting drug use is inherently difficult, this may be the strongest data collected to date showing a lack of increase in drug injecting following implementation of syringe exchange. PMID:19891668

  14. A large-scale computer facility for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Ballhaus, W. F., Jr.

    1985-01-01

    As a result of advances related to the combination of computer system technology and numerical modeling, computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. NASA has, therefore, initiated the Numerical Aerodynamic Simulation (NAS) Program with the objective to provide a basis for further advances in the modeling of aerodynamic flowfields. The Program is concerned with the development of a leading-edge, large-scale computer facility. This facility is to be made available to Government agencies, industry, and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. Attention is given to the requirements for computational aerodynamics, the principal specific goals of the NAS Program, the high-speed processor subsystem, the workstation subsystem, the support processing subsystem, the graphics subsystem, the mass storage subsystem, the long-haul communication subsystem, the high-speed data-network subsystem, and software.

  15. Development of Experimental Icing Simulation Capability for Full-Scale Swept Wings: Hybrid Design Process, Years 1 and 2

    NASA Technical Reports Server (NTRS)

    Fujiwara, Gustavo; Bragg, Mike; Triphahn, Chris; Wiberg, Brock; Woodard, Brian; Loth, Eric; Malone, Adam; Paul, Bernard; Pitera, David; Wilcox, Pete; hide

    2017-01-01

    This report presents the key results from the first two years of a program to develop experimental icing simulation capabilities for full-scale swept wings. This investigation was undertaken as a part of a larger collaborative research effort on ice accretion and aerodynamics for large-scale swept wings. Ice accretion and the resulting aerodynamic effect on large-scale swept wings presents a significant airplane design and certification challenge to air frame manufacturers, certification authorities, and research organizations alike. While the effect of ice accretion on straight wings has been studied in detail for many years, the available data on swept-wing icing are much more limited, especially for larger scales.

  16. Sign: large-scale gene network estimation environment for high performance computing.

    PubMed

    Tamada, Yoshinori; Shimamura, Teppei; Yamaguchi, Rui; Imoto, Seiya; Nagasaki, Masao; Miyano, Satoru

    2011-01-01

    Our research group is currently developing software for estimating large-scale gene networks from gene expression data. The software, called SiGN, is specifically designed for the Japanese flagship supercomputer "K computer" which is planned to achieve 10 petaflops in 2012, and other high performance computing environments including Human Genome Center (HGC) supercomputer system. SiGN is a collection of gene network estimation software with three different sub-programs: SiGN-BN, SiGN-SSM and SiGN-L1. In these three programs, five different models are available: static and dynamic nonparametric Bayesian networks, state space models, graphical Gaussian models, and vector autoregressive models. All these models require a huge amount of computational resources for estimating large-scale gene networks and therefore are designed to be able to exploit the speed of 10 petaflops. The software will be available freely for "K computer" and HGC supercomputer system users. The estimated networks can be viewed and analyzed by Cell Illustrator Online and SBiP (Systems Biology integrative Pipeline). The software project web site is available at http://sign.hgc.jp/ .

  17. Experiences with hypercube operating system instrumentation

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Rudolph, David C.

    1989-01-01

    The difficulties in conceptualizing the interactions among a large number of processors make it difficult both to identify the sources of inefficiencies and to determine how a parallel program could be made more efficient. This paper describes an instrumentation system that can trace the execution of distributed memory parallel programs by recording the occurrence of parallel program events. The resulting event traces can be used to compile summary statistics that provide a global view of program performance. In addition, visualization tools permit the graphic display of event traces. Visual presentation of performance data is particularly useful, indeed, necessary for large-scale parallel computers; the enormous volume of performance data mandates visual display.

  18. Lessons from SMD experience with approaches to the evaluation of fare changes

    DOT National Transportation Integrated Search

    1980-01-01

    Over the past several years UMTA's Service and Methods Demonstration Program (SMD) has undertaken a large number of studies of the effects of fare changes, both increases and decreases. Some of these studies have been large scale efforts directed at ...

  19. Program Correctness, Verification and Testing for Exascale (Corvette)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Koushik; Iancu, Costin; Demmel, James W

    The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less

  20. Reclamation with trees in Illinois

    Treesearch

    Brad Evilsizer

    1980-01-01

    Thru private initiative, Illinois citizens historically have invented and conducted large-scale tree planting programs, starting with hedgerow fences and farmstead windbreaks and continuing with surface mine reclamation and farm woodlands. With invaluable help from public and private scientific personnel, the old and new programs hold promise of enlargement and...

  1. 5 CFR 9301.7 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and that operates solely for the purpose of conducting scientific research the results of which are... employees who perform the work and costs of conducting large-scale computer searches. (c) Duplicate means to... education, that operates a program or programs of scholarly research. (e) Fee category means one of the...

  2. 5 CFR 9301.7 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and that operates solely for the purpose of conducting scientific research the results of which are... employees who perform the work and costs of conducting large-scale computer searches. (c) Duplicate means to... education, that operates a program or programs of scholarly research. (e) Fee category means one of the...

  3. IMPROVING ESTUARINE EVALUATION THROUGH OUTREACH AND TECHNOLOGY TRANSFER TO STATES, TRIBES AND OTHER PARTNERS: EPA'S NATIONAL COASTAL ASSESSMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA's) National Coastal Assessment (NCA) is a large-scale, comprehensive environmental monitoring program designed to characterize the ecological condition of the Nation's coastal resources. A key to this successful program is the developmen...

  4. Shifting Expectations: Bringing STEM to Scale through Expanded Learning Systems

    ERIC Educational Resources Information Center

    Donner, Jessica; Wang, Yvonne

    2013-01-01

    Expanded learning opportunities, such as afterschool and summer programs, are particularly well positioned to help address science, technology, engineering, and mathematics (STEM) education crisis. A large percentage of youth participating in afterschool programs are members of groups traditionally underrepresented in STEM fields. Additionally,…

  5. The Vital Program: Transforming ICT Professional Development

    ERIC Educational Resources Information Center

    Bradshaw, Pete; Twining, Peter; Walsh, Christopher S.

    2012-01-01

    Developing a model for effective large-scale continuous professional development (CPD) for teachers remains a significant obstacle for many governments worldwide. This article describes the development and evolution of Vital--a CPD program designed to enhance the teaching of information communication technology in state-funded primary and…

  6. Crowd-Sourcing with K-12 citizen scientists: The Continuing Evolution of the GLOBE Program

    NASA Astrophysics Data System (ADS)

    Murphy, T.; Wegner, K.; Andersen, T. J.

    2016-12-01

    Twenty years ago, the Internet was still in its infancy, citizen science was a relatively unknown term, and the idea of a global citizen science database was unheard of. Then the Global Learning and Observations to Benefit the Environment (GLOBE) Program was proposed and this all changed. GLOBE was one of the first K-12 citizen science programs on a global scale. An initial large scale ramp-up of the program was followed by the establishment of a network of partners in countries and within the U.S. Now in the 21st century, the program has over 50 protocols in atmosphere, biosphere, hydrosphere and pedosphere, almost 140 million measurements in the database, a visualization system, collaborations with NASA satellite mission scientists (GPM, SMAP) and other scientists, as well as research projects by GLOBE students. As technology changed over the past two decades, it was integrated into the program's outreach efforts to existing and new members with the result that the program now has a strong social media presence. In 2016, a new app was launched which opened up GLOBE and data entry to citizen scientists of all ages. The app is aimed at fresh audiences, beyond the traditional GLOBE K-12 community. Groups targeted included: scouting organizations, museums, 4H, science learning centers, retirement communities, etc. to broaden participation in the program and increase the number of data available to students and scientists. Through the 20 years of GLOBE, lessons have been learned about changing the management of this type of large-scale program, the use of technology to enhance and improve the experience for members, and increasing community involvement in the program.

  7. A Framework for Evaluating Implementation of Community College Workforce Education Partnerships and Programs

    ERIC Educational Resources Information Center

    Yarnall, Louise; Tennant, Elizabeth; Stites, Regie

    2016-01-01

    Greater investments in community college workforce education are fostering large-scale partnerships between employers and educators. However, the evaluation work in this area has focused on outcome and productivity metrics, rather than addressing measures of implementation quality, which is critical to scaling any innovation. To deepen…

  8. The HI Content of Galaxies as a Function of Local Density and Large-Scale Environment

    NASA Astrophysics Data System (ADS)

    Thoreen, Henry; Cantwell, Kelly; Maloney, Erin; Cane, Thomas; Brough Morris, Theodore; Flory, Oscar; Raskin, Mark; Crone-Odekon, Mary; ALFALFA Team

    2017-01-01

    We examine the HI content of galaxies as a function of environment, based on a catalogue of 41527 galaxies that are part of the 70% complete Arecibo Legacy Fast-ALFA (ALFALFA) survey. We use nearest-neighbor methods to characterize local environment, and a modified version of the algorithm developed for the Galaxy and Mass Assembly (GAMA) survey to classify large-scale environment as group, filament, tendril, or void. We compare the HI content in these environments using statistics that include both HI detections and the upper limits on detections from ALFALFA. The large size of the sample allows to statistically compare the HI content in different environments for early-type galaxies as well as late-type galaxies. This work is supported by NSF grants AST-1211005 and AST-1637339, the Skidmore Faculty-Student Summer Research program, and the Schupf Scholars program.

  9. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    PubMed

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  10. Conservation of northern bobwhite on private lands in Georgia, USA under uncertainty about landscape-level habitat effects

    USGS Publications Warehouse

    Howell, J.E.; Moore, C.T.; Conroy, M.J.; Hamrick, R.G.; Cooper, R.J.; Thackston, R.E.; Carroll, J.P.

    2009-01-01

    Large-scale habitat enhancement programs for birds are becoming more widespread, however, most lack monitoring to resolve uncertainties and enhance program impact over time. Georgia?s Bobwhite Quail Initiative (BQI) is a competitive, proposal-based system that provides incentives to landowners to establish habitat for northern bobwhites (Colinus virginianus). Using data from monitoring conducted in the program?s first years (1999?2001), we developed alternative hierarchical models to predict bobwhite abundance in response to program habitat modifications on local and regional scales. Effects of habitat and habitat management on bobwhite population response varied among geographical scales, but high measurement variability rendered the specific nature of these scaled effects equivocal. Under some models, BQI had positive impact at both local farm scales (1, 9 km2), particularly when practice acres were clustered, whereas other credible models indicated that bird response did not depend on spatial arrangement of practices. Thus, uncertainty about landscape-level effects of management presents a challenge to program managers who must decide which proposals to accept. We demonstrate that optimal selection decisions can be made despite this uncertainty and that uncertainty can be reduced over time, with consequent improvement in management efficacy. However, such an adaptive approach to BQI program implementation would require the reestablishment of monitoring of bobwhite abundance, an effort for which funding was discontinued in 2002. For landscape-level conservation programs generally, our approach demonstrates the value in assessing multiple scales of impact of habitat modification programs, and it reveals the utility of addressing management uncertainty through multiple decision models and system monitoring.

  11. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babu, Sudarsanam Suresh; Love, Lonnie J.; Peter, William H.

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact ofmore » the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii) systems development, (iv) material feedstock, (v) process planning, (vi) residual stress & distortion, (vii) post-processing, (viii) qualification of parts, (ix) supply chain and (x) business case. Furthermore, an open innovation network methodology was proposed to accelerate the development and deployment of new large-scale metal additive manufacturing technology with the goal of creating a new generation of high deposition rate equipment, affordable feed stocks, and large metallic components to enhance America’s economic competitiveness.« less

  12. Large-scale semidefinite programming for many-electron quantum mechanics.

    PubMed

    Mazziotti, David A

    2011-02-25

    The energy of a many-electron quantum system can be approximated by a constrained optimization of the two-electron reduced density matrix (2-RDM) that is solvable in polynomial time by semidefinite programming (SDP). Here we develop a SDP method for computing strongly correlated 2-RDMs that is 10-20 times faster than previous methods [D. A. Mazziotti, Phys. Rev. Lett. 93, 213001 (2004)]. We illustrate with (i) the dissociation of N(2) and (ii) the metal-to-insulator transition of H(50). For H(50) the SDP problem has 9.4×10(6) variables. This advance also expands the feasibility of large-scale applications in quantum information, control, statistics, and economics. © 2011 American Physical Society

  13. Large-Scale Semidefinite Programming for Many-Electron Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Mazziotti, David A.

    2011-02-01

    The energy of a many-electron quantum system can be approximated by a constrained optimization of the two-electron reduced density matrix (2-RDM) that is solvable in polynomial time by semidefinite programming (SDP). Here we develop a SDP method for computing strongly correlated 2-RDMs that is 10-20 times faster than previous methods [D. A. Mazziotti, Phys. Rev. Lett. 93, 213001 (2004)PRLTAO0031-900710.1103/PhysRevLett.93.213001]. We illustrate with (i) the dissociation of N2 and (ii) the metal-to-insulator transition of H50. For H50 the SDP problem has 9.4×106 variables. This advance also expands the feasibility of large-scale applications in quantum information, control, statistics, and economics.

  14. Concepts for on-board satellite image registration. Volume 3: Impact of VLSI/VHSIC on satellite on-board signal processing

    NASA Technical Reports Server (NTRS)

    Aanstoos, J. V.; Snyder, W. E.

    1981-01-01

    Anticipated major advances in integrated circuit technology in the near future are described as well as their impact on satellite onboard signal processing systems. Dramatic improvements in chip density, speed, power consumption, and system reliability are expected from very large scale integration. Improvements are expected from very large scale integration enable more intelligence to be placed on remote sensing platforms in space, meeting the goals of NASA's information adaptive system concept, a major component of the NASA End-to-End Data System program. A forecast of VLSI technological advances is presented, including a description of the Defense Department's very high speed integrated circuit program, a seven-year research and development effort.

  15. Heavy hydrocarbon main injector technology

    NASA Technical Reports Server (NTRS)

    Fisher, S. C.; Arbit, H. A.

    1988-01-01

    One of the key components of the Advanced Launch System (ALS) is a large liquid rocket, booster engine. To keep the overall vehicle size and cost down, this engine will probably use liquid oxygen (LOX) and a heavy hydrocarbon, such as RP-1, as propellants and operate at relatively high chamber pressures to increase overall performance. A technology program (Heavy Hydrocarbon Main Injector Technology) is being studied. The main objective of this effort is to develop a logic plan and supporting experimental data base to reduce the risk of developing a large scale (approximately 750,000 lb thrust), high performance main injector system. The overall approach and program plan, from initial analyses to large scale, two dimensional combustor design and test, and the current status of the program are discussed. Progress includes performance and stability analyses, cold flow tests of injector model, design and fabrication of subscale injectors and calorimeter combustors for performance, heat transfer, and dynamic stability tests, and preparation of hot fire test plans. Related, current, high pressure, LOX/RP-1 injector technology efforts are also briefly discussed.

  16. Avoiding and tolerating latency in large-scale next-generation shared-memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Probst, David K.

    1993-01-01

    A scalable solution to the memory-latency problem is necessary to prevent the large latencies of synchronization and memory operations inherent in large-scale shared-memory multiprocessors from reducing high performance. We distinguish latency avoidance and latency tolerance. Latency is avoided when data is brought to nearby locales for future reference. Latency is tolerated when references are overlapped with other computation. Latency-avoiding locales include: processor registers, data caches used temporally, and nearby memory modules. Tolerating communication latency requires parallelism, allowing the overlap of communication and computation. Latency-tolerating techniques include: vector pipelining, data caches used spatially, prefetching in various forms, and multithreading in various forms. Relaxing the consistency model permits increased use of avoidance and tolerance techniques. Each model is a mapping from the program text to sets of partial orders on program operations; it is a convention about which temporal precedences among program operations are necessary. Information about temporal locality and parallelism constrains the use of avoidance and tolerance techniques. Suitable architectural primitives and compiler technology are required to exploit the increased freedom to reorder and overlap operations in relaxed models.

  17. Non-linear scale interactions in a forced turbulent boundary layer

    NASA Astrophysics Data System (ADS)

    Duvvuri, Subrahmanyam; McKeon, Beverley

    2015-11-01

    A strong phase-organizing influence exerted by a single synthetic large-scale spatio-temporal mode on directly-coupled (through triadic interactions) small scales in a turbulent boundary layer forced by a spatially-impulsive dynamic wall-roughness patch was previously demonstrated by the authors (J. Fluid Mech. 2015, vol. 767, R4). The experimental set-up was later enhanced to allow for simultaneous forcing of multiple scales in the flow. Results and analysis are presented from a new set of novel experiments where two distinct large scales are forced in the flow by a dynamic wall-roughness patch. The internal non-linear forcing of two other scales with triadic consistency to the artificially forced large scales, corresponding to sum and difference in wavenumbers, is dominated by the latter. This allows for a forcing-response (input-output) type analysis of the two triadic scales, and naturally lends itself to a resolvent operator based model (e.g. McKeon & Sharma, J. Fluid Mech. 2010, vol. 658, pp. 336-382) of the governing Navier-Stokes equations. The support of AFOSR (grant #FA 9550-12-1-0469, program manager D. Smith) is gratefully acknowledged.

  18. Large-scale hydropower system optimization using dynamic programming and object-oriented programming: the case of the Northeast China Power Grid.

    PubMed

    Li, Ji-Qing; Zhang, Yu-Shan; Ji, Chang-Ming; Wang, Ai-Jing; Lund, Jay R

    2013-01-01

    This paper examines long-term optimal operation using dynamic programming for a large hydropower system of 10 reservoirs in Northeast China. Besides considering flow and hydraulic head, the optimization explicitly includes time-varying electricity market prices to maximize benefit. Two techniques are used to reduce the 'curse of dimensionality' of dynamic programming with many reservoirs. Discrete differential dynamic programming (DDDP) reduces the search space and computer memory needed. Object-oriented programming (OOP) and the ability to dynamically allocate and release memory with the C++ language greatly reduces the cumulative effect of computer memory for solving multi-dimensional dynamic programming models. The case study shows that the model can reduce the 'curse of dimensionality' and achieve satisfactory results.

  19. Digital Systems Validation Handbook. Volume 2. Chapter 18. Avionic Data Bus Integration Technology

    DTIC Science & Technology

    1993-11-01

    interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion software, which make up digital...1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error detection and...formulate all the significant behavior of a system. MULTIVERSION PROGRAMMING. N-version programming. N-VERSION PROGRAMMING. The independent coding of a

  20. Large Composite Structures Processing Technologies for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Clinton, R. G., Jr.; Vickers, J. H.; McMahon, W. M.; Hulcher, A. B.; Johnston, N. J.; Cano, R. J.; Belvin, H. L.; McIver, K.; Franklin, W.; Sidwell, D.

    2001-01-01

    Significant efforts have been devoted to establishing the technology foundation to enable the progression to large scale composite structures fabrication. We are not capable today of fabricating many of the composite structures envisioned for the second generation reusable launch vehicle (RLV). Conventional 'aerospace' manufacturing and processing methodologies (fiber placement, autoclave, tooling) will require substantial investment and lead time to scale-up. Out-of-autoclave process techniques will require aggressive efforts to mature the selected technologies and to scale up. Focused composite processing technology development and demonstration programs utilizing the building block approach are required to enable envisioned second generation RLV large composite structures applications. Government/industry partnerships have demonstrated success in this area and represent best combination of skills and capabilities to achieve this goal.

  1. More Time or Better Tools? A Large-Scale Retrospective Comparison of Pedagogical Approaches to Teach Programming

    ERIC Educational Resources Information Center

    Silva-Maceda, Gabriela; Arjona-Villicaña, P. David; Castillo-Barrera, F. Edgar

    2016-01-01

    Learning to program is a complex task, and the impact of different pedagogical approaches to teach this skill has been hard to measure. This study examined the performance data of seven cohorts of students (N = 1168) learning programming under three different pedagogical approaches. These pedagogical approaches varied either in the length of the…

  2. The Wide-Scale Implementation of a Support Program for Parents of Children with an Intellectual Disability and Difficult Behaviour

    ERIC Educational Resources Information Center

    Hudson, Alan; Cameron, Christine; Matthews, Jan

    2008-01-01

    Background: While there have been several evaluations of programs to help parents manage difficult behaviour of their child with an intellectual disability, little research has focused on the evaluation of such programs when delivered to large populations. Method: The benchmarks recommended by Wiese, Stancliffe, and Hemsley (2005) were used to…

  3. The Management Aspect of the e-Portfolio as an Assessment Tool: Sample of Anadolu University

    ERIC Educational Resources Information Center

    Ozgur, Aydin Ziya; Kaya, Secil

    2011-01-01

    This article intends to introduce an e-portfolio system to help mentors assess the teacher candidates' performances and products in a large scale open and distance learning teacher training program. The Pre-School Teacher Training Program (PSTTP) of Anadolu University is a completely distance program that helps around 12.000 students get the…

  4. Environmental Education Organizations and Programs in Texas: Identifying Patterns through a Database and Survey Approach for Establishing Frameworks for Assessment and Progress

    ERIC Educational Resources Information Center

    Lloyd-Strovas, Jenny D.; Arsuffi, Thomas L.

    2016-01-01

    We examined the diversity of environmental education (EE) in Texas, USA, by developing a framework to assess EE organizations and programs at a large scale: the Environmental Education Database of Organizations and Programs (EEDOP). This framework consisted of the following characteristics: organization/visitor demographics, pedagogy/curriculum,…

  5. Annual Review of Research Under the Joint Service Electronics Program.

    DTIC Science & Technology

    1979-10-01

    Contents: Quadratic Optimization Problems; Nonlinear Control; Nonlinear Fault Analysis; Qualitative Analysis of Large Scale Systems; Multidimensional System Theory ; Optical Noise; and Pattern Recognition.

  6. The 2-MEV model: Constancy of adolescent environmental values within an 8-year time frame

    NASA Astrophysics Data System (ADS)

    Bogner, F. X.; Johnson, B.; Buxner, S.; Felix, L.

    2015-08-01

    The 2-MEV model is a widely used tool to monitor children's environmental perception by scoring individual values. Although the scale's validity has been confirmed repeatedly and independently as well as the scale is in usage within more than two dozen language units all over the world, longitudinal properties still need clarification. The purpose of the present study therefore was to validate the 2-MEV scale based on a large data basis of 10,676 children collected over an eight-year period. Cohorts of three different US states contributed to the sample by responding to a paper-and-pencil questionnaire within their pre-test initiatives in the context of field center programs. Since we used only the pre-program 2-MEV scale results (which is before participation in education programs), the data were clearly unspoiled by any follow-up interventions. The purpose of analysis was fourfold: First, to test and confirm the hypothesized factorized structure for the large data set and for the subsample of each of the three states. Second, to analyze the scoring pattern across the eight years' time range for both preservation and utilitarian preferences. Third, to investigate any age effects in the extracted factors. Finally, to extract suitable recommendations for educational implementation efforts.

  7. Deployment Pulmonary Health

    DTIC Science & Technology

    2015-02-11

    A similar risk-based approach may be appropriate for deploying military personnel. e) If DoD were to consider implementing a large- scale pre...quality of existing spirometry programs prior to considering a larger scale pre-deployment effort. Identifying an accelerated decrease in spirometry...baseline spirometry on a wider scale . e) Conduct pre-deployment baseline spirometry if there is a significant risk of exposure to a pulmonary hazard based

  8. Literacy Assessment in the Service of Literacy Policy.

    ERIC Educational Resources Information Center

    Venezky, Richard L.

    Literacy policy has often developed independently of other social and employment programs. As a consequence, literacy tends to become an end unto itself, and assessment is directed more toward academic, archival ends than toward policy evaluation. Many justifications given for large-scale literacy programs are not based upon empirical data.…

  9. Finite element meshing of ANSYS (trademark) solid models

    NASA Technical Reports Server (NTRS)

    Kelley, F. S.

    1987-01-01

    A large scale, general purpose finite element computer program, ANSYS, developed and marketed by Swanson Analysis Systems, Inc. is discussed. ANSYS was perhaps the first commercially available program to offer truly interactive finite element model generation. ANSYS's purpose is for solid modeling. This application is briefly discussed and illustrated.

  10. Comparing Public, Private, and Market Schools: The International Evidence

    ERIC Educational Resources Information Center

    Coulson, Andrew J.

    2009-01-01

    Would large-scale, free-market reforms improve educational outcomes for American children? This question cannot be reliably answered by looking exclusively at domestic evidence, much less by looking exclusively at existing "school choice" programs. Though many such programs have been implemented around the United States, none has created…

  11. Disaggregated Effects of Device on Score Comparability

    ERIC Educational Resources Information Center

    Davis, Laurie; Morrison, Kristin; Kong, Xiaojing; McBride, Yuanyuan

    2017-01-01

    The use of tablets for large-scale testing programs has transitioned from concept to reality for many state testing programs. This study extended previous research on score comparability between tablets and computers with high school students to compare score distributions across devices for reading, math, and science and to evaluate device…

  12. Assessing Students in the Margin: Challenges, Strategies, and Techniques

    ERIC Educational Resources Information Center

    Russell, Michael; Kavanaugh, Maureen

    2011-01-01

    The importance of student assessment, particularly for summative purposes, has increased greatly over the past thirty years. At the same time, emphasis on including all students in assessment programs has also increased. Assessment programs, whether they are large-scale, district-based, or teacher developed, have traditionally attempted to assess…

  13. An Introduction to the Safe Schools/Healthy Students Initiative

    ERIC Educational Resources Information Center

    Modzeleski, William; Mathews-Younes, Anne; Arroyo, Carmen G.; Mannix, Danyelle; Wells, Michael E.; Hill, Gary; Yu, Ping; Murray, Stephen

    2012-01-01

    The Safe Schools/Healthy Students (SS/HS) Initiative offers a unique opportunity to conduct large-scale, multisite, multilevel program evaluation in the context of a federal environment that places many requirements and constraints on how the grants are conducted and managed. Federal programs stress performance-based outcomes, valid and reliable…

  14. Large Scale Quality Engineering in Distance Learning Programs

    ERIC Educational Resources Information Center

    Herron, Rita I.; Holsombach-Ebner, Cinda; Shomate, Alice K.; Szathmary, Kimberly J.

    2012-01-01

    Embry-Riddle Aeronautical University--Worldwide serves more than 36,000 online students across the globe, many of whom are military and other non-traditional students, offering 34 undergraduate, graduate, and professional education/workforce certificate programs, presented both online and via blended delivery modes. The centralized model of online…

  15. Development of the Large-Scale Forcing Data to Support MC3E Cloud Modeling Studies

    NASA Astrophysics Data System (ADS)

    Xie, S.; Zhang, Y.

    2011-12-01

    The large-scale forcing fields (e.g., vertical velocity and advective tendencies) are required to run single-column and cloud-resolving models (SCMs/CRMs), which are the two key modeling frameworks widely used to link field data to climate model developments. In this study, we use an advanced objective analysis approach to derive the required forcing data from the soundings collected by the Midlatitude Continental Convective Cloud Experiment (MC3E) in support of its cloud modeling studies. MC3E is the latest major field campaign conducted during the period 22 April 2011 to 06 June 2011 in south-central Oklahoma through a joint effort between the DOE ARM program and the NASA Global Precipitation Measurement Program. One of its primary goals is to provide a comprehensive dataset that can be used to describe the large-scale environment of convective cloud systems and evaluate model cumulus parameterizations. The objective analysis used in this study is the constrained variational analysis method. A unique feature of this approach is the use of domain-averaged surface and top-of-the atmosphere (TOA) observations (e.g., precipitation and radiative and turbulent fluxes) as constraints to adjust atmospheric state variables from soundings by the smallest possible amount to conserve column-integrated mass, moisture, and static energy so that the final analysis data is dynamically and thermodynamically consistent. To address potential uncertainties in the surface observations, an ensemble forcing dataset will be developed. Multi-scale forcing will be also created for simulating various scale convective systems. At the meeting, we will provide more details about the forcing development and present some preliminary analysis of the characteristics of the large-scale forcing structures for several selected convective systems observed during MC3E.

  16. Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)

    2000-01-01

    HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).

  17. Interferometric Mapping of Perseus Outflows with MASSES

    NASA Astrophysics Data System (ADS)

    Stephens, Ian; Dunham, Michael; Myers, Philip C.; MASSES Team

    2017-01-01

    The MASSES (Mass Assembly of Stellar Systems and their Evolution with the SMA) survey, a Submillimeter Array (SMA) large-scale program, is mapping molecular lines and continuum emission about the 75 known Class 0/I sources in the Perseus Molecular Cloud. In this talk, I present some of the key results of this project, with a focus on the CO(2-1) maps of the molecular outflows. In particular, I investigate how protostars inherit their rotation axes from large-scale magnetic fields and filamentary structure.

  18. Strategies to facilitate implementation and sustainability of large system transformations: a case study of a national program for improving quality of care for elderly people.

    PubMed

    Nyström, Monica Elisabeth; Strehlenert, Helena; Hansson, Johan; Hasson, Henna

    2014-09-18

    Large-scale change initiatives stimulating change in several organizational systems in the health and social care sector are challenging both to lead and evaluate. There is a lack of systematic research that can enrich our understanding of strategies to facilitate large system transformations in this sector. The purpose of this study was to examine the characteristics of core activities and strategies to facilitate implementation and change of a national program aimed at improving life for the most ill elderly people in Sweden. The program outcomes were also addressed to assess the impact of these strategies. A longitudinal case study design with multiple data collection methods was applied. Archival data (n = 795), interviews with key stakeholders (n = 11) and non-participant observations (n = 23) were analysed using content analysis. Outcome data was obtained from national quality registries. This study presents an approach for implementing a large national change program that is characterized by initial flexibility and dynamism regarding content and facilitation strategies and a growing complexity over time requiring more structure and coordination. The description of activities and strategies show that the program management team engaged a variety of stakeholders and actor groups and accordingly used a palate of different strategies. The main strategies used to influence change in the target organisations were to use regional improvement coaches, regional strategic management teams, national quality registries, financial incentives and annually revised agreements. Interactive learning sessions, intense communication, monitor and measurements, and active involvement of different experts and stakeholders, including elderly people, complemented these strategies. Program outcomes showed steady progress in most of the five target areas, less so for the target of achieving coordinated care. There is no blue-print on how to approach the challenging task of leading large scale change programs in complex contexts, but our conclusion is that more attention has to be given to the multidimensional strategies that program management need to consider. This multidimensionality comprises different strategies depending on types of actors, system levels, contextual factors, program progress over time, program content, types of learning and change processes, and the conditions for sustainability.

  19. A state-based national network for effective wildlife conservation

    USGS Publications Warehouse

    Meretsky, Vicky J.; Maguire, Lynn A.; Davis, Frank W.; Stoms, David M.; Scott, J. Michael; Figg, Dennis; Goble, Dale D.; Griffith, Brad; Henke, Scott E.; Vaughn, Jacqueline; Yaffee, Steven L.

    2012-01-01

    State wildlife conservation programs provide a strong foundation for biodiversity conservation in the United States, building on state wildlife action plans. However, states may miss the species that are at the most risk at rangewide scales, and threats such as novel diseases and climate change increasingly act at regional and national levels. Regional collaborations among states and their partners have had impressive successes, and several federal programs now incorporate state priorities. However, regional collaborations are uneven across the country, and no national counterpart exists to support efforts at that scale. A national conservation-support program could fill this gap and could work across the conservation community to identify large-scale conservation needs and support efforts to meet them. By providing important information-sharing and capacity-building services, such a program would advance collaborative conservation among the states and their partners, thus increasing both the effectiveness and the efficiency of conservation in the United States.

  20. Investigating the dependence of SCM simulated precipitation and clouds on the spatial scale of large-scale forcing at SGP [Investigating the scale dependence of SCM simulated precipitation and cloud by using gridded forcing data at SGP

    DOE PAGES

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2017-08-05

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version ofmore » the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. As a result, other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.« less

  1. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    NASA Astrophysics Data System (ADS)

    Fonseca, R. A.; Vieira, J.; Fiuza, F.; Davidson, A.; Tsung, F. S.; Mori, W. B.; Silva, L. O.

    2013-12-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ˜106 cores and sustained performance over ˜2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios.

  2. Supporting Source Code Comprehension during Software Evolution and Maintenance

    ERIC Educational Resources Information Center

    Alhindawi, Nouh

    2013-01-01

    This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…

  3. Applications of Magnetic Suspension Technology to Large Scale Facilities: Progress, Problems and Promises

    NASA Technical Reports Server (NTRS)

    Britcher, Colin P.

    1997-01-01

    This paper will briefly review previous work in wind tunnel Magnetic Suspension and Balance Systems (MSBS) and will examine the handful of systems around the world currently known to be in operational condition or undergoing recommissioning. Technical developments emerging from research programs at NASA and elsewhere will be reviewed briefly, where there is potential impact on large-scale MSBSS. The likely aerodynamic applications for large MSBSs will be addressed, since these applications should properly drive system designs. A recently proposed application to ultra-high Reynolds number testing will then be addressed in some detail. Finally, some opinions on the technical feasibility and usefulness of a large MSBS will be given.

  4. Lessons from a pilot program to induce stove replacements in Chile: design, implementation and evaluation

    NASA Astrophysics Data System (ADS)

    Gómez, Walter; Chávez, Carlos; Salgado, Hugo; Vásquez, Felipe

    2017-11-01

    We present the design, implementation, and evaluation of a subsidy program to introduce cleaner and more efficient household wood combustion technologies. The program was conducted in the city of Temuco, one of the most polluted cities in southern Chile, as a pilot study to design a new national stove replacement initiative for pollution control. In this city, around 90% of the total emissions of suspended particulate matter is caused by households burning wood. We created a simulated market in which households could choose among different combustion technologies with an assigned subsidy. The subsidy was a relevant factor in the decision to participate, and the inability to secure credit was a significant constraint for the participation of low-income households. Due to several practical difficulties and challenges associated with the implementation of large-scale programs that encourage technological innovation at the household level, it is strongly advisable to start with a small-scale pilot that can provide useful insights into the final design of a fuller, larger-scale program.

  5. The Effectiveness of Private School Franchises in Chile's National Voucher Program

    ERIC Educational Resources Information Center

    Elacqua, Gregory; Contreras, Dante; Salazar, Felipe; Santos, Humberto

    2011-01-01

    There is persistent debate over the role of scale of operations in education. Some argue that school franchises offer educational services more effectively than small independent schools. Skeptics counter that large centralized operations create hard-to-manage bureaucracies and foster diseconomies of scale and that small schools are more effective…

  6. Relative Costs of Various Types of Assessments.

    ERIC Educational Resources Information Center

    Wheeler, Patricia H.

    Issues of the relative costs of multiple choice tests and alternative types of assessment are explored. Before alternative assessments in large-scale or small-scale programs are used, attention must be given to cost considerations and the resources required to develop and implement the assessment. Major categories of cost to be considered are…

  7. Fuel savings potential of the NASA Advanced Turboprop Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitlow, J.B. Jr.; Sievers, G.K.

    1984-01-01

    The NASA Advanced Turboprop (ATP) Program is directed at developing new technology for highly loaded, multibladed propellers for use at Mach 0.65 to 0.85 and at altitudes compatible with the air transport system requirements. Advanced turboprop engines offer the potential of 15 to 30 percent savings in aircraft block fuel relative to advanced turbofan engines (50 to 60 percent savings over today's turbofan fleet). The concept, propulsive efficiency gains, block fuel savings and other benefits, and the program objectives through a systems approach are described. Current program status and major accomplishments in both single rotation and counter rotation propeller technologymore » are addressed. The overall program from scale model wind tunnel tests to large scale flight tests on testbed aircraft is discussed.« less

  8. THE MARK I BUSINESS SYSTEM SIMULATION MODEL

    DTIC Science & Technology

    of a large-scale business simulation model as a vehicle for doing research in management controls. The major results of the program were the...development of the Mark I business simulation model and the Simulation Package (SIMPAC). SIMPAC is a method and set of programs facilitating the construction...of large simulation models. The object of this document is to describe the Mark I Corporation model, state why parts of the business were modeled as they were, and indicate the research applications of the model. (Author)

  9. The Concert system - Compiler and runtime technology for efficient concurrent object-oriented programming

    NASA Technical Reports Server (NTRS)

    Chien, Andrew A.; Karamcheti, Vijay; Plevyak, John; Sahrawat, Deepak

    1993-01-01

    Concurrent object-oriented languages, particularly fine-grained approaches, reduce the difficulty of large scale concurrent programming by providing modularity through encapsulation while exposing large degrees of concurrency. Despite these programmability advantages, such languages have historically suffered from poor efficiency. This paper describes the Concert project whose goal is to develop portable, efficient implementations of fine-grained concurrent object-oriented languages. Our approach incorporates aggressive program analysis and program transformation with careful information management at every stage from the compiler to the runtime system. The paper discusses the basic elements of the Concert approach along with a description of the potential payoffs. Initial performance results and specific plans for system development are also detailed.

  10. Eigensolver for a Sparse, Large Hermitian Matrix

    NASA Technical Reports Server (NTRS)

    Tisdale, E. Robert; Oyafuso, Fabiano; Klimeck, Gerhard; Brown, R. Chris

    2003-01-01

    A parallel-processing computer program finds a few eigenvalues in a sparse Hermitian matrix that contains as many as 100 million diagonal elements. This program finds the eigenvalues faster, using less memory, than do other, comparable eigensolver programs. This program implements a Lanczos algorithm in the American National Standards Institute/ International Organization for Standardization (ANSI/ISO) C computing language, using the Message Passing Interface (MPI) standard to complement an eigensolver in PARPACK. [PARPACK (Parallel Arnoldi Package) is an extension, to parallel-processing computer architectures, of ARPACK (Arnoldi Package), which is a collection of Fortran 77 subroutines that solve large-scale eigenvalue problems.] The eigensolver runs on Beowulf clusters of computers at the Jet Propulsion Laboratory (JPL).

  11. Using MHD Models for Context for Multispacecraft Missions

    NASA Astrophysics Data System (ADS)

    Reiff, P. H.; Sazykin, S. Y.; Webster, J.; Daou, A.; Welling, D. T.; Giles, B. L.; Pollock, C.

    2016-12-01

    The use of global MHD models such as BATS-R-US to provide context to data from widely spaced multispacecraft mission platforms is gaining in popularity and in effectiveness. Examples are shown, primarily from the Magnetospheric Multiscale Mission (MMS) program compared to BATS-R-US. We present several examples of large-scale magnetospheric configuration changes such as tail dipolarization events and reconfigurations after a sector boundary crossing which are made much more easily understood by placing the spacecraft in the model fields. In general, the models can reproduce the large-scale changes observed by the various spacecraft but sometimes miss small-scale or rapid time changes.

  12. Iraq: Recent Developments in Reconstruction Assistance

    DTIC Science & Technology

    2005-01-27

    Developments in Reconstruction Assistance Summary Large-scale reconstruction assistance programs are being undertaken by the United States following the war...in grant aid and as much as $13.3 billion in possible loans. On June 28, 2004, the entity implementing assistance programs , the Coalition Provisional... programs are being undertaken by the United States in Iraq. This report describes recent developments in this assistance effort. The report will be updated

  13. Iraq: Recent Developments in Reconstruction Assistance

    DTIC Science & Technology

    2004-12-20

    Developments in Reconstruction Assistance Summary Large-scale reconstruction assistance programs are being undertaken by the United States following the war...in grant aid and as much as $13.3 billion in possible loans. On June 28, 2004, the entity implementing assistance programs , the Coalition... programs are being undertaken by the United States in Iraq. This report describes recent developments in this assistance effort. The report will be

  14. The USNO Astrometry Department

    Science.gov Websites

    and methods, such as large scale CCD measuring devices, speckle and radio interferometry, are being the observational programs are published in the Naval Observatory Publications and in refereed

  15. GenomeDiagram: a python package for the visualization of large-scale genomic data.

    PubMed

    Pritchard, Leighton; White, Jennifer A; Birch, Paul R J; Toth, Ian K

    2006-03-01

    We present GenomeDiagram, a flexible, open-source Python module for the visualization of large-scale genomic, comparative genomic and other data with reference to a single chromosome or other biological sequence. GenomeDiagram may be used to generate publication-quality vector graphics, rastered images and in-line streamed graphics for webpages. The package integrates with datatypes from the BioPython project, and is available for Windows, Linux and Mac OS X systems. GenomeDiagram is freely available as source code (under GNU Public License) at http://bioinf.scri.ac.uk/lp/programs.html, and requires Python 2.3 or higher, and recent versions of the ReportLab and BioPython packages. A user manual, example code and images are available at http://bioinf.scri.ac.uk/lp/programs.html.

  16. Evaluation of the clinical implementation of a large-scale online e-learning program on venous blood specimen collection guideline practices.

    PubMed

    Willman, Britta; Grankvist, Kjell; Bölenius, Karin

    2018-05-11

    When performed erroneously, the venous blood specimen collection (VBSC) practice steps patient identification, test request management and test tube labeling are at high risk to jeopardize patient safety. VBSC educational programs with the intention to minimize risk of harm to patients are therefore needed. In this study, we evaluate the efficiency of a large-scale online e-learning program on personnel's adherence to VBSC practices and their experience of the e-learning program. An interprofessional team transformed an implemented traditional VBSC education program to an online e-learning program developed to stimulate reflection with focus on the high-risk practice steps. We used questionnaires to evaluate the effect of the e-learning program on personnel's self-reported adherence to VBSC practices compared to questionnaire surveys before and after introduction of the traditional education program. We used content analysis to evaluate the participants free text experience of the VBSC e-learning program. Adherence to the VBSC guideline high-risk practice steps generally increased following the implementation of a traditional educational program followed by an e-learning program. We however found a negative trend over years regarding participation rates and the practice to always send/sign the request form following the introduction of an electronic request system. The participants were in general content with the VBSC e-learning program. Properly designed e-learning programs on VBSC practices supersedes traditional educational programs in usefulness and functionality. Inclusion of questionnaires in the e-learning program is necessary for follow-up of VBSC participant's practices and educational program efficiency.

  17. Large Scale GW Calculations on the Cori System

    NASA Astrophysics Data System (ADS)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  18. Development of fine-resolution analyses and expanded large-scale forcing properties. Part II: Scale-awareness and application to single-column model experiments

    DOE PAGES

    Feng, Sha; Vogelmann, Andrew M.; Li, Zhijin; ...

    2015-01-20

    Fine-resolution three-dimensional fields have been produced using the Community Gridpoint Statistical Interpolation (GSI) data assimilation system for the U.S. Department of Energy’s Atmospheric Radiation Measurement Program (ARM) Southern Great Plains region. The GSI system is implemented in a multi-scale data assimilation framework using the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. From the fine-resolution three-dimensional fields, large-scale forcing is derived explicitly at grid-scale resolution; a subgrid-scale dynamic component is derived separately, representing subgrid-scale horizontal dynamic processes. Analyses show that the subgrid-scale dynamic component is often a major component over the large-scale forcing for grid scalesmore » larger than 200 km. The single-column model (SCM) of the Community Atmospheric Model version 5 (CAM5) is used to examine the impact of the grid-scale and subgrid-scale dynamic components on simulated precipitation and cloud fields associated with a mesoscale convective system. It is found that grid-scale size impacts simulated precipitation, resulting in an overestimation for grid scales of about 200 km but an underestimation for smaller grids. The subgrid-scale dynamic component has an appreciable impact on the simulations, suggesting that grid-scale and subgrid-scale dynamic components should be considered in the interpretation of SCM simulations.« less

  19. Measurement characteristics of the levels of institutionalization scales: examining reliability and validity.

    PubMed

    Barab, S A; Redman, B K; Froman, R D

    1998-01-01

    The Level of Institutionalization (LoIn) scales were developed to assess the extent to which a health promotion program has become integrated into a health care organization. The instrument was designed specifically to measure the amount of routinization and niche saturation of four subsystems (production, maintenance, supportive, and managerial) believed to make up an organization. In this study, the LoIn scales were completed for diabetes programs in 102 general hospitals and 30 home health agencies in Maryland and Pennsylvania. Reliability estimates across the four subsystems for routines (alpha = .61) and for niche saturation (alpha = .44) were substandard. Average correlation among the four subsystems for routines was .67, and among the four subsystems for niche saturation was .38, indicating moderate to large amounts of shared variance among subsystems and challenging claims of discriminant validity. Given these large correlations and a poor fit when testing the eight-factor model, higher-order confirmatory factor analyses were carried out. Results supported the existence of two second-order factors. When collapsed into two factors, the reliabilities were adequate (routines alpha = .90; niche saturation alpha = .80). Criterion-related validity also was found between length of program existence and the routine factor.

  20. Literacy in the Southern Sudan: A Case Study of Variables Affecting Literacy Programs.

    ERIC Educational Resources Information Center

    Cowan, J. Ronayne

    1983-01-01

    Describes the Local Languages Literacy Project in the Southern Sudan; delineates the most important educational, socioeconomic, and linguistic variables affecting the success of large-scale literacy programs in Africa; and questions the widely held assumption that indigenous language literacy is essential to subsequent literacy in the prestige…

  1. Assessing Student Achievement in Large-Scale Educational Programs Using Hierarchical Propensity Scores

    ERIC Educational Resources Information Center

    Vaughan, Angela L.; Lalonde, Trent L.; Jenkins-Guarnieri, Michael A.

    2014-01-01

    Many researchers assessing the efficacy of educational programs face challenges due to issues with non-randomization and the likelihood of dependence between nested subjects. The purpose of the study was to demonstrate a rigorous research methodology using a hierarchical propensity score matching method that can be utilized in contexts where…

  2. Examining the Types, Features, and Use of Instructional Materials in Afterschool Science

    ERIC Educational Resources Information Center

    D'Angelo, Cynthia M.; Harris, Christopher J.; Lundh, Patrik; House, Ann; Leones, Tiffany; Llorente, Carlin

    2017-01-01

    Afterschool programs have garnered much attention as promising environments for learning where children can engage in rich science activities. Yet, little is known about the kinds of instructional materials used in typical, large-scale afterschool programs that implement science with diverse populations of children. In this study, we investigated…

  3. Program Development: Identification and Formulation of Desirable Educational Goals.

    ERIC Educational Resources Information Center

    Goodlad, John I.

    In this speech, the author suggests that the success of public schools depends heavily on commitment to and large-scale agreement on educational goals. He examines the difficulty in creating rational programs to carry out specific behavioral goals and the more remote ends usually stated for educational systems. The author then discusses the…

  4. Using the ACRL Framework to Develop a Student-Centered Model for Program-Level Assessment

    ERIC Educational Resources Information Center

    Gammons, Rachel Wilder; Inge, Lindsay Taylor

    2017-01-01

    Information literacy instruction presents a difficult balance between quantity and quality, particularly for large-scale general education courses. This paper discusses the overhaul of the freshman composition instruction program at the University of Maryland Libraries, focusing on the transition from survey assessments to a student-centered and…

  5. Perspectives on the Integration of Technology and Assessment

    ERIC Educational Resources Information Center

    Pellegrino, James W.; Quellmalz, Edys S.

    2011-01-01

    This paper considers uses of technology in educational assessment from the perspective of innovation and support for teaching and learning. It examines assessment cases drawn from contexts that include large-scale testing programs as well as classroom-based programs, and attempts that have been made to harness the power of technology to provide…

  6. Study Gives Edge to 2 Math Programs

    ERIC Educational Resources Information Center

    Viadero, Debra

    2009-01-01

    This article reports that two programs for teaching mathematics in the early grades--Math Expressions and Saxon Math--emerged as winners in early findings released last week from a large-scale federal experiment that pits four popular, and philosophically distinct, math curricula against one another. But the results don't promise to end the…

  7. Delivering digital health and well-being at scale: lessons learned during the implementation of the dallas program in the United Kingdom.

    PubMed

    Devlin, Alison M; McGee-Lennon, Marilyn; O'Donnell, Catherine A; Bouamrane, Matt-Mouley; Agbakoba, Ruth; O'Connor, Siobhan; Grieve, Eleanor; Finch, Tracy; Wyke, Sally; Watson, Nicholas; Browne, Susan; Mair, Frances S

    2016-01-01

    To identify implementation lessons from the United Kingdom Delivering Assisted Living Lifestyles at Scale (dallas) program-a large-scale, national technology program that aims to deliver a broad range of digital services and products to the public to promote health and well-being. Prospective, longitudinal qualitative research study investigating implementation processes. Qualitative data collected includes semi-structured e-Health Implementation Toolkit-led interviews at baseline/mid-point (n = 38), quarterly evaluation, quarterly technical and barrier and solutions reports, observational logs, quarterly evaluation alignment interviews with project leads, observational data collected during meetings, and ethnographic data from dallas events (n > 200 distinct pieces of qualitative data). Data analysis was guided by Normalization Process Theory, a sociological theory that aids conceptualization of implementation issues in complex healthcare settings. Five key challenges were identified: 1) The challenge of establishing and maintaining large heterogeneous, multi-agency partnerships to deliver new models of healthcare; 2) The need for resilience in the face of barriers and set-backs including the backdrop of continually changing external environments; 3) The inherent tension between embracing innovative co-design and achieving delivery at pace and at scale; 4) The effects of branding and marketing issues in consumer healthcare settings; and 5) The challenge of interoperability and information governance, when commercial proprietary models are dominant. The magnitude and ambition of the dallas program provides a unique opportunity to investigate the macro level implementation challenges faced when designing and delivering digital health and wellness services at scale. Flexibility, adaptability, and resilience are key implementation facilitators when shifting to new digitally enabled models of care. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  8. Full-scale flammability test data for validation of aircraft fire mathematical models

    NASA Technical Reports Server (NTRS)

    Kuminecz, J. F.; Bricker, R. W.

    1982-01-01

    Twenty-five large scale aircraft flammability tests were conducted in a Boeing 737 fuselage at the NASA Johnson Space Center (JSC). The objective of this test program was to provide a data base on the propagation of large scale aircraft fires to support the validation of aircraft fire mathematical models. Variables in the test program included cabin volume, amount of fuel, fuel pan area, fire location, airflow rate, and cabin materials. A number of tests were conducted with jet A-1 fuel only, while others were conducted with various Boeing 747 type cabin materials. These included urethane foam seats, passenger service units, stowage bins, and wall and ceiling panels. Two tests were also included using special urethane foam and polyimide foam seats. Tests were conducted with each cabin material individually, with various combinations of these materials, and finally, with all materials in the cabin. The data include information obtained from approximately 160 locations inside the fuselage.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Junghyun; Gangwon, Jo; Jaehoon, Jung

    Applications written solely in OpenCL or CUDA cannot execute on a cluster as a whole. Most previous approaches that extend these programming models to clusters are based on a common idea: designating a centralized host node and coordinating the other nodes with the host for computation. However, the centralized host node is a serious performance bottleneck when the number of nodes is large. In this paper, we propose a scalable and distributed OpenCL framework called SnuCL-D for large-scale clusters. SnuCL-D's remote device virtualization provides an OpenCL application with an illusion that all compute devices in a cluster are confined inmore » a single node. To reduce the amount of control-message and data communication between nodes, SnuCL-D replicates the OpenCL host program execution and data in each node. We also propose a new OpenCL host API function and a queueing optimization technique that significantly reduce the overhead incurred by the previous centralized approaches. To show the effectiveness of SnuCL-D, we evaluate SnuCL-D with a microbenchmark and eleven benchmark applications on a large-scale CPU cluster and a medium-scale GPU cluster.« less

  10. Quantifying expert consensus against the existence of a secret, large-scale atmospheric spraying program

    NASA Astrophysics Data System (ADS)

    Shearer, Christine; West, Mick; Caldeira, Ken; Davis, Steven J.

    2016-08-01

    Nearly 17% of people in an international survey said they believed the existence of a secret large-scale atmospheric program (SLAP) to be true or partly true. SLAP is commonly referred to as ‘chemtrails’ or ‘covert geoengineering’, and has led to a number of websites purported to show evidence of widespread chemical spraying linked to negative impacts on human health and the environment. To address these claims, we surveyed two groups of experts—atmospheric chemists with expertize in condensation trails and geochemists working on atmospheric deposition of dust and pollution—to scientifically evaluate for the first time the claims of SLAP theorists. Results show that 76 of the 77 scientists (98.7%) that took part in this study said they had not encountered evidence of a SLAP, and that the data cited as evidence could be explained through other factors, including well-understood physics and chemistry associated with aircraft contrails and atmospheric aerosols. Our goal is not to sway those already convinced that there is a secret, large-scale spraying program—who often reject counter-evidence as further proof of their theories—but rather to establish a source of objective science that can inform public discourse.

  11. What Works? Common Practices in High Functioning Afterschool Programs across the Nation in Math, Reading, Science, Arts, Technology, and Homework--A Study by the National Partnership. The Afterschool Program Assessment Guide. CRESST Report 768

    ERIC Educational Resources Information Center

    Huang, Denise; Cho, Jamie; Mostafavi, Sima; Nam, Hannah H.; Oh, Christine; Harven, Aletha; Leon, Seth

    2010-01-01

    In an effort to identify and incorporate exemplary practices into existing and future afterschool programs, the U.S. Department of Education commissioned a large-scale evaluation of the 21st Century Community Learning Center (CCLC) program. The purpose of this evaluation project was to develop resources and professional development that addresses…

  12. US EPA - ToxCast and the Tox21 program: perspectives

    EPA Science Inventory

    ToxCast is a large-scale project being conducted by the U.S. EPA to screen ~2000 chemicals against a large battery of in vitro high-throughput screening (HTS) assays. ToxCast is complemented by the Tox21 project being jointly carried out by the U.S. NIH Chemical Genomics Center (...

  13. The Soldier Fitness Tracker: Global Delivery of Comprehensive Soldier Fitness

    ERIC Educational Resources Information Center

    Fravell, Mike; Nasser, Katherine; Cornum, Rhonda

    2011-01-01

    Carefully implemented technology strategies are vital to the success of large-scale initiatives such as the U.S. Army's Comprehensive Soldier Fitness (CSF) program. Achieving the U.S. Army's vision for CSF required a robust information technology platform that was scaled to millions of users and that leveraged the Internet to enable global reach.…

  14. Comparing Validity Evidence of Two ECERS-R Scoring Systems

    ERIC Educational Resources Information Center

    Zeng, Songtian

    2017-01-01

    Over 30 states have adopted the Early Childhood Environmental Rating Scale-Revised (ECERS-R) as a component of their program quality assessment systems, but the use of ECERS-R on such a large scale has raised important questions about implementation. One of the most pressing question centers upon decisions users must make between two scoring…

  15. Making automated computer program documentation a feature of total system design

    NASA Technical Reports Server (NTRS)

    Wolf, A. W.

    1970-01-01

    It is pointed out that in large-scale computer software systems, program documents are too often fraught with errors, out of date, poorly written, and sometimes nonexistent in whole or in part. The means are described by which many of these typical system documentation problems were overcome in a large and dynamic software project. A systems approach was employed which encompassed such items as: (1) configuration management; (2) standards and conventions; (3) collection of program information into central data banks; (4) interaction among executive, compiler, central data banks, and configuration management; and (5) automatic documentation. A complete description of the overall system is given.

  16. Solving Fuzzy Optimization Problem Using Hybrid Ls-Sa Method

    NASA Astrophysics Data System (ADS)

    Vasant, Pandian

    2011-06-01

    Fuzzy optimization problem has been one of the most and prominent topics inside the broad area of computational intelligent. It's especially relevant in the filed of fuzzy non-linear programming. It's application as well as practical realization can been seen in all the real world problems. In this paper a large scale non-linear fuzzy programming problem has been solved by hybrid optimization techniques of Line Search (LS), Simulated Annealing (SA) and Pattern Search (PS). As industrial production planning problem with cubic objective function, 8 decision variables and 29 constraints has been solved successfully using LS-SA-PS hybrid optimization techniques. The computational results for the objective function respect to vagueness factor and level of satisfaction has been provided in the form of 2D and 3D plots. The outcome is very promising and strongly suggests that the hybrid LS-SA-PS algorithm is very efficient and productive in solving the large scale non-linear fuzzy programming problem.

  17. Stochastic Dynamic Mixed-Integer Programming (SD-MIP)

    DTIC Science & Technology

    2015-05-05

    stochastic linear programming ( SLP ) problems. By using a combination of ideas from cutting plane theory of deterministic MIP (especially disjunctive...developed to date. b) As part of this project, we have also developed tools for very large scale Stochastic Linear Programming ( SLP ). There are...several reasons for this. First, SLP models continue to challenge many of the fastest computers to date, and many applications within the DoD (e.g

  18. Algorithms for Large-Scale Astronomical Problems

    DTIC Science & Technology

    2013-08-01

    implemented as a succession of Hadoop MapReduce jobs and sequential programs written in Java . The sampling and splitting stages are implemented as...one MapReduce job, the partitioning and clustering phases make up another job. The merging stage is implemented as a stand-alone Java program. The...Merging. The merging stage is implemented as a sequential Java program that reads the files with the shell information, which were generated by

  19. Increasing condom use and declining STI prevalence in high-risk MSM and TGs: evaluation of a large-scale prevention program in Tamil Nadu, India.

    PubMed

    Subramanian, Thilakavathi; Ramakrishnan, Lakshmi; Aridoss, Santhakumar; Goswami, Prabuddhagopal; Kanguswami, Boopathi; Shajan, Mathew; Adhikary, Rajat; Purushothaman, Girish Kumar Chethrapilly; Ramamoorthy, Senthil Kumar; Chinnaswamy, Eswaramurthy; Veeramani, Ilaya Bharathy; Paranjape, Ramesh Shivram

    2013-09-17

    This paper presents an evaluation of Avahan, a large scale HIV prevention program that was implemented using peer-mediated strategies, condom distribution and sexually transmitted infection (STI) clinical services among high-risk men who have sex with men (HR-MSM) and male to female transgender persons (TGs) in six high-prevalence state of Tamil Nadu, in southern India. Two rounds of large scale cross-sectional bio-behavioural surveys among HR-MSM and TGs and routine program monitoring data were used to assess changes in program coverage, condom use and prevalence of STIs (including HIV) and their association to program exposure. The Avahan program for HR-MSM and TGs in Tamil Nadu was significantly scaled up and contacts by peer educators reached 77 percent of the estimated denominator by the end of the program's fourth year. Exposure to the program increased between the two rounds of surveys for both HR-MSM (from 66 percent to 90 percent; AOR = 4.6; p < 0.001) and TGs (from 74.5 percent to 83 percent; AOR = 1.82; p < 0.06). There was an increase in consistent condom use by HR-MSM with their regular male partners (from 33 percent to 46 percent; AOR = 1.9; p < 0.01). Last time condom use with paying male partners (up from 81 percent to 94 percent; AOR = 3.6; p < 0.001) also showed an increase. Among TGs, the increase in condom use with casual male partners (18 percent to 52 percent; AOR = 1.8; p < 0.27) was not significant, and last time condom use declined significantly with paying male partners (93 percent to 80 percent; AOR = 0.32; p < 0.015). Syphilis declined significantly among both HR-MSM (14.3 percent to 6.8 percent; AOR = 0.37; p < 0.001) and TGs (16.6 percent to 4.2 percent; AOR = 0.34; p < 0.012), while change in HIV prevalence was not found to be significant for HR-MSM (9.7 percent to 10.9 percent) and TGs (12 percent to 9.8 percent). For both groups, change in condom use with commercial and non-commercial partners was found to be strongly linked with exposure to the Avahan program. The Avahan program for HR-MSM and TGs in Tamil Nadu achieved a high coverage, resulting in improved condom use by HR-MSM with their regular and commercial male partners. Declining STI prevalence and stable HIV prevalence reflect the positive effects of the prevention strategy. Outcomes from the program logic model indiacte the effectiveness of the program for HR-MSM and TGs in Tamil Nadu.

  20. Correlation of finite-element structural dynamic analysis with measured free vibration characteristics for a full-scale helicopter fuselage

    NASA Technical Reports Server (NTRS)

    Kenigsberg, I. J.; Dean, M. W.; Malatino, R.

    1974-01-01

    The correlation achieved with each program provides the material for a discussion of modeling techniques developed for general application to finite-element dynamic analyses of helicopter airframes. Included are the selection of static and dynamic degrees of freedom, cockpit structural modeling, and the extent of flexible-frame modeling in the transmission support region and in the vicinity of large cut-outs. The sensitivity of predicted results to these modeling assumptions are discussed. Both the Sikorsky Finite-Element Airframe Vibration analysis Program (FRAN/Vibration Analysis) and the NASA Structural Analysis Program (NASTRAN) have been correlated with data taken in full-scale vibration tests of a modified CH-53A helicopter.

  1. DupTree: a program for large-scale phylogenetic analyses using gene tree parsimony.

    PubMed

    Wehe, André; Bansal, Mukul S; Burleigh, J Gordon; Eulenstein, Oliver

    2008-07-01

    DupTree is a new software program for inferring rooted species trees from collections of gene trees using the gene tree parsimony approach. The program implements a novel algorithm that significantly improves upon the run time of standard search heuristics for gene tree parsimony, and enables the first truly genome-scale phylogenetic analyses. In addition, DupTree allows users to examine alternate rootings and to weight the reconciliation costs for gene trees. DupTree is an open source project written in C++. DupTree for Mac OS X, Windows, and Linux along with a sample dataset and an on-line manual are available at http://genome.cs.iastate.edu/CBL/DupTree

  2. The application of two-step linear temperature program to thermal analysis for monitoring the lipid induction of Nostoc sp. KNUA003 in large scale cultivation.

    PubMed

    Kang, Bongmun; Yoon, Ho-Sung

    2015-02-01

    Recently, microalgae was considered as a renewable energy for fuel production because its production is nonseasonal and may take place on nonarable land. Despite all of these advantages, microalgal oil production is significantly affected by environmental factors. Furthermore, the large variability remains an important problem in measurement of algae productivity and compositional analysis, especially, the total lipid content. Thus, there is considerable interest in accurate determination of total lipid content during the biotechnological process. For these reason, various high-throughput technologies were suggested for accurate measurement of total lipids contained in the microorganisms, especially oleaginous microalgae. In addition, more advanced technologies were employed to quantify the total lipids of the microalgae without a pretreatment. However, these methods are difficult to measure total lipid content in wet form microalgae obtained from large-scale production. In present study, the thermal analysis performed with two-step linear temeperature program was applied to measure heat evolved in temperature range from 310 to 351 °C of Nostoc sp. KNUA003 obtained from large-scale cultivation. And then, we examined the relationship between the heat evolved in 310-351 °C (HE) and total lipid content of the wet Nostoc cell cultivated in raceway. As a result, the linear relationship was determined between HE value and total lipid content of Nostoc sp. KNUA003. Particularly, there was a linear relationship of 98% between the HE value and the total lipid content of the tested microorganism. Based on this relationship, the total lipid content converted from the heat evolved of wet Nostoc sp. KNUA003 could be used for monitoring its lipid induction in large-scale cultivation. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Two Large-Scale Professional Development Programs for Mathematics Teachers and Their Impact on Student Achievement

    ERIC Educational Resources Information Center

    Lindvall, Jannika

    2017-01-01

    This article reports on two professional development programs for mathematics teachers and their effects on student achievement. The projects' design and their implementation within a larger municipality in Sweden, working together with over 90 teachers and 5000 students in elementary school, are described by using a set of core critical features…

  4. Item Response Theory at Subject- and Group-Level. Research Report 90-1.

    ERIC Educational Resources Information Center

    Tobi, Hilde

    This paper reviews the literature about item response models for the subject level and aggregated level (group level). Group-level item response models (IRMs) are used in the United States in large-scale assessment programs such as the National Assessment of Educational Progress and the California Assessment Program. In the Netherlands, these…

  5. Implementing a Peer Mentoring Model in the Clemson Eportfolio Program

    ERIC Educational Resources Information Center

    Ring, Gail L.

    2015-01-01

    Since the implementation of the ePortfolio Program in 2006, Clemson University has incorporated peer review for the formative feedback process. One of the challenges with this large-scale implementation has been ensuring that all work is reviewed and constructive feedback is provided in a timely manner. In this article, I discuss the strategies…

  6. Community Partnership to Address Snack Quality and Cost in After-School Programs

    ERIC Educational Resources Information Center

    Beets, Michael W.; Tilley, Falon; Turner-McGrievy, Gabrielle; Weaver, Robert G.; Jones, Sonya

    2014-01-01

    Background: Policies call on after-school programs (ASPs) to serve more nutritious snacks. A major barrier for improving snack quality is cost. This study describes the impact on snack quality and expenditures from a community partnership between ASPs and local grocery stores. Methods: Four large-scale ASPs (serving ~500 children, aged 6-12?years,…

  7. The Educational Predicament Confronting Taiwan's Gifted Programs: An Evaluation of Current Practices and Future Challenges

    ERIC Educational Resources Information Center

    Kao, Chen-yao

    2012-01-01

    This study examines the current problems affecting Taiwan's gifted education through a large-scale gifted program evaluation. Fifty-one gifted classes at 15 elementary schools and 62 gifted classes at 18 junior high schools were evaluated. The primary activities included in this biennial evaluation were document review, observation of…

  8. ICT, Literacy and Teacher Change: The Effectiveness of ICT Options in Kenya

    ERIC Educational Resources Information Center

    Piper, Benjamin

    2014-01-01

    There is a dearth of literature that use research design for causal inference that estimate the effect of information and communications technology (ICT) programs on literacy outcomes in early primary, particularly in Sub-Saharan Africa. There are several programs that have used ICT at a large scale, including Los Angeles, Peru, Nicaragua, Rwanda…

  9. Learning and Teaching Technology in English Teacher Education: Findings from a National Study

    ERIC Educational Resources Information Center

    Pasternak, Donna L.; Hallman, Heidi L.; Caughlan, Samantha; Renzi, Laura; Rush, Leslie S.; Meineke, Hannah

    2016-01-01

    This paper reports on one aspect of a large-scale nationwide study that surveyed English teacher educators about English teacher preparation programs throughout the United States. One aspect of the study focused on how technology is integrated within the context of English teacher education programs, asking the question, "As an area of…

  10. Evaluating Educational Programs. ETS R&D Scientific and Policy Contributions Series. ETS SPC-11-01. ETS Research Report No. RR-11-15

    ERIC Educational Resources Information Center

    Ball, Samuel

    2011-01-01

    Since its founding in 1947, ETS has conducted a significant and wide-ranging research program that has focused on, among other things, psychometric and statistical methodology; educational evaluation; performance assessment and scoring; large-scale assessment and evaluation; cognitive, developmental, personality, and social psychology; and…

  11. Thinking Big: A Framework for States on Scaling Up Community College Innovation

    ERIC Educational Resources Information Center

    Asera, Rose; McDonnell, Rachel Pleasants; Soricone, Lisa; Anderson, Nate; Endel, Barbara

    2013-01-01

    It is a truism of American social policy that our nation has great success generating innovative programs that improve outcomes for participants--but that we are far less effective at moving from small, "boutique" programs into broadly applied solutions that improve the prospects of large numbers of individuals. This is certainly true in…

  12. Aftershocks of Chile's Earthquake for an Ongoing, Large-Scale Experimental Evaluation

    ERIC Educational Resources Information Center

    Moreno, Lorenzo; Trevino, Ernesto; Yoshikawa, Hirokazu; Mendive, Susana; Reyes, Joaquin; Godoy, Felipe; Del Rio, Francisca; Snow, Catherine; Leyva, Diana; Barata, Clara; Arbour, MaryCatherine; Rolla, Andrea

    2011-01-01

    Evaluation designs for social programs are developed assuming minimal or no disruption from external shocks, such as natural disasters. This is because extremely rare shocks may not make it worthwhile to account for them in the design. Among extreme shocks is the 2010 Chile earthquake. Un Buen Comienzo (UBC), an ongoing early childhood program in…

  13. A De-Novo Genome Analysis Pipeline (DeNoGAP) for large-scale comparative prokaryotic genomics studies.

    PubMed

    Thakur, Shalabh; Guttman, David S

    2016-06-30

    Comparative analysis of whole genome sequence data from closely related prokaryotic species or strains is becoming an increasingly important and accessible approach for addressing both fundamental and applied biological questions. While there are number of excellent tools developed for performing this task, most scale poorly when faced with hundreds of genome sequences, and many require extensive manual curation. We have developed a de-novo genome analysis pipeline (DeNoGAP) for the automated, iterative and high-throughput analysis of data from comparative genomics projects involving hundreds of whole genome sequences. The pipeline is designed to perform reference-assisted and de novo gene prediction, homolog protein family assignment, ortholog prediction, functional annotation, and pan-genome analysis using a range of proven tools and databases. While most existing methods scale quadratically with the number of genomes since they rely on pairwise comparisons among predicted protein sequences, DeNoGAP scales linearly since the homology assignment is based on iteratively refined hidden Markov models. This iterative clustering strategy enables DeNoGAP to handle a very large number of genomes using minimal computational resources. Moreover, the modular structure of the pipeline permits easy updates as new analysis programs become available. DeNoGAP integrates bioinformatics tools and databases for comparative analysis of a large number of genomes. The pipeline offers tools and algorithms for annotation and analysis of completed and draft genome sequences. The pipeline is developed using Perl, BioPerl and SQLite on Ubuntu Linux version 12.04 LTS. Currently, the software package accompanies script for automated installation of necessary external programs on Ubuntu Linux; however, the pipeline should be also compatible with other Linux and Unix systems after necessary external programs are installed. DeNoGAP is freely available at https://sourceforge.net/projects/denogap/ .

  14. Housing first on a large scale: Fidelity strengths and challenges in the VA's HUD-VASH program.

    PubMed

    Kertesz, Stefan G; Austin, Erika L; Holmes, Sally K; DeRussy, Aerin J; Van Deusen Lukas, Carol; Pollio, David E

    2017-05-01

    Housing First (HF) combines permanent supportive housing and supportive services for homeless individuals and removes traditional treatment-related preconditions for housing entry. There has been little research describing strengths and shortfalls of HF implementation outside of research demonstration projects. The U.S. Department of Veterans Affairs (VA) has transitioned to an HF approach in a supportive housing program serving over 85,000 persons. This offers a naturalistic window to study fidelity when HF is adopted on a large scale. We operationalized HF into 20 criteria grouped into 5 domains. We assessed 8 VA medical centers twice (1 year apart), scoring each criterion using a scale ranging from 1 ( low fidelity ) to 4 ( high fidelity ). There were 2 HF domains (no preconditions and rapidly offering permanent housing) for which high fidelity was readily attained. There was uneven progress in prioritizing the most vulnerable clients for housing support. Two HF domains (sufficient supportive services and a modern recovery philosophy) had considerably lower fidelity. Interviews suggested that operational issues such as shortfalls in staffing and training likely hindered performance in these 2 domains. In this ambitious national HF program, the largest to date, we found substantial fidelity in focusing on permanent housing and removal of preconditions to housing entry. Areas of concern included the adequacy of supportive services and adequacy in deployment of a modern recovery philosophy. Under real-world conditions, large-scale implementation of HF is likely to require significant additional investment in client service supports to assure that results are concordant with those found in research studies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Space Weather Research at the National Science Foundation

    NASA Astrophysics Data System (ADS)

    Moretto, T.

    2015-12-01

    There is growing recognition that the space environment can have substantial, deleterious, impacts on society. Consequently, research enabling specification and forecasting of hazardous space effects has become of great importance and urgency. This research requires studying the entire Sun-Earth system to understand the coupling of regions all the way from the source of disturbances in the solar atmosphere to the Earth's upper atmosphere. The traditional, region-based structure of research programs in Solar and Space physics is ill suited to fully support the change in research directions that the problem of space weather dictates. On the observational side, dense, distributed networks of observations are required to capture the full large-scale dynamics of the space environment. However, the cost of implementing these is typically prohibitive, especially for measurements in space. Thus, by necessity, the implementation of such new capabilities needs to build on creative and unconventional solutions. A particularly powerful idea is the utilization of new developments in data engineering and informatics research (big data). These new technologies make it possible to build systems that can collect and process huge amounts of noisy and inaccurate data and extract from them useful information. The shift in emphasis towards system level science for geospace also necessitates the development of large-scale and multi-scale models. The development of large-scale models capable of capturing the global dynamics of the Earth's space environment requires investment in research team efforts that go beyond what can typically be funded under the traditional grants programs. This calls for effective interdisciplinary collaboration and efficient leveraging of resources both nationally and internationally. This presentation will provide an overview of current and planned initiatives, programs, and activities at the National Science Foundation pertaining to space weathe research.

  16. Liquid Oxygen Propellant Densification Production and Performance Test Results With a Large-Scale Flight-Weight Propellant Tank for the X33 RLV

    NASA Technical Reports Server (NTRS)

    Tomsik, Thomas M.; Meyer, Michael L.

    2010-01-01

    This paper describes in-detail a test program that was initiated at the Glenn Research Center (GRC) involving the cryogenic densification of liquid oxygen (LO2). A large scale LO2 propellant densification system rated for 200 gpm and sized for the X-33 LO2 propellant tank, was designed, fabricated and tested at the GRC. Multiple objectives of the test program included validation of LO2 production unit hardware and characterization of densifier performance at design and transient conditions. First, performance data is presented for an initial series of LO2 densifier screening and check-out tests using densified liquid nitrogen. The second series of tests show performance data collected during LO2 densifier test operations with liquid oxygen as the densified product fluid. An overview of LO2 X-33 tanking operations and load tests with the 20,000 gallon Structural Test Article (STA) are described. Tank loading testing and the thermal stratification that occurs inside of a flight-weight launch vehicle propellant tank were investigated. These operations involved a closed-loop recirculation process of LO2 flow through the densifier and then back into the STA. Finally, in excess of 200,000 gallons of densified LO2 at 120 oR was produced with the propellant densification unit during the demonstration program, an achievement that s never been done before in the realm of large-scale cryogenic tests.

  17. The future of emissions trading in light of the acid rain experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLean, B.J.; Rico, R.

    1995-12-31

    The idea of emissions trading was developed more than two decades ago by environmental economists eager to provide new ideas for how to improve the efficiency of environmental protection. However, early emissions trading efforts were built on the historical {open_quotes}command and control{close_quotes} infrastructure which has dominated U.S. environmental protection until today. The {open_quotes}command and control{close_quotes} model initially had advantages that were of a very pragmatic character: it assured large pollution reductions in a time when large, cheap reductions were available and necessary; and it did not require a sophisticated government infrastructure. Within the last five years, large-scale emission trading programsmore » have been successfully designed and started that are fundamentally different from the earlier efforts, creating a new paradigm for environmental control just when our understanding of environmental problems is changing as well. The purpose of this paper is to focus on the largest national-scale program--the Acid Rain Program--and from that experience, forecast when emission trading programs may be headed based on our understanding of the factors currently influencing environmental management. The first section of this paper will briefly review the history of emissions trading programs, followed by a summary of the features of the Acid Rain Program, highlighting those features that distinguish it from previous efforts. The last section addresses the opportunities for emissions trading (and its probable future directions).« less

  18. Large-Scale Hybrid Motor Testing. Chapter 10

    NASA Technical Reports Server (NTRS)

    Story, George

    2006-01-01

    Hybrid rocket motors can be successfully demonstrated at a small scale virtually anywhere. There have been many suitcase sized portable test stands assembled for demonstration of hybrids. They show the safety of hybrid rockets to the audiences. These small show motors and small laboratory scale motors can give comparative burn rate data for development of different fuel/oxidizer combinations, however questions that are always asked when hybrids are mentioned for large scale applications are - how do they scale and has it been shown in a large motor? To answer those questions, large scale motor testing is required to verify the hybrid motor at its true size. The necessity to conduct large-scale hybrid rocket motor tests to validate the burn rate from the small motors to application size has been documented in several place^'^^.^. Comparison of small scale hybrid data to that of larger scale data indicates that the fuel burn rate goes down with increasing port size, even with the same oxidizer flux. This trend holds for conventional hybrid motors with forward oxidizer injection and HTPB based fuels. While the reason this is occurring would make a great paper or study or thesis, it is not thoroughly understood at this time. Potential causes include the fact that since hybrid combustion is boundary layer driven, the larger port sizes reduce the interaction (radiation, mixing and heat transfer) from the core region of the port. This chapter focuses on some of the large, prototype sized testing of hybrid motors. The largest motors tested have been AMROC s 250K-lbf thrust motor at Edwards Air Force Base and the Hybrid Propulsion Demonstration Program s 250K-lbf thrust motor at Stennis Space Center. Numerous smaller tests were performed to support the burn rate, stability and scaling concepts that went into the development of those large motors.

  19. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities (Book)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. The U.S. Department of Energy's Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessary private capital to complete them. This guide is intended to provide a general resource that will begin to develop the Federal employee's awareness and understanding of the project developer's operating environment and the private sector's awareness and understandingmore » of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this guide has been organized to match Federal processes with typical phases of commercial project development. The main purpose of this guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project.« less

  20. Evaluation of programs to improve complementary feeding in infants and young children.

    PubMed

    Frongillo, Edward A

    2017-10-01

    Evaluation of complementary feeding programs is needed to enhance knowledge on what works, to document responsible use of resources, and for advocacy. Evaluation is done during program conceptualization and design, implementation, and determination of effectiveness. This paper explains the role of evaluation in the advancement of complementary feeding programs, presenting concepts and methods and illustrating them through examples. Planning and investments for evaluations should occur from the beginning of the project life cycle. Essential to evaluation is articulation of a program theory on how change would occur and what program actions are required for change. Analysis of program impact pathways makes explicit the dynamic connections in the program theory and accounts for contextual factors that could influence program effectiveness. Evaluating implementation functioning is done through addressing questions about needs, coverage, provision, and utilization using information obtained from process evaluation, operations research, and monitoring. Evaluating effectiveness is done through assessing impact, efficiency, coverage, process, and causality. Plausibility designs ask whether the program seemed to have an effect above and beyond external influences, often using a nonrandomized control group and baseline and end line measures. Probability designs ask whether there was an effect using a randomized control group. Evaluations may not be able to use randomization, particularly for programs implemented at a large scale. Plausibility designs, innovative designs, or innovative combinations of designs sometimes are best able to provide useful information. Further work is needed to develop practical designs for evaluation of large-scale country programs on complementary feeding. © 2017 John Wiley & Sons Ltd.

  1. FALCON reactor-pumped laser description and program overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1989-12-01

    The FALCON (Fission Activated Laser CONcept) reactor-pumped laser program at Sandia National Laboratories is examining the feasibility of high-power systems pumped directly by the energy from a nuclear reactor. In this concept we use the highly energetic fission fragments from neutron induced fission to excite a large volume laser medium. This technology has the potential to scale to extremely large optical power outputs in a primarily self-powered device. A laser system of this type could also be relatively compact and capable of long run times without refueling.

  2. Scaling and Sustaining Effective Early Childhood Programs Through School-Family-University Collaboration.

    PubMed

    Reynolds, Arthur J; Hayakawa, Momoko; Ou, Suh-Ruu; Mondi, Christina F; Englund, Michelle M; Candee, Allyson J; Smerillo, Nicole E

    2017-09-01

    We describe the development, implementation, and evaluation of a comprehensive preschool to third grade prevention program for the goals of sustaining services at a large scale. The Midwest Child-Parent Center (CPC) Expansion is a multilevel collaborative school reform model designed to improve school achievement and parental involvement from ages 3 to 9. By increasing the dosage, coordination, and comprehensiveness of services, the program is expected to enhance the transition to school and promote more enduring effects on well-being in multiple domains. We review and evaluate evidence from two longitudinal studies (Midwest CPC, 2012 to present; Chicago Longitudinal Study, 1983 to present) and four implementation examples of how the guiding principles of shared ownership, committed resources, and progress monitoring for improvement can promote effectiveness. The implementation system of partners and further expansion using "Pay for Success" financing shows the feasibility of scaling the program while continuing to improve effectiveness. © 2017 The Authors. Child Development published by Wiley Periodicals, Inc. on behalf of Society for Research in Child Development.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version ofmore » the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. As a result, other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.« less

  4. The global climate of December 1992-February 1993. Part 2: Large-scale variability across the tropical western Pacific during TOGA COARE

    NASA Technical Reports Server (NTRS)

    Gutzler, D. S.; Kiladis, G. N.; Meehl, G. A.; Weickmann, K. M.; Wheeler, M.

    1994-01-01

    Recently, scientists from more than a dozen countries carried out the field phase of a project called the Coupled-Atmosphere Response Experiment (COARE), devoted to describing the ocean-atmosphere system of the western Pacific near-equatorial warm pool. The project was conceived, organized, and funded under the auspices of the International Tropical Ocean Global Atmosphere (TOGA) Program. Although COARE consisted of several field phases, including a year-long atmospheric enhanced monitoring period (1 July 1992 -- 30 June 1993), the heart of COARE was its four-month Intensive Observation Period (IOP) extending from 1 Nov. 1992 through 28 Feb. 1993. An overview of large-scale variability during COARE is presented. The weather and climate observed in the IOP is placed into context with regard to large-scale, low-frequency fluctuations of the ocean-atmosphere system. Aspects of tropical variability beginning in Aug. 1992 and extending through Mar. 1993, with some sounding data for Apr. 1993 are considered. Variability over the large-scale sounding array (LSA) and the intensive flux array (IFA) is emphasized.

  5. A comprehensive study on urban true orthorectification

    USGS Publications Warehouse

    Zhou, G.; Chen, W.; Kelmelis, J.A.; Zhang, Dongxiao

    2005-01-01

    To provide some advanced technical bases (algorithms and procedures) and experience needed for national large-scale digital orthophoto generation and revision of the Standards for National Large-Scale City Digital Orthophoto in the National Digital Orthophoto Program (NDOP), this paper presents a comprehensive study on theories, algorithms, and methods of large-scale urban orthoimage generation. The procedures of orthorectification for digital terrain model (DTM)-based and digital building model (DBM)-based orthoimage generation and their mergence for true orthoimage generation are discussed in detail. A method of compensating for building occlusions using photogrammetric geometry is developed. The data structure needed to model urban buildings for accurately generating urban orthoimages is presented. Shadow detection and removal, the optimization of seamline for automatic mosaic, and the radiometric balance of neighbor images are discussed. Street visibility analysis, including the relationship between flight height, building height, street width, and relative location of the street to the imaging center, is analyzed for complete true orthoimage generation. The experimental results demonstrated that our method can effectively and correctly orthorectify the displacements caused by terrain and buildings in urban large-scale aerial images. ?? 2005 IEEE.

  6. Determination of the number of ψ(3686) events at BESIII

    NASA Astrophysics Data System (ADS)

    Ablikim, M.; Achasov, M. N.; Ai, X. C.; Ambrose, D. J.; Amoroso, A.; An, F. F.; An, Q.; Bai, J. Z.; Baldini Ferroli, R.; Ban, Y.; Bennett, J. V.; Bertani, M.; Bian, J. M.; Boger, E.; Bondarenko, O.; Boyko, I.; Briere, R. A.; Cai, H.; Cai, X.; Cakir, O.; Calcaterra, A.; Cao, G. F.; Cetin, S. A.; Chang, J. F.; Chelkov, G.; Chen, G.; Chen, H. S.; Chen, J. C.; Chen, M. L.; Chen, S. J.; Chen, X.; Chen, X. R.; Chen, Y. B.; Chu, X. K.; Chu, Y. P.; Cronin-Hennessy, D.; Dai, H. L.; Dai, J. P.; Dedovich, D.; Deng, Z. Y.; Denig, A.; Denysenko, I.; Destefanis, M.; Ding, Y.; Dong, C.; Dong, J.; Dong, L. Y.; Dong, M. Y.; Du, S. X.; Fan, J. Z.; Fang, J.; Fang, S. S.; Fang, Y.; Fava, L.; Feldbauer, F.; Feng, C. Q.; Fu, C. D.; Gao, Q.; Gao, Y.; Goetzen, K.; Gong, W. X.; Gradl, W.; Greco, M.; Gu, M. H.; Gu, Y. T.; Guan, Y. H.; Guo, A. Q.; Guo, Y. P.; Han, Y. L.; Harris, F. A.; He, K. L.; He, M.; Held, T.; Heng, Y. K.; Hou, Z. L.; Hu, H. M.; Hu, T.; Huang, G. S.; Huang, J. S.; Huang, L.; Huang, X. T.; Hussain, T.; Ji, Q.; Ji, Q. P.; Ji, X. B.; Ji, X. L.; Jiang, L. L.; Jiang, X. S.; Jiao, J. B.; Jiao, Z.; Jin, D. P.; Jin, S.; Johansson, T.; Kalantar-Nayestanaki, N.; Kang, X. L.; Kang, X. S.; Kavatsyuk, M.; Kloss, B.; Kopf, B.; Kornicer, M.; Kupsc, A.; Kühn, W.; Lai, W.; Lange, J. S.; Lara, M.; Larin, P.; Li, C. H.; Li, Cheng; Li, D. M.; Li, F.; Li, G.; Li, H. B.; Li, J. C.; Li, Kang; Li, Ke; Li, Lei; Li, P. R.; Li, Q. J.; Li, W. D.; Li, W. G.; Li, X. L.; Li, X. N.; Li, X. Q.; Li, X. R.; Li, Z. B.; Liang, H.; Liang, Y. F.; Liang, Y. T.; Liao, G. R.; Lin, D. X.; Liu, B. J.; Liu, C. X.; Liu, F. H.; Liu, Fang.; Liu, Feng.; Liu, H. B.; Liu, H. M.; Liu, Huihui.; Liu, J.; Liu, J. P.; Liu, K.; Liu, K. Y.; Liu, Q.; Liu, S. B.; Liu, X.; Liu, Y. B.; Liu, Z. A.; Liu, Zhiqiang.; Liu, Zhiqing.; Loehner, H.; Lou, X. C.; Lu, H. J.; Lu, H. L.; Lu, J. G.; Lu, Y.; Lu, Y. P.; Luo, C. L.; Luo, M. X.; Luo, T.; Luo, X. L.; Lv, M.; Lyu, X. R.; Ma, F. C.; Ma, H. L.; Ma, Q. M.; Ma, S.; Ma, T.; Ma, X. Y.; Maas, F. E.; Maggiora, M.; Mao, Y. J.; Mao, Z. P.; Messchendorp, J. G.; Min, J.; Min, T. J.; Mitchell, R. E.; Mo, X. H.; Mo, Y. J.; Morales Morales, C.; Moriya, K.; Muchnoi, N. Yu.; Muramatsu, H.; Nefedov, Y.; Nikolaev, I. B.; Ning, Z.; Nisar, S.; Niu, S. L.; Niu, X. Y.; Olsen, S. L.; Ouyang, Q.; Pacetti, S.; Pelizaeus, M.; Peng, H. P.; Peters, K.; Ping, J. L.; Ping, R. G.; Poling, R.; Qi, M.; Qian, S.; Qiao, C. F.; Qin, X. S.; Qin, Z. H.; Qiu, J. F.; Rashid, K. H.; Redmer, C. F.; Ripka, M.; Rong, G.; Sarantsev, A.; Schoenning, K.; Shan, W.; Shao, M.; Shen, C. P.; Shen, X. Y.; Sheng, H. Y.; Shepherd, M. R.; Song, W. M.; Song, X. Y.; Sosio, S.; Spataro, S.; Sun, G. X.; Sun, J. F.; Sun, S. S.; Sun, Y. J.; Sun, Y. Z.; Sun, Z. J.; Tang, C. J.; Tang, X.; Tapan, I.; Thorndike, E. H.; Toth, D.; Uman, I.; Varner, G. S.; Wang, B.; Wang, D.; Wang, D. Y.; Wang, K.; Wang, L. L.; Wang, L. S.; Wang, M.; Wang, P.; Wang, P. L.; Wang, Q. J.; Wang, W.; Wang, X. F.; Wang(Yadi, Y. D.; Wang, Y. F.; Wang, Y. Q.; Wang, Z.; Wang, Z. G.; Wang, Z. Y.; Wei, D. H.; Weidenkaff, P.; Wen, S. P.; Wiedner, U.; Wolke, M.; Wu, L. H.; Wu, Z.; Xia, L. G.; Xia, Y.; Xiao, D.; Xiao, Z. J.; Xie, Y. G.; Xiu, Q. L.; Xu, G. F.; Xu, L.; Xu, Q. J.; Xu, Q. N.; Xu, X. P.; Yan, W. B.; Yan, Y. H.; Yang, H. X.; Yang, Y.; Yang, Y. X.; Ye, H.; Ye, M.; Ye, M. H.; Yu, B. X.; Yu, C. X.; Yu, J. S.; Yuan, C. Z.; Yuan, Y.; Zafar, A. A.; Zeng, Y.; Zhang, B. X.; Zhang, B. Y.; Zhang, C. C.; Zhang, D. H.; Zhang, H. H.; Zhang, H. Y.; Zhang, J. J.; Zhang, J. Q.; Zhang, J. W.; Zhang, J. Y.; Zhang, J. Z.; Zhang, L.; Zhang, R.; Zhang, S. H.; Zhang, X. J.; Zhang, X. Y.; Zhang, Y. H.; Zhang, Yao.; Zhang, Z. H.; Zhang, Z. P.; Zhang, Z. Y.; Zhao, G.; Zhao, J. W.; Zhao, J. Z.; Zhao, Lei; Zhao, Ling.; Zhao, M. G.; Zhao, Q.; Zhao, Q. W.; Zhao, S. J.; Zhao, T. C.; Zhao, Y. B.; Zhao, Z. G.; Zhemchugov, A.; Zheng, B.; Zheng, J. P.; Zheng, Y. H.; Zhong, B.; Zhou, L.; Zhou, X.; Zhou, X. K.; Zhou, X. R.; Zhou, X. Y.; Zhu, K.; Zhu, K. J.; Zhu, X. L.; Zhu, Y. C.; Zhu, Y. S.; Zhu, Z. A.; Zhuang, J.; Zou, B. S.; Zou, J. H.; BESIII Collaboration

    2018-02-01

    The numbers of ψ(3686) events accumulated by the BESIII detector for the data taken during 2009 and 2012 are determined to be (107.0+/- 0.8)× {10}6 and (341.1+/- 2.1)× {10}6, respectively, by counting inclusive hadronic events, where the uncertainties are systematic and the statistical uncertainties are negligible. The number of events for the sample taken in 2009 is consistent with that of the previous measurement. The total number of ψ(3686) events for the two data taking periods is (448.1+/- 2.9)× {10}6. Supported by the Ministry of Science and Technology of China (2009CB825200), National Natural Science Foundation of China (NSFC) (11235011, 11322544, 11335008, 11425524, 11475207), the Chinese Academy of Sciences (CAS) Large-Scale Scientific Facility Program, the Collaborative Innovation Center for Particles and Interactions (CICPI), Joint Large-Scale Scientific Facility Funds of the NSFC and CAS (11179014), Joint Large-Scale Scientific Facility Funds of the NSFC and CAS (11179007, U1232201, U1532257, U1532258), Joint Funds of the National Natural Science Foundation of China (11079008), CAS (KJCX2-YW-N29, KJCX2-YW-N45), 100 Talents Program of CAS, National 1000 Talents Program of China, German Research Foundation DFG (Collaborative Research Center CRC 1044), Istituto Nazionale di Fisica Nucleare, Italy, Koninklijke Nederlandse Akademie van Wetenschappen (KNAW) (530-4CDP03), Ministry of Development of Turkey (DPT2006K-120470), National Natural Science Foundation of China (11205082), The Swedish Research Council, U. S. Department of Energy (DE-FG02-05ER41374, DE-SC-0010118, DE-SC-0010504), U.S. National Science Foundation, University of Groningen (RuG) and the Helmholtzzentrum fuer Schwerionenforschung GmbH (GSI), Darmstadt, WCU Program of National Research Foundation of Korea (R32-2008-000-10155-0).

  7. Analogue scale modelling of extensional tectonic processes using a large state-of-the-art centrifuge

    NASA Astrophysics Data System (ADS)

    Park, Heon-Joon; Lee, Changyeol

    2017-04-01

    Analogue scale modelling of extensional tectonic processes such as rifting and basin opening has been numerously conducted. Among the controlling factors, gravitational acceleration (g) on the scale models was regarded as a constant (Earth's gravity) in the most of the analogue model studies, and only a few model studies considered larger gravitational acceleration by using a centrifuge (an apparatus generating large centrifugal force by rotating the model at a high speed). Although analogue models using a centrifuge allow large scale-down and accelerated deformation that is derived by density differences such as salt diapir, the possible model size is mostly limited up to 10 cm. A state-of-the-art centrifuge installed at the KOCED Geotechnical Centrifuge Testing Center, Korea Advanced Institute of Science and Technology (KAIST) allows a large surface area of the scale-models up to 70 by 70 cm under the maximum capacity of 240 g-tons. Using the centrifuge, we will conduct analogue scale modelling of the extensional tectonic processes such as opening of the back-arc basin. Acknowledgement This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (grant number 2014R1A6A3A04056405).

  8. Experimental design for estimating unknown groundwater pumping using genetic algorithm and reduced order model

    NASA Astrophysics Data System (ADS)

    Ushijima, Timothy T.; Yeh, William W.-G.

    2013-10-01

    An optimal experimental design algorithm is developed to select locations for a network of observation wells that provide maximum information about unknown groundwater pumping in a confined, anisotropic aquifer. The design uses a maximal information criterion that chooses, among competing designs, the design that maximizes the sum of squared sensitivities while conforming to specified design constraints. The formulated optimization problem is non-convex and contains integer variables necessitating a combinatorial search. Given a realistic large-scale model, the size of the combinatorial search required can make the problem difficult, if not impossible, to solve using traditional mathematical programming techniques. Genetic algorithms (GAs) can be used to perform the global search; however, because a GA requires a large number of calls to a groundwater model, the formulated optimization problem still may be infeasible to solve. As a result, proper orthogonal decomposition (POD) is applied to the groundwater model to reduce its dimensionality. Then, the information matrix in the full model space can be searched without solving the full model. Results from a small-scale test case show identical optimal solutions among the GA, integer programming, and exhaustive search methods. This demonstrates the GA's ability to determine the optimal solution. In addition, the results show that a GA with POD model reduction is several orders of magnitude faster in finding the optimal solution than a GA using the full model. The proposed experimental design algorithm is applied to a realistic, two-dimensional, large-scale groundwater problem. The GA converged to a solution for this large-scale problem.

  9. Assessing estimation techniques for missing plot observations in the U.S. forest inventory

    Treesearch

    Grant M. Domke; Christopher W. Woodall; Ronald E. McRoberts; James E. Smith; Mark A. Hatfield

    2012-01-01

    The U.S. Forest Service, Forest Inventory and Analysis Program made a transition from state-by-state periodic forest inventories--with reporting standards largely tailored to regional requirements--to a nationally consistent, annual inventory tailored to large-scale strategic requirements. Lack of measurements on all forest land during the periodic inventory, along...

  10. Testing a New Generation: Implementing Clickers as an Extension Data Collection Tool

    ERIC Educational Resources Information Center

    Parmer, Sondra M.; Parmer, Greg; Struempler, Barb

    2012-01-01

    Using clickers to gauge student understanding in large classrooms is well documented. Less well known is the effectiveness of using clickers with youth for test taking in large-scale Extension programs. This article describes the benefits and challenges of collecting evaluation data using clickers with a third-grade population participating in a…

  11. Cultivating an Ethic of Environmental Sustainability: Integrating Insights from Aristotelian Virtue Ethics and Pragmatist Cognitive Development Theory

    ERIC Educational Resources Information Center

    York, Travis; Becker, Christian

    2012-01-01

    Despite increased attention for environmental sustainability programming, large-scale adoption of pro-environmental behaviors has been slow and largely short-term. This article analyzes the crucial role of ethics in this respect. The authors utilize an interdisciplinary approach drawing on virtue ethics and cognitive development theory to…

  12. Designing a Large-Scale Multilevel Improvement Initiative: The Improving Performance in Practice Program

    ERIC Educational Resources Information Center

    Margolis, Peter A.; DeWalt, Darren A.; Simon, Janet E.; Horowitz, Sheldon; Scoville, Richard; Kahn, Norman; Perelman, Robert; Bagley, Bruce; Miles, Paul

    2010-01-01

    Improving Performance in Practice (IPIP) is a large system intervention designed to align efforts and motivate the creation of a tiered system of improvement at the national, state, practice, and patient levels, assisting primary-care physicians and their practice teams to assess and measurably improve the quality of care for chronic illness and…

  13. A Review of Challenges in Developing a National Program for Gifted Children in India's Diverse Context

    ERIC Educational Resources Information Center

    Kurup, Anitha; Maithreyi, R.

    2012-01-01

    Large-scale sequential research developments for identification and measurement of giftedness have received ample attention in the West, whereas India's response to this has largely been lukewarm. The wide variation in parents' abilities to provide enriched environments to nurture their children's potential makes it imperative for India to develop…

  14. Proceedings of the DICE THROW Symposium 21-23 June 1977. Volume 1

    DTIC Science & Technology

    1977-07-01

    different scaled ANFO events to insure yield scalability. Phase 1 of the program consisted of a series of one-pound events to examine cratering and...characterization of a 500-ton-equivalent event. A large number of agencies were involved in different facets of the development program. Probably most...charge geometry observed in the 1000-pound series, supported the observations from the Phase 1 program. Differences were observed in the fireball

  15. Linux OS Jitter Measurements at Large Node Counts using a BlueGene/L

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Terry R; Tauferner, Mr. Andrew; Inglett, Mr. Todd

    2010-01-01

    We present experimental results for a coordinated scheduling implementation of the Linux operating system. Results were collected on an IBM Blue Gene/L machine at scales up to 16K nodes. Our results indicate coordinated scheduling was able to provide a dramatic improvement in scaling performance for two applications characterized as bulk synchronous parallel programs.

  16. Programming Self-Assembly of DNA Origami Honeycomb Two-Dimensional Lattices and Plasmonic Metamaterials.

    PubMed

    Wang, Pengfei; Gaitanaros, Stavros; Lee, Seungwoo; Bathe, Mark; Shih, William M; Ke, Yonggang

    2016-06-22

    Scaffolded DNA origami has proven to be a versatile method for generating functional nanostructures with prescribed sub-100 nm shapes. Programming DNA-origami tiles to form large-scale 2D lattices that span hundreds of nanometers to the micrometer scale could provide an enabling platform for diverse applications ranging from metamaterials to surface-based biophysical assays. Toward this end, here we design a family of hexagonal DNA-origami tiles using computer-aided design and demonstrate successful self-assembly of micrometer-scale 2D honeycomb lattices and tubes by controlling their geometric and mechanical properties including their interconnecting strands. Our results offer insight into programmed self-assembly of low-defect supra-molecular DNA-origami 2D lattices and tubes. In addition, we demonstrate that these DNA-origami hexagon tiles and honeycomb lattices are versatile platforms for assembling optical metamaterials via programmable spatial arrangement of gold nanoparticles (AuNPs) into cluster and superlattice geometries.

  17. Pyrotechnic hazards classification and evaluation program. Run-up reaction testing in pyrotechnic dust suspensions

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A preliminary investigation of the parameters included in run-up dust reactions is presented. Two types of tests were conducted: (1) ignition criteria of large bulk pyrotechnic dusts, and (2) optimal run-up conditions of large bulk pyrotechnic dusts. These tests were used to evaluate the order of magnitude and gross scale requirements needed to induce run-up reactions in pyrotechnic dusts and to simulate at reduced scale an accident that occurred in a manufacturing installation. Test results showed that propagation of pyrotechnic dust clouds resulted in a fireball of relatively long duration and large size. In addition, a plane wave front was observed to travel down the length of the gallery.

  18. Solid amine development program

    NASA Technical Reports Server (NTRS)

    Lovell, J. S.

    1973-01-01

    A regenerable solid amine material to perform the functions of humidity control and CO2 removal for space shuttle type vehicle is reported. Both small scale and large scale testing have shown this material to be competitive, especially for the longer shuttle missions. However, it had been observed that the material off-gasses ammonia under certain conditions. This presents two concerns. The first, that the ammonia would contaminate the cabin atmosphere, and second, that the material is degrading with time. An extensive test program has shown HS-C to produce only trace quantities of atmospheric contaminants, and under normal extremes, to have no practical life limitation.

  19. The Cancer Genome Atlas (TCGA): The next stage - TCGA

    Cancer.gov

    The Cancer Genome Atlas (TCGA), the NIH research program that has helped set the standards for characterizing the genomic underpinnings of dozens of cancers on a large scale, is moving to its next phase.

  20. Scaling NASA Applications to 1024 CPUs on Origin 3K

    NASA Technical Reports Server (NTRS)

    Taft, Jim

    2002-01-01

    The long and highly successful joint SGI-NASA research effort in ever larger SSI systems was to a large degree the result of the successful development of the MLP scalable parallel programming paradigm developed at ARC: 1) MLP scaling in real production codes justified ever larger systems at NAS; 2) MLP scaling on 256p Origin 2000 gave SGl impetus to productize 256p; 3) MLP scaling on 512 gave SGI courage to build 1024p O3K; and 4) History of MLP success resulted in IBM Star Cluster based MLP effort.

  1. Propeller aircraft interior noise model utilization study and validation

    NASA Technical Reports Server (NTRS)

    Pope, L. D.

    1984-01-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  2. Programming in a proposed 9X distributed Ada

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Goldsack, Stephen J.

    1990-01-01

    The proposed Ada 9X constructs for distribution was studied. The goal was to select suitable test cases to help in the evaluation of the proposed constructs. The examples were to be considered according to the following requirements: real time operation; fault tolerance at several different levels; demonstration of both distributed and massively parallel operation; reflection of realistic NASA programs; illustration of the issues of configuration, compilation, linking, and loading; indications of the consequences of using the proposed revisions for large scale programs; and coverage of the spectrum of communication patterns such as predictable, bursty, small and large messages. The first month was spent identifying possible examples and judging their suitability for the project.

  3. Commercial use of remote sensing in agriculture: a case study

    NASA Astrophysics Data System (ADS)

    Gnauck, Gary E.

    1999-12-01

    Over 25 years of research have clearly shown that an analysis of remote sensing imagery can provide information on agricultural crops. Most of this research has been funded by and directed toward the needs of government agencies. Commercial use of agricultural remote sensing has been limited to very small-scale operations supplying remote sensing services to a few selected customers. Datron/Transco Inc. undertook an internally funded remote sensing program directed toward the California cash crop industry (strawberries, lettuce, tomatoes, other fresh vegetables and cotton). The objectives of this program were twofold: (1) to assess the need and readiness of agricultural land managers to adopt remote sensing as a management tool, and (2) determine what technical barriers exist to large-scale implementation of this technology on a commercial basis. The program was divided into three phases: Planning, Engineering Test and Evaluation, and Commercial Operations. Findings: Remote sensing technology can deliver high resolution multispectral imagery with rapid turnaround, that can provide information on crop stress insects, disease and various soil parameters. The limiting factors to the use of remote sensing in agriculture are a lack of familiarization by the land managers, difficulty in translating 'information' into increased revenue or reduced cost for the land manager, and the large economies of scale needed to make the venture commercially viable.

  4. The Chandra Deep Wide-Field Survey: Completing the new generation of Chandra extragalactic surveys

    NASA Astrophysics Data System (ADS)

    Hickox, Ryan

    2016-09-01

    Chandra X-ray surveys have revolutionized our view of the growth of black holes across cosmic time. Recently, fundamental questions have emerged about the connection of AGN to their host large scale structures that clearly demand a wide, deep survey over a large area, comparable to the recent extensive Chandra surveys in smaller fields. We propose the Chandra Deep Wide-Field Survey (CDWFS) covering the central 6 sq. deg in the Bootes field, totaling 1.025 Ms (building on 550 ks from the HRC GTO program). CDWFS will efficiently probe a large cosmic volume, allowing us to carry out accurate new investigations of the connections between black holes and their large-scale structures, and will complete the next generation surveys that comprise a key part of Chandra's legacy.

  5. Documenting the Early Literacy and Numeracy Practices of Home Tutors in Distance and Isolated Education in Australia

    ERIC Educational Resources Information Center

    Lee, Libby; Wilks, Anne

    2007-01-01

    This paper reports aspects of a large-scale project conducted in rural and remote regions of Australia. The study was designed to assess teaching and learning practices in early childhood programs with a particular focus on literacy, numeracy and the use of information and communication technologies. Programs had been specifically designed for use…

  6. Hybrid Nested Partitions and Math Programming Framework for Large-scale Combinatorial Optimization

    DTIC Science & Technology

    2010-03-31

    optimization problems: 1) exact algorithms and 2) metaheuristic algorithms . This project will integrate concepts from these two technologies to develop...optimal solutions within an acceptable amount of computation time, and 2) metaheuristic algorithms such as genetic algorithms , tabu search, and the...integer programming decomposition approaches, such as Dantzig Wolfe decomposition and Lagrangian relaxation, and metaheuristics such as the Nested

  7. Facilitating Long-Term Recovery from Natural Disasters: Psychosocial Programming for Tsunami-Affected Schools of Sri Lanka

    ERIC Educational Resources Information Center

    Nastasi, Bonnie K.; Jayasena, Asoka; Summerville, Meredith; Borja, Amanda P.

    2011-01-01

    This article reports the findings of a school-based intervention project conducted in the Southern Province of Sri Lanka 15 to 18 months after the December 2004 Tsunami. The work responds to the need for culturally relevant programming to address long-term psychosocial recovery of children and adolescents affected by large scale disasters. Program…

  8. From Okra to Oak: Reforestation of Abandoned Agricultural Fields in the Lower Mississippi Alluvial Valley

    Treesearch

    Callie Jo Schweitzer; John A. Stanturf

    1997-01-01

    There has been a tremendous upsurge in interest in reforestation of bottomland hardwoods. In the lower Mississippi alluvial valley, reforestation projects are occurringon a large scale on abandoned agricultural fields, often in conjunction with state or federal cost-share programs. This paper describes some of the cost share programs used to establish bottomland...

  9. An effectiveness monitoring program for the northwest forest plan: new approaches to common monitoring problems

    Treesearch

    Craig Palmer; Barry Mulder; Barry Noon

    2000-01-01

    The Northwest Forest Plan is a large-scale ecosystem management plan for federal lands in the Pacific Northwest of the United States. An effectiveness monitoring program has been developed to determine the extent to which the goals and objectives of this Plan are being achieved. Priority resources identified for ecological monitoring include late-successional and old-...

  10. Interactive Profiler: An Intuitive, Web-Based Statistical Application in Visualizing Educational and Marketing Databases

    ERIC Educational Resources Information Center

    Ip, Edward H.; Leung, Phillip; Johnson, Joseph

    2004-01-01

    We describe the design and implementation of a web-based statistical program--the Interactive Profiler (IP). The prototypical program, developed in Java, was motivated by the need for the general public to query against data collected from the National Assessment of Educational Progress (NAEP), a large-scale US survey of the academic state of…

  11. Promoting Bio-Ethanol in the United States by Incorporating Lessons from Brazil's National Alcohol Program

    ERIC Educational Resources Information Center

    Du, Yangbo

    2007-01-01

    Current U.S. energy policy supports increasing the use of bio-ethanol as a gasoline substitute, which Brazil first produced on a large scale in response to the 1970s energy crises. Brazil's National Alcohol Program stood out among its contemporaries regarding its success at displacing a third of Brazil's gasoline requirements, primarily due to…

  12. Large-scale comparison of reforestation techniques commonly used in the lower Mississippi alluvial valley: first year results

    Treesearch

    Callie J. Schweitzer; John A. Stanturf; James P. Shepard; Timothy M. Wilkins; C. Jeffery Portwood; Lamar C., Jr. Dorris

    1997-01-01

    In the Lower Mississippi Alluvial Valley (LMAV), restoring bottomland hardwood forests has attracted heightened interest. The impetus involves not only environmental and aesthetic benefits, but also sound economics. Financial incentives to restore forested wetlands in the LMAV can come from federal cost share programs such as the Conservation Reserve Program and the...

  13. Extrinsic Motivation for Large-Scale Assessments: A Case Study of a Student Achievement Program at One Urban High School

    ERIC Educational Resources Information Center

    Emmett, Joshua; McGee, Dean

    2013-01-01

    The purpose of this case study was to discover the critical attributes of a student achievement program, known as "Think Gold," implemented at one urban comprehensive high school as part of the improvement process. Student achievement on state assessments improved during the period under study. The study draws upon perspectives on…

  14. Collaborative restoration effects on forest structure in ponderosa pine-dominated forests of Colorado

    Treesearch

    Jeffery B. Cannon; Kevin J. Barrett; Benjamin M. Gannon; Robert N. Addington; Mike A. Battaglia; Paula J. Fornwalt; Gregory H. Aplet; Antony S. Cheng; Jeffrey L. Underhill; Jennifer S. Briggs; Peter M. Brown

    2018-01-01

    In response to large, severe wildfires in historically fire-adapted forests in the western US, policy initiatives, such as the USDA Forest Service’s Collaborative Forest Landscape Restoration Program (CFLRP), seek to increase the pace and scale of ecological restoration. One required component of this program is collaborative adaptive management, in which monitoring...

  15. Scaling properties of the Arctic sea ice Deformation from Buoy Dispersion Analysis

    NASA Astrophysics Data System (ADS)

    Weiss, J.; Rampal, P.; Marsan, D.; Lindsay, R.; Stern, H.

    2007-12-01

    A temporal and spatial scaling analysis of Arctic sea ice deformation is performed over time scales from 3 hours to 3 months and over spatial scales from 300 m to 300 km. The deformation is derived from the dispersion of pairs of drifting buoys, using the IABP (International Arctic Buoy Program) buoy data sets. This study characterizes the deformation of a very large solid plate -the Arctic sea ice cover- stressed by heterogeneous forcing terms like winds and ocean currents. It shows that the sea ice deformation rate depends on the scales of observation following specific space and time scaling laws. These scaling properties share similarities with those observed for turbulent fluids, especially for the ocean and the atmosphere. However, in our case, the time scaling exponent depends on the spatial scale, and the spatial exponent on the temporal scale, which implies a time/space coupling. An analysis of the exponent values shows that Arctic sea ice deformation is very heterogeneous and intermittent whatever the scales, i.e. it cannot be considered as viscous-like, even at very large time and/or spatial scales. Instead, it suggests a deformation accommodated by a multi-scale fracturing/faulting processes.

  16. Uncertainties of Large-Scale Forcing Caused by Surface Turbulence Flux Measurements and the Impacts on Cloud Simulations at the ARM SGP Site

    NASA Astrophysics Data System (ADS)

    Tang, S.; Xie, S.; Tang, Q.; Zhang, Y.

    2017-12-01

    Two types of instruments, the eddy correlation flux measurement system (ECOR) and the energy balance Bowen ratio system (EBBR), are used at the Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site to measure surface latent and sensible fluxes. ECOR and EBBR typically sample different land surface types, and the domain-mean surface fluxes derived from ECOR and EBBR are not always consistent. The uncertainties of the surface fluxes will have impacts on the derived large-scale forcing data and further affect the simulations of single-column models (SCM), cloud-resolving models (CRM) and large-eddy simulation models (LES), especially for the shallow-cumulus clouds which are mainly driven by surface forcing. This study aims to quantify the uncertainties of the large-scale forcing caused by surface turbulence flux measurements and investigate the impacts on cloud simulations using long-term observations from the ARM SGP site.

  17. Large-Scale Structure of Subauroral Polarization Streams During the Main Phase of a Severe Geomagnetic Storm

    NASA Astrophysics Data System (ADS)

    He, Fei; Zhang, Xiao-Xin; Wang, Wenbin; Liu, Libo; Ren, Zhi-Peng; Yue, Xinan; Hu, Lianhuan; Wan, Weixing; Wang, Hui

    2018-04-01

    In this study, we present multisatellite observations of the large-scale structures of subauroral polarization streams (SAPS) during the main phase of a severe geomagnetic storm that occurred on 31 March 2001. Observations by the Defense Meteorological Satellite Program F12 to F15 satellites indicate that the SAPS were first generated around the dusk sector at the beginning of the main phase. The SAPS channel then expanded toward the midnight sector and moved to lower latitudes as the main phase progressed. The peak velocity, latitudinal width, latitudinal alignment, and longitudinal span of the SAPS channel were highly dynamic during the storm main phase. The large westward velocities of the SAPS were located in the region of low electron densities, associated with low ionospheric conductivity. The large-scale structures of the SAPS also corresponded closely to those of the region-2 field-aligned currents, which were mainly determined by the azimuthal pressure gradient of the ring current.

  18. Fast Crystallization of the Phase Change Compound GeTe by Large-Scale Molecular Dynamics Simulations.

    PubMed

    Sosso, Gabriele C; Miceli, Giacomo; Caravati, Sebastiano; Giberti, Federico; Behler, Jörg; Bernasconi, Marco

    2013-12-19

    Phase change materials are of great interest as active layers in rewritable optical disks and novel electronic nonvolatile memories. These applications rest on a fast and reversible transformation between the amorphous and crystalline phases upon heating, taking place on the nanosecond time scale. In this work, we investigate the microscopic origin of the fast crystallization process by means of large-scale molecular dynamics simulations of the phase change compound GeTe. To this end, we use an interatomic potential generated from a Neural Network fitting of a large database of ab initio energies. We demonstrate that in the temperature range of the programming protocols of the electronic memories (500-700 K), nucleation of the crystal in the supercooled liquid is not rate-limiting. In this temperature range, the growth of supercritical nuclei is very fast because of a large atomic mobility, which is, in turn, the consequence of the high fragility of the supercooled liquid and the associated breakdown of the Stokes-Einstein relation between viscosity and diffusivity.

  19. False-Positive Tuberculin Skin Test Results Among Low-Risk Healthcare Workers Following Implementation of Fifty-Dose Vials of Purified Protein Derivative.

    PubMed

    Collins, Jeffrey M; Hunter, Mary; Gordon, Wanda; Kempker, Russell R; Blumberg, Henry M; Ray, Susan M

    2018-06-01

    Following large declines in tuberculosis transmission the United States, large-scale screening programs targeting low-risk healthcare workers are increasingly a source of false-positive results. We report a large cluster of presumed false-positive tuberculin skin test results in healthcare workers following a change to 50-dose vials of Tubersol tuberculin.Infect Control Hosp Epidemiol 2018;39:750-752.

  20. A sense of life: computational and experimental investigations with models of biochemical and evolutionary processes.

    PubMed

    Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael

    2003-01-01

    We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.

  1. Molecular diagnosis of Plasmodium ovale by photo-induced electron transfer fluorogenic primers: PET-PCR

    PubMed Central

    Akerele, David; Ljolje, Dragan; Talundzic, Eldin; Udhayakumar, Venkatachalam

    2017-01-01

    Accurate diagnosis of malaria infections continues to be challenging and elusive, especially in the detection of submicroscopic infections. Developing new malaria diagnostic tools that are sensitive enough to detect low-level infections, user friendly, cost effective and capable of performing large scale diagnosis, remains critical. We have designed novel self-quenching photo-induced electron transfer (PET) fluorogenic primers for the detection of P. ovale by real-time PCR. In our study, a total of 173 clinical samples, consisting of different malaria species, were utilized to test this novel PET-PCR primer. The sensitivity and specificity were calculated using nested-PCR as the reference test. The novel primer set demonstrated a sensitivity of 97.5% and a specificity of 99.2% (95% CI 85.2–99.8% and 95.2–99.9% respectively). Furthermore, the limit of detection for P. ovale was found to be 1 parasite/μl. The PET-PCR assay is a new molecular diagnostic tool with comparable performance to other commonly used PCR methods. It is relatively easy to perform, and amiable to large scale malaria surveillance studies and malaria control and elimination programs. Further field validation of this novel primer will be helpful to ascertain the utility for large scale malaria screening programs. PMID:28640824

  2. Molecular diagnosis of Plasmodium ovale by photo-induced electron transfer fluorogenic primers: PET-PCR.

    PubMed

    Akerele, David; Ljolje, Dragan; Talundzic, Eldin; Udhayakumar, Venkatachalam; Lucchi, Naomi W

    2017-01-01

    Accurate diagnosis of malaria infections continues to be challenging and elusive, especially in the detection of submicroscopic infections. Developing new malaria diagnostic tools that are sensitive enough to detect low-level infections, user friendly, cost effective and capable of performing large scale diagnosis, remains critical. We have designed novel self-quenching photo-induced electron transfer (PET) fluorogenic primers for the detection of P. ovale by real-time PCR. In our study, a total of 173 clinical samples, consisting of different malaria species, were utilized to test this novel PET-PCR primer. The sensitivity and specificity were calculated using nested-PCR as the reference test. The novel primer set demonstrated a sensitivity of 97.5% and a specificity of 99.2% (95% CI 85.2-99.8% and 95.2-99.9% respectively). Furthermore, the limit of detection for P. ovale was found to be 1 parasite/μl. The PET-PCR assay is a new molecular diagnostic tool with comparable performance to other commonly used PCR methods. It is relatively easy to perform, and amiable to large scale malaria surveillance studies and malaria control and elimination programs. Further field validation of this novel primer will be helpful to ascertain the utility for large scale malaria screening programs.

  3. Large-Scale Demonstration of Liquid Hydrogen Storage with Zero Boiloff for In-Space Applications

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Bryant, C. B.; Flachbart, R. H.; Holt, K. A.; Johnson, E.; Hedayat, A.; Hipp, B.; Plachta, D. W.

    2010-01-01

    Cryocooler and passive insulation technology advances have substantially improved prospects for zero-boiloff cryogenic storage. Therefore, a cooperative effort by NASA s Ames Research Center, Glenn Research Center, and Marshall Space Flight Center (MSFC) was implemented to develop zero-boiloff concepts for in-space cryogenic storage. Described herein is one program element - a large-scale, zero-boiloff demonstration using the MSFC multipurpose hydrogen test bed (MHTB). A commercial cryocooler was interfaced with an existing MHTB spray bar mixer and insulation system in a manner that enabled a balance between incoming and extracted thermal energy.

  4. Establishment of a National Wind Energy Center at University of Houston

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Su Su

    The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less

  5. Conference on Fire Resistant Materials (FIREMEN): A compilation of presentations and papers

    NASA Technical Reports Server (NTRS)

    Kourtides, D. A. (Editor)

    1978-01-01

    The proceedings of the NASA Fire Resistant Materials Engineering (FIREMEN) Program held at Ames Research Center on April, 13, 14, 1978 are reported. The purpose of the conference was to discuss the results of NASA in the field of aircraft fire safety and fire resistant materials. The program components include the following: (1) large-scale testing; (2) fire toxicology; (3) polymeric materials; and (4) bibliography related and/or generated from the program.

  6. BEHAVIOR ANALYSTS IN THE WAR ON POVERTY: A REVIEW OF THE USE OF FINANCIAL INCENTIVES TO PROMOTE EDUCATION AND EMPLOYMENT

    PubMed Central

    Holtyn, August F.; Jarvis, Brantley P.; Silverman, Kenneth

    2017-01-01

    Poverty is a pervasive risk factor underlying poor health. Many interventions that have sought to reduce health disparities associated with poverty have focused on improving health-related behaviors of low-income adults. Poverty itself could be targeted to improve health, but this approach would require programs that can consistently move poor individuals out of poverty. Governments and other organizations in the United States have tested a diverse range of antipoverty programs, generally on a large scale and in conjunction with welfare reform initiatives. This paper reviews antipoverty programs that used financial incentives to promote education and employment among welfare recipients and other low-income adults. The incentive-based, antipoverty programs had small or no effects on the target behaviors; they were implemented on large scales from the outset, without systematic development and evaluation of their components; and they did not apply principles of operant conditioning that have been shown to determine the effectiveness of incentive or reinforcement interventions. By applying basic principles of operant conditioning, behavior analysts could help address poverty and improve health through development of effective antipoverty programs. This paper describes a potential framework for a behavior-analytic antipoverty program, with the goal of illustrating that behavior analysts could be uniquely suited to make substantial contributions to the war on poverty. PMID:28078664

  7. Behavior analysts in the war on poverty: A review of the use of financial incentives to promote education and employment.

    PubMed

    Holtyn, August F; Jarvis, Brantley P; Silverman, Kenneth

    2017-01-01

    Poverty is a pervasive risk factor underlying poor health. Many interventions that have sought to reduce health disparities associated with poverty have focused on improving health-related behaviors of low-income adults. Poverty itself could be targeted to improve health, but this approach would require programs that can consistently move poor individuals out of poverty. Governments and other organizations in the United States have tested a diverse range of antipoverty programs, generally on a large scale and in conjunction with welfare reform initiatives. This paper reviews antipoverty programs that used financial incentives to promote education and employment among welfare recipients and other low-income adults. The incentive-based, antipoverty programs had small or no effects on the target behaviors; they were implemented on large scales from the outset, without systematic development and evaluation of their components; and they did not apply principles of operant conditioning that have been shown to determine the effectiveness of incentive or reinforcement interventions. By applying basic principles of operant conditioning, behavior analysts could help address poverty and improve health through development of effective antipoverty programs. This paper describes a potential framework for a behavior-analytic antipoverty program, with the goal of illustrating that behavior analysts could be uniquely suited to make substantial contributions to the war on poverty. © 2017 Society for the Experimental Analysis of Behavior.

  8. The EMCC / DARPA Massively Parallel Electromagnetic Scattering Project

    NASA Technical Reports Server (NTRS)

    Woo, Alex C.; Hill, Kueichien C.

    1996-01-01

    The Electromagnetic Code Consortium (EMCC) was sponsored by the Advanced Research Program Agency (ARPA) to demonstrate the effectiveness of massively parallel computing in large scale radar signature predictions. The EMCC/ARPA project consisted of three parts.

  9. Racking Response of Reinforced Concrete Cut and Cover Tunnel

    DOT National Transportation Integrated Search

    2016-01-01

    Currently, the knowledge base and quantitative data sets concerning cut and cover tunnel seismic response are scarce. In this report, a large-scale experimental program is conducted to assess: i) stiffness, capacity, and potential seismically-induced...

  10. Interprofessional Education and Practice Guide No. 7: Development, implementation, and evaluation of a large-scale required interprofessional education foundational programme.

    PubMed

    Shrader, Sarah; Hodgkins, Renee; Laverentz, Delois; Zaudke, Jana; Waxman, Michael; Johnston, Kristy; Jernigan, Stephen

    2016-09-01

    Health profession educators and administrators are interested in how to develop an effective and sustainable interprofessional education (IPE) programme. We describe the approach used at the University of Kansas Medical Centre, Kansas City, United States. This approach is a foundational programme with multiple large-scale, half-day events each year. The programme is threaded with common curricular components that build in complexity over time and assures that each learner is exposed to IPE. In this guide, lessons learned and general principles related to the development of IPE programming are discussed. Important areas that educators should consider include curriculum development, engaging leadership, overcoming scheduling barriers, providing faculty development, piloting the programming, planning for logistical coordination, intentionally pairing IP facilitators, anticipating IP conflict, setting clear expectations for learners, publicising the programme, debriefing with faculty, planning for programme evaluation, and developing a scholarship and dissemination plan.

  11. High-speed inlet research program and supporting analysis

    NASA Technical Reports Server (NTRS)

    Coltrin, Robert E.

    1990-01-01

    The technology challenges faced by the high speed inlet designer are discussed by describing the considerations that went into the design of the Mach 5 research inlet. It is shown that the emerging three dimensional viscous computational fluid dynamics (CFD) flow codes, together with small scale experiments, can be used to guide larger scale full inlet systems research. Then, in turn, the results of the large scale research, if properly instrumented, can be used to validate or at least to calibrate the CFD codes.

  12. Linear Scaling Density Functional Calculations with Gaussian Orbitals

    NASA Technical Reports Server (NTRS)

    Scuseria, Gustavo E.

    1999-01-01

    Recent advances in linear scaling algorithms that circumvent the computational bottlenecks of large-scale electronic structure simulations make it possible to carry out density functional calculations with Gaussian orbitals on molecules containing more than 1000 atoms and 15000 basis functions using current workstations and personal computers. This paper discusses the recent theoretical developments that have led to these advances and demonstrates in a series of benchmark calculations the present capabilities of state-of-the-art computational quantum chemistry programs for the prediction of molecular structure and properties.

  13. Charting the Emergence of Corporate Procurement of Utility-Scale PV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heeter, Jenny S.; Cook, Jeffrey J.; Bird, Lori A.

    Corporations and other institutions have contracted for more than 2300 MW of off-site solar, using power purchase agreements, green tariffs, or bilateral deals with utilities. This paper examines the benefits, challenges, and outlooks for large-scale off-site solar purchasing in the United States. Pathways differ based on where they are available, the hedge value they can provide, and their ease of implementation. The paper features case studies of an aggregate PPA (Massachusetts Institute of Technology, Boston Medical Center, and Post Office Square), a corporation exiting their incumbent utility (MGM Resorts), a utility offering large scale renewables to corporate customers (Alabama Powersmore » Renewable Procurement Program), and a company with approval to sell energy into wholesale markets (Google Energy Inc.).« less

  14. Global Magnetohydrodynamic Modeling of the Solar Corona

    NASA Technical Reports Server (NTRS)

    Linker, Jon A.; Wagner, William (Technical Monitor)

    2001-01-01

    The solar corona, the hot, tenuous outer atmosphere of the Sun, exhibits many fascinating phenomena on a wide range of scales. One of the ways that the Sun can affect us here at Earth is through the large-scale structure of the corona and the dynamical phenomena associated with it, as it is the corona that extends outward as the solar wind and encounters the Earth's magnetosphere. The goal of our research sponsored by NASA's Supporting Research and Technology Program in Solar Physics is to develop increasingly realistic models of the large-scale solar corona, so that we can understand the underlying properties of the coronal magnetic field that lead to the observed structure and evolution of the corona. We describe the work performed under this contract.

  15. Architecting for Large Scale Agile Software Development: A Risk-Driven Approach

    DTIC Science & Technology

    2013-05-01

    addressed aspect of scale in agile software development. Practices such as Scrum of Scrums are meant to address orchestration of multiple development...owner, Scrum master) have differing responsibilities from the roles in the existing phase-based waterfall program structures. Such differences may... Scrum . Communication with both internal and external stakeholders must be open and documentation should not be used as a substitute for communication

  16. Policy and administrative issues for large-scale clinical interventions following disasters.

    PubMed

    Scheeringa, Michael S; Cobham, Vanessa E; McDermott, Brett

    2014-02-01

    Large, programmatic mental health intervention programs for children and adolescents following disasters have become increasingly common; however, little has been written about the key goals and challenges involved. Using available data and the authors' experiences, this article reviews the factors involved in planning and implementing large-scale treatment programs following disasters. These issues include funding, administration, choice of clinical targets, workforce selection, choice of treatment modalities, training, outcome monitoring, and consumer uptake. Ten factors are suggested for choosing among treatment modalities: 1) reach (providing access to the greatest number), 2) retention of patients, 3) privacy, 4) parental involvement, 5) familiarity of the modality to clinicians, 6) intensity (intervention type matches symptom acuity and impairment of patient), 7) burden to the clinician (in terms of time, travel, and inconvenience), 8) cost, 9) technology needs, and 10) effect size. Traditionally, after every new disaster, local leaders who have never done so before have had to be recruited to design, administer, and implement programs. As expertise in all of these areas represents a gap for most local professionals in disaster-affected areas, we propose that a central, nongovernmental agency with national or international scope be created that can consult flexibly with local leaders following disasters on both overarching and specific issues. We propose recommendations and point out areas in greatest need of innovation.

  17. Resolving the Circumstellar Environment of the Galactic B[e] Supergiant Star MWC 137 from Large to Small Scales

    NASA Astrophysics Data System (ADS)

    Kraus, Michaela; Liimets, Tiina; Cappa, Cristina E.; Cidale, Lydia S.; Nickeler, Dieter H.; Duronea, Nicolas U.; Arias, Maria L.; Gunawan, Diah S.; Oksala, Mary E.; Borges Fernandes, Marcelo; Maravelias, Grigoris; Curé, Michel; Santander-García, Miguel

    2017-11-01

    The Galactic object MWC 137 has been suggested to belong to the group of B[e] supergiants. However, with its large-scale optical bipolar ring nebula and high-velocity jet and knots, it is a rather atypical representative of this class. We performed multiwavelength observations spreading from the optical to the radio regimes. Based on optical imaging and long-slit spectroscopic data, we found that the northern parts of the large-scale nebula are predominantly blueshifted, while the southern regions appear mostly redshifted. We developed a geometrical model consisting of two double cones. Although various observational features can be approximated with such a scenario, the observed velocity pattern is more complex. Using near-infrared integral-field unit spectroscopy, we studied the hot molecular gas in the vicinity of the star. The emission from the hot CO gas arises in a small-scale disk revolving around the star on Keplerian orbits. Although the disk itself cannot be spatially resolved, its emission is reflected by the dust arranged in arc-like structures and the clumps surrounding MWC 137 on small scales. In the radio regime, we mapped the cold molecular gas in the outskirts of the optical nebula. We found that large amounts of cool molecular gas and warm dust embrace the optical nebula in the east, south, and west. No cold gas or dust was detected in the north and northwestern regions. Despite the new insights into the nebula kinematics gained from our studies, the real formation scenario of the large-scale nebula remains an open issue. Based on observations collected with (1) the ESO VLT Paranal Observatory under programs 094.D-0637(B) and 097.D-0033(A), (2) the MPG 2.2 m telescope at La Silla Observatory, Chile, under programs 096.A-9030(A) and 096.A-9039(A), (3) the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the NSF on behalf of the Gemini partnership: the National Science Foundation (United States), the Science and Technology Facilities Council (United Kingdom), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministério da Ciência, Tecnologia e Inovação (Brazil), and Ministerio de Ciencia, Tecnología e Innovación Productiva (Argentina), under program GN-2013B-Q-11, (4) the Nordic Optical Telescope, operated by the Nordic Optical Telescope Scientific Association at the Observatorio del Roque de los Muchachos, La Palma, Spain, of the Instituto de Astrofisica de Canarias, (5) the APEX telescope under the program CHILE-9711B-2016. APEX is a collaboration between the Max-Planck-Institut für Radioastronomie, the European Southern Observatory, and the Onsala Observatory, and (6) the Perek 2 m telescope at Ondřejov Observatory, Czech Republic.

  18. A Large-Scale Assessment of Nucleic Acids Binding Site Prediction Programs

    PubMed Central

    Miao, Zhichao; Westhof, Eric

    2015-01-01

    Computational prediction of nucleic acid binding sites in proteins are necessary to disentangle functional mechanisms in most biological processes and to explore the binding mechanisms. Several strategies have been proposed, but the state-of-the-art approaches display a great diversity in i) the definition of nucleic acid binding sites; ii) the training and test datasets; iii) the algorithmic methods for the prediction strategies; iv) the performance measures and v) the distribution and availability of the prediction programs. Here we report a large-scale assessment of 19 web servers and 3 stand-alone programs on 41 datasets including more than 5000 proteins derived from 3D structures of protein-nucleic acid complexes. Well-defined binary assessment criteria (specificity, sensitivity, precision, accuracy…) are applied. We found that i) the tools have been greatly improved over the years; ii) some of the approaches suffer from theoretical defects and there is still room for sorting out the essential mechanisms of binding; iii) RNA binding and DNA binding appear to follow similar driving forces and iv) dataset bias may exist in some methods. PMID:26681179

  19. Scaling properties of sea ice deformation from buoy dispersion analysis

    NASA Astrophysics Data System (ADS)

    Rampal, P.; Weiss, J.; Marsan, D.; Lindsay, R.; Stern, H.

    2008-03-01

    A temporal and spatial scaling analysis of Arctic sea ice deformation is performed over timescales from 3 h to 3 months and over spatial scales from 300 m to 300 km. The deformation is derived from the dispersion of pairs of drifting buoys, using the IABP (International Arctic Buoy Program) buoy data sets. This study characterizes the deformation of a very large solid plate (the Arctic sea ice cover) stressed by heterogeneous forcing terms like winds and ocean currents. It shows that the sea ice deformation rate depends on the scales of observation following specific space and time scaling laws. These scaling properties share similarities with those observed for turbulent fluids, especially for the ocean and the atmosphere. However, in our case, the time scaling exponent depends on the spatial scale, and the spatial exponent on the temporal scale, which implies a time/space coupling. An analysis of the exponent values shows that Arctic sea ice deformation is very heterogeneous and intermittent whatever the scales, i.e., it cannot be considered as viscous-like, even at very large time and/or spatial scales. Instead, it suggests a deformation accommodated by a multiscale fracturing/faulting processes.

  20. Markets for Clean Air

    NASA Astrophysics Data System (ADS)

    Ellerman, A. Denny; Joskow, Paul L.; Schmalensee, Richard; Montero, Juan-Pablo; Bailey, Elizabeth M.

    2000-06-01

    Markets for Clean Air provides a comprehensive, in-depth description and evaluation of the first three years' experience with the U.S. Acid Rain Program. This environmental control program is the world's first large-scale use of a tradable emission permit system for achieving environmental goals. The book analyzes the behavior and performance of the market for emissions permits, called allowances in the Acid Rain Program, and quantifies emission reductions, compliance costs, and cost savings associated with the trading program. The book also includes chapters on the historical context in which this pioneering program developed and the political economy of allowance allocations.

  1. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    NASA Astrophysics Data System (ADS)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  2. Employers should disband employee weight control programs.

    PubMed

    Lewis, Alfred; Khanna, Vikram; Montrose, Shana

    2015-02-01

    American corporations continue to expand wellness programs, which now reach an estimated 90% of workers in large organizations, yet no study has demonstrated that the main focus of these programs-weight control-has any positive effect. There is no published evidence that large-scale corporate attempts to control employee body weight through financial incentives and penalties have generated savings from long-term weight loss, or a reduction in inpatient admissions associated with obesity or even long-term weight loss itself. Other evidence contradicts the hypothesis that population obesity rates meaningfully retard economic growth or manufacturing productivity. Quite the contrary, overscreening and crash dieting can impact employee morale and even harm employee health. Therefore, the authors believe that corporations should disband or significantly reconfigure weight-oriented wellness programs, and that the Affordable Care Act should be amended to require such programs to conform to accepted guidelines for harm avoidance.

  3. Experiments with a New, Unique Large-Scale Rig Investigating the Effects of Background System Rotation on Vortex Rings in Water

    NASA Astrophysics Data System (ADS)

    Brend, Mark A.; Verzicco, Roberto

    2005-11-01

    We introduce our unique, new large-scale experimental facility [1] designed for our long-term research program investigating the effects of background system rotation on the stability and the dynamics of vortex rings. The new rig constitutes a large water-filled tank positioned on a rotating turntable and its overall height and diameter are 5.7m and 1.4 m, respectively. First experimental and computational results of our program are summarized. We will show various videos of flow visualizations that illustrate some major, qualitative differences between rings propagating in rotating and non-rotating flows. Some of the investigated characteristics of the vortex rings include their translation velocity, the velocity field inside and surrounding the rings, and, in particular, their stability. We will briefly outline experiments employing the relatively new Ultrasonic-Velocity-Profiler technique (UVP). This technique appears to be particularly suited for some of our measurements and it was, as far as we are aware, not previously used in the context of vortex-ring studies. [1] http://www.eng.warwick.ac.uk/staff/pjt/turntabpics/voriskt.html

  4. A Randomized Controlled Trial Evaluation of "Time to Read", a Volunteer Tutoring Program for 8- to 9-Year-Olds

    ERIC Educational Resources Information Center

    Miller, Sarah; Connolly, Paul

    2013-01-01

    Tutoring is commonly employed to prevent early reading failure, and evidence suggests that it can have a positive effect. This article presents findings from a large-scale ("n" = 734) randomized controlled trial evaluation of the effect of "Time to Read"--a volunteer tutoring program aimed at children aged 8 to 9 years--on…

  5. Quantity and Quality of Computer Use and Academic Achievement: Evidence from a Large-Scale International Test Program

    ERIC Educational Resources Information Center

    Cheema, Jehanzeb R.; Zhang, Bo

    2013-01-01

    This study looked at the effect of both quantity and quality of computer use on achievement. The Program for International Student Assessment (PISA) 2003 student survey comprising of 4,356 students (boys, n = 2,129; girls, n = 2,227) was used to predict academic achievement from quantity and quality of computer use while controlling for…

  6. Applications for Micrographics in Large Scale Information Systems of the Future. Volume II: Part III. A Review of Micrographics State-of-the-Art.

    ERIC Educational Resources Information Center

    Information Dynamics Corp., Reading, MA.

    A five-year development program plan was drawn up for the Defense Documentation Center (DDC). This report presents in summary form the results of various surveys and reviews performed in selected areas of micrographics to support the efforts of the program's planners. Exhibits of supporting documentation are presented, together with a discussion…

  7. U.S. Regional Aquifer Analysis Program

    NASA Astrophysics Data System (ADS)

    Johnson, Ivan

    As a result of the severe 1976-1978 drought, Congress in 1978 requested that the U.S. Geological Survey (USGS) initiate studies of the nation's aquifers on a regional scale. This continuing USGS project, the Regional Aquifer System Analysis (RASA) Program, consists of systematic studies of the quality and quantity of water in the regional groundwater systems that supply a large part of the nation's water.

  8. Review of Three Recent Randomized Trials of School-Based Mentoring: Making Sense of Mixed Findings. Social Policy Report. Volume 24, Number 3

    ERIC Educational Resources Information Center

    Wheeler, Marc E.; Keller, Thomas E.; DuBois, David L.

    2010-01-01

    Between 2007 and 2009, reports were released on the results of three separate large-scale random assignment studies of the effectiveness of school-based mentoring programs for youth. The studies evaluated programs implemented by Big Brothers Big Sisters of America (BBBSA) affiliates (Herrera et al., 2007), Communities In Schools of San Antonio,…

  9. Studying Online: Student Motivations and Experiences in ALA-Accredited LIS Programs

    ERIC Educational Resources Information Center

    Oguz, Fatih; Chu, Clara M.; Chow, Anthony S.

    2015-01-01

    This paper presents a large scale study of online MLIS students (n = 910), who completed at least one online course and were enrolled in 36 of the 58 ALA-accredited MLIS programs in Canada and the United States. The results indicate that the typical student is female, White, lives in an urban setting, and is in her mid-30s. Online students were…

  10. Improving health aid for a better planet: The planning, monitoring and evaluation tool (PLANET).

    PubMed

    Sridhar, Devi; Car, Josip; Chopra, Mickey; Campbell, Harry; Woods, Ngaire; Rudan, Igor

    2015-12-01

    International development assistance for health (DAH) quadrupled between 1990 and 2012, from US$ 5.6 billion to US$ 28.1 billion. This generates an increasing need for transparent and replicable tools that could be used to set investment priorities, monitor the distribution of funding in real time, and evaluate the impact of those investments. In this paper we present a methodology that addresses these three challenges. We call this approach PLANET, which stands for planning, monitoring and evaluation tool. Fundamentally, PLANET is based on crowdsourcing approach to obtaining information relevant to deployment of large-scale programs. Information is contributed in real time by a diverse group of participants involved in the program delivery. PLANET relies on real-time information from three levels of participants in large-scale programs: funders, managers and recipients. At each level, information is solicited to assess five key risks that are most relevant to each level of operations. The risks at the level of funders involve systematic neglect of certain areas, focus on donor's interests over that of program recipients, ineffective co-ordination between donors, questionable mechanisms of delivery and excessive loss of funding to "middle men". At the level of managers, the risks are corruption, lack of capacity and/or competence, lack of information and /or communication, undue avoidance of governmental structures / preference to non-governmental organizations and exclusion of local expertise. At the level of primary recipients, the risks are corruption, parallel operations / "verticalization", misalignment with local priorities and lack of community involvement, issues with ethics, equity and/or acceptability, and low likelihood of sustainability beyond the end of the program's implementation. PLANET is intended as an additional tool available to policy-makers to prioritize, monitor and evaluate large-scale development programs. In this, it should complement tools such as LiST (for health care/interventions), EQUIST (for health care/interventions) and CHNRI (for health research), which also rely on information from local experts and on local context to set priorities in a transparent, user-friendly, replicable, quantifiable and specific, algorithmic-like manner.

  11. Measurements of the wall-normal velocity component in very high Reynolds number pipe flow

    NASA Astrophysics Data System (ADS)

    Vallikivi, Margit; Hultmark, Marcus; Smits, Alexander J.

    2012-11-01

    Nano-Scale Thermal Anemometry Probes (NSTAPs) have recently been developed and used to study the scaling of the streamwise component of turbulence in pipe flow over a very large range of Reynolds numbers. This probe has an order of magnitude higher spatial and temporal resolution than regular hot wires, allowing it to resolve small scale motions at very high Reynolds numbers. Here use a single inclined NSTAP probe to study the scaling of the wall normal component of velocity fluctuations in the same flow. These new probes are calibrated using a method that is based on the use of the linear stress region of a fully developed pipe flow. Results on the behavior of the wall-normal component of velocity for Reynolds numbers up to 2 million are reported. Supported under NR Grant N00014-09-1-0263 (program manager Ron Joslin) and NSF Grant CBET-1064257 (program manager Henning Winter).

  12. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE PAGES

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2016-01-05

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  13. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  14. Increasing condom use and declining STI prevalence in high-risk MSM and TGs: evaluation of a large-scale prevention program in Tamil Nadu, India

    PubMed Central

    2013-01-01

    Background This paper presents an evaluation of Avahan, a large scale HIV prevention program that was implemented using peer-mediated strategies, condom distribution and sexually transmitted infection (STI) clinical services among high-risk men who have sex with men (HR-MSM) and male to female transgender persons (TGs) in six high-prevalence state of Tamil Nadu, in southern India. Methods Two rounds of large scale cross-sectional bio-behavioural surveys among HR-MSM and TGs and routine program monitoring data were used to assess changes in program coverage, condom use and prevalence of STIs (including HIV) and their association to program exposure. Results The Avahan program for HR-MSM and TGs in Tamil Nadu was significantly scaled up and contacts by peer educators reached 77 percent of the estimated denominator by the end of the program’s fourth year. Exposure to the program increased between the two rounds of surveys for both HR-MSM (from 66 percent to 90 percent; AOR = 4.6; p < 0.001) and TGs (from 74.5 percent to 83 percent; AOR = 1.82; p < 0.06). There was an increase in consistent condom use by HR-MSM with their regular male partners (from 33 percent to 46 percent; AOR = 1.9; p < 0.01). Last time condom use with paying male partners (up from 81 percent to 94 percent; AOR = 3.6; p < 0.001) also showed an increase. Among TGs, the increase in condom use with casual male partners (18 percent to 52 percent; AOR = 1.8; p < 0.27) was not significant, and last time condom use declined significantly with paying male partners (93 percent to 80 percent; AOR = 0.32; p < 0.015). Syphilis declined significantly among both HR-MSM (14.3 percent to 6.8 percent; AOR = 0.37; p < 0.001) and TGs (16.6 percent to 4.2 percent; AOR = 0.34; p < 0.012), while change in HIV prevalence was not found to be significant for HR-MSM (9.7 percent to 10.9 percent) and TGs (12 percent to 9.8 percent). For both groups, change in condom use with commercial and non-commercial partners was found to be strongly linked with exposure to the Avahan program. Conclusion The Avahan program for HR-MSM and TGs in Tamil Nadu achieved a high coverage, resulting in improved condom use by HR-MSM with their regular and commercial male partners. Declining STI prevalence and stable HIV prevalence reflect the positive effects of the prevention strategy. Outcomes from the program logic model indiacte the effectiveness of the program for HR-MSM and TGs in Tamil Nadu. PMID:24044766

  15. Cost analysis of large-scale implementation of the 'Helping Babies Breathe' newborn resuscitation-training program in Tanzania.

    PubMed

    Chaudhury, Sumona; Arlington, Lauren; Brenan, Shelby; Kairuki, Allan Kaijunga; Meda, Amunga Robson; Isangula, Kahabi G; Mponzi, Victor; Bishanga, Dunstan; Thomas, Erica; Msemo, Georgina; Azayo, Mary; Molinier, Alice; Nelson, Brett D

    2016-12-01

    Helping Babies Breathe (HBB) has become the gold standard globally for training birth-attendants in neonatal resuscitation in low-resource settings in efforts to reduce early newborn asphyxia and mortality. The purpose of this study was to do a first-ever activity-based cost-analysis of at-scale HBB program implementation and initial follow-up in a large region of Tanzania and evaluate costs of national scale-up as one component of a multi-method external evaluation of the implementation of HBB at scale in Tanzania. We used activity-based costing to examine budget expense data during the two-month implementation and follow-up of HBB in one of the target regions. Activity-cost centers included administrative, initial training (including resuscitation equipment), and follow-up training expenses. Sensitivity analysis was utilized to project cost scenarios incurred to achieve countrywide expansion of the program across all mainland regions of Tanzania and to model costs of program maintenance over one and five years following initiation. Total costs for the Mbeya Region were $202,240, with the highest proportion due to initial training and equipment (45.2%), followed by central program administration (37.2%), and follow-up visits (17.6%). Within Mbeya, 49 training sessions were undertaken, involving the training of 1,341 health providers from 336 health facilities in eight districts. To similarly expand the HBB program across the 25 regions of mainland Tanzania, the total economic cost is projected to be around $4,000,000 (around $600 per facility). Following sensitivity analyses, the estimated total for all Tanzania initial rollout lies between $2,934,793 to $4,309,595. In order to maintain the program nationally under the current model, it is estimated it would cost $2,019,115 for a further one year and $5,640,794 for a further five years of ongoing program support. HBB implementation is a relatively low-cost intervention with potential for high impact on perinatal mortality in resource-poor settings. It is shown here that nationwide expansion of this program across the range of health provision levels and regions of Tanzania would be feasible. This study provides policymakers and investors with the relevant cost-estimation for national rollout of this potentially neonatal life-saving intervention.

  16. Two Topics in Seasonal Streamflow Forecasting: Soil Moisture Initialization Error and Precipitation Downscaling

    NASA Technical Reports Server (NTRS)

    Koster, Randal; Walker, Greg; Mahanama, Sarith; Reichle, Rolf

    2012-01-01

    Continental-scale offline simulations with a land surface model are used to address two important issues in the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which the downscaling of seasonal precipitation forecasts, if it could be done accurately, would improve streamflow forecasts. The reduction in streamflow forecast skill (with forecasted streamflow measured against observations) associated with adding noise to a soil moisture field is found to be, to first order, proportional to the average reduction in the accuracy of the soil moisture field itself. This result has implications for streamflow forecast improvement under satellite-based soil moisture measurement programs. In the second and more idealized ("perfect model") analysis, precipitation downscaling is found to have an impact on large-scale streamflow forecasts only if two conditions are met: (i) evaporation variance is significant relative to the precipitation variance, and (ii) the subgrid spatial variance of precipitation is adequately large. In the large-scale continental region studied (the conterminous United States), these two conditions are met in only a somewhat limited area.

  17. Large-scale fortification of condiments and seasonings as a public health strategy: equity considerations for implementation.

    PubMed

    Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia

    2016-09-01

    Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.

  18. Ready-to-use foods for management of moderate acute malnutrition: considerations for scaling up production and use in programs.

    PubMed

    Osendarp, Saskia; Rogers, Beatrice; Ryan, Kelsey; Manary, Mark; Akomo, Peter; Bahwere, Paluku; Belete, Hilina; Zeilani, Mamane; Islam, Munirul; Dibari, Filippo; De Pee, Saskia

    2015-03-01

    Ready-to-use foods are one of the available strategies for the treatment of moderate acute malnutrition (MAM), but challenges remain in the use of these products in programs at scale. This paper focuses on two challenges: the need for cheaper formulations using locally available ingredients that are processed in a safe, reliable, and financially sustainable local production facility; and the effective use of these products in large-scale community-based programs. Linear programming tools can be used successfully to design local compositions that are in line with international guidelines, low in cost, and acceptable, and the efficacy of these local formulations in the treatment of MAM was recently demonstrated in Malawi. The production of local formulations for programs at scale relies on the existence of a reliable and efficient local production facility. Technical assistance may be required in the development of sustainable business models at an early stage in the process, taking into account the stringent product quality and safety criteria and the required investments. The use of ready-to-use products, as of any food supplement, in programs at scale will be affected by the practice of household sharing and diversion of these products for other uses. Additional measures can be considered to account for sharing. These products designed for the treatment and prevention of MAM are to be used in community-based programs and should therefore be used in conjunction with other interventions and designed so that they do not replace the intake of other foods and breastmilk. Remaining challenges and implications for the (operations) research agenda are discussed.

  19. The Panchromatic Comparative Exoplanetary Treasury Program

    NASA Astrophysics Data System (ADS)

    Sing, David

    2016-10-01

    HST has played the definitive role in the characterization of exoplanets and from the first planets available, we have learned that their atmospheres are incredibly diverse. The large number of transiting planets now available has prompted a new era of atmospheric studies, where wide scale comparative planetology is now possible. The atmospheric chemistry of cloud/haze formation and atmospheric mass-loss are a major outstanding issues in the field of exoplanets, and we seek to make progress gaining insight into their underlying physical process through comparative studies. Here we propose to use Hubble's full spectroscopic capabilities to produce the first large-scale, simultaneous UVOIR comparative study of exoplanets. With full wavelength coverage, an entire planet's atmosphere can be probed simultaneously and with sufficient numbers of planets, we can statistically compare their features with physical parameters for the first time. This panchromatic program will build a lasting HST legacy, providing the UV and blue-optical spectra unavailable to JWST. From these observations, chemistry over a wide range of physical environments will be probed, from the hottest condensates to much cooler planets where photochemical hazes could be present. Constraints on aerosol size and composition will help unlock our understanding of clouds and how they are suspended at such high altitudes. Notably, there have been no large transiting UV HST programs, and this panchromatic program will provide a fundamental legacy contribution to atmospheric escape of small exoplanets, where the mass loss can be significant and have a major impact on the evolution of the planet itself.

  20. Rocket University at KSC

    NASA Technical Reports Server (NTRS)

    Sullivan, Steven J.

    2014-01-01

    "Rocket University" is an exciting new initiative at Kennedy Space Center led by NASA's Engineering and Technology Directorate. This hands-on experience has been established to develop, refine & maintain targeted flight engineering skills to enable the Agency and KSC strategic goals. Through "RocketU", KSC is developing a nimble, rapid flight engineering life cycle systems knowledge base. Ongoing activities in RocketU develop and test new technologies and potential customer systems through small scale vehicles, build and maintain flight experience through balloon and small-scale rocket missions, and enable a revolving fresh perspective of engineers with hands on expertise back into the large scale NASA programs, providing a more experienced multi-disciplined set of systems engineers. This overview will define the Program, highlight aspects of the training curriculum, and identify recent accomplishments and activities.

  1. United States Department of Agriculture-Agricultural Research Service research programs in biological control of plant diseases.

    PubMed

    Roberts, Daniel P; Lohrke, Scott M

    2003-01-01

    A number of USDA-ARS programs directed at overcoming impediments to the use of biocontrol agents on a commercial scale are described. These include improvements in screening techniques, taxonomic studies to identify beneficial strains more precisely, and studies on various aspects of the large-scale production of biocontrol agents. Another broad area of studies covers the ecological aspects of biocontrol agents-their interaction with the pathogen, with the plant and with other aspects of the environmental complex. Examples of these studies are given and their relevance to the further development and expansion of biocontrol agents is discussed.

  2. How the Avahan HIV prevention program transitioned from the Gates Foundation to the government of India.

    PubMed

    Sgaier, Sema K; Ramakrishnan, Aparajita; Dhingra, Neeraj; Wadhwani, Alkesh; Alexander, Ashok; Bennett, Sara; Bhalla, Aparajita; Kumta, Sameer; Jayaram, Matangi; Gupta, Pankaj; Piot, Peter K; Bertozzi, Stefano M; Anthony, John

    2013-07-01

    Developing countries face diminishing development aid and time-limited donor commitments that challenge the long-term sustainability of donor-funded programs to improve the health of local populations. Increasing country ownership of the programs is one solution. Transitioning managerial and financial responsibility for donor-funded programs to governments and local stakeholders represents a highly advanced form of country ownership, but there are few successful examples among large-scale programs. We present a transition framework and describe how it was used to transfer the Bill & Melinda Gates Foundation's HIV/AIDS prevention program, the Avahan program, to the Government of India. Essential features recommended for the transition of donor-funded programs to governments include early planning with the government, aligning donor program components with government structures and funding models prior to transition, building government capacity through active technical and management support, budgeting for adequate support during and after the transition, and dividing the transition into phases to allow time for adjustments and corrections. The transition of programs to governments is an important sustainability strategy for efforts to scale up HIV prevention programs to reach the populations most at risk.

  3. Delivering digital health and well-being at scale: lessons learned during the implementation of the dallas program in the United Kingdom

    PubMed Central

    Devlin, Alison M; McGee-Lennon, Marilyn; O’Donnell, Catherine A; Bouamrane, Matt-Mouley; Agbakoba, Ruth; O’Connor, Siobhan; Grieve, Eleanor; Finch, Tracy; Wyke, Sally; Watson, Nicholas; Browne, Susan

    2016-01-01

    Objective To identify implementation lessons from the United Kingdom Delivering Assisted Living Lifestyles at Scale (dallas) program—a large-scale, national technology program that aims to deliver a broad range of digital services and products to the public to promote health and well-being. Materials and Methods Prospective, longitudinal qualitative research study investigating implementation processes. Qualitative data collected includes semi-structured e-Health Implementation Toolkit–led interviews at baseline/mid-point (n = 38), quarterly evaluation, quarterly technical and barrier and solutions reports, observational logs, quarterly evaluation alignment interviews with project leads, observational data collected during meetings, and ethnographic data from dallas events (n > 200 distinct pieces of qualitative data). Data analysis was guided by Normalization Process Theory, a sociological theory that aids conceptualization of implementation issues in complex healthcare settings. Results Five key challenges were identified: 1) The challenge of establishing and maintaining large heterogeneous, multi-agency partnerships to deliver new models of healthcare; 2) The need for resilience in the face of barriers and set-backs including the backdrop of continually changing external environments; 3) The inherent tension between embracing innovative co-design and achieving delivery at pace and at scale; 4) The effects of branding and marketing issues in consumer healthcare settings; and 5) The challenge of interoperability and information governance, when commercial proprietary models are dominant. Conclusions The magnitude and ambition of the dallas program provides a unique opportunity to investigate the macro level implementation challenges faced when designing and delivering digital health and wellness services at scale. Flexibility, adaptability, and resilience are key implementation facilitators when shifting to new digitally enabled models of care. PMID:26254480

  4. National Water-Quality Assessment program: The Trinity River Basin

    USGS Publications Warehouse

    Land, Larry F.

    1991-01-01

    In 1991, the U.S. Geological Survey (USGS) began to implement a full-scale National Water-Quality Assessment (NAWQA) program. The long-term goals of the NAWQA program are to describe the status and trends in the quality of a large, representative part of the Nation's surface- and ground-water resources and to provide a sound, scientific understanding of the primary natural and human factors affecting the quality of these resources. In meeting these goals, the program will produce a wealth of water-quality information that will be useful to policy makers and managers at the national, State, and local levels. A major design feature of the NAWQA program will enable water-quality information at different areal scales to be integrated. A major component of the program is study-unit investigations, which comprise the principal building blocks of the program on which national-level assessment activities will be based. The 60 study-unit investigations that make up the program are hydrologic systems that include parts of most major river basins and aquifer systems. These study units cover areas of 1,200 to more than 65,000 square miles and incorporate about 60 to 70 percent of the Nation's water use and population served by public water supply. In 1991, the Trinity River basin study was among the first 20 NAWQA study units selected for study under the full-scale implementation plan.

  5. A large-scale evaluation of the KiVa antibullying program: grades 4-6.

    PubMed

    Kärnä, Antti; Voeten, Marinus; Little, Todd D; Poskiparta, Elisa; Kaljonen, Anne; Salmivalli, Christina

    2011-01-01

    This study demonstrates the effectiveness of the KiVa antibullying program using a large sample of 8,237 youth from Grades 4-6 (10-12 years). Altogether, 78 schools were randomly assigned to intervention (39 schools, 4,207 students) and control conditions (39 schools, 4,030 students). Multilevel regression analyses revealed that after 9 months of implementation, the intervention had consistent beneficial effects on 7 of the 11 dependent variables, including self- and peer-reported victimization and self-reported bullying. The results indicate that the KiVa program is effective in reducing school bullying and victimization in Grades 4-6. Despite some evidence against school-based interventions, the results suggest that well-conceived school-based programs can reduce victimization. © 2011 The Authors. Child Development © 2011 Society for Research in Child Development, Inc.

  6. Externally blown flap noise research

    NASA Technical Reports Server (NTRS)

    Dorsch, R. G.

    1974-01-01

    The Lewis Research Center cold-flow model externally blown flap (EBF) noise research test program is summarized. Both engine under-the-wing and over-the-wing EBF wing section configurations were studied. Ten large scale and nineteen small scale EBF models were tested. A limited number of forward airspeed effect and flap noise suppression tests were also run. The key results and conclusions drawn from the flap noise tests are summarized and discussed.

  7. Accelerating large scale Kohn-Sham density functional theory calculations with semi-local functionals and hybrid functionals

    NASA Astrophysics Data System (ADS)

    Lin, Lin

    The computational cost of standard Kohn-Sham density functional theory (KSDFT) calculations scale cubically with respect to the system size, which limits its use in large scale applications. In recent years, we have developed an alternative procedure called the pole expansion and selected inversion (PEXSI) method. The PEXSI method solves KSDFT without solving any eigenvalue and eigenvector, and directly evaluates physical quantities including electron density, energy, atomic force, density of states, and local density of states. The overall algorithm scales as at most quadratically for all materials including insulators, semiconductors and the difficult metallic systems. The PEXSI method can be efficiently parallelized over 10,000 - 100,000 processors on high performance machines. The PEXSI method has been integrated into a number of community electronic structure software packages such as ATK, BigDFT, CP2K, DGDFT, FHI-aims and SIESTA, and has been used in a number of applications with 2D materials beyond 10,000 atoms. The PEXSI method works for LDA, GGA and meta-GGA functionals. The mathematical structure for hybrid functional KSDFT calculations is significantly different. I will also discuss recent progress on using adaptive compressed exchange method for accelerating hybrid functional calculations. DOE SciDAC Program, DOE CAMERA Program, LBNL LDRD, Sloan Fellowship.

  8. Defense Meteorological Satellite Program Data in Dynamic Auroral Boundary Coordinates: New insights into Polar Cap and Auroral Dynamics

    NASA Astrophysics Data System (ADS)

    Knipp, D.

    2016-12-01

    Using reprocessed (Level-2) data from the Defense Meteorology Satellite Program magnetometer (SSM) and particle precipitation (SSJ) instruments we determine the boundaries of the central plasma sheet auroral oval, and then consider the relative locations and intensities of field aligned currents. Large-scale field-aligned currents (FAC) are determined using the Minimum Variance Analysis technique, and their influence is then removed from the magnetic perturbations allowing us to estimate intensity and scale-size of the smaller-scale currents. When sorted by dynamic auroral boundary coordinates we find that large- scale Region 1 (R1) FAC are often within the polar cap and Region 2 (R2) FAC show a strong dawn-dusk asymmetry (as in Ohtani et al., 2010). We find that mesoscale FAC are stronger in the summer and are most consistently present in the vicinity of dawnside (downward) R1 FAC. Further, mesoscale FAC are confined to auroral latitudes and above on the dawnside, but can be subaroural on the dusk side. Hotspots of mesoscale FAC occur in pre-midnight regions especially during summer. Finally, we show how this information can be combined with measurements from above and below the ionosphere-thermosphere to help explain significant perturbations in polar cap dynamics.

  9. NWChem: A comprehensive and scalable open-source solution for large scale molecular simulations

    NASA Astrophysics Data System (ADS)

    Valiev, M.; Bylaska, E. J.; Govind, N.; Kowalski, K.; Straatsma, T. P.; Van Dam, H. J. J.; Wang, D.; Nieplocha, J.; Apra, E.; Windus, T. L.; de Jong, W. A.

    2010-09-01

    The latest release of NWChem delivers an open-source computational chemistry package with extensive capabilities for large scale simulations of chemical and biological systems. Utilizing a common computational framework, diverse theoretical descriptions can be used to provide the best solution for a given scientific problem. Scalable parallel implementations and modular software design enable efficient utilization of current computational architectures. This paper provides an overview of NWChem focusing primarily on the core theoretical modules provided by the code and their parallel performance. Program summaryProgram title: NWChem Catalogue identifier: AEGI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Open Source Educational Community License No. of lines in distributed program, including test data, etc.: 11 709 543 No. of bytes in distributed program, including test data, etc.: 680 696 106 Distribution format: tar.gz Programming language: Fortran 77, C Computer: all Linux based workstations and parallel supercomputers, Windows and Apple machines Operating system: Linux, OS X, Windows Has the code been vectorised or parallelized?: Code is parallelized Classification: 2.1, 2.2, 3, 7.3, 7.7, 16.1, 16.2, 16.3, 16.10, 16.13 Nature of problem: Large-scale atomistic simulations of chemical and biological systems require efficient and reliable methods for ground and excited solutions of many-electron Hamiltonian, analysis of the potential energy surface, and dynamics. Solution method: Ground and excited solutions of many-electron Hamiltonian are obtained utilizing density-functional theory, many-body perturbation approach, and coupled cluster expansion. These solutions or a combination thereof with classical descriptions are then used to analyze potential energy surface and perform dynamical simulations. Additional comments: Full documentation is provided in the distribution file. This includes an INSTALL file giving details of how to build the package. A set of test runs is provided in the examples directory. The distribution file for this program is over 90 Mbytes and therefore is not delivered directly when download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. Running time: Running time depends on the size of the chemical system, complexity of the method, number of cpu's and the computational task. It ranges from several seconds for serial DFT energy calculations on a few atoms to several hours for parallel coupled cluster energy calculations on tens of atoms or ab-initio molecular dynamics simulation on hundreds of atoms.

  10. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2010-09-30

    Application of Earth Sciences Products” supports improvements in NAAPS physics and model initialization. The implementation of NAAPS, NAVDAS-AOD, FLAMBE ...Forecasting of Biomass-Burning Smoke: Description of and Lessons From the Fire Locating and Modeling of Burning Emissions ( FLAMBE ) Program, IEEE Journal of

  11. REAL TIME CONTROL OF SEWERS: US EPA MANUAL

    EPA Science Inventory

    The problem of sewage spills and local flooding has traditionally been addressed by large scale capital improvement programs that focus on construction alternatives such as sewer separation or construction of storage facilities. The cost of such projects is often high, especiall...

  12. Forming Mandrels for X-Ray Mirror Substrates

    NASA Technical Reports Server (NTRS)

    Blake, Peter N.; Saha. To,p; Zhang, Will; O'Dell, Stephen; Kester, Thomas; Jones, William

    2011-01-01

    Precision forming mandrels are one element in X-ray mirror development at NASA. Current mandrel fabrication process is capable of meeting the allocated precision requirements for a 5 arcsec telescope. A manufacturing plan is outlined for a large IXO-scale program.

  13. SUMMARY OF SOLIDIFICATION/STABILIZATION SITE DEMONSTRATIONS AT UNCONTROLLED HAZARDOUS WASTE SITES

    EPA Science Inventory

    Four large-scale solidification/stabilization demonstrations have occurred under EPA's SITE program. In general, physical testing results have been acceptable. Reduction in metal leachability, as determined by the TCLP test, has been observed. Reduction in organic leachability ha...

  14. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1985-01-01

    Developments in programs managed by the Jet Propulsion Laboratory's Office of Telecommunications and Data acquisition are discussed. Space communications, radio antennas, the Deep Space Network, antenna design, Project SETI, seismology, coding, very large scale integration, downlinking, and demodulation are among the topics covered.

  15. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    NASA Astrophysics Data System (ADS)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-08-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication necessary for the interaction calculation. We discuss how we can overcome these bottlenecks.

  16. Degree program changes and curricular flexibility: Addressing long held beliefs about student progression

    NASA Astrophysics Data System (ADS)

    Ricco, George Dante

    In higher education and in engineering education in particular, changing majors is generally considered a negative event - or at least an event with negative consequences. An emergent field of study within engineering education revolves around understanding the factors and processes driving student changes of major. Of key importance to further the field of change of major research is a grasp of large scale phenomena occurring throughout multiple systems, knowledge of previous attempts at describing such issues, and the adoption of metrics to probe them effectively. The problem posed is exacerbated by the drive in higher education institutions and among state legislatures to understand and reduce time-to-degree and student attrition. With these factors in mind, insights into large-scale processes that affect student progression are essential to evaluating the success or failure of programs. The goals of this work include describing the current educational research on switchers, identifying core concepts and stumbling blocks in my treatment of switchers, and using the Multiple Institutional Database for Investigating Engineering Longitudinal Development (MIDFIELD) to explore how those who change majors perform as a function of large-scale academic pathways within and without the engineering context. To accomplish these goals, it was first necessary to delve into a recent history of the treatment of switchers within the literature and categorize their approach. While three categories of papers exist in the literature concerning change of major, all three may or may not be applicable to a given database of students or even a single institution. Furthermore, while the term has been coined in the literature, no portable metric for discussing large-scale navigational flexibility exists in engineering education. What such a metric would look like will be discussed as well as the delimitations involved. The results and subsequent discussion will include a description of changes of major, how they may or may not have a deleterious effect on one's academic pathway, the special context of changes of major in the pathways of students within first-year engineering programs students labeled as undecided, an exploration of curricular flexibility by the construction of a novel metric, and proposed future work.

  17. On the role of minicomputers in structural design

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.

    1977-01-01

    Results are presented of exploratory studies on the use of a minicomputer in conjunction with large-scale computers to perform structural design tasks, including data and program management, use of interactive graphics, and computations for structural analysis and design. An assessment is made of minicomputer use for the structural model definition and checking and for interpreting results. Included are results of computational experiments demonstrating the advantages of using both a minicomputer and a large computer to solve a large aircraft structural design problem.

  18. The Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey

    NASA Astrophysics Data System (ADS)

    Squires, Gordon K.; Lubin, L. M.; Gal, R. R.

    2007-05-01

    We present the motivation, design, and latest results from the Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 Mpc around 20 known galaxy clusters at z > 0.6. When complete, the survey will cover nearly 5 square degrees, all targeted at high-density regions, making it complementary and comparable to field surveys such as DEEP2, GOODS, and COSMOS. For the survey, we are using the Large Format Camera on the Palomar 5-m and SuPRIME-Cam on the Subaru 8-m to obtain optical/near-infrared imaging of an approximately 30 arcmin region around previously studied high-redshift clusters. Colors are used to identify likely member galaxies which are targeted for follow-up spectroscopy with the DEep Imaging Multi-Object Spectrograph on the Keck 10-m. This technique has been used to identify successfully the Cl 1604 supercluster at z = 0.9, a large scale structure containing at least eight clusters (Gal & Lubin 2004; Gal, Lubin & Squires 2005). We present the most recent structures to be photometrically and spectroscopically confirmed through this program, discuss the properties of the member galaxies as a function of environment, and describe our planned multi-wavelength (radio, mid-IR, and X-ray) observations of these systems. The goal of this survey is to identify and examine a statistical sample of large scale structures during an active period in the assembly history of the most massive clusters. With such a sample, we can begin to constrain large scale cluster dynamics and determine the effect of the larger environment on galaxy evolution.

  19. Developing enterprise tools and capacities for large-scale natural resource monitoring: A visioning workshop

    USGS Publications Warehouse

    Bayer, Jennifer M.; Weltzin, Jake F.; Scully, Rebecca A.

    2017-01-01

    Objectives of the workshop were: 1) identify resources that support natural resource monitoring programs working across the data life cycle; 2) prioritize desired capacities and tools to facilitate monitoring design and implementation; 3) identify standards and best practices that improve discovery, accessibility, and interoperability of data across programs and jurisdictions; and 4) contribute to an emerging community of practice focused on natural resource monitoring.

  20. A Composite Theoretical Model Showing Potential Hidden Costs of Online Distance Education at Historically Black Colleges and Universities: With Implications for Building Cost-Resistant Courses and Programs

    ERIC Educational Resources Information Center

    Arroyo, Andrew T.

    2014-01-01

    Growing numbers of historically Black colleges and universities (HBCUs) are entering the arena of online distance education. Some are seeking to grow large-scale programs that can compete for market share with historically White institutions and for-profit schools. This theoretical essay develops a composite model to assist HBCU administrators in…

  1. Learned perceptual associations influence visuomotor programming under limited conditions: kinematic consistency.

    PubMed

    Haffenden, Angela M; Goodale, Melvyn A

    2002-12-01

    Previous findings have suggested that visuomotor programming can make use of learned size information in experimental paradigms where movement kinematics are quite consistent from trial to trial. The present experiment was designed to test whether or not this conclusion could be generalized to a different manipulation of kinematic variability. As in previous work, an association was established between the size and colour of square blocks (e.g. red = large; yellow = small, or vice versa). Associating size and colour in this fashion has been shown to reliably alter the perceived size of two test blocks halfway in size between the large and small blocks: estimations of the test block matched in colour to the group of large blocks are smaller than estimations of the test block matched to the group of small blocks. Subjects grasped the blocks, and on other trials estimated the size of the blocks. These changes in perceived block size were incorporated into grip scaling only when movement kinematics were highly consistent from trial to trial; that is, when the blocks were presented in the same location on each trial. When the blocks were presented in different locations grip scaling remained true to the metrics of the test blocks despite the changes in perceptual estimates of block size. These results support previous findings suggesting that kinematic consistency facilitates the incorporation of learned perceptual information into grip scaling.

  2. GoFFish: A Sub-Graph Centric Framework for Large-Scale Graph Analytics1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Kumbhare, Alok; Wickramaarachchi, Charith

    2014-08-25

    Large scale graph processing is a major research area for Big Data exploration. Vertex centric programming models like Pregel are gaining traction due to their simple abstraction that allows for scalable execution on distributed systems naturally. However, there are limitations to this approach which cause vertex centric algorithms to under-perform due to poor compute to communication overhead ratio and slow convergence of iterative superstep. In this paper we introduce GoFFish a scalable sub-graph centric framework co-designed with a distributed persistent graph storage for large scale graph analytics on commodity clusters. We introduce a sub-graph centric programming abstraction that combines themore » scalability of a vertex centric approach with the flexibility of shared memory sub-graph computation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation.« less

  3. Time-evolving of very large-scale motions in a turbulent channel flow

    NASA Astrophysics Data System (ADS)

    Hwang, Jinyul; Lee, Jin; Sung, Hyung Jin; Zaki, Tamer A.

    2014-11-01

    Direct numerical simulation (DNS) data of a turbulent channel flow at Reτ = 930 was scrutinized to investigate the formation of very large-scale motions (VLSMs) by merging of two large-scale motions (LSMs), aligned in the streamwise direction. We mainly focused on the supportive motions by the near-wall streaks during the merging of the outer LSMs. From visualization of the instantaneous flow fields, several low-speed streaks in the near-wall region were collected in the spanwise direction, when LSMs were concatenated in the outer region. The magnitude of the streamwise velocity fluctuations in the streaks was intensified during the spanwise merging of the near-wall streaks. Conditionally-averaged velocity fields around the merging of the outer LSMs showed that the intensified near-wall motions were induced by the outer LSMs and extended over the near-wall regions. The intense near-wall motions influence the formation of the outer low-speed regions as well as the reduction of the convection velocity of the downstream LSMs. The interaction between the near-wall and the outer motions is the essential origin of the different convection velocities of the upstream and downstream LSMs for the formation process of VLSMs by merging. This work was supported by the Creative Research Initiatives (No. 2014-001493) program of the National Research Foundation of Korea (MSIP) and partially supported by KISTI under the Strategic Supercomputing Support Program.

  4. Load Balancing Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pearce, Olga Tkachyshyn

    2014-12-01

    The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one atmore » the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.« less

  5. "During early implementation you just muddle through": factors that impacted a statewide arthritis program's implementation.

    PubMed

    Conte, Kathleen P; Marie Harvey, S; Turner Goins, R

    2017-12-01

    The need to scale-up effective arthritis self-management programs is pressing as the prevalence of arthritis increases. The CDC Arthritis Program funds state health departments to work with local delivery systems to embed arthritis programs into their day-to-day work. To encourage organizational ownership and sustainability of programs, funding is restricted to offset program start-up costs. The purpose of this study was to identify factors that impacted the success of implementing an evidence-based arthritis self-management program, funded by the CDC Arthritis Program, into the Oregon Extension Service. We interviewed staff and partners involved in implementation who had and had not successfully delivered Walk With Ease (N = 12) to identify barriers and facilitators to scaling-up. Document analysis of administrative records was used to triangulate and expand on findings. Delivery goals defined by the funder were not met in Year 1: only 3 of the expected 28 programs were delivered. Barriers to implementation included insufficient planning for implementation driven by pressure to deliver programs and insufficient resources to support staff time. Facilitators included centralized administration of key implementation activities and staffs' previous experience implementing new programs. The importance of planning and preparing for implementation cannot be overlooked. Funders, however, eager to see deliverables, continue to define implementation goals in terms of program reach, exclusive of capacity-building. Lack of capacity-building can jeopardize staff buy-in, implementation quality, and sustainability. Based on our findings coupled with support from implementation literature, we offer recommendations for future large-scale implementation efforts operating under such funding restrictions.

  6. A MANAGEMENT SUPPORT SYSTEM FOR GREAT LAKES COASTAL WETLANDS

    EPA Science Inventory

    The Great Lakes National Program Office in conjunction with the Great Lakes Commission and other researchers is leading a large scale collaborative effort that will yield, in unprecedented detail, a management support system for Great Lakes coastal wetlands. This entails the dev...

  7. Lessons Learned from the First Decade of Adaptive Management in Comprehensive Everglades Restoration

    EPA Science Inventory

    Although few successful examples of large-scale adaptive management applications are available to ecosystem restoration scientists and managers, examining where and how the components of an adaptive management program have been successfully implemented yields insight into what ...

  8. NON-POLLUTING METAL SURFACE FINISHING PRETREATMENT AND PRETREATMENT/CONVERSION COATING

    EPA Science Inventory

    Picklex, a proprietary formulation, is an alterantive to conventional metal surface pretreatments and is claimed not to produce waste or lower production or lower performance. A laboratory program was designed to evaluate Picklex in common, large scale, polluting surface finishin...

  9. Incorporation of DNA barcoding into a large-scale biomonitoring program: opportunities and pitfalls

    EPA Science Inventory

    Taxonomic identification of benthic macroinvertebrates is critical to protocols used to assess the biological integrity of aquatic ecosystems. The time, expense, and inherent error rate of species-level morphological identifications has necessitated use of genus- or family-level ...

  10. Pioneering University/Industry Venture Explores VLSI Frontiers.

    ERIC Educational Resources Information Center

    Davis, Dwight B.

    1983-01-01

    Discusses industry-sponsored programs in semiconductor research, focusing on Stanford University's Center for Integrated Systems (CIS). CIS, while pursuing research in semiconductor very-large-scale integration, is merging the fields of computer science, information science, and physical science. Issues related to these university/industry…

  11. Computation and Theory in Large-Scale Optimization

    DTIC Science & Technology

    1993-01-13

    Sang Jin Lee. Research Assistant. - Laura Morley, Research Assistant. - Yonca A. Ozge , Research Assistant. - Stephen M. Robinson. Professor. - Hichem...other participants. M.N. Azadez. S.J. Lee. Y.A. Ozge . and H. Sellami are continuing students in the doctoral program (in Industrial Engineering except

  12. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    NASA Astrophysics Data System (ADS)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  13. National Water Quality Assessment Program; the Santee Basin and coastal drainage, North Carolina and South Carolina

    USGS Publications Warehouse

    Hughes, W. Brian

    1994-01-01

    In 1991, the U.S. Geological Survey (USGS), U.S. Department of the Interior, began a National Water-Quality Assessment Program (NAWQA). The long-term goals of NAWQA are to describe the status of and trends in the quality of a large representative part of the Nation's surface- and ground-water resources and to identify all the major factors that affect the quality of these resources. In addressing these goals, NAWQA produces water-quality information that is useful to policymakers and managers at State, Federal, and local levels.NAWQA emphasis is on regional scale water-quality problems. The program does not diminish the need for smaller scale studies and monitoring designed and conducted by State, Federal, and local agencies. NAWQA, however, provides a large-scale framework for conducting many of these activities and an understanding about regional and national water-quality conditions that cannot be acquired from these other programs and studies.Studies of 60 hydrologic systems that include parts of most major river basins and aquifer systems are the building blocks of the national assessment. The areas of the 60 study units range in size from 1,000 to more than 60,000 square miles (mi2) and represent 60 to 70 percent of the Nation's water use and population served by public water supplies. Twenty investigations were begun in 1991, 20 investigations began in 1994, and 20 are planned to begin in 1997. The assessment activities in the Santee River Basin and Coastal Drainage began in 1994.

  14. Cost-Effective Large-Scale Occupancy–Abundance Monitoring of Invasive Brushtail Possums (Trichosurus Vulpecula) on New Zealand’s Public Conservation Land

    PubMed Central

    Gormley, Andrew M.; Forsyth, David M.; Wright, Elaine F.; Lyall, John; Elliott, Mike; Martini, Mark; Kappers, Benno; Perry, Mike; McKay, Meredith

    2015-01-01

    There is interest in large-scale and unbiased monitoring of biodiversity status and trend, but there are few published examples of such monitoring being implemented. The New Zealand Department of Conservation is implementing a monitoring program that involves sampling selected biota at the vertices of an 8-km grid superimposed over the 8.6 million hectares of public conservation land that it manages. The introduced brushtail possum (Trichosurus Vulpecula) is a major threat to some biota and is one taxon that they wish to monitor and report on. A pilot study revealed that the traditional method of monitoring possums using leg-hold traps set for two nights, termed the Trap Catch Index, was a constraint on the cost and logistical feasibility of the monitoring program. A phased implementation of the monitoring program was therefore conducted to collect data for evaluating the trade-off between possum occupancy–abundance estimates and the costs of sampling for one night rather than two nights. Reducing trapping effort from two nights to one night along four trap-lines reduced the estimated costs of monitoring by 5.8% due to savings in labour, food and allowances; it had a negligible effect on estimated national possum occupancy but resulted in slightly higher and less precise estimates of relative possum abundance. Monitoring possums for one night rather than two nights would provide an annual saving of NZ$72,400, with 271 fewer field days required for sampling. Possums occupied 60% (95% credible interval; 53–68) of sampling locations on New Zealand’s public conservation land, with a mean relative abundance (Trap Catch Index) of 2.7% (2.0–3.5). Possum occupancy and abundance were higher in forest than in non-forest habitats. Our case study illustrates the need to evaluate relationships between sampling design, cost, and occupancy–abundance estimates when designing and implementing large-scale occupancy–abundance monitoring programs. PMID:26029890

  15. 2012 U.S. Department of Energy: Joint Genome Institute: Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, David

    2013-01-01

    The mission of the U.S. Department of Energy Joint Genome Institute (DOE JGI) is to serve the diverse scientific community as a user facility, enabling the application of large-scale genomics and analysis of plants, microbes, and communities of microbes to address the DOE mission goals in bioenergy and the environment. The DOE JGI's sequencing efforts fall under the Eukaryote Super Program, which includes the Plant and Fungal Genomics Programs; and the Prokaryote Super Program, which includes the Microbial Genomics and Metagenomics Programs. In 2012, several projects made news for their contributions to energy and environment research.

  16. A case study in R and D productivity: Helping the program manager cope with job stress and improve communication effectiveness

    NASA Technical Reports Server (NTRS)

    Bodensteiner, W. D.; Gerloff, E. A.

    1985-01-01

    Certain structural changes in the Naval Material Command which resulted from a comparison of its operations to those of selected large-scale private sector companies are described. Central to the change was a reduction in the number of formal reports from systems commands to headquarters, and the provision of Program Management Assistance Teams (at the request of the program manager) to help resolve project problems. It is believed that these changes improved communication and information-processing, reduced program manager stress, and resulted in improved productivity.

  17. Final Report---Optimization Under Nonconvexity and Uncertainty: Algorithms and Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Linderoth

    2011-11-06

    the goal of this work was to develop new algorithmic techniques for solving large-scale numerical optimization problems, focusing on problems classes that have proven to be among the most challenging for practitioners: those involving uncertainty and those involving nonconvexity. This research advanced the state-of-the-art in solving mixed integer linear programs containing symmetry, mixed integer nonlinear programs, and stochastic optimization problems. The focus of the work done in the continuation was on Mixed Integer Nonlinear Programs (MINLP)s and Mixed Integer Linear Programs (MILP)s, especially those containing a great deal of symmetry.

  18. National Water-Quality Assessment Program - Red River of the North

    USGS Publications Warehouse

    Stoner, J.D.

    1991-01-01

    In 1991, the U.S. Geological Survey (USGS) began to implement a full-scale National Water-Quality Assessment (NAWQA) program. The long-term goals of the NAWQA program are to describe the status and trends in the quality of a large, representative part of the Nation's surface- and ground-water resources, and to provide a sound scientific understanding of the primary natural and human factors affecting the quality of these resources. The program will produce a wealth of water-quality information that will be useful to policy makers and managers at the national, State, and local levels.

  19. The Great Observatories Origins Deep Survey

    NASA Astrophysics Data System (ADS)

    Dickinson, Mark

    2008-05-01

    Observing the formation and evolution of ordinary galaxies at early cosmic times requires data at many wavelengths in order to recognize, separate and analyze the many physical processes which shape galaxies' history, including the growth of large scale structure, gravitational interactions, star formation, and active nuclei. Extremely deep data, covering an adequately large volume, are needed to detect ordinary galaxies in sufficient numbers at such great distances. The Great Observatories Origins Deep Survey (GOODS) was designed for this purpose as an anthology of deep field observing programs that span the electromagnetic spectrum. GOODS targets two fields, one in each hemisphere. Some of the deepest and most extensive imaging and spectroscopic surveys have been carried out in the GOODS fields, using nearly every major space- and ground-based observatory. Many of these data have been taken as part of large, public surveys (including several Hubble Treasury, Spitzer Legacy, and ESO Large Programs), which have produced large data sets that are widely used by the astronomical community. I will review the history of the GOODS program, highlighting results on the formation and early growth of galaxies and their active nuclei. I will also describe new and upcoming observations, such as the GOODS Herschel Key Program, which will continue to fill out our portrait of galaxies in the young universe.

  20. University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    abate, alex; cheu, elliott

    This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.

  1. Calculating Second-Order Effects in MOSFET's

    NASA Technical Reports Server (NTRS)

    Benumof, Reuben; Zoutendyk, John A.; Coss, James R.

    1990-01-01

    Collection of mathematical models includes second-order effects in n-channel, enhancement-mode, metal-oxide-semiconductor field-effect transistors (MOSFET's). When dimensions of circuit elements relatively large, effects neglected safely. However, as very-large-scale integration of microelectronic circuits leads to MOSFET's shorter or narrower than 2 micrometer, effects become significant in design and operation. Such computer programs as widely-used "Simulation Program With Integrated Circuit Emphasis, Version 2" (SPICE 2) include many of these effects. In second-order models of n-channel, enhancement-mode MOSFET, first-order gate-depletion region diminished by triangular-cross-section deletions on end and augmented by circular-wedge-cross-section bulges on sides.

  2. Study design of a cluster-randomized controlled trial to evaluate a large-scale distribution of cook stoves and water filters in Western Province, Rwanda.

    PubMed

    Nagel, Corey L; Kirby, Miles A; Zambrano, Laura D; Rosa, Ghislane; Barstow, Christina K; Thomas, Evan A; Clasen, Thomas F

    2016-12-15

    In Rwanda, pneumonia and diarrhea are the first and second leading causes of death, respectively, among children under five. Household air pollution (HAP) resultant from cooking indoors with biomass fuels on traditional stoves is a significant risk factor for pneumonia, while consumption of contaminated drinking water is a primary cause of diarrheal disease. To date, there have been no large-scale effectiveness trials of programmatic efforts to provide either improved cookstoves or household water filters at scale in a low-income country. In this paper we describe the design of a cluster-randomized trial to evaluate the impact of a national-level program to distribute and promote the use of improved cookstoves and advanced water filters to the poorest quarter of households in Rwanda. We randomly allocated 72 sectors (administratively defined units) in Western Province to the intervention, with the remaining 24 sectors in the province serving as controls. In the intervention sectors, roughly 100,000 households received improved cookstoves and household water filters through a government-sponsored program targeting the poorest quarter of households nationally. The primary outcome measures are the incidence of acute respiratory infection (ARI) and diarrhea among children under five years of age. Over a one-year surveillance period, all cases of acute respiratory infection (ARI) and diarrhea identified by health workers in the study area will be extracted from records maintained at health facilities and by community health workers (CHW). In addition, we are conducting intensive, longitudinal data collection among a random sample of households in the study area for in-depth assessment of coverage, use, environmental exposures, and additional health measures. Although previous research has examined the impact of providing household water treatment and improved cookstoves on child health, there have been no studies of national-level programs to deliver these interventions at scale in a developing country. The results of this study, the first RCT of a large-scale programmatic cookstove or household water filter intervention, will inform global efforts to reduce childhood morbidity and mortality from diarrheal disease and pneumonia. This trial is registered at Clinicaltrials.gov (NCT02239250).

  3. Compliant Robotic Structures. Part 2

    DTIC Science & Technology

    1986-07-01

    Nonaxially Homogeneous Stresses and Strains 44 Parametric Studies 52 % References 65 III. LARGE DEFLECTIONS OF CONTINUOUS ELASTIC ’- STRUCTURES 66...APPENDIX C: Computer Program for the Element String 133 -° SUMMARY This is the second year report which is a part of a three- year study on compliant...ratios as high as 10/1 for laboratory-scale models and up to 3/1 for full-scale prototype arms. The first two years of this study have involved the

  4. Policy and Administrative Issues for Large-Scale Clinical Interventions Following Disasters

    PubMed Central

    Cobham, Vanessa E.; McDermott, Brett

    2014-01-01

    Abstract Objective: Large, programmatic mental health intervention programs for children and adolescents following disasters have become increasingly common; however, little has been written about the key goals and challenges involved. Methods: Using available data and the authors' experiences, this article reviews the factors involved in planning and implementing large-scale treatment programs following disasters. Results: These issues include funding, administration, choice of clinical targets, workforce selection, choice of treatment modalities, training, outcome monitoring, and consumer uptake. Ten factors are suggested for choosing among treatment modalities: 1) reach (providing access to the greatest number), 2) retention of patients, 3) privacy, 4) parental involvement, 5) familiarity of the modality to clinicians, 6) intensity (intervention type matches symptom acuity and impairment of patient), 7) burden to the clinician (in terms of time, travel, and inconvenience), 8) cost, 9) technology needs, and 10) effect size. Traditionally, after every new disaster, local leaders who have never done so before have had to be recruited to design, administer, and implement programs. Conclusion: As expertise in all of these areas represents a gap for most local professionals in disaster-affected areas, we propose that a central, nongovernmental agency with national or international scope be created that can consult flexibly with local leaders following disasters on both overarching and specific issues. We propose recommendations and point out areas in greatest need of innovation. PMID:24521227

  5. Computer Science Techniques Applied to Parallel Atomistic Simulation

    NASA Astrophysics Data System (ADS)

    Nakano, Aiichiro

    1998-03-01

    Recent developments in parallel processing technology and multiresolution numerical algorithms have established large-scale molecular dynamics (MD) simulations as a new research mode for studying materials phenomena such as fracture. However, this requires large system sizes and long simulated times. We have developed: i) Space-time multiresolution schemes; ii) fuzzy-clustering approach to hierarchical dynamics; iii) wavelet-based adaptive curvilinear-coordinate load balancing; iv) multilevel preconditioned conjugate gradient method; and v) spacefilling-curve-based data compression for parallel I/O. Using these techniques, million-atom parallel MD simulations are performed for the oxidation dynamics of nanocrystalline Al. The simulations take into account the effect of dynamic charge transfer between Al and O using the electronegativity equalization scheme. The resulting long-range Coulomb interaction is calculated efficiently with the fast multipole method. Results for temperature and charge distributions, residual stresses, bond lengths and bond angles, and diffusivities of Al and O will be presented. The oxidation of nanocrystalline Al is elucidated through immersive visualization in virtual environments. A unique dual-degree education program at Louisiana State University will also be discussed in which students can obtain a Ph.D. in Physics & Astronomy and a M.S. from the Department of Computer Science in five years. This program fosters interdisciplinary research activities for interfacing High Performance Computing and Communications with large-scale atomistic simulations of advanced materials. This work was supported by NSF (CAREER Program), ARO, PRF, and Louisiana LEQSF.

  6. Solving LP Relaxations of Large-Scale Precedence Constrained Problems

    NASA Astrophysics Data System (ADS)

    Bienstock, Daniel; Zuckerberg, Mark

    We describe new algorithms for solving linear programming relaxations of very large precedence constrained production scheduling problems. We present theory that motivates a new set of algorithmic ideas that can be employed on a wide range of problems; on data sets arising in the mining industry our algorithms prove effective on problems with many millions of variables and constraints, obtaining provably optimal solutions in a few minutes of computation.

  7. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  8. Template Interfaces for Agile Parallel Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.

    Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less

  9. NASA/FAA general aviation crash dynamics program

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Hayduk, R. J.; Carden, H. D.

    1981-01-01

    The program involves controlled full scale crash testing, nonlinear structural analyses to predict large deflection elastoplastic response, and load attenuating concepts for use in improved seat and subfloor structure. Both analytical and experimental methods are used to develop expertise in these areas. Analyses include simplified procedures for estimating energy dissipating capabilities and comprehensive computerized procedures for predicting airframe response. These analyses are developed to provide designers with methods for predicting accelerations, loads, and displacements on collapsing structure. Tests on typical full scale aircraft and on full and subscale structural components are performed to verify the analyses and to demonstrate load attenuating concepts. A special apparatus was built to test emergency locator transmitters when attached to representative aircraft structure. The apparatus is shown to provide a good simulation of the longitudinal crash pulse observed in full scale aircraft crash tests.

  10. Scaling up HIV viral load - lessons from the large-scale implementation of HIV early infant diagnosis and CD4 testing.

    PubMed

    Peter, Trevor; Zeh, Clement; Katz, Zachary; Elbireer, Ali; Alemayehu, Bereket; Vojnov, Lara; Costa, Alex; Doi, Naoko; Jani, Ilesh

    2017-11-01

    The scale-up of effective HIV viral load (VL) testing is an urgent public health priority. Implementation of testing is supported by the availability of accurate, nucleic acid based laboratory and point-of-care (POC) VL technologies and strong WHO guidance recommending routine testing to identify treatment failure. However, test implementation faces challenges related to the developing health systems in many low-resource countries. The purpose of this commentary is to review the challenges and solutions from the large-scale implementation of other diagnostic tests, namely nucleic-acid based early infant HIV diagnosis (EID) and CD4 testing, and identify key lessons to inform the scale-up of VL. Experience with EID and CD4 testing provides many key lessons to inform VL implementation and may enable more effective and rapid scale-up. The primary lessons from earlier implementation efforts are to strengthen linkage to clinical care after testing, and to improve the efficiency of testing. Opportunities to improve linkage include data systems to support the follow-up of patients through the cascade of care and test delivery, rapid sample referral networks, and POC tests. Opportunities to increase testing efficiency include improvements to procurement and supply chain practices, well connected tiered laboratory networks with rational deployment of test capacity across different levels of health services, routine resource mapping and mobilization to ensure adequate resources for testing programs, and improved operational and quality management of testing services. If applied to VL testing programs, these approaches could help improve the impact of VL on ART failure management and patient outcomes, reduce overall costs and help ensure the sustainable access to reduced pricing for test commodities, as well as improve supportive health systems such as efficient, and more rigorous quality assurance. These lessons draw from traditional laboratory practices as well as fields such as logistics, operations management and business. The lessons and innovations from large-scale EID and CD4 programs described here can be adapted to inform more effective scale-up approaches for VL. They demonstrate that an integrated approach to health system strengthening focusing on key levers for test access such as data systems, supply efficiencies and network management. They also highlight the challenges with implementation and the need for more innovative approaches and effective partnerships to achieve equitable and cost-effective test access. © 2017 The Authors. Journal of the International AIDS Society published by John Wiley & sons Ltd on behalf of the International AIDS Society.

  11. The brief multidimensional students' life satisfaction scale-college version.

    PubMed

    Zullig, Keith J; Huebner, E Scott; Patton, Jon M; Murray, Karen A

    2009-01-01

    To investigate the psychometric properties of the BMSLSS-College among 723 college students. Internal consistency estimates explored scale reliability, factor analysis explored construct validity, and known-groups validity was assessed using the National College Youth Risk Behavior Survey and Harvard School of Public Health College Alcohol Study. Criterion-related validity was explored through analyses with the CDC's health-related quality of life scale and a social isolation scale. Acceptable internal consistency reliability, construct, known-groups, and criterion-related validity were established. Findings offer preliminary support for the BMSLSS-C; it could be useful in large-scale research studies, applied screening contexts, and for program evaluation purposes toward achieving Healthy People 2010 objectives.

  12. Assessing the effects of fire disturbances on ecosystems: A scientific agenda for research and management

    USGS Publications Warehouse

    Schmoldt, D.L.; Peterson, D.L.; Keane, R.E.; Lenihan, J.M.; McKenzie, D.; Weise, D.R.; Sandberg, D.V.

    1999-01-01

    A team of fire scientists and resource managers convened 17-19 April 1996 in Seattle, Washington, to assess the effects of fire disturbance on ecosystems. Objectives of this workshop were to develop scientific recommendations for future fire research and management activities. These recommendations included a series of numerically ranked scientific and managerial questions and responses focusing on (1) links among fire effects, fuels, and climate; (2) fire as a large-scale disturbance; (3) fire-effects modeling structures; and (4) managerial concerns, applications, and decision support. At the present time, understanding of fire effects and the ability to extrapolate fire-effects knowledge to large spatial scales are limited, because most data have been collected at small spatial scales for specific applications. Although we clearly need more large-scale fire-effects data, it will be more expedient to concentrate efforts on improving and linking existing models that simulate fire effects in a georeferenced format while integrating empirical data as they become available. A significant component of this effort should be improved communication between modelers and managers to develop modeling tools to use in a planning context. Another component of this modeling effort should improve our ability to predict the interactions of fire and potential climatic change at very large spatial scales. The priority issues and approaches described here provide a template for fire science and fire management programs in the next decade and beyond.

  13. What drives the formation of massive stars and clusters?

    NASA Astrophysics Data System (ADS)

    Ochsendorf, Bram; Meixner, Margaret; Roman-Duval, Julia; Evans, Neal J., II; Rahman, Mubdi; Zinnecker, Hans; Nayak, Omnarayani; Bally, John; Jones, Olivia C.; Indebetouw, Remy

    2018-01-01

    Galaxy-wide surveys allow to study star formation in unprecedented ways. In this talk, I will discuss our analysis of the Large Magellanic Cloud (LMC) and the Milky Way, and illustrate how studying both the large and small scale structure of galaxies are critical in addressing the question: what drives the formation of massive stars and clusters?I will show that ‘turbulence-regulated’ star formation models do not reproduce massive star formation properties of GMCs in the LMC and Milky Way: this suggests that theory currently does not capture the full complexity of star formation on small scales. I will also report on the discovery of a massive star forming complex in the LMC, which in many ways manifests itself as an embedded twin of 30 Doradus: this may shed light on the formation of R136 and 'Super Star Clusters' in general. Finally, I will highlight what we can expect in the next years in the field of star formation with large-scale sky surveys, ALMA, and our JWST-GTO program.

  14. Effects on fish and wildlife of chemical treatments of large areas

    USGS Publications Warehouse

    George, J.L.

    1959-01-01

    Summary: The history of field investigations of the effects of DDT on wildlife is reviewed briefly, from the initial studies in 1945 through the more recent studies of the effects of the large-scale programs for spruce-budworm control and gypsy-moth eradication. DDT dosages and procedures that are recommended for protection of wildlife are reviewed. Effects of aldrin, heptachlor, and toxaphene are discussed in connection with the grasshopper and Mormon cricket control programs. Delayed and indirect effects of chemical treatments are emphasized as an important current problem. Cited in this connection are fish losses in the Yellowstone and Miramichi rivers and losses of wildlife from eating earthworms a year after treatment of the area with DDT. Currently recommended procedures to safeguard wildlife in pesticidal programs are listed.

  15. On the scaling features of high-latitude geomagnetic field fluctuations during a large geomagnetic storm

    NASA Astrophysics Data System (ADS)

    De Michelis, Paola; Federica Marcucci, Maria; Consolini, Giuseppe

    2015-04-01

    Recently we have investigated the spatial distribution of the scaling features of short-time scale magnetic field fluctuations using measurements from several ground-based geomagnetic observatories distributed in the northern hemisphere. We have found that the scaling features of fluctuations of the horizontal magnetic field component at time scales below 100 minutes are correlated with the geomagnetic activity level and with changes in the currents flowing in the ionosphere. Here, we present a detailed analysis of the dynamical changes of the magnetic field scaling features as a function of the geomagnetic activity level during the well-known large geomagnetic storm occurred on July, 15, 2000 (the Bastille event). The observed dynamical changes are discussed in relationship with the changes of the overall ionospheric polar convection and potential structure as reconstructed using SuperDARN data. This work is supported by the Italian National Program for Antarctic Research (PNRA) - Research Project 2013/AC3.08 and by the European Community's Seventh Framework Programme ([FP7/2007-2013]) under Grant no. 313038/STORM and

  16. WIC's promotion of infant formula in the United States

    PubMed Central

    Kent, George

    2006-01-01

    Background The United States' Special Supplemental Nutrition Program for Women, Infants and Children (WIC) distributes about half the infant formula used in the United States at no cost to the families. This is a matter of concern because it is known that feeding with infant formula results in worse health outcomes for infants than breastfeeding. Discussion The evidence that is available indicates that the WIC program has the effect of promoting the use of infant formula, thus placing infants at higher risk. Moreover, the program violates the widely accepted principles that have been set out in the International Code of Marketing of Breast-milk Substitutes and in the human right to adequate food. Summary There is no good reason for an agency of government to distribute large quantities of free infant formula. It is recommended that the large-scale distribution of free infant formula by the WIC program should be phased out. PMID:16722534

  17. A large-scale cluster randomized trial to determine the effects of community-based dietary sodium reduction--the China Rural Health Initiative Sodium Reduction Study.

    PubMed

    Li, Nicole; Yan, Lijing L; Niu, Wenyi; Labarthe, Darwin; Feng, Xiangxian; Shi, Jingpu; Zhang, Jianxin; Zhang, Ruijuan; Zhang, Yuhong; Chu, Hongling; Neiman, Andrea; Engelgau, Michael; Elliott, Paul; Wu, Yangfeng; Neal, Bruce

    2013-11-01

    Cardiovascular diseases are the leading cause of death and disability in China. High blood pressure caused by excess intake of dietary sodium is widespread and an effective sodium reduction program has potential to improve cardiovascular health. This study is a large-scale, cluster-randomized, trial done in five Northern Chinese provinces. Two counties have been selected from each province and 12 townships in each county making a total of 120 clusters. Within each township one village has been selected for participation with 1:1 randomization stratified by county. The sodium reduction intervention comprises community health education and a food supply strategy based upon providing access to salt substitute. Subsidization of the price of salt substitute was done in 30 intervention villages selected at random. Control villages continued usual practices. The primary outcome for the study is dietary sodium intake level estimated from assays of 24-hour urine. The trial recruited and randomized 120 townships in April 2011. The sodium reduction program was commenced in the 60 intervention villages between May and June of that year with outcome surveys scheduled for October to December 2012. Baseline data collection shows that randomisation achieved good balance across groups. The establishment of the China Rural Health Initiative has enabled the launch of this large-scale trial designed to identify a novel, scalable strategy for reduction of dietary sodium and control of blood pressure. If proved effective, the intervention could plausibly be implemented at low cost in large parts of China and other countries worldwide. © 2013.

  18. Large-scale mapping and predictive modeling of submerged aquatic vegetation in a shallow eutrophic lake.

    PubMed

    Havens, Karl E; Harwell, Matthew C; Brady, Mark A; Sharfstein, Bruce; East, Therese L; Rodusky, Andrew J; Anson, Daniel; Maki, Ryan P

    2002-04-09

    A spatially intensive sampling program was developed for mapping the submerged aquatic vegetation (SAV) over an area of approximately 20,000 ha in a large, shallow lake in Florida, U.S. The sampling program integrates Geographic Information System (GIS) technology with traditional field sampling of SAV and has the capability of producing robust vegetation maps under a wide range of conditions, including high turbidity, variable depth (0 to 2 m), and variable sediment types. Based on sampling carried out in August-September 2000, we measured 1,050 to 4,300 ha of vascular SAV species and approximately 14,000 ha of the macroalga Chara spp. The results were similar to those reported in the early 1990s, when the last large-scale SAV sampling occurred. Occurrence of Chara was strongly associated with peat sediments, and maximal depths of occurrence varied between sediment types (mud, sand, rock, and peat). A simple model of Chara occurrence, based only on water depth, had an accuracy of 55%. It predicted occurrence of Chara over large areas where the plant actually was not found. A model based on sediment type and depth had an accuracy of 75% and produced a spatial map very similar to that based on observations. While this approach needs to be validated with independent data in order to test its general utility, we believe it may have application elsewhere. The simple modeling approach could serve as a coarse-scale tool for evaluating effects of water level management on Chara populations.

  19. Status of EPA's (Environmental Protection Agency's) LIMB (Limestone Injection Multistage Burner) demonstration program at Ohio Edison's Edgewater Unit 4. Report for September-December 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendriks, R.V.; Nolan, P.S.

    1987-01-01

    The paper describes and discusses the key design features of the retrofit of EPA's Limestone Injection Multistage Burner (LIMB) system to an operating, wall-fired utility boiler at Ohio Edison's Edgewater Station. It further describes results of the pertinent projects in EPA's LIMB program and shows how these results were used as the basis for the design of the system. The full-scale demonstration is expected to prove the effectiveness and cost of the LIMB concept for use on large-scale utility boilers. The equipment is now being installed at Edgewater, with system start-up scheduled for May 1987.

  20. Controlling Guessing Bias in the Dichotomous Rasch Model Applied to a Large-Scale, Vertically Scaled Testing Program

    PubMed Central

    Andrich, David; Marais, Ida; Humphry, Stephen Mark

    2015-01-01

    Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The consequence is that the proficiencies of the more proficient students are increased relative to those of the less proficient. Not controlling the guessing bias underestimates the progress of students across 7 years of schooling with important educational implications. PMID:29795871

  1. Aircraft data summaries for the SURE intensives. Final report. [Sampling done July 1978 near Duncan Falls, Ohio and Scranton, Pennsylvania

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keifer, W.S.; Blumenthal, D.L.; Tommerdahl, J.B.

    1981-09-01

    As part of the EPRI sulfate regional experiment (SURE), Meteorology Research, Inc., (MRI) and Research Triangle Institute (RTI) conducted six air quality sampling programs in the eastern United States using instrumented aircraft. This volume includes the air quality and meteorological data obtained during the July 1978 Intensive when MRI sampled near the Duncan Falls, Ohio, SURE Station and RTI sampled near the Scranton, Pennsylvania, SURE Station. During the last part of the July 1978 sampling period, both MRI and RTI aircraft participated in a large regional-scale sampling program with Brookhaven National Laboratory (BNL) and Pacific Northwest Laboratory (PNL). Only themore » data obtained by the MRI and RTI aircraft during this regional-scale sapling program are included in this volume.« less

  2. OVERVIEW OF US NATIONAL LAND-COVER MAPPING PROGRAM

    EPA Science Inventory

    Because of escalating costs amid growing needs for large-scale, satellite-based landscape information, a group of US federal agencies agreed to pool resources and operate as a consortium to acquire the necessary data land-cover mapping of the nation . The consortium was initiated...

  3. Development of a Catalytic Combustor for Aircraft Gas Turbine Engines.

    DTIC Science & Technology

    1976-09-22

    80 VI. DESIGN OF 7.6 CI CIANETE COMBUSTORS . . . . . . . . . . . 86 1. Design and Fabrication of CombusLors for Large Scale T est in...obtained for this program included round holes of different diameters, squares, rectangles, triangles, and other more complex hollow configurations

  4. Harnessing quantitative genetics and genomics for understanding and improving complex traits in crops

    USDA-ARS?s Scientific Manuscript database

    Classical quantitative genetics aids crop improvement by providing the means to estimate heritability, genetic correlations, and predicted responses to various selection schemes. Genomics has the potential to aid quantitative genetics and applied crop improvement programs via large-scale, high-thro...

  5. Laboratory development and testing of spacecraft diagnostics

    NASA Astrophysics Data System (ADS)

    Amatucci, William; Tejero, Erik; Blackwell, Dave; Walker, Dave; Gatling, George; Enloe, Lon; Gillman, Eric

    2017-10-01

    The Naval Research Laboratory's Space Chamber experiment is a large-scale laboratory device dedicated to the creation of large-volume plasmas with parameters scaled to realistic space plasmas. Such devices make valuable contributions to the investigation of space plasma phenomena under controlled, reproducible conditions, allowing for the validation of theoretical models being applied to space data. However, in addition to investigations such as plasma wave and instability studies, such devices can also make valuable contributions to the development and testing of space plasma diagnostics. One example is the plasma impedance probe developed at NRL. Originally developed as a laboratory diagnostic, the sensor has now been flown on a sounding rocket, is included on a CubeSat experiment, and will be included on the DoD Space Test Program's STP-H6 experiment on the International Space Station. In this talk, we will describe how the laboratory simulation of space plasmas made this development path possible. Work sponsored by the US Naval Research Laboratory Base Program.

  6. Comparisons of benthic filter feeder communities before and after a large-scale capital dredging program.

    PubMed

    Abdul Wahab, Muhammad Azmi; Fromont, Jane; Gomez, Oliver; Fisher, Rebecca; Jones, Ross

    2017-09-15

    Changes in turbidity, sedimentation and light over a two year large scale capital dredging program at Onslow, northwestern Australia, were quantified to assess their effects on filter feeder communities, in particular sponges. Community functional morphological composition was quantified using towed video surveys, while dive surveys allowed for assessments of species composition and chlorophyll content. Onslow is relatively diverse recording 150 sponge species. The area was naturally turbid (1.1 mean P 80 NTU), with inshore sites recording 6.5× higher turbidity than offshore localities, likely influenced by the Ashburton River discharge. Turbidity and sedimentation increased by up to 146% and 240% through dredging respectively, with corresponding decreases in light levels. The effects of dredging was variable, and despite existing caveats (i.e. bleaching event and passing of a cyclone), the persistence of sponges and the absence of a pronounced response post-dredging suggest environmental filtering or passive adaptation acquired pre-dredging may have benefited these communities. Copyright © 2017. Published by Elsevier Ltd.

  7. In vivo production of RNA nanostructures via programmed folding of single-stranded RNAs.

    PubMed

    Li, Mo; Zheng, Mengxi; Wu, Siyu; Tian, Cheng; Liu, Di; Weizmann, Yossi; Jiang, Wen; Wang, Guansong; Mao, Chengde

    2018-06-06

    Programmed self-assembly of nucleic acids is a powerful approach for nano-constructions. The assembled nanostructures have been explored for various applications. However, nucleic acid assembly often requires chemical or in vitro enzymatical synthesis of DNA or RNA, which is not a cost-effective production method on a large scale. In addition, the difficulty of cellular delivery limits the in vivo applications. Herein we report a strategy that mimics protein production. Gene-encoded DNA duplexes are transcribed into single-stranded RNAs, which self-fold into well-defined RNA nanostructures in the same way as polypeptide chains fold into proteins. The resulting nanostructure contains only one component RNA molecule. This approach allows both in vitro and in vivo production of RNA nanostructures. In vivo synthesized RNA strands can fold into designed nanostructures inside cells. This work not only suggests a way to synthesize RNA nanostructures on a large scale and at a low cost but also facilitates the in vivo applications.

  8. Lessons learned from LNG safety research.

    PubMed

    Koopman, Ronald P; Ermak, Donald L

    2007-02-20

    During the period from 1977 to 1989, the Lawrence Livermore National Laboratory (LLNL) conducted a liquefied gaseous fuels spill effects program under the sponsorship of the US Department of Energy, Department of Transportation, Gas Research Institute and others. The goal of this program was to develop and validate tools that could be used to predict the effects of a large liquefied gas spill through the execution of large scale field experiments and the development of computer models to make predictions for conditions under which tests could not be performed. Over the course of the program, three series of LNG spill experiments were performed to study cloud formation, dispersion, combustion and rapid phase transition (RPT) explosions. The purpose of this paper is to provide an overview of this program, the lessons learned from 12 years of research as well as some recommendations for the future. The general conclusion from this program is that cold, dense gas related phenomena can dominate the dispersion of a large volume, high release rate spill of LNG especially under low ambient wind speed and stable atmospheric conditions, and therefore, it is necessary to include a detailed and validated description of these phenomena in computer models to adequately predict the consequences of a release. Specific conclusions include: * LNG vapor clouds are lower and wider than trace gas clouds and tend to follow the downhill slope of terrain due to dampened vertical turbulence and gravity flow within the cloud. Under low wind speed, stable atmospheric conditions, a bifurcated, two lobed structure develops. * Navier-Stokes models provide the most complete description of LNG dispersion, while more highly parameterized Lagrangian models were found to be well suited to emergency response applications. * The measured heat flux from LNG vapor cloud burns exceeded levels necessary for third degree burns and were large enough to ignite most flammable materials. * RPTs are of two types, source generated and enrichment generated, and were observed to increase the burn area by a factor of two and to extend the downwind burn distance by 65%. Additional large scale experiments and model development are recommended.

  9. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  10. Structural design using equilibrium programming formulations

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.

    1995-01-01

    Solutions to increasingly larger structural optimization problems are desired. However, computational resources are strained to meet this need. New methods will be required to solve increasingly larger problems. The present approaches to solving large-scale problems involve approximations for the constraints of structural optimization problems and/or decomposition of the problem into multiple subproblems that can be solved in parallel. An area of game theory, equilibrium programming (also known as noncooperative game theory), can be used to unify these existing approaches from a theoretical point of view (considering the existence and optimality of solutions), and be used as a framework for the development of new methods for solving large-scale optimization problems. Equilibrium programming theory is described, and existing design techniques such as fully stressed design and constraint approximations are shown to fit within its framework. Two new structural design formulations are also derived. The first new formulation is another approximation technique which is a general updating scheme for the sensitivity derivatives of design constraints. The second new formulation uses a substructure-based decomposition of the structure for analysis and sensitivity calculations. Significant computational benefits of the new formulations compared with a conventional method are demonstrated.

  11. Effects of rotation on coolant passage heat transfer. Volume 2: Coolant passages with trips normal and skewed to the flow

    NASA Technical Reports Server (NTRS)

    Johnson, B. V.; Wagner, J. H.; Steuber, G. D.

    1993-01-01

    An experimental program was conducted to investigate heat transfer and pressure loss characteristics of rotating multipass passages, for configurations and dimensions typical of modem turbine blades. This experimental program is one part of the NASA Hot Section Technology (HOST) Initiative, which has as its overall objective the development and verification of improved analysis methods that will form the basis for a design system that will produce turbine components with improved durability. The objective of this program was the generation of a data base of heat transfer and pressure loss data required to develop heat transfer correlations and to assess computational fluid dynamic techniques for rotating coolant passages. The experimental work was broken down into two phases. Phase 1 consists of experiments conducted in a smooth wall large scale heat transfer model. A detailed discussion of these results was presented in volume 1 of a NASA Report. In Phase 2 the large scale model was modified to investigate the effects of skewed and normal passage turbulators. The results of Phase 2 along with comparison to Phase 1 is the subject of this Volume 2 NASA Report.

  12. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Fallon, J.

    The author looks in a broad perspective at funding for high energy physics programs over the period from the 1960`s to today. He tries to look at this in the large perspective, nationally and internationally, and then gives more detailed information for different laboratories and programs. In general funding peaked in the 1960`s, and has been going downward since then. This is not only in terms of adjusted dollars, but in terms of the sense in which the programs are funded on realistic time scales to allow them to come to a rapid completion.

  14. An Analysis of Enlisted Early Separations Under the Air Force’s and Navy’s VSI/SSB Programs: A Comparative Study

    DTIC Science & Technology

    1994-03-01

    were contained in the 1992 National Defense Authorization Act as an important policy tool of DoD’s force reduction strategy. As a result of the FY92...National Defense Authorization Act , the Department of Defense faced a large scale personnel strength reduction. The bonus programs were implemented in an...had many concerns. One of course, was the concern with how successful the incentive programs would be in inducing separations. Given the high

  15. A Block-LU Update for Large-Scale Linear Programming

    DTIC Science & Technology

    1990-01-01

    linear programming problems. Results are given from runs on the Cray Y -MP. 1. Introduction We wish to use the simplex method [Dan63] to solve the...standard linear program, minimize cTx subject to Ax = b 1< x <U, where A is an m by n matrix and c, x, 1, u, and b are of appropriate dimension. The simplex...the identity matrix. The basis is used to solve for the search direction y and the dual variables 7r in the following linear systems: Bky = aq (1.2) and

  16. Luminosity measurements for the R scan experiment at BESIII

    NASA Astrophysics Data System (ADS)

    Ablikim, M.; Achasov, M. N.; Ahmed, S.; Ai, X. C.; Albayrak, O.; Albrecht, M.; Ambrose, D. J.; Amoroso, A.; An, F. F.; An, Q.; Bai, J. Z.; Bakina, O.; Baldini Ferroli, R.; Ban, Y.; Bennett, D. W.; Bennett, J. V.; Berger, N.; Bertani, M.; Bettoni, D.; Bian, J. M.; Bianchi, F.; Boger, E.; Boyko, I.; Briere, R. A.; Cai, H.; Cai, X.; Cakir, O.; Calcaterra, A.; Cao, G. F.; Cetin, S. A.; Chai, J.; Chang, J. F.; Chelkov, G.; Chen, G.; Chen, H. S.; Chen, J. C.; Chen, M. L.; Chen, S.; Chen, S. J.; Chen, X.; Chen, X. R.; Chen, Y. B.; Chu, X. K.; Cibinetto, G.; Dai, H. L.; Dai, J. P.; Dbeyssi, A.; Dedovich, D.; Deng, Z. Y.; Denig, A.; Denysenko, I.; Destefanis, M.; De Mori, F.; Ding, Y.; Dong, C.; Dong, J.; Dong, L. Y.; Dong, M. Y.; Dou, Z. L.; Du, S. X.; Duan, P. F.; Fan, J. Z.; Fang, J.; Fang, S. S.; Fang, X.; Fang, Y.; Farinelli, R.; Fava, L.; Feldbauer, F.; Felici, G.; Feng, C. Q.; Fioravanti, E.; Fritsch, M.; Fu, C. D.; Gao, Q.; Gao, X. L.; Gao, Y.; Gao, Z.; Garzia, I.; Goetzen, K.; Gong, L.; Gong, W. X.; Gradl, W.; Greco, M.; Gu, M. H.; Gu, Y. T.; Guan, Y. H.; Guo, A. Q.; Guo, L. B.; Guo, R. P.; Guo, Y.; Guo, Y. P.; Haddadi, Z.; Hafner, A.; Han, S.; Hao, X. Q.; Harris, F. A.; He, K. L.; Heinsius, F. H.; Held, T.; Heng, Y. K.; Holtmann, T.; Hou, Z. L.; Hu, C.; Hu, H. M.; Hu, J. F.; Hu, T.; Hu, Y.; Huang, G. S.; Huang, J. S.; Huang, X. T.; Huang, X. Z.; Huang, Z. L.; Hussain, T.; Ikegami Andersson, W.; Ji, Q.; Ji, Q. P.; Ji, X. B.; Ji, X. L.; Jiang, L. W.; Jiang, X. S.; Jiang, X. Y.; Jiao, J. B.; Jiao, Z.; Jin, D. P.; Jin, S.; Johansson, T.; Julin, A.; Kalantar-Nayestanaki, N.; Kang, X. L.; Kang, X. S.; Kavatsyuk, M.; Ke, B. C.; Kiese, P.; Kliemt, R.; Kloss, B.; Kolcu, O. B.; Kopf, B.; Kornicer, M.; Kupsc, A.; Kühn, W.; Lange, J. S.; Lara, M.; Larin, P.; Leithoff, H.; Leng, C.; Li, C.; Li, Cheng; Li, D. M.; Li, F.; Li, F. Y.; Li, G.; Li, H. B.; Li, H. J.; Li, J. C.; Li, Jin; Li, K.; Li, K.; Li, Lei; Li, P. R.; Li, Q. Y.; Li, T.; Li, W. D.; Li, W. G.; Li, X. L.; Li, X. N.; Li, X. Q.; Li, Y. B.; Li, Z. B.; Liang, H.; Liang, Y. F.; Liang, Y. T.; Liao, G. R.; Lin, D. X.; Liu, B.; Liu, B. J.; Liu, C. X.; Liu, D.; Liu, F. H.; Liu, Fang; Liu, Feng; Liu, H. B.; Liu, H. H.; Liu, H. H.; Liu, H. M.; Liu, J.; Liu, J. B.; Liu, J. P.; Liu, J. Y.; Liu, K.; Liu, K. Y.; Liu, L. D.; Liu, P. L.; Liu, Q.; Liu, S. B.; Liu, X.; Liu, Y. B.; Liu, Y. Y.; Liu, Z. A.; Liu, Zhiqing; Loehner, H.; Lou, X. C.; Lu, H. J.; Lu, J. G.; Lu, Y.; Lu, Y. P.; Luo, C. L.; Luo, M. X.; Luo, T.; Luo, X. L.; Lyu, X. R.; Ma, F. C.; Ma, H. L.; Ma, L. L.; Ma, M. M.; Ma, Q. M.; Ma, T.; Ma, X. N.; Ma, X. Y.; Ma, Y. M.; Maas, F. E.; Maggiora, M.; Malik, Q. A.; Mao, Y. J.; Mao, Z. P.; Marcello, S.; Messchendorp, J. G.; Mezzadri, G.; Min, J.; Min, T. J.; Mitchell, R. E.; Mo, X. H.; Mo, Y. J.; Morales Morales, C.; Muchnoi, N. Yu.; Muramatsu, H.; Musiol, P.; Nefedov, Y.; Nerling, F.; Nikolaev, I. B.; Ning, Z.; Nisar, S.; Niu, S. L.; Niu, X. Y.; Olsen, S. L.; Ouyang, Q.; Pacetti, S.; Pan, Y.; Patteri, P.; Pelizaeus, M.; Peng, H. P.; Peters, K.; Pettersson, J.; Ping, J. L.; Ping, R. G.; Poling, R.; Prasad, V.; Qi, H. R.; Qi, M.; Qian, S.; Qiao, C. F.; Qin, L. Q.; Qin, N.; Qin, X. S.; Qin, Z. H.; Qiu, J. F.; Rashid, K. H.; Redmer, C. F.; Ripka, M.; Rong, G.; Rosner, Ch.; Ruan, X. D.; Sarantsev, A.; Savrié, M.; Schnier, C.; Schoenning, K.; Shan, W.; Shao, M.; Shen, C. P.; Shen, P. X.; Shen, X. Y.; Sheng, H. Y.; Song, W. M.; Song, X. Y.; Sosio, S.; Spataro, S.; Sun, G. X.; Sun, J. F.; Sun, S. S.; Sun, X. H.; Sun, Y. J.; Sun, Y. Z.; Sun, Z. J.; Sun, Z. T.; Tang, C. J.; Tang, X.; Tapan, I.; Thorndike, E. H.; Tiemens, M.; Uman, I.; Varner, G. S.; Wang, B.; Wang, B. L.; Wang, D.; Wang, D. Y.; Wang, K.; Wang, L. L.; Wang, L. S.; Wang, M.; Wang, P.; Wang, P. L.; Wang, W.; Wang, W. P.; Wang, X. F.; Wang, Y.; Wang, Y. D.; Wang, Y. F.; Wang, Y. Q.; Wang, Z.; Wang, Z. G.; Wang, Z. H.; Wang, Z. Y.; Wang, Z. Y.; Weber, T.; Wei, D. H.; Weidenkaff, P.; Wen, S. P.; Wiedner, U.; Wolke, M.; Wu, L. H.; Wu, L. J.; Wu, Z.; Xia, L.; Xia, L. G.; Xia, Y.; Xiao, D.; Xiao, H.; Xiao, Z. J.; Xie, Y. G.; Xie, Y. H.; Xiu, Q. L.; Xu, G. F.; Xu, J. J.; Xu, L.; Xu, Q. J.; Xu, Q. N.; Xu, X. P.; Yan, L.; Yan, W. B.; Yan, W. C.; Yan, Y. H.; Yang, H. J.; Yang, H. X.; Yang, L.; Yang, Y. X.; Ye, M.; Ye, M. H.; Yin, J. H.; You, Z. Y.; Yu, B. X.; Yu, C. X.; Yu, J. S.; Yuan, C. Z.; Yuan, Y.; Yuncu, A.; Zafar, A. A.; Zeng, Y.; Zeng, Z.; Zhang, B. X.; Zhang, B. Y.; Zhang, C. C.; Zhang, D. H.; Zhang, H. H.; Zhang, H. Y.; Zhang, J.; Zhang, J. J.; Zhang, J. L.; Zhang, J. Q.; Zhang, J. W.; Zhang, J. Y.; Zhang, J. Z.; Zhang, K.; Zhang, L.; Zhang, S. Q.; Zhang, X. Y.; Zhang, Y.; Zhang, Y.; Zhang, Y. H.; Zhang, Y. N.; Zhang, Y. T.; Zhang, Yu; Zhang, Z. H.; Zhang, Z. P.; Zhang, Z. Y.; Zhao, G.; Zhao, J. W.; Zhao, J. Y.; Zhao, J. Z.; Zhao, Lei; Zhao, Ling; Zhao, M. G.; Zhao, Q.; Zhao, Q. W.; Zhao, S. J.; Zhao, T. C.; Zhao, Y. B.; Zhao, Z. G.; Zhemchugov, A.; Zheng, B.; Zheng, J. P.; Zheng, W. J.; Zheng, Y. H.; Zhong, B.; Zhou, L.; Zhou, X.; Zhou, X. K.; Zhou, X. R.; Zhou, X. Y.; Zhu, K.; Zhu, K. J.; Zhu, S.; Zhu, S. H.; Zhu, X. L.; Zhu, Y. C.; Zhu, Y. S.; Zhu, Z. A.; Zhuang, J.; Zotti, L.; Zou, B. S.; Zou, J. H.; BESIII Collaboration

    2017-06-01

    By analyzing the large-angle Bhabha scattering events e+e- → (γ)e+e- and diphoton events e+e- → (γ)γγ for the data sets collected at center-of-mass (c.m.) energies between 2.2324 and 4.5900 GeV (131 energy points in total) with the upgraded Beijing Spectrometer (BESIII) at the Beijing Electron-Positron Collider (BEPCII), the integrated luminosities have been measured at the different c.m. energies, individually. The results are important inputs for the R value and J/ψ resonance parameter measurements. Supported by National Key Basic Research Program of China (2015CB856700), National Natural Science Foundation of China (NSFC) (10935007, 11121092, 11125525, 11235011, 11322544, 11335008, 11375170, 11275189, 11079030, 11475164, 11475169, 11005109, 10979095, 11275211), Chinese Academy of Sciences (CAS) Large-Scale Scientific Facility Program; Joint Large-Scale Scientific Facility Funds of the NSFC and CAS (11179007, U1232201, U1332201, U1532102). (KJCX2-YW-N29, KJCX2-YW-N45). 100 Talents Program of CAS, INPAC and Shanghai Key Laboratory for Particle Physics and Cosmology, German Research Foundation DFG (Collaborative Research Center CRC-1044), Istituto Nazionale di Fisica Nucleare, Italy, Ministry of Development of Turkey (DPT2006K-120470), Russian Foundation for Basic Research (14-07-91152), U. S. Department of Energy (DE-FG02-04ER41291, DE-FG02-05ER41374, DE-FG02-94ER40823, DESC0010118), U.S. National Science Foundation, University of Groningen (RuG) and the Helmholtzzentrum fuer Schwerionenforschung GmbH (GSI), Darmstadt, WCU Program of National Research Foundation of Korea (R32-2008-000-10155-0)

  17. Precision measurement of the integrated luminosity of the data taken by BESIII at center-of-mass energies between 3.810 GeV and 4.600 GeV

    NASA Astrophysics Data System (ADS)

    Ablikim, M.; N. Achasov, M.; Ai, X. C.; Albayrak, O.; Albrecht, M.; J. Ambrose, D.; Amoroso, A.; An, F. F.; An, Q.; Bai, J. Z.; R. Baldini, Ferroli; Ban, Y.; W. Bennett, D.; V. Bennett, J.; Bertani, M.; Bettoni, D.; Bian, J. M.; Bianchi, F.; Boger, E.; Bondarenko, O.; Boyko, I.; A. Briere, R.; Cai, H.; Cai, X.; Cakir, O.; Calcaterra, A.; Cao, G. F.; A. Cetin, S.; Chang, J. F.; Chelkov, G.; Chen, G.; Chen, H. S.; Chen, H. Y.; Chen, J. C.; Chen, M. L.; Chen, S. J.; Chen, X.; Chen, X. R.; Chen, Y. B.; Cheng, H. P.; Chu, X. K.; Cibinetto, G.; Cronin-Hennessy, D.; Dai, H. L.; Dai, J. P.; Dbeyssi, A.; Dedovich, D.; Deng, Z. Y.; Denig, A.; Denysenko, I.; Destefanis, M.; F. De, Mori; Ding, Y.; Dong, C.; Dong, J.; Dong, L. Y.; Dong, M. Y.; Du, S. X.; Duan, P. F.; Fan, J. Z.; Fang, J.; Fang, S. S.; Fang, X.; Fang, Y.; Fava, L.; Feldbauer, F.; Felici, G.; Feng, C. Q.; Fioravanti, E.; Fritsch, M.; Fu, C. D.; Gao, Q.; Gao, Y.; Gao, Z.; Garzia, I.; Geng, C.; Goetzen, K.; Gong, W. X.; Gradl, W.; Greco, M.; Gu, M. H.; Gu, Y. T.; Guan, Y. H.; Guo, A. Q.; Guo, L. B.; Guo, Y.; P. Guo, Y.; Haddadi, Z.; Hafner, A.; Han, S.; Han, Y. L.; Hao, X. Q.; A. Harris, F.; He, K. L.; He, Z. Y.; Held, T.; Heng, Y. K.; Hou, Z. L.; Hu, C.; Hu, H. M.; Hu, J. F.; Hu, T.; Hu, Y.; Huang, G. M.; Huang, G. S.; Huang, H. P.; Huang, J. S.; Huang, X. T.; Huang, Y.; Hussain, T.; Ji, Q.; Ji, Q. P.; Ji, X. B.; Ji, X. L.; Jiang, L. L.; Jiang, L. W.; Jiang, X. S.; Jiao, J. B.; Jiao, Z.; Jin, D. P.; Jin, S.; Johansson, T.; Julin, A.; Kalantar-Nayestanaki, N.; Kang, X. L.; Kang, X. S.; Kavatsyuk, M.; C. Ke, B.; Kliemt, R.; Kloss, B.; B. Kolcu, O.; Kopf, B.; Kornicer, M.; Kuehn, W.; Kupsc, A.; Lai, W.; S. Lange, J.; M., Lara; Larin, P.; Leng, C.; Li, C. H.; Li, Cheng; Li, D. M.; Li, F.; Li, G.; Li, H. B.; Li, J. C.; Li, Jin; Li, K.; Li, K.; Li, Lei; Li, P. R.; Li, T.; Li, W. D.; Li, W. G.; Li, X. L.; Li, X. M.; Li, X. N.; Li, X. Q.; Li, Z. B.; Liang, H.; Liang, Y. F.; Liang, Y. T.; Liao, G. R.; X. Lin(Lin, D.; Liu, B. J.; Liu, C. X.; Liu, F. H.; Liu, Fang; Liu, Feng; Liu, H. B.; Liu, H. H.; Liu, H. H.; Liu, H. M.; Liu, J.; Liu, J. P.; Liu, J. Y.; Liu, K.; Liu, K. Y.; Liu, L. D.; Liu, P. L.; Liu, Q.; Liu, S. B.; Liu, X.; Liu, X. X.; Liu, Y. B.; Liu, Z. A.; Liu, Zhiqiang; Zhiqing, Liu; Loehner, H.; Lou, X. C.; Lu, H. J.; Lu, J. G.; Lu, R. Q.; Lu, Y.; Lu, Y. P.; Luo, C. L.; Luo, M. X.; Luo, T.; Luo, X. L.; Lv, M.; Lyu, X. R.; Ma, F. C.; Ma, H. L.; Ma, L. L.; Ma, Q. M.; Ma, S.; Ma, T.; Ma, X. N.; Ma, X. Y.; E. Maas, F.; Maggiora, M.; A. Malik, Q.; Mao, Y. J.; Mao, Z. P.; Marcello, S.; G. Messchendorp, J.; Min, J.; Min, T. J.; E. Mitchell, R.; Mo, X. H.; Mo, Y. J.; C. Morales, Morales; Moriya, K.; Yu. Muchnoi, N.; Muramatsu, H.; Nefedov, Y.; Nerling, F.; B. Nikolaev, I.; Ning, Z.; Nisar, S.; Niu, S. L.; Niu, X. Y.; Olsen, S. L.; Ouyang, Q.; Pacetti, S.; Patteri, P.; Pelizaeus, M.; Peng, H. P.; Peters, K.; Ping, J. L.; Ping, R. G.; Poling, R.; Pu, Y. N.; Qi, M.; Qian, S.; Qiao, C. F.; Qin, L. Q.; Qin, N.; Qin, X. S.; Qin, Y.; Qin, Z. H.; Qiu, J. F.; H. Rashid, K.; F. Redmer, C.; Ren, H. L.; Ripka, M.; Rong, G.; Ruan, X. D.; Santoro, V.; Sarantsev, A.; Savrié, M.; Schoenning, K.; Schumann, S.; Shan, W.; Shao, M.; Shen, C. P.; Shen, P. X.; Shen, X. Y.; Sheng, H. Y.; Song, W. M.; Song, X. Y.; Sosio, S.; Spataro, S.; Sun, G. X.; Sun, J. F.; Sun, S. S.; Sun, Y. J.; Sun, Y. Z.; Sun, Z. J.; Sun, Z. T.; Tang, C. J.; Tang, X.; Tapan, I.; H. Thorndike, E.; Tiemens, M.; Toth, D.; Ullrich, M.; Uman, I.; S. Varner, G.; Wang, B.; Wang, B. L.; Wang, D.; Wang, D. Y.; Wang, K.; Wang, L. L.; Wang, L. S.; Wang, M.; Wang, P.; Wang, P. L.; Wang, Q. J.; Wang, S. G.; Wang, W.; Wang, X. F.; Yadi, Wang; Wang, Y. F.; Wang, Y. Q.; Wang, Z.; Wang, Z. G.; Wang, Z. H.; Wang, Z. Y.; Weber, T.; Wei, D. H.; Wei, J. B.; Weidenkaff, P.; Wen, S. P.; Wiedner, U.; Wolke, M.; Wu, L. H.; Wu, Z.; Xia, L. G.; Xia, Y.; Xiao, D.; Xiao, Z. J.; Xie, Y. G.; Xiu, Q. L.; Xu, G. F.; Xu, L.; Xu, Q. J.; Xu, Q. N.; Xu, X. P.; Yan, L.; Yan, W. B.; Yan, W. C.; Yan, Y. H.; Yang, H. X.; Yang, L.; Yang, Y.; Yang, Y. X.; Ye, H.; Ye, M.; Ye, M. H.; Yin, J. H.; Yu, B. X.; Yu, C. X.; Yu, H. W.; Yu, J. S.; Yuan, C. Z.; Yuan, W. L.; Yuan, Y.; Yuncu, A.; A. Zafar, A.; Zallo, A.; Zeng, Y.; Zhang, B. X.; Zhang, B. Y.; Zhang, C.; Zhang, C. C.; Zhang, D. H.; Zhang, H. H.; Zhang, H. Y.; Zhang, J. J.; Zhang, J. L.; Zhang, J. Q.; Zhang, J. W.; Zhang, J. Y.; Zhang, J. Z.; Zhang, K.; Zhang, L.; Zhang, S. H.; Zhang, X. Y.; Zhang, Y.; Zhang, Y. H.; Zhang, Y. T.; Zhang, Z. H.; Zhang, Z. P.; Zhang, Z. Y.; Zhao, G.; Zhao, J. W.; Zhao, J. Y.; Zhao, J. Z.; Zhao, Lei; Zhao, Ling; Zhao, M. G.; Zhao, Q.; Zhao, Q. W.; Zhao, S. J.; Zhao, T. C.; Zhao, Y. B.; Zhao, Z. G.; Zhemchugov, A.; Zheng, B.; Zheng, J. P.; Zheng, W. J.; Zheng, Y. H.; Zhong, B.; Zhou, L.; Zhou, Li; Zhou, X.; Zhou, X. K.; Zhou, X. R.; Zhou, X. Y.; Zhu, K.; Zhu, K. J.; Zhu, S.; Zhu, X. L.; Zhu, Y. C.; Zhu, Y. S.; Zhu, Z. A.; Zhuang, J.; Zotti, L.; Zou, B. S.; Zou, J. H.; BESIII Collaboration

    2015-09-01

    From December 2011 to May 2014, about 5 fb-1 of data were taken with the BESIII detector at center-of-mass energies between 3.810 GeV and 4.600 GeV to study the charmonium-like states and higher excited charmonium states. The time-integrated luminosity of the collected data sample is measured to a precision of 1% by analyzing events produced by the large-angle Bhabha scattering process. Supported by National Key Basic Research Program of China (2015CB856700), National Natural Science Foundation of China (NSFC) (11125525, 11235011, 11322544, 11335008, 11425524), Chinese Academy of Sciences (CAS) Large-Scale Scientific Facility Program, Joint Large-Scale Scientific Facility Funds of the NSFC and CAS (11179007, U1232201, U1332201) CAS (KJCX2-YW-N29, KJCX2-YW-N45), 100 Talents Program of CAS, INPAC and Shanghai Key Laboratory for Particle Physics and Cosmology, German Research Foundation DFG (Collaborative Research Center CRC-1044), Istituto Nazionale di Fisica Nucleare, Italy; Ministry of Development of Turkey (DPT2006K-120470), Russian Foundation for Basic Research (14-07-91152), U.S. Department of Energy (DE-FG02-04ER41291, DE-FG02-05ER41374, DE-FG02-94ER40823, DESC0010118), U.S. National Science Foundation, University of Groningen (RuG) and the Helmholtzzentrum fuer Schwerionenforschung GmbH (GSI), Darmstadt and WCU Program of National Research Foundation of Korea (R32-2008-000-10155-0)

  18. Approximate kernel competitive learning.

    PubMed

    Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang

    2015-03-01

    Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Advanced Visualization and Interactive Display Rapid Innovation and Discovery Evaluation Research (VISRIDER) Program Task 6: Point Cloud Visualization Techniques for Desktop and Web Platforms

    DTIC Science & Technology

    2017-04-01

    ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH (VISRIDER) PROGRAM TASK 6: POINT CLOUD...To) OCT 2013 – SEP 2014 4. TITLE AND SUBTITLE ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH...various point cloud visualization techniques for viewing large scale LiDAR datasets. Evaluate their potential use for thick client desktop platforms

  20. Skills of U.S. Unemployed, Young, and Older Adults in Sharper Focus: Results from the Program for the International Assessment of Adult Competencies (PIAAC) 2012/2014. First Look. NCES 2016-039

    ERIC Educational Resources Information Center

    Rampey, Bobby D.; Finnegan, Robert; Goodman, Madeline; Mohadjer, Leyla; Krenzke, Tom; Hogan, Jacquie; Provasnik, Stephen

    2016-01-01

    The "Program for the International Assessment of Adult Competencies" (PIAAC) is a cyclical, large-scale study of adult skills and life experiences focusing on education and employment. Nationally representative samples of adults between the ages of 16 and 65 are administered an assessment of literacy, numeracy, and problem solving in…

  1. Nonproliferation and Threat Reduction Assistance: U.S. Programs in the Former Soviet Union

    DTIC Science & Technology

    2011-04-26

    large - scale former BW-related facilities so that they can perform peaceful research issues such as infectious diseases. The Global Threat Reduction...indicated that it may not pursue the MOX program to eliminate its plutonium, opting instead for the construction of fast breeder reactors that could...burn plutonium directly for energy production. The United States might not fund this effort, as many in the United States argue that breeder reactors

  2. Hybrid MPI+OpenMP Programming of an Overset CFD Solver and Performance Investigations

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Jin, Haoqiang H.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    This report describes a two level parallelization of a Computational Fluid Dynamic (CFD) solver with multi-zone overset structured grids. The approach is based on a hybrid MPI+OpenMP programming model suitable for shared memory and clusters of shared memory machines. The performance investigations of the hybrid application on an SGI Origin2000 (O2K) machine is reported using medium and large scale test problems.

  3. Materials for stem cell factories of the future

    NASA Astrophysics Data System (ADS)

    Celiz, Adam D.; Smith, James G. W.; Langer, Robert; Anderson, Daniel G.; Winkler, David A.; Barrett, David A.; Davies, Martyn C.; Young, Lorraine E.; Denning, Chris; Alexander, Morgan R.

    2014-06-01

    Polymeric substrates are being identified that could permit translation of human pluripotent stem cells from laboratory-based research to industrial-scale biomedicine. Well-defined materials are required to allow cell banking and to provide the raw material for reproducible differentiation into lineages for large-scale drug-screening programs and clinical use. Yet more than 1 billion cells for each patient are needed to replace losses during heart attack, multiple sclerosis and diabetes. Producing this number of cells is challenging, and a rethink of the current predominant cell-derived substrates is needed to provide technology that can be scaled to meet the needs of millions of patients a year. In this Review, we consider the role of materials discovery, an emerging area of materials chemistry that is in large part driven by the challenges posed by biologists to materials scientists.

  4. National Water-Quality Assessment Program--Southern High Plains, Texas and New Mexico

    USGS Publications Warehouse

    Woodward, Dennis G.; Diniz, Cecilia G.

    1994-01-01

    BACKGROUND In 1991, the U.S. Geological Survey (USGS) began a National Water-Quality Assessment (NAWQA) program. The long-term goals of the NAWQA program are to describe the status of, and trends in, the quality of a large, representative part of the Nation's surface- and ground-water resources and to identify the major natural and human factors that affect the quality of these resources. In addressing these goals, the program will produce a wealth of water-quality information that will be useful to policy makers and managers at the National, State, and local levels. The NAWQA program emphasis is on regional water-quality problems. The program will not diminish the need for smaller studies and monitoring designed and currently being conducted by Federal, State, and local agencies to meet their individual needs. The NAWQA program, however, will provide a large-scale framework for conducting many of these activities and an understanding about National and regional water-quality conditions that cannot be acquired from individual, small-scale programs and studies. Studies of 60 hydrologic systems that include parts of most major river basins and aquifer systems (study-unit investigations) are the building blocks of the National assessment. The 60 study units range in size from 1,000 mi 2 (square miles) to more than 60,000 mi 2 and represent 60 to 70 percent of the Nation's water use and population served by public water supplies. Twenty study-unit investigations were started in 1991, 20 additional are starting in 1994, and 20 more are planned to start in 1997. The Southern High Plains study unit was selected as one of 20 study units to begin assessment activities in 1994. This study will be run from the New Mexico District office of the USGS in Albuquerque, New Mexico.

  5. A scientific program for infrared, submillimeter and radio astronomy from space: A report by the Management Operations Working Group

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Important and fundamental scientific progress can be attained through space observations in the wavelengths longward of 1 micron. The formation of galaxies, stars, and planets, the origin of quasars and the nature of active galactic nuclei, the large scale structure of the Universe, and the problem of the missing mass, are among the major scientific issues that can be addressed by these observations. Significant advances in many areas of astrophysics can be made over the next 20 years by implementing the outlined program. This program combines large observatories with smaller projects to create an overall scheme that emphasized complementarity and synergy, advanced technology, community support and development, and the training of the next generation of scientists. Key aspects of the program include: the Space Infrared Telescope Facility; the Stratospheric Observatory for Infrared Astronomy; a robust program of small missions; and the creation of the technology base for future major observatories.

  6. Safe Patient Handling and Mobility: Development and Implementation of a Large-Scale Education Program.

    PubMed

    Lee, Corinne; Knight, Suzanne W; Smith, Sharon L; Nagle, Dorothy J; DeVries, Lori

    This article addresses the development, implementation, and evaluation of an education program for safe patient handling and mobility at a large academic medical center. The ultimate goal of the program was to increase safety during patient mobility/transfer and reduce nursing staff injury from lifting/pulling. This comprehensive program was designed on the basis of the principles of prework, application, and support at the point of care. A combination of online learning, demonstration, skill evaluation, and coaching at the point of care was used to achieve the goal. Specific roles and responsibilities were developed to facilitate implementation. It took 17 master trainers, 88 certified trainers, 176 unit-based trainers, and 98 coaches to put 3706 nurses and nursing assistants through the program. Evaluations indicated both an increase in knowledge about safe patient handling and an increased ability to safely mobilize patients. The challenge now is sustainability of safe patient-handling practices and the growth and development of trainers and coaches.

  7. The development of a capability for aerodynamic testing of large-scale wing sections in a simulated natural rain environment

    NASA Technical Reports Server (NTRS)

    Bezos, Gaudy M.; Cambell, Bryan A.; Melson, W. Edward

    1989-01-01

    A research technique to obtain large-scale aerodynamic data in a simulated natural rain environment has been developed. A 10-ft chord NACA 64-210 wing section wing section equipped with leading-edge and trailing-edge high-lift devices was tested as part of a program to determine the effect of highly-concentrated, short-duration rainfall on airplane performance. Preliminary dry aerodynamic data are presented for the high-lift configuration at a velocity of 100 knots and an angle of attack of 18 deg. Also, data are presented on rainfield uniformity and rainfall concentration intensity levels obtained during the calibration of the rain simulation system.

  8. Scaled Lunar Module Jet Erosion Experiments

    NASA Technical Reports Server (NTRS)

    Land, Norman S.; Scholl, Harland F.

    1966-01-01

    An experimental research program was conducted on the erosion of particulate surfaces by a jet exhaust. These experiments were scaled to represent the lunar module (LM) during landing. A conical cold-gas nozzle simulating the lunar module nozzle was utilized. The investigation was conducted within a large vacuum chamber by using gravel or glass beads as a simulated soil. The effects of thrust, descent speed, nozzle terminal height, particle size on crater size, and visibility during jet erosion were determined.

  9. Western Pacific Tropical Cyclone Adaptive Observing of Inner Core Life-cycle Structure and Intensity Change

    DTIC Science & Technology

    2009-09-30

    airborne radar images; develop an analysis scheme for the monsoon and storm- scale circulation features that would: a. Define large-scale context...Doppler radar observations of TC mesoscale observations. The TCS-08 field program provided unique aircraft reconnaissance (recon) data that will be...system for WC-130J, as well as developed new system for recording airborne radar video for the first time. 3. Created an archive of all WC-130J

  10. Coverage of Large-Scale Food Fortification of Edible Oil, Wheat Flour, and Maize Flour Varies Greatly by Vehicle and Country but Is Consistently Lower among the Most Vulnerable: Results from Coverage Surveys in 8 Countries123

    PubMed Central

    Aaron, Grant J; Friesen, Valerie M; Jungjohann, Svenja; Garrett, Greg S; Myatt, Mark

    2017-01-01

    Background: Large-scale food fortification (LSFF) of commonly consumed food vehicles is widely implemented in low- and middle-income countries. Many programs have monitoring information gaps and most countries fail to assess program coverage. Objective: The aim of this work was to present LSFF coverage survey findings (overall and in vulnerable populations) from 18 programs (7 wheat flour, 4 maize flour, and 7 edible oil programs) conducted in 8 countries between 2013 and 2015. Methods: A Fortification Assessment Coverage Toolkit (FACT) was developed to standardize the assessments. Three indicators were used to assess the relations between coverage and vulnerability: 1) poverty, 2) poor dietary diversity, and 3) rural residence. Three measures of coverage were assessed: 1) consumption of the vehicle, 2) consumption of a fortifiable vehicle, and 3) consumption of a fortified vehicle. Individual program performance was assessed based on the following: 1) achieving overall coverage ≥50%, 2) achieving coverage of ≥75% in ≥1 vulnerable group, and 3) achieving equity in coverage for ≥1 vulnerable group. Results: Coverage varied widely by food vehicle and country. Only 2 of the 18 LSFF programs assessed met all 3 program performance criteria. The 2 main program bottlenecks were a poor choice of vehicle and failure to fortify a fortifiable vehicle (i.e., absence of fortification). Conclusions: The results highlight the importance of sound program design and routine monitoring and evaluation. There is strong evidence of the impact and cost-effectiveness of LSFF; however, impact can only be achieved when the necessary activities and processes during program design and implementation are followed. The FACT approach fills an important gap in the availability of standardized tools. The LSFF programs assessed here need to be re-evaluated to determine whether to further invest in the programs, whether other vehicles are appropriate, and whether other approaches are needed. PMID:28404836

  11. Coverage of Large-Scale Food Fortification of Edible Oil, Wheat Flour, and Maize Flour Varies Greatly by Vehicle and Country but Is Consistently Lower among the Most Vulnerable: Results from Coverage Surveys in 8 Countries.

    PubMed

    Aaron, Grant J; Friesen, Valerie M; Jungjohann, Svenja; Garrett, Greg S; Neufeld, Lynnette M; Myatt, Mark

    2017-05-01

    Background: Large-scale food fortification (LSFF) of commonly consumed food vehicles is widely implemented in low- and middle-income countries. Many programs have monitoring information gaps and most countries fail to assess program coverage. Objective: The aim of this work was to present LSFF coverage survey findings (overall and in vulnerable populations) from 18 programs (7 wheat flour, 4 maize flour, and 7 edible oil programs) conducted in 8 countries between 2013 and 2015. Methods: A Fortification Assessment Coverage Toolkit (FACT) was developed to standardize the assessments. Three indicators were used to assess the relations between coverage and vulnerability: 1 ) poverty, 2 ) poor dietary diversity, and 3 ) rural residence. Three measures of coverage were assessed: 1 ) consumption of the vehicle, 2 ) consumption of a fortifiable vehicle, and 3 ) consumption of a fortified vehicle. Individual program performance was assessed based on the following: 1 ) achieving overall coverage ≥50%, 2) achieving coverage of ≥75% in ≥1 vulnerable group, and 3 ) achieving equity in coverage for ≥1 vulnerable group. Results: Coverage varied widely by food vehicle and country. Only 2 of the 18 LSFF programs assessed met all 3 program performance criteria. The 2 main program bottlenecks were a poor choice of vehicle and failure to fortify a fortifiable vehicle (i.e., absence of fortification). Conclusions: The results highlight the importance of sound program design and routine monitoring and evaluation. There is strong evidence of the impact and cost-effectiveness of LSFF; however, impact can only be achieved when the necessary activities and processes during program design and implementation are followed. The FACT approach fills an important gap in the availability of standardized tools. The LSFF programs assessed here need to be re-evaluated to determine whether to further invest in the programs, whether other vehicles are appropriate, and whether other approaches are needed.

  12. Descriptive Statistics for Modern Test Score Distributions: Skewness, Kurtosis, Discreteness, and Ceiling Effects.

    PubMed

    Ho, Andrew D; Yu, Carol C

    2015-06-01

    Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.

  13. ATLAS Large Scale Thin Gap Chambers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soha, Aria

    This is a technical scope of work (TSW) between the Fermi National Accelerator Laboratory (Fermilab) and the experimenters of the ATLAS sTGC New Small Wheel collaboration who have committed to participate in beam tests to be carried out during the FY2014 Fermilab Test Beam Facility program.

  14. Predictors of Sustainability of Social Programs

    ERIC Educational Resources Information Center

    Savaya, Riki; Spiro, Shimon E.

    2012-01-01

    This article presents the findings of a large scale study that tested a comprehensive model of predictors of three manifestations of sustainability: continuation, institutionalization, and duration. Based on the literature the predictors were arrayed in four groups: variables pertaining to the project, the auspice organization, the community, and…

  15. National Security and Civil Liberty: Striking the Balance

    ERIC Educational Resources Information Center

    Lopach, James J.; Luckowski, Jean A.

    2006-01-01

    After September 11, 2001, the Bush administration initiated large-scale electronic surveillance within the United States to gather intelligence to protect citizens from terrorists. Media commentary, public reaction, and classroom practices regarding this program have tended toward either-or positions: either for presidential power and national…

  16. Test Design Considerations for Students with Significant Cognitive Disabilities

    ERIC Educational Resources Information Center

    Anderson, Daniel; Farley, Dan; Tindal, Gerald

    2015-01-01

    Students with significant cognitive disabilities present an assessment dilemma that centers on access and validity in large-scale testing programs. Typically, access is improved by eliminating construct-irrelevant barriers, while validity is improved, in part, through test standardization. In this article, one state's alternate assessment data…

  17. Handbook of Formative Assessment

    ERIC Educational Resources Information Center

    Andrade, Heidi, Ed.; Cizek, Gregory J., Ed.

    2010-01-01

    Formative assessment has recently become a focus of renewed research as state and federal policy-makers realize that summative assessments have reached a point of diminishing returns as a tool for increasing student achievement. Consequently, supporters of large-scale testing programs are now beginning to consider the potential of formative…

  18. Active microwave remote sensing of oceans, chapter 3

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A rationale is developed for the use of active microwave sensing in future aerospace applications programs for the remote sensing of the world's oceans, lakes, and polar regions. Summaries pertaining to applications, local phenomena, and large-scale phenomena are given along with a discussion of orbital errors.

  19. Power Grid Data Analysis with R and Hadoop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Gibson, Tara D.; Kleese van Dam, Kerstin

    This book chapter presents an approach to analysis of large-scale time-series sensor information based on our experience with power grid data. We use the R-Hadoop Integrated Programming Environment (RHIPE) to analyze a 2TB data set and present code and results for this analysis.

  20. Interfaith Leaders as Social Entrepreneurs

    ERIC Educational Resources Information Center

    Patel, Eboo; Meyer, Cassie

    2012-01-01

    Social entrepreneurs work to find concrete solutions to large-scale problems that are scalable and sustainable. In this article, the authors explore what the framework of social entrepreneurship might offer those seeking to positively engage religious diversity on college campuses, and highlight two programs that offer examples of what such…

  1. A statewide nurse training program for a hospital based infant abusive head trauma prevention program.

    PubMed

    Nocera, Maryalice; Shanahan, Meghan; Murphy, Robert A; Sullivan, Kelly M; Barr, Marilyn; Price, Julie; Zolotor, Adam

    2016-01-01

    Successful implementation of universal patient education programs requires training large numbers of nursing staff in new content and procedures and maintaining fidelity to program standards. In preparation for statewide adoption of a hospital based universal education program, nursing staff at 85 hospitals and 1 birthing center in North Carolina received standardized training. This article describes the training program and reports findings from the process, outcome and impact evaluations of this training. Evaluation strategies were designed to query nurse satisfaction with training and course content; determine if training conveyed new information, and assess if nurses applied lessons from the training sessions to deliver the program as designed. Trainings were conducted during April 2008-February 2010. Evaluations were received from 4358 attendees. Information was obtained about training type, participants' perceptions of newness and usefulness of information and how the program compared to other education materials. Program fidelity data were collected using telephone surveys about compliance to delivery of teaching points and teaching behaviors. Results demonstrate high levels of satisfaction and perceptions of program utility as well as adherence to program model. These findings support the feasibility of implementing a universal patient education programs with strong uptake utilizing large scale systematic training programs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. For the purposes of this Guide, large-scale Federal renewable energy projects are defined as renewable energy facilities larger than 10 megawatts (MW) that are sited on Federal property and lands and typically financed and owned by third parties.1 The U.S. Department of Energy’s Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessarymore » private capital to complete them. This Guide is intended to provide a general resource that will begin to develop the Federal employee’s awareness and understanding of the project developer’s operating environment and the private sector’s awareness and understanding of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this Guide has been organized to match Federal processes with typical phases of commercial project development. FEMP collaborated with the National Renewable Energy Laboratory (NREL) and professional project developers on this Guide to ensure that Federal projects have key elements recognizable to private sector developers and investors. The main purpose of this Guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project. This framework begins the translation between the Federal and private sector operating environments. When viewing the overall« less

  3. Transitioning a Large Scale HIV/AIDS Prevention Program to Local Stakeholders: Findings from the Avahan Transition Evaluation

    PubMed Central

    Bennett, Sara; Singh, Suneeta; Rodriguez, Daniela; Ozawa, Sachiko; Singh, Kriti; Chhabra, Vibha; Dhingra, Neeraj

    2015-01-01

    Background Between 2009–2013 the Bill and Melinda Gates Foundation transitioned its HIV/AIDS prevention initiative in India from being a stand-alone program outside of government, to being fully government funded and implemented. We present an independent prospective evaluation of the transition. Methods The evaluation drew upon (1) a structured survey of transition readiness in a sample of 80 targeted HIV prevention programs prior to transition; (2) a structured survey assessing institutionalization of program features in a sample of 70 targeted intervention (TI) programs, one year post-transition; and (3) case studies of 15 TI programs. Findings Transition was conducted in 3 rounds. While the 2009 transition round was problematic, subsequent rounds were implemented more smoothly. In the 2011 and 2012 transition rounds, Avahan programs were well prepared for transition with the large majority of TI program staff trained for transition, high alignment with government clinical, financial and managerial norms, and strong government commitment to the program. One year post transition there were significant program changes, but these were largely perceived positively. Notable negative changes were: limited flexibility in program management, delays in funding, commodity stock outs, and community member perceptions of a narrowing in program focus. Service coverage outcomes were sustained at least six months post-transition. Interpretation The study suggests that significant investments in transition preparation contributed to a smooth transition and sustained service coverage. Notwithstanding, there were substantive program changes post-transition. Five key lessons for transition design and implementation are identified. PMID:26327591

  4. A full year of snow on sea ice observations and simulations - Plans for MOSAiC 2019/20

    NASA Astrophysics Data System (ADS)

    Nicolaus, M.; Geland, S.; Perovich, D. K.

    2017-12-01

    The snow cover on sea on sea ice dominates many exchange processes and properties of the ice covered polar oceans. It is a major interface between the atmosphere and the sea ice with the ocean underneath. Snow on sea ice is known for its extraordinarily large spatial and temporal variability from micro scales and minutes to basin wide scales and decades. At the same time, snow cover properties and even snow depth distributions are among the least known and most difficult to observe climate variables. Starting in October 2019 and ending in October 2020, the international MOSAiC drift experiment will allow to observe the evolution of a snow pack on Arctic sea ice over a full annual cycle. During the drift with one ice floe along the transpolar drift, we will study snow processes and interactions as one of the main topics of the MOSAiC research program. Thus we will, for the first time, be able to perform such studies on seasonal sea ice and relate it to previous expeditions and parallel observations at different locations. Here we will present the current status of our planning of the MOSAiC snow program. We will summarize the latest implementation ideas to combine the field observations with numerical simulations. The field program will include regular manual observations and sampling on the main floe of the central observatory, autonomous recordings in the distributed network, airborne observations in the surrounding of the central observatory, and retrievals of satellite remote sensing products. Along with the field program, numerical simulations of the MOSAiC snow cover will be performed on different scales, including large-scale interaction with the atmosphere and the sea ice. The snow studies will also bridge between the different disciplines, including physical, chemical, biological, and geochemical measurements, samples, and fluxes. The main challenge of all measurements will be to accomplish the description of the full annual cycle.

  5. Suaahara in Nepal: An at-scale, multi-sectoral nutrition program influences knowledge and practices while enhancing equity.

    PubMed

    Cunningham, Kenda; Singh, Akriti; Pandey Rana, Pooja; Brye, Laura; Alayon, Silvia; Lapping, Karin; Gautam, Bindu; Underwood, Carol; Klemm, Rolf D W

    2017-10-01

    The burden of undernutrition in South Asia is greater than anywhere else. Policies and programmatic efforts increasingly address health and non-health determinants of undernutrition. In Nepal, one large-scale integrated nutrition program, Suaahara, aimed to reduce undernutrition among women and children in the 1,000-day period, while simultaneously addressing inequities. In this study, we use household-level process evaluation data (N = 480) to assess levels of exposure to program inputs and levels of knowledge and practices related to health, nutrition, and water, sanitation, and hygiene (WASH). We also assess Suaahara's effect on the differences between disadvantaged (DAG) and non-disadvantaged households in exposure, knowledge, and practice indicators. All regression models were adjusted for potential confounders at the child-, maternal-, and household levels, as well as clustering. We found a higher prevalence of almost all exposure and knowledge indicators and some practice indicators in Suaahara areas versus comparison areas. A higher proportion of DAG households in Suaahara areas reported exposure, were knowledgeable, and practiced optimal behaviors related to nearly all maternal and child health, nutrition, and WASH indicators than DAG households in non-Suaahara areas and sometimes even than non-DAG households in Suaahara areas. Moreover, differences in some of these indicators between DAG and non-DAG households were significantly smaller in Suaahara areas than in comparison areas. These results indicate that large-scale integrated interventions can influence nutrition-related knowledge and practices, while simultaneously reducing inequities. © 2017 John Wiley & Sons Ltd.

  6. High-Lift Engine Aeroacoustics Technology (HEAT) Test Program Overview

    NASA Technical Reports Server (NTRS)

    Zuniga, Fanny A.; Smith, Brian E.

    1999-01-01

    The NASA High-Speed Research program developed the High-Lift Engine Aeroacoustics Technology (HEAT) program to demonstrate satisfactory interaction between the jet noise suppressor and high-lift system of a High-Speed Civil Transport (HSCT) configuration at takeoff, climb, approach and landing conditions. One scheme for reducing jet exhaust noise generated by an HSCT is the use of a mixer-ejector system which would entrain large quantities of ambient air into the nozzle exhaust flow through secondary inlets in order to cool and slow the jet exhaust before it exits the nozzle. The effectiveness of such a noise suppression device must be evaluated in the presence of an HSCT wing high-lift system before definitive assessments can be made concerning its acoustic performance. In addition, these noise suppressors must provide the required acoustic attenuation while not degrading the thrust efficiency of the propulsion system or the aerodynamic performance of the high-lift devices on the wing. Therefore, the main objective of the HEAT program is to demonstrate these technologies and understand their interactions on a large-scale HSCT model. The HEAT program is a collaborative effort between NASA-Ames, Boeing Commercial Airplane Group, Douglas Aircraft Corp., Lockheed-Georgia, General Electric and NASA - Lewis. The suppressor nozzles used in the tests were Generation 1 2-D mixer-ejector nozzles made by General Electric. The model used was a 13.5%-scale semi-span model of a Boeing Reference H configuration.

  7. Price schedules coordination for electricity pool markets

    NASA Astrophysics Data System (ADS)

    Legbedji, Alexis Motto

    2002-04-01

    We consider the optimal coordination of a class of mathematical programs with equilibrium constraints, which is formally interpreted as a resource-allocation problem. Many decomposition techniques were proposed to circumvent the difficulty of solving large systems with limited computer resources. The considerable improvement in computer architecture has allowed the solution of large-scale problems with increasing speed. Consequently, interest in decomposition techniques has waned. Nonetheless, there is an important class of applications for which decomposition techniques will still be relevant, among others, distributed systems---the Internet, perhaps, being the most conspicuous example---and competitive economic systems. Conceptually, a competitive economic system is a collection of agents that have similar or different objectives while sharing the same system resources. In theory, constructing a large-scale mathematical program and solving it centrally, using currently available computing power can optimize such systems of agents. In practice, however, because agents are self-interested and not willing to reveal some sensitive corporate data, one cannot solve these kinds of coordination problems by simply maximizing the sum of agent's objective functions with respect to their constraints. An iterative price decomposition or Lagrangian dual method is considered best suited because it can operate with limited information. A price-directed strategy, however, can only work successfully when coordinating or equilibrium prices exist, which is not generally the case when a weak duality is unavoidable. Showing when such prices exist and how to compute them is the main subject of this thesis. Among our results, we show that, if the Lagrangian function of a primal program is additively separable, price schedules coordination may be attained. The prices are Lagrange multipliers, and are also the decision variables of a dual program. In addition, we propose a new form of augmented or nonlinear pricing, which is an example of the use of penalty functions in mathematical programming. Applications are drawn from mathematical programming problems of the form arising in electric power system scheduling under competition.

  8. Measurement of the integrated Luminosities of cross-section scan data samples around the {\\rm{\\psi }}(3770) mass region

    NASA Astrophysics Data System (ADS)

    Ablikim, M.; Achasov, M. N.; Ahmed, S.; Albrecht, M.; Alekseev, M.; Amoroso, A.; An, F. F.; An, Q.; Bai, Y.; Bakina, O.; Baldini Ferroli, R.; Ban, Y.; Begzsuren, K.; Bennett, D. W.; Bennett, J. V.; Berger, N.; Bertani, M.; Bettoni, D.; Bianchi, F.; Boger, E.; Boyko, I.; Briere, R. A.; Cai, H.; Cai, X.; Cakir, O.; Calcaterra, A.; Cao, G. F.; Cetin, S. A.; Chai, J.; Chang, J. F.; Chang, W. L.; Chelkov, G.; Chen, G.; Chen, H. S.; Chen, J. C.; Chen, M. L.; Chen, P. L.; Chen, S. J.; Chen, X. R.; Chen, Y. B.; Chu, X. K.; Cibinetto, G.; Cossio, F.; Dai, H. L.; Dai, J. P.; Dbeyssi, A.; Dedovich, D.; Deng, Z. Y.; Denig, A.; Denysenko, I.; Destefanis, M.; De Mori, F.; Ding, Y.; Dong, C.; Dong, J.; Dong, L. Y.; Dong, M. Y.; Dou, Z. L.; Du, S. X.; Duan, P. F.; Fang, J.; Fang, S. S.; Fang, Y.; Farinelli, R.; Fava, L.; Fegan, S.; Feldbauer, F.; Felici, G.; Feng, C. Q.; Fioravanti, E.; Fritsch, M.; Fu, C. D.; Gao, Q.; Gao, X. L.; Gao, Y.; Gao, Y. G.; Gao, Z.; Garillon, B.; Garzia, I.; Gilman, A.; Goetzen, K.; Gong, L.; Gong, W. X.; Gradl, W.; Greco, M.; Gu, L. M.; Gu, M. H.; Gu, Y. T.; Guo, A. Q.; Guo, L. B.; Guo, R. P.; Guo, Y. P.; Guskov, A.; Haddadi, Z.; Han, S.; Hao, X. Q.; Harris, F. A.; He, K. L.; He, X. Q.; Heinsius, F. H.; Held, T.; Heng, Y. K.; Holtmann, T.; Hou, Z. L.; Hu, H. M.; Hu, J. F.; Hu, T.; Hu, Y.; Huang, G. S.; Huang, J. S.; Huang, X. T.; Huang, X. Z.; Huang, Z. L.; Hussain, T.; Ikegami Andersson, W.; Irshad, M.; Ji, Q.; Ji, Q. P.; Ji, X. B.; Ji, X. L.; Jiang, X. S.; Jiang, X. Y.; Jiao, J. B.; Jiao, Z.; Jin, D. P.; Jin, S.; Jin, Y.; Johansson, T.; Julin, A.; Kalantar-Nayestanaki, N.; Kang, X. S.; Kavatsyuk, M.; Ke, B. C.; Khan, T.; Khoukaz, A.; Kiese, P.; Kliemt, R.; Koch, L.; Kolcu, O. B.; Kopf, B.; Kornicer, M.; Kuemmel, M.; Kuessner, M.; Kupsc, A.; Kurth, M.; Kühn, W.; Lange, J. S.; Lara, M.; Larin, P.; Lavezzi, L.; Leiber, S.; Leithoff, H.; Li, C.; Li, Cheng; Li, D. M.; Li, F.; Li, F. Y.; Li, G.; Li, H. B.; Li, H. J.; Li, J. C.; Li, J. W.; Li, K. J.; Li, Kang; Li, Ke; Li, Lei; Li, P. L.; Li, P. R.; Li, Q. Y.; Li, T.; Li, W. D.; Li, W. G.; Li, X. L.; Li, X. N.; Li, X. Q.; Li, Z. B.; Liang, H.; Liang, Y. F.; Liang, Y. T.; Liao, G. R.; Liao, L. Z.; Libby, J.; Lin, C. X.; Lin, D. X.; Liu, B.; Liu, B. J.; Liu, C. X.; Liu, D.; Liu, D. Y.; Liu, F. H.; Liu, Fang; Liu, Feng; Liu, H. B.; Liu, H. L.; Liu, H. M.; Liu, Huanhuan; Liu, Huihui; Liu, J. B.; Liu, J. Y.; Liu, K.; Liu, K. Y.; Liu, Ke; Liu, L. D.; Liu, Q.; Liu, S. B.; Liu, X.; Liu, Y. B.; Liu, Z. A.; Liu, Zhiqing; Long, Y. F.; Lou, X. C.; Lu, H. J.; Lu, J. G.; Lu, Y.; Lu, Y. P.; Luo, C. L.; Luo, M. X.; Luo, X. L.; Lusso, S.; Lyu, X. R.; Ma, F. C.; Ma, H. L.; Ma, L. L.; Ma, M. M.; Ma, Q. M.; Ma, X. N.; Ma, X. Y.; Ma, Y. M.; Maas, F. E.; Maggiora, M.; Malik, Q. A.; Mangoni, A.; Mao, Y. J.; Mao, Z. P.; Marcello, S.; Meng, Z. X.; Messchendorp, J. G.; Mezzadri, G.; Min, J.; Min, T. J.; Mitchell, R. E.; Mo, X. H.; Mo, Y. J.; Morales Morales, C.; Morello, G.; Muchnoi, N. Yu; Muramatsu, H.; Mustafa, A.; Nakhoul, S.; Nefedov, Y.; Nerling, F.; Nikolaev, I. B.; Ning, Z.; Nisar, S.; Niu, S. L.; Niu, X. Y.; Olsen, S. L.; Ouyang, Q.; Pacetti, S.; Pan, Y.; Papenbrock, M.; Patteri, P.; Pelizaeus, M.; Pellegrino, J.; Peng, H. P.; Peng, Z. Y.; Peters, K.; Pettersson, J.; Ping, J. L.; Ping, R. G.; Pitka, A.; Poling, R.; Prasad, V.; Qi, H. R.; Qi, M.; Qi, T. Y.; Qian, S.; Qiao, C. F.; Qin, N.; Qin, X. S.; Qin, Z. H.; Qiu, J. F.; Rashid, K. H.; Redmer, C. F.; Richter, M.; Ripka, M.; Rolo, M.; Rong, G.; Rosner, Ch.; Ruan, X. D.; Sarantsev, A.; Savrié, M.; Schnier, C.; Schoenning, K.; Shan, W.; Shan, X. Y.; Shao, M.; Shen, C. P.; Shen, P. X.; Shen, X. Y.; Sheng, H. Y.; Shi, X.; Song, J. J.; Song, W. M.; Song, X. Y.; Sosio, S.; Sowa, C.; Spataro, S.; Sun, G. X.; Sun, J. F.; Sun, L.; Sun, S. S.; Sun, X. H.; Sun, Y. J.; Sun, Y. K.; Sun, Y. Z.; Sun, Z. J.; Sun, Z. T.; Tan, Y. T.; Tang, C. J.; Tang, G. Y.; Tang, X.; Tapan, I.; Tiemens, M.; Tsednee, B.; Uman, I.; Varner, G. S.; Wang, B.; Wang, B. L.; Wang, C. W.; Wang, D.; Wang, D. Y.; Wang, Dan; Wang, K.; Wang, L. L.; Wang, L. S.; Wang, M.; Wang, Meng; Wang, P.; Wang, P. L.; Wang, W. P.; Wang, X. F.; Wang, Y.; Wang, Y. F.; Wang, Y. Q.; Wang, Z.; Wang, Z. G.; Wang, Z. Y.; Wang, Zongyuan; Weber, T.; Wei, D. H.; Weidenkaff, P.; Wen, S. P.; Wiedner, U.; Wolke, M.; Wu, L. H.; Wu, L. J.; Wu, Z.; Xia, L.; Xia, X.; Xia, Y.; Xiao, D.; Xiao, Y. J.; Xiao, Z. J.; Xie, Y. G.; Xie, Y. H.; Xiong, X. A.; Xiu, Q. L.; Xu, G. F.; Xu, J. J.; Xu, L.; Xu, Q. J.; Xu, Q. N.; Xu, X. P.; Yan, F.; Yan, L.; Yan, W. B.; Yan, W. C.; Yan, Y. H.; Yang, H. J.; Yang, H. X.; Yang, L.; Yang, S. L.; Yang, Y. H.; Yang, Y. X.; Yang, Yifan; Ye, M.; Ye, M. H.; Yin, J. H.; You, Z. Y.; Yu, B. X.; Yu, C. X.; Yu, J. S.; Yuan, C. Z.; Yuan, Y.; Yuncu, A.; Zafar, A. A.; Zallo, A.; Zeng, Y.; Zeng, Z.; Zhang, B. X.; Zhang, B. Y.; Zhang, C. C.; Zhang, D. H.; Zhang, H. H.; Zhang, H. Y.; Zhang, J.; Zhang, J. L.; Zhang, J. Q.; Zhang, J. W.; Zhang, J. Y.; Zhang, J. Z.; Zhang, K.; Zhang, L.; Zhang, S. F.; Zhang, T. J.; Zhang, X. Y.; Zhang, Y.; Zhang, Y. H.; Zhang, Y. T.; Zhang, Yang; Zhang, Yao; Zhang, Yu; Zhang, Z. H.; Zhang, Z. P.; Zhang, Z. Y.; Zhao, G.; Zhao, J. W.; Zhao, J. Y.; Zhao, J. Z.; Zhao, Lei; Zhao, Ling; Zhao, M. G.; Zhao, Q.; Zhao, S. J.; Zhao, T. C.; Zhao, Y. B.; Zhao, Z. G.; Zhemchugov, A.; Zheng, B.; Zheng, J. P.; Zheng, W. J.; Zheng, Y. H.; Zhong, B.; Zhou, L.; Zhou, Q.; Zhou, X.; Zhou, X. K.; Zhou, X. R.; Zhou, X. Y.; Zhu, A. N.; Zhu, J.; Zhu, J.; Zhu, K.; Zhu, K. J.; Zhu, S.; Zhu, S. H.; Zhu, X. L.; Zhu, Y. C.; Zhu, Y. S.; Zhu, Z. A.; Zhuang, J.; Zou, B. S.; Zou, J. H.; BESIII Collaboration

    2018-05-01

    To investigate the nature of the {{\\psi }}(3770) resonance and to measure the cross section for {{{e}}}+{{{e}}}-\\to {{D}}\\bar{{{D}}}, a cross-section scan data sample, distributed among 41 center-of-mass energy points from 3.73 to 3.89 GeV, was taken with the BESIII detector operated at the BEPCII collider in the year 2010. By analyzing the large angle Bhabha scattering events, we measure the integrated luminosity of the data sample at each center-of-mass energy point. The total integrated luminosity of the data sample is 76.16+/- 0.04+/- 0.61 {pb}}-1, where the first uncertainty is statistical and the second systematic. Supported by National Key Basic Research Program of China (2015CB856700), National Natural Science Foundation of China (NSFC) (11235011, 11335008, 11425524, 11625523, 11635010), the Chinese Academy of Sciences (CAS) Large-Scale Scientific Facility Program, the CAS Center for Excellence in Particle Physics (CCEPP), Joint Large-Scale Scientific Facility Funds of the NSFC and CAS (U1332201, U1532257, U1532258), CAS Key Research Program of Frontier Sciences (QYZDJ-SSW-SLH003, QYZDJ-SSW-SLH040), 100 Talents Program of CAS, National 1000 Talents Program of China, INPAC and Shanghai Key Laboratory for Particle Physics and Cosmology, German Research Foundation DFG under Contracts Nos. Collaborative Research Center CRC 1044, FOR 2359, Istituto Nazionale di Fisica Nucleare, Italy, Koninklijke Nederlandse Akademie van Wetenschappen (KNAW) (530-4CDP03), Ministry of Development of Turkey (DPT2006K-120470), National Science and Technology fund, The Swedish Research Council, U. S. Department of Energy (DE-FG02-05ER41374, DE-SC-0010118, DE-SC-0010504, DE-SC-0012069), University of Groningen (RuG) and the Helmholtzzentrum fuer Schwerionenforschung GmbH (GSI), Darmstadt, WCU Program of National Research Foundation of Korea (R32-2008-000-10155-0)

  9. An introduction to data reduction: space-group determination, scaling and intensity statistics.

    PubMed

    Evans, Philip R

    2011-04-01

    This paper presents an overview of how to run the CCP4 programs for data reduction (SCALA, POINTLESS and CTRUNCATE) through the CCP4 graphical interface ccp4i and points out some issues that need to be considered, together with a few examples. It covers determination of the point-group symmetry of the diffraction data (the Laue group), which is required for the subsequent scaling step, examination of systematic absences, which in many cases will allow inference of the space group, putting multiple data sets on a common indexing system when there are alternatives, the scaling step itself, which produces a large set of data-quality indicators, estimation of |F| from intensity and finally examination of intensity statistics to detect crystal pathologies such as twinning. An appendix outlines the scoring schemes used by the program POINTLESS to assign probabilities to possible Laue and space groups.

  10. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  11. Three pillars for achieving quantum mechanical molecular dynamics simulations of huge systems: Divide-and-conquer, density-functional tight-binding, and massively parallel computation.

    PubMed

    Nishizawa, Hiroaki; Nishimura, Yoshifumi; Kobayashi, Masato; Irle, Stephan; Nakai, Hiromi

    2016-08-05

    The linear-scaling divide-and-conquer (DC) quantum chemical methodology is applied to the density-functional tight-binding (DFTB) theory to develop a massively parallel program that achieves on-the-fly molecular reaction dynamics simulations of huge systems from scratch. The functions to perform large scale geometry optimization and molecular dynamics with DC-DFTB potential energy surface are implemented to the program called DC-DFTB-K. A novel interpolation-based algorithm is developed for parallelizing the determination of the Fermi level in the DC method. The performance of the DC-DFTB-K program is assessed using a laboratory computer and the K computer. Numerical tests show the high efficiency of the DC-DFTB-K program, a single-point energy gradient calculation of a one-million-atom system is completed within 60 s using 7290 nodes of the K computer. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  12. Supercomputer optimizations for stochastic optimal control applications

    NASA Technical Reports Server (NTRS)

    Chung, Siu-Leung; Hanson, Floyd B.; Xu, Huihuang

    1991-01-01

    Supercomputer optimizations for a computational method of solving stochastic, multibody, dynamic programming problems are presented. The computational method is valid for a general class of optimal control problems that are nonlinear, multibody dynamical systems, perturbed by general Markov noise in continuous time, i.e., nonsmooth Gaussian as well as jump Poisson random white noise. Optimization techniques for vector multiprocessors or vectorizing supercomputers include advanced data structures, loop restructuring, loop collapsing, blocking, and compiler directives. These advanced computing techniques and superconducting hardware help alleviate Bellman's curse of dimensionality in dynamic programming computations, by permitting the solution of large multibody problems. Possible applications include lumped flight dynamics models for uncertain environments, such as large scale and background random aerospace fluctuations.

  13. Skills of U.S. Unemployed, Young, and Older Adults in Sharper Focus: Results from the Program for the International Assessment of Adult Competencies (PIAAC) 2012/2014. First Look. NCES 2016-039rev

    ERIC Educational Resources Information Center

    Rampey, Bobby D.; Finnegan, Robert; Mohadjer, Leyla; Krenzke, Tom; Hogan, Jacquie; Provasnik, Stephen

    2016-01-01

    The Program for the International Assessment of Adult Competencies (PIAAC) is a cyclical, large-scale study of adult skills and life experiences focusing on education and employment. Nationally representative samples of adults between the ages of 16 and 65 are administered an assessment of literacy, numeracy, and problem solving in technology rich…

  14. A Microcomputer-Based Program for Printing Check Plots of Integrated Circuits Specified in Caltech Intermediate Form.

    DTIC Science & Technology

    1984-12-01

    only four transistors[5]. Each year since that time, the semiconductor industry has con- sistently improved the quality of the fabrication tech- niques...rarely took place at universities and was almost exclusively confined to industry . IC design techniques were developed, tested, and taught only in the...community, it is not uncommon for industry to borrow ideas and even particular programs from these university designed tools. The Very Large Scale Integration

  15. Conference on Fire Resistant Materials: A compilation of presentations and papers

    NASA Technical Reports Server (NTRS)

    Kourtides, D. A. (Editor); Johnson, G. A. (Editor)

    1979-01-01

    The proceedings of the NASA IRE Resistant Materials Engineering (FIREMEN) Program held at Boeing Commercial Airplane Company, Seattle, Washington, on March 1-2, 1979 are reported. The conference was to discuss the results of research by the National Aeronautics and Space Administration in the field of aircraft fire safety and fire-resistant materials. The program topics include the following: (1) large-scale testing; (2) fire toxicology; (3) polymeric materials; and (4) fire modeling.

  16. Optimization by nonhierarchical asynchronous decomposition

    NASA Technical Reports Server (NTRS)

    Shankar, Jayashree; Ribbens, Calvin J.; Haftka, Raphael T.; Watson, Layne T.

    1992-01-01

    Large scale optimization problems are tractable only if they are somehow decomposed. Hierarchical decompositions are inappropriate for some types of problems and do not parallelize well. Sobieszczanski-Sobieski has proposed a nonhierarchical decomposition strategy for nonlinear constrained optimization that is naturally parallel. Despite some successes on engineering problems, the algorithm as originally proposed fails on simple two dimensional quadratic programs. The algorithm is carefully analyzed for quadratic programs, and a number of modifications are suggested to improve its robustness.

  17. Nonproliferation and Threat Reduction Assistance: U.S, Programs in the Former Soviet Union

    DTIC Science & Technology

    2008-03-26

    reconfigure its large - scale former BW-related facilities so that they can perform peaceful research issues such as infectious diseases. For FY2004, the Bush...program to eliminate its plutonium, opting instead for the construction of fast breeder reactors that could burn plutonium directly for energy production...The United States might not fund this effort, as many in the United States argue that breeder reactors , which produce more plutonium than they

  18. A role for shellfish aquaculture in coastal nitrogen management.

    PubMed

    Rose, Julie M; Bricker, Suzanne B; Tedesco, Mark A; Wikfors, Gary H

    2014-01-01

    Excess nutrients in the coastal environment have been linked to a host of environmental problems, and nitrogen reduction efforts have been a top priority of resource managers for decades. The use of shellfish for coastal nitrogen remediation has been proposed, but formal incorporation into nitrogen management programs is lagging. Including shellfish aquaculture in existing nitrogen management programs makes sense from environmental, economic, and social perspectives, but challenges must be overcome for large-scale implementation to be possible.

  19. Literacy, Numeracy, and Problem Solving in Technology-Rich Environments among U.S. Adults: Results from the Program for the International Assessment of Adult Competencies 2012. First Look. NCES 2014-008

    ERIC Educational Resources Information Center

    Goodman, Madeline; Finnegan, Robert; Mohadjer, Leyla; Krenzke, Tom; Hogan, Jacquie

    2013-01-01

    The Program for the International Assessment of Adult Competencies (PIAAC) is a cyclical, large scale study of adult skills and life experience focusing on education and employment that was developed and organized by the Organization for Economic Cooperation and Development (OECD). In the United States, the study was conducted in 2011-12 with a…

  20. Video Monitoring a Simulation-Based Quality Improvement Program in Bihar, India.

    PubMed

    Dyer, Jessica; Spindler, Hilary; Christmas, Amelia; Shah, Malay Bharat; Morgan, Melissa; Cohen, Susanna R; Sterne, Jason; Mahapatra, Tanmay; Walker, Dilys

    2018-04-01

    Simulation-based training has become an accepted clinical training andragogy in high-resource settings with its use increasing in low-resource settings. Video recordings of simulated scenarios are commonly used by facilitators. Beyond using the videos during debrief sessions, researchers can also analyze the simulation videos to quantify technical and nontechnical skills during simulated scenarios over time. Little is known about the feasibility and use of large-scale systems to video record and analyze simulation and debriefing data for monitoring and evaluation in low-resource settings. This manuscript describes the process of designing and implementing a large-scale video monitoring system. Mentees and Mentors were consented and all simulations and debriefs conducted at 320 Primary Health Centers (PHCs) were video recorded. The system design, number of video recordings, and inter-rater reliability of the coded videos were assessed. The final dataset included a total of 11,278 videos. Overall, a total of 2,124 simulation videos were coded and 183 (12%) were blindly double-coded. For the double-coded sample, the average inter-rater reliability (IRR) scores were 80% for nontechnical skills, and 94% for clinical technical skills. Among 4,450 long debrief videos received, 216 were selected for coding and all were double-coded. Data quality of simulation videos was found to be very good in terms of recorded instances of "unable to see" and "unable to hear" in Phases 1 and 2. This study demonstrates that video monitoring systems can be effectively implemented at scale in resource limited settings. Further, video monitoring systems can play several vital roles within program implementation, including monitoring and evaluation, provision of actionable feedback to program implementers, and assurance of program fidelity.

Top