Sample records for large scale research

  1. Research on the Construction Management and Sustainable Development of Large-Scale Scientific Facilities in China

    NASA Astrophysics Data System (ADS)

    Guiquan, Xi; Lin, Cong; Xuehui, Jin

    2018-05-01

    As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.

  2. Large-Scale 3D Printing: The Way Forward

    NASA Astrophysics Data System (ADS)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  3. Side effects of problem-solving strategies in large-scale nutrition science: towards a diversification of health.

    PubMed

    Penders, Bart; Vos, Rein; Horstman, Klasien

    2009-11-01

    Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.

  4. The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research

    ERIC Educational Resources Information Center

    Mertens, Steven B.

    2006-01-01

    This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…

  5. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2010-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also

  6. Managing Risk and Uncertainty in Large-Scale University Research Projects

    ERIC Educational Resources Information Center

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  7. Aquatic Plant Control Research Program. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 5. Synthesis Report.

    DTIC Science & Technology

    1984-06-01

    RD-Rl45 988 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MANAGEMENT ..(U) ARMY ENGINEER WATERWAYS EXPERIMENT STATION VICKSBURG MS...REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR -, CONTROL OF PROBLEM AQUATIC PLANTS Report 5 SYNTHESIS REPORT bv Andrew...Corps of Engineers Washington, DC 20314 84 0,_1 oil.. LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMUR FOR CONTROL OF PROBLEM AQUATIC

  8. The Need for Large-Scale, Longitudinal Empirical Studies in Middle Level Education Research

    ERIC Educational Resources Information Center

    Mertens, Steven B.; Caskey, Micki M.; Flowers, Nancy

    2016-01-01

    This essay describes and discusses the ongoing need for large-scale, longitudinal, empirical research studies focused on middle grades education. After a statement of the problem and concerns, the essay describes and critiques several prior middle grades efforts and research studies. Recommendations for future research efforts to inform policy…

  9. Cloud computing for genomic data analysis and collaboration.

    PubMed

    Langmead, Ben; Nellore, Abhinav

    2018-04-01

    Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.

  10. Lessons from a Large-Scale Assessment: Results from Conceptual Inventories

    ERIC Educational Resources Information Center

    Thacker, Beth; Dulli, Hani; Pattillo, Dave; West, Keith

    2014-01-01

    We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER) (physics education research-informed materials) into a department where most instruction has previously been traditional and a significant…

  11. Icing Simulation Research Supporting the Ice-Accretion Testing of Large-Scale Swept-Wing Models

    NASA Technical Reports Server (NTRS)

    Yadlin, Yoram; Monnig, Jaime T.; Malone, Adam M.; Paul, Bernard P.

    2018-01-01

    The work summarized in this report is a continuation of NASA's Large-Scale, Swept-Wing Test Articles Fabrication; Research and Test Support for NASA IRT contract (NNC10BA05 -NNC14TA36T) performed by Boeing under the NASA Research and Technology for Aerospace Propulsion Systems (RTAPS) contract. In the study conducted under RTAPS, a series of icing tests in the Icing Research Tunnel (IRT) have been conducted to characterize ice formations on large-scale swept wings representative of modern commercial transport airplanes. The outcome of that campaign was a large database of ice-accretion geometries that can be used for subsequent aerodynamic evaluation in other experimental facilities and for validation of ice-accretion prediction codes.

  12. Raising Concerns about Sharing and Reusing Large-Scale Mathematics Classroom Observation Video Data

    ERIC Educational Resources Information Center

    Ing, Marsha; Samkian, Artineh

    2018-01-01

    There are great opportunities and challenges to sharing large-scale mathematics classroom observation data. This Research Commentary describes the methodological opportunities and challenges and provides a specific example from a mathematics education research project to illustrate how the research questions and framework drove observational…

  13. Investigating and Stimulating Primary Teachers' Attitudes Towards Science: Summary of a Large-Scale Research Project

    ERIC Educational Resources Information Center

    Walma van der Molen, Juliette; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical concept of attitude, methodological flaws in…

  14. Embarking on large-scale qualitative research: reaping the benefits of mixed methods in studying youth, clubs and drugs

    PubMed Central

    Hunt, Geoffrey; Moloney, Molly; Fazio, Adam

    2012-01-01

    Qualitative research is often conceptualized as inherently small-scale research, primarily conducted by a lone researcher enmeshed in extensive and long-term fieldwork or involving in-depth interviews with a small sample of 20 to 30 participants. In the study of illicit drugs, traditionally this has often been in the form of ethnographies of drug-using subcultures. Such small-scale projects have produced important interpretive scholarship that focuses on the culture and meaning of drug use in situated, embodied contexts. Larger-scale projects are often assumed to be solely the domain of quantitative researchers, using formalistic survey methods and descriptive or explanatory models. In this paper, however, we will discuss qualitative research done on a comparatively larger scale—with in-depth qualitative interviews with hundreds of young drug users. Although this work incorporates some quantitative elements into the design, data collection, and analysis, the qualitative dimension and approach has nevertheless remained central. Larger-scale qualitative research shares some of the challenges and promises of smaller-scale qualitative work including understanding drug consumption from an emic perspective, locating hard-to-reach populations, developing rapport with respondents, generating thick descriptions and a rich analysis, and examining the wider socio-cultural context as a central feature. However, there are additional challenges specific to the scale of qualitative research, which include data management, data overload and problems of handling large-scale data sets, time constraints in coding and analyzing data, and personnel issues including training, organizing and mentoring large research teams. Yet large samples can prove to be essential for enabling researchers to conduct comparative research, whether that be cross-national research within a wider European perspective undertaken by different teams or cross-cultural research looking at internal divisions and differences within diverse communities and cultures. PMID:22308079

  15. Finite Mixture Multilevel Multidimensional Ordinal IRT Models for Large Scale Cross-Cultural Research

    ERIC Educational Resources Information Center

    de Jong, Martijn G.; Steenkamp, Jan-Benedict E. M.

    2010-01-01

    We present a class of finite mixture multilevel multidimensional ordinal IRT models for large scale cross-cultural research. Our model is proposed for confirmatory research settings. Our prior for item parameters is a mixture distribution to accommodate situations where different groups of countries have different measurement operations, while…

  16. Large-Scale Wind Turbine Testing in the NASA 24.4m (80) by 36.6m(120) Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Zell, Peter T.; Imprexia, Cliff (Technical Monitor)

    2000-01-01

    The 80- by 120-Foot Wind Tunnel at NASA Ames Research Center in California provides a unique capability to test large-scale wind turbines under controlled conditions. This special capability is now available for domestic and foreign entities wishing to test large-scale wind turbines. The presentation will focus on facility capabilities to perform wind turbine tests and typical research objectives for this type of testing.

  17. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  18. Aquatic Plant Control Research Program. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Reports 2 and 3. First and Second Year Poststocking Results. Volume 5. The Herpetofauna of Lake Conway, Florida: Community Analysis.

    DTIC Science & Technology

    1983-07-01

    TEST CHART NATIONAL BVIREAU OF StANARS-1963- I AQUATIC PLANT CONTROL RESEARCH PROGRAM TECHNICAL REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF...Waterways Experiment Station P. 0. Box 631, Vicksburg, Miss. 39180 83 11 01 018 - I ., lit I III I | LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE...No. 3. RECIPIENT’S CATALOG NUMBER Technical Report A-78-2 Aa 1 Lj 19 ________5!1___ A. TITLE (Ad Subtitle) LARGE-SCALE OPERATIONS MANAGEMENT S. TYPE

  19. Integrating land and resource management plans and applied large-scale research on two national forests

    Treesearch

    Callie Jo Schweitzer; Stacy Clark; Glen Gaines; Paul Finke; Kurt Gottschalk; David Loftis

    2008-01-01

    Researchers working out of the Southern and Northern Research Stations have partnered with two National Forests to conduct two large-scale studies designed to assess the effectiveness of silvicultural techniques used to restore and maintain upland oak (Quercus spp.)-dominated ecosystems in the Cumberland Plateau Region of the southeastern United...

  20. Critical Issues in Large-Scale Assessment: A Resource Guide.

    ERIC Educational Resources Information Center

    Redfield, Doris

    The purpose of this document is to provide practical guidance and support for the design, development, and implementation of large-scale assessment systems that are grounded in research and best practice. Information is included about existing large-scale testing efforts, including national testing programs, state testing programs, and…

  1. Measured acoustic characteristics of ducted supersonic jets at different model scales

    NASA Technical Reports Server (NTRS)

    Jones, R. R., III; Ahuja, K. K.; Tam, Christopher K. W.; Abdelwahab, M.

    1993-01-01

    A large-scale (about a 25x enlargement) model of the Georgia Tech Research Institute (GTRI) hardware was installed and tested in the Propulsion Systems Laboratory of the NASA Lewis Research Center. Acoustic measurements made in these two facilities are compared and the similarity in acoustic behavior over the scale range under consideration is highlighted. The study provide the acoustic data over a relatively large-scale range which may be used to demonstrate the validity of scaling methods employed in the investigation of this phenomena.

  2. Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System

    NASA Astrophysics Data System (ADS)

    He, Qing; Li, Hong

    Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.

  3. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  4. Cohort Profile of the Goals Study: A Large-Scale Research of Physical Activity in Dutch Students

    ERIC Educational Resources Information Center

    de Groot, Renate H. M.; van Dijk, Martin L.; Kirschner, Paul A.

    2015-01-01

    The GOALS study (Grootschalig Onderzoek naar Activiteiten van Limburgse Scholieren [Large-scale Research of Activities in Dutch Students]) was set up to investigate possible associations between different forms of physical activity and inactivity with cognitive performance, academic achievement and mental well-being. It was conducted at a…

  5. Causal Inferences with Large Scale Assessment Data: Using a Validity Framework

    ERIC Educational Resources Information Center

    Rutkowski, David; Delandshere, Ginette

    2016-01-01

    To answer the calls for stronger evidence by the policy community, educational researchers and their associated organizations increasingly demand more studies that can yield causal inferences. International large scale assessments (ILSAs) have been targeted as a rich data sources for causal research. It is in this context that we take up a…

  6. Inexpensive Tools To Quantify And Map Vegetative Cover For Large-Scale Research Or Management Decisions.

    USDA-ARS?s Scientific Manuscript database

    Vegetative cover can be quantified quickly and consistently and often at lower cost with image analysis of color digital images than with visual assessments. Image-based mapping of vegetative cover for large-scale research and management decisions can now be considered with the accuracy of these met...

  7. Using Practitioner Inquiry within and against Large-Scale Educational Reform

    ERIC Educational Resources Information Center

    Hines, Mary Beth; Conner-Zachocki, Jennifer

    2015-01-01

    This research study examines the impact of teacher research on participants in a large-scale educational reform initiative in the United States, No Child Left Behind, and its strand for reading teachers, Reading First. Reading First supported professional development for teachers in order to increase student scores on standardized tests. The…

  8. Framing Innovation: Does an Instructional Vision Help Superintendents Gain Acceptance for a Large-Scale Technology Initiative?

    ERIC Educational Resources Information Center

    Flanagan, Gina E.

    2014-01-01

    There is limited research that outlines how a superintendent's instructional vision can help to gain acceptance of a large-scale technology initiative. This study explored how superintendents gain acceptance for a large-scale technology initiative (specifically a 1:1 device program) through various leadership actions. The role of the instructional…

  9. Potential for geophysical experiments in large scale tests.

    USGS Publications Warehouse

    Dieterich, J.H.

    1981-01-01

    Potential research applications for large-specimen geophysical experiments include measurements of scale dependence of physical parameters and examination of interactions with heterogeneities, especially flaws such as cracks. In addition, increased specimen size provides opportunities for improved recording resolution and greater control of experimental variables. Large-scale experiments using a special purpose low stress (100MPa).-Author

  10. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    NASA Astrophysics Data System (ADS)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  11. Performance of lap splices in large-scale column specimens affected by ASR and/or DEF.

    DOT National Transportation Integrated Search

    2012-06-01

    This research program conducted a large experimental program, which consisted of the design, construction, : curing, deterioration, and structural load testing of 16 large-scale column specimens with a critical lap splice : region, and then compared ...

  12. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  13. Wetlands as large-scale nature-based solutions: status and future challenges for research and management

    NASA Astrophysics Data System (ADS)

    Thorslund, Josefin; Jarsjö, Jerker; Destouni, Georgia

    2017-04-01

    Wetlands are often considered as nature-based solutions that can provide a multitude of services of great social, economic and environmental value to humankind. The services may include recreation, greenhouse gas sequestration, contaminant retention, coastal protection, groundwater level and soil moisture regulation, flood regulation and biodiversity support. Changes in land-use, water use and climate can all impact wetland functions and occur at scales extending well beyond the local scale of an individual wetland. However, in practical applications, management decisions usually regard and focus on individual wetland sites and local conditions. To understand the potential usefulness and services of wetlands as larger-scale nature-based solutions, e.g. for mitigating negative impacts from large-scale change pressures, one needs to understand the combined function multiple wetlands at the relevant large scales. We here systematically investigate if and to what extent research so far has addressed the large-scale dynamics of landscape systems with multiple wetlands, which are likely to be relevant for understanding impacts of regional to global change. Our investigation regards key changes and impacts of relevance for nature-based solutions, such as large-scale nutrient and pollution retention, flow regulation and coastal protection. Although such large-scale knowledge is still limited, evidence suggests that the aggregated functions and effects of multiple wetlands in the landscape can differ considerably from those observed at individual wetlands. Such scale differences may have important implications for wetland function-effect predictability and management under large-scale change pressures and impacts, such as those of climate change.

  14. State of the Art in Large-Scale Soil Moisture Monitoring

    NASA Technical Reports Server (NTRS)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  15. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2009-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast

  16. Awareness and Concern about Large-Scale Livestock and Poultry: Results from a Statewide Survey of Ohioans

    ERIC Educational Resources Information Center

    Sharp, Jeff; Tucker, Mark

    2005-01-01

    The development of large-scale livestock facilities has become a controversial issue in many regions of the U.S. in recent years. In this research, rural-urban differences in familiarity and concern about large-scale livestock facilities among Ohioans is examined as well as the relationship of social distance from agriculture and trust in risk…

  17. Higher Education Teachers' Descriptions of Their Own Learning: A Large-Scale Study of Finnish Universities of Applied Sciences

    ERIC Educational Resources Information Center

    Töytäri, Aija; Piirainen, Arja; Tynjälä, Päivi; Vanhanen-Nuutinen, Liisa; Mäki, Kimmo; Ilves, Vesa

    2016-01-01

    In this large-scale study, higher education teachers' descriptions of their own learning were examined with qualitative analysis involving application of principles of phenomenographic research. This study is unique: it is unusual to use large-scale data in qualitative studies. The data were collected through an e-mail survey sent to 5960 teachers…

  18. Examining the Emergence of Large-Scale Structures in Collaboration Networks: Methods in Sociological Analysis

    ERIC Educational Resources Information Center

    Ghosh, Jaideep; Kshitij, Avinash

    2017-01-01

    This article introduces a number of methods that can be useful for examining the emergence of large-scale structures in collaboration networks. The study contributes to sociological research by investigating how clusters of research collaborators evolve and sometimes percolate in a collaboration network. Typically, we find that in our networks,…

  19. Influencing Public School Policy in the United States: The Role of Large-Scale Assessments

    ERIC Educational Resources Information Center

    Schmidt, William H.; Burroughs, Nathan A.

    2016-01-01

    The authors review the influence of state, national and international large-scale assessments (LSAs) on education policy and research. They distinguish between two main uses of LSAs: as a means for conducting research that informs educational reform and LSAs as a tool for implementing standards and enforcing accountability. The authors discuss the…

  20. Architectural Optimization of Digital Libraries

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1998-01-01

    This work investigates performance and scaling issues relevant to large scale distributed digital libraries. Presently, performance and scaling studies focus on specific implementations of production or prototype digital libraries. Although useful information is gained to aid these designers and other researchers with insights to performance and scaling issues, the broader issues relevant to very large scale distributed libraries are not addressed. Specifically, no current studies look at the extreme or worst case possibilities in digital library implementations. A survey of digital library research issues is presented. Scaling and performance issues are mentioned frequently in the digital library literature but are generally not the focus of much of the current research. In this thesis a model for a Generic Distributed Digital Library (GDDL) and nine cases of typical user activities are defined. This model is used to facilitate some basic analysis of scaling issues. Specifically, the calculation of Internet traffic generated for different configurations of the study parameters and an estimate of the future bandwidth needed for a large scale distributed digital library implementation. This analysis demonstrates the potential impact a future distributed digital library implementation would have on the Internet traffic load and raises questions concerning the architecture decisions being made for future distributed digital library designs.

  1. An Novel Architecture of Large-scale Communication in IOT

    NASA Astrophysics Data System (ADS)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  2. Statistical analysis of mesoscale rainfall: Dependence of a random cascade generator on large-scale forcing

    NASA Technical Reports Server (NTRS)

    Over, Thomas, M.; Gupta, Vijay K.

    1994-01-01

    Under the theory of independent and identically distributed random cascades, the probability distribution of the cascade generator determines the spatial and the ensemble properties of spatial rainfall. Three sets of radar-derived rainfall data in space and time are analyzed to estimate the probability distribution of the generator. A detailed comparison between instantaneous scans of spatial rainfall and simulated cascades using the scaling properties of the marginal moments is carried out. This comparison highlights important similarities and differences between the data and the random cascade theory. Differences are quantified and measured for the three datasets. Evidence is presented to show that the scaling properties of the rainfall can be captured to the first order by a random cascade with a single parameter. The dependence of this parameter on forcing by the large-scale meteorological conditions, as measured by the large-scale spatial average rain rate, is investigated for these three datasets. The data show that this dependence can be captured by a one-to-one function. Since the large-scale average rain rate can be diagnosed from the large-scale dynamics, this relationship demonstrates an important linkage between the large-scale atmospheric dynamics and the statistical cascade theory of mesoscale rainfall. Potential application of this research to parameterization of runoff from the land surface and regional flood frequency analysis is briefly discussed, and open problems for further research are presented.

  3. The Rights and Responsibility of Test Takers When Large-Scale Testing Is Used for Classroom Assessment

    ERIC Educational Resources Information Center

    van Barneveld, Christina; Brinson, Karieann

    2017-01-01

    The purpose of this research was to identify conflicts in the rights and responsibility of Grade 9 test takers when some parts of a large-scale test are marked by teachers and used in the calculation of students' class marks. Data from teachers' questionnaires and students' questionnaires from a 2009-10 administration of a large-scale test of…

  4. ECOLOGICAL RESEARCH IN THE LARGE-SCALE BIOSPHERE–ATMOSPHERE EXPERIMENT IN AMAZONIA: EARLY RESULTS.

    Treesearch

    M. Keller; A. Alencar; G. P. Asner; B. Braswell; M. Bustamente; E. Davidson; T. Feldpausch; E. Fern ndes; M. Goulden; P. Kabat; B. Kruijt; F. Luizao; S. Miller; D. Markewitz; A. D. Nobre; C. A. Nobre; N. Priante Filho; H. Rocha; P. Silva Dias; C von Randow; G. L. Vourlitis

    2004-01-01

    The Large-scale Biosphere–Atmosphere Experiment in Amazonia (LBA) is a multinational, interdisciplinary research program led by Brazil. Ecological studies in LBA focus on how tropical forest conversion, regrowth, and selective logging influence carbon storage, nutrient dynamics, trace gas fluxes, and the prospect for sustainable land use in the Amazon region. Early...

  5. Prospective and Retrospective Studies of Substance Abuse Treatment Outcomes: Methods and Results of Four Large-Scale Follow-Up Studies.

    ERIC Educational Resources Information Center

    Gerstein, Dean R.; Johnson, Robert A.

    This report compares the research methods, provider and patient characteristics, and outcome results from four large-scale followup studies of drug treatment during the 1990s: (1) the California Drug and Alcohol Treatment Assessment (CALDATA); (2) Services Research Outcomes Study (SROS); (3) National Treatment Improvement Evaluation Study (NTIES);…

  6. Self-Report Measures of the Home Learning Environment in Large Scale Research: Measurement Properties and Associations with Key Developmental Outcomes

    ERIC Educational Resources Information Center

    Niklas, Frank; Nguyen, Cuc; Cloney, Daniel S.; Tayler, Collette; Adams, Raymond

    2016-01-01

    Favourable home learning environments (HLEs) support children's literacy, numeracy and social development. In large-scale research, HLE is typically measured by self-report survey, but there is little consistency between studies and many different items and latent constructs are observed. Little is known about the stability of these items and…

  7. Secondary Analysis and Large-Scale Assessments. Monograph in the Faculty of Education Research Seminar and Workshop Series.

    ERIC Educational Resources Information Center

    Tobin, Kenneth; Fraser, Barry J.

    Large scale assessments of educational progress can be useful tools to judge the effectiveness of educational programs and assessments. This document contains papers presented at the research seminar on this topic held at the Western Australian Institute of Technology in November, 1984. It is the fifth in a series of publications of papers…

  8. Analogue scale modelling of extensional tectonic processes using a large state-of-the-art centrifuge

    NASA Astrophysics Data System (ADS)

    Park, Heon-Joon; Lee, Changyeol

    2017-04-01

    Analogue scale modelling of extensional tectonic processes such as rifting and basin opening has been numerously conducted. Among the controlling factors, gravitational acceleration (g) on the scale models was regarded as a constant (Earth's gravity) in the most of the analogue model studies, and only a few model studies considered larger gravitational acceleration by using a centrifuge (an apparatus generating large centrifugal force by rotating the model at a high speed). Although analogue models using a centrifuge allow large scale-down and accelerated deformation that is derived by density differences such as salt diapir, the possible model size is mostly limited up to 10 cm. A state-of-the-art centrifuge installed at the KOCED Geotechnical Centrifuge Testing Center, Korea Advanced Institute of Science and Technology (KAIST) allows a large surface area of the scale-models up to 70 by 70 cm under the maximum capacity of 240 g-tons. Using the centrifuge, we will conduct analogue scale modelling of the extensional tectonic processes such as opening of the back-arc basin. Acknowledgement This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (grant number 2014R1A6A3A04056405).

  9. Comparative Effectiveness Research and Children With Cerebral Palsy: Identifying a Conceptual Framework and Specifying Measures.

    PubMed

    Gannotti, Mary E; Law, Mary; Bailes, Amy F; OʼNeil, Margaret E; Williams, Uzma; DiRezze, Briano

    2016-01-01

    A step toward advancing research about rehabilitation service associated with positive outcomes for children with cerebral palsy is consensus about a conceptual framework and measures. A Delphi process was used to establish consensus among clinicians and researchers in North America. Directors of large pediatric rehabilitation centers, clinicians from large hospitals, and researchers with expertise in outcomes participated (N = 18). Andersen's model of health care utilization framed outcomes: consumer satisfaction, activity, participation, quality of life, and pain. Measures agreed upon included Participation and Environment Measure for Children and Youth, Measure of Processes of Care, PEDI-CAT, KIDSCREEN-10, PROMIS Pediatric Pain Interference Scale, Visual Analog Scale for pain intensity, PROMIS Global Health Short Form, Family Environment Scale, Family Support Scale, and functional classification levels for gross motor, manual ability, and communication. Universal forms for documenting service use are needed. Findings inform clinicians and researchers concerned with outcome assessment.

  10. Mining the Mind Research Network: A Novel Framework for Exploring Large Scale, Heterogeneous Translational Neuroscience Research Data Sources

    PubMed Central

    Bockholt, Henry J.; Scully, Mark; Courtney, William; Rachakonda, Srinivas; Scott, Adam; Caprihan, Arvind; Fries, Jill; Kalyanam, Ravi; Segall, Judith M.; de la Garza, Raul; Lane, Susan; Calhoun, Vince D.

    2009-01-01

    A neuroinformatics (NI) system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN), database system has been designed and improved through our experience with 200 research studies and 250 researchers from seven different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining. PMID:20461147

  11. Banana production systems: identification of alternative systems for more sustainable production.

    PubMed

    Bellamy, Angelina Sanderson

    2013-04-01

    Large-scale, monoculture production systems dependent on synthetic fertilizers and pesticides, increase yields, but are costly and have deleterious impacts on human health and the environment. This research investigates variations in banana production practices in Costa Rica, to identify alternative systems that combine high productivity and profitability, with reduced reliance on agrochemicals. Farm workers were observed during daily production activities; 39 banana producers and 8 extension workers/researchers were interviewed; and a review of field experiments conducted by the National Banana Corporation between 1997 and 2002 was made. Correspondence analysis showed that there is no structured variation in large-scale banana producers' practices, but two other banana production systems were identified: a small-scale organic system and a small-scale conventional coffee-banana intercropped system. Field-scale research may reveal ways that these practices can be scaled up to achieve a productive and profitable system producing high-quality export bananas with fewer or no pesticides.

  12. Large Scale Density Estimation of Blue and Fin Whales (LSD)

    DTIC Science & Technology

    2014-09-30

    172. McDonald, MA, Hildebrand, JA, and Mesnick, S (2009). Worldwide decline in tonal frequencies of blue whale songs . Endangered Species Research 9...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...estimating blue and fin whale density that is effective over large spatial scales and is designed to cope with spatial variation in animal density utilizing

  13. Botswana water and surface energy balance research program. Part 2: Large scale moisture and passive microwaves

    NASA Technical Reports Server (NTRS)

    Vandegriend, A. A.; Owe, M.; Chang, A. T. C.

    1992-01-01

    The Botswana water and surface energy balance research program was developed to study and evaluate the integrated use of multispectral satellite remote sensing for monitoring the hydrological status of the Earth's surface. The research program consisted of two major, mutually related components: a surface energy balance modeling component, built around an extensive field campaign; and a passive microwave research component which consisted of a retrospective study of large scale moisture conditions and Nimbus scanning multichannel microwave radiometer microwave signatures. The integrated approach of both components are explained in general and activities performed within the passive microwave research component are summarized. The microwave theory is discussed taking into account: soil dielectric constant, emissivity, soil roughness effects, vegetation effects, optical depth, single scattering albedo, and wavelength effects. The study site is described. The soil moisture data and its processing are considered. The relation between observed large scale soil moisture and normalized brightness temperatures is discussed. Vegetation characteristics and inverse modeling of soil emissivity is considered.

  14. IKONOS imagery for the Large Scale Biosphere–Atmosphere Experiment in Amazonia (LBA).

    Treesearch

    George Hurtt; Xiangming Xiao; Michael Keller; Michael Palace; Gregory P. Asner; Rob Braswell; Brond& #305; Eduardo S. zio; Manoel Cardoso; Claudio J.R. Carvalho; Matthew G. Fearon; Liane Guild; Steve Hagen; Scott Hetrick; Berrien Moore III; Carlos Nobre; Jane M. Read; S& aacute; Tatiana NO-VALUE; Annette Schloss; George Vourlitis; Albertus J. Wickel

    2003-01-01

    The LBA-ECO program is one of several international research components under the Brazilian-led Large Scale Biosphere–Atmosphere Experiment in Amazonia (LBA). The field-oriented research activities of this study are organized along transects and include a set of primary field sites, where the major objective is to study land-use change and ecosystem dynamics, and a...

  15. Research-Based Recommendations for the Use of Accommodations in Large-Scale Assessments: 2012 Update. Practical Guidelines for the Education of English Language Learners. Book 4

    ERIC Educational Resources Information Center

    Kieffer, Michael J.; Rivera, Mabel; Francis, David J.

    2012-01-01

    This report presents results from a new quantitative synthesis of research on the effectiveness and validity of test accommodations for English language learners (ELLs) taking large-scale assessments. In 2006, the Center on Instruction published a review of the literature on test accommodations for ELLs titled "Practical Guidelines for the…

  16. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  17. Robust Control of Multivariable and Large Scale Systems.

    DTIC Science & Technology

    1986-03-14

    AD-A175 $5B ROBUST CONTROL OF MULTIVRRIALE AND LARG SCALE SYSTEMS V2 R75 (U) HONEYWELL SYSTEMS AND RESEARCH CENTER MINNEAPOLIS MN J C DOYLE ET AL...ONIJQ 86 R alFS ja ,.AMIECFOEPF:ORMING ORGANIZATION So OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATI ON jonevwell Systems & Research If 4000c" Air...Force Office of Scientific Research .~ C :AE S C.rv. Stare arma ZIP Code) 7C ADDRESS (Crty. Stare. am ZIP Code, *3660 Marshall Street NE Building 410

  18. Aquatic Plant Control Research Program. Large-Scale Operations Management Test (LSOMT) of Insects and Pathogens for Control of Waterhyacinth in Louisiana. Volume 1. Results for 1979-1981.

    DTIC Science & Technology

    1985-01-01

    RD-Ai56 759 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MRNAGEMENT..(U) ARMY ENGINEER WATERAYS EXPERIMENT STATION VICKSBURG MS...PO Box 631, Vicksburg, Aquatic Plant Control Mississippi 39180-0631 and University of Research Program Tennessee-Chattanooga, Chattanooga, 12...19. KEY WORDS (Continue on reverse side if necesary nd identify by block number) - Aquatic plant control Louisiana Biological control Plant

  19. Scaling properties of European research units

    PubMed Central

    Jamtveit, Bjørn; Jettestuen, Espen; Mathiesen, Joachim

    2009-01-01

    A quantitative characterization of the scale-dependent features of research units may provide important insight into how such units are organized and how they grow. The relative importance of top-down versus bottom-up controls on their growth may be revealed by their scaling properties. Here we show that the number of support staff in Scandinavian research units, ranging in size from 20 to 7,800 staff members, is related to the number of academic staff by a power law. The scaling exponent of ≈1.30 is broadly consistent with a simple hierarchical model of the university organization. Similar scaling behavior between small and large research units with a wide range of ambitions and strategies argues against top-down control of the growth. Top-down effects, and externally imposed effects from changing political environments, can be observed as fluctuations around the main trend. The observed scaling law implies that cost-benefit arguments for merging research institutions into larger and larger units may have limited validity unless the productivity per academic staff and/or the quality of the products are considerably higher in larger institutions. Despite the hierarchical structure of most large-scale research units in Europe, the network structures represented by the academic component of such units are strongly antihierarchical and suboptimal for efficient communication within individual units. PMID:19625626

  20. Hydrological response of karst systems to large-scale climate variability for different catchments of the French karst observatory network INSU/CNRS SNO KARST

    NASA Astrophysics Data System (ADS)

    Massei, Nicolas; Labat, David; Jourde, Hervé; Lecoq, Nicolas; Mazzilli, Naomi

    2017-04-01

    The french karst observatory network SNO KARST is a national initiative from the National Institute for Earth Sciences and Astronomy (INSU) of the National Center for Scientific Research (CNRS). It is also part of the new french research infrastructure for the observation of the critical zone OZCAR. SNO KARST is composed by several karst sites distributed over conterminous France which are located in different physiographic and climatic contexts (Mediterranean, Pyrenean, Jura mountain, western and northwestern shore near the Atlantic or the English Channel). This allows the scientific community to develop advanced research and experiments dedicated to improve understanding of the hydrological functioning of karst catchments. Here we used several sites of SNO KARST in order to assess the hydrological response of karst catchments to long-term variation of large-scale atmospheric circulation. Using NCEP reanalysis products and karst discharge, we analyzed the links between large-scale circulation and karst water resources variability. As karst hydrosystems are highly heterogeneous media, they behave differently across different time-scales : we explore the large-scale/local-scale relationships according to time-scales using a wavelet multiresolution approach of both karst hydrological variables and large-scale climate fields such as sea level pressure (SLP). The different wavelet components of karst discharge in response to the corresponding wavelet component of climate fields are either 1) compared to physico-chemical/geochemical responses at karst springs, or 2) interpreted in terms of hydrological functioning by comparing discharge wavelet components to internal components obtained from precipitation/discharge models using the KARSTMOD conceptual modeling platform of SNO KARST.

  1. Establishment of a National Wind Energy Center at University of Houston

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Su Su

    The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less

  2. A topology visualization early warning distribution algorithm for large-scale network security incidents.

    PubMed

    He, Hui; Fan, Guotao; Ye, Jianwei; Zhang, Weizhe

    2013-01-01

    It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system's emergency response capabilities, alleviate the cyber attacks' damage, and strengthen the system's counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system's plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks' topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  3. 78 FR 7464 - Large Scale Networking (LSN) ; Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN) ; Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination...://www.nitrd.gov/nitrdgroups/index.php?title=Joint_Engineering_Team_ (JET)#title. SUMMARY: The JET...

  4. Studies of land-cover, land-use, and biophysical properties of vegetation in the Large Scale Biosphere Atmosphere experiment in Amazonia.

    Treesearch

    Dar A. Robertsa; Michael Keller; Joao Vianei Soares

    2003-01-01

    We summarize early research on land-cover, land-use, and biophysical properties of vegetation from the Large Scale Biosphere Atmosphere (LBA) experiment in Amazoˆnia. LBA is an international research program developed to evaluate regional function and to determine how land-use and climate modify biological, chemical and physical processes there. Remote sensing has...

  5. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.

  6. The use of large scale datasets for understanding traffic network state.

    DOT National Transportation Integrated Search

    2013-09-01

    The goal of this proposal is to develop novel modeling techniques to infer individual activity patterns from the large scale cell phone : datasets and taxi data from NYC. As such this research offers a paradigm shift from traditional transportation m...

  7. 77 FR 58415 - Large Scale Networking (LSN); Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN); Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET). SUMMARY: The JET, established in 1997, provides for information sharing among Federal...

  8. 78 FR 70076 - Large Scale Networking (LSN)-Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET)#title. SUMMARY: The JET, established in 1997, provides for information sharing among...

  9. Development of Experimental Icing Simulation Capability for Full-Scale Swept Wings: Hybrid Design Process, Years 1 and 2

    NASA Technical Reports Server (NTRS)

    Fujiwara, Gustavo; Bragg, Mike; Triphahn, Chris; Wiberg, Brock; Woodard, Brian; Loth, Eric; Malone, Adam; Paul, Bernard; Pitera, David; Wilcox, Pete; hide

    2017-01-01

    This report presents the key results from the first two years of a program to develop experimental icing simulation capabilities for full-scale swept wings. This investigation was undertaken as a part of a larger collaborative research effort on ice accretion and aerodynamics for large-scale swept wings. Ice accretion and the resulting aerodynamic effect on large-scale swept wings presents a significant airplane design and certification challenge to air frame manufacturers, certification authorities, and research organizations alike. While the effect of ice accretion on straight wings has been studied in detail for many years, the available data on swept-wing icing are much more limited, especially for larger scales.

  10. The Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial and Its Associated Research Resource

    PubMed Central

    2013-01-01

    The Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial is a large-scale research effort conducted by the National Cancer Institute. PLCO offers an example of coordinated research by both the extramural and intramural communities of the National Institutes of Health. The purpose of this article is to describe the PLCO research resource and how it is managed and to assess the productivity and the costs associated with this resource. Such an in-depth analysis of a single large-scale project can shed light on questions such as how large-scale projects should be managed, what metrics should be used to assess productivity, and how costs can be compared with productivity metrics. A comprehensive publication analysis identified 335 primary research publications resulting from research using PLCO data and biospecimens from 2000 to 2012. By the end of 2012, a total of 9679 citations (excluding self-citations) have resulted from this body of research publications, with an average of 29.7 citations per article, and an h index of 45, which is comparable with other large-scale studies, such as the Nurses’ Health Study. In terms of impact on public health, PLCO trial results have been used by the US Preventive Services Task Force in making recommendations concerning prostate and ovarian cancer screening. The overall cost of PLCO was $454 million over 20 years, adjusted to 2011 dollars, with approximately $37 million for the collection, processing, and storage of biospecimens, including blood samples, buccal cells, and pathology tissues. PMID:24115361

  11. Skin Friction Reduction Through Large-Scale Forcing

    NASA Astrophysics Data System (ADS)

    Bhatt, Shibani; Artham, Sravan; Gnanamanickam, Ebenezer

    2017-11-01

    Flow structures in a turbulent boundary layer larger than an integral length scale (δ), referred to as large-scales, interact with the finer scales in a non-linear manner. By targeting these large-scales and exploiting this non-linear interaction wall shear stress (WSS) reduction of over 10% has been achieved. The plane wall jet (PWJ), a boundary layer which has highly energetic large-scales that become turbulent independent of the near-wall finer scales, is the chosen model flow field. It's unique configuration allows for the independent control of the large-scales through acoustic forcing. Perturbation wavelengths from about 1 δ to 14 δ were considered with a reduction in WSS for all wavelengths considered. This reduction, over a large subset of the wavelengths, scales with both inner and outer variables indicating a mixed scaling to the underlying physics, while also showing dependence on the PWJ global properties. A triple decomposition of the velocity fields shows an increase in coherence due to forcing with a clear organization of the small scale turbulence with respect to the introduced large-scale. The maximum reduction in WSS occurs when the introduced large-scale acts in a manner so as to reduce the turbulent activity in the very near wall region. This material is based upon work supported by the Air Force Office of Scientific Research under Award Number FA9550-16-1-0194 monitored by Dr. Douglas Smith.

  12. Research Activities at Fermilab for Big Data Movement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhashilkar, Parag; Wu, Wenji; Kim, Hyun W

    2013-01-01

    Adaptation of 100GE Networking Infrastructure is the next step towards management of Big Data. Being the US Tier-1 Center for the Large Hadron Collider's (LHC) Compact Muon Solenoid (CMS) experiment and the central data center for several other large-scale research collaborations, Fermilab has to constantly deal with the scaling and wide-area distribution challenges of the big data. In this paper, we will describe some of the challenges involved in the movement of big data over 100GE infrastructure and the research activities at Fermilab to address these challenges.

  13. Fabrication of High Strength Lightweight Metals for Armor and Structural Applications: Large Scale Equal Channel Angular Extrusion Processing of Aluminum 5083 Alloy

    DTIC Science & Technology

    2017-06-01

    ARL-TR-8047 ● JUNE 2017 US Army Research Laboratory Fabrication of High -Strength Lightweight Metals for Armor and Structural...to the originator. ARL-TR-8047 ● JUNE 2017 US Army Research Laboratory Fabrication of High -Strength Lightweight Metals for...Fabrication of High -Strength Lightweight Metals for Armor and Structural Applications: Large-Scale Equal Channel Angular Extrusion Processing of

  14. Stereotype Threat, Inquiring about Test Takers' Race and Gender, and Performance on Low-Stakes Tests in a Large-Scale Assessment. Research Report. ETS RR-15-02

    ERIC Educational Resources Information Center

    Stricker, Lawrence J.; Rock, Donald A.; Bridgeman, Brent

    2015-01-01

    This study explores stereotype threat on low-stakes tests used in a large-scale assessment, math and reading tests in the Education Longitudinal Study of 2002 (ELS). Issues identified in laboratory research (though not observed in studies of high-stakes tests) were assessed: whether inquiring about their race and gender is related to the…

  15. International Large-Scale Assessments: What Uses, What Consequences?

    ERIC Educational Resources Information Center

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  16. Data integration in the era of omics: current and future challenges

    PubMed Central

    2014-01-01

    To integrate heterogeneous and large omics data constitutes not only a conceptual challenge but a practical hurdle in the daily analysis of omics data. With the rise of novel omics technologies and through large-scale consortia projects, biological systems are being further investigated at an unprecedented scale generating heterogeneous and often large data sets. These data-sets encourage researchers to develop novel data integration methodologies. In this introduction we review the definition and characterize current efforts on data integration in the life sciences. We have used a web-survey to assess current research projects on data-integration to tap into the views, needs and challenges as currently perceived by parts of the research community. PMID:25032990

  17. Experimental Study of Homogeneous Isotropic Slowly-Decaying Turbulence in Giant Grid-Wind Tunnel Set Up

    NASA Astrophysics Data System (ADS)

    Aliseda, Alberto; Bourgoin, Mickael; Eswirp Collaboration

    2014-11-01

    We present preliminary results from a recent grid turbulence experiment conducted at the ONERA wind tunnel in Modane, France. The ESWIRP Collaboration was conceived to probe the smallest scales of a canonical turbulent flow with very high Reynolds numbers. To achieve this, the largest scales of the turbulence need to be extremely big so that, even with the large separation of scales, the smallest scales would be well above the spatial and temporal resolution of the instruments. The ONERA wind tunnel in Modane (8 m -diameter test section) was chosen as a limit of the biggest large scales achievable in a laboratory setting. A giant inflatable grid (M = 0.8 m) was conceived to induce slowly-decaying homogeneous isotropic turbulence in a large region of the test section, with minimal structural risk. An international team or researchers collected hot wire anemometry, ultrasound anemometry, resonant cantilever anemometry, fast pitot tube anemometry, cold wire thermometry and high-speed particle tracking data of this canonical turbulent flow. While analysis of this large database, which will become publicly available over the next 2 years, has only started, the Taylor-scale Reynolds number is estimated to be between 400 and 800, with Kolmogorov scales as large as a few mm . The ESWIRP Collaboration is formed by an international team of scientists to investigate experimentally the smallest scales of turbulence. It was funded by the European Union to take advantage of the largest wind tunnel in Europe for fundamental research.

  18. A leap forward in geographic scale for forest ectomycorrhizal fungi

    Treesearch

    Filipa Cox; Nadia Barsoum; Martin I. Bidartondo; Isabella Børja; Erik Lilleskov; Lars O. Nilsson; Pasi Rautio; Kath Tubby; Lars Vesterdal

    2010-01-01

    In this letter we propose a first large-scale assessment of mycorrhizas with a European-wide network of intensively monitored forest plots as a research platform. This effort would create a qualitative and quantitative shift in mycorrhizal research by delivering the first continental-scale map of mycorrhizal fungi. Readersmay note that several excellent detailed...

  19. Feasibility analysis of large length-scale thermocapillary flow experiment for the International Space Station

    NASA Astrophysics Data System (ADS)

    Alberts, Samantha J.

    The investigation of microgravity fluid dynamics emerged out of necessity with the advent of space exploration. In particular, capillary research took a leap forward in the 1960s with regards to liquid settling and interfacial dynamics. Due to inherent temperature variations in large spacecraft liquid systems, such as fuel tanks, forces develop on gas-liquid interfaces which induce thermocapillary flows. To date, thermocapillary flows have been studied in small, idealized research geometries usually under terrestrial conditions. The 1 to 3m lengths in current and future large tanks and hardware are designed based on hardware rather than research, which leaves spaceflight systems designers without the technological tools to effectively create safe and efficient designs. This thesis focused on the design and feasibility of a large length-scale thermocapillary flow experiment, which utilizes temperature variations to drive a flow. The design of a helical channel geometry ranging from 1 to 2.5m in length permits a large length-scale thermocapillary flow experiment to fit in a seemingly small International Space Station (ISS) facility such as the Fluids Integrated Rack (FIR). An initial investigation determined the proposed experiment produced measurable data while adhering to the FIR facility limitations. The computational portion of this thesis focused on the investigation of functional geometries of fuel tanks and depots using Surface Evolver. This work outlines the design of a large length-scale thermocapillary flow experiment for the ISS FIR. The results from this work improve the understanding thermocapillary flows and thus improve technological tools for predicting heat and mass transfer in large length-scale thermocapillary flows. Without the tools to understand the thermocapillary flows in these systems, engineers are forced to design larger, heavier vehicles to assure safety and mission success.

  20. Measuring large-scale vertical motion in the atmosphere with dropsondes

    NASA Astrophysics Data System (ADS)

    Bony, Sandrine; Stevens, Bjorn

    2017-04-01

    Large-scale vertical velocity modulates important processes in the atmosphere, including the formation of clouds, and constitutes a key component of the large-scale forcing of Single-Column Model simulations and Large-Eddy Simulations. Its measurement has also been a long-standing challenge for observationalists. We will show that it is possible to measure the vertical profile of large-scale wind divergence and vertical velocity from aircraft by using dropsondes. This methodology was tested in August 2016 during the NARVAL2 campaign in the lower Atlantic trades. Results will be shown for several research flights, the robustness and the uncertainty of measurements will be assessed, ands observational estimates will be compared with data from high-resolution numerical forecasts.

  1. Proceedings of the Annual Meeting (14th) Aquatic Plant Control Research Planning and Operations Review, Held at Lake Eufaula, Oklahoma on 26-29 November 1979.

    DTIC Science & Technology

    1980-10-01

    Development; Problem Identification and Assessment for Aquatic Plant Management; Natural Succession of Aquatic Plants; Large-Scale Operations Management Test...of Insects and Pathogens for Control of Waterhyacinth in Louisiana; Large-Scale Operations Management Test to Evaluate Prevention Methodology for...Control of Eurasian Watermilfoil in Washington; Large-Scale Operations Management Test Using the White Amur at Lake Conway, Florida; and Aquatic Plant Control Activities in the Panama Canal Zone.

  2. The Expanded Large Scale Gap Test

    DTIC Science & Technology

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  3. High-speed inlet research program and supporting analysis

    NASA Technical Reports Server (NTRS)

    Coltrin, Robert E.

    1990-01-01

    The technology challenges faced by the high speed inlet designer are discussed by describing the considerations that went into the design of the Mach 5 research inlet. It is shown that the emerging three dimensional viscous computational fluid dynamics (CFD) flow codes, together with small scale experiments, can be used to guide larger scale full inlet systems research. Then, in turn, the results of the large scale research, if properly instrumented, can be used to validate or at least to calibrate the CFD codes.

  4. Comparing the Effectiveness of Self-Paced and Collaborative Frame-of-Reference Training on Rater Accuracy in a Large-Scale Writing Assessment

    ERIC Educational Resources Information Center

    Raczynski, Kevin R.; Cohen, Allan S.; Engelhard, George, Jr.; Lu, Zhenqiu

    2015-01-01

    There is a large body of research on the effectiveness of rater training methods in the industrial and organizational psychology literature. Less has been reported in the measurement literature on large-scale writing assessments. This study compared the effectiveness of two widely used rater training methods--self-paced and collaborative…

  5. Real-time adaptive ramp metering : phase I, MILOS proof of concept (multi-objective, integrated, large-scale, optimized system).

    DOT National Transportation Integrated Search

    2006-12-01

    Over the last several years, researchers at the University of Arizonas ATLAS Center have developed an adaptive ramp : metering system referred to as MILOS (Multi-Objective, Integrated, Large-Scale, Optimized System). The goal of this project : is ...

  6. Data management strategies for multinational large-scale systems biology projects.

    PubMed

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  7. Data management strategies for multinational large-scale systems biology projects

    PubMed Central

    Peuker, Martin; Regenbrecht, Christian R.A.

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157

  8. Criminological research in contemporary China: challenges and lessons learned from a large-scale criminal victimization survey.

    PubMed

    Zhang, Lening; Messner, Steven F; Lu, Jianhong

    2007-02-01

    This article discusses research experience gained from a large-scale survey of criminal victimization recently conducted in Tianjin, China. The authors review some of the more important challenges that arose in the research, their responses to these challenges, and lessons learned that might be beneficial to other scholars who are interested in conducting criminological research in China. Their experience underscores the importance of understanding the Chinese political, cultural, and academic context, and the utility of collaborating with experienced and knowledgeable colleagues "on site." Although there are some special difficulties and barriers, their project demonstrates the feasibility of original criminological data collection in China.

  9. Numerical Simulations of Vortical Mode Stirring: Effects of Large Scale Shear and Strain

    DTIC Science & Technology

    2015-09-30

    Numerical Simulations of Vortical Mode Stirring: Effects of Large-Scale Shear and Strain M.-Pascale Lelong NorthWest Research Associates...can be implemented in larger-scale ocean models. These parameterizations will incorporate the effects of local ambient conditions including latitude...talk at the 1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Nonlinear Effects in Internal Waves Conference held

  10. Linking Errors in Trend Estimation in Large-Scale Surveys: A Case Study. Research Report. ETS RR-10-10

    ERIC Educational Resources Information Center

    Xu, Xueli; von Davier, Matthias

    2010-01-01

    One of the major objectives of large-scale educational surveys is reporting trends in academic achievement. For this purpose, a substantial number of items are carried from one assessment cycle to the next. The linking process that places academic abilities measured in different assessments on a common scale is usually based on a concurrent…

  11. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 2. First Year Poststocking Results. Volume IV. Nitrogen and Phosphorus Dynamics of the Lake Conway Ecosystem: Loading Budgets and a Dynamic Hydrologic Phosphorus Model.

    DTIC Science & Technology

    1982-08-01

    AD-AIA 700 FLORIDA UN1V GAINESVILLE DEPT OF ENVIRONMENTAL ENGIN -ETC F/G 6/6 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE WHITE AMOR--ENL...Conway ecosystem and is part of the Large- Scale Operations Management Test (LSOMT) of the Aquatic Plant Control Research Program (APCRP) at the WES...should be cited as follows: Blancher, E. C., II, and Fellows, C. R. 1982. "Large-Scale Operations Management Test of Use of the White Amur for Control

  12. Copy of Using Emulation and Simulation to Understand the Large-Scale Behavior of the Internet.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adalsteinsson, Helgi; Armstrong, Robert C.; Chiang, Ken

    2008-10-01

    We report on the work done in the late-start LDRDUsing Emulation and Simulation toUnderstand the Large-Scale Behavior of the Internet. We describe the creation of a researchplatform that emulates many thousands of machines to be used for the study of large-scale inter-net behavior. We describe a proof-of-concept simple attack we performed in this environment.We describe the successful capture of a Storm bot and, from the study of the bot and furtherliterature search, establish large-scale aspects we seek to understand via emulation of Storm onour research platform in possible follow-on work. Finally, we discuss possible future work.3

  13. Process, pattern and scale: hydrogeomorphology and plant diversity in forested wetlands across multiple spatial scales

    NASA Astrophysics Data System (ADS)

    Alexander, L.; Hupp, C. R.; Forman, R. T.

    2002-12-01

    Many geodisturbances occur across large spatial scales, spanning entire landscapes and creating ecological phenomena in their wake. Ecological study at large scales poses special problems: (1) large-scale studies require large-scale resources, and (2) sampling is not always feasible at the appropriate scale, and researchers rely on data collected at smaller scales to interpret patterns across broad regions. A criticism of landscape ecology is that findings at small spatial scales are "scaled up" and applied indiscriminately across larger spatial scales. In this research, landscape scaling is addressed through process-pattern relationships between hydrogeomorphic processes and patterns of plant diversity in forested wetlands. The research addresses: (1) whether patterns and relationships between hydrogeomorphic, vegetation, and spatial variables can transcend scale; and (2) whether data collected at small spatial scales can be used to describe patterns and relationships across larger spatial scales. Field measurements of hydrologic, geomorphic, spatial, and vegetation data were collected or calculated for 15- 1-ha sites on forested floodplains of six (6) Chesapeake Bay Coastal Plain streams over a total area of about 20,000 km2. Hydroperiod (day/yr), floodplain surface elevation range (m), discharge (m3/s), stream power (kg-m/s2), sediment deposition (mm/yr), relative position downstream and other variables were used in multivariate analyses to explain differences in species richness, tree diversity (Shannon-Wiener Diversity Index H'), and plant community composition at four spatial scales. Data collected at the plot (400-m2) and site- (c. 1-ha) scales are applied to and tested at the river watershed and regional spatial scales. Results indicate that plant species richness and tree diversity (Shannon-Wiener diversity index H') can be described by hydrogeomorphic conditions at all scales, but are best described at the site scale. Data collected at plot and site scales are tested for spatial heterogeneity across the Chesapeake Bay Coastal Plain using a geostatistical variogram, and multiple regression analysis is used to relate plant diversity, spatial, and hydrogeomorphic variables across Coastal Plain regions and hydrologic regimes. Results indicate that relationships between hydrogeomorphic processes and patterns of plant diversity at finer scales can proxy relationships at coarser scales in some, not all, cases. Findings also suggest that data collected at small scales can be used to describe trends across broader scales under limited conditions.

  14. Recent (1999-2003) Canadian research on contemporary processes of river erosion and sedimentation, and river mechanics

    NASA Astrophysics Data System (ADS)

    de Boer, D. H.; Hassan, M. A.; MacVicar, B.; Stone, M.

    2005-01-01

    Contributions by Canadian fluvial geomorphologists between 1999 and 2003 are discussed under four major themes: sediment yield and sediment dynamics of large rivers; cohesive sediment transport; turbulent flow structure and sediment transport; and bed material transport and channel morphology. The paper concludes with a section on recent technical advances. During the review period, substantial progress has been made in investigating the details of fluvial processes at relatively small scales. Examples of this emphasis are the studies of flow structure, turbulence characteristics and bedload transport, which continue to form central themes in fluvial research in Canada. Translating the knowledge of small-scale, process-related research to an understanding of the behaviour of large-scale fluvial systems, however, continues to be a formidable challenge. Models play a prominent role in elucidating the link between small-scale processes and large-scale fluvial geomorphology, and, as a result, a number of papers describing models and modelling results have been published during the review period. In addition, a number of investigators are now approaching the problem by directly investigating changes in the system of interest at larger scales, e.g. a channel reach over tens of years, and attempting to infer what processes may have led to the result. It is to be expected that these complementary approaches will contribute to an increased understanding of fluvial systems at a variety of spatial and temporal scales. Copyright

  15. Large-Scale Networked Virtual Environments: Architecture and Applications

    ERIC Educational Resources Information Center

    Lamotte, Wim; Quax, Peter; Flerackers, Eddy

    2008-01-01

    Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…

  16. Considerations for Managing Large-Scale Clinical Trials.

    ERIC Educational Resources Information Center

    Tuttle, Waneta C.; And Others

    1989-01-01

    Research management strategies used effectively in a large-scale clinical trial to determine the health effects of exposure to Agent Orange in Vietnam are discussed, including pre-project planning, organization according to strategy, attention to scheduling, a team approach, emphasis on guest relations, cross-training of personnel, and preparing…

  17. Simulating the impact of the large-scale circulation on the 2-m temperature and precipitation climatology

    EPA Science Inventory

    The impact of the simulated large-scale atmospheric circulation on the regional climate is examined using the Weather Research and Forecasting (WRF) model as a regional climate model. The purpose is to understand the potential need for interior grid nudging for dynamical downscal...

  18. Multistage Security Mechanism For Hybrid, Large-Scale Wireless Sensor Networks

    DTIC Science & Technology

    2007-06-01

    sensor network . Building on research in the areas of the wireless sensor networks (WSN) and the mobile ad hoc networks (MANET), this thesis proposes an...A wide area network consisting of ballistic missile defense satellites and terrestrial nodes can be viewed as a hybrid, large-scale mobile wireless

  19. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  20. [Privacy and public benefit in using large scale health databases].

    PubMed

    Yamamoto, Ryuichi

    2014-01-01

    In Japan, large scale heath databases were constructed in a few years, such as National Claim insurance and health checkup database (NDB) and Japanese Sentinel project. But there are some legal issues for making adequate balance between privacy and public benefit by using such databases. NDB is carried based on the act for elderly person's health care but in this act, nothing is mentioned for using this database for general public benefit. Therefore researchers who use this database are forced to pay much concern about anonymization and information security that may disturb the research work itself. Japanese Sentinel project is a national project to detecting drug adverse reaction using large scale distributed clinical databases of large hospitals. Although patients give the future consent for general such purpose for public good, it is still under discussion using insufficiently anonymized data. Generally speaking, researchers of study for public benefit will not infringe patient's privacy, but vague and complex requirements of legislation about personal data protection may disturb the researches. Medical science does not progress without using clinical information, therefore the adequate legislation that is simple and clear for both researchers and patients is strongly required. In Japan, the specific act for balancing privacy and public benefit is now under discussion. The author recommended the researchers including the field of pharmacology should pay attention to, participate in the discussion of, and make suggestion to such act or regulations.

  1. Impact of potential large-scale and medium-scale irrigation on the West African Monsoon and its dependence on location of irrigated area

    NASA Astrophysics Data System (ADS)

    Eltahir, E. A. B.; IM, E. S.

    2014-12-01

    This study investigates the impact of potential large-scale (about 400,000 km2) and medium-scale (about 60,000 km2) irrigation on the climate of West Africa using the MIT Regional Climate Model. A new irrigation module is implemented to assess the impact of location and scheduling of irrigation on rainfall distribution over West Africa. A control simulation (without irrigation) and various sensitivity experiments (with irrigation) are performed and compared to discern the effects of irrigation location, size and scheduling. In general, the irrigation-induced surface cooling due to anomalously wet soil tends to suppress moist convection and rainfall, which in turn induces local subsidence and low level anti-cyclonic circulation. These local effects are dominated by a consistent reduction of local rainfall over the irrigated land, irrespective of its location. However, the remote response of rainfall distribution to irrigation exhibits a significant sensitivity to the latitudinal position of irrigation. The low-level northeasterly flow associated with anti-cyclonic circulation centered over the irrigation area can enhance the extent of low level convergence through interaction with the prevailing monsoon flow, leading to significant increase in rainfall. Despite much reduced forcing of irrigation water, the medium-scale irrigation seems to draw the same response as large-scale irrigation, which supports the robustness of the response to irrigation in our modeling system. Both large-scale and medium-scale irrigation experiments show that an optimal irrigation location and scheduling exists that would lead to a more efficient use of irrigation water. The approach of using a regional climate model to investigate the impact of location and size of irrigation schemes may be the first step in incorporating land-atmosphere interactions in the design of location and size of irrigation projects. However, this theoretical approach is still in early stages of development and further research is needed before any practical application in water resources planning. Acknowledgements.This research was supported by the National Research Foundation Singapore through the Singapore MIT Alliance for Research and Technology's Center for Environmental Sensing and Modeling interdisciplinary research program.

  2. Review of the Need for a Large-scale Test Facility for Research on the Effects of Extreme Winds on Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. G. Little

    1999-03-01

    The Idaho National Engineering and Environmental Laboratory (INEEL), through the US Department of Energy (DOE), has proposed that a large-scale wind test facility (LSWTF) be constructed to study, in full-scale, the behavior of low-rise structures under simulated extreme wind conditions. To determine the need for, and potential benefits of, such a facility, the Idaho Operations Office of the DOE requested that the National Research Council (NRC) perform an independent assessment of the role and potential value of an LSWTF in the overall context of wind engineering research. The NRC established the Committee to Review the Need for a Large-scale Testmore » Facility for Research on the Effects of Extreme Winds on Structures, under the auspices of the Board on Infrastructure and the Constructed Environment, to perform this assessment. This report conveys the results of the committee's deliberations as well as its findings and recommendations. Data developed at large-scale would enhanced the understanding of how structures, particularly light-frame structures, are affected by extreme winds (e.g., hurricanes, tornadoes, sever thunderstorms, and other events). With a large-scale wind test facility, full-sized structures, such as site-built or manufactured housing and small commercial or industrial buildings, could be tested under a range of wind conditions in a controlled, repeatable environment. At this time, the US has no facility specifically constructed for this purpose. During the course of this study, the committee was confronted by three difficult questions: (1) does the lack of a facility equate to a need for the facility? (2) is need alone sufficient justification for the construction of a facility? and (3) would the benefits derived from information produced in an LSWTF justify the costs of producing that information? The committee's evaluation of the need and justification for an LSWTF was shaped by these realities.« less

  3. Social welfare as small-scale help: evolutionary psychology and the deservingness heuristic.

    PubMed

    Petersen, Michael Bang

    2012-01-01

    Public opinion concerning social welfare is largely driven by perceptions of recipient deservingness. Extant research has argued that this heuristic is learned from a variety of cultural, institutional, and ideological sources. The present article provides evidence supporting a different view: that the deservingness heuristic is rooted in psychological categories that evolved over the course of human evolution to regulate small-scale exchanges of help. To test predictions made on the basis of this view, a method designed to measure social categorization is embedded in nationally representative surveys conducted in different countries. Across the national- and individual-level differences that extant research has used to explain the heuristic, people categorize welfare recipients on the basis of whether they are lazy or unlucky. This mode of categorization furthermore induces people to think about large-scale welfare politics as its presumed ancestral equivalent: small-scale help giving. The general implications for research on heuristics are discussed.

  4. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  5. Global Behavior in Large Scale Systems

    DTIC Science & Technology

    2013-12-05

    release. AIR FORCE RESEARCH LABORATORY AF OFFICE OF SCIENTIFIC RESEARCH (AFOSR)/RSL ARLINGTON, VIRGINIA 22203 AIR FORCE MATERIEL COMMAND AFRL-OSR-VA...and Research 875 Randolph Street, Suite 325 Room 3112, Arlington, VA 22203 December 3, 2013 1 Abstract This research attained two main achievements: 1...microscopic random interactions among the agents. 2 1 Introduction In this research we considered two main problems: 1) large deviation error performance in

  6. The New Big Science at the NSLS

    NASA Astrophysics Data System (ADS)

    Crease, Robert

    2016-03-01

    The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.

  7. TARGET Publication Guidelines | Office of Cancer Genomics

    Cancer.gov

    Like other NCI large-scale genomics initiatives, TARGET is a community resource project and data are made available rapidly after validation for use by other researchers. To act in accord with the Fort Lauderdale principles and support the continued prompt public release of large-scale genomic data prior to publication, researchers who plan to prepare manuscripts containing descriptions of TARGET pediatric cancer data that would be of comparable scope to an initial TARGET disease-specific comprehensive, global analysis publication, and journal editors who receive such manuscripts, are

  8. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  9. Development of a superconductor magnetic suspension and balance prototype facility for studying the feasibility of applying this technique to large scale aerodynamic testing

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    The basic research and development work towards proving the feasibility of operating an all-superconductor magnetic suspension and balance device for aerodynamic testing is presented. The feasibility of applying a quasi-six-degree-of freedom free support technique to dynamic stability research was studied along with the design concepts and parameters for applying magnetic suspension techniques to large-scale aerodynamic facilities. A prototype aerodynamic test facility was implemented. Relevant aspects of the development of the prototype facility are described in three sections: (1) design characteristics; (2) operational characteristics; and (3) scaling to larger facilities.

  10. Internationalization Measures in Large Scale Research Projects

    NASA Astrophysics Data System (ADS)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  11. How Can the Evidence from Global Large-scale Clinical Trials for Cardiovascular Diseases be Improved?

    PubMed

    Sawata, Hiroshi; Tsutani, Kiichiro

    2011-06-29

    Clinical investigations are important for obtaining evidence to improve medical treatment. Large-scale clinical trials with thousands of participants are particularly important for this purpose in cardiovascular diseases. Conducting large-scale clinical trials entails high research costs. This study sought to investigate global trends in large-scale clinical trials in cardiovascular diseases. We searched for trials using clinicaltrials.gov (URL: http://www.clinicaltrials.gov/) using the key words 'cardio' and 'event' in all fields on 10 April, 2010. We then selected trials with 300 or more participants examining cardiovascular diseases. The search revealed 344 trials that met our criteria. Of 344 trials, 71% were randomized controlled trials, 15% involved more than 10,000 participants, and 59% were funded by industry. In RCTs whose results were disclosed, 55% of industry-funded trials and 25% of non-industry funded trials reported statistically significant superiority over control (p = 0.012, 2-sided Fisher's exact test). Our findings highlighted concerns regarding potential bias related to funding sources, and that researchers should be aware of the importance of trial information disclosures and conflicts of interest. We should keep considering management and training regarding information disclosures and conflicts of interest for researchers. This could lead to better clinical evidence and further improvements in the development of medical treatment worldwide.

  12. Survey of decentralized control methods. [for large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.

  13. Using Large-Scale Databases in Evaluation: Advances, Opportunities, and Challenges

    ERIC Educational Resources Information Center

    Penuel, William R.; Means, Barbara

    2011-01-01

    Major advances in the number, capabilities, and quality of state, national, and transnational databases have opened up new opportunities for evaluators. Both large-scale data sets collected for administrative purposes and those collected by other researchers can provide data for a variety of evaluation-related activities. These include (a)…

  14. The Role of Reading Comprehension in Large-Scale Subject-Matter Assessments

    ERIC Educational Resources Information Center

    Zhang, Ting

    2013-01-01

    This study was designed with the overall goal of understanding how difficulties in reading comprehension are associated with early adolescents' performance in large-scale assessments in subject domains including science and civic-related social studies. The current study extended previous research by taking a cognition-centered approach based on…

  15. Linkages between large-scale climate patterns and the dynamics of Alaskan caribou populations

    Treesearch

    Kyle Joly; David R. Klein; David L. Verbyla; T. Scott Rupp; F. Stuart Chapin

    2011-01-01

    Recent research has linked climate warming to global declines in caribou and reindeer (both Rangifer tarandus) populations. We hypothesize large-scale climate patterns are a contributing factor explaining why these declines are not universal. To test our hypothesis for such relationships among Alaska caribou herds, we calculated the population growth...

  16. Reconsidering Replication: New Perspectives on Large-Scale School Improvement

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Glazer, Joshua L.

    2012-01-01

    The purpose of this analysis is to reconsider organizational replication as a strategy for large-scale school improvement: a strategy that features a "hub" organization collaborating with "outlet" schools to enact school-wide designs for improvement. To do so, we synthesize a leading line of research on commercial replication to construct a…

  17. Teachers' Perceptions of Teaching in Workplace Simulations in Vocational Education

    ERIC Educational Resources Information Center

    Jossberger, Helen; Brand-Gruwel, Saskia; van de Wiel, Margje W.; Boshuizen, Henny P.

    2015-01-01

    In a large-scale top-down innovation operation in the Netherlands, workplace simulations have been implemented in vocational schools, where students are required to work independently and self-direct their learning. However, research has shown that the success of such large-scale top-down innovations depends on how well their execution in schools…

  18. The Large-Scale Biosphere-Atmosphere Experiment in Amazonia: Analyzing Regional Land Use Change Effects.

    Treesearch

    Michael Keller; Maria Assunção Silva-Dias; Daniel C. Nepstad; Meinrat O. Andreae

    2004-01-01

    The Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multi-disciplinary, multinational scientific project led by Brazil. LBA researchers seek to understand Amazonia in its global context especially with regard to regional and global climate. Current development activities in Amazonia including deforestation, logging, cattle ranching, and agriculture...

  19. Introduction and Overview: Counseling Psychologists' Roles, Training, and Research Contributions to Large-Scale Disasters

    ERIC Educational Resources Information Center

    Jacobs, Sue C.; Leach, Mark M.; Gerstein, Lawrence H.

    2011-01-01

    Counseling psychologists have responded to many disasters, including the Haiti earthquake, the 2001 terrorist attacks in the United States, and Hurricane Katrina. However, as a profession, their responses have been localized and nonsystematic. In this first of four articles in this contribution, "Counseling Psychology and Large-Scale Disasters,…

  20. The shared and unique values of optical, fluorescence, thermal and microwave satellite data for estimating large-scale crop yields

    USDA-ARS?s Scientific Manuscript database

    Large-scale crop monitoring and yield estimation are important for both scientific research and practical applications. Satellite remote sensing provides an effective means for regional and global cropland monitoring, particularly in data-sparse regions that lack reliable ground observations and rep...

  1. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  2. Follow YOUR Heart: development of an evidence-based campaign empowering older women with HIV to participate in a large-scale cardiovascular disease prevention trial.

    PubMed

    Zanni, Markella V; Fitch, Kathleen; Rivard, Corinne; Sanchez, Laura; Douglas, Pamela S; Grinspoon, Steven; Smeaton, Laura; Currier, Judith S; Looby, Sara E

    2017-03-01

    Women's under-representation in HIV and cardiovascular disease (CVD) research suggests a need for novel strategies to ensure robust representation of women in HIV-associated CVD research. To elicit perspectives on CVD research participation among a community-sample of women with or at risk for HIV, and to apply acquired insights toward the development of an evidence-based campaign empowering older women with HIV to participate in a large-scale CVD prevention trial. In a community-based setting, we surveyed 40 women with or at risk for HIV about factors which might facilitate or impede engagement in CVD research. We applied insights derived from these surveys into the development of the Follow YOUR Heart campaign, educating women about HIV-associated CVD and empowering them to learn more about a multi-site HIV-associated CVD prevention trial: REPRIEVE. Endorsed best methods for learning about a CVD research study included peer-to-peer communication (54%), provider communication (46%) and video-based communication (39%). Top endorsed non-monetary reasons for participating in research related to gaining information (63%) and helping others (47%). Top endorsed reasons for not participating related to lack of knowledge about studies (29%) and lack of request to participate (29%). Based on survey results, the REPRIEVE Follow YOUR Heart campaign was developed. Interwoven campaign components (print materials, video, web presence) offer provider-based information/knowledge, peer-to-peer communication, and empowerment to learn more. Campaign components reflect women's self-identified motivations for research participation - education and altruism. Investigation of factors influencing women's participation in HIV-associated CVD research may be usefully applied to develop evidence-based strategies for enhancing women's enrollment in disease-specific large-scale trials. If proven efficacious, such strategies may enhance conduct of large-scale research studies across disciplines.

  3. Research on unit commitment with large-scale wind power connected power system

    NASA Astrophysics Data System (ADS)

    Jiao, Ran; Zhang, Baoqun; Chi, Zhongjun; Gong, Cheng; Ma, Longfei; Yang, Bing

    2017-01-01

    Large-scale integration of wind power generators into power grid brings severe challenges to power system economic dispatch due to its stochastic volatility. Unit commitment including wind farm is analyzed from the two parts of modeling and solving methods. The structures and characteristics can be summarized after classification has been done according to different objective function and constraints. Finally, the issues to be solved and possible directions of research and development in the future are discussed, which can adapt to the requirements of the electricity market, energy-saving power generation dispatching and smart grid, even providing reference for research and practice of researchers and workers in this field.

  4. The USC Epigenome Center.

    PubMed

    Laird, Peter W

    2009-10-01

    The University of Southern California (USC, CA, USA) has a long tradition of excellence in epigenetics. With the recent explosive growth and technological maturation of the field of epigenetics, it became clear that a dedicated high-throughput epigenomic data production facility would be needed to remain at the forefront of epigenetic research. To address this need, USC launched the USC Epigenome Center as the first large-scale center in academics dedicated to epigenomic research. The Center is providing high-throughput data production for large-scale genomic and epigenomic studies, and developing novel analysis tools for epigenomic research. This unique facility promises to be a valuable resource for multidisciplinary research, education and training in genomics, epigenomics, bioinformatics, and translational medicine.

  5. Large-scale standardized phenotyping of strawberry in RosBREED

    USDA-ARS?s Scientific Manuscript database

    A large, multi-institutional, international, research project with the goal of bringing genomicists and plant breeders together was funded by USDA-NIFA Specialty Crop Research Initiative. Apple, cherry, peach, and strawberry are the Rosaceous crops included in the project. Many (900+) strawberry g...

  6. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities. [cryogenic traonics wind tunnel

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    Based on the premises that magnetic suspension techniques can play a useful role in large scale aerodynamic testing, and that superconductor technology offers the only practical hope for building large scale magnetic suspensions, an all-superconductor 3-component magnetic suspension and balance facility was built as a prototype and tested sucessfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities at Langley Research Center were made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  7. Gas Generators and Their Potential to Support Human-Scale HIADS (Hypersonic Inflatable Aerodynamic Decelerators)

    NASA Technical Reports Server (NTRS)

    Bodkin, Richard J.; Cheatwood, F. M.; Dillman, Robert A; Dinonno, John M.; Hughes, Stephen J.; Lucy, Melvin H.

    2016-01-01

    As HIAD technology progresses from 3-m diameter experimental scale to as large as 20-m diameter for human Mars entry, the mass penalties of carrying compressed gas has led the HIAD team to research current state-of-the-art gas generator approaches. Summarized below are several technologies identified in this survey, along with some of the pros and cons with respect to supporting large-scale HIAD applications.

  8. A multidisciplinary approach to the development of low-cost high-performance lightwave networks

    NASA Technical Reports Server (NTRS)

    Maitan, Jacek; Harwit, Alex

    1991-01-01

    Our research focuses on high-speed distributed systems. We anticipate that our results will allow the fabrication of low-cost networks employing multi-gigabit-per-second data links for space and military applications. The recent development of high-speed low-cost photonic components and new generations of microprocessors creates an opportunity to develop advanced large-scale distributed information systems. These systems currently involve hundreds of thousands of nodes and are made up of components and communications links that may fail during operation. In order to realize these systems, research is needed into technologies that foster adaptability and scaleability. Self-organizing mechanisms are needed to integrate a working fabric of large-scale distributed systems. The challenge is to fuse theory, technology, and development methodologies to construct a cost-effective, efficient, large-scale system.

  9. Outlook and Challenges of Perovskite Solar Cells toward Terawatt-Scale Photovoltaic Module Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Kai; Kim, Donghoe; Whitaker, James B

    Rapid development of perovskite solar cells (PSCs) during the past several years has made this photovoltaic (PV) technology a serious contender for potential large-scale deployment on the terawatt scale in the PV market. To successfully transition PSC technology from the laboratory to industry scale, substantial efforts need to focus on scalable fabrication of high-performance perovskite modules with minimum negative environmental impact. Here, we provide an overview of the current research and our perspective regarding PSC technology toward future large-scale manufacturing and deployment. Several key challenges discussed are (1) a scalable process for large-area perovskite module fabrication; (2) less hazardous chemicalmore » routes for PSC fabrication; and (3) suitable perovskite module designs for different applications.« less

  10. Fire Whirls

    NASA Astrophysics Data System (ADS)

    Tohidi, Ali; Gollner, Michael J.; Xiao, Huahua

    2018-01-01

    Fire whirls present a powerful intensification of combustion, long studied in the fire research community because of the dangers they present during large urban and wildland fires. However, their destructive power has hidden many features of their formation, growth, and propagation. Therefore, most of what is known about fire whirls comes from scale modeling experiments in the laboratory. Both the methods of formation, which are dominated by wind and geometry, and the inner structure of the whirl, including velocity and temperature fields, have been studied at this scale. Quasi-steady fire whirls directly over a fuel source form the bulk of current experimental knowledge, although many other cases exist in nature. The structure of fire whirls has yet to be reliably measured at large scales; however, scaling laws have been relatively successful in modeling the conditions for formation from small to large scales. This review surveys the state of knowledge concerning the fluid dynamics of fire whirls, including the conditions for their formation, their structure, and the mechanisms that control their unique state. We highlight recent discoveries and survey potential avenues for future research, including using the properties of fire whirls for efficient remediation and energy generation.

  11. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    NASA Astrophysics Data System (ADS)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this research. Hopefully, all results developed from this research can be used as a warning system for Predicting Large Scale Landslides in the southern Taiwan. Keywords:Heavy Rainfall, Large Scale, landslides, Critical Rainfall Value

  12. Achieving online consent to participation in large-scale gene-environment studies: a tangible destination.

    PubMed

    Wood, Fiona; Kowalczuk, Jenny; Elwyn, Glyn; Mitchell, Clive; Gallacher, John

    2011-08-01

    Population based genetics studies are dependent on large numbers of individuals in the pursuit of small effect sizes. Recruiting and consenting a large number of participants is both costly and time consuming. We explored whether an online consent process for large-scale genetics studies is acceptable for prospective participants using an example online genetics study. We conducted semi-structured interviews with 42 members of the public stratified by age group, gender and newspaper readership (a measure of social status). Respondents were asked to use a website designed to recruit for a large-scale genetic study. After using the website a semi-structured interview was conducted to explore opinions and any issues they would have. Responses were analysed using thematic content analysis. The majority of respondents said they would take part in the research (32/42). Those who said they would decline to participate saw fewer benefits from the research, wanted more information and expressed a greater number of concerns about the study. Younger respondents had concerns over time commitment. Middle aged respondents were concerned about privacy and security. Older respondents were more altruistic in their motivation to participate. Common themes included trust in the authenticity of the website, security of personal data, curiosity about their own genetic profile, operational concerns and a desire for more information about the research. Online consent to large-scale genetic studies is likely to be acceptable to the public. The online consent process must establish trust quickly and effectively by asserting authenticity and credentials, and provide access to a range of information to suit different information preferences.

  13. Low Cost Manufacturing of Composite Cryotanks

    NASA Technical Reports Server (NTRS)

    Meredith, Brent; Palm, Tod; Deo, Ravi; Munafo, Paul M. (Technical Monitor)

    2002-01-01

    This viewgraph presentation reviews research and development of cryotank manufacturing conducted by Northrup Grumman. The objectives of the research and development included the development and validation of manufacturing processes and technology for fabrication of large scale cryogenic tanks, the establishment of a scale-up and facilitization plan for full scale cryotanks, the development of non-autoclave composite manufacturing processes, the fabrication of subscale tank joints for element tests, the performance of manufacturing risk reduction trials for the subscale tank, and the development of full-scale tank manufacturing concepts.

  14. Children's Understanding of Large-Scale Mapping Tasks: An Analysis of Talk, Drawings, and Gesture

    ERIC Educational Resources Information Center

    Kotsopoulos, Donna; Cordy, Michelle; Langemeyer, Melanie

    2015-01-01

    This research examined how children represent motion in large-scale mapping tasks that we referred to as "motion maps". The underlying mathematical content was transformational geometry. In total, 19 children, 8- to 10-year-old, created motion maps and captured their motion maps with accompanying verbal description digitally. Analysis of…

  15. Counseling Psychology and Large-Scale Disasters: Moving on to Action, Practice, and Research

    ERIC Educational Resources Information Center

    Jacobs, Sue C.; Hoffman, Mary Ann; Leach, Mark M.; Gerstein, Lawrence H.

    2011-01-01

    Juntunen and Parham each reacted positively with important personal reflections and/or calls to action in response to "Counseling Psychology and Large-Scale Disasters, Catastrophes, and Traumas: Opportunities for Growth." We comment on the primary themes and suggestions they raised. Since the time we were stimulated by Katrina and its aftermath…

  16. Status of JUPITER Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inoue, T.; Shirakata, K.; Kinjo, K.

    To obtain the data necessary for evaluating the nuclear design method of a large-scale fast breeder reactor, criticality tests with a large- scale homogeneous reactor were conducted as part of a joint research program by Japan and the U.S. Analyses of the tests are underway in both countries. The purpose of this paper is to describe the status of this project.

  17. Large-scale silviculture experiments of western Oregon and Washington.

    Treesearch

    Nathan J. Poage; Paul D. Anderson

    2007-01-01

    We review 12 large-scale silviculture experiments (LSSEs) in western Washington and Oregon with which the Pacific Northwest Research Station of the USDA Forest Service is substantially involved. We compiled and arrayed information about the LSSEs as a series of matrices in a relational database, which is included on the compact disc published with this report and...

  18. The Contribution of International Large-Scale Assessments to Educational Research: Combining Individual and Institutional Data Sources

    ERIC Educational Resources Information Center

    Strietholt, Rolf; Scherer, Ronny

    2018-01-01

    The present paper aims to discuss how data from international large-scale assessments (ILSAs) can be utilized and combined, even with other existing data sources, in order to monitor educational outcomes and study the effectiveness of educational systems. We consider different purposes of linking data, namely, extending outcomes measures,…

  19. Psychology in an Interdisciplinary Setting: A Large-Scale Project to Improve University Teaching

    ERIC Educational Resources Information Center

    Koch, Franziska D.; Vogt, Joachim

    2015-01-01

    At a German university of technology, a large-scale project was funded as a part of the "Quality Pact for Teaching", a programme launched by the German Federal Ministry of Education and Research to improve the quality of university teaching and study conditions. The project aims at intensifying interdisciplinary networking in teaching,…

  20. Towards large-scale, human-based, mesoscopic neurotechnologies.

    PubMed

    Chang, Edward F

    2015-04-08

    Direct human brain recordings have transformed the scope of neuroscience in the past decade. Progress has relied upon currently available neurophysiological approaches in the context of patients undergoing neurosurgical procedures for medical treatment. While this setting has provided precious opportunities for scientific research, it also has presented significant constraints on the development of new neurotechnologies. A major challenge now is how to achieve high-resolution spatiotemporal neural recordings at a large scale. By narrowing the gap between current approaches, new directions tailored to the mesoscopic (intermediate) scale of resolution may overcome the barriers towards safe and reliable human-based neurotechnology development, with major implications for advancing both basic research and clinical translation. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Globus | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.

  2. Stability of Rasch Scales over Time

    ERIC Educational Resources Information Center

    Taylor, Catherine S.; Lee, Yoonsun

    2010-01-01

    Item response theory (IRT) methods are generally used to create score scales for large-scale tests. Research has shown that IRT scales are stable across groups and over time. Most studies have focused on items that are dichotomously scored. Now Rasch and other IRT models are used to create scales for tests that include polytomously scored items.…

  3. Application and research of block caving in Pulang copper mine

    NASA Astrophysics Data System (ADS)

    Ge, Qifa; Fan, Wenlu; Zhu, Weigen; Chen, Xiaowei

    2018-01-01

    The application of block caving in mines shows significant advantages in large scale, low cost and high efficiency, thus block caving is worth promoting in the mines that meets the requirement of natural caving. Due to large scale of production and low ore grade in Pulang copper mine in China, comprehensive analysis and research were conducted on rock mechanics, mining sequence, undercutting and stability of bottom structure in terms of raising mine benefit and maximizing the recovery mineral resources. Finally this study summarizes that block caving is completely suitable for Pulang copper mine.

  4. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  5. Crowdsourced 'R&D' and medical research.

    PubMed

    Callaghan, Christian William

    2015-09-01

    Crowdsourced R&D, a research methodology increasingly applied to medical research, has properties well suited to large-scale medical data collection and analysis, as well as enabling rapid research responses to crises such as disease outbreaks. Multidisciplinary literature offers diverse perspectives of crowdsourced R&D as a useful large-scale medical data collection and research problem-solving methodology. Crowdsourced R&D has demonstrated 'proof of concept' in a host of different biomedical research applications. A wide range of quality and ethical issues relate to crowdsourced R&D. The rapid growth in applications of crowdsourced R&D in medical research is predicted by an increasing body of multidisciplinary theory. Further research in areas such as artificial intelligence may allow better coordination and management of the high volumes of medical data and problem-solving inputs generated by the crowdsourced R&D process. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Shifting Interests: Changes in the Lexical Semantics of ED-MEDIA

    ERIC Educational Resources Information Center

    Wild, Fridolin; Valentine, Chris; Scott, Peter

    2010-01-01

    Large research networks naturally form complex communities with overlapping but not identical expertise. To map the distribution of professional competence in field of "technology-enhanced learning", the lexical semantics expressed in research articles published in a representative, large-scale conference (ED-MEDIA) can be investigated and changes…

  7. Satellite-based characterization of climatic conditions before large-scale general flowering events in Peninsular Malaysia

    PubMed Central

    Azmy, Muna Maryam; Hashim, Mazlan; Numata, Shinya; Hosaka, Tetsuro; Noor, Nur Supardi Md.; Fletcher, Christine

    2016-01-01

    General flowering (GF) is a unique phenomenon wherein, at irregular intervals, taxonomically diverse trees in Southeast Asian dipterocarp forests synchronize their reproduction at the community level. Triggers of GF, including drought and low minimum temperatures a few months previously has been limitedly observed across large regional scales due to lack of meteorological stations. Here, we aim to identify the climatic conditions that trigger large-scale GF in Peninsular Malaysia using satellite sensors, Tropical Rainfall Measuring Mission (TRMM) and Moderate Resolution Imaging Spectroradiometer (MODIS), to evaluate the climatic conditions of focal forests. We observed antecedent drought, low temperature and high photosynthetic radiation conditions before large-scale GF events, suggesting that large-scale GF events could be triggered by these factors. In contrast, we found higher-magnitude GF in forests where lower precipitation preceded large-scale GF events. GF magnitude was also negatively influenced by land surface temperature (LST) for a large-scale GF event. Therefore, we suggest that spatial extent of drought may be related to that of GF forests, and that the spatial pattern of LST may be related to that of GF occurrence. With significant new findings and other results that were consistent with previous research we clarified complicated environmental correlates with the GF phenomenon. PMID:27561887

  8. Satellite-based characterization of climatic conditions before large-scale general flowering events in Peninsular Malaysia.

    PubMed

    Azmy, Muna Maryam; Hashim, Mazlan; Numata, Shinya; Hosaka, Tetsuro; Noor, Nur Supardi Md; Fletcher, Christine

    2016-08-26

    General flowering (GF) is a unique phenomenon wherein, at irregular intervals, taxonomically diverse trees in Southeast Asian dipterocarp forests synchronize their reproduction at the community level. Triggers of GF, including drought and low minimum temperatures a few months previously has been limitedly observed across large regional scales due to lack of meteorological stations. Here, we aim to identify the climatic conditions that trigger large-scale GF in Peninsular Malaysia using satellite sensors, Tropical Rainfall Measuring Mission (TRMM) and Moderate Resolution Imaging Spectroradiometer (MODIS), to evaluate the climatic conditions of focal forests. We observed antecedent drought, low temperature and high photosynthetic radiation conditions before large-scale GF events, suggesting that large-scale GF events could be triggered by these factors. In contrast, we found higher-magnitude GF in forests where lower precipitation preceded large-scale GF events. GF magnitude was also negatively influenced by land surface temperature (LST) for a large-scale GF event. Therefore, we suggest that spatial extent of drought may be related to that of GF forests, and that the spatial pattern of LST may be related to that of GF occurrence. With significant new findings and other results that were consistent with previous research we clarified complicated environmental correlates with the GF phenomenon.

  9. Ice Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Broeren, Andy; Potapczuk, Mark; Lee, Sam; Malone, Adam; Paul, Ben; Woodard, Brian

    2016-01-01

    The design and certification of modern transport airplanes for flight in icing conditions increasing relies on three-dimensional numerical simulation tools for ice accretion prediction. There is currently no publically available, high-quality, ice accretion database upon which to evaluate the performance of icing simulation tools for large-scale swept wings that are representative of modern commercial transport airplanes. The purpose of this presentation is to present the results of a series of icing wind tunnel test campaigns whose aim was to provide an ice accretion database for large-scale, swept wings.

  10. Tools for Large-Scale Mobile Malware Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bierma, Michael

    Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000more » Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.« less

  11. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  12. Improving data workflow systems with cloud services and use of open data for bioinformatics research.

    PubMed

    Karim, Md Rezaul; Michel, Audrey; Zappa, Achille; Baranov, Pavel; Sahay, Ratnesh; Rebholz-Schuhmann, Dietrich

    2017-04-16

    Data workflow systems (DWFSs) enable bioinformatics researchers to combine components for data access and data analytics, and to share the final data analytics approach with their collaborators. Increasingly, such systems have to cope with large-scale data, such as full genomes (about 200 GB each), public fact repositories (about 100 TB of data) and 3D imaging data at even larger scales. As moving the data becomes cumbersome, the DWFS needs to embed its processes into a cloud infrastructure, where the data are already hosted. As the standardized public data play an increasingly important role, the DWFS needs to comply with Semantic Web technologies. This advancement to DWFS would reduce overhead costs and accelerate the progress in bioinformatics research based on large-scale data and public resources, as researchers would require less specialized IT knowledge for the implementation. Furthermore, the high data growth rates in bioinformatics research drive the demand for parallel and distributed computing, which then imposes a need for scalability and high-throughput capabilities onto the DWFS. As a result, requirements for data sharing and access to public knowledge bases suggest that compliance of the DWFS with Semantic Web standards is necessary. In this article, we will analyze the existing DWFS with regard to their capabilities toward public open data use as well as large-scale computational and human interface requirements. We untangle the parameters for selecting a preferable solution for bioinformatics research with particular consideration to using cloud services and Semantic Web technologies. Our analysis leads to research guidelines and recommendations toward the development of future DWFS for the bioinformatics research community. © The Author 2017. Published by Oxford University Press.

  13. Space Weather Research at the National Science Foundation

    NASA Astrophysics Data System (ADS)

    Moretto, T.

    2015-12-01

    There is growing recognition that the space environment can have substantial, deleterious, impacts on society. Consequently, research enabling specification and forecasting of hazardous space effects has become of great importance and urgency. This research requires studying the entire Sun-Earth system to understand the coupling of regions all the way from the source of disturbances in the solar atmosphere to the Earth's upper atmosphere. The traditional, region-based structure of research programs in Solar and Space physics is ill suited to fully support the change in research directions that the problem of space weather dictates. On the observational side, dense, distributed networks of observations are required to capture the full large-scale dynamics of the space environment. However, the cost of implementing these is typically prohibitive, especially for measurements in space. Thus, by necessity, the implementation of such new capabilities needs to build on creative and unconventional solutions. A particularly powerful idea is the utilization of new developments in data engineering and informatics research (big data). These new technologies make it possible to build systems that can collect and process huge amounts of noisy and inaccurate data and extract from them useful information. The shift in emphasis towards system level science for geospace also necessitates the development of large-scale and multi-scale models. The development of large-scale models capable of capturing the global dynamics of the Earth's space environment requires investment in research team efforts that go beyond what can typically be funded under the traditional grants programs. This calls for effective interdisciplinary collaboration and efficient leveraging of resources both nationally and internationally. This presentation will provide an overview of current and planned initiatives, programs, and activities at the National Science Foundation pertaining to space weathe research.

  14. Propulsion simulator for magnetically-suspended wind tunnel models

    NASA Technical Reports Server (NTRS)

    Joshi, Prakash B.; Goldey, C. L.; Sacco, G. P.; Lawing, Pierce L.

    1991-01-01

    The objective of phase two of a current investigation sponsored by NASA Langley Research Center is to demonstrate the measurement of aerodynamic forces/moments, including the effects of exhaust gases, in magnetic suspension and balance system (MSBS) wind tunnels. Two propulsion simulator models are being developed: a small-scale and a large-scale unit, both employing compressed, liquified carbon dioxide as propellant. The small-scale unit was designed, fabricated, and statically-tested at Physical Sciences Inc. (PSI). The large-scale simulator is currently in the preliminary design stage. The small-scale simulator design/development is presented, and the data from its static firing on a thrust stand are discussed. The analysis of this data provides important information for the design of the large-scale unit. A description of the preliminary design of the device is also presented.

  15. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    NASA Astrophysics Data System (ADS)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  16. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  17. Impact of spectral nudging on the downscaling of tropical cyclones in regional climate simulations

    NASA Astrophysics Data System (ADS)

    Choi, Suk-Jin; Lee, Dong-Kyou

    2016-06-01

    This study investigated the simulations of three months of seasonal tropical cyclone (TC) activity over the western North Pacific using the Advanced Research WRF Model. In the control experiment (CTL), the TC frequency was considerably overestimated. Additionally, the tracks of some TCs tended to have larger radii of curvature and were shifted eastward. The large-scale environments of westerly monsoon flows and subtropical Pacific highs were unreasonably simulated. The overestimated frequency of TC formation was attributed to a strengthened westerly wind field in the southern quadrants of the TC center. In comparison with the experiment with the spectral nudging method, the strengthened wind speed was mainly modulated by large-scale flow that was greater than approximately 1000 km in the model domain. The spurious formation and undesirable tracks of TCs in the CTL were considerably improved by reproducing realistic large-scale atmospheric monsoon circulation with substantial adjustment between large-scale flow in the model domain and large-scale boundary forcing modified by the spectral nudging method. The realistic monsoon circulation took a vital role in simulating realistic TCs. It revealed that, in the downscaling from large-scale fields for regional climate simulations, scale interaction between model-generated regional features and forced large-scale fields should be considered, and spectral nudging is a desirable method in the downscaling method.

  18. Contrasting styles of large-scale displacement of unconsolidated sand: examples from the early Jurassic Navajo Sandstone on the Colorado Plateau, USA

    NASA Astrophysics Data System (ADS)

    Bryant, Gerald

    2015-04-01

    Large-scale soft-sediment deformation features in the Navajo Sandstone have been a topic of interest for nearly 40 years, ever since they were first explored as a criterion for discriminating between marine and continental processes in the depositional environment. For much of this time, evidence for large-scale sediment displacements was commonly attributed to processes of mass wasting. That is, gravity-driven movements of surficial sand. These slope failures were attributed to the inherent susceptibility of dune sand responding to environmental triggers such as earthquakes, floods, impacts, and the differential loading associated with dune topography. During the last decade, a new wave of research is focusing on the event significance of deformation features in more detail, revealing a broad diversity of large-scale deformation morphologies. This research has led to a better appreciation of subsurface dynamics in the early Jurassic deformation events recorded in the Navajo Sandstone, including the important role of intrastratal sediment flow. This report documents two illustrative examples of large-scale sediment displacements represented in extensive outcrops of the Navajo Sandstone along the Utah/Arizona border. Architectural relationships in these outcrops provide definitive constraints that enable the recognition of a large-scale sediment outflow, at one location, and an equally large-scale subsurface flow at the other. At both sites, evidence for associated processes of liquefaction appear at depths of at least 40 m below the original depositional surface, which is nearly an order of magnitude greater than has commonly been reported from modern settings. The surficial, mass flow feature displays attributes that are consistent with much smaller-scale sediment eruptions (sand volcanoes) that are often documented from modern earthquake zones, including the development of hydraulic pressure from localized, subsurface liquefaction and the subsequent escape of fluidized sand toward the unconfined conditions of the surface. The origin of the forces that produced the lateral, subsurface movement of a large body of sand at the other site is not readily apparent. The various constraints on modeling the generation of the lateral force required to produce the observed displacement are considered here, along with photodocumentation of key outcrop relationships.

  19. A user-friendly tool to transform large scale administrative data into wide table format using a MapReduce program with a Pig Latin based script.

    PubMed

    Horiguchi, Hiromasa; Yasunaga, Hideo; Hashimoto, Hideki; Ohe, Kazuhiko

    2012-12-22

    Secondary use of large scale administrative data is increasingly popular in health services and clinical research, where a user-friendly tool for data management is in great demand. MapReduce technology such as Hadoop is a promising tool for this purpose, though its use has been limited by the lack of user-friendly functions for transforming large scale data into wide table format, where each subject is represented by one row, for use in health services and clinical research. Since the original specification of Pig provides very few functions for column field management, we have developed a novel system called GroupFilterFormat to handle the definition of field and data content based on a Pig Latin script. We have also developed, as an open-source project, several user-defined functions to transform the table format using GroupFilterFormat and to deal with processing that considers date conditions. Having prepared dummy discharge summary data for 2.3 million inpatients and medical activity log data for 950 million events, we used the Elastic Compute Cloud environment provided by Amazon Inc. to execute processing speed and scaling benchmarks. In the speed benchmark test, the response time was significantly reduced and a linear relationship was observed between the quantity of data and processing time in both a small and a very large dataset. The scaling benchmark test showed clear scalability. In our system, doubling the number of nodes resulted in a 47% decrease in processing time. Our newly developed system is widely accessible as an open resource. This system is very simple and easy to use for researchers who are accustomed to using declarative command syntax for commercial statistical software and Structured Query Language. Although our system needs further sophistication to allow more flexibility in scripts and to improve efficiency in data processing, it shows promise in facilitating the application of MapReduce technology to efficient data processing with large scale administrative data in health services and clinical research.

  20. Can a science-based definition of acupuncture improve clinical outcomes?

    PubMed

    Priebe, Ted; Stumpf, Steven H; Zalunardo, Rod

    2017-05-01

    Research on acupuncture has been muddled by attempts to bridge the ancient with the modern. Barriers to effectiveness research are reflected in recurring conflicts that include disagreement on use of the most basic terms, lack of standard intervention controls, and the absence of functional measures for assessing treatment effect. Acupuncture research has stalled at the "placebo barrier" wherein acupuncture is "no better than placebo." The most widely recognized comparative effectiveness research in acupuncture does not compare acupuncture treatment protocols within groups, thereby, mutating large scale effectiveness studies into large scale efficacy trials. Too often research in acupuncture attempts to tie outcomes to traditional belief systems thereby limiting usefulness of the research. The acupuncture research paradigm needs to focus more closely on a scientific definition of treatments and outcomes that compare protocols in terms of prevalent clinical issues such as relative effectiveness for treating pain.

  1. Addressing Methodological Challenges in Large Communication Data Sets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care.

    PubMed

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2016-07-01

    In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research.

  2. Addressing Methodological Challenges in Large Communication Datasets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care

    PubMed Central

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2015-01-01

    In this paper, we present strategies for collecting and coding a large longitudinal communication dataset collected across multiple sites, consisting of over 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication datasets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multi-site secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges have the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a “how-to” example for managing large, digitally-recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414

  3. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  4. Evaluation of Large-Scale Public-Sector Reforms: A Comparative Analysis

    ERIC Educational Resources Information Center

    Breidahl, Karen N.; Gjelstrup, Gunnar; Hansen, Hanne Foss; Hansen, Morten Balle

    2017-01-01

    Research on the evaluation of large-scale public-sector reforms is rare. This article sets out to fill that gap in the evaluation literature and argues that it is of vital importance since the impact of such reforms is considerable and they change the context in which evaluations of other and more delimited policy areas take place. In our…

  5. Understanding Business Interests in International Large-Scale Student Assessments: A Media Analysis of "The Economist," "Financial Times," and "Wall Street Journal"

    ERIC Educational Resources Information Center

    Steiner-Khamsi, Gita; Appleton, Margaret; Vellani, Shezleen

    2018-01-01

    The media analysis is situated in the larger body of studies that explore the varied reasons why different policy actors advocate for international large-scale student assessments (ILSAs) and adds to the research on the fast advance of the global education industry. The analysis of "The Economist," "Financial Times," and…

  6. The Limits and Possibilities of International Large-Scale Assessments. Education Policy Brief. Volume 9, Number 2, Spring 2011

    ERIC Educational Resources Information Center

    Rutkowski, David J.; Prusinski, Ellen L.

    2011-01-01

    The staff of the Center for Evaluation & Education Policy (CEEP) at Indiana University is often asked about how international large-scale assessments influence U.S. educational policy. This policy brief is designed to provide answers to some of the most frequently asked questions encountered by CEEP researchers concerning the three most popular…

  7. First Large-Scale Proteogenomic Study of Breast Cancer Provides Insight into Potential Therapeutic Targets | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    News Release: May 25, 2016 — Building on data from The Cancer Genome Atlas (TCGA) project, a multi-institutional team of scientists has completed the first large-scale “proteogenomic” study of breast cancer, linking DNA mutations to protein signaling and helping pinpoint the genes that drive cancer.

  8. Projecting Images of the "Good" and the "Bad School": Top Scorers in Educational Large-Scale Assessments as Reference Societies

    ERIC Educational Resources Information Center

    Waldow, Florian

    2017-01-01

    Researchers interested in the global flow of educational ideas and programmes have long been interested in the role of so-called "reference societies." The article investigates how top scorers in large-scale assessments are framed as positive or negative reference societies in the education policy-making debate in German mass media and…

  9. Why Do Countries Participate in International Large-Scale Assessments? The Case of PISA. Policy Research Working Paper 7447

    ERIC Educational Resources Information Center

    Lockheed, Marlaine E.

    2015-01-01

    The number of countries that regularly participate in international large-scale assessments has increased sharply over the past 15 years, with the share of countries participating in the Programme for International Student Assessment growing from one-fifth of countries in 2000 to over one-third of countries in 2015. What accounts for this…

  10. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  11. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Externally blown flap noise research

    NASA Technical Reports Server (NTRS)

    Dorsch, R. G.

    1974-01-01

    The Lewis Research Center cold-flow model externally blown flap (EBF) noise research test program is summarized. Both engine under-the-wing and over-the-wing EBF wing section configurations were studied. Ten large scale and nineteen small scale EBF models were tested. A limited number of forward airspeed effect and flap noise suppression tests were also run. The key results and conclusions drawn from the flap noise tests are summarized and discussed.

  13. Subgrid-scale models for large-eddy simulation of rotating turbulent channel flows

    NASA Astrophysics Data System (ADS)

    Silvis, Maurits H.; Bae, Hyunji Jane; Trias, F. Xavier; Abkar, Mahdi; Moin, Parviz; Verstappen, Roel

    2017-11-01

    We aim to design subgrid-scale models for large-eddy simulation of rotating turbulent flows. Rotating turbulent flows form a challenging test case for large-eddy simulation due to the presence of the Coriolis force. The Coriolis force conserves the total kinetic energy while transporting it from small to large scales of motion, leading to the formation of large-scale anisotropic flow structures. The Coriolis force may also cause partial flow laminarization and the occurrence of turbulent bursts. Many subgrid-scale models for large-eddy simulation are, however, primarily designed to parametrize the dissipative nature of turbulent flows, ignoring the specific characteristics of transport processes. We, therefore, propose a new subgrid-scale model that, in addition to the usual dissipative eddy viscosity term, contains a nondissipative nonlinear model term designed to capture transport processes, such as those due to rotation. We show that the addition of this nonlinear model term leads to improved predictions of the energy spectra of rotating homogeneous isotropic turbulence as well as of the Reynolds stress anisotropy in spanwise-rotating plane-channel flows. This work is financed by the Netherlands Organisation for Scientific Research (NWO) under Project Number 613.001.212.

  14. Desalination: Status and Federal Issues

    DTIC Science & Technology

    2009-12-30

    on one side and lets purified water through. Reverse osmosis plants have fewer problems with corrosion and usually have lower energy requirements...Texas) and cities are actively researching and investigating the feasibility of large-scale desalination plants for municipal water supplies...desalination research and development, and in construction and operational costs of desalination demonstration projects and full-scale plants

  15. A Ground-Based Research Vehicle for Base Drag Studies at Subsonic Speeds

    NASA Technical Reports Server (NTRS)

    Diebler, Corey; Smith, Mark

    2002-01-01

    A ground research vehicle (GRV) has been developed to study the base drag on large-scale vehicles at subsonic speeds. Existing models suggest that base drag is dependent upon vehicle forebody drag, and for certain configurations, the total drag of a vehicle can be reduced by increasing its forebody drag. Although these models work well for small projectile shapes, studies have shown that they do not provide accurate predictions when applied to large-scale vehicles. Experiments are underway at the NASA Dryden Flight Research Center to collect data at Reynolds numbers to a maximum of 3 x 10(exp 7), and to formulate a new model for predicting the base drag of trucks, buses, motor homes, reentry vehicles, and other large-scale vehicles. Preliminary tests have shown errors as great as 70 percent compared to Hoerner's two-dimensional base drag prediction. This report describes the GRV and its capabilities, details the studies currently underway at NASA Dryden, and presents preliminary results of both the effort to formulate a new base drag model and the investigation into a method of reducing total drag by manipulating forebody drag.

  16. CRP: Collaborative Research Project (A Mathematical Research Experience for Undergraduates)

    ERIC Educational Resources Information Center

    Parsley, Jason; Rusinko, Joseph

    2017-01-01

    The "Collaborative Research Project" ("CRP")--a mathematics research experience for undergraduates--offers a large-scale collaborative experience in research for undergraduate students. CRP seeks to widen the audience of students who participate in undergraduate research in mathematics. In 2015, the inaugural CRP had 100…

  17. Global Magnetohydrodynamic Modeling of the Solar Corona

    NASA Technical Reports Server (NTRS)

    Linker, Jon A.; Wagner, William (Technical Monitor)

    2001-01-01

    The solar corona, the hot, tenuous outer atmosphere of the Sun, exhibits many fascinating phenomena on a wide range of scales. One of the ways that the Sun can affect us here at Earth is through the large-scale structure of the corona and the dynamical phenomena associated with it, as it is the corona that extends outward as the solar wind and encounters the Earth's magnetosphere. The goal of our research sponsored by NASA's Supporting Research and Technology Program in Solar Physics is to develop increasingly realistic models of the large-scale solar corona, so that we can understand the underlying properties of the coronal magnetic field that lead to the observed structure and evolution of the corona. We describe the work performed under this contract.

  18. Talking About The Smokes: a large-scale, community-based participatory research project.

    PubMed

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  19. Large-Scale Demonstration of Liquid Hydrogen Storage with Zero Boiloff for In-Space Applications

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Bryant, C. B.; Flachbart, R. H.; Holt, K. A.; Johnson, E.; Hedayat, A.; Hipp, B.; Plachta, D. W.

    2010-01-01

    Cryocooler and passive insulation technology advances have substantially improved prospects for zero-boiloff cryogenic storage. Therefore, a cooperative effort by NASA s Ames Research Center, Glenn Research Center, and Marshall Space Flight Center (MSFC) was implemented to develop zero-boiloff concepts for in-space cryogenic storage. Described herein is one program element - a large-scale, zero-boiloff demonstration using the MSFC multipurpose hydrogen test bed (MHTB). A commercial cryocooler was interfaced with an existing MHTB spray bar mixer and insulation system in a manner that enabled a balance between incoming and extracted thermal energy.

  20. Research directions in large scale systems and decentralized control

    NASA Technical Reports Server (NTRS)

    Tenney, R. R.

    1980-01-01

    Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.

  1. Landscape-Scale Research In The Ouachita Mountains Of West-Central Arkansas: General Study Design

    Treesearch

    James M. Guldin

    2004-01-01

    Abstract A landscape-scale study on forest ecology and management began in 1995 in the eastern Ouachita Mountains. Of four large watersheds, three were within the Winona Ranger District of the Ouachita National Forest, and a major forest industry landowner largely owned and managed the fourth. These watersheds vary from 3,700 to 9,800 acres. At this...

  2. Gaining Ground in the Middle Grades: Why Some Schools Do Better. A Large-Scale Study of Middle Grades Practices and Student Outcomes. Technical Appendix B

    ERIC Educational Resources Information Center

    EdSource, 2010

    2010-01-01

    This appendix focuses on the descriptive statistics of the middle study schools that participated in the "Gaining Ground in the Middle Grades: Why Some Schools Do Better. A Large-Scale Study of Middle Grades Practices and Student Outcomes. Initial Research." This appendix contains the following figures: (1) Student…

  3. From Efficacy Trial to Large Scale Effectiveness Trial: A Tier 2 Mathematics Intervention for First Graders with Difficulties in Mathematics

    ERIC Educational Resources Information Center

    Rolfhus, Eric; Clarke, Ben; Decker, Lauren E.; Williams, Chuck; Dimino, Joseph

    2013-01-01

    Large scale longitudinal research (Morgan, Farkas, & Wu, 2009) and a meta-analysis (Duncan et al., 2007) have found that early mathematics achievement is a strong predictor of later mathematics achievement. In fact, end of Kindergarten and end of grade 1 mathematics achievement on ECLS-K and similar mathematics proficiency measures tends to be…

  4. Newly invented biobased materials from low-carbon, diverted waste fibers: research methods, testing, and full-scale application in a case study structure

    Treesearch

    Julee A Herdt; John Hunt; Kellen Schauermann

    2016-01-01

    This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authors’ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...

  5. Implementation of Shifted Periodic Boundary Conditions in the Large-Scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) Software

    DTIC Science & Technology

    2015-08-01

    Atomic/Molecular Massively Parallel Simulator ( LAMMPS ) Software by N Scott Weingarten and James P Larentzos Approved for...Massively Parallel Simulator ( LAMMPS ) Software by N Scott Weingarten Weapons and Materials Research Directorate, ARL James P Larentzos Engility...Shifted Periodic Boundary Conditions in the Large-Scale Atomic/Molecular Massively Parallel Simulator ( LAMMPS ) Software 5a. CONTRACT NUMBER 5b

  6. Scale Up of Malonic Acid Fermentation Process: Cooperative Research and Development Final Report, CRADA Number CRD-16-612

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schell, Daniel J

    The goal of this work is to use the large fermentation vessels in the National Renewable Energy Laboratory's (NREL) Integrated Biorefinery Research Facility (IBRF) to scale-up Lygos' biological-based process for producing malonic acid and to generate performance data. Initially, work at the 1 L scale validated successful transfer of Lygos' fermentation protocols to NREL using a glucose substrate. Outside of the scope of the CRADA with NREL, Lygos tested their process on lignocellulosic sugars produced by NREL at Lawrence Berkeley National Laboratory's (LBNL) Advanced Biofuels Process Development Unit (ABPDU). NREL produced these cellulosic sugar solutions from corn stover using amore » separate cellulose/hemicellulose process configuration. Finally, NREL performed fermentations using glucose in large fermentors (1,500- and 9,000-L vessels) to intermediate product and to demonstrate successful performance of Lygos' technology at larger scales.« less

  7. Assessing the effects of fire disturbances on ecosystems: A scientific agenda for research and management

    USGS Publications Warehouse

    Schmoldt, D.L.; Peterson, D.L.; Keane, R.E.; Lenihan, J.M.; McKenzie, D.; Weise, D.R.; Sandberg, D.V.

    1999-01-01

    A team of fire scientists and resource managers convened 17-19 April 1996 in Seattle, Washington, to assess the effects of fire disturbance on ecosystems. Objectives of this workshop were to develop scientific recommendations for future fire research and management activities. These recommendations included a series of numerically ranked scientific and managerial questions and responses focusing on (1) links among fire effects, fuels, and climate; (2) fire as a large-scale disturbance; (3) fire-effects modeling structures; and (4) managerial concerns, applications, and decision support. At the present time, understanding of fire effects and the ability to extrapolate fire-effects knowledge to large spatial scales are limited, because most data have been collected at small spatial scales for specific applications. Although we clearly need more large-scale fire-effects data, it will be more expedient to concentrate efforts on improving and linking existing models that simulate fire effects in a georeferenced format while integrating empirical data as they become available. A significant component of this effort should be improved communication between modelers and managers to develop modeling tools to use in a planning context. Another component of this modeling effort should improve our ability to predict the interactions of fire and potential climatic change at very large spatial scales. The priority issues and approaches described here provide a template for fire science and fire management programs in the next decade and beyond.

  8. Implementing Projects in Calculus on a Large Scale at the University of South Florida

    ERIC Educational Resources Information Center

    Fox, Gordon A.; Campbell, Scott; Grinshpan, Arcadii; Xu, Xiaoying; Holcomb, John; Bénéteau, Catherine; Lewis, Jennifer E.; Ramachandran, Kandethody

    2017-01-01

    This paper describes the development of a program of project-based learning in Calculus courses at a large urban research university. In this program, students developed research projects in consultation with a faculty advisor in their major, and supervised by their calculus instructors. Students wrote up their projects in a prescribed format…

  9. Self-Reacting Friction Stir Welding for Aluminum Complex Curvature Applications

    NASA Technical Reports Server (NTRS)

    Brown, Randy J.; Martin, W.; Schneider, J.; Hartley, P. J.; Russell, Carolyn; Lawless, Kirby; Jones, Chip

    2003-01-01

    This viewgraph representation provides an overview of sucessful research conducted by Lockheed Martin and NASA to develop an advanced self-reacting friction stir technology for complex curvature aluminum alloys. The research included weld process development for 0.320 inch Al 2219, sucessful transfer from the 'lab' scale to the production scale tool and weld quality exceeding strenght goals. This process will enable development and implementation of large scale complex geometry hardware fabrication. Topics covered include: weld process development, weld process transfer, and intermediate hardware fabrication.

  10. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations

    PubMed Central

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts. PMID:25993414

  11. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations.

    PubMed

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts.

  12. Tropospheric transport differences between models using the same large-scale meteorological fields

    NASA Astrophysics Data System (ADS)

    Orbe, Clara; Waugh, Darryn W.; Yang, Huang; Lamarque, Jean-Francois; Tilmes, Simone; Kinnison, Douglas E.

    2017-01-01

    The transport of chemicals is a major uncertainty in the modeling of tropospheric composition. A common approach is to transport gases using the winds from meteorological analyses, either using them directly in a chemical transport model or by constraining the flow in a general circulation model. Here we compare the transport of idealized tracers in several different models that use the same meteorological fields taken from Modern-Era Retrospective analysis for Research and Applications (MERRA). We show that, even though the models use the same meteorological fields, there are substantial differences in their global-scale tropospheric transport related to large differences in parameterized convection between the simulations. Furthermore, we find that the transport differences between simulations constrained with the same-large scale flow are larger than differences between free-running simulations, which have differing large-scale flow but much more similar convective mass fluxes. Our results indicate that more attention needs to be paid to convective parameterizations in order to understand large-scale tropospheric transport in models, particularly in simulations constrained with analyzed winds.

  13. Breed locally, disperse globally: Fine-scale genetic structure despite landscape-scale panmixia in a fire-specialist

    Treesearch

    Jennifer C. Pierson; Fred W. Allendorf; Pierre Drapeau; Michael K. Schwartz

    2013-01-01

    An exciting advance in the understanding of metapopulation dynamics has been the investigation of how populations respond to ephemeral patches that go 'extinct' during the lifetime of an individual. Previous research has shown that this scenario leads to genetic homogenization across large spatial scales. However, little is known about fine-scale genetic...

  14. Machine Learning, deep learning and optimization in computer vision

    NASA Astrophysics Data System (ADS)

    Canu, Stéphane

    2017-03-01

    As quoted in the Large Scale Computer Vision Systems NIPS workshop, computer vision is a mature field with a long tradition of research, but recent advances in machine learning, deep learning, representation learning and optimization have provided models with new capabilities to better understand visual content. The presentation will go through these new developments in machine learning covering basic motivations, ideas, models and optimization in deep learning for computer vision, identifying challenges and opportunities. It will focus on issues related with large scale learning that is: high dimensional features, large variety of visual classes, and large number of examples.

  15. Swept-Wing Ice Accretion Characterization and Aerodynamics

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.

    2013-01-01

    NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65% scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20%, 64% and 83% semispan stations of the baseline-reference wing. Three-dimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date. 1

  16. Swept-Wing Ice Accretion Characterization and Aerodynamics

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Riley, James T.; Villedieu, Philippe; Moens, Frederic; Bragg, Michael B.

    2013-01-01

    NASA, FAA, ONERA, the University of Illinois and Boeing have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large-scale, three-dimensional swept wings. The overall goal is to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formation and resulting aerodynamic effect. A seven-phase research effort has been designed that incorporates ice-accretion and aerodynamic experiments and computational simulations. As the baseline, full-scale, swept-wing-reference geometry, this research will utilize the 65 percent scale Common Research Model configuration. Ice-accretion testing will be conducted in the NASA Icing Research Tunnel for three hybrid swept-wing models representing the 20, 64 and 83 percent semispan stations of the baseline-reference wing. Threedimensional measurement techniques are being developed and validated to document the experimental ice-accretion geometries. Artificial ice shapes of varying geometric fidelity will be developed for aerodynamic testing over a large Reynolds number range in the ONERA F1 pressurized wind tunnel and in a smaller-scale atmospheric wind tunnel. Concurrent research will be conducted to explore and further develop the use of computational simulation tools for ice accretion and aerodynamics on swept wings. The combined results of this research effort will result in an improved understanding of the ice formation and aerodynamic effects on swept wings. The purpose of this paper is to describe this research effort in more detail and report on the current results and status to date.

  17. A simple model of intraseasonal oscillations

    NASA Astrophysics Data System (ADS)

    Fuchs, Željka; Raymond, David J.

    2017-06-01

    The intraseasonal oscillations and in particular the MJO have been and still remain a "holy grail" of today's atmospheric science research. Why does the MJO propagate eastward? What makes it unstable? What is the scaling for the MJO, i.e., why does it prefer long wavelengths or planetary wave numbers 1-3? What is the westward moving component of the intraseasonal oscillation? Though linear WISHE has long been discounted as a plausible model for intraseasonal oscillations and the MJO, the version we have developed explains many of the observed features of those phenomena, in particular, the preference for large zonal scale. In this model version, the moisture budget and the increase of precipitation with tropospheric humidity lead to a "moisture mode." The destabilization of the large-scale moisture mode occurs via WISHE only and there is no need to postulate large-scale radiatively induced instability or negative effective gross moist stability. Our WISHE-moisture theory leads to a large-scale unstable eastward propagating mode in n = -1 case and a large-scale unstable westward propagating mode in n = 1 case. We suggest that the n = -1 case might be connected to the MJO and the observed westward moving disturbance to the observed equatorial Rossby mode.

  18. [Research progress on hydrological scaling].

    PubMed

    Liu, Jianmei; Pei, Tiefan

    2003-12-01

    With the development of hydrology and the extending effect of mankind on environment, scale issue has become a great challenge to many hydrologists due to the stochasticism and complexity of hydrological phenomena and natural catchments. More and more concern has been given to the scaling issues to gain a large-scale (or small-scale) hydrological characteristic from a certain known catchments, but hasn't been solved successfully. The first part of this paper introduced some concepts about hydrological scale, scale issue and scaling. The key problem is the spatial heterogeneity of catchments and the temporal and spatial variability of hydrological fluxes. Three approaches to scale were put forward in the third part, which were distributed modeling, fractal theory and statistical self similarity analyses. Existing problems and future research directions were proposed in the last part.

  19. Culture and cognition in health systems change.

    PubMed

    Evans, Jenna M; Baker, G Ross; Berta, Whitney; Barnsley, Jan

    2015-01-01

    Large-scale change involves modifying not only the structures and functions of multiple organizations, but also the mindsets and behaviours of diverse stakeholders. This paper focuses on the latter: the informal, less visible, and often neglected psychological and social factors implicated in change efforts. The purpose of this paper is to differentiate between the concepts of organizational culture and mental models, to argue for the value of applying a shared mental models (SMM) framework to large-scale change, and to suggest directions for future research. The authors provide an overview of SMM theory and use it to explore the dynamic relationship between culture and cognition. The contributions and limitations of the theory to change efforts are also discussed. Culture and cognition are complementary perspectives, providing insight into two different levels of the change process. SMM theory draws attention to important questions that add value to existing perspectives on large-scale change. The authors outline these questions for future research and argue that research and practice in this domain may be best served by focusing less on the potentially narrow goal of "achieving consensus" and more on identifying, understanding, and managing cognitive convergences and divergences as part of broader research and change management programmes. Drawing from both cultural and cognitive paradigms can provide researchers with a more complete picture of the processes by which coordinated action are achieved in complex change initiatives in the healthcare domain.

  20. Everyday Scale Errors

    ERIC Educational Resources Information Center

    Ware, Elizabeth A.; Uttal, David H.; DeLoache, Judy S.

    2010-01-01

    Young children occasionally make "scale errors"--they attempt to fit their bodies into extremely small objects or attempt to fit a larger object into another, tiny, object. For example, a child might try to sit in a dollhouse-sized chair or try to stuff a large doll into it. Scale error research was originally motivated by parents' and…

  1. Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption

    ERIC Educational Resources Information Center

    Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane

    2014-01-01

    A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…

  2. Forest Ecosystem Analysis Using a GIS

    Treesearch

    S.G. McNulty; W.T. Swank

    1996-01-01

    Forest ecosystem studies have expanded spatially in recent years to address large scale environmental issues. We are using a geographic information system (GIS) to understand and integrate forest processes at landscape to regional spatial scales. This paper presents three diverse research studies using a GIS. First, we used a GIS to develop a landscape scale model to...

  3. Taking Teacher Learning to Scale: Sharing Knowledge and Spreading Ideas across Geographies

    ERIC Educational Resources Information Center

    Klein, Emily J.; Jaffe-Walter, Reva; Riordan, Megan

    2016-01-01

    This research reports data from case studies of three intermediary organizations facing the challenge of scaling up teacher learning. The turn of the century launched scaling-up efforts of all three intermediaries, growing from intimate groups, where founding teachers and staff were key supports for teacher learning, to large multistate…

  4. Multisite Studies and Scaling up in Educational Research

    ERIC Educational Resources Information Center

    Harwell, Michael

    2012-01-01

    A scale-up study in education typically expands the sample of students, schools, districts, and/or practices or materials used in smaller studies in ways that build in heterogeneity. Yet surprisingly little is known about the factors that promote successful scaling up efforts in education, in large part due to the absence of empirically supported…

  5. III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.

    PubMed

    Davis-Kean, Pamela E; Jager, Justin

    2017-06-01

    For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.

  6. On the scaling features of high-latitude geomagnetic field fluctuations during a large geomagnetic storm

    NASA Astrophysics Data System (ADS)

    De Michelis, Paola; Federica Marcucci, Maria; Consolini, Giuseppe

    2015-04-01

    Recently we have investigated the spatial distribution of the scaling features of short-time scale magnetic field fluctuations using measurements from several ground-based geomagnetic observatories distributed in the northern hemisphere. We have found that the scaling features of fluctuations of the horizontal magnetic field component at time scales below 100 minutes are correlated with the geomagnetic activity level and with changes in the currents flowing in the ionosphere. Here, we present a detailed analysis of the dynamical changes of the magnetic field scaling features as a function of the geomagnetic activity level during the well-known large geomagnetic storm occurred on July, 15, 2000 (the Bastille event). The observed dynamical changes are discussed in relationship with the changes of the overall ionospheric polar convection and potential structure as reconstructed using SuperDARN data. This work is supported by the Italian National Program for Antarctic Research (PNRA) - Research Project 2013/AC3.08 and by the European Community's Seventh Framework Programme ([FP7/2007-2013]) under Grant no. 313038/STORM and

  7. A Bayesian Hierarchical Model for Large-Scale Educational Surveys: An Application to the National Assessment of Educational Progress. Research Report. ETS RR-04-38

    ERIC Educational Resources Information Center

    Johnson, Matthew S.; Jenkins, Frank

    2005-01-01

    Large-scale educational assessments such as the National Assessment of Educational Progress (NAEP) sample examinees to whom an exam will be administered. In most situations the sampling design is not a simple random sample and must be accounted for in the estimating model. After reviewing the current operational estimation procedure for NAEP, this…

  8. The dynamics and evolution of clusters of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret; Huchra, John P.

    1987-01-01

    Research was undertaken to produce a coherent picture of the formation and evolution of large-scale structures in the universe. The program is divided into projects which examine four areas: the relationship between individual galaxies and their environment; the structure and evolution of individual rich clusters of galaxies; the nature of superclusters; and the large-scale distribution of individual galaxies. A brief review of results in each area is provided.

  9. Large Scale Density Estimation of Blue and Fin Whales (LSD)

    DTIC Science & Technology

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...sensors, or both. The goal of this research is to develop and implement a new method for estimating blue and fin whale density that is effective over...develop and implement a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse

  10. A large-scale initiative to disseminate an evidence-based drug abuse prevention program in Italy: Lessons learned for practitioners and researchers.

    PubMed

    Velasco, Veronica; Griffin, Kenneth W; Antichi, Mariella; Celata, Corrado

    2015-10-01

    Across developed countries, experimentation with alcohol, tobacco, and other drugs often begins in the early adolescent years. Several evidence-based programs have been developed to prevent adolescent substance use. Many of the most rigorously tested and empirically supported prevention programs were initially developed and tested in the United States. Increasingly, these interventions are being adopted for use in Europe and throughout the world. This paper reports on a large-scale comprehensive initiative designed to select, adapt, implement, and sustain an evidence-based drug abuse prevention program in Italy. As part of a large-scale regionally funded collaboration in the Lombardy region of Italy, we report on processes through which a team of stakeholders selected, translated and culturally adapted, planned, implemented and evaluated the Life Skills Training (LST) school-based drug abuse prevention program, an evidence-based intervention developed in the United States. We discuss several challenges and lessons learned and implications for prevention practitioners and researchers attempting to undertake similar international dissemination projects. We review several published conceptual models designed to promote the replication and widespread dissemination of effective programs, and discuss their strengths and limitations in the context of planning and implementing a complex, large-scale real-world dissemination effort. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Biology-Inspired Distributed Consensus in Massively-Deployed Sensor Networks

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng

    2005-01-01

    Promises of ubiquitous control of the physical environment by large-scale wireless sensor networks open avenues for new applications that are expected to redefine the way we live and work. Most of recent research has concentrated on developing techniques for performing relatively simple tasks in small-scale sensor networks assuming some form of centralized control. The main contribution of this work is to propose a new way of looking at large-scale sensor networks, motivated by lessons learned from the way biological ecosystems are organized. Indeed, we believe that techniques used in small-scale sensor networks are not likely to scale to large networks; that such large-scale networks must be viewed as an ecosystem in which the sensors/effectors are organisms whose autonomous actions, based on local information, combine in a communal way to produce global results. As an example of a useful function, we demonstrate that fully distributed consensus can be attained in a scalable fashion in massively deployed sensor networks where individual motes operate based on local information, making local decisions that are aggregated across the network to achieve globally-meaningful effects.

  12. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  13. Investigation of multilayer domains in large-scale CVD monolayer graphene by optical imaging

    NASA Astrophysics Data System (ADS)

    Yu, Yuanfang; Li, Zhenzhen; Wang, Wenhui; Guo, Xitao; Jiang, Jie; Nan, Haiyan; Ni, Zhenhua

    2017-03-01

    CVD graphene is a promising candidate for optoelectronic applications due to its high quality and high yield. However, multi-layer domains could inevitably form at the nucleation centers during the growth. Here, we propose an optical imaging technique to precisely identify the multilayer domains and also the ratio of their coverage in large-scale CVD monolayer graphene. We have also shown that the stacking disorder in twisted bilayer graphene as well as the impurities on the graphene surface could be distinguished by optical imaging. Finally, we investigated the effects of bilayer domains on the optical and electrical properties of CVD graphene, and found that the carrier mobility of CVD graphene is seriously limited by scattering from bilayer domains. Our results could be useful for guiding future optoelectronic applications of large-scale CVD graphene. Project supported by the National Natural Science Foundation of China (Nos. 61422503, 61376104), the Open Research Funds of Key Laboratory of MEMS of Ministry of Education (SEU, China), and the Fundamental Research Funds for the Central Universities.

  14. Large-Scale Advanced Prop-Fan (LAP)

    NASA Technical Reports Server (NTRS)

    Degeorge, C. L.

    1988-01-01

    In recent years, considerable attention has been directed toward improving aircraft fuel efficiency. Analytical studies and research with wind tunnel models have demonstrated that the high inherent efficiency of low speed turboprop propulsion systems may now be extended to the Mach .8 flight regime of today's commercial airliners. This can be accomplished with a propeller, employing a large number of thin highly swept blades. The term Prop-Fan has been coined to describe such a propulsion system. In 1983 the NASA-Lewis Research Center contracted with Hamilton Standard to design, build and test a near full scale Prop-Fan, designated the Large Scale Advanced Prop-Fan (LAP). This report provides a detailed description of the LAP program. The assumptions and analytical procedures used in the design of Prop-Fan system components are discussed in detail. The manufacturing techniques used in the fabrication of the Prop-Fan are presented. Each of the tests run during the course of the program are also discussed and the major conclusions derived from them stated.

  15. Evaluating the Health Impact of Large-Scale Public Policy Changes: Classical and Novel Approaches

    PubMed Central

    Basu, Sanjay; Meghani, Ankita; Siddiqi, Arjumand

    2018-01-01

    Large-scale public policy changes are often recommended to improve public health. Despite varying widely—from tobacco taxes to poverty-relief programs—such policies present a common dilemma to public health researchers: how to evaluate their health effects when randomized controlled trials are not possible. Here, we review the state of knowledge and experience of public health researchers who rigorously evaluate the health consequences of large-scale public policy changes. We organize our discussion by detailing approaches to address three common challenges of conducting policy evaluations: distinguishing a policy effect from time trends in health outcomes or preexisting differences between policy-affected and -unaffected communities (using difference-in-differences approaches); constructing a comparison population when a policy affects a population for whom a well-matched comparator is not immediately available (using propensity score or synthetic control approaches); and addressing unobserved confounders by utilizing quasi-random variations in policy exposure (using regression discontinuity, instrumental variables, or near-far matching approaches). PMID:28384086

  16. Large-Scale Low-Boom Inlet Test Overview

    NASA Technical Reports Server (NTRS)

    Hirt, Stefanie

    2011-01-01

    This presentation provides a high level overview of the Large-Scale Low-Boom Inlet Test and was presented at the Fundamental Aeronautics 2011 Technical Conference. In October 2010 a low-boom supersonic inlet concept with flow control was tested in the 8'x6' supersonic wind tunnel at NASA Glenn Research Center (GRC). The primary objectives of the test were to evaluate the inlet stability and operability of a large-scale low-boom supersonic inlet concept by acquiring performance and flowfield validation data, as well as evaluate simple, passive, bleedless inlet boundary layer control options. During this effort two models were tested: a dual stream inlet intended to model potential flight hardware and a single stream design to study a zero-degree external cowl angle and to permit surface flow visualization of the vortex generator flow control on the internal centerbody surface. The tests were conducted by a team of researchers from NASA GRC, Gulfstream Aerospace Corporation, University of Illinois at Urbana-Champaign, and the University of Virginia

  17. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  18. Centrifuge impact cratering experiments: Scaling laws for non-porous targets

    NASA Technical Reports Server (NTRS)

    Schmidt, Robert M.

    1987-01-01

    This research is a continuation of an ongoing program whose objective is to perform experiments and to develop scaling relationships for large body impacts onto planetary surfaces. The development of the centrifuge technique has been pioneered by the present investigator and is used to provide experimental data for actual target materials of interest. With both powder and gas guns mounted on a rotor arm, it is possible to match various dimensionless similarity parameters, which have been shown to govern the behavior of large scale impacts. Current work is directed toward the determination of scaling estimates for nonporous targets. The results are presented in summary form.

  19. Linear static structural and vibration analysis on high-performance computers

    NASA Technical Reports Server (NTRS)

    Baddourah, M. A.; Storaasli, O. O.; Bostic, S. W.

    1993-01-01

    Parallel computers offer the oppurtunity to significantly reduce the computation time necessary to analyze large-scale aerospace structures. This paper presents algorithms developed for and implemented on massively-parallel computers hereafter referred to as Scalable High-Performance Computers (SHPC), for the most computationally intensive tasks involved in structural analysis, namely, generation and assembly of system matrices, solution of systems of equations and calculation of the eigenvalues and eigenvectors. Results on SHPC are presented for large-scale structural problems (i.e. models for High-Speed Civil Transport). The goal of this research is to develop a new, efficient technique which extends structural analysis to SHPC and makes large-scale structural analyses tractable.

  20. Theme II Joint Work Plan -2017 Collaboration and Knowledge Sharing on Large-scale Demonstration Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoliang; Stauffer, Philip H.

    This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.

  1. Pre-treatment analysis of woody vegetation composition and structure on the hardwood ecosystem experiment research units

    Treesearch

    Michael R. Saunders; Justin E. Arseneault

    2013-01-01

    In long-term, large-scale forest management studies, documentation of pre-treatment differences among and variability within experimental units is critical for drawing the proper inferences from imposed treatments. We compared pre-treatment overstory and large shrub communities (diameters at breast height >1.5 cm) for the 9 research cores with the Hardwood Ecosystem...

  2. Orthographic and Phonological Neighborhood Databases across Multiple Languages.

    PubMed

    Marian, Viorica

    2017-01-01

    The increased globalization of science and technology and the growing number of bilinguals and multilinguals in the world have made research with multiple languages a mainstay for scholars who study human function and especially those who focus on language, cognition, and the brain. Such research can benefit from large-scale databases and online resources that describe and measure lexical, phonological, orthographic, and semantic information. The present paper discusses currently-available resources and underscores the need for tools that enable measurements both within and across multiple languages. A general review of language databases is followed by a targeted introduction to databases of orthographic and phonological neighborhoods. A specific focus on CLEARPOND illustrates how databases can be used to assess and compare neighborhood information across languages, to develop research materials, and to provide insight into broad questions about language. As an example of how using large-scale databases can answer questions about language, a closer look at neighborhood effects on lexical access reveals that not only orthographic, but also phonological neighborhoods can influence visual lexical access both within and across languages. We conclude that capitalizing upon large-scale linguistic databases can advance, refine, and accelerate scientific discoveries about the human linguistic capacity.

  3. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    NASA Astrophysics Data System (ADS)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  4. Closed Large Cell Clouds

    Atmospheric Science Data Center

    2013-04-19

    article title:  Closed Large Cell Clouds in the South Pacific ... the Multi-angle Imaging SpectroRadiometer (MISR) provide an example of very large scale closed cells, and can be contrasted with the  ... MD. The MISR data were obtained from the NASA Langley Research Center Atmospheric Science Data Center in Hampton, VA. Image ...

  5. In Defense of the National Labs and Big-Budget Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodwin, J R

    2008-07-29

    The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tappedmore » in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider (LHC) at CERN, and the International Tokamak Experimental Reactor (ITER) in Cadarache, France, a magnetic-confinement fusion research project. The postWWII histories of particle and fusion physics contain remarkable examples of both international competition, with an emphasis on secrecy, and international cooperation, with an emphasis on shared knowledge and resources. Initiatives to share sometimes came from surprising directions. Most large-scale scientific projects have potential defense applications. NIF certainly does; it is primarily designed to create small-scale fusion explosions. Blue Gene/L operates in part in service to NIF, and in part to various defense projects. The most important defense projects include stewardship of the national nuclear weapons stockpile, and the proposed redesign and replacement of those weapons with fewer, safer, more reliable, longer-lived, and less apocalyptic warheads. Many well-meaning people will consider the optimal lifetime of a nuclear weapon to be zero, but most thoughtful people, when asked how much longer they think this nation will require them, will ask for some time to think. NIF is also designed to create exothermic small-scale fusion explosions. The malapropos 'exothermic' here is a convenience to cover a profusion of complexities, but the basic idea is that the explosions will create more recoverable energy than was used to create them. One can hope that the primary future benefits of success for NIF will be in cost-effective generation of electrical power through controlled small-scale fusion reactions, rather than in improved large-scale fusion explosions. Blue Gene/L also services climate research, genomic research, materials research, and a myriad of other computational problems that become more feasible, reliable, and precise the larger the number of computational nodes employed. Blue Gene/L has to be sited within a security complex for obvious reasons, but its value extends to the nation and the world. There is a duality here between large-scale scientific research machines and the supercomputers used to model them. An astounding example is illustrated in a graph released by EFDAJET, at Oxfordshire, UK, presently the largest operating magnetic-confinement fusion experiment. The graph shows plasma confinement times (an essential performance parameter) for all the major tokamaks in the international fusion program, over their existing lifetimes. The remarkable thing about the data is not so much confinement-time versus date or scale, but the fact that the data are given for both the computer model predictions and the actual experimental measurements, and the two are in phenomenal agreement over the extended range of scales. Supercomputer models, sometimes operating with the intricacy of Schroedinger's equation at quantum physical scales, have become a costly but enormously cost-saving tool.« less

  6. Computation and Theory in Large-Scale Optimization

    DTIC Science & Technology

    1993-01-13

    Sang Jin Lee. Research Assistant. - Laura Morley, Research Assistant. - Yonca A. Ozge , Research Assistant. - Stephen M. Robinson. Professor. - Hichem...other participants. M.N. Azadez. S.J. Lee. Y.A. Ozge . and H. Sellami are continuing students in the doctoral program (in Industrial Engineering except

  7. Controlling Guessing Bias in the Dichotomous Rasch Model Applied to a Large-Scale, Vertically Scaled Testing Program

    ERIC Educational Resources Information Center

    Andrich, David; Marais, Ida; Humphry, Stephen Mark

    2016-01-01

    Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The…

  8. Experiential Approaches to Teaching Survey Research: Role Strains and Relationships.

    ERIC Educational Resources Information Center

    Suelzle, Marijean; And Others

    Research and instructional role strains are identified, based on the use of large-scale mail surveys for college self-study that are used to teach introductory research methodology. The two organizational hierarchies, the research model and the instructional model, are examined. Experiences at Northwestern University, Northeastern Illinois…

  9. Exploring the Challenges of Conducting Respectful Research: Seen and Unforeseen Factors within Urban School Research

    ERIC Educational Resources Information Center

    Samaroo, Julia; Dahya, Negin; Alidina, Shahnaaz

    2013-01-01

    This paper discusses the significance of conducting respectful research within urban schools, using the example of one large-scale university-school board partnership in northwestern Toronto. The authors, three research assistants on the project, use their experiences within three of the participating schools to interrogate the research approach…

  10. The impact of ordinate scaling on the visual analysis of single-case data.

    PubMed

    Dart, Evan H; Radley, Keith C

    2017-08-01

    Visual analysis is the primary method for detecting the presence of treatment effects in graphically displayed single-case data and it is often referred to as the "gold standard." Although researchers have developed standards for the application of visual analysis (e.g., Horner et al., 2005), over- and underestimation of effect size magnitude is not uncommon among analysts. Several characteristics have been identified as potential contributors to these errors; however, researchers have largely focused on characteristics of the data itself (e.g., autocorrelation), paying less attention to characteristics of the graphic display which are largely in control of the analyst (e.g., ordinate scaling). The current study investigated the impact that differences in ordinate scaling, a graphic display characteristic, had on experts' accuracy in judgments regarding the magnitude of effect present in single-case percentage data. 32 participants were asked to evaluate eight ABAB data sets (2 each presenting null, small, moderate, and large effects) along with three iterations of each (32 graphs in total) in which only the ordinate scale was manipulated. Results suggest that raters are less accurate in their detection of treatment effects as the ordinate scale is constricted. Additionally, raters were more likely to overestimate the size of a treatment effect when the ordinate scale was constricted. Copyright © 2017 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  11. Cooperation, collective action, and the archeology of large-scale societies.

    PubMed

    Carballo, David M; Feinman, Gary M

    2016-11-01

    Archeologists investigating the emergence of large-scale societies in the past have renewed interest in examining the dynamics of cooperation as a means of understanding societal change and organizational variability within human groups over time. Unlike earlier approaches to these issues, which used models designated voluntaristic or managerial, contemporary research articulates more explicitly with frameworks for cooperation and collective action used in other fields, thereby facilitating empirical testing through better definition of the costs, benefits, and social mechanisms associated with success or failure in coordinated group action. Current scholarship is nevertheless bifurcated along lines of epistemology and scale, which is understandable but problematic for forging a broader, more transdisciplinary field of cooperation studies. Here, we point to some areas of potential overlap by reviewing archeological research that places the dynamics of social cooperation and competition in the foreground of the emergence of large-scale societies, which we define as those having larger populations, greater concentrations of political power, and higher degrees of social inequality. We focus on key issues involving the communal-resource management of subsistence and other economic goods, as well as the revenue flows that undergird political institutions. Drawing on archeological cases from across the globe, with greater detail from our area of expertise in Mesoamerica, we offer suggestions for strengthening analytical methods and generating more transdisciplinary research programs that address human societies across scalar and temporal spectra. © 2016 Wiley Periodicals, Inc.

  12. Studies of Postdisaster Economic Recovery: Analysis, Synthesis, and Assessment

    DTIC Science & Technology

    1987-06-01

    of a large-scale nuclear disaster can be viewed in the aggre- gate as attempting to answer two broad questions: 1. Do resources survive in sufficient...With respect to economic institutional issues in the aftermath of a nuclear disaster , published research has been, almost without exception, speculative...possibilities. There are at least three major themes that per- meate the literature on economic control in the event of a large-scale nuclear disaster . First

  13. Flexible sampling large-scale social networks by self-adjustable random walk

    NASA Astrophysics Data System (ADS)

    Xu, Xiao-Ke; Zhu, Jonathan J. H.

    2016-12-01

    Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.

  14. Does Scale Really Matter? Ultra-Large-Scale Systems Seven Years after the Study

    DTIC Science & Technology

    2013-05-24

    Beyonce Knowles releases second consecutive No.1 album and fourth No.1 single in the US BlackBerry users numbered 4,900,000 in March, 2006...And yet…there is a fast growing gap between our research and reality. 75 Does Scale Really Matter?: ULS Systems Seven Years Later Linda Northrop

  15. Overview of current research on atmospheric interactions with wildland fires

    Treesearch

    Warren E. Heilman

    1996-01-01

    Changes in the large-scale mean thermal structure of the atmosphere have the potential for affecting the dynamics of the atmosphere across the entire spectrum of scales that govern atmospheric processes. Inherent in these changes are interactions among the scales that could change, resulting in an alteration in the frequency of regional weather systems conducive to...

  16. Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach

    PubMed Central

    Bigdely-Shamlo, Nima; Makeig, Scott; Robbins, Kay A.

    2016-01-01

    Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain–computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a “containerized” approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data “Levels,” each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org). PMID:27014048

  17. Annual Research Briefs

    NASA Technical Reports Server (NTRS)

    Spinks, Debra (Compiler)

    1997-01-01

    This report contains the 1997 annual progress reports of the research fellows and students supported by the Center for Turbulence Research (CTR). Titles include: Invariant modeling in large-eddy simulation of turbulence; Validation of large-eddy simulation in a plain asymmetric diffuser; Progress in large-eddy simulation of trailing-edge turbulence and aeronautics; Resolution requirements in large-eddy simulations of shear flows; A general theory of discrete filtering for LES in complex geometry; On the use of discrete filters for large eddy simulation; Wall models in large eddy simulation of separated flow; Perspectives for ensemble average LES; Anisotropic grid-based formulas for subgrid-scale models; Some modeling requirements for wall models in large eddy simulation; Numerical simulation of 3D turbulent boundary layers using the V2F model; Accurate modeling of impinging jet heat transfer; Application of turbulence models to high-lift airfoils; Advances in structure-based turbulence modeling; Incorporating realistic chemistry into direct numerical simulations of turbulent non-premixed combustion; Effects of small-scale structure on turbulent mixing; Turbulent premixed combustion in the laminar flamelet and the thin reaction zone regime; Large eddy simulation of combustion instabilities in turbulent premixed burners; On the generation of vorticity at a free-surface; Active control of turbulent channel flow; A generalized framework for robust control in fluid mechanics; Combined immersed-boundary/B-spline methods for simulations of flow in complex geometries; and DNS of shock boundary-layer interaction - preliminary results for compression ramp flow.

  18. Development of a Two-Stage Microalgae Dewatering Process – A Life Cycle Assessment Approach

    PubMed Central

    Soomro, Rizwan R.; Zeng, Xianhai; Lu, Yinghua; Lin, Lu; Danquah, Michael K.

    2016-01-01

    Even though microalgal biomass is leading the third generation biofuel research, significant effort is required to establish an economically viable commercial-scale microalgal biofuel production system. Whilst a significant amount of work has been reported on large-scale cultivation of microalgae using photo-bioreactors and pond systems, research focus on establishing high performance downstream dewatering operations for large-scale processing under optimal economy is limited. The enormous amount of energy and associated cost required for dewatering large-volume microalgal cultures has been the primary hindrance to the development of the needed biomass quantity for industrial-scale microalgal biofuels production. The extremely dilute nature of large-volume microalgal suspension and the small size of microalgae cells in suspension create a significant processing cost during dewatering and this has raised major concerns towards the economic success of commercial-scale microalgal biofuel production as an alternative to conventional petroleum fuels. This article reports an effective framework to assess the performance of different dewatering technologies as the basis to establish an effective two-stage dewatering system. Bioflocculation coupled with tangential flow filtration (TFF) emerged a promising technique with total energy input of 0.041 kWh, 0.05 kg CO2 emissions and a cost of $ 0.0043 for producing 1 kg of microalgae biomass. A streamlined process for operational analysis of two-stage microalgae dewatering technique, encompassing energy input, carbon dioxide emission, and process cost, is presented. PMID:26904075

  19. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    DTIC Science & Technology

    2004-10-01

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) Defense Advanced Research Projects Agency AFRL/IFTC 3701 North Fairfax Drive...Scalable Parallel Libraries for Large-Scale Concurrent Applications," Technical Report UCRL -JC-109251, Lawrence Livermore National Laboratory

  20. Operations

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.; Norton, Anderson; Boyce, Steven J.

    2013-01-01

    Previous research has documented schemes and operations that undergird students' understanding of fractions. This prior research was based, in large part, on small-group teaching experiments. However, written assessments are needed in order for teachers and researchers to assess students' ways of operating on a whole-class scale. In this study,…

  1. "Fan-Tip-Drive" High-Power-Density, Permanent Magnet Electric Motor and Test Rig Designed for a Nonpolluting Aircraft Propulsion Program

    NASA Technical Reports Server (NTRS)

    Brown, Gerald V.; Kascak, Albert F.

    2004-01-01

    A scaled blade-tip-drive test rig was designed at the NASA Glenn Research Center. The rig is a scaled version of a direct-current brushless motor that would be located in the shroud of a thrust fan. This geometry is very attractive since the allowable speed of the armature is approximately the speed of the blade tips (Mach 1 or 1100 ft/s). The magnetic pressure generated in the motor acts over a large area and, thus, produces a large force or torque. This large force multiplied by the large velocity results in a high-power-density motor.

  2. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  3. A unique large-scale undergraduate research experience in molecular systems biology for non-mathematics majors.

    PubMed

    Kappler, Ulrike; Rowland, Susan L; Pedwell, Rhianna K

    2017-05-01

    Systems biology is frequently taught with an emphasis on mathematical modeling approaches. This focus effectively excludes most biology, biochemistry, and molecular biology students, who are not mathematics majors. The mathematical focus can also present a misleading picture of systems biology, which is a multi-disciplinary pursuit requiring collaboration between biochemists, bioinformaticians, and mathematicians. This article describes an authentic large-scale undergraduate research experience (ALURE) in systems biology that incorporates proteomics, bacterial genomics, and bioinformatics in the one exercise. This project is designed to engage students who have a basic grounding in protein chemistry and metabolism and no mathematical modeling skills. The pedagogy around the research experience is designed to help students attack complex datasets and use their emergent metabolic knowledge to make meaning from large amounts of raw data. On completing the ALURE, participants reported a significant increase in their confidence around analyzing large datasets, while the majority of the cohort reported good or great gains in a variety of skills including "analysing data for patterns" and "conducting database or internet searches." An environmental scan shows that this ALURE is the only undergraduate-level system-biology research project offered on a large-scale in Australia; this speaks to the perceived difficulty of implementing such an opportunity for students. We argue however, that based on the student feedback, allowing undergraduate students to complete a systems-biology project is both feasible and desirable, even if the students are not maths and computing majors. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(3):235-248, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.

  4. Teacher Beliefs, Teacher Concerns, and School Leadership Support as Influences on School Readiness for Implementing a Research-Based Reform Model

    ERIC Educational Resources Information Center

    Carhart, Elizabeth Hoag

    2013-01-01

    Federal policy makers and school leaders increasingly recognize middle school math as a turning point in students' academic success. An i3 scale-up grant allowed grant partners to conduct a large-scale implementation of PowerTeaching (PT), a research-based reform to increase student math achievement. In a mixed-methods study during the pilot phase…

  5. Research Guidelines in the Era of Large-scale Collaborations: An Analysis of Genome-wide Association Study Consortia

    PubMed Central

    Austin, Melissa A.; Hair, Marilyn S.; Fullerton, Stephanie M.

    2012-01-01

    Scientific research has shifted from studies conducted by single investigators to the creation of large consortia. Genetic epidemiologists, for example, now collaborate extensively for genome-wide association studies (GWAS). The effect has been a stream of confirmed disease-gene associations. However, effects on human subjects oversight, data-sharing, publication and authorship practices, research organization and productivity, and intellectual property remain to be examined. The aim of this analysis was to identify all research consortia that had published the results of a GWAS analysis since 2005, characterize them, determine which have publicly accessible guidelines for research practices, and summarize the policies in these guidelines. A review of the National Human Genome Research Institute’s Catalog of Published Genome-Wide Association Studies identified 55 GWAS consortia as of April 1, 2011. These consortia were comprised of individual investigators, research centers, studies, or other consortia and studied 48 different diseases or traits. Only 14 (25%) were found to have publicly accessible research guidelines on consortia websites. The available guidelines provide information on organization, governance, and research protocols; half address institutional review board approval. Details of publication, authorship, data-sharing, and intellectual property vary considerably. Wider access to consortia guidelines is needed to establish appropriate research standards with broad applicability to emerging forms of large-scale collaboration. PMID:22491085

  6. Large-scale production of lentiviral vector in a closed system hollow fiber bioreactor

    PubMed Central

    Sheu, Jonathan; Beltzer, Jim; Fury, Brian; Wilczek, Katarzyna; Tobin, Steve; Falconer, Danny; Nolta, Jan; Bauer, Gerhard

    2015-01-01

    Lentiviral vectors are widely used in the field of gene therapy as an effective method for permanent gene delivery. While current methods of producing small scale vector batches for research purposes depend largely on culture flasks, the emergence and popularity of lentiviral vectors in translational, preclinical and clinical research has demanded their production on a much larger scale, a task that can be difficult to manage with the numbers of producer cell culture flasks required for large volumes of vector. To generate a large scale, partially closed system method for the manufacturing of clinical grade lentiviral vector suitable for the generation of induced pluripotent stem cells (iPSCs), we developed a method employing a hollow fiber bioreactor traditionally used for cell expansion. We have demonstrated the growth, transfection, and vector-producing capability of 293T producer cells in this system. Vector particle RNA titers after subsequent vector concentration yielded values comparable to lentiviral iPSC induction vector batches produced using traditional culture methods in 225 cm2 flasks (T225s) and in 10-layer cell factories (CF10s), while yielding a volume nearly 145 times larger than the yield from a T225 flask and nearly three times larger than the yield from a CF10. Employing a closed system hollow fiber bioreactor for vector production offers the possibility of manufacturing large quantities of gene therapy vector while minimizing reagent usage, equipment footprint, and open system manipulation. PMID:26151065

  7. Design of a decentralized reusable research database architecture to support data acquisition in large research projects.

    PubMed

    Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning

    2007-01-01

    The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.

  8. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    ERIC Educational Resources Information Center

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  9. International law poses problems for negative emissions research

    NASA Astrophysics Data System (ADS)

    Brent, Kerryn; McGee, Jeffrey; McDonald, Jan; Rohling, Eelco J.

    2018-06-01

    New international governance arrangements that manage environmental risk and potential conflicts of interests are needed to facilitate negative emissions research that is essential to achieving the large-scale CO2 removal implied by the Paris Agreement targets.

  10. HTS-DB: an online resource to publish and query data from functional genomics high-throughput siRNA screening projects.

    PubMed

    Saunders, Rebecca E; Instrell, Rachael; Rispoli, Rossella; Jiang, Ming; Howell, Michael

    2013-01-01

    High-throughput screening (HTS) uses technologies such as RNA interference to generate loss-of-function phenotypes on a genomic scale. As these technologies become more popular, many research institutes have established core facilities of expertise to deal with the challenges of large-scale HTS experiments. As the efforts of core facility screening projects come to fruition, focus has shifted towards managing the results of these experiments and making them available in a useful format that can be further mined for phenotypic discovery. The HTS-DB database provides a public view of data from screening projects undertaken by the HTS core facility at the CRUK London Research Institute. All projects and screens are described with comprehensive assay protocols, and datasets are provided with complete descriptions of analysis techniques. This format allows users to browse and search data from large-scale studies in an informative and intuitive way. It also provides a repository for additional measurements obtained from screens that were not the focus of the project, such as cell viability, and groups these data so that it can provide a gene-centric summary across several different cell lines and conditions. All datasets from our screens that can be made available can be viewed interactively and mined for further hit lists. We believe that in this format, the database provides researchers with rapid access to results of large-scale experiments that might facilitate their understanding of genes/compounds identified in their own research. DATABASE URL: http://hts.cancerresearchuk.org/db/public.

  11. A new framework to increase the efficiency of large-scale solar power plants.

    NASA Astrophysics Data System (ADS)

    Alimohammadi, Shahrouz; Kleissl, Jan P.

    2015-11-01

    A new framework to estimate the spatio-temporal behavior of solar power is introduced, which predicts the statistical behavior of power output at utility scale Photo-Voltaic (PV) power plants. The framework is based on spatio-temporal Gaussian Processes Regression (Kriging) models, which incorporates satellite data with the UCSD version of the Weather and Research Forecasting model. This framework is designed to improve the efficiency of the large-scale solar power plants. The results are also validated from measurements of the local pyranometer sensors, and some improvements in different scenarios are observed. Solar energy.

  12. Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.

    PubMed

    Hardt, Marah J; Flett, Keith; Howell, Colleen J

    2017-08-01

    Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.

  13. Understanding ocean acidification impacts on organismal to ecological scales

    USGS Publications Warehouse

    Andersson, Andreas J; Kline, David I; Edmunds, Peter J; Archer, Stephen D; Bednaršek, Nina; Carpenter, Robert C; Chadsey, Meg; Goldstein, Philip; Grottoli, Andrea G.; Hurst, Thomas P; King, Andrew L; Kübler, Janet E.; Kuffner, Ilsa B.; Mackey, Katherine R M; Menge, Bruce A.; Paytan, Adina; Riebesell, Ulf; Schnetzer, Astrid; Warner, Mark E; Zimmerman, Richard C

    2015-01-01

    Ocean acidification (OA) research seeks to understand how marine ecosystems and global elemental cycles will respond to changes in seawater carbonate chemistry in combination with other environmental perturbations such as warming, eutrophication, and deoxygenation. Here, we discuss the effectiveness and limitations of current research approaches used to address this goal. A diverse combination of approaches is essential to decipher the consequences of OA to marine organisms, communities, and ecosystems. Consequently, the benefits and limitations of each approach must be considered carefully. Major research challenges involve experimentally addressing the effects of OA in the context of large natural variability in seawater carbonate system parameters and other interactive variables, integrating the results from different research approaches, and scaling results across different temporal and spatial scales.

  14. Optical/IR from ground

    NASA Technical Reports Server (NTRS)

    Strom, Stephen; Sargent, Wallace L. W.; Wolff, Sidney; Ahearn, Michael F.; Angel, J. Roger; Beckwith, Steven V. W.; Carney, Bruce W.; Conti, Peter S.; Edwards, Suzan; Grasdalen, Gary

    1991-01-01

    Optical/infrared (O/IR) astronomy in the 1990's is reviewed. The following subject areas are included: research environment; science opportunities; technical development of the 1980's and opportunities for the 1990's; and ground-based O/IR astronomy outside the U.S. Recommendations are presented for: (1) large scale programs (Priority 1: a coordinated program for large O/IR telescopes); (2) medium scale programs (Priority 1: a coordinated program for high angular resolution; Priority 2: a new generation of 4-m class telescopes); (3) small scale programs (Priority 1: near-IR and optical all-sky surveys; Priority 2: a National Astrometric Facility); and (4) infrastructure issues (develop, purchase, and distribute optical CCDs and infrared arrays; a program to support large optics technology; a new generation of large filled aperture telescopes; a program to archive and disseminate astronomical databases; and a program for training new instrumentalists)

  15. AirSTAR: A UAV Platform for Flight Dynamics and Control System Testing

    NASA Technical Reports Server (NTRS)

    Jordan, Thomas L.; Foster, John V.; Bailey, Roger M.; Belcastro, Christine M.

    2006-01-01

    As part of the NASA Aviation Safety Program at Langley Research Center, a dynamically scaled unmanned aerial vehicle (UAV) and associated ground based control system are being developed to investigate dynamics modeling and control of large transport vehicles in upset conditions. The UAV is a 5.5% (seven foot wingspan), twin turbine, generic transport aircraft with a sophisticated instrumentation and telemetry package. A ground based, real-time control system is located inside an operations vehicle for the research pilot and associated support personnel. The telemetry system supports over 70 channels of data plus video for the downlink and 30 channels for the control uplink. Data rates are in excess of 200 Hz. Dynamic scaling of the UAV, which includes dimensional, weight, inertial, actuation, and control system scaling, is required so that the sub-scale vehicle will realistically simulate the flight characteristics of the full-scale aircraft. This testbed will be utilized to validate modeling methods, flight dynamics characteristics, and control system designs for large transport aircraft, with the end goal being the development of technologies to reduce the fatal accident rate due to loss-of-control.

  16. Yong-Ki Kim — His Life and Recent Work

    NASA Astrophysics Data System (ADS)

    Stone, Philip M.

    2007-08-01

    Dr. Kim made internationally recognized contributions in many areas of atomic physics research and applications, and was still very active when he was killed in an automobile accident. He joined NIST in 1983 after 17 years at the Argonne National Laboratory following his Ph.D. work at the University of Chicago. Much of his early work at Argonne and especially at NIST was the elucidation and detailed analysis of the structure of highly charged ions. He developed a sophisticated, fully relativistic atomic structure theory that accurately predicts atomic energy levels, transition wavelengths, lifetimes, and transition probabilities for a large number of ions. This information has been vital to model the properties of the hot interior of fusion research plasmas, where atomic ions must be described with relativistic atomic structure calculations. In recent years, Dr. Kim worked on the precise calculation of ionization and excitation cross sections of numerous atoms, ions, and molecules that are important in fusion research and in plasma processing for manufacturing semiconductor chips. Dr. Kim greatly advanced the state-of-the-art of calculations for these cross sections through development and implementation of highly innovative methods, including his Binary-Encounter-Bethe (BEB) theory and a scaled plane wave Born (scaled PWB) theory. His methods, using closed quantum mechanical formulas and no adjustable parameters, avoid tedious large-scale computations with main-frame computers. His calculations closely reproduce the results of benchmark experiments as well as large-scale calculations requiring hours of computer time. This recent work on BEB and scaled PWB is reviewed and examples of its capabilities are shown.

  17. Studies on combined model based on functional objectives of large scale complex engineering

    NASA Astrophysics Data System (ADS)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  18. Using Propensity Scores in Quasi-Experimental Designs to Equate Groups

    ERIC Educational Resources Information Center

    Lane, Forrest C.; Henson, Robin K.

    2010-01-01

    Education research rarely lends itself to large scale experimental research and true randomization, leaving the researcher to quasi-experimental designs. The problem with quasi-experimental research is that underlying factors may impact group selection and lead to potentially biased results. One way to minimize the impact of non-randomization is…

  19. Reconciling Rigour and Impact by Collaborative Research Design: Study of Teacher Agency

    ERIC Educational Resources Information Center

    Pantic, Nataša

    2017-01-01

    This paper illustrates a new way of working collaboratively on the development of a methodology for studying teacher agency for social justice. Increasing emphasis of impact on change as a purpose of social research raises questions about appropriate research designs. Large-scale quantitative research framed within externally set parameters has…

  20. Big Data in Medicine is Driving Big Changes

    PubMed Central

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  1. Modeling Change in Large-Scale Longitudinal Studies of Educational Growth: Four Decades of Contributions to the Assessment of Educational Growth. ETC R&D Scientific and Policy Contributions Series. ETS SPC-12-01. Research Report No. RR-12-04

    ERIC Educational Resources Information Center

    Rock, Donald A.

    2012-01-01

    This paper provides a history of ETS's role in developing assessment instruments and psychometric procedures for measuring change in large-scale national assessments funded by the Longitudinal Studies branch of the National Center for Education Statistics. It documents the innovations developed during more than 30 years of working with…

  2. Modeling Change in Large-Scale Longitudinal Studies of Educational Growth: Four Decades of Contributions to the Assessment of Educational Growth. Research Report. ETS RR-12-04. ETS R&D Scientific and Policy Contributions Series. ETS SPC-12-01

    ERIC Educational Resources Information Center

    Rock, Donald A.

    2012-01-01

    This paper provides a history of ETS's role in developing assessment instruments and psychometric procedures for measuring change in large-scale national assessments funded by the Longitudinal Studies branch of the National Center for Education Statistics. It documents the innovations developed during more than 30 years of working with…

  3. The impact of large-scale, long-term optical surveys on pulsating star research

    NASA Astrophysics Data System (ADS)

    Soszyński, Igor

    2017-09-01

    The era of large-scale photometric variability surveys began a quarter of a century ago, when three microlensing projects - EROS, MACHO, and OGLE - started their operation. These surveys initiated a revolution in the field of variable stars and in the next years they inspired many new observational projects. Large-scale optical surveys multiplied the number of variable stars known in the Universe. The huge, homogeneous and complete catalogs of pulsating stars, such as Cepheids, RR Lyrae stars, or long-period variables, offer an unprecedented opportunity to calibrate and test the accuracy of various distance indicators, to trace the three-dimensional structure of the Milky Way and other galaxies, to discover exotic types of intrinsically variable stars, or to study previously unknown features and behaviors of pulsators. We present historical and recent findings on various types of pulsating stars obtained from the optical large-scale surveys, with particular emphasis on the OGLE project which currently offers the largest photometric database among surveys for stellar variability.

  4. Chronic, Wireless Recordings of Large Scale Brain Activity in Freely Moving Rhesus Monkeys

    PubMed Central

    Schwarz, David A.; Lebedev, Mikhail A.; Hanson, Timothy L.; Dimitrov, Dragan F.; Lehew, Gary; Meloy, Jim; Rajangam, Sankaranarayani; Subramanian, Vivek; Ifft, Peter J.; Li, Zheng; Ramakrishnan, Arjun; Tate, Andrew; Zhuang, Katie; Nicolelis, Miguel A.L.

    2014-01-01

    Advances in techniques for recording large-scale brain activity contribute to both the elucidation of neurophysiological principles and the development of brain-machine interfaces (BMIs). Here we describe a neurophysiological paradigm for performing tethered and wireless large-scale recordings based on movable volumetric three-dimensional (3D) multielectrode implants. This approach allowed us to isolate up to 1,800 units per animal and simultaneously record the extracellular activity of close to 500 cortical neurons, distributed across multiple cortical areas, in freely behaving rhesus monkeys. The method is expandable, in principle, to thousands of simultaneously recorded channels. It also allows increased recording longevity (5 consecutive years), and recording of a broad range of behaviors, e.g. social interactions, and BMI paradigms in freely moving primates. We propose that wireless large-scale recordings could have a profound impact on basic primate neurophysiology research, while providing a framework for the development and testing of clinically relevant neuroprostheses. PMID:24776634

  5. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    NASA Astrophysics Data System (ADS)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  6. An outdoor test facility for the large-scale production of microalgae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, D.A.; Weissman, J.; Goebel, R.

    The goal of the US Department of EnergySolar Energy Research Institute's Aquatic Species Program is to develop the technology base to produce liquid fuels from microalgae. This technology is being initially developed for the desert Southwest. As part of this program an outdoor test facility has been designed and constructed in Roswell, New Mexico. The site has a large existing infrastructure, a suitable climate, and abundant saline groundwater. This facility will be used to evaluate productivity of microalgae strains and conduct large-scale experiments to increase biomass productivity while decreasing production costs. Six 3-m/sup 2/ fiberglass raceways were constructed. Several microalgaemore » strains were screened for growth, one of which had a short-term productivity rate of greater than 50 g dry wt m/sup /minus/2/ d/sup /minus/1/. Two large-scale, 0.1-ha raceways have also been built. These are being used to evaluate the performance trade-offs between low-cost earthen liners and higher cost plastic liners. A series of hydraulic measurements is also being carried out to evaluate future improved pond designs. Future plans include a 0.5-ha pond, which will be built in approximately 2 years to test a scaled-up system. This unique facility will be available to other researchers and industry for studies on microalgae productivity. 6 refs., 9 figs., 1 tab.« less

  7. Development of fine-resolution analyses and expanded large-scale forcing properties. Part II: Scale-awareness and application to single-column model experiments

    DOE PAGES

    Feng, Sha; Vogelmann, Andrew M.; Li, Zhijin; ...

    2015-01-20

    Fine-resolution three-dimensional fields have been produced using the Community Gridpoint Statistical Interpolation (GSI) data assimilation system for the U.S. Department of Energy’s Atmospheric Radiation Measurement Program (ARM) Southern Great Plains region. The GSI system is implemented in a multi-scale data assimilation framework using the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. From the fine-resolution three-dimensional fields, large-scale forcing is derived explicitly at grid-scale resolution; a subgrid-scale dynamic component is derived separately, representing subgrid-scale horizontal dynamic processes. Analyses show that the subgrid-scale dynamic component is often a major component over the large-scale forcing for grid scalesmore » larger than 200 km. The single-column model (SCM) of the Community Atmospheric Model version 5 (CAM5) is used to examine the impact of the grid-scale and subgrid-scale dynamic components on simulated precipitation and cloud fields associated with a mesoscale convective system. It is found that grid-scale size impacts simulated precipitation, resulting in an overestimation for grid scales of about 200 km but an underestimation for smaller grids. The subgrid-scale dynamic component has an appreciable impact on the simulations, suggesting that grid-scale and subgrid-scale dynamic components should be considered in the interpretation of SCM simulations.« less

  8. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  9. Vulnerability of China's nearshore ecosystems under intensive mariculture development.

    PubMed

    Liu, Hui; Su, Jilan

    2017-04-01

    Rapid economic development and increasing population in China have exerted tremendous pressures on the coastal ecosystems. In addition to land-based pollutants and reclamation, fast expansion of large-scale intensive mariculture activities has also brought about additional effects. So far, the ecological impact of rapid mariculture development and its large-scale operations has not drawn enough attention. In this paper, the rapid development of mariculture in China is reviewed, China's effort in the application of ecological mariculture is examined, and the vulnerability of marine ecosystem to mariculture impact is evaluated through a number of examples. Removal or reduced large and forage fish, due to both habitat loss to reclamation/mariculture and overfishing for food or fishmeal, may have far-reaching effects on the coastal and shelf ecosystems in the long run. Large-scale intensive mariculture operations carry with them undesirable biological and biochemical characteristics, which may have consequences on natural ecosystems beyond normally perceived spatial and temporal boundaries. As our understanding of possible impacts of large-scale intensive mariculture is lagging far behind its development, much research is urgently needed.

  10. The Mothball, Sustainment, and Proposed Reactivation of the Hypersonic Tunnel Facility (HTF) at NASA Glenn Research Center Plum Brook Station

    NASA Technical Reports Server (NTRS)

    Thomas, Scott R.; Lee, Jinho; Stephens, John W.; Hostler, Robert W., Jr.; VonKamp, William D.

    2010-01-01

    The Hypersonic Tunnel Facility (HTF) located at the NASA Glenn Research Center s Plum Brook Station in Sandusky, Ohio, is the nation s only large-scale, non-vitiated, hypersonic propulsion test facility. The HTF, with its 4-story graphite induction heater, is capable of duplicating Mach 5, 6, and 7 flight conditions. This unique propulsion system test facility has experienced several standby and reactivation cycles. The intent of the paper is to overview the HTF capabilities to the propulsion community, present the current status of HTF, and share the lessons learned from putting a large-scale facility into mothball status for a later restart

  11. When micro meets macro: microbial lipid analysis and ecosystem ecology

    NASA Astrophysics Data System (ADS)

    Balser, T.; Gutknecht, J.

    2008-12-01

    There is growing interest in linking soil microbial community composition and activity with large-scale field studies of nutrient cycling or plant community response to disturbances. And while analysis of microbial communities has moved rapidly in the past decade from culture-based to non-culture based techniques, still it must be asked what have we gained from the move? How well does the necessarily micro-scale of microbial analysis allow us to address questions of interest at the macro-scale? Several challenges exist in bridging the scales, and foremost is the question of methodological feasibility. Past microbiological methodologies have not been readily adaptable to the large sample sizes necessary for ecosystem-scale research. As a result, it has been difficult to generate compatible microbial and ecosystem data sets. We describe the use of a modified lipid extraction method to generate microbial community data sets that allow us to match landscape-scale or long-term ecological studies with microbial community data. We briefly discuss the challenges and advantages associated with lipid analysis as an approach to addressing ecosystem ecological studies, and provide examples from our research in ecosystem restoration and recovery following disturbance and climate change.

  12. Interactive, graphical processing unitbased evaluation of evacuation scenarios at the state scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Aaby, Brandon G; Yoginath, Srikanth B

    2011-01-01

    In large-scale scenarios, transportation modeling and simulation is severely constrained by simulation time. For example, few real- time simulators scale to evacuation traffic scenarios at the level of an entire state, such as Louisiana (approximately 1 million links) or Florida (2.5 million links). New simulation approaches are needed to overcome severe computational demands of conventional (microscopic or mesoscopic) modeling techniques. Here, a new modeling and execution methodology is explored that holds the potential to provide a tradeoff among the level of behavioral detail, the scale of transportation network, and real-time execution capabilities. A novel, field-based modeling technique and its implementationmore » on graphical processing units are presented. Although additional research with input from domain experts is needed for refining and validating the models, the techniques reported here afford interactive experience at very large scales of multi-million road segments. Illustrative experiments on a few state-scale net- works are described based on an implementation of this approach in a software system called GARFIELD. Current modeling cap- abilities and implementation limitations are described, along with possible use cases and future research.« less

  13. Relations Between Coastal Catchment Attributes and Submarine Groundwater Discharge at Different Scales

    NASA Astrophysics Data System (ADS)

    Moosdorf, N.; Langlotz, S. T.

    2016-02-01

    Submarine groundwater discharge (SGD) has been recognized as a relevant field of coastal research in the last years. Its implications on local scale have been documented by an increasing number of studies researching individual locations with SGD. The local studies also often emphasize its large variability. On the other end, global scale studies try to estimate SGD related fluxes of e.g. carbon (Cole et al., 2007) and nitrogen (Beusen et al., 2013). These studies naturally use a coarse resolution, too coarse to represent the aforementioned local variability of SGD (Moosdorf et al., 2015). A way to transfer information of the local variability of SGD to large scale flux estimates is needed. Here we discuss the upscaling of local studies based on the definition and typology of coastal catchments. Coastal catchments are those stretches of coast that do not drain into major rivers but directly into the sea. Their attributes, e.g. climate, topography, land cover, or lithology can be used to extrapolate from the local scale to larger scales. We present first results of a typology, compare coastal catchment attributes to SGD estimates from field studies and discuss upscaling as well as the associated uncertainties. This study aims at bridging the gap between the scales and enabling an improved representation of local scale variability on continental to global scale. With this, it can contribute to a recent initiative to model large scale SGD fluxes (NExT SGD). References: Beusen, A.H.W., Slomp, C.P., Bouwman, A.F., 2013. Global land-ocean linkage: direct inputs of nitrogen to coastal waters via submarine groundwater discharge. Environmental Research Letters, 8(3): 6. Cole, J.J., Prairie, Y.T., Caraco, N.F., McDowell, W.H., Tranvik, L.J., Striegl, R.G., Duarte, C.M., Kortelainen, P., Downing, J.A., Middelburg, J.J., Melack, J., 2007. Plumbing the global carbon cycle: Integrating inland waters into the terrestrial carbon budget. Ecosystems, 10(1): 171-184. Moosdorf, N., Stieglitz, T., Waska, H., Durr, H.H., Hartmann, J., 2015. Submarine groundwater discharge from tropical islands: a review. Grundwasser, 20(1): 53-67.

  14. Darwin the scientist.

    PubMed

    Browne, J

    2009-01-01

    Charles Darwin's experimental investigations show him to have been a superb practical researcher. These skills are often underestimated today when assessing Darwin's achievement in the Origin of Species and his other books. Supported by a private income, he turned his house and gardens into a Victorian equivalent of a modern research station. Darwin participated actively in the exchange of scientific information via letters and much of his research was also carried out through correspondence. Although this research was relatively small scale in practice, it was large scale in intellectual scope. Darwin felt he had a strong desire to understand or explain whatever he observed.

  15. EPA'S LANDSCAPE SCIENCES RESEARCH: NUTRIENT POLLUTION, FLOODING, AND HABITAT

    EPA Science Inventory

    There is a growing need to understand the pattern of landscape change at regional scales and to determine how such changes affect environmental values. Key to conducting these assessments is the development of land-cover databases that permit large-scale analyses, such as an exam...

  16. The Starkey habitat database for ungulate research: construction, documentation, and use.

    Treesearch

    Mary M. Rowland; Priscilla K. Coe; Rosemary J. Stussy; [and others].

    1998-01-01

    The Starkey Project, a large-scale, multidisciplinary research venture, began in 1987 in the Starkey Experimental Forest and Range in northeast Oregon. Researchers are studying effects of forest management on interactions and habitat use of mule deer (Odocoileus hemionus hemionus), elk (Cervus elaphus nelsoni), and cattle. A...

  17. Plague and Climate: Scales Matter

    PubMed Central

    Ben Ari, Tamara; Neerinckx, Simon; Gage, Kenneth L.; Kreppel, Katharina; Laudisoit, Anne; Leirs, Herwig; Stenseth, Nils Chr.

    2011-01-01

    Plague is enzootic in wildlife populations of small mammals in central and eastern Asia, Africa, South and North America, and has been recognized recently as a reemerging threat to humans. Its causative agent Yersinia pestis relies on wild rodent hosts and flea vectors for its maintenance in nature. Climate influences all three components (i.e., bacteria, vectors, and hosts) of the plague system and is a likely factor to explain some of plague's variability from small and regional to large scales. Here, we review effects of climate variables on plague hosts and vectors from individual or population scales to studies on the whole plague system at a large scale. Upscaled versions of small-scale processes are often invoked to explain plague variability in time and space at larger scales, presumably because similar scale-independent mechanisms underlie these relationships. This linearity assumption is discussed in the light of recent research that suggests some of its limitations. PMID:21949648

  18. Collaborative mining and interpretation of large-scale data for biomedical research insights.

    PubMed

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.

  19. Collaborative Mining and Interpretation of Large-Scale Data for Biomedical Research Insights

    PubMed Central

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence. PMID:25268270

  20. Content validation of an interprofessional learning video peer assessment tool.

    PubMed

    Nisbet, Gillian; Jorm, Christine; Roberts, Chris; Gordon, Christopher J; Chen, Timothy F

    2017-12-16

    Large scale models of interprofessional learning (IPL) where outcomes are assessed are rare within health professional curricula. To date, there is sparse research describing robust assessment strategies to support such activities. We describe the development of an IPL assessment task based on peer rating of a student generated video evidencing collaborative interprofessional practice. We provide content validation evidence of an assessment rubric in the context of large scale IPL. Two established approaches to scale development in an educational setting were combined. A literature review was undertaken to develop a conceptual model of the relevant domains and issues pertaining to assessment of student generated videos within IPL. Starting with a prototype rubric developed from the literature, a series of staff and student workshops were undertaken to integrate expert opinion and user perspectives. Participants assessed five-minute videos produced in a prior pilot IPL activity. Outcomes from each workshop informed the next version of the rubric until agreement was reached on anchoring statements and criteria. At this point the rubric was declared fit to be used in the upcoming mandatory large scale IPL activity. The assessment rubric consisted of four domains: patient issues, interprofessional negotiation; interprofessional management plan in action; and effective use of video medium to engage audience. The first three domains reflected topic content relevant to the underlying construct of interprofessional collaborative practice. The fourth domain was consistent with the broader video assessment literature calling for greater emphasis on creativity in education. We have provided evidence for the content validity of a video-based peer assessment task portraying interprofessional collaborative practice in the context of large-scale IPL activities for healthcare professional students. Further research is needed to establish the reliability of such a scale.

  1. Biogeochemistry and ecology of terrestrial ecosystems of Amazonia

    NASA Astrophysics Data System (ADS)

    Malhi, Yadvinder; Davidson, Eric A.

    The last decade of research associated with the Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) has led to substantial advances in our understanding of the biogeochemistry and ecology of Amazonian forests and savannas, in particular in relation to the carbon cycle of Amazonia. In this chapter, we present a synthesis of results and ideas that are presented in more detail in subsequent chapters, drawing together evidence from studies of forest ecology, ecophysiology, trace gas fluxes and atmospheric flux towers, large-scale rainfall manipulation experiments and soil surveys, satellite remote sensing, and quantification of carbon and nutrient stocks and flows. The studies have demonstrated the variability of the functioning and biogeochemistry of Amazonian forests at a range of spatial and temporal scales, and they provide clues as to how Amazonia will respond to ongoing direct pressure and global atmospheric change. We conclude by highlighting key questions for the next decade of research to address.

  2. A real-time interferometer technique for compressible flow research

    NASA Technical Reports Server (NTRS)

    Bachalo, W. D.; Houser, M. J.

    1984-01-01

    Strengths and shortcomings in the application of interferometric techniques to transonic flow fields are examined and an improved method is elaborated. Such applications have demonstrated the value of interferometry in obtaining data for compressible flow research. With holographic techniques, interferometry may be applied in large scale facilities without the use of expensive optics or elaborate vibration isolation equipment. Results obtained using holographic interferometry and other methods demonstrate that reliable qualitative and quantitative data can be acquired. Nevertheless, the conventional method can be difficult to set up and apply, and it cannot produce real-time data. A new interferometry technique is investigated that promises to be easier to apply and can provide real-time information. This single-beam technique has the necessary insensitivity to vibration for large scale wind tunnel operations. Capabilities of the method and preliminary tests on some laboratory scale flow fluids are described.

  3. Genome resequencing in Populus: Revealing large-scale genome variation and implications on specialized-trait genomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muchero, Wellington; Labbe, Jessy L; Priya, Ranjan

    2014-01-01

    To date, Populus ranks among a few plant species with a complete genome sequence and other highly developed genomic resources. With the first genome sequence among all tree species, Populus has been adopted as a suitable model organism for genomic studies in trees. However, far from being just a model species, Populus is a key renewable economic resource that plays a significant role in providing raw materials for the biofuel and pulp and paper industries. Therefore, aside from leading frontiers of basic tree molecular biology and ecological research, Populus leads frontiers in addressing global economic challenges related to fuel andmore » fiber production. The latter fact suggests that research aimed at improving quality and quantity of Populus as a raw material will likely drive the pursuit of more targeted and deeper research in order to unlock the economic potential tied in molecular biology processes that drive this tree species. Advances in genome sequence-driven technologies, such as resequencing individual genotypes, which in turn facilitates large scale SNP discovery and identification of large scale polymorphisms are key determinants of future success in these initiatives. In this treatise we discuss implications of genome sequence-enable technologies on Populus genomic and genetic studies of complex and specialized-traits.« less

  4. The potential of text mining in data integration and network biology for plant research: a case study on Arabidopsis.

    PubMed

    Van Landeghem, Sofie; De Bodt, Stefanie; Drebert, Zuzanna J; Inzé, Dirk; Van de Peer, Yves

    2013-03-01

    Despite the availability of various data repositories for plant research, a wealth of information currently remains hidden within the biomolecular literature. Text mining provides the necessary means to retrieve these data through automated processing of texts. However, only recently has advanced text mining methodology been implemented with sufficient computational power to process texts at a large scale. In this study, we assess the potential of large-scale text mining for plant biology research in general and for network biology in particular using a state-of-the-art text mining system applied to all PubMed abstracts and PubMed Central full texts. We present extensive evaluation of the textual data for Arabidopsis thaliana, assessing the overall accuracy of this new resource for usage in plant network analyses. Furthermore, we combine text mining information with both protein-protein and regulatory interactions from experimental databases. Clusters of tightly connected genes are delineated from the resulting network, illustrating how such an integrative approach is essential to grasp the current knowledge available for Arabidopsis and to uncover gene information through guilt by association. All large-scale data sets, as well as the manually curated textual data, are made publicly available, hereby stimulating the application of text mining data in future plant biology studies.

  5. X-ray techniques for innovation in industry

    PubMed Central

    Lawniczak-Jablonska, Krystyna; Cutler, Jeffrey

    2014-01-01

    The smart specialization declared in the European program Horizon 2020, and the increasing cooperation between research and development found in companies and researchers at universities and research institutions have created a new paradigm where many calls for proposals require participation and funding from public and private entities. This has created a unique opportunity for large-scale facilities, such as synchrotron research laboratories, to participate in and support applied research programs. Scientific staff at synchrotron facilities have developed many advanced tools that make optimal use of the characteristics of the light generated by the storage ring. These tools have been exceptionally valuable for materials characterization including X-ray absorption spectroscopy, diffraction, tomography and scattering, and have been key in solving many research and development issues. Progress in optics and detectors, as well as a large effort put into the improvement of data analysis codes, have resulted in the development of reliable and reproducible procedures for materials characterization. Research with photons has contributed to the development of a wide variety of products such as plastics, cosmetics, chemicals, building materials, packaging materials and pharma. In this review, a few examples are highlighted of successful cooperation leading to solutions of a variety of industrial technological problems which have been exploited by industry including lessons learned from the Science Link project, supported by the European Commission, as a new approach to increase the number of commercial users at large-scale research infrastructures. PMID:25485139

  6. Translational bioinformatics in the cloud: an affordable alternative

    PubMed Central

    2010-01-01

    With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073

  7. Problems in merging Earth sensing satellite data sets

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.; Goldberg, Michael J.

    1987-01-01

    Satellite remote sensing systems provide a tremendous source of data flow to the Earth science community. These systems provide scientists with data of types and on a scale previously unattainable. Looking forward to the capabilities of Space Station and the Earth Observing System (EOS), the full realization of the potential of satellite remote sensing will be handicapped by inadequate information systems. There is a growing emphasis in Earth science research to ask questions which are multidisciplinary in nature and global in scale. Many of these research projects emphasize the interactions of the land surface, the atmosphere, and the oceans through various physical mechanisms. Conducting this research requires large and complex data sets and teams of multidisciplinary scientists, often working at remote locations. A review of the problems of merging these large volumes of data into spatially referenced and manageable data sets is presented.

  8. Evaluation of Alternative Altitude Scaling Methods for Thermal Ice Protection System in NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Addy, Harold; Broeren, Andy P.; Orchard, David M.

    2017-01-01

    A test was conducted at NASA Icing Research Tunnel to evaluate altitude scaling methods for thermal ice protection system. Two scaling methods based on Weber number were compared against a method based on the Reynolds number. The results generally agreed with the previous set of tests conducted in NRCC Altitude Icing Wind Tunnel. The Weber number based scaling methods resulted in smaller runback ice mass than the Reynolds number based scaling method. The ice accretions from the Weber number based scaling method also formed farther upstream. However there were large differences in the accreted ice mass between the two Weber number based scaling methods. The difference became greater when the speed was increased. This indicated that there may be some Reynolds number effects that isnt fully accounted for and warrants further study.

  9. Use of a large-scale rainfall simulator reveals novel insights into stemflow generation

    NASA Astrophysics Data System (ADS)

    Levia, D. F., Jr.; Iida, S. I.; Nanko, K.; Sun, X.; Shinohara, Y.; Sakai, N.

    2017-12-01

    Detailed knowledge of stemflow generation and its effects on both hydrological and biogoechemical cycling is important to achieve a holistic understanding of forest ecosystems. Field studies and a smaller set of experiments performed under laboratory conditions have increased our process-based knowledge of stemflow production. Building upon these earlier works, a large-scale rainfall simulator was employed to deepen our understanding of stemflow generation processes. The use of the large-scale rainfall simulator provides a unique opportunity to examine a range of rainfall intensities under constant conditions that are difficult under natural conditions due to the variable nature of rainfall intensities in the field. Stemflow generation and production was examined for three species- Cryptomeria japonica D. Don (Japanese cedar), Chamaecyparis obtusa (Siebold & Zucc.) Endl. (Japanese cypress), Zelkova serrata Thunb. (Japanese zelkova)- under both leafed and leafless conditions at several different rainfall intensities (15, 20, 30, 40, 50, and 100 mm h-1) using a large-scale rainfall simulator in National Research Institute for Earth Science and Disaster Resilience (Tsukuba, Japan). Stemflow production and rates and funneling ratios were examined in relation to both rainfall intensity and canopy structure. Preliminary results indicate a dynamic and complex response of the funneling ratios of individual trees to different rainfall intensities among the species examined. This is partly the result of different canopy structures, hydrophobicity of vegetative surfaces, and differential wet-up processes across species and rainfall intensities. This presentation delves into these differences and attempts to distill them into generalizable patterns, which can advance our theories of stemflow generation processes and ultimately permit better stewardship of forest resources. ________________ Funding note: This research was supported by JSPS Invitation Fellowship for Research in Japan (Grant Award No.: S16088) and JSPS KAKENHI (Grant Award No.: JP15H05626).

  10. Grant Development for Large Scale Research Proposals: An Overview and Case Study

    ERIC Educational Resources Information Center

    Goodman, Ira S.

    2011-01-01

    With some NIH pay lines running at or below the 10th percentile, and funding becoming scarce for large science grants, new approaches are necessary to secure large interdisciplinary grant awards. The UCSD Moores Cancer Center has developed a team approach, starting with the identification of a competitive opportunity and progressing to the…

  11. Applications of Magnetic Suspension Technology to Large Scale Facilities: Progress, Problems and Promises

    NASA Technical Reports Server (NTRS)

    Britcher, Colin P.

    1997-01-01

    This paper will briefly review previous work in wind tunnel Magnetic Suspension and Balance Systems (MSBS) and will examine the handful of systems around the world currently known to be in operational condition or undergoing recommissioning. Technical developments emerging from research programs at NASA and elsewhere will be reviewed briefly, where there is potential impact on large-scale MSBSS. The likely aerodynamic applications for large MSBSs will be addressed, since these applications should properly drive system designs. A recently proposed application to ultra-high Reynolds number testing will then be addressed in some detail. Finally, some opinions on the technical feasibility and usefulness of a large MSBS will be given.

  12. A High-Resolution WRF Tropical Channel Simulation Driven by a Global Reanalysis

    NASA Astrophysics Data System (ADS)

    Holland, G.; Leung, L.; Kuo, Y.; Hurrell, J.

    2006-12-01

    Since 2003, NCAR has invested in the development and application of Nested Regional Climate Model (NRCM) based on the Weather Research and Forecasting (WRF) model and the Community Climate System Model, as a key component of the Prediction Across Scales Initiative. A prototype tropical channel model has been developed to investigate scale interactions and the influence of tropical convection on large scale circulation and tropical modes. The model was developed based on the NCAR Weather Research and Forecasting Model (WRF), configured as a tropical channel between 30 ° S and 45 ° N, wide enough to allow teleconnection effects over the mid-latitudes. Compared to the limited area domain that WRF is typically applied over, the channel mode alleviates issues with reflection of tropical modes that could result from imposing east/west boundaries. Using a large amount of available computing resources on a supercomputer (Blue Vista) during its bedding in period, a simulation has been completed with the tropical channel applied at 36 km horizontal resolution for 5 years from 1996 to 2000, with large scale circulation provided by the NCEP/NCAR global reanalysis at the north/south boundaries. Shorter simulations of 2 years and 6 months have also been performed to include two-way nests at 12 km and 4 km resolution, respectively, over the western Pacific warm pool, to explicitly resolve tropical convection in the Maritime Continent. The simulations realistically captured the large-scale circulation including the trade winds over the tropical Pacific and Atlantic, the Australian and Asian monsoon circulation, and hurricane statistics. Preliminary analysis and evaluation of the simulations will be presented.

  13. The Challenges of Evaluating Large-Scale, Multi-Partner Programmes: The Case of NIHR CLAHRCs

    ERIC Educational Resources Information Center

    Martin, Graham P.; Ward, Vicky; Hendy, Jane; Rowley, Emma; Nancarrow, Susan; Heaton, Janet; Britten, Nicky; Fielden, Sandra; Ariss, Steven

    2011-01-01

    The limited extent to which research evidence is utilised in healthcare and other public services is widely acknowledged. The United Kingdom government has attempted to address this gap by funding nine Collaborations for Leadership in Applied Health Research and Care (CLAHRCs). CLAHRCs aim to carry out health research, implement research findings…

  14. Designing, Building, and Connecting Networks to Support Distributed Collaborative Empirical Writing Research

    ERIC Educational Resources Information Center

    Brunk-Chavez, Beth; Pigg, Stacey; Moore, Jessie; Rosinski, Paula; Grabill, Jeffrey T.

    2018-01-01

    To speak to diverse audiences about how people learn to write and how writing works inside and outside the academy, we must conduct research across geographical, institutional, and cultural contexts as well as research that enables comparison when appropriate. Large-scale empirical research is useful for both of these moves; however, we must…

  15. Improving the representation of clouds, radiation, and precipitation using spectral nudging in the Weather Research and Forecasting model

    EPA Science Inventory

    Spectral nudging – a scale-selective interior constraint technique – is commonly used in regional climate models to maintain consistency with large-scale forcing while permitting mesoscale features to develop in the downscaled simulations. Several studies have demonst...

  16. Finite difference and Runge-Kutta methods for solving vibration problems

    NASA Astrophysics Data System (ADS)

    Lintang Renganis Radityani, Scolastika; Mungkasi, Sudi

    2017-11-01

    The vibration of a storey building can be modelled into a system of second order ordinary differential equations. If the number of floors of a building is large, then the result is a large scale system of second order ordinary differential equations. The large scale system is difficult to solve, and if it can be solved, the solution may not be accurate. Therefore, in this paper, we seek for accurate methods for solving vibration problems. We compare the performance of numerical finite difference and Runge-Kutta methods for solving large scale systems of second order ordinary differential equations. The finite difference methods include the forward and central differences. The Runge-Kutta methods include the Euler and Heun methods. Our research results show that the central finite difference and the Heun methods produce more accurate solutions than the forward finite difference and the Euler methods do.

  17. Upscaling of U (VI) desorption and transport from decimeter‐scale heterogeneity to plume‐scale modeling

    USGS Publications Warehouse

    Curtis, Gary P.; Kohler, Matthias; Kannappan, Ramakrishnan; Briggs, Martin A.; Day-Lewis, Frederick D.

    2015-01-01

    Scientifically defensible predictions of field scale U(VI) transport in groundwater requires an understanding of key processes at multiple scales. These scales range from smaller than the sediment grain scale (less than 10 μm) to as large as the field scale which can extend over several kilometers. The key processes that need to be considered include both geochemical reactions in solution and at sediment surfaces as well as physical transport processes including advection, dispersion, and pore-scale diffusion. The research summarized in this report includes both experimental and modeling results in batch, column and tracer tests. The objectives of this research were to: (1) quantify the rates of U(VI) desorption from sediments acquired from a uranium contaminated aquifer in batch experiments;(2) quantify rates of U(VI) desorption in column experiments with variable chemical conditions, and(3) quantify nonreactive tracer and U(VI) transport in field tests.

  18. Process for Low Cost Domestic Production of LIB Cathode Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurston, Anthony

    The objective of the research was to determine the best low cost method for the large scale production of the Nickel-Cobalt-Manganese (NCM) layered cathode materials. The research and development focused on scaling up the licensed technology from Argonne National Laboratory in BASF’s battery material pilot plant in Beachwood Ohio. Since BASF did not have experience with the large scale production of the NCM cathode materials there was a significant amount of development that was needed to support BASF’s already existing research program. During the three year period BASF was able to develop and validate production processes for the NCM 111,more » 523 and 424 materials as well as begin development of the High Energy NCM. BASF also used this time period to provide free cathode material samples to numerous manufactures, OEM’s and research companies in order to validate the ma-terials. The success of the project can be demonstrated by the construction of the production plant in Elyria Ohio and the successful operation of that facility. The benefit of the project to the public will begin to be apparent as soon as material from the production plant is being used in electric vehicles.« less

  19. An examination of three sets of MMPI-2 personality disorder scales.

    PubMed

    Jones, Alvin

    2005-08-01

    Three sets of personality disorder scales (PD scales) can be scored for the MMPI-2 (Butcher, Dahlstrom, Graham, Tellegen, & Kaemmer, 1989). Two sets (Levitt & Gotts, 1995; Morey, Waugh, & Blashfield, 1985) are derived from the MMPI (Hathaway & McKinley, 1983), and a third set (Somwaru & Ben-Porath, 1995) is based on the MMPI-2. There is no validity research for the Levitt and Gotts scale, and limited validity research is available for the Somwaru and Ben-Porath scales. There is a large body of research suggesting that the Morey et al. scales have good to excellent convergent validity when compared to a variety of other measures of personality disorders. Since the Morey et al. scales have established validity, there is a question if additional sets of PD scales are needed. The primary purpose of this research was to determine if the PD scales developed by Levitt and Gotts and those developed by Somwaru and Ben-Porath contribute incrementally to the scales developed by Morey et al. in predicting corresponding scales on the MCMI-II (Millon, 1987). In a sample of 494 individuals evaluated at an Army medical center, a hierarchical regression analysis demonstrated that the Somwaru and Ben-Porath Borderline, Antisocial, and Schizoid PD scales and the Levitt and Gotts Narcissistic and Histrionic scales contributed significantly and meaningfully to the Morey et al. scales in predicting the corresponding MCMI-II (Millon, 1987) scale. However, only the Somwaru and Ben-Porath scales demonstrated acceptable internal consistency and convergent validity.

  20. Status of DSMT research program

    NASA Technical Reports Server (NTRS)

    Mcgowan, Paul E.; Javeed, Mehzad; Edighoffer, Harold H.

    1991-01-01

    The status of the Dynamic Scale Model Technology (DSMT) research program is presented. DSMT is developing scale model technology for large space structures as part of the Control Structure Interaction (CSI) program at NASA Langley Research Center (LaRC). Under DSMT a hybrid-scale structural dynamics model of Space Station Freedom was developed. Space Station Freedom was selected as the focus structure for DSMT since the station represents the first opportunity to obtain flight data on a complex, three-dimensional space structure. Included is an overview of DSMT including the development of the space station scale model and the resulting hardware. Scaling technology was developed for this model to achieve a ground test article which existing test facilities can accommodate while employing realistically scaled hardware. The model was designed and fabricated by the Lockheed Missile and Space Co., and is assembled at LaRc for dynamic testing. Also, results from ground tests and analyses of the various model components are presented along with plans for future subassembly and matted model tests. Finally, utilization of the scale model for enhancing analysis verification of the full-scale space station is also considered.

  1. BactoGeNIE: A large-scale comparative genome visualization for big displays

    DOE PAGES

    Aurisano, Jillian; Reda, Khairi; Johnson, Andrew; ...

    2015-08-13

    The volume of complete bacterial genome sequence data available to comparative genomics researchers is rapidly increasing. However, visualizations in comparative genomics--which aim to enable analysis tasks across collections of genomes--suffer from visual scalability issues. While large, multi-tiled and high-resolution displays have the potential to address scalability issues, new approaches are needed to take advantage of such environments, in order to enable the effective visual analysis of large genomics datasets. In this paper, we present Bacterial Gene Neighborhood Investigation Environment, or BactoGeNIE, a novel and visually scalable design for comparative gene neighborhood analysis on large display environments. We evaluate BactoGeNIE throughmore » a case study on close to 700 draft Escherichia coli genomes, and present lessons learned from our design process. In conclusion, BactoGeNIE accommodates comparative tasks over substantially larger collections of neighborhoods than existing tools and explicitly addresses visual scalability. Given current trends in data generation, scalable designs of this type may inform visualization design for large-scale comparative research problems in genomics.« less

  2. BactoGeNIE: a large-scale comparative genome visualization for big displays

    PubMed Central

    2015-01-01

    Background The volume of complete bacterial genome sequence data available to comparative genomics researchers is rapidly increasing. However, visualizations in comparative genomics--which aim to enable analysis tasks across collections of genomes--suffer from visual scalability issues. While large, multi-tiled and high-resolution displays have the potential to address scalability issues, new approaches are needed to take advantage of such environments, in order to enable the effective visual analysis of large genomics datasets. Results In this paper, we present Bacterial Gene Neighborhood Investigation Environment, or BactoGeNIE, a novel and visually scalable design for comparative gene neighborhood analysis on large display environments. We evaluate BactoGeNIE through a case study on close to 700 draft Escherichia coli genomes, and present lessons learned from our design process. Conclusions BactoGeNIE accommodates comparative tasks over substantially larger collections of neighborhoods than existing tools and explicitly addresses visual scalability. Given current trends in data generation, scalable designs of this type may inform visualization design for large-scale comparative research problems in genomics. PMID:26329021

  3. BactoGeNIE: A large-scale comparative genome visualization for big displays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurisano, Jillian; Reda, Khairi; Johnson, Andrew

    The volume of complete bacterial genome sequence data available to comparative genomics researchers is rapidly increasing. However, visualizations in comparative genomics--which aim to enable analysis tasks across collections of genomes--suffer from visual scalability issues. While large, multi-tiled and high-resolution displays have the potential to address scalability issues, new approaches are needed to take advantage of such environments, in order to enable the effective visual analysis of large genomics datasets. In this paper, we present Bacterial Gene Neighborhood Investigation Environment, or BactoGeNIE, a novel and visually scalable design for comparative gene neighborhood analysis on large display environments. We evaluate BactoGeNIE throughmore » a case study on close to 700 draft Escherichia coli genomes, and present lessons learned from our design process. In conclusion, BactoGeNIE accommodates comparative tasks over substantially larger collections of neighborhoods than existing tools and explicitly addresses visual scalability. Given current trends in data generation, scalable designs of this type may inform visualization design for large-scale comparative research problems in genomics.« less

  4. Development and Initial Testing of the Tiltrotor Test Rig

    NASA Technical Reports Server (NTRS)

    Acree, C. W., Jr.; Sheikman, A. L.

    2018-01-01

    The NASA Tiltrotor Test Rig (TTR) is a new, large-scale proprotor test system, developed jointly with the U.S. Army and Air Force, to develop a new, large-scale proprotor test system for the National Full-Scale Aerodynamics Complex (NFAC). The TTR is designed to test advanced proprotors up to 26 feet in diameter at speeds up to 300 knots, and even larger rotors at lower airspeeds. This combination of size and speed is unprecedented and is necessary for research into 21st-century tiltrotors and other advanced rotorcraft concepts. The TTR will provide critical data for validation of state-of-the-art design and analysis tools.

  5. Large-scale Advanced Prop-fan (LAP) technology assessment report

    NASA Technical Reports Server (NTRS)

    Degeorge, C. L.

    1988-01-01

    The technologically significant findings and accomplishments of the Large Scale Advanced Prop-Fan (LAP) program in the areas of aerodynamics, aeroelasticity, acoustics and materials and fabrication are described. The extent to which the program goals related to these disciplines were achieved is discussed, and recommendations for additional research are presented. The LAP program consisted of the design, manufacture and testing of a near full-scale Prop-Fan or advanced turboprop capable of operating efficiently at speeds to Mach .8. An aeroelastically scaled model of the LAP was also designed and fabricated. The goal of the program was to acquire data on Prop-Fan performance that would indicate the technology readiness of Prop-Fans for practical applications in commercial and military aviation.

  6. Establishing the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS): Operationalizing Community-based Research in a Large National Quantitative Study.

    PubMed

    Loutfy, Mona; Greene, Saara; Kennedy, V Logan; Lewis, Johanna; Thomas-Pavanel, Jamie; Conway, Tracey; de Pokomandy, Alexandra; O'Brien, Nadia; Carter, Allison; Tharao, Wangari; Nicholson, Valerie; Beaver, Kerrigan; Dubuc, Danièle; Gahagan, Jacqueline; Proulx-Boucher, Karène; Hogg, Robert S; Kaida, Angela

    2016-08-19

    Community-based research has gained increasing recognition in health research over the last two decades. Such participatory research approaches are lauded for their ability to anchor research in lived experiences, ensuring cultural appropriateness, accessing local knowledge, reaching marginalized communities, building capacity, and facilitating research-to-action. While having these positive attributes, the community-based health research literature is predominantly composed of small projects, using qualitative methods, and set within geographically limited communities. Its use in larger health studies, including clinical trials and cohorts, is limited. We present the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS), a large-scale, multi-site, national, longitudinal quantitative study that has operationalized community-based research in all steps of the research process. Successes, challenges and further considerations are offered. Through the integration of community-based research principles, we have been successful in: facilitating a two-year long formative phase for this study; developing a novel survey instrument with national involvement; training 39 Peer Research Associates (PRAs); offering ongoing comprehensive support to PRAs; and engaging in an ongoing iterative community-based research process. Our community-based research approach within CHIWOS demanded that we be cognizant of challenges managing a large national team, inherent power imbalances and challenges with communication, compensation and volunteering considerations, and extensive delays in institutional processes. It is important to consider the iterative nature of community-based research and to work through tensions that emerge given the diverse perspectives of numerous team members. Community-based research, as an approach to large-scale quantitative health research projects, is an increasingly viable methodological option. Community-based research has several advantages that go hand-in-hand with its obstacles. We offer guidance on implementing this approach, such that the process can be better planned and result in success.

  7. Low frequency steady-state brain responses modulate large scale functional networks in a frequency-specific means.

    PubMed

    Wang, Yi-Feng; Long, Zhiliang; Cui, Qian; Liu, Feng; Jing, Xiu-Juan; Chen, Heng; Guo, Xiao-Nan; Yan, Jin H; Chen, Hua-Fu

    2016-01-01

    Neural oscillations are essential for brain functions. Research has suggested that the frequency of neural oscillations is lower for more integrative and remote communications. In this vein, some resting-state studies have suggested that large scale networks function in the very low frequency range (<1 Hz). However, it is difficult to determine the frequency characteristics of brain networks because both resting-state studies and conventional frequency tagging approaches cannot simultaneously capture multiple large scale networks in controllable cognitive activities. In this preliminary study, we aimed to examine whether large scale networks can be modulated by task-induced low frequency steady-state brain responses (lfSSBRs) in a frequency-specific pattern. In a revised attention network test, the lfSSBRs were evoked in the triple network system and sensory-motor system, indicating that large scale networks can be modulated in a frequency tagging way. Furthermore, the inter- and intranetwork synchronizations as well as coherence were increased at the fundamental frequency and the first harmonic rather than at other frequency bands, indicating a frequency-specific modulation of information communication. However, there was no difference among attention conditions, indicating that lfSSBRs modulate the general attention state much stronger than distinguishing attention conditions. This study provides insights into the advantage and mechanism of lfSSBRs. More importantly, it paves a new way to investigate frequency-specific large scale brain activities. © 2015 Wiley Periodicals, Inc.

  8. Are all "research fields" equal? Rethinking practice for the use of data from crowdsourcing market places.

    PubMed

    Gleibs, Ilka H

    2017-08-01

    New technologies like large-scale social media sites (e.g., Facebook and Twitter) and crowdsourcing services (e.g., Amazon Mechanical Turk, Crowdflower, Clickworker) are impacting social science research and providing many new and interesting avenues for research. The use of these new technologies for research has not been without challenges, and a recently published psychological study on Facebook has led to a widespread discussion of the ethics of conducting large-scale experiments online. Surprisingly little has been said about the ethics of conducting research using commercial crowdsourcing marketplaces. In this article, I focus on the question of which ethical questions are raised by data collection with crowdsourcing tools. I briefly draw on the implications of Internet research more generally, and then focus on the specific challenges that research with crowdsourcing tools faces. I identify fair pay and the related issue of respect for autonomy, as well as problems with the power dynamic between researcher and participant, which has implications for withdrawal without prejudice, as the major ethical challenges of crowdsourced data. Furthermore, I wish to draw attention to how we can develop a "best practice" for researchers using crowdsourcing tools.

  9. The use of data from national and other large-scale user experience surveys in local quality work: a systematic review.

    PubMed

    Haugum, Mona; Danielsen, Kirsten; Iversen, Hilde Hestad; Bjertnaes, Oyvind

    2014-12-01

    An important goal for national and large-scale surveys of user experiences is quality improvement. However, large-scale surveys are normally conducted by a professional external surveyor, creating an institutionalized division between the measurement of user experiences and the quality work that is performed locally. The aim of this study was to identify and describe scientific studies related to the use of national and large-scale surveys of user experiences in local quality work. Ovid EMBASE, Ovid MEDLINE, Ovid PsycINFO and the Cochrane Database of Systematic Reviews. Scientific publications about user experiences and satisfaction about the extent to which data from national and other large-scale user experience surveys are used for local quality work in the health services. Themes of interest were identified and a narrative analysis was undertaken. Thirteen publications were included, all differed substantially in several characteristics. The results show that large-scale surveys of user experiences are used in local quality work. The types of follow-up activity varied considerably from conducting a follow-up analysis of user experience survey data to information sharing and more-systematic efforts to use the data as a basis for improving the quality of care. This review shows that large-scale surveys of user experiences are used in local quality work. However, there is a need for more, better and standardized research in this field. The considerable variation in follow-up activities points to the need for systematic guidance on how to use data in local quality work. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  10. Large-scale quarantine following biological terrorism in the United States: scientific examination, logistic and legal limits, and possible consequences.

    PubMed

    Barbera, J; Macintyre, A; Gostin, L; Inglesby, T; O'Toole, T; DeAtley, C; Tonat, K; Layton, M

    2001-12-05

    Concern for potential bioterrorist attacks causing mass casualties has increased recently. Particular attention has been paid to scenarios in which a biological agent capable of person-to-person transmission, such as smallpox, is intentionally released among civilians. Multiple public health interventions are possible to effect disease containment in this context. One disease control measure that has been regularly proposed in various settings is the imposition of large-scale or geographic quarantine on the potentially exposed population. Although large-scale quarantine has not been implemented in recent US history, it has been used on a small scale in biological hoaxes, and it has been invoked in federally sponsored bioterrorism exercises. This article reviews the scientific principles that are relevant to the likely effectiveness of quarantine, the logistic barriers to its implementation, legal issues that a large-scale quarantine raises, and possible adverse consequences that might result from quarantine action. Imposition of large-scale quarantine-compulsory sequestration of groups of possibly exposed persons or human confinement within certain geographic areas to prevent spread of contagious disease-should not be considered a primary public health strategy in most imaginable circumstances. In the majority of contexts, other less extreme public health actions are likely to be more effective and create fewer unintended adverse consequences than quarantine. Actions and areas for future research, policy development, and response planning efforts are provided.

  11. Analysis of the electricity demand of Greece for optimal planning of a large-scale hybrid renewable energy system

    NASA Astrophysics Data System (ADS)

    Tyralis, Hristos; Karakatsanis, Georgios; Tzouka, Katerina; Mamassis, Nikos

    2015-04-01

    The Greek electricity system is examined for the period 2002-2014. The demand load data are analysed at various time scales (hourly, daily, seasonal and annual) and they are related to the mean daily temperature and the gross domestic product (GDP) of Greece for the same time period. The prediction of energy demand, a product of the Greek Independent Power Transmission Operator, is also compared with the demand load. Interesting results about the change of the electricity demand scheme after the year 2010 are derived. This change is related to the decrease of the GDP, during the period 2010-2014. The results of the analysis will be used in the development of an energy forecasting system which will be a part of a framework for optimal planning of a large-scale hybrid renewable energy system in which hydropower plays the dominant role. Acknowledgement: This research was funded by the Greek General Secretariat for Research and Technology through the research project Combined REnewable Systems for Sustainable ENergy DevelOpment (CRESSENDO; grant number 5145)

  12. UPDATE ON THE MARINA STUDY ON LAKE TEXOMA

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) has instituted a program for Risk Management Research for Ecosystem Restoration in Watersheds. As part of this program a large scale project was initiated on Lake Texoma and the surrounding watershed to evaluate the assimi...

  13. Pioneering University/Industry Venture Explores VLSI Frontiers.

    ERIC Educational Resources Information Center

    Davis, Dwight B.

    1983-01-01

    Discusses industry-sponsored programs in semiconductor research, focusing on Stanford University's Center for Integrated Systems (CIS). CIS, while pursuing research in semiconductor very-large-scale integration, is merging the fields of computer science, information science, and physical science. Issues related to these university/industry…

  14. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  15. The Untapped Promise of Secondary Data Sets in International and Comparative Education Policy Research

    ERIC Educational Resources Information Center

    Chudagr, Amita; Luschei, Thomas F.

    2016-01-01

    The objective of this commentary is to call attention to the feasibility and importance of large-scale, systematic, quantitative analysis in international and comparative education research. We contend that although many existing databases are under- or unutilized in quantitative international-comparative research, these resources present the…

  16. Doing Disability Research in a Southern Context: Challenges and Possibilities

    ERIC Educational Resources Information Center

    Singal, Nidhi

    2010-01-01

    Research on disability issues in countries of the South is primarily dominated by a focus on generating large scale quantitative data sets. This paper discusses the many challenges, opportunities and dilemmas faced in designing and undertaking a qualitative research study in one district in India. The Disability, Education and Poverty Project…

  17. Research Translation Strategies to Improve the Readability of Workplace Health Promotion Resources

    ERIC Educational Resources Information Center

    Wallace, Alison; Joss, Nerida

    2016-01-01

    Without deliberate and resourced translation, research evidence is unlikely to inform policy and practice. This paper describes the processes and practical solutions used to translate evaluation research findings to improve the readability of print materials in a large scale worksite health programme. It is argued that a knowledge brokering and…

  18. Early Undergraduate Research Experiences Lead to Similar Learning Gains for STEM and Non-STEM Undergraduates

    ERIC Educational Resources Information Center

    Stanford, Jennifer S.; Rocheleau, Suzanne E.; Smith, Kevin P. W.; Mohan, Jaya

    2017-01-01

    Undergraduate research is touted as a high-impact educational practice yielding important benefits such as increased retention and notable learning gains. Large-scale studies describing benefits of mentored research programs have focused primarily on outcomes for science, technology, engineering and mathematics (STEM) undergraduates. The Students…

  19. Ecological research at the Goosenest Adaptive Management Area in northeastern California

    Treesearch

    Martin W. Ritchie

    2005-01-01

    This paper describes the establishment of an interdisciplinary, large-scale ecological research project on the Goosenest Adaptive Management Area of the Klamath National Forest in northeastern California. This project is a companion to the Blacks Mountain Ecological Research Project described by Oliver (2000). The genesis for this project was the Northwest...

  20. "Baby-Cam" and Researching with Infants: Viewer, Image and (Not) Knowing

    ERIC Educational Resources Information Center

    Elwick, Sheena

    2015-01-01

    This article offers a methodological reflection on how "baby-cam" enhanced ethically reflective attitudes in a large-scale research project that set out to research with infants in Australian early childhood education and care settings. By juxtaposing digital images produced by two different digital-camera technologies and drawing on…

  1. Using National Education Longitudinal Data Sets in School Counseling Research

    ERIC Educational Resources Information Center

    Bryan, Julia A.; Day-Vines, Norma L.; Holcomb-McCoy, Cheryl; Moore-Thomas, Cheryl

    2010-01-01

    National longitudinal databases hold much promise for school counseling researchers. Several of the more frequently used data sets, possible professional implications, and strategies for acquiring training in the use of large-scale national data sets are described. A 6-step process for conducting research with the data sets is explicated:…

  2. "Scientifically-Based Research": The Art of Politics and the Distortion of Science

    ERIC Educational Resources Information Center

    Shaker, Paul; Ruitenberg, Claudia

    2007-01-01

    The US Federal Government is forcefully prescribing a narrow definition of "scientifically-based" educational research. US policy, emerging from contemporary neoliberal and technocratic viewpoints and funded and propagated on a large scale, has the potential to influence international thinking on educational research. In this article we continue a…

  3. Cross-cultural validation of the Work Values Scale EVAT using multi-group confirmatory factor analysis and confirmatory multidimensional scaling.

    PubMed

    Arciniega, Luis M; González, Luis; Soares, Vítor; Ciulli, Stefania; Giannini, Marco

    2009-11-01

    The Work Values Scale EVAT (based on its initials in Spanish: Escala de Valores hacia el Trabajo) was created in 2000 to measure values in the work context. The instrument operationalizes the four higher-order-values of the Schwartz Theory (1992) through sixteen items focused on work scenarios. The questionnaire has been used among large samples of Mexican and Spanish individuals reporting adequate psychometric properties. The instrument has recently been translated into Portuguese and Italian, and subsequently used in a large-scale study with nurses in Portugal and in a sample of various occupations in Italy. The purpose of this research was to demonstrate the cross-cultural validity of the Work Values Scale EVAT in Spanish, Portuguese, and Italian. Our results suggest that the original Spanish version of the EVAT scale and the new Portuguese and Italian versions are equivalent.

  4. Parallel Clustering Algorithm for Large-Scale Biological Data Sets

    PubMed Central

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Backgrounds Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Methods Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. Result A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies. PMID:24705246

  5. Los Alamos Discovers Super Efficient Solar Using Perovskite Crystals

    ScienceCinema

    Mohite, Aditya; Nie, Wanyi

    2018-05-11

    State-of-the-art photovoltaics using high-purity, large-area, wafer-scale single-crystalline semiconductors grown by sophisticated, high temperature crystal-growth processes offer promising routes for developing low-cost, solar-based clean global energy solutions for the future. Solar cells composed of the recently discovered material organic-inorganic perovskites offer the efficiency of silicon, yet suffer from a variety of deficiencies limiting the commercial viability of perovskite photovoltaic technology. In research to appear in Science, Los Alamos National Laboratory researchers reveal a new solution-based hot-casting technique that eliminates these limitations, one that allows for the growth of high-quality, large-area, millimeter-scale perovskite crystals and demonstrates that highly efficient and reproducible solar cells with reduced trap assisted recombination can be realized.

  6. How to Establish and Follow up a Large Prospective Cohort Study in the 21st Century--Lessons from UK COSMOS.

    PubMed

    Toledano, Mireille B; Smith, Rachel B; Brook, James P; Douglass, Margaret; Elliott, Paul

    2015-01-01

    Large-scale prospective cohort studies are invaluable in epidemiology, but they are increasingly difficult and costly to establish and follow-up. More efficient methods for recruitment, data collection and follow-up are essential if such studies are to remain feasible with limited public and research funds. Here, we discuss how these challenges were addressed in the UK COSMOS cohort study where fixed budget and limited time frame necessitated new approaches to consent and recruitment between 2009-2012. Web-based e-consent and data collection should be considered in large scale observational studies, as they offer a streamlined experience which benefits both participants and researchers and save costs. Commercial providers of register and marketing data, smartphones, apps, email, social media, and the internet offer innovative possibilities for identifying, recruiting and following up cohorts. Using examples from UK COSMOS, this article sets out the dos and don'ts for today's cohort studies and provides a guide on how best to take advantage of new technologies and innovative methods to simplify logistics and minimise costs. Thus a more streamlined experience to the benefit of both research participants and researchers becomes achievable.

  7. How to Establish and Follow up a Large Prospective Cohort Study in the 21st Century - Lessons from UK COSMOS

    PubMed Central

    Toledano, Mireille B.; Smith, Rachel B.; Brook, James P.; Douglass, Margaret; Elliott, Paul

    2015-01-01

    Large-scale prospective cohort studies are invaluable in epidemiology, but they are increasingly difficult and costly to establish and follow-up. More efficient methods for recruitment, data collection and follow-up are essential if such studies are to remain feasible with limited public and research funds. Here, we discuss how these challenges were addressed in the UK COSMOS cohort study where fixed budget and limited time frame necessitated new approaches to consent and recruitment between 2009-2012. Web-based e-consent and data collection should be considered in large scale observational studies, as they offer a streamlined experience which benefits both participants and researchers and save costs. Commercial providers of register and marketing data, smartphones, apps, email, social media, and the internet offer innovative possibilities for identifying, recruiting and following up cohorts. Using examples from UK COSMOS, this article sets out the dos and don’ts for today's cohort studies and provides a guide on how best to take advantage of new technologies and innovative methods to simplify logistics and minimise costs. Thus a more streamlined experience to the benefit of both research participants and researchers becomes achievable. PMID:26147611

  8. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  9. Multiresolution comparison of precipitation datasets for large-scale models

    NASA Astrophysics Data System (ADS)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  10. The Phenomenology of Small-Scale Turbulence

    NASA Astrophysics Data System (ADS)

    Sreenivasan, K. R.; Antonia, R. A.

    I have sometimes thought that what makes a man's work classic is often just this multiplicity [of interpretations], which invites and at the same time resists our craving for a clear understanding. Wright (1982, p. 34), on Wittgenstein's philosophy Small-scale turbulence has been an area of especially active research in the recent past, and several useful research directions have been pursued. Here, we selectively review this work. The emphasis is on scaling phenomenology and kinematics of small-scale structure. After providing a brief introduction to the classical notions of universality due to Kolmogorov and others, we survey the existing work on intermittency, refined similarity hypotheses, anomalous scaling exponents, derivative statistics, intermittency models, and the structure and kinematics of small-scale structure - the latter aspect coming largely from the direct numerical simulation of homogeneous turbulence in a periodic box.

  11. Strengths amidst vulnerabilities: the paradox of resistance in a mining-affected community in Guatemala.

    PubMed

    Caxaj, C Susana; Berman, Helene; Ray, Susan L; Restoule, Jean-Paul; Varcoe, Coleen

    2014-11-01

    The influence of large-scale mining on the psychosocial wellbeing and mental health of diverse Indigenous communities has attracted increased attention. In previous reports, we have discussed the influence of a gold mining operation on the health of a community in the Western highlands of Guatemala. Here, we discuss the community strengths, and acts of resistance of this community, that is, community processes that promoted mental health amidst this context. Using an anti-colonial narrative methodology that incorporated participatory action research principles, we developed a research design in collaboration with community leaders and participants. Data collection involved focus groups, individual interviews and photo-sharing with 54 men and women between the ages of 18 and 67. Data analysis was guided by iterative and ongoing conversations with participants and McCormack's narrative lenses. Study findings revealed key mechanisms and sources of resistance, including a shared cultural identity, a spiritual knowing and being, 'defending our rights, defending our territory,' and, speaking truth to power. These overlapping strengths were identified by participants as key protective factors in facing challenges and adversity. Yet ultimately, these same strengths were often the most eroded or endangered due the influence of large-scale mining operations in the region. These community strengths and acts of resistance reveal important priorities for promoting mental health and wellbeing for populations impacted by large-scale mining operations. Mental health practitioners must attend to both the strengths and parallel vulnerabilities that may be occasioned by large-scale projects of this nature.

  12. Large scale synthesis of α-Si3N4 nanowires through a kinetically favored chemical vapour deposition process

    NASA Astrophysics Data System (ADS)

    Liu, Haitao; Huang, Zhaohui; Zhang, Xiaoguang; Fang, Minghao; Liu, Yan-gai; Wu, Xiaowen; Min, Xin

    2018-01-01

    Understanding the kinetic barrier and driving force for crystal nucleation and growth is decisive for the synthesis of nanowires with controllable yield and morphology. In this research, we developed an effective reaction system to synthesize very large scale α-Si3N4 nanowires (hundreds of milligrams) and carried out a comparative study to characterize the kinetic influence of gas precursor supersaturation and liquid metal catalyst. The phase composition, morphology, microstructure and photoluminescence properties of the as-synthesized products were characterized by X-ray diffraction, fourier-transform infrared spectroscopy, field emission scanning electron microscopy, transmission electron microscopy and room temperature photoluminescence measurement. The yield of the products not only relates to the reaction temperature (thermodynamic condition) but also to the distribution of gas precursors (kinetic condition). As revealed in this research, by controlling the gas diffusion process, the yield of the nanowire products could be greatly improved. The experimental results indicate that the supersaturation is the dominant factor in the as-designed system rather than the catalyst. With excellent non-flammability and high thermal stability, the large scale α-Si3N4 products would have potential applications to the improvement of strength of high temperature ceramic composites. The photoluminescence spectrum of the α-Si3N4 shows a blue shift which could be valued for future applications in blue-green emitting devices. There is no doubt that the large scale products are the base of these applications.

  13. Conducting pilot and feasibility studies.

    PubMed

    Cope, Diane G

    2015-03-01

    Planning a well-designed research study can be tedious and laborious work. However, this process is critical and ultimately can produce valid, reliable study findings. Designing a large-scale randomized, controlled trial (RCT)-the gold standard in quantitative research-can be even more challenging. Even the most well-planned study potentially can result in issues with research procedures and design, such as recruitment, retention, or methodology. One strategy that may facilitate sound study design is the completion of a pilot or feasibility study prior to the initiation of a larger-scale trial. This article will discuss pilot and feasibility studies, their advantages and disadvantages, and implications for oncology nursing research. 
.

  14. The magnetic shear-current effect: Generation of large-scale magnetic fields by the small-scale dynamo

    DOE PAGES

    Squire, J.; Bhattacharjee, A.

    2016-03-14

    A novel large-scale dynamo mechanism, the magnetic shear-current effect, is discussed and explored. Here, the effect relies on the interaction of magnetic fluctuations with a mean shear flow, meaning the saturated state of the small-scale dynamo can drive a large-scale dynamo – in some sense the inverse of dynamo quenching. The dynamo is non-helical, with the mean fieldmore » $${\\it\\alpha}$$coefficient zero, and is caused by the interaction between an off-diagonal component of the turbulent resistivity and the stretching of the large-scale field by shear flow. Following up on previous numerical and analytic work, this paper presents further details of the numerical evidence for the effect, as well as an heuristic description of how magnetic fluctuations can interact with shear flow to produce the required electromotive force. The pressure response of the fluid is fundamental to this mechanism, which helps explain why the magnetic effect is stronger than its kinematic cousin, and the basic idea is related to the well-known lack of turbulent resistivity quenching by magnetic fluctuations. As well as being interesting for its applications to general high Reynolds number astrophysical turbulence, where strong small-scale magnetic fluctuations are expected to be prevalent, the magnetic shear-current effect is a likely candidate for large-scale dynamo in the unstratified regions of ionized accretion disks. Evidence for this is discussed, as well as future research directions and the challenges involved with understanding details of the effect in astrophysically relevant regimes.« less

  15. Sex differences in virtual navigation influenced by scale and navigation experience.

    PubMed

    Padilla, Lace M; Creem-Regehr, Sarah H; Stefanucci, Jeanine K; Cashdan, Elizabeth A

    2017-04-01

    The Morris water maze is a spatial abilities test adapted from the animal spatial cognition literature and has been studied in the context of sex differences in humans. This is because its standard design, which manipulates proximal (close) and distal (far) cues, applies to human navigation. However, virtual Morris water mazes test navigation skills on a scale that is vastly smaller than natural human navigation. Many researchers have argued that navigating in large and small scales is fundamentally different, and small-scale navigation might not simulate natural human navigation. Other work has suggested that navigation experience could influence spatial skills. To address the question of how individual differences influence navigational abilities in differently scaled environments, we employed both a large- (146.4 m in diameter) and a traditional- (36.6 m in diameter) scaled virtual Morris water maze along with a novel measure of navigation experience (lifetime mobility). We found sex differences on the small maze in the distal cue condition only, but in both cue-conditions on the large maze. Also, individual differences in navigation experience modulated navigation performance on the virtual water maze, showing that higher mobility was related to better performance with proximal cues for only females on the small maze, but for both males and females on the large maze.

  16. Shared versus distributed memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The question of whether multiprocessors should have shared or distributed memory has attracted a great deal of attention. Some researchers argue strongly for building distributed memory machines, while others argue just as strongly for programming shared memory multiprocessors. A great deal of research is underway on both types of parallel systems. Special emphasis is placed on systems with a very large number of processors for computation intensive tasks and considers research and implementation trends. It appears that the two types of systems will likely converge to a common form for large scale multiprocessors.

  17. Robust Optical Recognition of Cursive Pashto Script Using Scale, Rotation and Location Invariant Approach

    PubMed Central

    Ahmad, Riaz; Naz, Saeeda; Afzal, Muhammad Zeshan; Amin, Sayed Hassan; Breuel, Thomas

    2015-01-01

    The presence of a large number of unique shapes called ligatures in cursive languages, along with variations due to scaling, orientation and location provides one of the most challenging pattern recognition problems. Recognition of the large number of ligatures is often a complicated task in oriental languages such as Pashto, Urdu, Persian and Arabic. Research on cursive script recognition often ignores the fact that scaling, orientation, location and font variations are common in printed cursive text. Therefore, these variations are not included in image databases and in experimental evaluations. This research uncovers challenges faced by Arabic cursive script recognition in a holistic framework by considering Pashto as a test case, because Pashto language has larger alphabet set than Arabic, Persian and Urdu. A database containing 8000 images of 1000 unique ligatures having scaling, orientation and location variations is introduced. In this article, a feature space based on scale invariant feature transform (SIFT) along with a segmentation framework has been proposed for overcoming the above mentioned challenges. The experimental results show a significantly improved performance of proposed scheme over traditional feature extraction techniques such as principal component analysis (PCA). PMID:26368566

  18. A Successful Replication of the River Visitor Inventory and Monitoring Process for Capacity Management

    Treesearch

    Kenneth Chilman; James Vogel; Greg Brown; John H. Burde

    2004-01-01

    This paper has 3 purposes: to discuss 1. case study research and its utility for recreation management decisionmaking, 2. the recreation visitor inventory and monitoring process developed from case study research, and 3. a successful replication of the process in a large-scale, multi-year application. Although case study research is discussed in research textbooks as...

  19. International Citizenship Education Research: An Annotated Bibliography of Research Using the IEA ICCS and IEA CIVED Datasets

    ERIC Educational Resources Information Center

    Knowles, Ryan T.; Di Stefano, Marialuisa

    2015-01-01

    In November 2015, a group of researchers met to discuss the role of large-scale international studies to inform social studies research and practice. The conversation focused on published analyses of the International Association for the Evaluation of Educational Achievement (IEA) 1999 Civic Education study (CIVED) of 14 year olds in 28 countries,…

  20. Annual Review of Research Under the Joint Service Electronics Program.

    DTIC Science & Technology

    1979-10-01

    Contents: Quadratic Optimization Problems; Nonlinear Control; Nonlinear Fault Analysis; Qualitative Analysis of Large Scale Systems; Multidimensional System Theory ; Optical Noise; and Pattern Recognition.

  1. Workplan for Catalyzing Collaboration with Amazonian Universities in the Large Scale Biosphere-Atmosphere Experiment in Amazonia (LBA)

    NASA Technical Reports Server (NTRS)

    Brown, I. Foster; Moreira, Adriana

    1997-01-01

    Success of the Large-Scale Biosphere-Atmospheric Experiment in Amazonia (LBA) program depends on several critical factors, the most important being the effective participation of Amazonian researchers and institutions. Without host-county counterparts, particularly in Amazonia, many important studies cannot he undertaken due either to lack of qualified persons or to legal constraints. No less important, the acceptance of the LBA program in Amazonia is also dependent on what LBA can do for improving the scientific expertise in Amazonia. Gaining the active investment of Amazonian scientists in a comprehensive research program is not a trivial task. Potential collaborators are few, particularly where much of the research was to be originally focused - the southern arc of Brazilian Amazonia. The mid-term goals of the LBA Committee on Training and Education are to increase the number of collaborators and to demonstrate that LBA will be of benefit to the region.

  2. Towards AI-powered personalization in MOOC learning

    NASA Astrophysics Data System (ADS)

    Yu, Han; Miao, Chunyan; Leung, Cyril; White, Timothy John

    2017-12-01

    Massive Open Online Courses (MOOCs) represent a form of large-scale learning that is changing the landscape of higher education. In this paper, we offer a perspective on how advances in artificial intelligence (AI) may enhance learning and research on MOOCs. We focus on emerging AI techniques including how knowledge representation tools can enable students to adjust the sequence of learning to fit their own needs; how optimization techniques can efficiently match community teaching assistants to MOOC mediation tasks to offer personal attention to learners; and how virtual learning companions with human traits such as curiosity and emotions can enhance learning experience on a large scale. These new capabilities will also bring opportunities for educational researchers to analyse students' learning skills and uncover points along learning paths where students with different backgrounds may require different help. Ethical considerations related to the application of AI in MOOC education research are also discussed.

  3. Ocean Research Enabled by Underwater Gliders.

    PubMed

    Rudnick, Daniel L

    2016-01-01

    Underwater gliders are autonomous underwater vehicles that profile vertically by changing their buoyancy and use wings to move horizontally. Gliders are useful for sustained observation at relatively fine horizontal scales, especially to connect the coastal and open ocean. In this review, research topics are grouped by time and length scales. Large-scale topics addressed include the eastern and western boundary currents and the regional effects of climate variability. The accessibility of horizontal length scales of order 1 km allows investigation of mesoscale and submesoscale features such as fronts and eddies. Because the submesoscales dominate vertical fluxes in the ocean, gliders have found application in studies of biogeochemical processes. At the finest scales, gliders have been used to measure internal waves and turbulent dissipation. The review summarizes gliders' achievements to date and assesses their future in ocean observation.

  4. Crash test and evaluation of temporary wood sign support system for large guide signs.

    DOT National Transportation Integrated Search

    2016-07-01

    The objective of this research task was to evaluate the impact performance of a temporary wood sign support : system for large guide signs. It was desired to use existing TxDOT sign hardware in the design to the extent possible. : The full-scale cras...

  5. Lessons from historical rangeland revegetation for today's restoration

    Treesearch

    Bruce A. Roundy

    1999-01-01

    Rangeland revegetation in the Western United States historically was applied at a large scale for soil conservation and forage production purposes. Principles of revegetation that have developed over years of research include matching site potential and plant materials adaption, use of appropriate seedbed preparation and sowing techniques, and development of large...

  6. META II: Formal Co-Verification of Correctness of Large-Scale Cyber-Physical Systems during Design. Volume 1

    DTIC Science & Technology

    2011-08-01

    design space is large. His research contributions are to the field of Decision-based Design, specifically in linking consumer preferences and...Integrating Consumer Preferences into Engineering Design, to be published in 2012. He received his PhD from Northwestern University in Mechanical

  7. Supporting Source Code Comprehension during Software Evolution and Maintenance

    ERIC Educational Resources Information Center

    Alhindawi, Nouh

    2013-01-01

    This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…

  8. Taking the pulse of a continent: Expanding site-based research infrastructure for regional- to continental-scale ecology

    USDA-ARS?s Scientific Manuscript database

    Many of the most dramatic and surprising effects of global change on ecological systems will occur across large spatial extents, from regions to continents. Multiple ecosystem types will be impacted across a range of interacting spatial and temporal scales. The ability of ecologists to understand an...

  9. PREPping Students for Authentic Science

    ERIC Educational Resources Information Center

    Dolan, Erin L.; Lally, David J.; Brooks, Eric; Tax, Frans E.

    2008-01-01

    In this article, the authors describe a large-scale research collaboration, the Partnership for Research and Education in Plants (PREP), which has capitalized on publicly available databases that contain massive amounts of biological information; stock centers that house and distribute inexpensive organisms with different genotypes; and the…

  10. Uniform standards for genome databases in forest and fruit trees

    USDA-ARS?s Scientific Manuscript database

    TreeGenes and tfGDR serve the international forestry and fruit tree genomics research communities, respectively. These databases hold similar sequence data and provide resources for the submission and recovery of this information in order to enable comparative genomics research. Large-scale genotype...

  11. Topical report on sources and systems for aquatic plant biomass as an energy resource

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldman, J.C.; Ryther, J.H.; Waaland, R.

    1977-10-21

    Background information is documented on the mass cultivation of aquatic plants and systems design that is available from the literature and through consultation with active research scientists and engineers. The biology of microalgae, macroalgae, and aquatic angiosperms is discussed in terms of morphology, life history, mode of existence, and ecological significance, as they relate to cultivation. The requirements for growth of these plants, which are outlined in the test, suggest that productivity rates are dependent primarily on the availability of light and nutrients. It is concluded that the systems should be run with an excess of nutrients and with lightmore » as the limiting factor. A historical review of the mass cultivation of aquatic plants describes the techniques used in commercial large-scale operations throughout the world and recent small-scale research efforts. This review presents information on the biomass yields that have been attained to date in various geographical locations with different plant species and culture conditions, emphasizing the contrast between high yields in small-scale operations and lower yields in large-scale operations.« less

  12. Efficient design of clinical trials and epidemiological research: is it possible?

    PubMed

    Lauer, Michael S; Gordon, David; Wei, Gina; Pearson, Gail

    2017-08-01

    Randomized clinical trials and large-scale, cohort studies continue to have a critical role in generating evidence in cardiovascular medicine; however, the increasing concern is that ballooning costs threaten the clinical trial enterprise. In this Perspectives article, we discuss the changing landscape of clinical research, and clinical trials in particular, focusing on reasons for the increasing costs and inefficiencies. These reasons include excessively complex design, overly restrictive inclusion and exclusion criteria, burdensome regulations, excessive source-data verification, and concerns about the effect of clinical research conduct on workflow. Thought leaders have called on the clinical research community to consider alternative, transformative business models, including those models that focus on simplicity and leveraging of digital resources. We present some examples of innovative approaches by which some investigators have successfully conducted large-scale, clinical trials at relatively low cost. These examples include randomized registry trials, cluster-randomized trials, adaptive trials, and trials that are fully embedded within digital clinical care or administrative platforms.

  13. Data Crosscutting Requirements Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleese van Dam, Kerstin; Shoshani, Arie; Plata, Charity

    2013-04-01

    In April 2013, a diverse group of researchers from the U.S. Department of Energy (DOE) scientific community assembled to assess data requirements associated with DOE-sponsored scientific facilities and large-scale experiments. Participants in the review included facilities staff, program managers, and scientific experts from the offices of Basic Energy Sciences, Biological and Environmental Research, High Energy Physics, and Advanced Scientific Computing Research. As part of the meeting, review participants discussed key issues associated with three distinct aspects of the data challenge: 1) processing, 2) management, and 3) analysis. These discussions identified commonalities and differences among the needs of varied scientific communities.more » They also helped to articulate gaps between current approaches and future needs, as well as the research advances that will be required to close these gaps. Moreover, the review provided a rare opportunity for experts from across the Office of Science to learn about their collective expertise, challenges, and opportunities. The "Data Crosscutting Requirements Review" generated specific findings and recommendations for addressing large-scale data crosscutting requirements.« less

  14. Mining influence on underground water resources in arid and semiarid regions

    NASA Astrophysics Data System (ADS)

    Luo, A. K.; Hou, Y.; Hu, X. Y.

    2018-02-01

    Coordinated mining of coal and water resources in arid and semiarid regions has traditionally become a focus issue. The research takes Energy and Chemical Base in Northern Shaanxi as an example, and conducts statistical analysis on coal yield and drainage volume from several large-scale mines in the mining area. Meanwhile, research determines average water volume per ton coal, and calculates four typical years’ drainage volume in different mining intensity. Then during mining drainage, with the combination of precipitation observation data in recent two decades and water level data from observation well, the calculation of groundwater table, precipitation infiltration recharge, and evaporation capacity are performed. Moreover, the research analyzes the transforming relationship between surface water, mine water, and groundwater. The result shows that the main reason for reduction of water resources quantity and transforming relationship between surface water, groundwater, and mine water is massive mine drainage, which is caused by large-scale coal mining in the research area.

  15. Robust large-scale parallel nonlinear solvers for simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their usemore » in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple to write and easily portable. However, the method usually takes twice as long to solve as Newton-GMRES on general problems because it solves two linear systems at each iteration. In this paper, we discuss modifications to Bouaricha's method for a practical implementation, including a special globalization technique and other modifications for greater efficiency. We present numerical results showing computational advantages over Newton-GMRES on some realistic problems. We further discuss a new approach for dealing with singular (or ill-conditioned) matrices. In particular, we modify an algorithm for identifying a turning point so that an increasingly ill-conditioned Jacobian does not prevent convergence.« less

  16. Big biomedical data and cardiovascular disease research: opportunities and challenges.

    PubMed

    Denaxas, Spiros C; Morley, Katherine I

    2015-07-01

    Electronic health records (EHRs), data generated and collected during normal clinical care, are increasingly being linked and used for translational cardiovascular disease research. Electronic health record data can be structured (e.g. coded diagnoses) or unstructured (e.g. clinical notes) and increasingly encapsulate medical imaging, genomic and patient-generated information. Large-scale EHR linkages enable researchers to conduct high-resolution observational and interventional clinical research at an unprecedented scale. A significant amount of preparatory work and research, however, is required to identify, obtain, and transform raw EHR data into research-ready variables that can be statistically analysed. This study critically reviews the opportunities and challenges that EHR data present in the field of cardiovascular disease clinical research and provides a series of recommendations for advancing and facilitating EHR research.

  17. Spatial and temporal variance in fatty acid and stable isotope signatures across trophic levels in large river systems

    USGS Publications Warehouse

    Fritts, Andrea; Knights, Brent C.; Lafrancois, Toben D.; Bartsch, Lynn; Vallazza, Jon; Bartsch, Michelle; Richardson, William B.; Karns, Byron N.; Bailey, Sean; Kreiling, Rebecca

    2018-01-01

    Fatty acid and stable isotope signatures allow researchers to better understand food webs, food sources, and trophic relationships. Research in marine and lentic systems has indicated that the variance of these biomarkers can exhibit substantial differences across spatial and temporal scales, but this type of analysis has not been completed for large river systems. Our objectives were to evaluate variance structures for fatty acids and stable isotopes (i.e. δ13C and δ15N) of seston, threeridge mussels, hydropsychid caddisflies, gizzard shad, and bluegill across spatial scales (10s-100s km) in large rivers of the Upper Mississippi River Basin, USA that were sampled annually for two years, and to evaluate the implications of this variance on the design and interpretation of trophic studies. The highest variance for both isotopes was present at the largest spatial scale for all taxa (except seston δ15N) indicating that these isotopic signatures are responding to factors at a larger geographic level rather than being influenced by local-scale alterations. Conversely, the highest variance for fatty acids was present at the smallest spatial scale (i.e. among individuals) for all taxa except caddisflies, indicating that the physiological and metabolic processes that influence fatty acid profiles can differ substantially between individuals at a given site. Our results highlight the need to consider the spatial partitioning of variance during sample design and analysis, as some taxa may not be suitable to assess ecological questions at larger spatial scales.

  18. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    NASA Astrophysics Data System (ADS)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  19. A numerical projection technique for large-scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang

    2011-10-01

    We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.

  20. Economies of scale and scope in publicly funded biomedical and health research: evidence from the literature.

    PubMed

    Hernandez-Villafuerte, Karla; Sussex, Jon; Robin, Enora; Guthrie, Sue; Wooding, Steve

    2017-02-02

    Publicly funded biomedical and health research is expected to achieve the best return possible for taxpayers and for society generally. It is therefore important to know whether such research is more productive if concentrated into a small number of 'research groups' or dispersed across many. We undertook a systematic rapid evidence assessment focused on the research question: do economies of scale and scope exist in biomedical and health research? In other words, is that research more productive per unit of cost if more of it, or a wider variety of it, is done in one location? We reviewed English language literature without date restriction to the end of 2014. To help us to classify and understand that literature, we first undertook a review of econometric literature discussing models for analysing economies of scale and/or scope in research generally (not limited to biomedical and health research). We found a large and disparate literature. We reviewed 60 empirical studies of (dis-)economies of scale and/or scope in biomedical and health research, or in categories of research including or overlapping with biomedical and health research. This literature is varied in methods and findings. At the level of universities or research institutes, studies more often point to positive economies of scale than to diseconomies of scale or constant returns to scale in biomedical and health research. However, all three findings exist in the literature, along with inverse U-shaped relationships. At the level of individual research units, laboratories or projects, the numbers of studies are smaller and evidence is mixed. Concerning economies of scope, the literature more often suggests positive economies of scope than diseconomies, but the picture is again mixed. The effect of varying the scope of activities by a research group was less often reported than the effect of scale and the results were more mixed. The absence of predominant findings for or against the existence of economies of scale or scope implies a continuing need for case by case decisions when distributing research funding, rather than a general policy either to concentrate funding in a few centres or to disperse it across many.

  1. Measuring high-density built environment for public health research: Uncertainty with respect to data, indicator design and spatial scale.

    PubMed

    Sun, Guibo; Webster, Chris; Ni, Michael Y; Zhang, Xiaohu

    2018-05-07

    Uncertainty with respect to built environment (BE) data collection, measure conceptualization and spatial scales is evident in urban health research, but most findings are from relatively lowdensity contexts. We selected Hong Kong, an iconic high-density city, as the study area as limited research has been conducted on uncertainty in such areas. We used geocoded home addresses (n=5732) from a large population-based cohort in Hong Kong to extract BE measures for the participants' place of residence based on an internationally recognized BE framework. Variability of the measures was mapped and Spearman's rank correlation calculated to assess how well the relationships among indicators are preserved across variables and spatial scales. We found extreme variations and uncertainties for the 180 measures collected using comprehensive data and advanced geographic information systems modelling techniques. We highlight the implications of methodological selection and spatial scales of the measures. The results suggest that more robust information regarding urban health research in high-density city would emerge if greater consideration were given to BE data, design methods and spatial scales of the BE measures.

  2. About the Early Detection Research Group | Division of Cancer Prevention

    Cancer.gov

    The Early Detection Research Group supports research that seeks to determine the effectiveness, operating characteristics and clinical impact (harms as well as benefits) of cancer early detection technologies and practices, such as imaging and molecular biomarker approaches.   The group ran two large-scale early detection trials for which data and biospecimens are available

  3. Guide to Datasets for Research and Policymaking in Child Care and Early Education

    ERIC Educational Resources Information Center

    Romero, Mariajose; Douglas-Hall, Ayana

    2009-01-01

    This Guide is an annotated bibliography of existing large scale data sets that provide useful information to policymakers, researchers, state administrators, and others in the field of child care and early education. The Guide follows an ecological approach to research and policy in the field: it brings attention not only to children themselves,…

  4. AFRL/Cornell Information Assurance Institute

    DTIC Science & Technology

    2007-03-01

    revewing this colection ofinformation . Send connents regarding this burden estimate or any other aspect of this collection of information, indcudng...collabora- tions involving Cornell and AFRL researchers, with * AFRL researchers able to participate in Cornell research projects, fa- cilitating technology ...approach to developing a science base and technology for supporting large-scale reliable distributed systems. First, so- lutions to core problems were

  5. Examining What We Mean by "Collaboration" in Collaborative Action Research: A Cross-Case Analysis

    ERIC Educational Resources Information Center

    Bruce, Catherine D.; Flynn, Tara; Stagg-Peterson, Shelley

    2011-01-01

    The purpose of this paper is to report on the nature of collaboration in a multi-year, large-scale collaborative action research project in which a teachers' federation (in Ontario, Canada), university researchers and teachers partnered to investigate teacher-selected topics for inquiry. Over two years, 14 case studies were generated involving six…

  6. The Potential of Text Mining in Data Integration and Network Biology for Plant Research: A Case Study on Arabidopsis[C][W

    PubMed Central

    Van Landeghem, Sofie; De Bodt, Stefanie; Drebert, Zuzanna J.; Inzé, Dirk; Van de Peer, Yves

    2013-01-01

    Despite the availability of various data repositories for plant research, a wealth of information currently remains hidden within the biomolecular literature. Text mining provides the necessary means to retrieve these data through automated processing of texts. However, only recently has advanced text mining methodology been implemented with sufficient computational power to process texts at a large scale. In this study, we assess the potential of large-scale text mining for plant biology research in general and for network biology in particular using a state-of-the-art text mining system applied to all PubMed abstracts and PubMed Central full texts. We present extensive evaluation of the textual data for Arabidopsis thaliana, assessing the overall accuracy of this new resource for usage in plant network analyses. Furthermore, we combine text mining information with both protein–protein and regulatory interactions from experimental databases. Clusters of tightly connected genes are delineated from the resulting network, illustrating how such an integrative approach is essential to grasp the current knowledge available for Arabidopsis and to uncover gene information through guilt by association. All large-scale data sets, as well as the manually curated textual data, are made publicly available, hereby stimulating the application of text mining data in future plant biology studies. PMID:23532071

  7. The growth of radiative filamentation modes in sheared magnetic fields

    NASA Technical Reports Server (NTRS)

    Vanhoven, Gerard

    1986-01-01

    Observations of prominences show them to require well-developed magnetic shear and to have complex small-scale structure. Researchers show here that these features are reflected in the results of the theory of radiative condensation. Researchers studied, in particular, the influence of the nominally negligible contributions of perpendicular (to B) thermal conduction. They find a large number of unstable modes, with closely spaced growth rates. Their scale widths across B show a wide range of longitudinal and transverse sizes, ranging from much larger than to much smaller than the magnetic shear scale, the latter characterization applying particularly in the direction of shear variation.

  8. 5 CFR 9301.7 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and that operates solely for the purpose of conducting scientific research the results of which are... employees who perform the work and costs of conducting large-scale computer searches. (c) Duplicate means to... education, that operates a program or programs of scholarly research. (e) Fee category means one of the...

  9. 5 CFR 9301.7 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and that operates solely for the purpose of conducting scientific research the results of which are... employees who perform the work and costs of conducting large-scale computer searches. (c) Duplicate means to... education, that operates a program or programs of scholarly research. (e) Fee category means one of the...

  10. Encouraging Gender Analysis in Research Practice

    ERIC Educational Resources Information Center

    Thien, Deborah

    2009-01-01

    Few resources for practical teaching or fieldwork exercises exist which address gender in geographical contexts. This paper adds to teaching and fieldwork resources by describing an experience with designing and implementing a "gender intervention" for a large-scale, multi-university, bilingual research project that brought together a group of…

  11. Academic-industrial partnerships in drug discovery in the age of genomics.

    PubMed

    Harris, Tim; Papadopoulos, Stelios; Goldstein, David B

    2015-06-01

    Many US FDA-approved drugs have been developed through productive interactions between the biotechnology industry and academia. Technological breakthroughs in genomics, in particular large-scale sequencing of human genomes, is creating new opportunities to understand the biology of disease and to identify high-value targets relevant to a broad range of disorders. However, the scale of the work required to appropriately analyze large genomic and clinical data sets is challenging industry to develop a broader view of what areas of work constitute precompetitive research. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Federated queries of clinical data repositories: Scaling to a national network.

    PubMed

    Weber, Griffin M

    2015-06-01

    Federated networks of clinical research data repositories are rapidly growing in size from a handful of sites to true national networks with more than 100 hospitals. This study creates a conceptual framework for predicting how various properties of these systems will scale as they continue to expand. Starting with actual data from Harvard's four-site Shared Health Research Information Network (SHRINE), the framework is used to imagine a future 4000 site network, representing the majority of hospitals in the United States. From this it becomes clear that several common assumptions of small networks fail to scale to a national level, such as all sites being online at all times or containing data from the same date range. On the other hand, a large network enables researchers to select subsets of sites that are most appropriate for particular research questions. Developers of federated clinical data networks should be aware of how the properties of these networks change at different scales and design their software accordingly. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Education Highlights: Forest Biomass

    ScienceCinema

    Barone, Rachel; Canter, Christina

    2018-06-25

    Argonne intern Rachel Barone from Ithaca College worked with Argonne mentor Christina Canter in studying forest biomass. This research will help scientists develop large scale use of biofuels from forest biomass.

  14. Education Highlights: Forest Biomass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Rachel; Canter, Christina

    2016-01-27

    Argonne intern Rachel Barone from Ithaca College worked with Argonne mentor Christina Canter in studying forest biomass. This research will help scientists develop large scale use of biofuels from forest biomass.

  15. Why is data sharing in collaborative natural resource efforts so hard and what can we do to improve it?

    PubMed

    Volk, Carol J; Lucero, Yasmin; Barnas, Katie

    2014-05-01

    Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents (n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.

  16. Why is Data Sharing in Collaborative Natural Resource Efforts so Hard and What can We Do to Improve it?

    NASA Astrophysics Data System (ADS)

    Volk, Carol J.; Lucero, Yasmin; Barnas, Katie

    2014-05-01

    Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents ( n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.

  17. Channel optimization of high-intensity laser beams in millimeter-scale plasmas.

    PubMed

    Ceurvorst, L; Savin, A; Ratan, N; Kasim, M F; Sadler, J; Norreys, P A; Habara, H; Tanaka, K A; Zhang, S; Wei, M S; Ivancic, S; Froula, D H; Theobald, W

    2018-04-01

    Channeling experiments were performed at the OMEGA EP facility using relativistic intensity (>10^{18}W/cm^{2}) kilojoule laser pulses through large density scale length (∼390-570 μm) laser-produced plasmas, demonstrating the effects of the pulse's focal location and intensity as well as the plasma's temperature on the resulting channel formation. The results show deeper channeling when focused into hot plasmas and at lower densities, as expected. However, contrary to previous large-scale particle-in-cell studies, the results also indicate deeper penetration by short (10 ps), intense pulses compared to their longer-duration equivalents. This new observation has many implications for future laser-plasma research in the relativistic regime.

  18. Channel optimization of high-intensity laser beams in millimeter-scale plasmas

    NASA Astrophysics Data System (ADS)

    Ceurvorst, L.; Savin, A.; Ratan, N.; Kasim, M. F.; Sadler, J.; Norreys, P. A.; Habara, H.; Tanaka, K. A.; Zhang, S.; Wei, M. S.; Ivancic, S.; Froula, D. H.; Theobald, W.

    2018-04-01

    Channeling experiments were performed at the OMEGA EP facility using relativistic intensity (>1018W/cm 2 ) kilojoule laser pulses through large density scale length (˜390 -570 μ m ) laser-produced plasmas, demonstrating the effects of the pulse's focal location and intensity as well as the plasma's temperature on the resulting channel formation. The results show deeper channeling when focused into hot plasmas and at lower densities, as expected. However, contrary to previous large-scale particle-in-cell studies, the results also indicate deeper penetration by short (10 ps), intense pulses compared to their longer-duration equivalents. This new observation has many implications for future laser-plasma research in the relativistic regime.

  19. Large Field Photogrammetry Techniques in Aircraft and Spacecraft Impact Testing

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.

    2010-01-01

    The Landing and Impact Research Facility (LandIR) at NASA Langley Research Center is a 240 ft. high A-frame structure which is used for full-scale crash testing of aircraft and rotorcraft vehicles. Because the LandIR provides a unique capability to introduce impact velocities in the forward and vertical directions, it is also serving as the facility for landing tests on full-scale and sub-scale Orion spacecraft mass simulators. Recently, a three-dimensional photogrammetry system was acquired to assist with the gathering of vehicle flight data before, throughout and after the impact. This data provides the basis for the post-test analysis and data reduction. Experimental setups for pendulum swing tests on vehicles having both forward and vertical velocities can extend to 50 x 50 x 50 foot cubes, while weather, vehicle geometry, and other constraints make each experimental setup unique to each test. This paper will discuss the specific calibration techniques for large fields of views, camera and lens selection, data processing, as well as best practice techniques learned from using the large field of view photogrammetry on a multitude of crash and landing test scenarios unique to the LandIR.

  20. Analysis of the ability of large-scale reanalysis data to define Siberian fire danger in preparation for future fire prediction

    NASA Astrophysics Data System (ADS)

    Soja, Amber; Westberg, David; Stackhouse, Paul, Jr.; McRae, Douglas; Jin, Ji-Zhong; Sukhinin, Anatoly

    2010-05-01

    Fire is the dominant disturbance that precipitates ecosystem change in boreal regions, and fire is largely under the control of weather and climate. Fire frequency, fire severity, area burned and fire season length are predicted to increase in boreal regions under current climate change scenarios. Therefore, changes in fire regimes have the potential to compel ecological change, moving ecosystems more quickly towards equilibrium with a new climate. The ultimate goal of this research is to assess the viability of large-scale (1°) data to be used to define fire weather danger and fire regimes, so that large-scale data can be confidently used to predict future fire regimes using large-scale fire weather data, like that available from current Intergovernmental Panel on Climate Change (IPCC) climate change scenarios. In this talk, we intent to: (1) evaluate Fire Weather Indices (FWI) derived using reanalysis and interpolated station data; (2) discuss the advantages and disadvantages of using these distinct data sources; and (3) highlight established relationships between large-scale fire weather data, area burned, active fires and ecosystems burned. Specifically, the Canadian Forestry Service (CFS) Fire Weather Index (FWI) will be derived using: (1) NASA Goddard Earth Observing System version 4 (GEOS-4) large-scale reanalysis and NASA Global Precipitation Climatology Project (GPCP) data; and National Climatic Data Center (NCDC) surface station-interpolated data. Requirements of the FWI are local noon surface-level air temperature, relative humidity, wind speed, and daily (noon-noon) rainfall. GEOS-4 reanalysis and NCDC station-interpolated fire weather indices are generally consistent spatially, temporally and quantitatively. Additionally, increased fire activity coincides with increased FWI ratings in both data products. Relationships have been established between large-scale FWI to area burned, fire frequency, ecosystem types, and these can be use to estimate historic and future fire regimes.

  1. Computational methods to extract meaning from text and advance theories of human cognition.

    PubMed

    McNamara, Danielle S

    2011-01-01

    Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA. Copyright © 2010 Cognitive Science Society, Inc.

  2. Simulating the impact of the large-scale circulation on the 2-m temperature and precipitation climatology

    NASA Astrophysics Data System (ADS)

    Bowden, Jared H.; Nolte, Christopher G.; Otte, Tanya L.

    2013-04-01

    The impact of the simulated large-scale atmospheric circulation on the regional climate is examined using the Weather Research and Forecasting (WRF) model as a regional climate model. The purpose is to understand the potential need for interior grid nudging for dynamical downscaling of global climate model (GCM) output for air quality applications under a changing climate. In this study we downscale the NCEP-Department of Energy Atmospheric Model Intercomparison Project (AMIP-II) Reanalysis using three continuous 20-year WRF simulations: one simulation without interior grid nudging and two using different interior grid nudging methods. The biases in 2-m temperature and precipitation for the simulation without interior grid nudging are unreasonably large with respect to the North American Regional Reanalysis (NARR) over the eastern half of the contiguous United States (CONUS) during the summer when air quality concerns are most relevant. This study examines how these differences arise from errors in predicting the large-scale atmospheric circulation. It is demonstrated that the Bermuda high, which strongly influences the regional climate for much of the eastern half of the CONUS during the summer, is poorly simulated without interior grid nudging. In particular, two summers when the Bermuda high was west (1993) and east (2003) of its climatological position are chosen to illustrate problems in the large-scale atmospheric circulation anomalies. For both summers, WRF without interior grid nudging fails to simulate the placement of the upper-level anticyclonic (1993) and cyclonic (2003) circulation anomalies. The displacement of the large-scale circulation impacts the lower atmosphere moisture transport and precipitable water, affecting the convective environment and precipitation. Using interior grid nudging improves the large-scale circulation aloft and moisture transport/precipitable water anomalies, thereby improving the simulated 2-m temperature and precipitation. The results demonstrate that constraining the RCM to the large-scale features in the driving fields improves the overall accuracy of the simulated regional climate, and suggest that in the absence of such a constraint, the RCM will likely misrepresent important large-scale shifts in the atmospheric circulation under a future climate.

  3. The role of large eddy fluctuations in the magnetic dynamics of the Madison Dynamo Experiment

    NASA Astrophysics Data System (ADS)

    Kaplan, Elliot

    The Madison Dynamo Experiment (MDE), a liquid sodium magnetohydrodynamics experiment in a 1 m diameter sphere at the University of Wisconsin-Madison, had measured [in Spence et al., 2006] diamagnetic electrical currents in the experiment that violated an anti dynamo theorem for axisymmetric flow. The diamagnetic currents were instead attributed to nonaxisymmetric turbulent fluctuations. The experimental apparatus has been modified to reduce the strength of the large-scale turbulence driven by the shear layer in its flow. A 7.62 cm baffle was affixed to the equator of the machine to stabilize the shear layer. This reduction has correlated with a decrease in the magnetic fields, induced by the flow, which had been associated with the α and β effects of mean-field magnetohydrodynamics. The research presented herein presents the experimental evidence for reduced fluctuations and reduced mean field emfs, and provides a theoretical framework—based upon mean-field MHD—that connects the observations. The shapes of the large-scale velocity fluctuations are inferred by the spectra of induced magnetic fluctuations and measured in a kinematically similar water experiment. The Bullard and Gellman [1954] formalism demonstrates that the large-scale velocity fluctuations that are inhibited by the baffle can beat with the large-scale magnetic fluctuations that they produce to generate a mean-field emf of the sort measured in Spence et al. [2006]. This shows that the reduction of these large-scale eddies has brought the MDE closer to exciting a dynamo magnetic field. We also examine the mean-field like effects of large-scale (stable) eddies in the Dudley-James [1989] two-vortex dynamo (that the MDE was based upon). Rotating the axis of symmetry redefines the problem from one of an axisymmetric flow exciting a nonaxisymmetric field to one of a combination of axisymmetric and nonaxisymmetric flows exciting a predominantly axisymmetric magnetic eigenmode. As a result, specific interactions between large-scale velocity modes and large-scale magnetic modes are shown to correspond to the Ω effect and the mean-field α and β effects.

  4. Establishing User Needs--A Large-Scale Study into the Requirements of Those Involved in the Research Process

    ERIC Educational Resources Information Center

    Grimshaw, Shirley; Wilson, Ian

    2009-01-01

    The aim of the project was to develop a set of online tools, systems and processes that would facilitate research at the University of Nottingham. The tools would be delivered via a portal, a one-stop place providing a Virtual Research Environment for all those involved in the research process. A predominantly bottom-up approach was used with…

  5. Scaling, soil moisture and evapotranspiration in runoff models

    NASA Technical Reports Server (NTRS)

    Wood, Eric F.

    1993-01-01

    The effects of small-scale heterogeneity in land surface characteristics on the large-scale fluxes of water and energy in the land-atmosphere system has become a central focus of many of the climatology research experiments. The acquisition of high resolution land surface data through remote sensing and intensive land-climatology field experiments (like HAPEX and FIFE) has provided data to investigate the interactions between microscale land-atmosphere interactions and macroscale models. One essential research question is how to account for the small scale heterogeneities and whether 'effective' parameters can be used in the macroscale models. To address this question of scaling, the probability distribution for evaporation is derived which illustrates the conditions for which scaling should work. A correction algorithm that may appropriate for the land parameterization of a GCM is derived using a 2nd order linearization scheme. The performance of the algorithm is evaluated.

  6. Bridging the Science/Policy Gap through Boundary Chain Partnerships and Communities of Practice

    NASA Astrophysics Data System (ADS)

    Kalafatis, S.

    2014-12-01

    Generating the capacity to facilitate the informed usage of climate change science by decision makers on a large scale is fast becoming an area of great concern. While research demonstrates that sustained interactions between producers of such information and potential users can overcome barriers to information usage, it also demonstrates the high resource demand of these efforts. Our social science work at Great Lakes Integrated Sciences and Assessments (GLISA) sheds light on scaling up the usability of climate science through two research areas. The first focuses on partnerships with other boundary organizations that GLISA has leveraged - the "boundary chains" approach. These partnerships reduce the transaction costs involved with outreach and have enhanced the scope of GLISA's climate service efforts to encompass new users such as First Nations groups in Wisconsin and Michigan and underserved neighborhoods in St. Paul, Minnesota. The second research area looks at the development of information usability across the regional scale of the eight Great Lakes states. It has identified the critical role that communities of practice are playing in making information usable to large groups of users who work in similar contexts and have similar information needs. Both these research areas demonstrate the emerging potential of flexible knowledge networks to enhance society's ability to prepare for the impacts of climate change.

  7. Nanoliter-Scale Protein Crystallization and Screening with a Microfluidic Droplet Robot

    PubMed Central

    Zhu, Ying; Zhu, Li-Na; Guo, Rui; Cui, Heng-Jun; Ye, Sheng; Fang, Qun

    2014-01-01

    Large-scale screening of hundreds or even thousands of crystallization conditions while with low sample consumption is in urgent need, in current structural biology research. Here we describe a fully-automated droplet robot for nanoliter-scale crystallization screening that combines the advantages of both automated robotics technique for protein crystallization screening and the droplet-based microfluidic technique. A semi-contact dispensing method was developed to achieve flexible, programmable and reliable liquid-handling operations for nanoliter-scale protein crystallization experiments. We applied the droplet robot in large-scale screening of crystallization conditions of five soluble proteins and one membrane protein with 35–96 different crystallization conditions, study of volume effects on protein crystallization, and determination of phase diagrams of two proteins. The volume for each droplet reactor is only ca. 4–8 nL. The protein consumption significantly reduces 50–500 fold compared with current crystallization stations. PMID:24854085

  8. Nanoliter-scale protein crystallization and screening with a microfluidic droplet robot.

    PubMed

    Zhu, Ying; Zhu, Li-Na; Guo, Rui; Cui, Heng-Jun; Ye, Sheng; Fang, Qun

    2014-05-23

    Large-scale screening of hundreds or even thousands of crystallization conditions while with low sample consumption is in urgent need, in current structural biology research. Here we describe a fully-automated droplet robot for nanoliter-scale crystallization screening that combines the advantages of both automated robotics technique for protein crystallization screening and the droplet-based microfluidic technique. A semi-contact dispensing method was developed to achieve flexible, programmable and reliable liquid-handling operations for nanoliter-scale protein crystallization experiments. We applied the droplet robot in large-scale screening of crystallization conditions of five soluble proteins and one membrane protein with 35-96 different crystallization conditions, study of volume effects on protein crystallization, and determination of phase diagrams of two proteins. The volume for each droplet reactor is only ca. 4-8 nL. The protein consumption significantly reduces 50-500 fold compared with current crystallization stations.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marrinan, Thomas; Leigh, Jason; Renambot, Luc

    Mixed presence collaboration involves remote collaboration between multiple collocated groups. This paper presents the design and results of a user study that focused on mixed presence collaboration using large-scale tiled display walls. The research was conducted in order to compare data synchronization schemes for multi-user visualization applications. Our study compared three techniques for sharing data between display spaces with varying constraints and affordances. The results provide empirical evidence that using data sharing techniques with continuous synchronization between the sites lead to improved collaboration for a search and analysis task between remotely located groups. We have also identified aspects of synchronizedmore » sessions that result in increased remote collaborator awareness and parallel task coordination. It is believed that this research will lead to better utilization of large-scale tiled display walls for distributed group work.« less

  10. Klamath Basin: A Watershed Approach to Support Habitat Restoration, Species Recovery, and Water Resource Planning

    USGS Publications Warehouse

    VanderKooi, S.P.; Thorsteinson, L.

    2007-01-01

    Water allocation among human and natural resource uses in the American West is challenging. Western rivers have been largely managed for hydropower, irrigation, drinking water, and navigation. Today land and water use practices have gained importance, particularly as aging dams are faced with re-licensing requirements and provisions of the Endangered Species and Clean Water Acts. Rising demand for scarce water heightens the need for scientific research to predict consequences of management actions on habitats, human resource use, and fish and wildlife. Climate change, introduction of invasive species, or restoration of fish passage can have large, landscape-scaled consequences - research must expand to encompass the appropriate scale and by applying multiple scientific disciplines to complex ecosystem challenges improve the adaptive management framework for decision-making.

  11. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Novel Route to Fabrication of Metal-Sandwiched Nanoscale Tapered Structures

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Yu, Da-Peng

    2009-08-01

    Tapered dielectric structures in metal have exhibited extraordinary performance in both surface plasmon polariton (SPP) waveguiding and SPP focusing. This is crucial to plasmonic research and industrial plasmonic device integration. We present a method that facilitates easy fabrication of smooth-surfaced sub-micron tapered structures in large scale simply with electron beam lithography (EBL). When a PMMA layer is spin-coated on previously-EBL-defined PMMA structures, steep edges can be transformed into a declining slope to form tapered PMMA structures, scaled from 10 nm to 1000 nm. Despite the simplicity of our method, patterns with PMMA surface smoothness can be well-positioned and replicated in large numbers, which therefore gives scientists easy access to research on the properties of tapered structures.

  12. ACTIVIS: Visual Exploration of Industry-Scale Deep Neural Network Models.

    PubMed

    Kahng, Minsuk; Andrews, Pierre Y; Kalro, Aditya; Polo Chau, Duen Horng

    2017-08-30

    While deep learning models have achieved state-of-the-art accuracies for many prediction tasks, understanding these models remains a challenge. Despite the recent interest in developing visual tools to help users interpret deep learning models, the complexity and wide variety of models deployed in industry, and the large-scale datasets that they used, pose unique design challenges that are inadequately addressed by existing work. Through participatory design sessions with over 15 researchers and engineers at Facebook, we have developed, deployed, and iteratively improved ACTIVIS, an interactive visualization system for interpreting large-scale deep learning models and results. By tightly integrating multiple coordinated views, such as a computation graph overview of the model architecture, and a neuron activation view for pattern discovery and comparison, users can explore complex deep neural network models at both the instance- and subset-level. ACTIVIS has been deployed on Facebook's machine learning platform. We present case studies with Facebook researchers and engineers, and usage scenarios of how ACTIVIS may work with different models.

  13. Development of Dynamic Flow Field Pressure Probes Suitable for Use in Large Scale Supersonic Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Porro, A. Robert

    2000-01-01

    A series of dynamic flow field pressure probes were developed for use in large-scale supersonic wind tunnels at NASA Glenn Research Center. These flow field probes include pitot, static, and five-hole conical pressure probes that are capable of capturing fast acting flow field pressure transients that occur on a millisecond time scale. The pitot and static probes can be used to determine local Mach number time histories during a transient event. The five-hole conical pressure probes are used primarily to determine local flow angularity, but can also determine local Mach number. These probes were designed, developed, and tested at the NASA Glenn Research Center. They were also used in a NASA Glenn 10-by 10-Foot Supersonic Wind Tunnel (SWT) test program where they successfully acquired flow field pressure data in the vicinity of a propulsion system during an engine compressor staff and inlet unstart transient event. Details of the design, development, and subsequent use of these probes are discussed in this report.

  14. Improving Design Efficiency for Large-Scale Heterogeneous Circuits

    NASA Astrophysics Data System (ADS)

    Gregerson, Anthony

    Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.

  15. Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments.

    PubMed

    Ionescu, Catalin; Papava, Dragos; Olaru, Vlad; Sminchisescu, Cristian

    2014-07-01

    We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.

  16. Sustaining Innovations through Lead Teacher Learning: A Learning Sciences Perspective on Supporting Professional Development

    ERIC Educational Resources Information Center

    Fogleman, Jay; Fishman, Barry; Krajcik, Joe

    2006-01-01

    There is a rich history of researchers developing curricular materials aimed at enhancing student learning in American classrooms. Though many of these innovations have been successful on a small scale, institutionalizing them so they become part of a district's instructional culture has been a challenge. As large districts try to scale up and…

  17. German National Proficiency Scales in Biology: Internal Structure, Relations to General Cognitive Abilities and Verbal Skills

    ERIC Educational Resources Information Center

    Kampa, Nele; Köller, Olaf

    2016-01-01

    National and international large-scale assessments (LSA) have a major impact on educational systems, which raises fundamental questions about the validity of the measures regarding their internal structure and their relations to relevant covariates. Given its importance, research on the validity of instruments specifically developed for LSA is…

  18. An Investigation on Computer-Adaptive Multistage Testing Panels for Multidimensional Assessment

    ERIC Educational Resources Information Center

    Wang, Xinrui

    2013-01-01

    The computer-adaptive multistage testing (ca-MST) has been developed as an alternative to computerized adaptive testing (CAT), and been increasingly adopted in large-scale assessments. Current research and practice only focus on ca-MST panels for credentialing purposes. The ca-MST test mode, therefore, is designed to gauge a single scale. The…

  19. Evaluation of Hydrogel Technologies for the Decontamination ...

    EPA Pesticide Factsheets

    Report This current research effort was developed to evaluate intermediate level (between bench-scale and large-scale or wide-area implementation) decontamination procedures, materials, technologies, and techniques used to remove radioactive material from different surfaces. In the event of such an incident, application of this technology would primarily be intended for decontamination of high-value buildings, important infrastructure, and landmarks.

  20. Mechanisms Affecting the Sustainability and Scale-up of a System-Wide Numeracy Reform

    ERIC Educational Resources Information Center

    Bobis, Janette

    2011-01-01

    With deliberate system-level reform now being acted upon around the world, both successful and unsuccessful cases provide a rich source of knowledge from which we can learn to improve large-scale reform. Research surrounding the effectiveness of a theory-based system-wide numeracy reform operating in primary schools across Australia is examined to…

  1. Satisfaction with Life and Hope: A Look at Age and Marital Status

    ERIC Educational Resources Information Center

    Bailey, Thomas C.; Snyder, C. R.

    2007-01-01

    The Adult Trait Hope Scale (Snyder et al., 1991) typically has been administered to samples of college students, and previous researchers have not explored key demographic variables. In a large sample of community persons who were not in college (N = 215), significant differences were detected in Hope Scale scores across differing age groups and…

  2. Convergence of microclimate in residential landscapes across diverse cities in the United States

    Treesearch

    Sharon J. Hall; J. Learned; B. Ruddell; K.L. Larson; J. Cavender-Bares; N. Bettez; P.M. Groffman; Morgan Grove; J.B. Heffernan; S.E. Hobbie; J.L. Morse; C. Neill; K.C. Nelson; Jarlath O' Neil-Dunne; L. Ogden; D.E. Pataki; W.D. Pearse; C. Polsky; R. Roy Chowdhury; M.K. Steele; T.L.E. Trammell

    2016-01-01

    The urban heat island (UHI) is a well-documented pattern of warming in cities relative to rural areas. Most UHI research utilizes remote sensing methods at large scales, or climate sensors in single cities surrounded by standardized land cover. Relatively few studies have explored continental-scale climatic patterns within common urban microenvironments such as...

  3. Linking Large-Scale Reading Assessments: Comment

    ERIC Educational Resources Information Center

    Hanushek, Eric A.

    2016-01-01

    E. A. Hanushek points out in this commentary that applied researchers in education have only recently begun to appreciate the value of international assessments, even though there are now 50 years of experience with these. Until recently, these assessments have been stand-alone surveys that have not been linked, and analysis has largely focused on…

  4. Using Syntactic Patterns to Enhance Text Analytics

    ERIC Educational Resources Information Center

    Meyer, Bradley B.

    2017-01-01

    Large scale product and service reviews proliferate and are commonly found across the web. The ability to harvest, digest and analyze a large corpus of reviews from online websites is still however a difficult problem. This problem is referred to as "opinion mining." Opinion mining is an important area of research as advances in the…

  5. Faculty Retirement and Recruitment in the Community Colleges

    ERIC Educational Resources Information Center

    Kisker, Carrie B.

    2004-01-01

    Several recent studies have shown that a large-scale turnover is likely to occur among community college instructors over the next several years, as veteran faculty retire in large numbers. As colleges begin to experience this mass retirement, researchers and administrators must consider the side effects of a faculty turnover, especially in…

  6. Large in-stream wood studies: A call for common metrics

    Treesearch

    Ellen Wohl; Daniel A. Cenderelli; Kathleen A. Dwire; Sandra E. Ryan-Burkett; Michael K. Young; Kurt D. Fausch

    2010-01-01

    During the past decade, research on large in-stream wood has expanded beyond North America's Pacific Northwest to diverse environments and has shifted toward increasingly holistic perspectives that incorporate processes of wood recruitment, retention, and loss at scales from channel segments to entire watersheds. Syntheses of this rapidly expanding literature can...

  7. Forced Alignment for Understudied Language Varieties: Testing Prosodylab-Aligner with Tongan Data

    ERIC Educational Resources Information Center

    Johnson, Lisa M.; Di Paolo, Marianna; Bell, Adrian

    2018-01-01

    Automated alignment of transcriptions to audio files expedites the process of preparing data for acoustic analysis. Unfortunately, the benefits of auto-alignment have generally been available only to researchers studying majority languages, for which large corpora exist and for which acoustic models have been created by large-scale research…

  8. Implications of Small Samples for Generalization: Adjustments and Rules of Thumb

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hallberg, Kelly; Hedges, Larry V.; Chan, Wendy

    2015-01-01

    Policy-makers are frequently interested in understanding how effective a particular intervention may be for a specific (and often broad) population. In many fields, particularly education and social welfare, the ideal form of these evaluations is a large-scale randomized experiment. Recent research has highlighted that sites in these large-scale…

  9. Class Size: Teachers' Perspectives

    ERIC Educational Resources Information Center

    Watson, Kevin; Handal, Boris; Maher, Marguerite

    2016-01-01

    A consistent body of research shows that large classes have been perceived by teachers as an obstacle to deliver quality teaching. This large-scale study sought to investigate further those differential effects by asking 1,119 teachers from 321 K-12 schools in New South Wales (Australia) their perceptions of ideal class size for a variety of…

  10. A Phenomenology of Learning Large: The Tutorial Sphere of xMOOC Video Lectures

    ERIC Educational Resources Information Center

    Adams, Catherine; Yin, Yin; Vargas Madriz, Luis Francisco; Mullen, C. Scott

    2014-01-01

    The current discourse surrounding Massive Open Online Courses (MOOCs) is powerful. Despite their rapid and widespread deployment, research has yet to confirm or refute some of the bold claims rationalizing the popularity and efficacy of these large-scale virtual learning environments. Also, MOOCs' reputed disruptive, game-changing potential…

  11. A Navy Shore Activity Manpower Planning System for Civilians. Technical Report No. 24.

    ERIC Educational Resources Information Center

    Niehaus, R. J.; Sholtz, D.

    This report describes the U.S. Navy Shore Activity Manpower Planning System (SAMPS) advanced development research project. This effort is aimed at large-scale feasibility tests of manpower models for large Naval installations. These local planning systems are integrated with Navy-wide information systems on a data-communications network accessible…

  12. Upscaling of U(VI) Desorption and Transport from Decimeter-Scale Heterogeneity to Plume-Scale Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Gary P.; Kohler, Matthias; Kannappan, Ramakrishnan

    2015-02-24

    Scientifically defensible predictions of field scale U(VI) transport in groundwater requires an understanding of key processes at multiple scales. These scales range from smaller than the sediment grain scale (less than 10 μm) to as large as the field scale which can extend over several kilometers. The key processes that need to be considered include both geochemical reactions in solution and at sediment surfaces as well as physical transport processes including advection, dispersion, and pore-scale diffusion. The research summarized in this report includes both experimental and modeling results in batch, column and tracer tests. The objectives of this research weremore » to: (1) quantify the rates of U(VI) desorption from sediments acquired from a uranium contaminated aquifer in batch experiments;(2) quantify rates of U(VI) desorption in column experiments with variable chemical conditions, and(3) quantify nonreactive tracer and U(VI) transport in field tests.« less

  13. An investigation of small scales of turbulence in a boundary layer at high Reynolds numbers

    NASA Technical Reports Server (NTRS)

    Wallace, James M.; Ong, L.; Balint, J.-L.

    1993-01-01

    The assumption that turbulence at large wave-numbers is isotropic and has universal spectral characteristics which are independent of the flow geometry, at least for high Reynolds numbers, has been a cornerstone of closure theories as well as of the most promising recent development in the effort to predict turbulent flows, viz. large eddy simulations. This hypothesis was first advanced by Kolmogorov based on the supposition that turbulent kinetic energy cascades down the scales (up the wave-numbers) of turbulence and that, if the number of these cascade steps is sufficiently large (i.e. the wave-number range is large), then the effects of anisotropies at the large scales are lost in the energy transfer process. Experimental attempts were repeatedly made to verify this fundamental assumption. However, Van Atta has recently suggested that an examination of the scalar and velocity gradient fields is necessary to definitively verify this hypothesis or prove it to be unfounded. Of course, this must be carried out in a flow with a sufficiently high Reynolds number to provide the necessary separation of scales in order unambiguously to provide the possibility of local isotropy at large wave-numbers. An opportunity to use our 12-sensor hot-wire probe to address this issue directly was made available at the 80'x120' wind tunnel at the NASA Ames Research Center, which is normally used for full-scale aircraft tests. An initial report on this high Reynolds number experiment and progress toward its evaluation is presented.

  14. A Commercialization Roadmap for Carbon-Negative Energy Systems

    NASA Astrophysics Data System (ADS)

    Sanchez, D.

    2016-12-01

    The Intergovernmental Panel on Climate Change (IPCC) envisages the need for large-scale deployment of net-negative CO2 emissions technologies by mid-century to meet stringent climate mitigation goals and yield a net drawdown of atmospheric carbon. Yet there are few commercial deployments of BECCS outside of niche markets, creating uncertainty about commercialization pathways and sustainability impacts at scale. This uncertainty is exacerbated by the absence of a strong policy framework, such as high carbon prices and research coordination. Here, we propose a strategy for the potential commercial deployment of BECCS. This roadmap proceeds via three steps: 1) via capture and utilization of biogenic CO2 from existing bioenergy facilities, notably ethanol fermentation, 2) via thermochemical co-conversion of biomass and fossil fuels, particularly coal, and 3) via dedicated, large-scale BECCS. Although biochemical conversion is a proven first market for BECCS, this trajectory alone is unlikely to drive commercialization of BECCS at the gigatonne scale. In contrast to biochemical conversion, thermochemical conversion of coal and biomass enables large-scale production of fuels and electricity with a wide range of carbon intensities, process efficiencies and process scales. Aside from systems integration, primarily technical barriers are involved in large-scale biomass logistics, gasification and gas cleaning. Key uncertainties around large-scale BECCS deployment are not limited to commercialization pathways; rather, they include physical constraints on biomass cultivation or CO2 storage, as well as social barriers, including public acceptance of new technologies and conceptions of renewable and fossil energy, which co-conversion systems confound. Despite sustainability risks, this commercialization strategy presents a pathway where energy suppliers, manufacturers and governments could transition from laggards to leaders in climate change mitigation efforts.

  15. Science Information System in Japan. NIER Occasional Paper 02/83.

    ERIC Educational Resources Information Center

    Matsumura, Tamiko

    This paper describes the development of a proposed Japanese Science Information System (SIS), a nationwide network of research and academic libraries, large-scale computer centers, national research institutes, and other organizations, to be formed for the purpose of sharing information and resources in the natural sciences, technology, the…

  16. Partner Power: A Study of Two Distance Education Consortia

    ERIC Educational Resources Information Center

    Pidduck, Anne Banks; Carey, Tom

    2006-01-01

    This research reports findings from a study which explored the process and criteria of partner selection--how and why partners are chosen--for two distance education consortia. The researchers reviewed recent literature on partnerships and partner selection. Two Canada-wide distance education consortia were identified as large-scale case studies…

  17. Combining Language Corpora with Experimental and Computational Approaches for Language Acquisition Research

    ERIC Educational Resources Information Center

    Monaghan, Padraic; Rowland, Caroline F.

    2017-01-01

    Historically, first language acquisition research was a painstaking process of observation, requiring the laborious hand coding of children's linguistic productions, followed by the generation of abstract theoretical proposals for how the developmental process unfolds. Recently, the ability to collect large-scale corpora of children's language…

  18. Moderation and Consistency of Teacher Judgement: Teachers' Views

    ERIC Educational Resources Information Center

    Connolly, Stephen; Klenowski, Valentina; Wyatt-Smith, Claire Maree

    2012-01-01

    Major curriculum and assessment reforms in Australia have generated research interest in issues related to standards, teacher judgement and moderation. This article is based on one related inquiry of a large-scale Australian Research Council Linkage project conducted in Queensland. This qualitative study analysed interview data to identify…

  19. Research and management issues in large-scale fire modeling

    Treesearch

    David L. Peterson; Daniel L. Schmoldt

    2000-01-01

    In 1996, a team of North American fire scientists and resource managers convened to assess the effects of fire disturbance on ecosystems and to develop scientific recommendations for future fire research and management activities. These recommendations - elicited with the Analytic Hierarchy Process - include numerically ranked scientific and managerial questions and...

  20. Evaluation of the Teaching American History Program

    ERIC Educational Resources Information Center

    Humphrey, Daniel C.; Chang-Ross, Christopher; Donnelly, Mary Beth; Hersh, Lauren; Skolnik, Heidi

    2005-01-01

    Nearly 20 years ago, the first national assessment of student achievement in U.S. history yielded disappointing results. Although policy-makers and researchers expressed great concern about the low scores, the federal government did not undertake large-scale efforts to address poor student performance, and few research dollars were dedicated to…

  1. Identifying Students Difficulties in Understanding Concepts Pertaining to Cell Water Relations: An Exploratory Study.

    ERIC Educational Resources Information Center

    Friedler, Y.; And Others

    This study identified students' conceptual difficulties in understanding concepts and processes associated with cell water relationships (osmosis), determined possible reasons for these difficulties, and pilot-tested instruments and research strategies for a large scale comprehensive study. Research strategies used included content analysis of…

  2. The Ceiling to Coproduction in University-Industry Research Collaboration

    ERIC Educational Resources Information Center

    McCabe, Angela; Parker, Rachel; Cox, Stephen

    2016-01-01

    The purpose of this paper is to provide insight into government attempts at bridging the divide between theory and practice through university-industry research collaboration modelled under engaged scholarship. The findings are based on data sourced from interviews with 47 academic and industry project leaders from 23 large-scale research…

  3. Quake Final Video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Critical infrastructures of the world are at constant risks for earthquakes. Most of these critical structures are designed using archaic, seismic, simulation methods that were built from early digital computers from the 1970s. Idaho National Laboratory’s Seismic Research Group are working to modernize the simulation methods through computational research and large-scale laboratory experiments.

  4. The Human Ecology of the American Educational Research Association. Report No. 261.

    ERIC Educational Resources Information Center

    Richards, James M., Jr.

    The concepts and methods of human ecology are applied to the geographic distribution of members of the American Educational Research Association. State characteristics are measured by five factors: (1) large-scale agriculture; (2) population size; (3) affluence-urbanization; (4) white predominance; (5) emphasis on specialized agriculture. City…

  5. Reframing Approaches to Narrating Young People's Conceptualisations of Citizenship in Education Research

    ERIC Educational Resources Information Center

    Akar, Bassel

    2018-01-01

    Large-scale quantitative studies on citizenship and citizenship education research have advanced an international and comparative field of democratic citizenship education. Their instruments, however, informed by theoretical variables constructed in Western Europe and North America mostly measure young people's understandings of a predefined…

  6. Preschool Children, Painting and Palimpsest: Collaboration as Pedagogy, Practice and Learning

    ERIC Educational Resources Information Center

    Cutcher, Alexandra; Boyd, Wendy

    2018-01-01

    This article describes a small, collaborative, arts-based research project conducted in two rural early childhood centres in regional Australia, where the children made large-scale collaborative paintings in partnership with teachers and researchers. Observation of young children's artistic practices, in order to inform the development of…

  7. Probing wind-turbine/atmosphere interactions at utility scale: Novel insights from the EOLOS wind energy research station

    NASA Astrophysics Data System (ADS)

    Hong, J.; Guala, M.; Chamorro, L. P.; Sotiropoulos, F.

    2014-06-01

    Despite major research efforts, the interaction of the atmospheric boundary layer with turbines and multi-turbine arrays at utility scale remains poorly understood today. This lack of knowledge stems from the limited number of utility-scale research facilities and a number of technical challenges associated with obtaining high-resolution measurements at field scale. We review recent results obtained at the University of Minnesota utility-scale wind energy research station (the EOLOS facility), which is comprised of a 130 m tall meteorological tower and a fully instrumented 2.5MW Clipper Liberty C96 wind turbine. The results address three major areas: 1) The detailed characterization of the wake structures at a scale of 36×36 m2 using a novel super-large-scale particle image velocimetry based on natural snowflakes, including the rich tip vortex dynamics and their correlation with turbine operations, control, and performance; 2) The use of a WindCube Lidar profiler to investigate how wind at various elevations influences turbine power fluctuation and elucidate the role of wind gusts on individual blade loading; and 3) The systematic quantification of the interaction between the turbine instantaneous power output and tower foundation strain with the incoming flow turbulence, which is measured from the meteorological tower.

  8. Approaches for advancing scientific understanding of macrosystems

    USGS Publications Warehouse

    Levy, Ofir; Ball, Becky A.; Bond-Lamberty, Ben; Cheruvelil, Kendra S.; Finley, Andrew O.; Lottig, Noah R.; Surangi W. Punyasena,; Xiao, Jingfeng; Zhou, Jizhong; Buckley, Lauren B.; Filstrup, Christopher T.; Keitt, Tim H.; Kellner, James R.; Knapp, Alan K.; Richardson, Andrew D.; Tcheng, David; Toomey, Michael; Vargas, Rodrigo; Voordeckers, James W.; Wagner, Tyler; Williams, John W.

    2014-01-01

    The emergence of macrosystems ecology (MSE), which focuses on regional- to continental-scale ecological patterns and processes, builds upon a history of long-term and broad-scale studies in ecology. Scientists face the difficulty of integrating the many elements that make up macrosystems, which consist of hierarchical processes at interacting spatial and temporal scales. Researchers must also identify the most relevant scales and variables to be considered, the required data resources, and the appropriate study design to provide the proper inferences. The large volumes of multi-thematic data often associated with macrosystem studies typically require validation, standardization, and assimilation. Finally, analytical approaches need to describe how cross-scale and hierarchical dynamics and interactions relate to macroscale phenomena. Here, we elaborate on some key methodological challenges of MSE research and discuss existing and novel approaches to meet them.

  9. Assessing nurse-patient interactions from a caring perspective: report of the development and preliminary psychometric testing of the Caring Nurse--Patient Interactions Scale.

    PubMed

    Cossette, Sylvie; Cara, Chantal; Ricard, Nicole; Pepin, Jacinthe

    2005-08-01

    While there is a large body of literature regarding caring in nursing and some measurement tools addressing the concept have been developed, limitations of existing instruments constrain theory-driven research on nurse-patient interactions. The purpose of this paper is to describe the development and initial psychometric evaluation of the Caring Nurse-Patient Interactions Scale in a sample of 332 nurses and nursing students. The tool intended to facilitate research on the links between caring and patient outcomes. A content validity approach involving 13 expert nurses resulted in a 70-item tool sub-divided into 10 nursing carative factors. Alpha coefficients between sub-scales varied from .73 to .91 and sub-scales inter-correlations ranged from .53 to .89. Pearson correlation coefficients ranged from --.02 to .32 between the sub-scales and social desirability suggesting low to moderate bias. Results of the contrasted group approach partially supported the hypotheses while all differences were in the expected direction. Results suggest that the scale has strong potential for use in research, clinical and educational settings.

  10. Tapping The Sun's Energy

    ERIC Educational Resources Information Center

    Lee, David G.

    1974-01-01

    Describes several successful attempts to utilize solar energy for heating and providing electrical energy for homes. Indicates that more research and development are needed, especially in the area of large scale usage. (SLH)

  11. Change in Weather Research and Forecasting (WRF) Model Accuracy with Age of Input Data from the Global Forecast System (GFS)

    DTIC Science & Technology

    2016-09-01

    Laboratory Change in Weather Research and Forecasting (WRF) Model Accuracy with Age of Input Data from the Global Forecast System (GFS) by JL Cogan...analysis. As expected, accuracy generally tended to decline as the large-scale data aged , but appeared to improve slightly as the age of the large...19 Table 7 Minimum and maximum mean RMDs for each WRF time (or GFS data age ) category. Minimum and

  12. THE MARK I BUSINESS SYSTEM SIMULATION MODEL

    DTIC Science & Technology

    of a large-scale business simulation model as a vehicle for doing research in management controls. The major results of the program were the...development of the Mark I business simulation model and the Simulation Package (SIMPAC). SIMPAC is a method and set of programs facilitating the construction...of large simulation models. The object of this document is to describe the Mark I Corporation model, state why parts of the business were modeled as they were, and indicate the research applications of the model. (Author)

  13. Pragmatic clinical trials embedded in healthcare systems: generalizable lessons from the NIH Collaboratory.

    PubMed

    Weinfurt, Kevin P; Hernandez, Adrian F; Coronado, Gloria D; DeBar, Lynn L; Dember, Laura M; Green, Beverly B; Heagerty, Patrick J; Huang, Susan S; James, Kathryn T; Jarvik, Jeffrey G; Larson, Eric B; Mor, Vincent; Platt, Richard; Rosenthal, Gary E; Septimus, Edward J; Simon, Gregory E; Staman, Karen L; Sugarman, Jeremy; Vazquez, Miguel; Zatzick, Douglas; Curtis, Lesley H

    2017-09-18

    The clinical research enterprise is not producing the evidence decision makers arguably need in a timely and cost effective manner; research currently involves the use of labor-intensive parallel systems that are separate from clinical care. The emergence of pragmatic clinical trials (PCTs) poses a possible solution: these large-scale trials are embedded within routine clinical care and often involve cluster randomization of hospitals, clinics, primary care providers, etc. Interventions can be implemented by health system personnel through usual communication channels and quality improvement infrastructure, and data collected as part of routine clinical care. However, experience with these trials is nascent and best practices regarding design operational, analytic, and reporting methodologies are undeveloped. To strengthen the national capacity to implement cost-effective, large-scale PCTs, the Common Fund of the National Institutes of Health created the Health Care Systems Research Collaboratory (Collaboratory) to support the design, execution, and dissemination of a series of demonstration projects using a pragmatic research design. In this article, we will describe the Collaboratory, highlight some of the challenges encountered and solutions developed thus far, and discuss remaining barriers and opportunities for large-scale evidence generation using PCTs. A planning phase is critical, and even with careful planning, new challenges arise during execution; comparisons between arms can be complicated by unanticipated changes. Early and ongoing engagement with both health care system leaders and front-line clinicians is critical for success. There is also marked uncertainty when applying existing ethical and regulatory frameworks to PCTS, and using existing electronic health records for data capture adds complexity.

  14. Steps Towards Understanding Large-scale Deformation of Gas Hydrate-bearing Sediments

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Deusner, C.; Haeckel, M.; Kossel, E.

    2016-12-01

    Marine sediments bearing gas hydrates are typically characterized by heterogeneity in the gas hydrate distribution and anisotropy in the sediment-gas hydrate fabric properties. Gas hydrates also contribute to the strength and stiffness of the marine sediment, and any disturbance in the thermodynamic stability of the gas hydrates is likely to affect the geomechanical stability of the sediment. Understanding mechanisms and triggers of large-strain deformation and failure of marine gas hydrate-bearing sediments is an area of extensive research, particularly in the context of marine slope-stability and industrial gas production. The ultimate objective is to predict severe deformation events such as regional-scale slope failure or excessive sand production by using numerical simulation tools. The development of such tools essentially requires a careful analysis of thermo-hydro-chemo-mechanical behavior of gas hydrate-bearing sediments at lab-scale, and its stepwise integration into reservoir-scale simulators through definition of effective variables, use of suitable constitutive relations, and application of scaling laws. One of the focus areas of our research is to understand the bulk coupled behavior of marine gas hydrate systems with contributions from micro-scale characteristics, transport-reaction dynamics, and structural heterogeneity through experimental flow-through studies using high-pressure triaxial test systems and advanced tomographical tools (CT, ERT, MRI). We combine these studies to develop mathematical model and numerical simulation tools which could be used to predict the coupled hydro-geomechanical behavior of marine gas hydrate reservoirs in a large-strain framework. Here we will present some of our recent results from closely co-ordinated experimental and numerical simulation studies with an objective to capture the large-deformation behavior relevant to different gas production scenarios. We will also report on a variety of mechanically relevant test scenarios focusing on effects of dynamic changes in gas hydrate saturation, highly uneven gas hydrate distributions, focused fluid migration and gas hydrate production through depressurization and CO2 injection.

  15. A comparison of working in small-scale and large-scale nursing homes: A systematic review of quantitative and qualitative evidence.

    PubMed

    Vermeerbergen, Lander; Van Hootegem, Geert; Benders, Jos

    2017-02-01

    Ongoing shortages of care workers, together with an ageing population, make it of utmost importance to increase the quality of working life in nursing homes. Since the 1970s, normalised and small-scale nursing homes have been increasingly introduced to provide care in a family and homelike environment, potentially providing a richer work life for care workers as well as improved living conditions for residents. 'Normalised' refers to the opportunities given to residents to live in a manner as close as possible to the everyday life of persons not needing care. The study purpose is to provide a synthesis and overview of empirical research comparing the quality of working life - together with related work and health outcomes - of professional care workers in normalised small-scale nursing homes as compared to conventional large-scale ones. A systematic review of qualitative and quantitative studies. A systematic literature search (April 2015) was performed using the electronic databases Pubmed, Embase, PsycInfo, CINAHL and Web of Science. References and citations were tracked to identify additional, relevant studies. We identified 825 studies in the selected databases. After checking the inclusion and exclusion criteria, nine studies were selected for review. Two additional studies were selected after reference and citation tracking. Three studies were excluded after requesting more information on the research setting. The findings from the individual studies suggest that levels of job control and job demands (all but "time pressure") are higher in normalised small-scale homes than in conventional large-scale nursing homes. Additionally, some studies suggested that social support and work motivation are higher, while risks of burnout and mental strain are lower, in normalised small-scale nursing homes. Other studies found no differences or even opposing findings. The studies reviewed showed that these inconclusive findings can be attributed to care workers in some normalised small-scale homes experiencing isolation and too high job demands in their work roles. This systematic review suggests that normalised small-scale homes are a good starting point for creating a higher quality of working life in the nursing home sector. Higher job control enables care workers to manage higher job demands in normalised small-scale homes. However, some jobs would benefit from interventions to address care workers' perceptions of too low social support and of too high job demands. More research is needed to examine strategies to enhance these working life issues in normalised small-scale settings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A Historical Perspective on Dynamics Testing at the Langley Research Center

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Kvaternik, Raymond G.

    2000-01-01

    The history of structural dynamics testing research over the past four decades at the Langley Research Center of the National Aeronautics and Space Administration is reviewed. Beginning in the early sixties, Langley investigated several scale model and full-scale spacecraft including the NIMBUS and various concepts for Apollo and Viking landers. Langley engineers pioneered the use of scaled models to study the dynamics of launch vehicles including Saturn I, Saturn V, and Titan III. In the seventies, work emphasized the Space Shuttle and advanced test and data analysis methods. In the eighties, the possibility of delivering large structures to orbit by the Space Shuttle shifted focus towards understanding the interaction of flexible space structures with attitude control systems. Although Langley has maintained a tradition of laboratory-based research, some flight experiments were supported. This review emphasizes work that, in some way, advanced the state of knowledge at the time.

  17. Capacity Building: Data- and Research-Informed Development of Schools and Teaching Practices in Denmark and Norway

    ERIC Educational Resources Information Center

    Qvortrup, Lars

    2016-01-01

    Based on experiences from a number of large scale data- and research-informed school development projects in Denmark and Norway, led by the author, three hypotheses are discussed: that an effective way of linking research and practice is achieved (1) using a capacity building approach, that is, to collaborate in the practical school context…

  18. Analysis of Radar and Optical Space Borne Data for Large Scale Topographical Mapping

    NASA Astrophysics Data System (ADS)

    Tampubolon, W.; Reinhardt, W.

    2015-03-01

    Normally, in order to provide high resolution 3 Dimension (3D) geospatial data, large scale topographical mapping needs input from conventional airborne campaigns which are in Indonesia bureaucratically complicated especially during legal administration procedures i.e. security clearance from military/defense ministry. This often causes additional time delays besides technical constraints such as weather and limited aircraft availability for airborne campaigns. Of course the geospatial data quality is an important issue for many applications. The increasing demand of geospatial data nowadays consequently requires high resolution datasets as well as a sufficient level of accuracy. Therefore an integration of different technologies is required in many cases to gain the expected result especially in the context of disaster preparedness and emergency response. Another important issue in this context is the fast delivery of relevant data which is expressed by the term "Rapid Mapping". In this paper we present first results of an on-going research to integrate different data sources like space borne radar and optical platforms. Initially the orthorectification of Very High Resolution Satellite (VHRS) imagery i.e. SPOT-6 has been done as a continuous process to the DEM generation using TerraSAR-X/TanDEM-X data. The role of Ground Control Points (GCPs) from GNSS surveys is mandatory in order to fulfil geometrical accuracy. In addition, this research aims on providing suitable processing algorithm of space borne data for large scale topographical mapping as described in section 3.2. Recently, radar space borne data has been used for the medium scale topographical mapping e.g. for 1:50.000 map scale in Indonesian territories. The goal of this on-going research is to increase the accuracy of remote sensing data by different activities, e.g. the integration of different data sources (optical and radar) or the usage of the GCPs in both, the optical and the radar satellite data processing. Finally this results will be used in the future as a reference for further geospatial data acquisitions to support topographical mapping in even larger scales up to the 1:10.000 map scale.

  19. OpenMP parallelization of a gridded SWAT (SWATG)

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin

    2017-12-01

    Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.

  20. Digital footprints: facilitating large-scale environmental psychiatric research in naturalistic settings through data from everyday technologies

    PubMed Central

    Bidargaddi, N; Musiat, P; Makinen, V-P; Ermes, M; Schrader, G; Licinio, J

    2017-01-01

    Digital footprints, the automatically accumulated by-products of our technology-saturated lives, offer an exciting opportunity for psychiatric research. The commercial sector has already embraced the electronic trails of customers as an enabling tool for guiding consumer behaviour, and analogous efforts are ongoing to monitor and improve the mental health of psychiatric patients. The untargeted collection of digital footprints that may or may not be health orientated comprises a large untapped information resource for epidemiological scale research into psychiatric disorders. Real-time monitoring of mood, sleep and physical and social activity in a substantial portion of the affected population in a naturalistic setting is unprecedented in psychiatry. We propose that digital footprints can provide these measurements from real world setting unobtrusively and in a longitudinal fashion. In this perspective article, we outline the concept of digital footprints and the services and devices that create them, and present examples where digital footprints have been successfully used in research. We then critically discuss the opportunities and fundamental challenges associated digital footprints in psychiatric research, such as collecting data from different sources, analysis, ethical and research design challenges. PMID:27922603

  1. Digital footprints: facilitating large-scale environmental psychiatric research in naturalistic settings through data from everyday technologies.

    PubMed

    Bidargaddi, N; Musiat, P; Makinen, V-P; Ermes, M; Schrader, G; Licinio, J

    2017-02-01

    Digital footprints, the automatically accumulated by-products of our technology-saturated lives, offer an exciting opportunity for psychiatric research. The commercial sector has already embraced the electronic trails of customers as an enabling tool for guiding consumer behaviour, and analogous efforts are ongoing to monitor and improve the mental health of psychiatric patients. The untargeted collection of digital footprints that may or may not be health orientated comprises a large untapped information resource for epidemiological scale research into psychiatric disorders. Real-time monitoring of mood, sleep and physical and social activity in a substantial portion of the affected population in a naturalistic setting is unprecedented in psychiatry. We propose that digital footprints can provide these measurements from real world setting unobtrusively and in a longitudinal fashion. In this perspective article, we outline the concept of digital footprints and the services and devices that create them, and present examples where digital footprints have been successfully used in research. We then critically discuss the opportunities and fundamental challenges associated digital footprints in psychiatric research, such as collecting data from different sources, analysis, ethical and research design challenges.

  2. Linking Research, Education and Public Engagement in Geoscience: Leadership and Strategic Partnerships

    NASA Astrophysics Data System (ADS)

    Spellman, K.

    2017-12-01

    A changing climate has impacted Alaska communities at unprecedented rates, and the need for efficient and effective climate change learning in the Boreal and Arctic regions is urgent. Learning programs that can both increase personal understanding and connection to climate change science and also inform large scale scientific research about climate change are an attractive option for building community adaptive capacity at multiple scales. Citizen science has emerged as a powerful tool for facilitating learning across scales, and for building partnerships across natural sciences research, education, and outreach disciplines. As an early career scientist and interdisciplinary researcher, citizen science has become the centerpiece of my work and has provided some of the most rewarding moments of my career. I will discuss my early career journey building a research and leadership portfolio integrating climate change research, learning research, and public outreach through citizen science. I will share key experiences from graduate student to early career PI that cultivated my leadership skills and ability to build partnerships necessary to create citizen science programs that emphasize synergy between climate change research and education.

  3. RENEB - Running the European Network of biological dosimetry and physical retrospective dosimetry.

    PubMed

    Kulka, Ulrike; Abend, Michael; Ainsbury, Elizabeth; Badie, Christophe; Barquinero, Joan Francesc; Barrios, Lleonard; Beinke, Christina; Bortolin, Emanuela; Cucu, Alexandra; De Amicis, Andrea; Domínguez, Inmaculada; Fattibene, Paola; Frøvig, Anne Marie; Gregoire, Eric; Guogyte, Kamile; Hadjidekova, Valeria; Jaworska, Alicja; Kriehuber, Ralf; Lindholm, Carita; Lloyd, David; Lumniczky, Katalin; Lyng, Fiona; Meschini, Roberta; Mörtl, Simone; Della Monaca, Sara; Monteiro Gil, Octávia; Montoro, Alegria; Moquet, Jayne; Moreno, Mercedes; Oestreicher, Ursula; Palitti, Fabrizio; Pantelias, Gabriel; Patrono, Clarice; Piqueret-Stephan, Laure; Port, Matthias; Prieto, María Jesus; Quintens, Roel; Ricoul, Michelle; Romm, Horst; Roy, Laurence; Sáfrány, Géza; Sabatier, Laure; Sebastià, Natividad; Sommer, Sylwester; Terzoudi, Georgia; Testa, Antonella; Thierens, Hubert; Turai, Istvan; Trompier, François; Valente, Marco; Vaz, Pedro; Voisin, Philippe; Vral, Anne; Woda, Clemens; Zafiropoulos, Demetre; Wojcik, Andrzej

    2017-01-01

    A European network was initiated in 2012 by 23 partners from 16 European countries with the aim to significantly increase individualized dose reconstruction in case of large-scale radiological emergency scenarios. The network was built on three complementary pillars: (1) an operational basis with seven biological and physical dosimetric assays in ready-to-use mode, (2) a basis for education, training and quality assurance, and (3) a basis for further network development regarding new techniques and members. Techniques for individual dose estimation based on biological samples and/or inert personalized devices as mobile phones or smart phones were optimized to support rapid categorization of many potential victims according to the received dose to the blood or personal devices. Communication and cross-border collaboration were also standardized. To assure long-term sustainability of the network, cooperation with national and international emergency preparedness organizations was initiated and links to radiation protection and research platforms have been developed. A legal framework, based on a Memorandum of Understanding, was established and signed by 27 organizations by the end of 2015. RENEB is a European Network of biological and physical-retrospective dosimetry, with the capacity and capability to perform large-scale rapid individualized dose estimation. Specialized to handle large numbers of samples, RENEB is able to contribute to radiological emergency preparedness and wider large-scale research projects.

  4. Plasma surface figuring of large optical components

    NASA Astrophysics Data System (ADS)

    Jourdain, R.; Castelli, M.; Morantz, P.; Shore, P.

    2012-04-01

    Fast figuring of large optical components is well known as a highly challenging manufacturing issue. Different manufacturing technologies including: magnetorheological finishing, loose abrasive polishing, ion beam figuring are presently employed. Yet, these technologies are slow and lead to expensive optics. This explains why plasma-based processes operating at atmospheric pressure have been researched as a cost effective means for figure correction of metre scale optical surfaces. In this paper, fast figure correction of a large optical surface is reported using the Reactive Atom Plasma (RAP) process. Achievements are shown following the scaling-up of the RAP figuring process to a 400 mm diameter area of a substrate made of Corning ULE®. The pre-processing spherical surface is characterized by a 3 metres radius of curvature, 2.3 μm PVr (373nm RMS), and 1.2 nm Sq nanometre roughness. The nanometre scale correction figuring system used for this research work is named the HELIOS 1200, and it is equipped with a unique plasma torch which is driven by a dedicated tool path algorithm. Topography map measurements were carried out using a vertical work station instrumented by a Zygo DynaFiz interferometer. Figuring results, together with the processing times, convergence levels and number of iterations, are reported. The results illustrate the significant potential and advantage of plasma processing for figuring correction of large silicon based optical components.

  5. Large-scale transportation network congestion evolution prediction using deep learning theory.

    PubMed

    Ma, Xiaolei; Yu, Haiyang; Wang, Yunpeng; Wang, Yinhai

    2015-01-01

    Understanding how congestion at one location can cause ripples throughout large-scale transportation network is vital for transportation researchers and practitioners to pinpoint traffic bottlenecks for congestion mitigation. Traditional studies rely on either mathematical equations or simulation techniques to model traffic congestion dynamics. However, most of the approaches have limitations, largely due to unrealistic assumptions and cumbersome parameter calibration process. With the development of Intelligent Transportation Systems (ITS) and Internet of Things (IoT), transportation data become more and more ubiquitous. This triggers a series of data-driven research to investigate transportation phenomena. Among them, deep learning theory is considered one of the most promising techniques to tackle tremendous high-dimensional data. This study attempts to extend deep learning theory into large-scale transportation network analysis. A deep Restricted Boltzmann Machine and Recurrent Neural Network architecture is utilized to model and predict traffic congestion evolution based on Global Positioning System (GPS) data from taxi. A numerical study in Ningbo, China is conducted to validate the effectiveness and efficiency of the proposed method. Results show that the prediction accuracy can achieve as high as 88% within less than 6 minutes when the model is implemented in a Graphic Processing Unit (GPU)-based parallel computing environment. The predicted congestion evolution patterns can be visualized temporally and spatially through a map-based platform to identify the vulnerable links for proactive congestion mitigation.

  6. Large-Scale Transportation Network Congestion Evolution Prediction Using Deep Learning Theory

    PubMed Central

    Ma, Xiaolei; Yu, Haiyang; Wang, Yunpeng; Wang, Yinhai

    2015-01-01

    Understanding how congestion at one location can cause ripples throughout large-scale transportation network is vital for transportation researchers and practitioners to pinpoint traffic bottlenecks for congestion mitigation. Traditional studies rely on either mathematical equations or simulation techniques to model traffic congestion dynamics. However, most of the approaches have limitations, largely due to unrealistic assumptions and cumbersome parameter calibration process. With the development of Intelligent Transportation Systems (ITS) and Internet of Things (IoT), transportation data become more and more ubiquitous. This triggers a series of data-driven research to investigate transportation phenomena. Among them, deep learning theory is considered one of the most promising techniques to tackle tremendous high-dimensional data. This study attempts to extend deep learning theory into large-scale transportation network analysis. A deep Restricted Boltzmann Machine and Recurrent Neural Network architecture is utilized to model and predict traffic congestion evolution based on Global Positioning System (GPS) data from taxi. A numerical study in Ningbo, China is conducted to validate the effectiveness and efficiency of the proposed method. Results show that the prediction accuracy can achieve as high as 88% within less than 6 minutes when the model is implemented in a Graphic Processing Unit (GPU)-based parallel computing environment. The predicted congestion evolution patterns can be visualized temporally and spatially through a map-based platform to identify the vulnerable links for proactive congestion mitigation. PMID:25780910

  7. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  8. A study of rotor and platform design trade-offs for large-scale floating vertical axis wind turbines

    NASA Astrophysics Data System (ADS)

    Griffith, D. Todd; Paquette, Joshua; Barone, Matthew; Goupee, Andrew J.; Fowler, Matthew J.; Bull, Diana; Owens, Brian

    2016-09-01

    Vertical axis wind turbines are receiving significant attention for offshore siting. In general, offshore wind offers proximity to large populations centers, a vast & more consistent wind resource, and a scale-up opportunity, to name a few beneficial characteristics. On the other hand, offshore wind suffers from high levelized cost of energy (LCOE) and in particular high balance of system (BoS) costs owing to accessibility challenges and limited project experience. To address these challenges associated with offshore wind, Sandia National Laboratories is researching large-scale (MW class) offshore floating vertical axis wind turbines (VAWTs). The motivation for this work is that floating VAWTs are a potential transformative technology solution to reduce offshore wind LCOE in deep-water locations. This paper explores performance and cost trade-offs within the design space for floating VAWTs between the configurations for the rotor and platform.

  9. Large-scale quantum networks based on graphs

    NASA Astrophysics Data System (ADS)

    Epping, Michael; Kampermann, Hermann; Bruß, Dagmar

    2016-05-01

    Society relies and depends increasingly on information exchange and communication. In the quantum world, security and privacy is a built-in feature for information processing. The essential ingredient for exploiting these quantum advantages is the resource of entanglement, which can be shared between two or more parties. The distribution of entanglement over large distances constitutes a key challenge for current research and development. Due to losses of the transmitted quantum particles, which typically scale exponentially with the distance, intermediate quantum repeater stations are needed. Here we show how to generalise the quantum repeater concept to the multipartite case, by describing large-scale quantum networks, i.e. network nodes and their long-distance links, consistently in the language of graphs and graph states. This unifying approach comprises both the distribution of multipartite entanglement across the network, and the protection against errors via encoding. The correspondence to graph states also provides a tool for optimising the architecture of quantum networks.

  10. A PRACTICAL ONTOLOGY FOR THE LARGE-SCALE MODELING OF SCHOLARLY ARTIFACTS AND THEIR USAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RODRIGUEZ, MARKO A.; BOLLEN, JOHAN; VAN DE SOMPEL, HERBERT

    2007-01-30

    The large-scale analysis of scholarly artifact usage is constrained primarily by current practices in usage data archiving, privacy issues concerned with the dissemination of usage data, and the lack of a practical ontology for modeling the usage domain. As a remedy to the third constraint, this article presents a scholarly ontology that was engineered to represent those classes for which large-scale bibliographic and usage data exists, supports usage research, and whose instantiation is scalable to the order of 50 million articles along with their associated artifacts (e.g. authors and journals) and an accompanying 1 billion usage events. The real worldmore » instantiation of the presented abstract ontology is a semantic network model of the scholarly community which lends the scholarly process to statistical analysis and computational support. They present the ontology, discuss its instantiation, and provide some example inference rules for calculating various scholarly artifact metrics.« less

  11. A review of sensing technologies for small and large-scale touch panels

    NASA Astrophysics Data System (ADS)

    Akhtar, Humza; Kemao, Qian; Kakarala, Ramakrishna

    2017-06-01

    A touch panel is an input device for human computer interaction. It consists of a network of sensors, a sampling circuit and a micro controller for detecting and locating a touch input. Touch input can come from either finger or stylus depending upon the type of touch technology. These touch panels provide an intuitive and collaborative workspace so that people can perform various tasks with the use of their fingers instead of traditional input devices like keyboard and mouse. Touch sensing technology is not new. At the time of this writing, various technologies are available in the market and this paper reviews the most common ones. We review traditional designs and sensing algorithms for touch technology. We also observe that due to its various strengths, capacitive touch will dominate the large-scale touch panel industry in years to come. In the end, we discuss the motivation for doing academic research on large-scale panels.

  12. Gaussian processes for personalized e-health monitoring with wearable sensors.

    PubMed

    Clifton, Lei; Clifton, David A; Pimentel, Marco A F; Watkinson, Peter J; Tarassenko, Lionel

    2013-01-01

    Advances in wearable sensing and communications infrastructure have allowed the widespread development of prototype medical devices for patient monitoring. However, such devices have not penetrated into clinical practice, primarily due to a lack of research into "intelligent" analysis methods that are sufficiently robust to support large-scale deployment. Existing systems are typically plagued by large false-alarm rates, and an inability to cope with sensor artifact in a principled manner. This paper has two aims: 1) proposal of a novel, patient-personalized system for analysis and inference in the presence of data uncertainty, typically caused by sensor artifact and data incompleteness; 2) demonstration of the method using a large-scale clinical study in which 200 patients have been monitored using the proposed system. This latter provides much-needed evidence that personalized e-health monitoring is feasible within an actual clinical environment, at scale, and that the method is capable of improving patient outcomes via personalized healthcare.

  13. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    PubMed

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs explored in an RCT are inadequate in describing the enablers and barriers of upscaled community program implementation. Methods for data collection, analysis and reporting also require consideration. We present a number of experiential reflections and suggestions for the successful evaluation of future upscaled community programs which are scarcely reported in the literature. PEACH™ QLD was retrospectively registered with the Australian New Zealand Clinical Trials Registry on 28 February 2017 (ACTRN12617000315314).

  14. Ten Steps to Conducting a Large, Multi-Site, Longitudinal Investigation of Language and Reading in Young Children

    PubMed Central

    Farquharson, Kelly; Murphy, Kimberly A.

    2016-01-01

    Purpose: This paper describes methodological procedures involving execution of a large-scale, multi-site longitudinal study of language and reading comprehension in young children. Researchers in the Language and Reading Research Consortium (LARRC) developed and implemented these procedures to ensure data integrity across multiple sites, schools, and grades. Specifically, major features of our approach, as well as lessons learned, are summarized in 10 steps essential for successful completion of a large-scale longitudinal investigation in early grades. Method: Over 5 years, children in preschool through third grade were administered a battery of 35 higher- and lower-level language, listening, and reading comprehension measures (RCM). Data were collected from children, their teachers, and their parents/guardians at four sites across the United States. Substantial and rigorous effort was aimed toward maintaining consistency in processes and data management across sites for children, assessors, and staff. Conclusion: With appropriate planning, flexibility, and communication strategies in place, LARRC developed and executed a successful multi-site longitudinal research study that will meet its goal of investigating the contribution and role of language skills in the development of children's listening and reading comprehension. Through dissemination of our design strategies and lessons learned, research teams embarking on similar endeavors can be better equipped to anticipate the challenges. PMID:27064308

  15. Techniques for automatic large scale change analysis of temporal multispectral imagery

    NASA Astrophysics Data System (ADS)

    Mercovich, Ryan A.

    Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring in large area and high resolution image sequences. The change detection and analysis algorithm developed could be adapted to many potential image change scenarios to perform automatic large scale analysis of change.

  16. Pavilion Lake Research Project - using multi-scaled approaches to understanding the provenance, maintenance and morphological characteristics of microbialites

    NASA Astrophysics Data System (ADS)

    Lim, D. S.; Brady, A. L.; Cardman, Z.; Cowie, B. R.; Forrest, A.; Marinova, M.; Shepard, R.; Laval, B.; Slater, G. F.; Gernhardt, M.; Andersen, D. T.; Hawes, I.; Sumner, D. Y.; Trembanis, A. C.; McKay, C. P.

    2009-12-01

    Microbialites can be metre-scale or larger discrete structures that cover kilometre-scale regions, for example in Pavilion Lake, British Columbia, Canada, while the organisms associated with their growth and development are much smaller (less than millimeter scale). As such, a multi-scaled approach to understanding their provenance, maintenance and morphological characteristics is required. Research members of the Pavilion Lake Research Project (PLRP) (www.pavilionlake.com) have been working to understand microbialite morphogenesis in Pavilion Lake, B.C., Canada and the potential for biosignature preservation in these carbonate rocks using a combination of field and lab based techniques. PLRP research participants have been: (1) exploring the physical and chemical limnological properties of the lake, especially as these characteristics pertain to microbialite formation, (2) using geochemical and molecular tools to test the hypothesized biological origin of the microbialites and the associated meso-scale processes, and (3) using geochemical and microscopic tools to characterize potential biosignature preservation in the microbialites on the micro scale. To address these goals, PLRP identified the need to (a) map Pavilion Lake to gain a contextual understanding of microbialite distribution and possible correlation between their lake-wide distribution and the ambient growth conditions, and (b) sample the microbialites, including those from deepest regions of the lake (60m). Initial assessments showed that PLRP science diving operations did not prove adequate for mapping and sample recovery in the large and deep (0.8 km x 5.7 km; 65m max depth) lake. As such, the DeepWorker Science and Exploration (DSE) program was established by the PLRP. At the heart of this program are two DeepWorker (DW) submersibles, single-person vehicles that offer Scientist-Pilots (SP) an opportunity to study the lake in a 1 atm pressurized environment. In addition, the use of Autonomous Underwater Vehicles (AUVs) for landscape level geophysical mapping (side-scan and multibeam) provides and additional large-scale context of the microbialite associations. The multi-scaled approach undertaken by the PLRP team members has created an opportunity to weave together a comprehensive understanding of the modern microbialites in Pavilion Lake, and their relevance to interpreting ancient carbonate fabrics. An overview of the team’s findings to date and on-going research will be presented.

  17. How big is too big or how many partners are needed to build a large project which still can be managed successfully?

    NASA Astrophysics Data System (ADS)

    Henkel, Daniela; Eisenhauer, Anton

    2017-04-01

    During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.

  18. Microfluidic desalination techniques and their potential applications.

    PubMed

    Roelofs, S H; van den Berg, A; Odijk, M

    2015-09-07

    In this review we discuss recent developments in the emerging research field of miniaturized desalination. Traditionally desalination is performed to convert salt water into potable water and research is focused on improving performance of large-scale desalination plants. Microfluidic desalination offers several new opportunities in comparison to macro-scale desalination, such as providing a platform to increase fundamental knowledge of ion transport on the nano- and microfluidic scale and new microfluidic sample preparation methods. This approach has also lead to the development of new desalination techniques, based on micro/nanofluidic ion-transport phenomena, which are potential candidates for up-scaling to (portable) drinking water devices. This review assesses microfluidic desalination techniques on their applications and is meant to contribute to further implementation of microfluidic desalination techniques in the lab-on-chip community.

  19. Scaling NASA Applications to 1024 CPUs on Origin 3K

    NASA Technical Reports Server (NTRS)

    Taft, Jim

    2002-01-01

    The long and highly successful joint SGI-NASA research effort in ever larger SSI systems was to a large degree the result of the successful development of the MLP scalable parallel programming paradigm developed at ARC: 1) MLP scaling in real production codes justified ever larger systems at NAS; 2) MLP scaling on 256p Origin 2000 gave SGl impetus to productize 256p; 3) MLP scaling on 512 gave SGI courage to build 1024p O3K; and 4) History of MLP success resulted in IBM Star Cluster based MLP effort.

  20. Evaluation of the scale dependent dynamic SGS model in the open source code caffa3d.MBRi in wall-bounded flows

    NASA Astrophysics Data System (ADS)

    Draper, Martin; Usera, Gabriel

    2015-04-01

    The Scale Dependent Dynamic Model (SDDM) has been widely validated in large-eddy simulations using pseudo-spectral codes [1][2][3]. The scale dependency, particularly the potential law, has been proved also in a priori studies [4][5]. To the authors' knowledge there have been only few attempts to use the SDDM in finite difference (FD) and finite volume (FV) codes [6][7], finding some improvements with the dynamic procedures (scale independent or scale dependent approach), but not showing the behavior of the scale-dependence parameter when using the SDDM. The aim of the present paper is to evaluate the SDDM in the open source code caffa3d.MBRi, an updated version of the code presented in [8]. caffa3d.MBRi is a FV code, second-order accurate, parallelized with MPI, in which the domain is divided in unstructured blocks of structured grids. To accomplish this, 2 cases are considered: flow between flat plates and flow over a rough surface with the presence of a model wind turbine, taking for this case the experimental data presented in [9]. In both cases the standard Smagorinsky Model (SM), the Scale Independent Dynamic Model (SIDM) and the SDDM are tested. As presented in [6][7] slight improvements are obtained with the SDDM. Nevertheless, the behavior of the scale-dependence parameter supports the generalization of the dynamic procedure proposed in the SDDM, particularly taking into account that no explicit filter is used (the implicit filter is unknown). [1] F. Porté-Agel, C. Meneveau, M.B. Parlange. "A scale-dependent dynamic model for large-eddy simulation: application to a neutral atmospheric boundary layer". Journal of Fluid Mechanics, 2000, 415, 261-284. [2] E. Bou-Zeid, C. Meneveau, M. Parlante. "A scale-dependent Lagrangian dynamic model for large eddy simulation of complex turbulent flows". Physics of Fluids, 2005, 17, 025105 (18p). [3] R. Stoll, F. Porté-Agel. "Dynamic subgrid-scale models for momentum and scalar fluxes in large-eddy simulations of neutrally stratified atmospheric boundary layers over heterogeneous terrain". Water Resources Research, 2006, 42, WO1409 (18 p). [4] J. Keissl, M. Parlange, C. Meneveau. "Field experimental study of dynamic Smagorinsky models in the atmospheric surface layer". Journal of the Atmospheric Science, 2004, 61, 2296-2307. [5] E. Bou-Zeid, N. Vercauteren, M.B. Parlange, C. Meneveau. "Scale dependence of subgrid-scale model coefficients: An a priori study". Physics of Fluids, 2008, 20, 115106. [6] G. Kirkil, J. Mirocha, E. Bou-Zeid, F.K. Chow, B. Kosovic, "Implementation and evaluation of dynamic subfilter - scale stress models for large - eddy simulation using WRF". Monthly Weather Review, 2012, 140, 266-284. [7] S. Radhakrishnan, U. Piomelli. "Large-eddy simulation of oscillating boundary layers: model comparison and validation". Journal of Geophysical Research, 2008, 113, C02022. [8] G. Usera, A. Vernet, J.A. Ferré. "A parallel block-structured finite volume method for flows in complex geometry with sliding interfaces". Flow, Turbulence and Combustion, 2008, 81, 471-495. [9] Y-T. Wu, F. Porté-Agel. "Large-eddy simulation of wind-turbine wakes: evaluation of turbine parametrisations". BoundaryLayerMeteorology, 2011, 138, 345-366.

  1. Geographic scale matters in detecting the relationship between neighbourhood food environments and obesity risk: an analysis of driver license records in Salt Lake County, Utah.

    PubMed

    Fan, Jessie X; Hanson, Heidi A; Zick, Cathleen D; Brown, Barbara B; Kowaleski-Jones, Lori; Smith, Ken R

    2014-08-19

    Empirical studies of the association between neighbourhood food environments and individual obesity risk have found mixed results. One possible cause of these mixed findings is the variation in neighbourhood geographic scale used. The purpose of this paper was to examine how various neighbourhood geographic scales affected the estimated relationship between food environments and obesity risk. Cross-sectional secondary data analysis. Salt Lake County, Utah, USA. 403,305 Salt Lake County adults 25-64 in the Utah driver license database between 1995 and 2008. Utah driver license data were geo-linked to 2000 US Census data and Dun & Bradstreet business data. Food outlets were classified into the categories of large grocery stores, convenience stores, limited-service restaurants and full-service restaurants, and measured at four neighbourhood geographic scales: Census block group, Census tract, ZIP code and a 1 km buffer around the resident's house. These measures were regressed on individual obesity status using multilevel random intercept regressions. Obesity. Food environment was important for obesity but the scale of the relevant neighbourhood differs for different type of outlets: large grocery stores were not significant at all four geographic scales, limited-service restaurants at the medium-to-large scale (Census tract or larger) and convenience stores and full-service restaurants at the smallest scale (Census tract or smaller). The choice of neighbourhood geographic scale can affect the estimated significance of the association between neighbourhood food environments and individual obesity risk. However, variations in geographic scale alone do not explain the mixed findings in the literature. If researchers are constrained to use one geographic scale with multiple categories of food outlets, using Census tract or 1 km buffer as the neighbourhood geographic unit is likely to allow researchers to detect most significant relationships. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Fractals and Spatial Methods for Mining Remote Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina; Emerson, Charles; Quattrochi, Dale

    2003-01-01

    The rapid increase in digital remote sensing and GIS data raises a critical problem -- how can such an enormous amount of data be handled and analyzed so that useful information can be derived quickly? Efficient handling and analysis of large spatial data sets is central to environmental research, particularly in global change studies that employ time series. Advances in large-scale environmental monitoring and modeling require not only high-quality data, but also reliable tools to analyze the various types of data. A major difficulty facing geographers and environmental scientists in environmental assessment and monitoring is that spatial analytical tools are not easily accessible. Although many spatial techniques have been described recently in the literature, they are typically presented in an analytical form and are difficult to transform to a numerical algorithm. Moreover, these spatial techniques are not necessarily designed for remote sensing and GIS applications, and research must be conducted to examine their applicability and effectiveness in different types of environmental applications. This poses a chicken-and-egg problem: on one hand we need more research to examine the usability of the newer techniques and tools, yet on the other hand, this type of research is difficult to conduct if the tools to be explored are not accessible. Another problem that is fundamental to environmental research are issues related to spatial scale. The scale issue is especially acute in the context of global change studies because of the need to integrate remote-sensing and other spatial data that are collected at different scales and resolutions. Extrapolation of results across broad spatial scales remains the most difficult problem in global environmental research. There is a need for basic characterization of the effects of scale on image data, and the techniques used to measure these effects must be developed and implemented to allow for a multiple scale assessment of the data before any useful process-oriented modeling involving scale-dependent data can be conducted. Through the support of research grants from NASA, we have developed a software module called ICAMS (Image Characterization And Modeling System) to address the need to develop innovative spatial techniques and make them available to the broader scientific communities. ICAMS provides new spatial techniques, such as fractal analysis, geostatistical functions, and multiscale analysis that are not easily available in commercial GIS/image processing software. By bundling newer spatial methods in a user-friendly software module, researchers can begin to test and experiment with the new spatial analysis methods and they can gauge scale effects using a variety of remote sensing imagery. In the following, we describe briefly the development of ICAMS and present application examples.

  3. Anticipatory Traumatic Reaction: Outcomes Arising From Secondary Exposure to Disasters and Large-Scale Threats.

    PubMed

    Hopwood, Tanya L; Schutte, Nicola S; Loi, Natasha M

    2017-09-01

    Two studies, with a total of 707 participants, developed and examined the reliability and validity of a measure for anticipatory traumatic reaction (ATR), a novel construct describing a form of distress that may occur in response to threat-related media reports and discussions. Exploratory and confirmatory factor analysis resulted in a scale comprising three subscales: feelings related to future threat; preparatory thoughts and actions; and disruption to daily activities. Internal consistency was .93 for the overall ATR scale. The ATR scale demonstrated convergent validity through associations with negative affect, depression, anxiety, stress, neuroticism, and repetitive negative thinking. The scale showed discriminant validity in relationships to Big Five characteristics. The ATR scale had some overlap with a measure of posttraumatic stress disorder, but also showed substantial separate variance. This research provides preliminary evidence for the novel construct of ATR as well as a measure of the construct. The ATR scale will allow researchers to further investigate anticipatory traumatic reaction in the fields of trauma, clinical practice, and social psychology.

  4. Original BPC3 Research Plan

    Cancer.gov

    The Breast and Prostate Cancer and Hormone-Related Gene Variant Study allows large-scale analyses of breast and prostate cancer risk in relation to genetic polymorphisms and gene-environment interactions that affect hormone metabolism.

  5. The Greatest Legacy of the Large Scale Biosphere Atmosphere Experiment in Amazonia (LBA): A Bibliometric Assessment

    NASA Astrophysics Data System (ADS)

    Keller, M. M.

    2015-12-01

    The Large Scale Biosphere Atmosphere Experiment in Amazonia (LBA) is an international continental scale effort led by Brazil to understand how land use change and climate change affects the role of Amazonia in the Earth system. During the first decade of studies (1998-2007), LBA researchers generated new understanding of Amazonia and published over 1000 papers. However, most LBA participants agree that training and education of a large cohort of scientists, especially students from Brazil, was the greatest contribution of LBA. I analyzed bibliographic data from the NASA supported component project known as LBA-ECO. This component covered a large cross-section of the LBA subject areas highlighting land use and land cover change, carbon cycling, nutrient cycling and other aspects of terrestrial and aquatic ecology. I reviewed the complete bibliography of peer-reviewed papers reported by LBA-ECO researchers (http://www.lbaeco.org/cgi-bin/web/investigations/lbaeco_refs.pl). The researchers reported 691 contributions from 1996 through 2013 of which 24 were theses that were removed them from further analysis. Of 667 papers and book chapters, I tallied the first authors separating categories for Brazilians, all students, and Brazilian students. Numerically, LBA-ECO production of papers peaked in 2004. Publication by Brazilians, students, and Brazilian students generally followed the same pattern as publication in general. However, student and Brazilian student contributions as first authors showed clearly increasing proportions of the papers from project initiation through peak publication. Brazilian student participation as first authors averaged more than 20% of all publications from 2003 to 2010 and more than half of all student publications had Brazilians as first authors. Foreign researchers, some initially reluctant to invest in Brazilian students, almost universally adapted the belief that the greatest legacy of LBA would be the contribution to building a cadre of environmental researchers and professionals for the Amazon region. This belief was transformed into a commitment through pressure by NASA management and through the leadership of the LBA-ECO research team leading to LBA's greatest legacy.

  6. Research on precision grinding technology of large scale and ultra thin optics

    NASA Astrophysics Data System (ADS)

    Zhou, Lian; Wei, Qiancai; Li, Jie; Chen, Xianhua; Zhang, Qinghua

    2018-03-01

    The flatness and parallelism error of large scale and ultra thin optics have an important influence on the subsequent polishing efficiency and accuracy. In order to realize the high precision grinding of those ductile elements, the low deformation vacuum chuck was designed first, which was used for clamping the optics with high supporting rigidity in the full aperture. Then the optics was planar grinded under vacuum adsorption. After machining, the vacuum system was turned off. The form error of optics was on-machine measured using displacement sensor after elastic restitution. The flatness would be convergenced with high accuracy by compensation machining, whose trajectories were integrated with the measurement result. For purpose of getting high parallelism, the optics was turned over and compensation grinded using the form error of vacuum chuck. Finally, the grinding experiment of large scale and ultra thin fused silica optics with aperture of 430mm×430mm×10mm was performed. The best P-V flatness of optics was below 3 μm, and parallelism was below 3 ″. This machining technique has applied in batch grinding of large scale and ultra thin optics.

  7. An Illustrative Guide to the Minerva Framework

    NASA Astrophysics Data System (ADS)

    Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration

    2017-10-01

    Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.

  8. Large-scale fortification of condiments and seasonings as a public health strategy: equity considerations for implementation.

    PubMed

    Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia

    2016-09-01

    Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.

  9. A Survey of Residents' Perceptions of the Effect of Large-Scale Economic Developments on Perceived Safety, Violence, and Economic Benefits.

    PubMed

    Fabio, Anthony; Geller, Ruth; Bazaco, Michael; Bear, Todd M; Foulds, Abigail L; Duell, Jessica; Sharma, Ravi

    2015-01-01

    Emerging research highlights the promise of community- and policy-level strategies in preventing youth violence. Large-scale economic developments, such as sports and entertainment arenas and casinos, may improve the living conditions, economics, public health, and overall wellbeing of area residents and may influence rates of violence within communities. To assess the effect of community economic development efforts on neighborhood residents' perceptions on violence, safety, and economic benefits. Telephone survey in 2011 using a listed sample of randomly selected numbers in six Pittsburgh neighborhoods. Descriptive analyses examined measures of perceived violence and safety and economic benefit. Responses were compared across neighborhoods using chi-square tests for multiple comparisons. Survey results were compared to census and police data. Residents in neighborhoods with the large-scale economic developments reported more casino-specific and arena-specific economic benefits. However, 42% of participants in the neighborhood with the entertainment arena felt there was an increase in crime, and 29% of respondents from the neighborhood with the casino felt there was an increase. In contrast, crime decreased in both neighborhoods. Large-scale economic developments have a direct influence on the perception of violence, despite actual violence rates.

  10. Next-Generation Sequencing: The Translational Medicine Approach from “Bench to Bedside to Population”

    PubMed Central

    Beigh, Mohammad Muzafar

    2016-01-01

    Humans have predicted the relationship between heredity and diseases for a long time. Only in the beginning of the last century, scientists begin to discover the connotations between different genes and disease phenotypes. Recent trends in next-generation sequencing (NGS) technologies have brought a great momentum in biomedical research that in turn has remarkably augmented our basic understanding of human biology and its associated diseases. State-of-the-art next generation biotechnologies have started making huge strides in our current understanding of mechanisms of various chronic illnesses like cancers, metabolic disorders, neurodegenerative anomalies, etc. We are experiencing a renaissance in biomedical research primarily driven by next generation biotechnologies like genomics, transcriptomics, proteomics, metabolomics, lipidomics etc. Although genomic discoveries are at the forefront of next generation omics technologies, however, their implementation into clinical arena had been painstakingly slow mainly because of high reaction costs and unavailability of requisite computational tools for large-scale data analysis. However rapid innovations and steadily lowering cost of sequence-based chemistries along with the development of advanced bioinformatics tools have lately prompted launching and implementation of large-scale massively parallel genome sequencing programs in different fields ranging from medical genetics, infectious biology, agriculture sciences etc. Recent advances in large-scale omics-technologies is bringing healthcare research beyond the traditional “bench to bedside” approach to more of a continuum that will include improvements, in public healthcare and will be primarily based on predictive, preventive, personalized, and participatory medicine approach (P4). Recent large-scale research projects in genetic and infectious disease biology have indicated that massively parallel whole-genome/whole-exome sequencing, transcriptome analysis, and other functional genomic tools can reveal large number of unique functional elements and/or markers that otherwise would be undetected by traditional sequencing methodologies. Therefore, latest trends in the biomedical research is giving birth to the new branch in medicine commonly referred to as personalized and/or precision medicine. Developments in the post-genomic era are believed to completely restructure the present clinical pattern of disease prevention and treatment as well as methods of diagnosis and prognosis. The next important step in the direction of the precision/personalized medicine approach should be its early adoption in clinics for future medical interventions. Consequently, in coming year’s next generation biotechnologies will reorient medical practice more towards disease prediction and prevention approaches rather than curing them at later stages of their development and progression, even at wider population level(s) for general public healthcare system. PMID:28930123

  11. Do you kiss your mother with that mouth? An authentic large-scale undergraduate research experience in mapping the human oral microbiome.

    PubMed

    Wang, Jack T H; Daly, Joshua N; Willner, Dana L; Patil, Jayee; Hall, Roy A; Schembri, Mark A; Tyson, Gene W; Hugenholtz, Philip

    2015-05-01

    Clinical microbiology testing is crucial for the diagnosis and treatment of community and hospital-acquired infections. Laboratory scientists need to utilize technical and problem-solving skills to select from a wide array of microbial identification techniques. The inquiry-driven laboratory training required to prepare microbiology graduates for this professional environment can be difficult to replicate within undergraduate curricula, especially in courses that accommodate large student cohorts. We aimed to improve undergraduate scientific training by engaging hundreds of introductory microbiology students in an Authentic Large-Scale Undergraduate Research Experience (ALURE). The ALURE aimed to characterize the microorganisms that reside in the healthy human oral cavity-the oral microbiome-by analyzing hundreds of samples obtained from student volunteers within the course. Students were able to choose from selective and differential culture media, Gram-staining, microscopy, as well as polymerase chain reaction (PCR) and 16S rRNA gene sequencing techniques, in order to collect, analyze, and interpret novel data to determine the collective oral microbiome of the student cohort. Pre- and postsurvey analysis of student learning gains across two iterations of the course (2012-2013) revealed significantly higher student confidence in laboratory skills following the completion of the ALURE (p < 0.05 using the Mann-Whitney U-test). Learning objectives on effective scientific communication were also met through effective student performance in laboratory reports describing the research outcomes of the project. The integration of undergraduate research in clinical microbiology has the capacity to deliver authentic research experiences and improve scientific training for large cohorts of undergraduate students.

  12. Housing first on a large scale: Fidelity strengths and challenges in the VA's HUD-VASH program.

    PubMed

    Kertesz, Stefan G; Austin, Erika L; Holmes, Sally K; DeRussy, Aerin J; Van Deusen Lukas, Carol; Pollio, David E

    2017-05-01

    Housing First (HF) combines permanent supportive housing and supportive services for homeless individuals and removes traditional treatment-related preconditions for housing entry. There has been little research describing strengths and shortfalls of HF implementation outside of research demonstration projects. The U.S. Department of Veterans Affairs (VA) has transitioned to an HF approach in a supportive housing program serving over 85,000 persons. This offers a naturalistic window to study fidelity when HF is adopted on a large scale. We operationalized HF into 20 criteria grouped into 5 domains. We assessed 8 VA medical centers twice (1 year apart), scoring each criterion using a scale ranging from 1 ( low fidelity ) to 4 ( high fidelity ). There were 2 HF domains (no preconditions and rapidly offering permanent housing) for which high fidelity was readily attained. There was uneven progress in prioritizing the most vulnerable clients for housing support. Two HF domains (sufficient supportive services and a modern recovery philosophy) had considerably lower fidelity. Interviews suggested that operational issues such as shortfalls in staffing and training likely hindered performance in these 2 domains. In this ambitious national HF program, the largest to date, we found substantial fidelity in focusing on permanent housing and removal of preconditions to housing entry. Areas of concern included the adequacy of supportive services and adequacy in deployment of a modern recovery philosophy. Under real-world conditions, large-scale implementation of HF is likely to require significant additional investment in client service supports to assure that results are concordant with those found in research studies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Validation of large-scale, monochromatic UV disinfection systems for drinking water using dyed microspheres.

    PubMed

    Blatchley, E R; Shen, C; Scheible, O K; Robinson, J P; Ragheb, K; Bergstrom, D E; Rokjer, D

    2008-02-01

    Dyed microspheres have been developed as a new method for validation of ultraviolet (UV) reactor systems. When properly applied, dyed microspheres allow measurement of the UV dose distribution delivered by a photochemical reactor for a given operating condition. Prior to this research, dyed microspheres had only been applied to a bench-scale UV reactor. The goal of this research was to extend the application of dyed microspheres to large-scale reactors. Dyed microsphere tests were conducted on two prototype large-scale UV reactors at the UV Validation and Research Center of New York (UV Center) in Johnstown, NY. All microsphere tests were conducted under conditions that had been used previously in biodosimetry experiments involving two challenge bacteriophage: MS2 and Qbeta. Numerical simulations based on computational fluid dynamics and irradiance field modeling were also performed for the same set of operating conditions used in the microspheres assays. Microsphere tests on the first reactor illustrated difficulties in sample collection and discrimination of microspheres against ambient particles. Changes in sample collection and work-up were implemented in tests conducted on the second reactor that allowed for improvements in microsphere capture and discrimination against the background. Under these conditions, estimates of the UV dose distribution from the microspheres assay were consistent with numerical simulations and the results of biodosimetry, using both challenge organisms. The combined application of dyed microspheres, biodosimetry, and numerical simulation offers the potential to provide a more in-depth description of reactor performance than any of these methods individually, or in combination. This approach also has the potential to substantially reduce uncertainties in reactor validation, thereby leading to better understanding of reactor performance, improvements in reactor design, and decreases in reactor capital and operating costs.

  14. Large scale in vivo recordings to study neuronal biophysics.

    PubMed

    Giocomo, Lisa M

    2015-06-01

    Over the last several years, technological advances have enabled researchers to more readily observe single-cell membrane biophysics in awake, behaving animals. Studies utilizing these technologies have provided important insights into the mechanisms generating functional neural codes in both sensory and non-sensory cortical circuits. Crucial for a deeper understanding of how membrane biophysics control circuit dynamics however, is a continued effort to move toward large scale studies of membrane biophysics, in terms of the numbers of neurons and ion channels examined. Future work faces a number of theoretical and technical challenges on this front but recent technological developments hold great promise for a larger scale understanding of how membrane biophysics contribute to circuit coding and computation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Channel optimization of high-intensity laser beams in millimeter-scale plasmas

    DOE PAGES

    Ceurvorst, L.; Savin, A.; Ratan, N.; ...

    2018-04-20

    Channeling experiments were performed at the OMEGA EP facility using relativistic intensity (> 10 18 W/cm 2) kilojoule laser pulses through large density scale length (~ 390-570 μm) laser-produced plasmas, demonstrating the effects of the pulse’s focal location and intensity as well as the plasma’s temperature on the resulting channel formation. The results show deeper channeling when focused into hot plasmas and at lower densities as expected. However, contrary to previous large scale particle-in-cell studies, the results also indicate deeper penetration by short (10 ps), intense pulses compared to their longer duration equivalents. To conclude, this new observation has manymore » implications for future laser-plasma research in the relativistic regime.« less

  16. Channel optimization of high-intensity laser beams in millimeter-scale plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceurvorst, L.; Savin, A.; Ratan, N.

    Channeling experiments were performed at the OMEGA EP facility using relativistic intensity (> 10 18 W/cm 2) kilojoule laser pulses through large density scale length (~ 390-570 μm) laser-produced plasmas, demonstrating the effects of the pulse’s focal location and intensity as well as the plasma’s temperature on the resulting channel formation. The results show deeper channeling when focused into hot plasmas and at lower densities as expected. However, contrary to previous large scale particle-in-cell studies, the results also indicate deeper penetration by short (10 ps), intense pulses compared to their longer duration equivalents. To conclude, this new observation has manymore » implications for future laser-plasma research in the relativistic regime.« less

  17. Wildfire as a hydrological and geomorphological agent

    NASA Astrophysics Data System (ADS)

    Shakesby, R. A.; Doerr, S. H.

    2006-02-01

    Wildfire can lead to considerable hydrological and geomorphological change, both directly by weathering bedrock surfaces and changing soil structure and properties, and indirectly through the effects of changes to the soil and vegetation on hydrological and geomorphological processes. This review summarizes current knowledge and identifies research gaps focusing particularly on the contribution of research from the Mediterranean Basin, Australia and South Africa over the last two decades or so to the state of knowledge mostly built on research carried out in the USA. Wildfire-induced weathering rates have been reported to be high relative to other weathering processes in fire-prone terrain, possibly as much as one or two magnitudes higher than frost action, with important implications for cosmogenic-isotope dating of the length of rock exposure. Wildfire impacts on soil properties have been a major focus of interest over the last two decades. Fire usually reduces soil aggregate stability and can induce, enhance or destroy soil water repellency depending on the temperature reached and its duration. These changes have implications for infiltration, overland flow and rainsplash detachment. A large proportion of publications concerned with fire impacts have focused on post-fire soil erosion by water, particularly at small scales. These have shown elevated, sometimes extremely large post-fire losses before geomorphological stability is re-established. Soil losses per unit area are generally negatively related to measurement scale reflecting increased opportunities for sediment storage at larger scales. Over the last 20 years, there has been much improvement in the understanding of the forms, causes and timing of debris flow and landslide activity on burnt terrain. Advances in previously largely unreported processes (e.g. bio-transfer of sediment and wind erosion) have also been made. Post-fire hydrological effects have generally also been studied at small rather than large scales, with soil water repellency effects on infiltration and overland flow being a particular focus. At catchment scales, post-fire accentuated peakflow has received more attention than changes in total flow, reflecting easier measurement and the greater hazard posed by the former. Post-fire changes to stream channels occur over both short and long terms with complex feedback mechanisms, though research to date has been limited. Research gaps identified include the need to: (1) develop a fire severity index relevant to soil changes rather than to degree of biomass destruction; (2) isolate the hydrological and geomorphological impacts of fire-induced soil water repellency changes from other important post-fire changes (e.g. litter and vegetation destruction); (3) improve knowledge of the hydrological and geomorphological impacts of wildfire in a wider range of fire-prone terrain types; (4) solve important problems in the determination and analysis of hillslope and catchment sediment yields including poor knowledge about soil losses other than at small spatial and short temporal scales, the lack of a clear measure of the degradational significance of post-fire soil losses, and confusion arising from errors in and lack of scale context for many quoted post-fire soil erosion rates; and (5) increase the research effort into past and potential future hydrological and geomorphological changes resulting from wildfire.

  18. Effects of season and scale on response of elk and mule deer to habitat manipulation

    Treesearch

    Ryan A. Long; Janet L. Rachlow; John G. Kie

    2008-01-01

    Manipulation of forest habitat via mechanical thinning or prescribed fire has become increasingly common across western North America. Nevertheless, empirical research on effects of those activities on wildlife is limited, although prescribed fire in particular often is assumed to benefit large herbivores. We evaluated effects of season and spatial scale on response of...

  19. Effects of Scaled-Up Professional Development Courses about Inquiry-Based Learning on Teachers

    ERIC Educational Resources Information Center

    Maass, Katja; Engeln, Katrin

    2018-01-01

    Although well researched in educational studies, inquiry-based learning, a student-centred way of teaching, is far away from being implemented in day-to-day science and mathematics teaching on a large scale. It is a challenge for teachers to adopt this new way of teaching in an often not supportive school context. Therefore it is important to…

  20. Social Activism in Elementary Science Education: A Science, Technology, and Society Approach to Teach Global Warming

    ERIC Educational Resources Information Center

    Lester, Benjamin T.; Ma, Li; Lee, Okhee; Lambert, Julie

    2006-01-01

    As part of a large-scale instructional intervention research, this study examined elementary students' science knowledge and awareness of social activism with regard to an increased greenhouse effect and global warming. The study involved fifth-grade students from five elementary schools of varying demographic makeup in a large urban school…

  1. Study Healthy Ageing and Intellectual Disabilities: Recruitment and Design

    ERIC Educational Resources Information Center

    Hilgenkamp, Thessa I. M.; Bastiaanse, Luc P.; Hermans, Heidi; Penning, Corine; van Wijck, Ruud; Evenhuis, Heleen M.

    2011-01-01

    Problems encountered in epidemiologic health research in older adults with intellectual disabilities (ID) are how to recruit a large-scale sample of participants and how to measure a range of health variables in such a group. This cross-sectional study into healthy ageing started with founding a consort of three large care providers with a total…

  2. A Review of Challenges in Developing a National Program for Gifted Children in India's Diverse Context

    ERIC Educational Resources Information Center

    Kurup, Anitha; Maithreyi, R.

    2012-01-01

    Large-scale sequential research developments for identification and measurement of giftedness have received ample attention in the West, whereas India's response to this has largely been lukewarm. The wide variation in parents' abilities to provide enriched environments to nurture their children's potential makes it imperative for India to develop…

  3. Ice-Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Lee, Sam; Malone, Adam M.; Paul, Benard P., Jr.; Woodard, Brian S.

    2016-01-01

    Icing simulation tools and computational fluid dynamics codes are reaching levels of maturity such that they are being proposed by manufacturers for use in certification of aircraft for flight in icing conditions with increasingly less reliance on natural-icing flight testing and icing-wind-tunnel testing. Sufficient high-quality data to evaluate the performance of these tools is not currently available. The objective of this work was to generate a database of ice-accretion geometry that can be used for development and validation of icing simulation tools as well as for aerodynamic testing. Three large-scale swept wing models were built and tested at the NASA Glenn Icing Research Tunnel (IRT). The models represented the Inboard (20% semispan), Midspan (64% semispan) and Outboard stations (83% semispan) of a wing based upon a 65% scale version of the Common Research Model (CRM). The IRT models utilized a hybrid design that maintained the full-scale leading-edge geometry with a truncated afterbody and flap. The models were instrumented with surface pressure taps in order to acquire sufficient aerodynamic data to verify the hybrid model design capability to simulate the full-scale wing section. A series of ice-accretion tests were conducted over a range of total temperatures from -23.8 deg C to -1.4 deg C with all other conditions held constant. The results showed the changing ice-accretion morphology from rime ice at the colder temperatures to highly 3-D scallop ice in the range of -11.2 deg C to -6.3 deg C. Warmer temperatures generated highly 3-D ice accretion with glaze ice characteristics. The results indicated that the general scallop ice morphology was similar for all three models. Icing results were documented for limited parametric variations in angle of attack, drop size and cloud liquid-water content (LWC). The effect of velocity on ice accretion was documented for the Midspan and Outboard models for a limited number of test cases. The data suggest that there are morphological characteristics of glaze and scallop ice accretion on these swept-wing models that are dependent upon the velocity. This work has resulted in a large database of ice-accretion geometry on large-scale, swept-wing models.

  4. Ice-Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Broeren, Andy P.; Potapczuk, Mark G.; Lee, Sam; Malone, Adam M.; Paul, Bernard P., Jr.; Woodard, Brian S.

    2016-01-01

    Icing simulation tools and computational fluid dynamics codes are reaching levels of maturity such that they are being proposed by manufacturers for use in certification of aircraft for flight in icing conditions with increasingly less reliance on natural-icing flight testing and icing-wind-tunnel testing. Sufficient high-quality data to evaluate the performance of these tools is not currently available. The objective of this work was to generate a database of ice-accretion geometry that can be used for development and validation of icing simulation tools as well as for aerodynamic testing. Three large-scale swept wing models were built and tested at the NASA Glenn Icing Research Tunnel (IRT). The models represented the Inboard (20 percent semispan), Midspan (64 percent semispan) and Outboard stations (83 percent semispan) of a wing based upon a 65 percent scale version of the Common Research Model (CRM). The IRT models utilized a hybrid design that maintained the full-scale leading-edge geometry with a truncated afterbody and flap. The models were instrumented with surface pressure taps in order to acquire sufficient aerodynamic data to verify the hybrid model design capability to simulate the full-scale wing section. A series of ice-accretion tests were conducted over a range of total temperatures from -23.8 to -1.4 C with all other conditions held constant. The results showed the changing ice-accretion morphology from rime ice at the colder temperatures to highly 3-D scallop ice in the range of -11.2 to -6.3 C. Warmer temperatures generated highly 3-D ice accretion with glaze ice characteristics. The results indicated that the general scallop ice morphology was similar for all three models. Icing results were documented for limited parametric variations in angle of attack, drop size and cloud liquid-water content (LWC). The effect of velocity on ice accretion was documented for the Midspan and Outboard models for a limited number of test cases. The data suggest that there are morphological characteristics of glaze and scallop ice accretion on these swept-wing models that are dependent upon the velocity. This work has resulted in a large database of ice-accretion geometry on large-scale, swept-wing models.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schreck, S.; Sant, T.; Micallef, D.

    Wind turbine structures and components suffer excessive loads and premature failures when key aerodynamic phenomena are not well characterized, fail to be understood, or are inaccurately predicted. Turbine blade rotational augmentation remains incompletely characterized and understood, thus limiting robust prediction for design. Pertinent rotational augmentation research including experimental, theoretical, and computational work has been pursued for some time, but large scale wind tunnel testing is a relatively recent development for investigating wind turbine blade aerodynamics. Because of their large scale and complementary nature, the MEXICO and UAE Phase VI wind tunnel experiments offer unprecedented synergies to better characterize and understandmore » rotational augmentation of blade aerodynamics.« less

  6. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  7. Investigation of the Large Scale Evolution and Topology of Coronal Mass Ejections in the Solar Wind

    NASA Technical Reports Server (NTRS)

    Riley, Peter

    1999-01-01

    This investigation is concerned with the large-scale evolution and topology of Coronal Mass Ejections (CMEs) in the solar wind. During this reporting period we have analyzed a series of low density intervals in the ACE (Advanced Composition Explorer) plasma data set that bear many similarities to CMEs. We have begun a series of 3D, MHD (Magnetohydrodynamics) coronal models to probe potential causes of these events. We also edited two manuscripts concerning the properties of CMEs in the solar wind. One was re-submitted to the Journal of Geophysical Research.

  8. Cost-Driven Design of a Large Scale X-Plane

    NASA Technical Reports Server (NTRS)

    Welstead, Jason R.; Frederic, Peter C.; Frederick, Michael A.; Jacobson, Steven R.; Berton, Jeffrey J.

    2017-01-01

    A conceptual design process focused on the development of a low-cost, large scale X-plane was developed as part of an internal research and development effort. One of the concepts considered for this process was the double-bubble configuration recently developed as an advanced single-aisle class commercial transport similar in size to a Boeing 737-800 or Airbus A320. The study objective was to reduce the contractor cost from contract award to first test flight to less than $100 million, and having the first flight within three years of contract award. Methods and strategies for reduced cost are discussed.

  9. Enabling Large-Scale Biomedical Analysis in the Cloud

    PubMed Central

    Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen

    2013-01-01

    Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665

  10. Environmental aspects of large-scale wind-power systems in the UK

    NASA Astrophysics Data System (ADS)

    Robson, A.

    1984-11-01

    Environmental issues relating to the introduction of large, MW-scale wind turbines at land-based sites in the UK are discussed. Noise, television interference, hazards to bird life, and visual effects are considered. Areas of uncertainty are identified, but enough is known from experience elsewhere in the world to enable the first UK machines to be introduced in a safe and environementally acceptable manner. Research to establish siting criteria more clearly, and significantly increase the potential wind-energy resource is mentioned. Studies of the comparative risk of energy systems are shown to be overpessimistic for UK wind turbines.

  11. Survey of organizational research climates in three research intensive, doctoral granting universities.

    PubMed

    Wells, James A; Thrush, Carol R; Martinson, Brian C; May, Terry A; Stickler, Michelle; Callahan, Eileen C; Klomparens, Karen L

    2014-12-01

    The Survey of Organizational Research Climate (SOuRCe) is a new instrument that assesses dimensions of research integrity climate, including ethical leadership, socialization and communication processes, and policies, procedures, structures, and processes to address risks to research integrity. We present a descriptive analysis to characterize differences on the SOuRCe scales across departments, fields of study, and status categories (faculty, postdoctoral scholars, and graduate students) for 11,455 respondents from three research-intensive universities. Among the seven SOuRCe scales, variance explained by status and fields of study ranged from 7.6% (Advisor-Advisee Relations) to 16.2% (Integrity Norms). Department accounted for greater than 50% of the variance explained for each of the SOuRCe scales, ranging from 52.6% (Regulatory Quality) to 80.3% (Integrity Inhibitors). It is feasible to implement this instrument in large university settings across a broad range of fields, department types, and individual roles within academic units. Published baseline results provide initial data for institutions using the SOuRCe who wish to compare their own research integrity climates. © The Author(s) 2014.

  12. Academic Productivity as Perceived by Malaysian Academics

    ERIC Educational Resources Information Center

    Hassan, Aminuddin; Tymms, Peter; Ismail, Habsah

    2008-01-01

    The purpose of this research is to explore the perspectives of Malaysian academics in relation to academic productivity and some factors affecting it. A large scale online questionnaire was used to gather information from six public universities. The most productive role in the eyes of the academics was found to be teaching, with research and…

  13. How to Construct an Organizational Field: Empirical Educational Research in Germany, 1995-2015

    ERIC Educational Resources Information Center

    Zapp, Mike; Powell, Justin J. W.

    2016-01-01

    Over the past two decades, educational research in Germany has undergone unprecedented changes. Following large-scale assessments such as the Trends in International Mathematics and Science Study (TIMSS) and the Programme for International Student Assessment (PISA), and a political interest in evidence-based policy-making, quality assessment and…

  14. Monitoring, Modeling, and Emergent Toxicology in the East Fork Watershed: Developing a Test Bed for Water Quality Management.

    EPA Science Inventory

    Overarching objectives for the development of the East Fork Watershed Test Bed in Southwestern Ohio include: 1) providing research infrastructure for integrating risk assessment and management research on the scale of a large multi-use watershed (1295 km2); 2) Focusing on process...

  15. Improving Teacher Practice: Teachers' Perspectives on Capacity-Building Initiatives in Literacy

    ERIC Educational Resources Information Center

    Mattos, Joseph C.

    2011-01-01

    Educational research over the past 15 years shows that schools and school districts have, on a large scale, failed to translate reform goals into improved teacher practice and student learning. Although classroom teachers are central to successful school reform, research has rarely examined how teachers experience reform initiatives and how that…

  16. Assessing Student Achievement in Large-Scale Educational Programs Using Hierarchical Propensity Scores

    ERIC Educational Resources Information Center

    Vaughan, Angela L.; Lalonde, Trent L.; Jenkins-Guarnieri, Michael A.

    2014-01-01

    Many researchers assessing the efficacy of educational programs face challenges due to issues with non-randomization and the likelihood of dependence between nested subjects. The purpose of the study was to demonstrate a rigorous research methodology using a hierarchical propensity score matching method that can be utilized in contexts where…

  17. Data Cleaning in Mathematics Education Research: The Overlooked Methodological Step

    ERIC Educational Resources Information Center

    Hubbard, Aleata

    2017-01-01

    The results of educational research studies are only as accurate as the data used to produce them. Drawing on experiences conducting large-scale efficacy studies of classroom-based algebra interventions for community college and middle school students, I am developing practice-based data cleaning procedures to support scholars in conducting…

  18. Educational Research with Real-World Data: Reducing Selection Bias with Propensity Scores

    ERIC Educational Resources Information Center

    Adelson, Jill L.

    2013-01-01

    Often it is infeasible or unethical to use random assignment in educational settings to study important constructs and questions. Hence, educational research often uses observational data, such as large-scale secondary data sets and state and school district data, and quasi-experimental designs. One method of reducing selection bias in estimations…

  19. Evidence-Based Practice for Teachers of Children with Autism: A Dynamic Approach

    ERIC Educational Resources Information Center

    Lubas, Margaret; Mitchell, Jennifer; De Leo, Gianluca

    2016-01-01

    Evidence-based practice related to autism research is a controversial topic. Governmental entities and national agencies are defining evidence-based practice as a specific set of interventions that educators should implement; however, large-scale efforts to generalize autism research, which are often single-subject case designs, may be a setback…

  20. Software Tools | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine.  Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.

Top