Managing Risk and Uncertainty in Large-Scale University Research Projects
ERIC Educational Resources Information Center
Moore, Sharlissa; Shangraw, R. F., Jr.
2011-01-01
Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…
ERIC Educational Resources Information Center
Walma van der Molen, Juliette; van Aalderen-Smeets, Sandra
2013-01-01
Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical concept of attitude, methodological flaws in…
Internationalization Measures in Large Scale Research Projects
NASA Astrophysics Data System (ADS)
Soeding, Emanuel; Smith, Nancy
2017-04-01
Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.
Talking About The Smokes: a large-scale, community-based participatory research project.
Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P
2015-06-01
To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.
Studies on combined model based on functional objectives of large scale complex engineering
NASA Astrophysics Data System (ADS)
Yuting, Wang; Jingchun, Feng; Jiabao, Sun
2018-03-01
As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.
Implementing Projects in Calculus on a Large Scale at the University of South Florida
ERIC Educational Resources Information Center
Fox, Gordon A.; Campbell, Scott; Grinshpan, Arcadii; Xu, Xiaoying; Holcomb, John; Bénéteau, Catherine; Lewis, Jennifer E.; Ramachandran, Kandethody
2017-01-01
This paper describes the development of a program of project-based learning in Calculus courses at a large urban research university. In this program, students developed research projects in consultation with a faculty advisor in their major, and supervised by their calculus instructors. Students wrote up their projects in a prescribed format…
Psychology in an Interdisciplinary Setting: A Large-Scale Project to Improve University Teaching
ERIC Educational Resources Information Center
Koch, Franziska D.; Vogt, Joachim
2015-01-01
At a German university of technology, a large-scale project was funded as a part of the "Quality Pact for Teaching", a programme launched by the German Federal Ministry of Education and Research to improve the quality of university teaching and study conditions. The project aims at intensifying interdisciplinary networking in teaching,…
Lessons from a Large-Scale Assessment: Results from Conceptual Inventories
ERIC Educational Resources Information Center
Thacker, Beth; Dulli, Hani; Pattillo, Dave; West, Keith
2014-01-01
We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER) (physics education research-informed materials) into a department where most instruction has previously been traditional and a significant…
Raising Concerns about Sharing and Reusing Large-Scale Mathematics Classroom Observation Video Data
ERIC Educational Resources Information Center
Ing, Marsha; Samkian, Artineh
2018-01-01
There are great opportunities and challenges to sharing large-scale mathematics classroom observation data. This Research Commentary describes the methodological opportunities and challenges and provides a specific example from a mathematics education research project to illustrate how the research questions and framework drove observational…
Large-scale standardized phenotyping of strawberry in RosBREED
USDA-ARS?s Scientific Manuscript database
A large, multi-institutional, international, research project with the goal of bringing genomicists and plant breeders together was funded by USDA-NIFA Specialty Crop Research Initiative. Apple, cherry, peach, and strawberry are the Rosaceous crops included in the project. Many (900+) strawberry g...
Ecological research at the Goosenest Adaptive Management Area in northeastern California
Martin W. Ritchie
2005-01-01
This paper describes the establishment of an interdisciplinary, large-scale ecological research project on the Goosenest Adaptive Management Area of the Klamath National Forest in northeastern California. This project is a companion to the Blacks Mountain Ecological Research Project described by Oliver (2000). The genesis for this project was the Northwest...
CRP: Collaborative Research Project (A Mathematical Research Experience for Undergraduates)
ERIC Educational Resources Information Center
Parsley, Jason; Rusinko, Joseph
2017-01-01
The "Collaborative Research Project" ("CRP")--a mathematics research experience for undergraduates--offers a large-scale collaborative experience in research for undergraduate students. CRP seeks to widen the audience of students who participate in undergraduate research in mathematics. In 2015, the inaugural CRP had 100…
Establishment of a National Wind Energy Center at University of Houston
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Su Su
The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaoliang; Stauffer, Philip H.
This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.
Data integration in the era of omics: current and future challenges
2014-01-01
To integrate heterogeneous and large omics data constitutes not only a conceptual challenge but a practical hurdle in the daily analysis of omics data. With the rise of novel omics technologies and through large-scale consortia projects, biological systems are being further investigated at an unprecedented scale generating heterogeneous and often large data sets. These data-sets encourage researchers to develop novel data integration methodologies. In this introduction we review the definition and characterize current efforts on data integration in the life sciences. We have used a web-survey to assess current research projects on data-integration to tap into the views, needs and challenges as currently perceived by parts of the research community. PMID:25032990
Data management strategies for multinational large-scale systems biology projects.
Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A
2014-01-01
Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.
Data management strategies for multinational large-scale systems biology projects
Peuker, Martin; Regenbrecht, Christian R.A.
2014-01-01
Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157
NASA Astrophysics Data System (ADS)
Henkel, Daniela; Eisenhauer, Anton
2017-04-01
During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.
Julee A Herdt; John Hunt; Kellen Schauermann
2016-01-01
This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authorsâ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...
HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases
NASA Technical Reports Server (NTRS)
Freeman, Michael S.
1987-01-01
The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.
[Privacy and public benefit in using large scale health databases].
Yamamoto, Ryuichi
2014-01-01
In Japan, large scale heath databases were constructed in a few years, such as National Claim insurance and health checkup database (NDB) and Japanese Sentinel project. But there are some legal issues for making adequate balance between privacy and public benefit by using such databases. NDB is carried based on the act for elderly person's health care but in this act, nothing is mentioned for using this database for general public benefit. Therefore researchers who use this database are forced to pay much concern about anonymization and information security that may disturb the research work itself. Japanese Sentinel project is a national project to detecting drug adverse reaction using large scale distributed clinical databases of large hospitals. Although patients give the future consent for general such purpose for public good, it is still under discussion using insufficiently anonymized data. Generally speaking, researchers of study for public benefit will not infringe patient's privacy, but vague and complex requirements of legislation about personal data protection may disturb the researches. Medical science does not progress without using clinical information, therefore the adequate legislation that is simple and clear for both researchers and patients is strongly required. In Japan, the specific act for balancing privacy and public benefit is now under discussion. The author recommended the researchers including the field of pharmacology should pay attention to, participate in the discussion of, and make suggestion to such act or regulations.
Evaluating a collaborative IT based research and development project.
Khan, Zaheer; Ludlow, David; Caceres, Santiago
2013-10-01
In common with all projects, evaluating an Information Technology (IT) based research and development project is necessary in order to discover whether or not the outcomes of the project are successful. However, evaluating large-scale collaborative projects is especially difficult as: (i) stakeholders from different countries are involved who, almost inevitably, have diverse technological and/or application domain backgrounds and objectives; (ii) multiple and sometimes conflicting application specific and user-defined requirements exist; and (iii) multiple and often conflicting technological research and development objectives are apparent. In this paper, we share our experiences based on the large-scale integrated research project - The HUMBOLDT project - with project duration of 54 months, involving contributions from 27 partner organisations, plus 4 sub-contractors from 14 different European countries. In the HUMBOLDT project, a specific evaluation methodology was defined and utilised for the user evaluation of the project outcomes. The user evaluation performed on the HUMBOLDT Framework and its associated nine application scenarios from various application domains, resulted in not only an evaluation of the integrated project, but also revealed the benefits and disadvantages of the evaluation methodology. This paper presents the evaluation methodology, discusses in detail the process of applying it to the HUMBOLDT project and provides an in-depth analysis of the results, which can be usefully applied to other collaborative research projects in a variety of domains. Copyright © 2013 Elsevier Ltd. All rights reserved.
Saunders, Rebecca E; Instrell, Rachael; Rispoli, Rossella; Jiang, Ming; Howell, Michael
2013-01-01
High-throughput screening (HTS) uses technologies such as RNA interference to generate loss-of-function phenotypes on a genomic scale. As these technologies become more popular, many research institutes have established core facilities of expertise to deal with the challenges of large-scale HTS experiments. As the efforts of core facility screening projects come to fruition, focus has shifted towards managing the results of these experiments and making them available in a useful format that can be further mined for phenotypic discovery. The HTS-DB database provides a public view of data from screening projects undertaken by the HTS core facility at the CRUK London Research Institute. All projects and screens are described with comprehensive assay protocols, and datasets are provided with complete descriptions of analysis techniques. This format allows users to browse and search data from large-scale studies in an informative and intuitive way. It also provides a repository for additional measurements obtained from screens that were not the focus of the project, such as cell viability, and groups these data so that it can provide a gene-centric summary across several different cell lines and conditions. All datasets from our screens that can be made available can be viewed interactively and mined for further hit lists. We believe that in this format, the database provides researchers with rapid access to results of large-scale experiments that might facilitate their understanding of genes/compounds identified in their own research. DATABASE URL: http://hts.cancerresearchuk.org/db/public.
A numerical projection technique for large-scale eigenvalue problems
NASA Astrophysics Data System (ADS)
Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang
2011-10-01
We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.
Hunt, Geoffrey; Moloney, Molly; Fazio, Adam
2012-01-01
Qualitative research is often conceptualized as inherently small-scale research, primarily conducted by a lone researcher enmeshed in extensive and long-term fieldwork or involving in-depth interviews with a small sample of 20 to 30 participants. In the study of illicit drugs, traditionally this has often been in the form of ethnographies of drug-using subcultures. Such small-scale projects have produced important interpretive scholarship that focuses on the culture and meaning of drug use in situated, embodied contexts. Larger-scale projects are often assumed to be solely the domain of quantitative researchers, using formalistic survey methods and descriptive or explanatory models. In this paper, however, we will discuss qualitative research done on a comparatively larger scale—with in-depth qualitative interviews with hundreds of young drug users. Although this work incorporates some quantitative elements into the design, data collection, and analysis, the qualitative dimension and approach has nevertheless remained central. Larger-scale qualitative research shares some of the challenges and promises of smaller-scale qualitative work including understanding drug consumption from an emic perspective, locating hard-to-reach populations, developing rapport with respondents, generating thick descriptions and a rich analysis, and examining the wider socio-cultural context as a central feature. However, there are additional challenges specific to the scale of qualitative research, which include data management, data overload and problems of handling large-scale data sets, time constraints in coding and analyzing data, and personnel issues including training, organizing and mentoring large research teams. Yet large samples can prove to be essential for enabling researchers to conduct comparative research, whether that be cross-national research within a wider European perspective undertaken by different teams or cross-cultural research looking at internal divisions and differences within diverse communities and cultures. PMID:22308079
DOT National Transportation Integrated Search
2006-12-01
Over the last several years, researchers at the University of Arizonas ATLAS Center have developed an adaptive ramp : metering system referred to as MILOS (Multi-Objective, Integrated, Large-Scale, Optimized System). The goal of this project : is ...
Research to Real Life, 2006: Innovations in Deaf-Blindness
ERIC Educational Resources Information Center
Leslie, Gail, Ed.
2006-01-01
This publication presents several projects that support children who are deaf-blind. These projects are: (1) Learning To Learn; (2) Project SALUTE; (3) Project SPARKLE; (4) Bringing It All Back Home; (5) Project PRIIDE; and (6) Including Students With Deafblindness In Large Scale Assessment Systems. Each project lists components, key practices,…
Large-scale neuromorphic computing systems
NASA Astrophysics Data System (ADS)
Furber, Steve
2016-10-01
Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.
Considerations for Managing Large-Scale Clinical Trials.
ERIC Educational Resources Information Center
Tuttle, Waneta C.; And Others
1989-01-01
Research management strategies used effectively in a large-scale clinical trial to determine the health effects of exposure to Agent Orange in Vietnam are discussed, including pre-project planning, organization according to strategy, attention to scheduling, a team approach, emphasis on guest relations, cross-training of personnel, and preparing…
The impact of large-scale, long-term optical surveys on pulsating star research
NASA Astrophysics Data System (ADS)
Soszyński, Igor
2017-09-01
The era of large-scale photometric variability surveys began a quarter of a century ago, when three microlensing projects - EROS, MACHO, and OGLE - started their operation. These surveys initiated a revolution in the field of variable stars and in the next years they inspired many new observational projects. Large-scale optical surveys multiplied the number of variable stars known in the Universe. The huge, homogeneous and complete catalogs of pulsating stars, such as Cepheids, RR Lyrae stars, or long-period variables, offer an unprecedented opportunity to calibrate and test the accuracy of various distance indicators, to trace the three-dimensional structure of the Milky Way and other galaxies, to discover exotic types of intrinsically variable stars, or to study previously unknown features and behaviors of pulsators. We present historical and recent findings on various types of pulsating stars obtained from the optical large-scale surveys, with particular emphasis on the OGLE project which currently offers the largest photometric database among surveys for stellar variability.
Teaching English Critically to Mexican Children
ERIC Educational Resources Information Center
López-Gopar, Mario E.
2014-01-01
The purpose of this article is to present one significant part of a large-scale critical-ethnographic-action-research project (CEAR Project) carried out in Oaxaca, Mexico. The overall CEAR Project has been conducted since 2007 in different Oaxacan elementary schools serving indigenous and mestizo (mixed-race) children. In the CEAR Project, teacher…
Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning
2007-01-01
The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.
Michael Keller; Maria Assunção Silva-Dias; Daniel C. Nepstad; Meinrat O. Andreae
2004-01-01
The Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multi-disciplinary, multinational scientific project led by Brazil. LBA researchers seek to understand Amazonia in its global context especially with regard to regional and global climate. Current development activities in Amazonia including deforestation, logging, cattle ranching, and agriculture...
The EMCC / DARPA Massively Parallel Electromagnetic Scattering Project
NASA Technical Reports Server (NTRS)
Woo, Alex C.; Hill, Kueichien C.
1996-01-01
The Electromagnetic Code Consortium (EMCC) was sponsored by the Advanced Research Program Agency (ARPA) to demonstrate the effectiveness of massively parallel computing in large scale radar signature predictions. The EMCC/ARPA project consisted of three parts.
In Defense of the National Labs and Big-Budget Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodwin, J R
2008-07-29
The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tappedmore » in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider (LHC) at CERN, and the International Tokamak Experimental Reactor (ITER) in Cadarache, France, a magnetic-confinement fusion research project. The postWWII histories of particle and fusion physics contain remarkable examples of both international competition, with an emphasis on secrecy, and international cooperation, with an emphasis on shared knowledge and resources. Initiatives to share sometimes came from surprising directions. Most large-scale scientific projects have potential defense applications. NIF certainly does; it is primarily designed to create small-scale fusion explosions. Blue Gene/L operates in part in service to NIF, and in part to various defense projects. The most important defense projects include stewardship of the national nuclear weapons stockpile, and the proposed redesign and replacement of those weapons with fewer, safer, more reliable, longer-lived, and less apocalyptic warheads. Many well-meaning people will consider the optimal lifetime of a nuclear weapon to be zero, but most thoughtful people, when asked how much longer they think this nation will require them, will ask for some time to think. NIF is also designed to create exothermic small-scale fusion explosions. The malapropos 'exothermic' here is a convenience to cover a profusion of complexities, but the basic idea is that the explosions will create more recoverable energy than was used to create them. One can hope that the primary future benefits of success for NIF will be in cost-effective generation of electrical power through controlled small-scale fusion reactions, rather than in improved large-scale fusion explosions. Blue Gene/L also services climate research, genomic research, materials research, and a myriad of other computational problems that become more feasible, reliable, and precise the larger the number of computational nodes employed. Blue Gene/L has to be sited within a security complex for obvious reasons, but its value extends to the nation and the world. There is a duality here between large-scale scientific research machines and the supercomputers used to model them. An astounding example is illustrated in a graph released by EFDAJET, at Oxfordshire, UK, presently the largest operating magnetic-confinement fusion experiment. The graph shows plasma confinement times (an essential performance parameter) for all the major tokamaks in the international fusion program, over their existing lifetimes. The remarkable thing about the data is not so much confinement-time versus date or scale, but the fact that the data are given for both the computer model predictions and the actual experimental measurements, and the two are in phenomenal agreement over the extended range of scales. Supercomputer models, sometimes operating with the intricacy of Schroedinger's equation at quantum physical scales, have become a costly but enormously cost-saving tool.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inoue, T.; Shirakata, K.; Kinjo, K.
To obtain the data necessary for evaluating the nuclear design method of a large-scale fast breeder reactor, criticality tests with a large- scale homogeneous reactor were conducted as part of a joint research program by Japan and the U.S. Analyses of the tests are underway in both countries. The purpose of this paper is to describe the status of this project.
Chapter 13 - Perspectives on LANDFIRE Prototype Project Accuracy Assessment
James Vogelmann; Zhiliang Zhu; Jay Kost; Brian Tolk; Donald Ohlen
2006-01-01
The purpose of this chapter is to provide a general overview of the many aspects of accuracy assessment pertinent to the Landscape Fire and Resource Management Planning Tools Prototype Project (LANDFIRE Prototype Project). The LANDFIRE Prototype formed a large and complex research and development project with many broad-scale data sets and products developed throughout...
Schmidt, Olga; Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan
2017-01-01
Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology - Indonesian Institute of Sciences (RCB-LIPI, Bogor).
ERIC Educational Resources Information Center
Supovitz, Jonathan
2016-01-01
In this presentation discussed in this brief abstracted report, the author presents about an ongoing partnership with the Philadelphia School District (PSD) to implement and research the Ongoing Assessment Project (OGAP). OGAP is a systematic, intentional and iterative formative assessment system grounded in the research on how students learn…
John A. Stanturf; Daniel A. Marion; Martin Spetich; Kenneth Luckow; James M. Guldin; Hal O. Liechty; Calvin E. Meier
2000-01-01
The Ouachita Mountains Ecosystem Management Research Project (OEMP) is a large interdisciplinary research project designed to provide the scientific foundation for landscape management at the scale of watersheds. The OEMP has progressed through three phases: developing natural regeneration alternatives to clearcutting and planting; testing of these alternatives at the...
2013-01-01
The Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial is a large-scale research effort conducted by the National Cancer Institute. PLCO offers an example of coordinated research by both the extramural and intramural communities of the National Institutes of Health. The purpose of this article is to describe the PLCO research resource and how it is managed and to assess the productivity and the costs associated with this resource. Such an in-depth analysis of a single large-scale project can shed light on questions such as how large-scale projects should be managed, what metrics should be used to assess productivity, and how costs can be compared with productivity metrics. A comprehensive publication analysis identified 335 primary research publications resulting from research using PLCO data and biospecimens from 2000 to 2012. By the end of 2012, a total of 9679 citations (excluding self-citations) have resulted from this body of research publications, with an average of 29.7 citations per article, and an h index of 45, which is comparable with other large-scale studies, such as the Nurses’ Health Study. In terms of impact on public health, PLCO trial results have been used by the US Preventive Services Task Force in making recommendations concerning prostate and ovarian cancer screening. The overall cost of PLCO was $454 million over 20 years, adjusted to 2011 dollars, with approximately $37 million for the collection, processing, and storage of biospecimens, including blood samples, buccal cells, and pathology tissues. PMID:24115361
Desalination: Status and Federal Issues
2009-12-30
on one side and lets purified water through. Reverse osmosis plants have fewer problems with corrosion and usually have lower energy requirements...Texas) and cities are actively researching and investigating the feasibility of large-scale desalination plants for municipal water supplies...desalination research and development, and in construction and operational costs of desalination demonstration projects and full-scale plants
The New Big Science at the NSLS
NASA Astrophysics Data System (ADS)
Crease, Robert
2016-03-01
The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.
TARGET Publication Guidelines | Office of Cancer Genomics
Like other NCI large-scale genomics initiatives, TARGET is a community resource project and data are made available rapidly after validation for use by other researchers. To act in accord with the Fort Lauderdale principles and support the continued prompt public release of large-scale genomic data prior to publication, researchers who plan to prepare manuscripts containing descriptions of TARGET pediatric cancer data that would be of comparable scope to an initial TARGET disease-specific comprehensive, global analysis publication, and journal editors who receive such manuscripts, are
Computational Everyday Life Human Behavior Model as Servicable Knowledge
NASA Astrophysics Data System (ADS)
Motomura, Yoichi; Nishida, Yoshifumi
A project called `Open life matrix' is not only a research activity but also real problem solving as an action research. This concept is realized by large-scale data collection, probabilistic causal structure model construction and information service providing using the model. One concrete outcome of this project is childhood injury prevention activity in new team consist of hospital, government, and many varieties of researchers. The main result from the project is a general methodology to apply probabilistic causal structure models as servicable knowledge for action research. In this paper, the summary of this project and future direction to emphasize action research driven by artificial intelligence technology are discussed.
News Release: May 25, 2016 — Building on data from The Cancer Genome Atlas (TCGA) project, a multi-institutional team of scientists has completed the first large-scale “proteogenomic” study of breast cancer, linking DNA mutations to protein signaling and helping pinpoint the genes that drive cancer.
ERIC Educational Resources Information Center
Waldow, Florian
2017-01-01
Researchers interested in the global flow of educational ideas and programmes have long been interested in the role of so-called "reference societies." The article investigates how top scorers in large-scale assessments are framed as positive or negative reference societies in the education policy-making debate in German mass media and…
2004-10-01
MONITORING AGENCY NAME(S) AND ADDRESS(ES) Defense Advanced Research Projects Agency AFRL/IFTC 3701 North Fairfax Drive...Scalable Parallel Libraries for Large-Scale Concurrent Applications," Technical Report UCRL -JC-109251, Lawrence Livermore National Laboratory
Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan
2017-01-01
Abstract Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology – Indonesian Institute of Sciences (RCB-LIPI, Bogor). PMID:29134041
Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption
ERIC Educational Resources Information Center
Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane
2014-01-01
A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…
Kappler, Ulrike; Rowland, Susan L; Pedwell, Rhianna K
2017-05-01
Systems biology is frequently taught with an emphasis on mathematical modeling approaches. This focus effectively excludes most biology, biochemistry, and molecular biology students, who are not mathematics majors. The mathematical focus can also present a misleading picture of systems biology, which is a multi-disciplinary pursuit requiring collaboration between biochemists, bioinformaticians, and mathematicians. This article describes an authentic large-scale undergraduate research experience (ALURE) in systems biology that incorporates proteomics, bacterial genomics, and bioinformatics in the one exercise. This project is designed to engage students who have a basic grounding in protein chemistry and metabolism and no mathematical modeling skills. The pedagogy around the research experience is designed to help students attack complex datasets and use their emergent metabolic knowledge to make meaning from large amounts of raw data. On completing the ALURE, participants reported a significant increase in their confidence around analyzing large datasets, while the majority of the cohort reported good or great gains in a variety of skills including "analysing data for patterns" and "conducting database or internet searches." An environmental scan shows that this ALURE is the only undergraduate-level system-biology research project offered on a large-scale in Australia; this speaks to the perceived difficulty of implementing such an opportunity for students. We argue however, that based on the student feedback, allowing undergraduate students to complete a systems-biology project is both feasible and desirable, even if the students are not maths and computing majors. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(3):235-248, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.
Zhang, Lening; Messner, Steven F; Lu, Jianhong
2007-02-01
This article discusses research experience gained from a large-scale survey of criminal victimization recently conducted in Tianjin, China. The authors review some of the more important challenges that arose in the research, their responses to these challenges, and lessons learned that might be beneficial to other scholars who are interested in conducting criminological research in China. Their experience underscores the importance of understanding the Chinese political, cultural, and academic context, and the utility of collaborating with experienced and knowledgeable colleagues "on site." Although there are some special difficulties and barriers, their project demonstrates the feasibility of original criminological data collection in China.
ERIC Educational Resources Information Center
Samaroo, Julia; Dahya, Negin; Alidina, Shahnaaz
2013-01-01
This paper discusses the significance of conducting respectful research within urban schools, using the example of one large-scale university-school board partnership in northwestern Toronto. The authors, three research assistants on the project, use their experiences within three of the participating schools to interrogate the research approach…
Forensic Schedule Analysis of Construction Delay in Military Projects in the Middle East
This research performs forensic schedule analysis of delay factors that impacted recent large-scale military construction projects in the Middle East...The methodologies for analysis are adapted from the Professional Practice Guide to Forensic Schedule Analysis, particularly Method 3.7 Modeled
ERIC Educational Resources Information Center
Spuck, Dennis W.; And Others
This paper reports on a large-scale project of research and evaluation of a program for disadvantaged minority group students conducted by the Center for Educational Opportunity at the Claremont Colleges. The Program of Special Directed Studies for Transition to College (PSDS), a five-year experimental project, is aimed at providing a four-year,…
The Maryland Large-Scale Integrated Neurocognitive Architecture
2008-03-01
Visual input enters the network through the lateral geniculate nucleus (LGN) and is passed forward through visual brain regions (V1, V2, and V4...University of Maryland Sponsored by Defense Advanced Research Projects Agency DARPA Order No. V029 APPROVED FOR PUBLIC RELEASE...interpreted as necessarily representing the official policies, either expressed or implied, of the Defense Advanced Research Projects Agency or the U.S
Data management for community research projects: A JGOFS case study
NASA Technical Reports Server (NTRS)
Lowry, Roy K.
1992-01-01
Since the mid 1980s, much of the marine science research effort in the United Kingdom has been focused into large scale collaborative projects involving public sector laboratories and university departments, termed Community Research Projects. Two of these, the Biogeochemical Ocean Flux Study (BOFS) and the North Sea Project incorporated large scale data collection to underpin multidisciplinary modeling efforts. The challenge of providing project data sets to support the science was met by a small team within the British Oceanographic Data Centre (BODC) operating as a topical data center. The role of the data center was to both work up the data from the ship's sensors and to combine these data with sample measurements into online databases. The working up of the data was achieved by a unique symbiosis between data center staff and project scientists. The project management, programming and data processing skills of the data center were combined with the oceanographic experience of the project communities to develop a system which has produced quality controlled, calibrated data sets from 49 research cruises in 3.5 years of operation. The data center resources required to achieve this were modest and far outweighed by the time liberated in the scientific community by the removal of the data processing burden. Two online project databases have been assembled containing a very high proportion of the data collected. As these are under the control of BODC their long term availability as part of the UK national data archive is assured. The success of the topical data center model for UK Community Research Project data management has been founded upon the strong working relationships forged between the data center and project scientists. These can only be established by frequent personal contact and hence the relatively small size of the UK has been a critical factor. However, projects covering a larger, even international scale could be successfully supported by a network of topical data centers managing online databases which are interconnected by object oriented distributed data management systems over wide area networks.
Coates, Jennifer C; Colaiezzi, Brooke A; Bell, Winnie; Charrondiere, U Ruth; Leclercq, Catherine
2017-03-16
An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper.
Coates, Jennifer C.; Colaiezzi, Brooke A.; Bell, Winnie; Charrondiere, U. Ruth; Leclercq, Catherine
2017-01-01
An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper. PMID:28300759
This Small Business Innovation Research (SBIR) Phase II project will employ the large scale; highly reliable boron-doped ultrananocrystalline diamond (BD-UNCD®) electrodes developed during Phase I project to build and test Electrochemical Anodic Oxidation process (EAOP)...
Factors Affecting Intervention Fidelity of Differentiated Instruction in Kindergarten
ERIC Educational Resources Information Center
Dijkstra, Elma M.; Walraven, Amber; Mooij, Ton; Kirschner, Paul A.
2017-01-01
This paper reports on the findings in the first phase of a design-based research project as part of a large-scale intervention study in Dutch kindergartens. The project aims at enhancing differentiated instruction and evaluating its effects on children's development, in particular high-ability children. This study investigates relevant…
Photovoltaic Subcontract Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Surek, Thomas; Catalano, Anthony
1993-03-01
This report summarizes the fiscal year (FY) 1992 progress of the subcontracted photovoltaic (PV) research and development (R D) performed under the Photovoltaic Advanced Research and Development Project at the National Renewable Energy Laboratory (NREL)-formerly the Solar Energy Research Institute (SERI). The mission of the national PV program is to develop PV technology for large-scale generation of economically competitive electric power in the United States. The technical sections of the report cover the main areas of the subcontract program: the Crystalline Materials and Advanced Concepts project, the Polycrystalline Thin Films project, Amorphous Silicon Research project, the Photovoltaic Manufacturing Technology (PVMaT)more » project, PV Module and System Performance and Engineering project, and the PV Analysis and Applications Development project. Technical summaries of each of the subcontracted programs provide a discussion of approaches, major accomplishments in FY 1992, and future research directions.« less
The Starkey habitat database for ungulate research: construction, documentation, and use.
Mary M. Rowland; Priscilla K. Coe; Rosemary J. Stussy; [and others].
1998-01-01
The Starkey Project, a large-scale, multidisciplinary research venture, began in 1987 in the Starkey Experimental Forest and Range in northeast Oregon. Researchers are studying effects of forest management on interactions and habitat use of mule deer (Odocoileus hemionus hemionus), elk (Cervus elaphus nelsoni), and cattle. A...
Distributed and grid computing projects with research focus in human health.
Diomidous, Marianna; Zikos, Dimitrios
2012-01-01
Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.
The Power of Engaging Citizen Scientists for Scientific Progress
Garbarino, Jeanne; Mason, Christopher E.
2016-01-01
Citizen science has become a powerful force for scientific inquiry, providing researchers with access to a vast array of data points while connecting nonscientists to the authentic process of science. This citizen-researcher relationship creates an incredible synergy, allowing for the creation, execution, and analysis of research projects that would otherwise prove impossible in traditional research settings, namely due to the scope of needed human or financial resources (or both). However, citizen-science projects are not without their challenges. For instance, as projects are scaled up, there is concern regarding the rigor and usability of data collected by citizens who are not formally trained in research science. While these concerns are legitimate, we have seen examples of highly successful citizen-science projects from multiple scientific disciplines that have enhanced our collective understanding of science, such as how RNA molecules fold or determining the microbial metagenomic snapshot of an entire public transportation system. These and other emerging citizen-science projects show how improved protocols for reliable, large-scale science can realize both an improvement of scientific understanding for the general public and novel views of the world around us. PMID:27047581
The dynamics and evolution of clusters of galaxies
NASA Technical Reports Server (NTRS)
Geller, Margaret; Huchra, John P.
1987-01-01
Research was undertaken to produce a coherent picture of the formation and evolution of large-scale structures in the universe. The program is divided into projects which examine four areas: the relationship between individual galaxies and their environment; the structure and evolution of individual rich clusters of galaxies; the nature of superclusters; and the large-scale distribution of individual galaxies. A brief review of results in each area is provided.
UPDATE ON THE MARINA STUDY ON LAKE TEXOMA
The National Risk Management Research Laboratory (NRMRL) has instituted a program for Risk Management Research for Ecosystem Restoration in Watersheds. As part of this program a large scale project was initiated on Lake Texoma and the surrounding watershed to evaluate the assimi...
APDA's Contribution to Current Research and Citizen Science
NASA Astrophysics Data System (ADS)
Barker, Thurburn; Castelaz, M. W.; Cline, J. D.; Hudec, R.
2010-01-01
The Astronomical Photographical Data Archive (APDA) is dedicated to the collection, restoration, preservation, and digitization of astronomical photographic data that eventually can be accessed via the Internet by the global community of scientists, researchers and students. Located on the Pisgah Astronomical Research Institute campus, APDA now includes collections from North America totaling more than 100,000 photographic plates and films. Two new large scale research projects, and one citizen science project have now been developed from the archived data. One unique photographic data collection covering the southern hemisphere contains the signatures of diffuse interstellar bands (DIBs) within the stellar spectra on objective prism plates. We plan to digitize the spectra, identify the DIBs, and map out the large scale spatial extent of DIBS. The goal is to understand the Galactic environment suitable to the DIB molecules. Another collection contains spectra with nearly the same dispersion as the GAIA Satellite low dispersion slitless spectrophotometers, BP and RP. The plates will be used to develop standards for GAIA spectra. To bring the data from APDA to the general public, we have developed the citizen science project called Stellar Classification Online - Public Exploration (SCOPE). SCOPE allows the citizen scientist to classify up to a half million stars on objective prism plates. We will present the status of each of these projects.
Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret
2017-11-29
Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs explored in an RCT are inadequate in describing the enablers and barriers of upscaled community program implementation. Methods for data collection, analysis and reporting also require consideration. We present a number of experiential reflections and suggestions for the successful evaluation of future upscaled community programs which are scarcely reported in the literature. PEACH™ QLD was retrospectively registered with the Australian New Zealand Clinical Trials Registry on 28 February 2017 (ACTRN12617000315314).
Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcox, Karen; Marzouk, Youssef
2013-11-12
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Aymerich, Marta; Carrion, Carme; Gallo, Pedro; Garcia, Maria; López-Bermejo, Abel; Quesada, Miquel; Ramos, Rafel
2012-08-01
Most ex-post evaluations of research funding programs are based on bibliometric methods and, although this approach has been widely used, it only examines one facet of the project's impact, that is, scientific productivity. More comprehensive models of payback assessment of research activities are designed for large-scale projects with extensive funding. The purpose of this study was to design and implement a methodology for the ex-post evaluation of small-scale projects that would take into account both the fulfillment of projects' stated objectives as well as other wider benefits to society as payback measures. We used a two-phase ex-post approach to appraise impact for 173 small-scale projects funded in 2007 and 2008 by a Spanish network center for research in epidemiology and public health. In the internal phase we used a questionnaire to query the principal investigator (PI) on the outcomes as well as actual and potential impact of each project; in the external phase we sent a second questionnaire to external reviewers with the aim of assessing (by peer-review) the performance of each individual project. Overall, 43% of the projects were rated as having completed their objectives "totally", and 40% "considerably". The research activities funded were reported by PIs as socially beneficial their greatest impact being on research capacity (50% of payback to society) and on knowledge translation (above 11%). The method proposed showed a good discriminating ability that makes it possible to measure, reliably, the extent to which a project's objectives were met as well as the degree to which the project contributed to enhance the group's scientific performance and of its social payback. Copyright © 2012 Elsevier Ltd. All rights reserved.
Space research - At a crossroads
NASA Technical Reports Server (NTRS)
Mcdonald, Frank B.
1987-01-01
Efforts which must be expended if U.S. space research is to regain vitality in the next few years are discussed. Small-scale programs are the cornerstone for big science projects, giving both researchers and students a chance to practice the development of space missions and hardware and identify promising goals for larger projects. Small projects can be carried aloft by balloons, sounding rockets, the Shuttle and ELVs. It is recommended that NASA continue the development of remote sensing systems, and join with other government agencies to fund space-based materials science, space biology and medical research. Increased international cooperation in space projects is necessary for affording moderate to large scale missions, for political reasons, and to maximize available space resources. Finally, the establishment and funding of long-range goals in space, particularly the development of the infrastructure and technologies for the exploration and colonization of the planets, must be viewed as the normal outgrowth of the capabilities being developed for LEO operations.
Municipal Sludge Application in Forests of Northern Michigan: a Case Study.
D.G. Brockway; P.V. Nguyen
1986-01-01
A large-scale operational demonstration and research project was cooperatively established by the US. Environmental Protection Agency, Michigan Department of Natural Resources, and Michigan State University to evaluate the practice of forest land application as an option for sludge utilization. Project objectives included completing (1) a logistic and economic...
What Have Researchers Learned from Project STAR?
ERIC Educational Resources Information Center
Schanzenbach, Diane Whitmore
2007-01-01
Project STAR (Student/Teacher Achievement Ratio) was a large-scale randomized trial of reduced class sizes in kindergarten through the third grade. Because of the scope of the experiment, it has been used in many policy discussions. For example, the California statewide class-size-reduction policy was justified, in part, by the successes of…
ERIC Educational Resources Information Center
Taylor, Catherine G.; Meyer, Elizabeth J.; Peter, Tracey; Ristock, Janice; Short, Donn; Campbell, Christopher
2016-01-01
The Every Teacher Project involved large-scale survey research conducted to identify the beliefs, perspectives, and practices of Kindergarten to Grade 12 educators in Canadian public schools regarding lesbian, gay, bisexual, transgender, and queer (LGBTQ)-inclusive education. Comparisons are made between LGBTQ and cisgender heterosexual…
Photovoltaic Subcontract Program. Annual report, FY 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-03-01
This report summarizes the fiscal year (FY) 1992 progress of the subcontracted photovoltaic (PV) research and development (R&D) performed under the Photovoltaic Advanced Research and Development Project at the National Renewable Energy Laboratory (NREL)-formerly the Solar Energy Research Institute (SERI). The mission of the national PV program is to develop PV technology for large-scale generation of economically competitive electric power in the United States. The technical sections of the report cover the main areas of the subcontract program: the Crystalline Materials and Advanced Concepts project, the Polycrystalline Thin Films project, Amorphous Silicon Research project, the Photovoltaic Manufacturing Technology (PVMaT) project,more » PV Module and System Performance and Engineering project, and the PV Analysis and Applications Development project. Technical summaries of each of the subcontracted programs provide a discussion of approaches, major accomplishments in FY 1992, and future research directions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fix, N. J.
The U.S. Department of Energy (DOE) is cleaning up and/or monitoring large, dilute plumes contaminated by metals, such as uranium and chromium, whose mobility and solubility change with redox status. Field-scale experiments with acetate as the electron donor have stimulated metal-reducing bacteria to effectively remove uranium [U(VI)] from groundwater at the Uranium Mill Tailings Site in Rifle, Colorado. The Pacific Northwest National Laboratory and a multidisciplinary team of national laboratory and academic collaborators has embarked on a research proposed for the Rifle site, the object of which is to gain a comprehensive and mechanistic understanding of the microbial factors andmore » associated geochemistry controlling uranium mobility so that DOE can confidently remediate uranium plumes as well as support stewardship of uranium-contaminated sites. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the Rifle Integrated Field-Scale Subsurface Research Challenge Project.« less
"Baby-Cam" and Researching with Infants: Viewer, Image and (Not) Knowing
ERIC Educational Resources Information Center
Elwick, Sheena
2015-01-01
This article offers a methodological reflection on how "baby-cam" enhanced ethically reflective attitudes in a large-scale research project that set out to research with infants in Australian early childhood education and care settings. By juxtaposing digital images produced by two different digital-camera technologies and drawing on…
ERIC Educational Resources Information Center
Nickerson, C.; Gerritsen, M.; Meurs, F.v.
2005-01-01
This Research Note reports on a large-scale staff-student project focussing on the use of English for Specific Business Purposes in a number of promotional genres (TV commercials, annual reports, corporate web-sites, print advertising) within several of the EU member states: Belgium, the Netherlands, France, Germany and Spain. The project as a…
Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve
2005-01-01
Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.
Multi-Scale Models for the Scale Interaction of Organized Tropical Convection
NASA Astrophysics Data System (ADS)
Yang, Qiu
Assessing the upscale impact of organized tropical convection from small spatial and temporal scales is a research imperative, not only for having a better understanding of the multi-scale structures of dynamical and convective fields in the tropics, but also for eventually helping in the design of new parameterization strategies to improve the next-generation global climate models. Here self-consistent multi-scale models are derived systematically by following the multi-scale asymptotic methods and used to describe the hierarchical structures of tropical atmospheric flows. The advantages of using these multi-scale models lie in isolating the essential components of multi-scale interaction and providing assessment of the upscale impact of the small-scale fluctuations onto the large-scale mean flow through eddy flux divergences of momentum and temperature in a transparent fashion. Specifically, this thesis includes three research projects about multi-scale interaction of organized tropical convection, involving tropical flows at different scaling regimes and utilizing different multi-scale models correspondingly. Inspired by the observed variability of tropical convection on multiple temporal scales, including daily and intraseasonal time scales, the goal of the first project is to assess the intraseasonal impact of the diurnal cycle on the planetary-scale circulation such as the Hadley cell. As an extension of the first project, the goal of the second project is to assess the intraseasonal impact of the diurnal cycle over the Maritime Continent on the Madden-Julian Oscillation. In the third project, the goals are to simulate the baroclinic aspects of the ITCZ breakdown and assess its upscale impact on the planetary-scale circulation over the eastern Pacific. These simple multi-scale models should be useful to understand the scale interaction of organized tropical convection and help improve the parameterization of unresolved processes in global climate models.
Encouraging Gender Analysis in Research Practice
ERIC Educational Resources Information Center
Thien, Deborah
2009-01-01
Few resources for practical teaching or fieldwork exercises exist which address gender in geographical contexts. This paper adds to teaching and fieldwork resources by describing an experience with designing and implementing a "gender intervention" for a large-scale, multi-university, bilingual research project that brought together a group of…
NASA Astrophysics Data System (ADS)
Bundschuh, V.; Grueter, J. W.; Kleemann, M.; Melis, M.; Stein, H. J.; Wagner, H. J.; Dittrich, A.; Pohlmann, D.
1982-08-01
A preliminary study was undertaken before a large scale project for construction and survey of about a hundred solar houses was launched. The notion of solar house was defined and the use of solar energy (hot water preparation, heating of rooms, heating of swimming pool, or a combination of these possibilities) were examined. A coherent measuring program was set up. Advantages and inconveniences of the large scale project were reviewed. Production of hot water, evaluation of different concepts and different fabrications of solar systems, coverage of the different systems, conservation of energy, failure frequency and failures statistics, durability of the installation, investment maintenance and energy costs were retained as study parameters. Different solar hot water production systems and the heat counter used for measurements are described.
Process for Low Cost Domestic Production of LIB Cathode Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thurston, Anthony
The objective of the research was to determine the best low cost method for the large scale production of the Nickel-Cobalt-Manganese (NCM) layered cathode materials. The research and development focused on scaling up the licensed technology from Argonne National Laboratory in BASF’s battery material pilot plant in Beachwood Ohio. Since BASF did not have experience with the large scale production of the NCM cathode materials there was a significant amount of development that was needed to support BASF’s already existing research program. During the three year period BASF was able to develop and validate production processes for the NCM 111,more » 523 and 424 materials as well as begin development of the High Energy NCM. BASF also used this time period to provide free cathode material samples to numerous manufactures, OEM’s and research companies in order to validate the ma-terials. The success of the project can be demonstrated by the construction of the production plant in Elyria Ohio and the successful operation of that facility. The benefit of the project to the public will begin to be apparent as soon as material from the production plant is being used in electric vehicles.« less
Bockholt, Henry J.; Scully, Mark; Courtney, William; Rachakonda, Srinivas; Scott, Adam; Caprihan, Arvind; Fries, Jill; Kalyanam, Ravi; Segall, Judith M.; de la Garza, Raul; Lane, Susan; Calhoun, Vince D.
2009-01-01
A neuroinformatics (NI) system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN), database system has been designed and improved through our experience with 200 research studies and 250 researchers from seven different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining. PMID:20461147
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shea, M.
1995-09-01
The proper isolation of radioactive waste is one of today`s most pressing environmental issues. Research is being carried out by many countries around the world in order to answer critical and perplexing questions regarding the safe disposal of radioactive waste. Natural analogue studies are an increasingly important facet of this international research effort. The Pocos de Caldas Project represents a major effort of the international technical and scientific community towards addressing one of modern civilization`s most critical environmental issues - radioactive waste isolation.
Moderation and Consistency of Teacher Judgement: Teachers' Views
ERIC Educational Resources Information Center
Connolly, Stephen; Klenowski, Valentina; Wyatt-Smith, Claire Maree
2012-01-01
Major curriculum and assessment reforms in Australia have generated research interest in issues related to standards, teacher judgement and moderation. This article is based on one related inquiry of a large-scale Australian Research Council Linkage project conducted in Queensland. This qualitative study analysed interview data to identify…
The Ceiling to Coproduction in University-Industry Research Collaboration
ERIC Educational Resources Information Center
McCabe, Angela; Parker, Rachel; Cox, Stephen
2016-01-01
The purpose of this paper is to provide insight into government attempts at bridging the divide between theory and practice through university-industry research collaboration modelled under engaged scholarship. The findings are based on data sourced from interviews with 47 academic and industry project leaders from 23 large-scale research…
Preschool Children, Painting and Palimpsest: Collaboration as Pedagogy, Practice and Learning
ERIC Educational Resources Information Center
Cutcher, Alexandra; Boyd, Wendy
2018-01-01
This article describes a small, collaborative, arts-based research project conducted in two rural early childhood centres in regional Australia, where the children made large-scale collaborative paintings in partnership with teachers and researchers. Observation of young children's artistic practices, in order to inform the development of…
Neuroscience thinks big (and collaboratively).
Kandel, Eric R; Markram, Henry; Matthews, Paul M; Yuste, Rafael; Koch, Christof
2013-09-01
Despite cash-strapped times for research, several ambitious collaborative neuroscience projects have attracted large amounts of funding and media attention. In Europe, the Human Brain Project aims to develop a large-scale computer simulation of the brain, whereas in the United States, the Brain Activity Map is working towards establishing a functional connectome of the entire brain, and the Allen Institute for Brain Science has embarked upon a 10-year project to understand the mouse visual cortex (the MindScope project). US President Barack Obama's announcement of the BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies Initiative) in April 2013 highlights the political commitment to neuroscience and is expected to further foster interdisciplinary collaborations, accelerate the development of new technologies and thus fuel much needed medical advances. In this Viewpoint article, five prominent neuroscientists explain the aims of the projects and how they are addressing some of the questions (and criticisms) that have arisen.
Education Policy, Globalization, Commercialization: An Interview with Bob Lingard by David Hursh
ERIC Educational Resources Information Center
Hursh, David
2017-01-01
In this interview with David Hursh, Bob Lingard comments on his current and/or recently completed research projects in respect to new modes of global governance in schooling and the complementarity between international large scale assessments and national testing. He also looks at a project that, in conjunction with school leaders, teachers,…
ERIC Educational Resources Information Center
Sleeter, Christine E., Ed.
2011-01-01
The work presented here is a large-scale evaluation of a theory-driven school reform project in New Zealand, which focuses on improving the educational achievement of Maori students in public secondary schools. The project's conceptual underpinnings are based on Kaupapa Maori research, culturally responsive teaching, student voice, and…
ERIC Educational Resources Information Center
Polanin, Joshua R.; Wilson, Sandra Jo
2014-01-01
The purpose of this project is to demonstrate the practical methods developed to utilize a dataset consisting of both multivariate and multilevel effect size data. The context for this project is a large-scale meta-analytic review of the predictors of academic achievement. This project is guided by three primary research questions: (1) How do we…
AFRL/Cornell Information Assurance Institute
2007-03-01
revewing this colection ofinformation . Send connents regarding this burden estimate or any other aspect of this collection of information, indcudng...collabora- tions involving Cornell and AFRL researchers, with * AFRL researchers able to participate in Cornell research projects, fa- cilitating technology ...approach to developing a science base and technology for supporting large-scale reliable distributed systems. First, so- lutions to core problems were
Examining What We Mean by "Collaboration" in Collaborative Action Research: A Cross-Case Analysis
ERIC Educational Resources Information Center
Bruce, Catherine D.; Flynn, Tara; Stagg-Peterson, Shelley
2011-01-01
The purpose of this paper is to report on the nature of collaboration in a multi-year, large-scale collaborative action research project in which a teachers' federation (in Ontario, Canada), university researchers and teachers partnered to investigate teacher-selected topics for inquiry. Over two years, 14 case studies were generated involving six…
A Navy Shore Activity Manpower Planning System for Civilians. Technical Report No. 24.
ERIC Educational Resources Information Center
Niehaus, R. J.; Sholtz, D.
This report describes the U.S. Navy Shore Activity Manpower Planning System (SAMPS) advanced development research project. This effort is aimed at large-scale feasibility tests of manpower models for large Naval installations. These local planning systems are integrated with Navy-wide information systems on a data-communications network accessible…
NASA Astrophysics Data System (ADS)
Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey
2017-04-01
Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this research. Hopefully, all results developed from this research can be used as a warning system for Predicting Large Scale Landslides in the southern Taiwan. Keywords:Heavy Rainfall, Large Scale, landslides, Critical Rainfall Value
A Primer on Infectious Disease Bacterial Genomics
Petkau, Aaron; Knox, Natalie; Graham, Morag; Van Domselaar, Gary
2016-01-01
SUMMARY The number of large-scale genomics projects is increasing due to the availability of affordable high-throughput sequencing (HTS) technologies. The use of HTS for bacterial infectious disease research is attractive because one whole-genome sequencing (WGS) run can replace multiple assays for bacterial typing, molecular epidemiology investigations, and more in-depth pathogenomic studies. The computational resources and bioinformatics expertise required to accommodate and analyze the large amounts of data pose new challenges for researchers embarking on genomics projects for the first time. Here, we present a comprehensive overview of a bacterial genomics projects from beginning to end, with a particular focus on the planning and computational requirements for HTS data, and provide a general understanding of the analytical concepts to develop a workflow that will meet the objectives and goals of HTS projects. PMID:28590251
Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA)
NASA Technical Reports Server (NTRS)
Lichtwardt, Jonathan; Paciano, Eric; Jameson, Tina; Fong, Robert; Marshall, David
2012-01-01
With the very recent advent of NASA's Environmentally Responsible Aviation Project (ERA), which is dedicated to designing aircraft that will reduce the impact of aviation on the environment, there is a need for research and development of methodologies to minimize fuel burn, emissions, and reduce community noise produced by regional airliners. ERA tackles airframe technology, propulsion technology, and vehicle systems integration to meet performance objectives in the time frame for the aircraft to be at a Technology Readiness Level (TRL) of 4-6 by the year of 2020 (deemed N+2). The proceeding project that investigated similar goals to ERA was NASA's Subsonic Fixed Wing (SFW). SFW focused on conducting research to improve prediction methods and technologies that will produce lower noise, lower emissions, and higher performing subsonic aircraft for the Next Generation Air Transportation System. The work provided in this investigation was a NASA Research Announcement (NRA) contract #NNL07AA55C funded by Subsonic Fixed Wing. The project started in 2007 with a specific goal of conducting a large-scale wind tunnel test along with the development of new and improved predictive codes for the advanced powered-lift concepts. Many of the predictive codes were incorporated to refine the wind tunnel model outer mold line design. The large scale wind tunnel test goal was to investigate powered lift technologies and provide an experimental database to validate current and future modeling techniques. Powered-lift concepts investigated were Circulation Control (CC) wing in conjunction with over-the-wing mounted engines to entrain the exhaust to further increase the lift generated by CC technologies alone. The NRA was a five-year effort; during the first year the objective was to select and refine CESTOL concepts and then to complete a preliminary design of a large-scale wind tunnel model for the large scale test. During the second, third, and fourth years the large-scale wind tunnel model design would be completed, manufactured, and calibrated. During the fifth year the large scale wind tunnel test was conducted. This technical memo will describe all phases of the Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA) project and provide a brief summary of the background and modeling efforts involved in the NRA. The conceptual designs considered for this project and the decision process for the selected configuration adapted for a wind tunnel model will be briefly discussed. The internal configuration of AMELIA, and the internal measurements chosen in order to satisfy the requirements of obtaining a database of experimental data to be used for future computational model validations. The external experimental techniques that were employed during the test, along with the large-scale wind tunnel test facility are covered in great detail. Experimental measurements in the database include forces and moments, and surface pressure distributions, local skin friction measurements, boundary and shear layer velocity profiles, far-field acoustic data and noise signatures from turbofan propulsion simulators. Results and discussion of the circulation control performance, over-the-wing mounted engines, and the combined performance are also discussed in great detail.
Realtime monitoring of bridge scour using remote monitoring technology
DOT National Transportation Integrated Search
2011-02-01
The research performed in this project focuses on the application of instruments including accelerometers : and tiltmeters to monitor bridge scour. First, two large scale laboratory experiments were performed. One : experiment is the simulation of a ...
An Overview of the NASA FAP Hypersonics Project Airbreathing Propulsion Research
NASA Technical Reports Server (NTRS)
Auslender, A. H.; Suder, Kenneth L.; Thomas, Scott R.
2009-01-01
The propulsion research portfolio of the National Aeronautics and Space Administration Fundamental Aeronautics Program Hypersonics Project encompasses a significant number of technical tasks that are aligned to achieve mastery and intellectual stewardship of the core competencies in the hypersonic-flight regime. An overall coordinated programmatic and technical effort has been structured to advance the state-of-the-art, via both experimental and analytical efforts. A subset of the entire hypersonics propulsion research portfolio is presented in this overview paper. To this end, two programmatic research disciplines are discussed; namely, (1) the Propulsion Discipline, including three associated research elements: the X-51A partnership, the HIFiRE-2 partnership, and the Durable Combustor Rig, and (2) the Turbine-Based Combine Cycle Discipline, including three associated research elements: the Combined Cycle Engine Large Scale Inlet Mode Transition Experiment, the small-scale Inlet Mode Transition Experiment, and the High-Mach Fan Rig.
Investigating potential transferability of place-based research in land system science
NASA Astrophysics Data System (ADS)
Václavík, Tomáš; Langerwisch, Fanny; Cotter, Marc; Fick, Johanna; Häuser, Inga; Hotes, Stefan; Kamp, Johannes; Settele, Josef; Spangenberg, Joachim H.; Seppelt, Ralf
2016-09-01
Much of our knowledge about land use and ecosystem services in interrelated social-ecological systems is derived from place-based research. While local and regional case studies provide valuable insights, it is often unclear how relevant this research is beyond the study areas. Drawing generalized conclusions about practical solutions to land management from local observations and formulating hypotheses applicable to other places in the world requires that we identify patterns of land systems that are similar to those represented by the case study. Here, we utilize the previously developed concept of land system archetypes to investigate potential transferability of research from twelve regional projects implemented in a large joint research framework that focus on issues of sustainable land management across four continents. For each project, we characterize its project archetype, i.e. the unique land system based on a synthesis of more than 30 datasets of land-use intensity, environmental conditions and socioeconomic indicators. We estimate the transferability potential of project research by calculating the statistical similarity of locations across the world to the project archetype, assuming higher transferability potentials in locations with similar land system characteristics. Results show that areas with high transferability potentials are typically clustered around project sites but for some case studies can be found in regions that are geographically distant, especially when values of considered variables are close to the global mean or where the project archetype is driven by large-scale environmental or socioeconomic conditions. Using specific examples from the local case studies, we highlight the merit of our approach and discuss the differences between local realities and information captured in global datasets. The proposed method provides a blueprint for large research programs to assess potential transferability of place-based studies to other geographical areas and to indicate possible gaps in research efforts.
Final Report for Project FG02-05ER25685
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiaosong Ma
2009-05-07
In this report, the PI summarizes the results and achievements obtained in the sponsored project. Overall, the project has been very successful and produced both research results in massive data-intensive computing and data management for large scale supercomputers today, and in open-source software products. During the project period, 14 conference/journal publications, as well as two PhD students, have been produced due to exclusive or shared support from this award. In addition, the PI has recently been granted tenure from NC State University.
ERIC Educational Resources Information Center
Peterson, Norman G., Ed.
As part of the United States Army's Project A, research has been conducted to develop and field test a battery of experimental tests to complement the Armed Services Vocational Aptitude Battery in predicting soldiers' job performance. Project A is the United States Army's large-scale manpower effort to improve selection, classification, and…
The Saskatchewan River Basin - a large scale observatory for water security research (Invited)
NASA Astrophysics Data System (ADS)
Wheater, H. S.
2013-12-01
The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.
ERIC Educational Resources Information Center
Grimshaw, Shirley; Wilson, Ian
2009-01-01
The aim of the project was to develop a set of online tools, systems and processes that would facilitate research at the University of Nottingham. The tools would be delivered via a portal, a one-stop place providing a Virtual Research Environment for all those involved in the research process. A predominantly bottom-up approach was used with…
Sudden Infant Death Syndrome, FY 1983. Special Report to Congress.
ERIC Educational Resources Information Center
National Inst. of Child Health and Human Development (NIH), Bethesda, MD.
This report describes research programs focusing on the sudden infant death syndrome (SIDS) and indicates some presently available results. Specific attention is given to research on sleep apnea, respiratory control, and hypoxia, as well as to infectious disease processes and immunology. Findings of a large-scale multidisciplinary SIDS project are…
A Strong Future for Public Library Use and Employment
ERIC Educational Resources Information Center
Griffiths, Jose-Marie; King, Donald W.
2011-01-01
The latest and most comprehensive assessment of public librarians' education and career paths to date, this important volume reports on a large-scale research project performed by authors Jose-Marie Griffiths and Donald W. King. Presented in collaboration with the Office for Research and Statistics (ORS), the book includes an examination of trends…
[Support Team for Investigator-Initiated Clinical Research].
Fujii, Hisako
2017-07-01
Investigator-initiated clinical research is that in which investigators plan and carry out their own clinical research in academia. For large-scale clinical research, a team should be organized and implemented. This team should include investigators and supporting staff, who will promote smooth research performance by fulfilling their respective roles. The supporting staff should include project managers, administrative personnel, billing personnel, data managers, and clinical research coordinators. In this article, I will present the current status of clinical research support and introduce the research organization of the Dominantly Inherited Alzheimer Network (DIAN) study, an investigator-initiated international clinical research study, with particular emphasis on the role of the project management staff and clinical research coordinators.
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Ronald C.
Bioinformatics researchers are increasingly confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBasemore » project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date.« less
ERIC Educational Resources Information Center
Qvortrup, Lars
2016-01-01
Based on experiences from a number of large scale data- and research-informed school development projects in Denmark and Norway, led by the author, three hypotheses are discussed: that an effective way of linking research and practice is achieved (1) using a capacity building approach, that is, to collaborate in the practical school context…
The role of ethics in data governance of large neuro-ICT projects.
Stahl, Bernd Carsten; Rainey, Stephen; Harris, Emma; Fothergill, B Tyr
2018-05-14
We describe current practices of ethics-related data governance in large neuro-ICT projects, identify gaps in current practice, and put forward recommendations on how to collaborate ethically in complex regulatory and normative contexts. We undertake a survey of published principles of data governance of large neuro-ICT projects. This grounds an approach to a normative analysis of current data governance approaches. Several ethical issues are well covered in the data governance policies of neuro-ICT projects, notably data protection and attribution of work. Projects use a set of similar policies to ensure users behave appropriately. However, many ethical issues are not covered at all. Implementation and enforcement of policies remain vague. The data governance policies we investigated indicate that the neuro-ICT research community is currently close-knit and that shared assumptions are reflected in infrastructural aspects. This explains why many ethical issues are not explicitly included in data governance policies at present. With neuro-ICT research growing in scale, scope, and international involvement, these shared assumptions should be made explicit and reflected in data governance.
NASA Technical Reports Server (NTRS)
Andrews, E. H., Jr.; Mackley, E. A.
1976-01-01
The NASA Hypersonic Research Engine (HRE) Project was initiated for the purpose of advancing the technology of airbreathing propulsion for hypersonic flight. A large component (inlet, combustor, and nozzle) and structures development program was encompassed by the project. The tests of a full-scale (18 in. diameter cowl and 87 in. long) HRE concept, designated the Aerothermodynamic Integration Model (AIM), at Mach numbers of 5, 6, and 7. Computer program results for Mach 6 component integration tests are presented.
Hybrid methods for cybersecurity analysis :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Warren Leon,; Dunlavy, Daniel M.
2014-01-01
Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling andmore » analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and years to hours and days for the application of new modeling and analysis capabilities to emerging threats. The development and deployment framework has been generalized into the Hybrid Framework and incor- porated into several LDRD, WFO, and DOE/CSL projects and proposals. And most importantly, the Hybrid project has provided Sandia security analysts with new, scalable, extensible analytic capabilities that have resulted in alerts not detectable using their previous work ow tool sets.« less
High-Resolution Climate Data Visualization through GIS- and Web-based Data Portals
NASA Astrophysics Data System (ADS)
WANG, X.; Huang, G.
2017-12-01
Sound decisions on climate change adaptation rely on an in-depth assessment of potential climate change impacts at regional and local scales, which usually requires finer resolution climate projections at both spatial and temporal scales. However, effective downscaling of global climate projections is practically difficult due to the lack of computational resources and/or long-term reference data. Although a large volume of downscaled climate data has been make available to the public, how to understand and interpret the large-volume climate data and how to make use of the data to drive impact assessment and adaptation studies are still challenging for both impact researchers and decision makers. Such difficulties have become major barriers preventing informed climate change adaptation planning at regional scales. Therefore, this research will explore new GIS- and web-based technologies to help visualize the large-volume regional climate data with high spatiotemporal resolutions. A user-friendly public data portal, named Climate Change Data Portal (CCDP, http://ccdp.network), will be established to allow intuitive and open access to high-resolution regional climate projections at local scales. The CCDP offers functions of visual representation through geospatial maps and data downloading for a variety of climate variables (e.g., temperature, precipitation, relative humidity, solar radiation, and wind) at multiple spatial resolutions (i.e., 25 - 50 km) and temporal resolutions (i.e., annual, seasonal, monthly, daily, and hourly). The vast amount of information the CCDP encompasses can provide a crucial basis for assessing impacts of climate change on local communities and ecosystems and for supporting better decision making under a changing climate.
Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghattas, Omar
2013-10-15
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Geomorphic analysis of large alluvial rivers
NASA Astrophysics Data System (ADS)
Thorne, Colin R.
2002-05-01
Geomorphic analysis of a large river presents particular challenges and requires a systematic and organised approach because of the spatial scale and system complexity involved. This paper presents a framework and blueprint for geomorphic studies of large rivers developed in the course of basic, strategic and project-related investigations of a number of large rivers. The framework demonstrates the need to begin geomorphic studies early in the pre-feasibility stage of a river project and carry them through to implementation and post-project appraisal. The blueprint breaks down the multi-layered and multi-scaled complexity of a comprehensive geomorphic study into a number of well-defined and semi-independent topics, each of which can be performed separately to produce a clearly defined, deliverable product. Geomorphology increasingly plays a central role in multi-disciplinary river research and the importance of effective quality assurance makes it essential that audit trails and quality checks are hard-wired into study design. The structured approach presented here provides output products and production trails that can be rigorously audited, ensuring that the results of a geomorphic study can stand up to the closest scrutiny.
Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid
NASA Astrophysics Data System (ADS)
Kuwayama, Akira
The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.
Simulation research on the process of large scale ship plane segmentation intelligent workshop
NASA Astrophysics Data System (ADS)
Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei
2017-04-01
Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.
The CELSS breadboard project: Plant production
NASA Technical Reports Server (NTRS)
Knott, William M.
1990-01-01
NASA's Breadboard Project for the Controlled Ecological Life Support System (CELSS) program is described. The simplified schematic of a CELSS is given. A modular approach is taken to building the CELSS Breadboard. Each module is researched in order to develop a data set for each one prior to its integration into the complete system. The data being obtained from the Biomass Production Module or the Biomass Production Chamber is examined. The other primary modules, food processing and resource recovery or waste management, are discussed briefly. The crew habitat module is not discussed. The primary goal of the Breadboard Project is to scale-up research data to an integrated system capable of supporting one person in order to establish feasibility for the development and operation of a CELSS. Breadboard is NASA's first attempt at developing a large scale CELSS.
Regional climate projection of the Maritime Continent using the MIT Regional Climate Model
NASA Astrophysics Data System (ADS)
IM, E. S.; Eltahir, E. A. B.
2014-12-01
Given that warming of the climate system is unequivocal (IPCC AR5), accurate assessment of future climate is essential to understand the impact of climate change due to global warming. Modelling the climate change of the Maritime Continent is particularly challenge, showing a high degree of uncertainty. Compared to other regions, model agreement of future projections in response to anthropogenic emission forcings is much less. Furthermore, the spatial and temporal behaviors of climate projections seem to vary significantly due to a complex geographical condition and a wide range of scale interactions. For the fine-scale climate information (27 km) suitable for representing the complexity of climate change over the Maritime Continent, dynamical downscaling is performed using the MIT regional climate model (MRCM) during two thirty-year period for reference (1970-1999) and future (2070-2099) climate. Initial and boundary conditions are provided by Community Earth System Model (CESM) simulations under the emission scenarios projected by MIT Integrated Global System Model (IGSM). Changes in mean climate as well as the frequency and intensity of extreme climate events are investigated at various temporal and spatial scales. Our analysis is primarily centered on the different behavior of changes in convective and large-scale precipitation over land vs. ocean during dry vs. wet season. In addition, we attempt to find the added value to downscaled results over the Maritime Continent through the comparison between MRCM and CESM projection. Acknowledgements.This research was supported by the National Research Foundation Singapore through the Singapore MIT Alliance for Research and Technology's Center for Environmental Sensing and Modeling interdisciplinary research program.
Documentation of validity for the AT-SAT computerized test battery. Volume 2
DOT National Transportation Integrated Search
2001-03-01
This document is a comprehensive report on a large-scale research project to develop and validate a : computerized selection battery to hire Air Traffic Control Specialists (ATCSs) for the Federal Aviation : Administration (FAA). The purpose of this ...
Documentation of validity for the AT-SAT computerized test battery. Volume 1
DOT National Transportation Integrated Search
2001-03-01
This document is a comprehensive report on a large-scale research project to develop and validate a : computerized selection battery to hire Air Traffic Control Specialists (ATCSs) for the Federal Aviation : Administration (FAA). The purpose of this ...
Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities (Book)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2013-03-01
To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. The U.S. Department of Energy's Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessary private capital to complete them. This guide is intended to provide a general resource that will begin to develop the Federal employee's awareness and understanding of the project developer's operating environment and the private sector's awareness and understandingmore » of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this guide has been organized to match Federal processes with typical phases of commercial project development. The main purpose of this guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project.« less
WHK Interns Highlight the Importance of Their Work | Poster
The Werner H. Kirsten (WHK) student interns at the National Cancer Institute (NCI) at Frederick are participating in groundbreaking cancer research, along with large-scale projects and technological advancements, during their senior year of high school. The interns at NCI at Frederick are given more than the opportunity to watch the research; they participate in and conduct
Development of an Internet Collaborative Learning Behavior Scale--Preliminary Results.
ERIC Educational Resources Information Center
Hsu, Ti; Wang, Hsiu Fei
It is well known that math phobia is a common problem among young school children. It becomes a challenge to educational practitioners and academic researchers to figure out ways to overcome the problem. Collaborative team learning has been proposed as one of the alternatives. This study was part of a large and ongoing research project designed to…
The Role of Empathy in Preparing Teachers to Tackle Bullying
ERIC Educational Resources Information Center
Murphy, Helena; Tubritt, John; Norman, James O'Higgins
2018-01-01
Much research on bullying behaviour in schools among students has been carried out since the 1970's, when Olweus started a large-scale project in Norway which is now generally regarded as the first scientific study on bullying. Yet, there has been little research on how teachers respond to reports of bullying and tackle bullying behaviour in…
Cruise noise of the 2/9th scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A
NASA Technical Reports Server (NTRS)
Dittmar, James H.; Stang, David B.
1987-01-01
Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.
Cruise noise of the 2/9 scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A
NASA Technical Reports Server (NTRS)
Dittmar, James H.; Stang, David B.
1987-01-01
Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.
The Earth Microbiome Project and Global Systems Biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Jack A.; Jansson, Janet K.; Knight, Rob
Recently, we published the first large-scale analysis of data from the Earth Microbiome Project (1, 2), a truly multidisciplinary research program involving more than 500 scientists and 27,751 samples acquired from 43 countries. These samples represent myriad specimen types and span a wide range of biotic and abiotic factors, geographic locations, and physicochemical properties. The database (https://qiita.ucsd.edu/emp/) is still growing, with over 90,000 amplicon datasets, >500 metagenomic runs, and metabolomics datasets from a similar number of samples. Importantly, the techniques, data and analytical tools are all standardized and publicly accessible, providing a framework to support research at a scale ofmore » integration that just 7 years ago seemed impossible.« less
Integrated Mid-Continent Carbon Capture, Sequestration & Enhanced Oil Recovery Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian McPherson
2010-08-31
A consortium of research partners led by the Southwest Regional Partnership on Carbon Sequestration and industry partners, including CAP CO2 LLC, Blue Source LLC, Coffeyville Resources, Nitrogen Fertilizers LLC, Ash Grove Cement Company, Kansas Ethanol LLC, Headwaters Clean Carbon Services, Black & Veatch, and Schlumberger Carbon Services, conducted a feasibility study of a large-scale CCS commercialization project that included large-scale CO{sub 2} sources. The overall objective of this project, entitled the 'Integrated Mid-Continent Carbon Capture, Sequestration and Enhanced Oil Recovery Project' was to design an integrated system of US mid-continent industrial CO{sub 2} sources with CO{sub 2} capture, and geologicmore » sequestration in deep saline formations and in oil field reservoirs with concomitant EOR. Findings of this project suggest that deep saline sequestration in the mid-continent region is not feasible without major financial incentives, such as tax credits or otherwise, that do not exist at this time. However, results of the analysis suggest that enhanced oil recovery with carbon sequestration is indeed feasible and practical for specific types of geologic settings in the Midwestern U.S.« less
The Snowmastodon Project: cutting-edge science on the blade of a bulldozer
Pigati, Jeffery S.; Miller, Ian M.; Johnson, Kirk R.
2015-01-01
Cutting-edge science happens at a variety of scales, from the individual and intimate to the large-scale and collaborative. The publication of a special issue of Quaternary Research in Nov. 2014 dedicated to the scientific findings of the “Snowmastodon Project” highlights what can be done when natural history museums, governmental agencies, and academic institutions work toward a common goal.
Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-03-01
To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. For the purposes of this Guide, large-scale Federal renewable energy projects are defined as renewable energy facilities larger than 10 megawatts (MW) that are sited on Federal property and lands and typically financed and owned by third parties.1 The U.S. Department of Energy’s Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessarymore » private capital to complete them. This Guide is intended to provide a general resource that will begin to develop the Federal employee’s awareness and understanding of the project developer’s operating environment and the private sector’s awareness and understanding of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this Guide has been organized to match Federal processes with typical phases of commercial project development. FEMP collaborated with the National Renewable Energy Laboratory (NREL) and professional project developers on this Guide to ensure that Federal projects have key elements recognizable to private sector developers and investors. The main purpose of this Guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project. This framework begins the translation between the Federal and private sector operating environments. When viewing the overall« less
Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR
NASA Astrophysics Data System (ADS)
Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.
2017-12-01
Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laubach, S.E.; Marrett, R.; Rossen, W.
The research for this project provides new technology to understand and successfully characterize, predict, and simulate reservoir-scale fractures. Such fractures have worldwide importance because of their influence on successful extraction of resources. The scope of this project includes creation and testing of new methods to measure, interpret, and simulate reservoir fractures that overcome the challenge of inadequate sampling. The key to these methods is the use of microstructures as guides to the attributes of the large fractures that control reservoir behavior. One accomplishment of the project research is a demonstration that these microstructures can be reliably and inexpensively sampled. Specificmore » goals of this project were to: create and test new methods of measuring attributes of reservoir-scale fractures, particularly as fluid conduits, and test the methods on samples from reservoirs; extrapolate structural attributes to the reservoir scale through rigorous mathematical techniques and help build accurate and useful 3-D models of the interwell region; and design new ways to incorporate geological and geophysical information into reservoir simulation and verify the accuracy by comparison with production data. New analytical methods developed in the project are leading to a more realistic characterization of fractured reservoir rocks. Testing diagnostic and predictive approaches was an integral part of the research, and several tests were successfully completed.« less
Assessing and Projecting Greenhouse Gas Release due to Abrupt Permafrost Degradation
NASA Astrophysics Data System (ADS)
Saito, K.; Ohno, H.; Yokohata, T.; Iwahana, G.; Machiya, H.
2017-12-01
Permafrost is a large reservoir of frozen soil organic carbon (SOC; about half of all the terrestrial storage). Therefore, its degradation (i.e., thawing) under global warming may lead to a substantial amount of additional greenhouse gas (GHG) release. However, understanding of the processes, geographical distribution of such hazards, and implementation of the relevant processes in the advanced climate models are insufficient yet so that variations in permafrost remains one of the large source of uncertainty in climatic and biogeochemical assessment and projections. Thermokarst, induced by melting of ground ice in ice-rich permafrost, leads to dynamic surface subsidence up to 60 m, which further affects local and regional societies and eco-systems in the Arctic. It can also accelerate a large-scale warming process through a positive feedback between released GHGs (especially methane), atmospheric warming and permafrost degradation. This three-year research project (2-1605, Environment Research and Technology Development Fund of the Ministry of the Environment, Japan) aims to assess and project the impacts of GHG release through dynamic permafrost degradation through in-situ and remote (e.g., satellite and airborn) observations, lab analysis of sampled ice and soil cores, and numerical modeling, by demonstrating the vulnerability distribution and relative impacts between large-scale degradation and such dynamic degradation. Our preliminary laboratory analysis of ice and soil cores sampled in 2016 at the Alaskan and Siberian sites largely underlain by ice-rich permafrost, shows that, although gas volumes trapped in unit mass are more or less homogenous among sites both for ice and soil cores, large variations are found in the methane concentration in the trapped gases, ranging from a few ppm (similar to that of the atmosphere) to hundreds of thousands ppm We will also present our numerical approach to evaluate relative impacts of GHGs released through dynamic permafrost degradations, by implementing conceptual modeling to assess and project distribution and affected amount of ground ice and SOC.
Design and operation of an outdoor microalgae test facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weissman, J.C.; Tillett, D.M.; Goebel, R.P.
The objective of the project covered in this report is to establish and operate a facility in the American Southwest to test the concept of producing microalgae on a large scale. This microalgae would then be used as a feedstock for producing liquid fuels. The site chosen for this project was an existing water research station in Roswell, New Mexico; the climate and water resources are representative of those in the Southwest. For this project, researchers tested specific designs, modes of operation, and strains of microalgae; proposed and evaluated modifications to technological concepts; and assessed the progress toward meeting costmore » objectives.« less
What Will the Neighbors Think? Building Large-Scale Science Projects Around the World
Jones, Craig; Mrotzek, Christian; Toge, Nobu; Sarno, Doug
2017-12-22
Public participation is an essential ingredient for turning the International Linear Collider into a reality. Wherever the proposed particle accelerator is sited in the world, its neighbors -- in any country -- will have something to say about hosting a 35-kilometer-long collider in their backyards. When it comes to building large-scale physics projects, almost every laboratory has a story to tell. Three case studies from Japan, Germany and the US will be presented to examine how community relations are handled in different parts of the world. How do particle physics laboratories interact with their local communities? How do neighbors react to building large-scale projects in each region? How can the lessons learned from past experiences help in building the next big project? These and other questions will be discussed to engage the audience in an active dialogue about how a large-scale project like the ILC can be a good neighbor.
All for one and one for all: The value of grassroots collaboration in clinical research.
Al Wattar, Bassel H; Tamblyn, Jennifer
2017-08-01
Collaboration in health research is common in current practice. Engaging grassroots clinicians in the evidence synthesis and research process can deliver impactful results and reduce research wastage. The UKARCOG is a group of specialty trainees in obstetrics and gynaecology in the UK aiming to promote women's health research by delivering high-quality impactful research and national audit projects. The collaborative enables trainees to develop essential academic skills and roll out multicentre research projects at high cost-effectiveness. Collective research work can face a number of challenges such as establishing a joint authorship style, gaining institutional support and acquiring funds to boost networking and deliver large scales studies. Copyright © 2017 Elsevier B.V. All rights reserved.
eScience for molecular-scale simulations and the eMinerals project.
Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H
2009-03-13
We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.
Reality check in the project management of EU funding
NASA Astrophysics Data System (ADS)
Guo, Chenbo
2015-04-01
A talk addressing workload, focuses, impacts and outcomes of project management (hereinafter PM) Two FP7 projects serve as objects for investigation. In the Earth Science sector NACLIM is a large scale collaborative project with 18 partners from North and West Europe. NACLIM aims at investigating and quantifying the predictability of the North Atlantic/Arctic sea surface temperature, sea ice variability and change on seasonal to decadal time scales which have a crucial impact on weather and climate in Europe. PRIMO from Political Science is a global PhD program funded by Marie Curie ITN instrument with 11 partners from Europe, Eurasia and BRICS countries focusing on the rise of regional powers and its impact on international politics at large. Although the two projects are granted by different FP7 funding instruments, stem from different cultural backgrounds and have different goals, the inherent processes and the key focus of the PM are quite alike. Only the operational management is at some point distinguished from one another. From the administrative point of view, understanding of both EU requirements and the country-specific regulations is essential; it also helps us identifying the grey area in order to carry out the projects more efficiently. The talk will focus on our observation of the day-to-day PM flows - primarily the project implementation - with few particular cases: transparency issues, e.g. priority settings of non-research stakeholders including the conflict in the human resources field, End-User integration, gender issues rising up during a monitoring visit and ethical aspects in field research. Through a brief comparison of both projects we summarize a range of dos and don'ts, an "acting instead of reacting" line of action, and the conclusion to a systematic overall management instead of exclusively project controlling. In a nutshell , the talk aims at providing the audience a summary of the observation in management methodologies and toolkits applied in both projects, our best practices and lessons learnt in coordinating large international consortia.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
Developing A Large-Scale, Collaborative, Productive Geoscience Education Network
NASA Astrophysics Data System (ADS)
Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.
2012-12-01
Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong Geoscience Departments sought to create the same type of shared information base that was supporting individual faculty for departments. The Teach the Earth portal and its underlying web development tools were used by NSF-funded projects in education to disseminate their results. Leveraging these funded efforts, the Climate Literacy Network has expanded this geoscience education community to include individuals broadly interested in fostering climate literacy. Most recently, the InTeGrate project is implementing inter-institutional collaborative authoring, testing and evaluation of curricular materials. While these projects represent only a fraction of the activity in geoscience education, they are important drivers in the development of a large, national, coherent geoscience education network with the ability to collaborate and disseminate information effectively. Importantly, the community is open and defined by active participation. Key mechanisms for engagement have included alignment of project activities with participants needs and goals; productive face-to-face and virtual workshops, events, and series; stipends for completion of large products; and strong supporting staff to keep projects moving and assist with product production. One measure of its success is the adoption and adaptation of resources and models by emerging projects, which results in the continued growth of the network.
NASA Astrophysics Data System (ADS)
Mujumdar, Pradeep P.
2014-05-01
Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.
Application of municipal sewage sludge in forest and degraded land
D.H. Marx; C.R. Berry; Paul P. Kormanik
1995-01-01
Nearly 8 million dry tons of municipal sewage sludge are produced each year in the USA by the more than 15,000 publicly owned treatment plants and the tonnage is increasing.For two decades, researchers in the USA have been studying the feasibility of land application of municipal sewage sludge. Research, large-scale practical projects, and commercial ventures have...
ERIC Educational Resources Information Center
Ward-King, Jessica; Cohen, Ira L.; Penning, Henderika; Holden, Jeanette J. A.
2010-01-01
The Autism Diagnostic Interview-Revised is one of the "gold standard" diagnostic tools for autism spectrum disorders. It is traditionally administered face-to-face. Cost and geographical concerns constrain the employment of the ADI-R for large-scale research projects. The telephone interview is a reasonable alternative, but has not yet been…
The Aeolus project: Science outreach through art.
Drumm, Ian A; Belantara, Amanda; Dorney, Steve; Waters, Timothy P; Peris, Eulalia
2015-04-01
With a general decline in people's choosing to pursue science and engineering degrees there has never been a greater need to raise the awareness of lesser known fields such as acoustics. Given this context, a large-scale public engagement project, the 'Aeolus project', was created to raise awareness of acoustics science through a major collaboration between an acclaimed artist and acoustics researchers. It centred on touring the large singing sculpture Aeolus during 2011/12, though the project also included an extensive outreach programme of talks, exhibitions, community workshops and resources for schools. Described here are the motivations behind the project and the artwork itself, the ways in which scientists and an artist collaborated, and the public engagement activities designed as part of the project. Evaluation results suggest that the project achieved its goal of inspiring interest in the discipline of acoustics through the exploration of an other-worldly work of art. © The Author(s) 2013.
Designing an External Evaluation of a Large-Scale Software Development Project.
ERIC Educational Resources Information Center
Collis, Betty; Moonen, Jef
This paper describes the design and implementation of the evaluation of the POCO Project, a large-scale national software project in the Netherlands which incorporates the perspective of an evaluator throughout the entire span of the project, and uses the experiences gained from it to suggest an evaluation procedure that could be applied to other…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... agreement (PLA), as they may decide appropriate, on large-scale construction projects, where the total cost... procurement. A PLA is a pre-hire collective bargaining agreement with one or more labor organizations that... the use of a project labor agreement (PLA), as they may decide appropriate, on large-scale...
Loutfy, Mona; Greene, Saara; Kennedy, V Logan; Lewis, Johanna; Thomas-Pavanel, Jamie; Conway, Tracey; de Pokomandy, Alexandra; O'Brien, Nadia; Carter, Allison; Tharao, Wangari; Nicholson, Valerie; Beaver, Kerrigan; Dubuc, Danièle; Gahagan, Jacqueline; Proulx-Boucher, Karène; Hogg, Robert S; Kaida, Angela
2016-08-19
Community-based research has gained increasing recognition in health research over the last two decades. Such participatory research approaches are lauded for their ability to anchor research in lived experiences, ensuring cultural appropriateness, accessing local knowledge, reaching marginalized communities, building capacity, and facilitating research-to-action. While having these positive attributes, the community-based health research literature is predominantly composed of small projects, using qualitative methods, and set within geographically limited communities. Its use in larger health studies, including clinical trials and cohorts, is limited. We present the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS), a large-scale, multi-site, national, longitudinal quantitative study that has operationalized community-based research in all steps of the research process. Successes, challenges and further considerations are offered. Through the integration of community-based research principles, we have been successful in: facilitating a two-year long formative phase for this study; developing a novel survey instrument with national involvement; training 39 Peer Research Associates (PRAs); offering ongoing comprehensive support to PRAs; and engaging in an ongoing iterative community-based research process. Our community-based research approach within CHIWOS demanded that we be cognizant of challenges managing a large national team, inherent power imbalances and challenges with communication, compensation and volunteering considerations, and extensive delays in institutional processes. It is important to consider the iterative nature of community-based research and to work through tensions that emerge given the diverse perspectives of numerous team members. Community-based research, as an approach to large-scale quantitative health research projects, is an increasingly viable methodological option. Community-based research has several advantages that go hand-in-hand with its obstacles. We offer guidance on implementing this approach, such that the process can be better planned and result in success.
Research at NASA's NFAC wind tunnels
NASA Technical Reports Server (NTRS)
Edenborough, H. Kipling
1990-01-01
The National Full-Scale Aerodynamics Complex (NFAC) is a unique combination of wind tunnels that allow the testing of aerodynamic and dynamic models at full or large scale. It can even accommodate actual aircraft with their engines running. Maintaining full-scale Reynolds numbers and testing with surface irregularities, protuberances, and control surface gaps that either closely match the full-scale or indeed are those of the full-scale aircraft help produce test data that accurately predict what can be expected from future flight investigations. This complex has grown from the venerable 40- by 80-ft wind tunnel that has served for over 40 years helping researchers obtain data to better understand the aerodynamics of a wide range of aircraft from helicopters to the space shuttle. A recent modification to the tunnel expanded its maximum speed capabilities, added a new 80- by 120-ft test section and provided extensive acoustic treatment. The modification is certain to make the NFAC an even more useful facility for NASA's ongoing research activities. A brief background is presented on the original facility and the kind of testing that has been accomplished using it through the years. A summary of the modification project and the measured capabilities of the two test sections is followed by a review of recent testing activities and of research projected for the future.
Project Management Life Cycle Models to Improve Management in High-rise Construction
NASA Astrophysics Data System (ADS)
Burmistrov, Andrey; Siniavina, Maria; Iliashenko, Oksana
2018-03-01
The paper describes a possibility to improve project management in high-rise buildings construction through the use of various Project Management Life Cycle Models (PMLC models) based on traditional and agile project management approaches. Moreover, the paper describes, how the split the whole large-scale project to the "project chain" will create the factor for better manageability of the large-scale buildings project and increase the efficiency of the activities of all participants in such projects.
Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette
2013-06-01
High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Beichner, Robert
2015-03-01
The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).
Bibliography of NASA-related publications on wind turbine technology 1973-1995
NASA Technical Reports Server (NTRS)
Spera, David A.
1995-01-01
A major program of research and development projects on wind turbines for generating electricity was conducted at the NASA Lewis Research Center from 1973 to 1988. Most of these projects were sponsored by the U.S. Department of Energy (DOE), as a major element of its Federal Wind Energy Program. One other large-scale wind turbine project was sponsored by the Bureau of Reclamation of the Department of Interior (DOI). The peak years for wind energy work at Lewis were 1979-80, when almost 100 engineers, technicians, and administrative personnel were involved. From 1988 their conclusion in 1995, NASA wind energy activities have been directed toward the transfer of technology to commercial and academic organizations. Wind energy activities at NASA can be divided into two broad categories which are closely related and often overlapping: (1) Designing, building, and testing a series of 12 large-scale, experimental, horizontal-axis wind turbines (HAWT's); and (2) conducting supporting research and technology (SR&T) projects. The purpose of this bibliography is to assist those active in the field of wind energy in locating the technical information they need on wind power planning, wind loads, turbine design and analysis, fabrication and installation, laboratory and field testing, and operations and maintenance. This bibliography contains approximately 620 citations of publications by over 520 authors and co-authors. Sources are: (1) NASA reports authored by government grantee, and contractor personnel, (2) papers presented by attendees at NASA-sponsored workshops and conferences, (3) papers presented by NASA personnel at outside workshops and conferences, and (4) outside publications related to research performed at NASA/ DOE wind turbine sites.
Investigation of multilayer domains in large-scale CVD monolayer graphene by optical imaging
NASA Astrophysics Data System (ADS)
Yu, Yuanfang; Li, Zhenzhen; Wang, Wenhui; Guo, Xitao; Jiang, Jie; Nan, Haiyan; Ni, Zhenhua
2017-03-01
CVD graphene is a promising candidate for optoelectronic applications due to its high quality and high yield. However, multi-layer domains could inevitably form at the nucleation centers during the growth. Here, we propose an optical imaging technique to precisely identify the multilayer domains and also the ratio of their coverage in large-scale CVD monolayer graphene. We have also shown that the stacking disorder in twisted bilayer graphene as well as the impurities on the graphene surface could be distinguished by optical imaging. Finally, we investigated the effects of bilayer domains on the optical and electrical properties of CVD graphene, and found that the carrier mobility of CVD graphene is seriously limited by scattering from bilayer domains. Our results could be useful for guiding future optoelectronic applications of large-scale CVD graphene. Project supported by the National Natural Science Foundation of China (Nos. 61422503, 61376104), the Open Research Funds of Key Laboratory of MEMS of Ministry of Education (SEU, China), and the Fundamental Research Funds for the Central Universities.
NASA Astrophysics Data System (ADS)
Turner, Sean W. D.; Marlow, David; Ekström, Marie; Rhodes, Bruce G.; Kularathna, Udaya; Jeffrey, Paul J.
2014-04-01
Despite a decade of research into climate change impacts on water resources, the scientific community has delivered relatively few practical methodological developments for integrating uncertainty into water resources system design. This paper presents an application of the "decision scaling" methodology for assessing climate change impacts on water resources system performance and asks how such an approach might inform planning decisions. The decision scaling method reverses the conventional ethos of climate impact assessment by first establishing the climate conditions that would compel planners to intervene. Climate model projections are introduced at the end of the process to characterize climate risk in such a way that avoids the process of propagating those projections through hydrological models. Here we simulated 1000 multisite synthetic monthly streamflow traces in a model of the Melbourne bulk supply system to test the sensitivity of system performance to variations in streamflow statistics. An empirical relation was derived to convert decision-critical flow statistics to climatic units, against which 138 alternative climate projections were plotted and compared. We defined the decision threshold in terms of a system yield metric constrained by multiple performance criteria. Our approach allows for fast and simple incorporation of demand forecast uncertainty and demonstrates the reach of the decision scaling method through successful execution in a large and complex water resources system. Scope for wider application in urban water resources planning is discussed.
Subgrid-scale models for large-eddy simulation of rotating turbulent channel flows
NASA Astrophysics Data System (ADS)
Silvis, Maurits H.; Bae, Hyunji Jane; Trias, F. Xavier; Abkar, Mahdi; Moin, Parviz; Verstappen, Roel
2017-11-01
We aim to design subgrid-scale models for large-eddy simulation of rotating turbulent flows. Rotating turbulent flows form a challenging test case for large-eddy simulation due to the presence of the Coriolis force. The Coriolis force conserves the total kinetic energy while transporting it from small to large scales of motion, leading to the formation of large-scale anisotropic flow structures. The Coriolis force may also cause partial flow laminarization and the occurrence of turbulent bursts. Many subgrid-scale models for large-eddy simulation are, however, primarily designed to parametrize the dissipative nature of turbulent flows, ignoring the specific characteristics of transport processes. We, therefore, propose a new subgrid-scale model that, in addition to the usual dissipative eddy viscosity term, contains a nondissipative nonlinear model term designed to capture transport processes, such as those due to rotation. We show that the addition of this nonlinear model term leads to improved predictions of the energy spectra of rotating homogeneous isotropic turbulence as well as of the Reynolds stress anisotropy in spanwise-rotating plane-channel flows. This work is financed by the Netherlands Organisation for Scientific Research (NWO) under Project Number 613.001.212.
[Structural Study in the Platform for Drug Discovery, Informatics, and Structural Life Science].
Senda, Toshiya
2016-01-01
The Platform for Drug Discovery, Informatics, and Structural Life Science (PDIS), which has been launched since FY2012, is a national project in the field of structural biology. The PDIS consists of three cores - structural analysis, control, and informatics - and aims to support life science researchers who are not familiar with structural biology. The PDIS project is able to provide full-scale support for structural biology research. The support provided by the PDIS project includes protein purification with various expression systems, large scale protein crystallization, crystal structure determination, small angle scattering (SAXS), NMR, electron microscopy, bioinformatics, etc. In order to utilize these methods of support, PDIS users need to submit an application form to the one-stop service office. Submitted applications will be reviewed by three referees. It is strongly encouraged that PDIS users have sufficient discussion with researchers in the PDIS project before submitting the application. This discussion is very useful in the process of project design, particularly for beginners in structural biology. In addition to this user support, the PDIS project has conducted R&D, which includes the development of synchrotron beamlines. In the PDIS project, PF and SPring-8 have developed beamlines for micro-crystallography, high-throughput data collection, supramolecular assembly, and native single anomalous dispersion (SAD) phasing. The newly developed beamlines have been open to all users, and have accelerated structural biology research. Beamlines for SAXS have also been developed, which has dramatically increased bio-SAXS users.
Geospatial Optimization of Siting Large-Scale Solar Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macknick, Jordan; Quinby, Ted; Caulfield, Emmet
2014-03-01
Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less
Problems in merging Earth sensing satellite data sets
NASA Technical Reports Server (NTRS)
Smith, Paul H.; Goldberg, Michael J.
1987-01-01
Satellite remote sensing systems provide a tremendous source of data flow to the Earth science community. These systems provide scientists with data of types and on a scale previously unattainable. Looking forward to the capabilities of Space Station and the Earth Observing System (EOS), the full realization of the potential of satellite remote sensing will be handicapped by inadequate information systems. There is a growing emphasis in Earth science research to ask questions which are multidisciplinary in nature and global in scale. Many of these research projects emphasize the interactions of the land surface, the atmosphere, and the oceans through various physical mechanisms. Conducting this research requires large and complex data sets and teams of multidisciplinary scientists, often working at remote locations. A review of the problems of merging these large volumes of data into spatially referenced and manageable data sets is presented.
Benchmarks of Historical Thinking: First Steps
ERIC Educational Resources Information Center
Peck, Carla; Seixas, Peter
2008-01-01
Although historical thinking has been the subject of a substantial body of recent research, few attempts explicitly apply the results on a large scale in North America. This article, a narrative inquiry, examines the first stages of a multi-year, Canada-wide project to reform history education through the development of classroom-based…
Variation in Swedish Address Practices
ERIC Educational Resources Information Center
Norrby, Catrin
2006-01-01
This article explores variation in address in contemporary Swedish in Sweden-Swedish and Finland-Swedish. The research is part of a large-scale Australian project on changes in the address systems of French, German and Swedish. The present article focuses on results from 72 social network interviews conducted in Sweden (Gothenburg) and Finland…
INFORMATION MANAGEMENT AND RELATED QUALITY ASSURANCE FOR A LARGE SCALE, MULTI-SITE RESEARCH PROJECT
During the summer of 2000, as part of a U.S. Environmental Protection Agency study designed to improve microbial water quality monitoring protocols at public beaches, over 11,000 water samples were collected at five selected beaches across the country. At each beach, samples wer...
Automated geographic registration and radiometric correction for UAV-based mosaics
USDA-ARS?s Scientific Manuscript database
Texas A&M University has been operating a large-scale, UAV-based, agricultural remote-sensing research project since 2015. To use UAV-based images in agricultural production, many high-resolution images must be mosaicked together to create an image of an agricultural field. Two key difficulties to s...
Calving distributions of individual bulls in multiple-sire pastures
USDA-ARS?s Scientific Manuscript database
The objective of this project was to quantify patterns in the calving rate of sires in multiple-sire pastures over seven years at a large-scale cow-calf operation. Data consisted of reproductive and genomic records from multiple-sire breeding pastures (n=33) at the United States Meat Animal Research...
In Search of the Eco-Teacher: Public School Edition
ERIC Educational Resources Information Center
Blenkinsop, Sean
2014-01-01
This paper uses an innovative building-less Canadian public elementary school and its accompanying large-scale research project to consider the characteristics that might be required of a teacher interested in working in an emergent, environmental, place- and community-based experiential public school setting. The six characteristics considered…
Moonlight project promotes energy-saving technology
NASA Astrophysics Data System (ADS)
Ishihara, A.
1986-01-01
In promoting energy saving, development of energy conservation technologies aimed at raising energy efficiency in the fields of energy conversion, its transportation, its storage, and its consumption is considered, along with enactment of legal actions urging rational use of energies and implementation of an enlightenment campaign for energy conservation to play a crucial role. Under the Moonlight Project, technical development is at present being centered around the following six pillars: (1) large scale energy saving technology; (2) pioneering and fundamental energy saving technology; (3) international cooperative research project; (4) research and survey of energy saving technology; (5) energy saving technology development by private industry; and (6) promotion of energy saving through standardization. Heat pumps, magnetohydrodynamic generators and fuel cells are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Gang
Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less
NASA Technical Reports Server (NTRS)
Sydnor, Goerge H.
2010-01-01
The National Aeronautics and Space Administration's (NASA) Aeronautics Test Program (ATP) is implementing five significant ground-based test facility projects across the nation with funding provided by the American Recovery and Reinvestment Act (ARRA). The projects were selected as the best candidates within the constraints of the ARRA and the strategic plan of ATP. They are a combination of much-needed large scale maintenance, reliability, and system upgrades plus creating new test beds for upcoming research programs. The projects are: 1.) Re-activation of a large compressor to provide a second source for compressed air and vacuum to the Unitary Plan Wind Tunnel at the Ames Research Center (ARC) 2.) Addition of high-altitude ice crystal generation at the Glenn Research Center Propulsion Systems Laboratory Test Cell 3, 3.) New refrigeration system and tunnel heat exchanger for the Icing Research Tunnel at the Glenn Research Center, 4.) Technical viability improvements for the National Transonic Facility at the Langley Research Center, and 5.) Modifications to conduct Environmentally Responsible Aviation and Rotorcraft research at the 14 x 22 Subsonic Tunnel at Langley Research Center. The selection rationale, problem statement, and technical solution summary for each project is given here. The benefits and challenges of the ARRA funded projects are discussed. Indirectly, this opportunity provides the advantages of developing experience in NASA's workforce in large projects and maintaining corporate knowledge in that very unique capability. It is envisioned that improved facilities will attract a larger user base and capabilities that are needed for current and future research efforts will offer revenue growth and future operations stability. Several of the chosen projects will maximize wind tunnel reliability and maintainability by using newer, proven technologies in place of older and obsolete equipment and processes. The projects will meet NASA's goal of integrating more efficient, environmentally safer, and less energy consuming hardware and processes into existing tunnel systems. These include Environmental Protection Agency-approved refrigerants, energy efficient motors, and faster, flexible tunnel data systems.
Wang, Yong
2017-03-25
In the last decade, synthetic biology research has been gradually transited from monocellular parts or devices toward more complex multicellular systems. The emerging plant synthetic biology is regarded as the "next chapter" of synthetic biology. The complex and diverse plant metabolism as the entry point, plant synthetic biology research not only helps us understand how real life is working, but also facilitates us to learn how to design and construct more complex artificial life. Bioactive compounds innovation and large-scale production are expected to be breakthrough with the redesigned plant metabolism as well. In this review, we discuss the research progress in plant synthetic biology and propose the new materia medica project to lift the level of traditional Chinese herbal medicine research.
NASA Astrophysics Data System (ADS)
De Michelis, Paola; Federica Marcucci, Maria; Consolini, Giuseppe
2015-04-01
Recently we have investigated the spatial distribution of the scaling features of short-time scale magnetic field fluctuations using measurements from several ground-based geomagnetic observatories distributed in the northern hemisphere. We have found that the scaling features of fluctuations of the horizontal magnetic field component at time scales below 100 minutes are correlated with the geomagnetic activity level and with changes in the currents flowing in the ionosphere. Here, we present a detailed analysis of the dynamical changes of the magnetic field scaling features as a function of the geomagnetic activity level during the well-known large geomagnetic storm occurred on July, 15, 2000 (the Bastille event). The observed dynamical changes are discussed in relationship with the changes of the overall ionospheric polar convection and potential structure as reconstructed using SuperDARN data. This work is supported by the Italian National Program for Antarctic Research (PNRA) - Research Project 2013/AC3.08 and by the European Community's Seventh Framework Programme ([FP7/2007-2013]) under Grant no. 313038/STORM and
NASA Technical Reports Server (NTRS)
1973-01-01
The results are reported of the NASA/Drexel research effort which was conducted in two separate phases. The initial phase stressed exploration of the problem from the point of view of three primary research areas and the building of a multidisciplinary team. The final phase consisted of a clinical demonstration program in which the research associates consulted with the County Executive of New Castle County, Delaware, to aid in solving actual problems confronting the County Government. The three primary research areas of the initial phase are identified as technology, management science, and behavioral science. Five specific projects which made up the research effort are treated separately. A final section contains the conclusions drawn from total research effort as well as from the specific projects.
Bozorgnia, Yousef; Abrahamson, Norman A.; Al Atik, Linda; Ancheta, Timothy D.; Atkinson, Gail M.; Baker, Jack W.; Baltay, Annemarie S.; Boore, David M.; Campbell, Kenneth W.; Chiou, Brian S.J.; Darragh, Robert B.; Day, Steve; Donahue, Jennifer; Graves, Robert W.; Gregor, Nick; Hanks, Thomas C.; Idriss, I. M.; Kamai, Ronnie; Kishida, Tadahiro; Kottke, Albert; Mahin, Stephen A.; Rezaeian, Sanaz; Rowshandel, Badie; Seyhan, Emel; Shahi, Shrey; Shantz, Tom; Silva, Walter; Spudich, Paul A.; Stewart, Jonathan P.; Watson-Lamprey, Jennie; Wooddell, Kathryn; Youngs, Robert
2014-01-01
The NGA-West2 project is a large multidisciplinary, multi-year research program on the Next Generation Attenuation (NGA) models for shallow crustal earthquakes in active tectonic regions. The research project has been coordinated by the Pacific Earthquake Engineering Research Center (PEER), with extensive technical interactions among many individuals and organizations. NGA-West2 addresses several key issues in ground-motion seismic hazard, including updating the NGA database for a magnitude range of 3.0–7.9; updating NGA ground-motion prediction equations (GMPEs) for the “average” horizontal component; scaling response spectra for damping values other than 5%; quantifying the effects of directivity and directionality for horizontal ground motion; resolving discrepancies between the NGA and the National Earthquake Hazards Reduction Program (NEHRP) site amplification factors; analysis of epistemic uncertainty for NGA GMPEs; and developing GMPEs for vertical ground motion. This paper presents an overview of the NGA-West2 research program and its subprojects.
Regional Climate Change across North America in 2030 Projected from RCP6.0
NASA Astrophysics Data System (ADS)
Otte, T.; Nolte, C. G.; Faluvegi, G.; Shindell, D. T.
2012-12-01
Projecting climate change scenarios to local scales is important for understanding and mitigating the effects of climate change on society and the environment. Many of the general circulation models (GCMs) that are participating in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) do not fully resolve regional-scale processes and therefore cannot capture local changes in temperature and precipitation extremes. We seek to project the GCM's large-scale climate change signal to the local scale using a regional climate model (RCM) by applying dynamical downscaling techniques. The RCM will be used to better understand the local changes of temperature and precipitation extremes that may result from a changing climate. In this research, downscaling techniques that we developed with historical data are now applied to GCM fields. Results from downscaling NASA/GISS ModelE2 simulations of the IPCC AR5 Representative Concentration Pathway (RCP) scenario 6.0 will be shown. The Weather Research and Forecasting (WRF) model has been used as the RCM to downscale decadal time slices for ca. 2000 and ca. 2030 over North America and illustrate potential changes in regional climate that are projected by ModelE2 and WRF under RCP6.0. The analysis focuses on regional climate fields that most strongly influence the interactions between climate change and air quality. In particular, an analysis of extreme temperature and precipitation events will be presented.
Minari, Jusaku; Shirai, Tetsuya; Kato, Kazuto
2014-12-01
As evidenced by high-throughput sequencers, genomic technologies have recently undergone radical advances. These technologies enable comprehensive sequencing of personal genomes considerably more efficiently and less expensively than heretofore. These developments present a challenge to the conventional framework of biomedical ethics; under these changing circumstances, each research project has to develop a pragmatic research policy. Based on the experience with a new large-scale project-the Genome Science Project-this article presents a novel approach to conducting a specific policy for personal genome research in the Japanese context. In creating an original informed-consent form template for the project, we present a two-tiered process: making the draft of the template following an analysis of national and international policies; refining the draft template in conjunction with genome project researchers for practical application. Through practical use of the template, we have gained valuable experience in addressing challenges in the ethical review process, such as the importance of sharing details of the latest developments in genomics with members of research ethics committees. We discuss certain limitations of the conventional concept of informed consent and its governance system and suggest the potential of an alternative process using information technology.
Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System
NASA Astrophysics Data System (ADS)
Yue, Liu; Hang, Mend
2018-01-01
With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.
Overview of ENEA's Projects on lithium batteries
NASA Astrophysics Data System (ADS)
Alessandrini, F.; Conte, M.; Passerini, S.; Prosini, P. P.
The increasing need of high performance batteries in various small-scale and large-scale applications (portable electronics, notebooks, palmtops, cellular phones, electric vehicles, UPS, load levelling) in Italy is motivating the R&D efforts of various public and private organizations. Research of lithium batteries in Italy goes back to the beginning of the technological development of primary and secondary lithium systems with national know-how spread in various academic and public institutions with a few private stakeholders. In the field of lithium polymer batteries, ENEA has been dedicating significant efforts in almost two decades to promote and carry out basic R&D and pre-industrial development projects. In recent years, three major national projects have been performed and coordinated by ENEA in co-operation with some universities, governmental research organizations and industry. In these projects novel polymer electrolytes with ceramic additives, low cost manganese oxide-based composite cathodes, environmentally friendly process for polymer electrolyte, fabrication processes of components and cells have been investigated and developed in order to fulfill long-term needs of cost-effective and highly performant lithium polymer batteries.
Stratiform clouds and their interaction with atmospheric motion
NASA Technical Reports Server (NTRS)
Clark, John H. E.; Shirer, Hampton N.
1990-01-01
During 1989 and 1990, the researchers saw the publication of two papers and the submission of a third for review on work supported primarily by the previous contract, NAS8-36150; the delivery of an invited talk at the SIAM Conference on Dynamical Systems in Orlando, Florida; and the start of two new projects on the radiative effects of stratocumulus on the large-scale flow. The published papers discuss aspects of stratocumulus circulations (Laufersweiler and Shirer, 1989) and the Hadley to Rossby regime transition in rotating spherical systems (Higgins and Shirer, 1990). The submitted paper (Haack and Shirer, 1990) discusses a new nonlinear model of roll circulations that are forced both dynamically and thermally. The invited paper by H. N. Shirer and R. Wells presented an objective means for determining appropriate truncation levels for low-order models of flows involving two incommensurate periods; this work has application to the Hadley to Rossby transition problem in quasi-geostrophic flows (Moroz and Holmes, 1984). The new projects involve the development of a multi-layered quasi-geostrophic channel model for study of the modulation of the large-scale flow by stratocumulus clouds that typically develop off the coasts of continents. In this model the diabatic forcing in the lowest layer will change in response to the (parameterized) development of extensive fields of stratocumulus clouds. To guide creation of this parameterization scheme, researchers are producing climatologies of stratocumulus frequency and the authors correlate these frequencies with the phasing and amplitude of the large-scale flow pattern. Researchers discuss the above topics in greater detail.
Consortium biology in immunology: the perspective from the Immunological Genome Project.
Benoist, Christophe; Lanier, Lewis; Merad, Miriam; Mathis, Diane
2012-10-01
Although the field has a long collaborative tradition, immunology has made less use than genetics of 'consortium biology', wherein groups of investigators together tackle large integrated questions or problems. However, immunology is naturally suited to large-scale integrative and systems-level approaches, owing to the multicellular and adaptive nature of the cells it encompasses. Here, we discuss the value and drawbacks of this organization of research, in the context of the long-running 'big science' debate, and consider the opportunities that may exist for the immunology community. We position this analysis in light of our own experience, both positive and negative, as participants of the Immunological Genome Project.
Keeping Connected: A Review of the Research Relationship
ERIC Educational Resources Information Center
Moss, Julianne; Hay, Trevor
2014-01-01
In this paper, some key findings of the Keeping Connected project are discussed in light of the methodological challenges of developing an analytical approach in a large-scale study, particularly in starting with open-ended, participant-selected, digital still visual images as part of 31 longitudinal case studies. The paper works to clarify the…
A Review of Large-Scale "How Much Information?" Inventories: Variations, Achievements and Challenges
ERIC Educational Resources Information Center
Hilbert, Martin
2015-01-01
Introduction: Pressed by the increasing social importance of digital information, including the current attention given to the "big data paradigm", several research projects have taken up the challenge to quantify the amount of technologically mediated information. Method: This meta-study reviews the eight most important inventories in a…
Design for a Study of American Youth.
ERIC Educational Resources Information Center
Flanagan, John C.; And Others
Project TALENT is a large-scale, long-range educational research effort aimed at developing methods for the identification, development, and utilization of human talents, which has involved some 440,000 students in 1,353 public, private, and parochial secondary schools in all parts of the country. Data collected through teacher-administered tests,…
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
ERIC Educational Resources Information Center
Tarr, James E.; Ross, Daniel J.; McNaught, Melissa D.; Chavez, Oscar; Grouws, Douglas A.; Reys, Robert E.; Sears, Ruthmae; Taylan, R. Didem
2010-01-01
The Comparing Options in Secondary Mathematics: Investigating Curriculum (COSMIC) project is a longitudinal study of student learning from two types of mathematics curricula: integrated and subject-specific. Previous large-scale research studies such as the National Assessment of Educational Progress (NAEP) indicate that numerous variables are…
Explaining Variation in Instructional Time: An Application of Quantile Regression
ERIC Educational Resources Information Center
Corey, Douglas Lyman; Phelps, Geoffrey; Ball, Deborah Loewenberg; Demonte, Jenny; Harrison, Delena
2012-01-01
This research is conducted in the context of a large-scale study of three nationally disseminated comprehensive school reform projects (CSRs) and examines how school- and classroom-level factors contribute to variation in instructional time in English language arts and mathematics. When using mean-based OLS regression techniques such as…
Say Who You Are, Play Who You Are: Improvisation, Pedagogy, and Youth on the Margins
ERIC Educational Resources Information Center
Willox, Ashlee Cunsolo; Heble, Ajay; Jackson, Rob; Walker, Melissa; Waterman, Ellen
2011-01-01
This paper presents a research that emerges from a set of community-based outreach activities associated with a large-scale, interdisciplinary project, Improvisation, Community, and Social Practice (ICASP), which focuses on the social and pedagogical implications of improvised musical practices. Working from the premise that musical improvisation…
The Context of Professional Learning for Inclusion: A 4-Ply Model
ERIC Educational Resources Information Center
O'Gorman, Elizabeth
2010-01-01
This paper outlines the findings from one dimension of a large-scale research project which addressed the PL requirements of specialist inclusion/SEN teachers in Ireland. Two aspects relating to the context of professional learning are explored here: the professional learning opportunities preferred by teachers and the professional learning…
Does External Funding Push Doctoral Supervisors to Be More Directive? A Large-Scale Danish Study
ERIC Educational Resources Information Center
Wichmann-Hansen, Gitte; Herrmann, Kim Jesper
2017-01-01
Around the world, changing funding policies have pushed for university departments to find increased external project-based funding. While this trend is widely acknowledged, mixed views exist about implications for faculty members' academic practices. Regarding doctoral education, researchers have raised concern that external funding will push…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camarda, G. S.; Bolotnikov, A. E.; Cui, Y.
The goal of this project is to obtain and characterize scintillators, emerging- and commercial-compoundsemiconductor radiation- detection materials and devices provided by vendors and research organizations. The focus of our proposed research is to clarify the role of the deleterious defects and impurities responsible for the detectors' non-uniformity in scintillating crystals, commercial semiconductor radiation-detector materials, and in emerging R&D ones. Some benefits of this project addresses the need for fabricating high-performance scintillators and compound-semiconductor radiation-detectors with the proven potential for large-scale manufacturing. The findings help researchers to resolve the problems of non-uniformities in scintillating crystals, commercial semiconductor radiation-detector materials, and inmore » emerging R&D ones.« less
Commercial-scale biotherapeutics manufacturing facility for plant-made pharmaceuticals.
Holtz, Barry R; Berquist, Brian R; Bennett, Lindsay D; Kommineni, Vally J M; Munigunti, Ranjith K; White, Earl L; Wilkerson, Don C; Wong, Kah-Yat I; Ly, Lan H; Marcel, Sylvain
2015-10-01
Rapid, large-scale manufacture of medical countermeasures can be uniquely met by the plant-made-pharmaceutical platform technology. As a participant in the Defense Advanced Research Projects Agency (DARPA) Blue Angel project, the Caliber Biotherapeutics facility was designed, constructed, commissioned and released a therapeutic target (H1N1 influenza subunit vaccine) in <18 months from groundbreaking. As of 2015, this facility was one of the world's largest plant-based manufacturing facilities, with the capacity to process over 3500 kg of plant biomass per week in an automated multilevel growing environment using proprietary LED lighting. The facility can commission additional plant grow rooms that are already built to double this capacity. In addition to the commercial-scale manufacturing facility, a pilot production facility was designed based on the large-scale manufacturing specifications as a way to integrate product development and technology transfer. The primary research, development and manufacturing system employs vacuum-infiltrated Nicotiana benthamiana plants grown in a fully contained, hydroponic system for transient expression of recombinant proteins. This expression platform has been linked to a downstream process system, analytical characterization, and assessment of biological activity. This integrated approach has demonstrated rapid, high-quality production of therapeutic monoclonal antibody targets, including a panel of rituximab biosimilar/biobetter molecules and antiviral antibodies against influenza and dengue fever. © 2015 Society for Experimental Biology, Association of Applied Biologists and John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena
2015-01-01
In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials…
CFIRP: What we learned in the first ten years
Chambers, C.L.; McComb, W.C.; Tappeiner, J. C.; Kellogg, L.D.; Johnson, R.L.; Spycher, G.
1999-01-01
In response to public dissatisfaction with forest management methods, we initiated the College of Forestry Integrated Research Project (CFIRP) to test alternative silvicultural systems in Douglas-fir (Pseudotsuga menziesii stands in western Oregon. We compared costs and biological and human responses among a control and three replicated silvicultural alternatives to clearcutting that retained structural features found in old Douglas-fir forests. Treatments were applied within 8- to 15-ha stands and attempted to mimic crown fires (modified clearcut), windthrow (green tree retention), and small-scale impacts such as root rot diseases (small patch group selection). We also compared costs in three unreplicated treatments (large patch group selection, wedge cut, and strip cut). Each treatment included differences in the pattern of retained dead trees (snags), as either scattered individuals or as clumps. Good communication among researchers and managers, a long-term commitment to the project, and careful documentation of research sites and data are important to the success of long-term silvicultural research projects. To date, over 30 publications have resulted from the project.
Health impact assessment of industrial development projects: a spatio-temporal visualization.
Winkler, Mirko S; Krieger, Gary R; Divall, Mark J; Singer, Burton H; Utzinger, Jürg
2012-05-01
Development and implementation of large-scale industrial projects in complex eco-epidemiological settings typically require combined environmental, social and health impact assessments. We present a generic, spatio-temporal health impact assessment (HIA) visualization, which can be readily adapted to specific projects and key stakeholders, including poorly literate communities that might be affected by consequences of a project. We illustrate how the occurrence of a variety of complex events can be utilized for stakeholder communication, awareness creation, interactive learning as well as formulating HIA research and implementation questions. Methodological features are highlighted in the context of an iron ore development in a rural part of Africa.
Reuther, Rudolf
2011-02-01
In 2010, the EU FP NanoSustain project (247989) has been successfully launched with the objective to develop innovative solutions for the sustainable use, recycling and final treatment of engineered nanomaterials (ENMs). The same year, NanoValid (263147), a large-scale integrating EU FP7 project has been initiated and contract negotiations with the European Commission commenced, to develop new reference methods and materials applicable to the unique properties of ENMs. The paper presented will give an overview on the main objectives of these 2 new European research initiatives, on main tasks to achieve objectives, and on the impact on current standardization efforts and technical innovations.
Walker, David; Ellaway, Anne
2018-01-01
Background Large-scale primary data collections are complex, costly, and time-consuming. Study protocols for trial-based research are now commonplace, with a growing number of similar pieces of work being published on observational research. However, useful additions to the literature base are publications that describe the issues and challenges faced while conducting observational studies. These can provide researchers with insightful knowledge that can inform funding proposals or project development work. Objectives In this study, we identify and reflectively discuss the unforeseen or often unpublished issues associated with organizing and implementing a large-scale objectively measured physical activity and global positioning system (GPS) data collection. Methods The SPACES (Studying Physical Activity in Children’s Environments across Scotland) study was designed to collect objectively measured physical activity and GPS data from 10- to 11-year-old children across Scotland, using a postal delivery method. The 3 main phases of the project (recruitment, delivery of project materials, and data collection and processing) are described within a 2-stage framework: (1) intended design and (2) implementation of the intended design. Results Unanticipated challenges arose, which influenced the data collection process; these encompass four main impact categories: (1) cost, budget, and funding; (2) project timeline; (3) participation and engagement; and (4) data challenges. The main unforeseen issues that impacted our timeline included the informed consent process for children under the age of 18 years; the use of, and coordination with, the postal service to deliver study information and equipment; and the variability associated with when participants began data collection and the time taken to send devices and consent forms back (1-12 months). Unanticipated budgetary issues included the identification of some study materials (AC power adapter) not fitting through letterboxes, as well as the employment of fieldworkers to increase recruitment and the return of consent forms. Finally, we encountered data issues when processing physical activity and GPS data that had been initiated across daylight saving time. Conclusions We present learning points and recommendations that may benefit future studies of similar methodology in their early stages of development. PMID:29712624
Extremely Large Telescope Project Selected in ESFRI Roadmap
NASA Astrophysics Data System (ADS)
2006-10-01
In its first Roadmap, the European Strategy Forum on Research Infrastructures (ESFRI) choose the European Extremely Large Telescope (ELT), for which ESO is presently developing a Reference Design, as one of the large scale projects to be conducted in astronomy, and the only one in optical astronomy. The aim of the ELT project is to build before the end of the next decade an optical/near-infrared telescope with a diameter in the 30-60m range. ESO PR Photo 40/06 The ESFRI Roadmap states: "Extremely Large Telescopes are seen world-wide as one of the highest priorities in ground-based astronomy. They will vastly advance astrophysical knowledge allowing detailed studies of inter alia planets around other stars, the first objects in the Universe, super-massive Black Holes, and the nature and distribution of the Dark Matter and Dark Energy which dominate the Universe. The European Extremely Large Telescope project will maintain and reinforce Europe's position at the forefront of astrophysical research." Said Catherine Cesarsky, Director General of ESO: "In 2004, the ESO Council mandated ESO to play a leading role in the development of an ELT for Europe's astronomers. To that end, ESO has undertaken conceptual studies for ELTs and is currently also leading a consortium of European institutes engaged in studying enabling technologies for such a telescope. The inclusion of the ELT in the ESFRI roadmap, together with the comprehensive preparatory work already done, paves the way for the next phase of this exciting project, the design phase." ESO is currently working, in close collaboration with the European astronomical community and the industry, on a baseline design for an Extremely Large Telescope. The plan is a telescope with a primary mirror between 30 and 60 metres in diameter and a financial envelope of about 750 m Euros. It aims at more than a factor ten improvement in overall performance compared to the current leader in ground based astronomy: the ESO Very Large Telescope at the Paranal Observatory. The draft Baseline Reference Design will be presented to the wider scientific community on 29 - 30 November 2006 at a dedicated ELT Workshop Meeting in Marseille (France) and will be further reiterated. The design is then to be presented to the ESO Council at the end of 2006. The goal is to start the detailed E-ELT design work by the first half of 2007. Launched in April 2002, the European Strategy Forum on Research Infrastructures was set-up following a recommendation of the European Union Council, with the role to support a coherent approach to policy-making on research infrastructures in Europe, and to act as an incubator for international negotiations about concrete initiatives. In particular, ESFRI has prepared a European Roadmap identifying new Research Infrastructure of pan-European interest corresponding to the long term needs of the European research communities, covering all scientific areas, regardless of possible location and likely to be realised in the next 10 to 20 years. The Roadmap was presented on 19 October. It is the result of an intensive two-year consultation and peer review process involving over 1000 high level European and international experts. The Roadmap identifies 35 large scale infrastructure projects, at various stages of development, in seven key research areas including Environmental Sciences; Energy; Materials Sciences; Astrophysics, Astronomy, Particle and Nuclear Physics; Biomedical and Life Sciences; Social Sciences and the Humanities; Computation and data Treatment.
Velasco, Veronica; Griffin, Kenneth W; Antichi, Mariella; Celata, Corrado
2015-10-01
Across developed countries, experimentation with alcohol, tobacco, and other drugs often begins in the early adolescent years. Several evidence-based programs have been developed to prevent adolescent substance use. Many of the most rigorously tested and empirically supported prevention programs were initially developed and tested in the United States. Increasingly, these interventions are being adopted for use in Europe and throughout the world. This paper reports on a large-scale comprehensive initiative designed to select, adapt, implement, and sustain an evidence-based drug abuse prevention program in Italy. As part of a large-scale regionally funded collaboration in the Lombardy region of Italy, we report on processes through which a team of stakeholders selected, translated and culturally adapted, planned, implemented and evaluated the Life Skills Training (LST) school-based drug abuse prevention program, an evidence-based intervention developed in the United States. We discuss several challenges and lessons learned and implications for prevention practitioners and researchers attempting to undertake similar international dissemination projects. We review several published conceptual models designed to promote the replication and widespread dissemination of effective programs, and discuss their strengths and limitations in the context of planning and implementing a complex, large-scale real-world dissemination effort. Copyright © 2015 Elsevier Ltd. All rights reserved.
1982-11-01
to occur). When a rectangle is inserted, all currently selected items are de -selected, and the newly inserted rectangle is selected. This makes it...Items are de - * selected before the selection takes place. A selected symbol instance is displayed with a bold outline, and a selected rectangle edge...symbol instance or set of rectangle edges, everything previously selected is first de -selected. If the selected object is a reference point the old
Manual of downburst identification for Project NIMROD. [atmospheric circulation
NASA Technical Reports Server (NTRS)
Fujita, T. T.
1978-01-01
Aerial photography, Doppler radar, and satellite infrared imagery are used in the two year National Intensive Meteorological Research on Downburst (NIMROD) project to provide large area mapping of strong downdrafts that induce an outward burst of damaging winds over or near the earth. Topics discussed include scales of thunderstorm outflow; aerial photographs of downburst damage; microbursts and aviation hazards; radar echo characteristics; infrared imagery from GOES/SMS; and downburts-tornado relationships. Color maps of downbursts and tornadoes are included.
Recent advances in research on climate and human conflict
NASA Astrophysics Data System (ADS)
Hsiang, S. M.
2014-12-01
A rapidly growing body of empirical, quantitative research examines whether rates of human conflict can be systematically altered by climatic changes. We discuss recent advances in this field, including Bayesian meta-analyses of the effect of temperature and rainfall on current and future large-scale conflicts, the impact of climate variables on gang violence and suicides in Mexico, and probabilistic projections of personal violence and property crime in the United States under RCP scenarios. Criticisms of this research field will also be explained and addressed.
Pound, Joe Mathews; Miller, John Allen; George, John E; Fish, Durland
2009-08-01
The Northeast Area-wide Tick Control Project (NEATCP) was funded by the United States Department of Agriculture (USDA) as a large-scale cooperative demonstration project of the USDA-Agricultural Research Service (ARS)-patented 4-Poster tick control technology (Pound et al. 1994) involving the USDA-ARS and a consortium of universities, state agencies, and a consulting firm at research locations in the five states of Connecticut (CT), Maryland (MD), New Jersey (NJ), New York (NY), and Rhode Island (RI). The stated objective of the project was "A community-based field trial of ARS-patented tick control technology designed to reduce the risk of Lyme disease in northeastern states." Here we relate the rationale and history of the technology, a chronological listing of events leading to implementation of the project, the original protocol for selecting treatment, and control sites, and protocols for deployment of treatments, sampling, assays, data analyses, and estimates of efficacy.
Can microbes economically remove sulfur
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, J.L.
Researchers have reported that refiners who now rely on costly physic-chemical procedures to desulfurize petroleum will soon have an alternative microbial-enzyme-based approach to this process. This new approach is still under development and considerable number chemical engineering problems need to be solved before this process is ready for large-scale use. This paper reviews the several research projects dedicated solving the problems that keep a biotechnology-based alternative from competing with chemical desulfurization.
ERIC Educational Resources Information Center
Educational Research Service, 2011
2011-01-01
This "Informed Educator" draws content from several reports as well as a PowerPoint presentation that describe findings from a study conducted by EdSource and its research partners from Stanford University and the American Institutes for Research. The project--Gaining Ground in the Middle Grades: Why Some Schools Do Better--focused on…
NASA Astrophysics Data System (ADS)
Keller, M. M.
2015-12-01
The Large Scale Biosphere Atmosphere Experiment in Amazonia (LBA) is an international continental scale effort led by Brazil to understand how land use change and climate change affects the role of Amazonia in the Earth system. During the first decade of studies (1998-2007), LBA researchers generated new understanding of Amazonia and published over 1000 papers. However, most LBA participants agree that training and education of a large cohort of scientists, especially students from Brazil, was the greatest contribution of LBA. I analyzed bibliographic data from the NASA supported component project known as LBA-ECO. This component covered a large cross-section of the LBA subject areas highlighting land use and land cover change, carbon cycling, nutrient cycling and other aspects of terrestrial and aquatic ecology. I reviewed the complete bibliography of peer-reviewed papers reported by LBA-ECO researchers (http://www.lbaeco.org/cgi-bin/web/investigations/lbaeco_refs.pl). The researchers reported 691 contributions from 1996 through 2013 of which 24 were theses that were removed them from further analysis. Of 667 papers and book chapters, I tallied the first authors separating categories for Brazilians, all students, and Brazilian students. Numerically, LBA-ECO production of papers peaked in 2004. Publication by Brazilians, students, and Brazilian students generally followed the same pattern as publication in general. However, student and Brazilian student contributions as first authors showed clearly increasing proportions of the papers from project initiation through peak publication. Brazilian student participation as first authors averaged more than 20% of all publications from 2003 to 2010 and more than half of all student publications had Brazilians as first authors. Foreign researchers, some initially reluctant to invest in Brazilian students, almost universally adapted the belief that the greatest legacy of LBA would be the contribution to building a cadre of environmental researchers and professionals for the Amazon region. This belief was transformed into a commitment through pressure by NASA management and through the leadership of the LBA-ECO research team leading to LBA's greatest legacy.
Wilcox, S.; Andreas, A.
2010-09-27
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Learning binary code via PCA of angle projection for image retrieval
NASA Astrophysics Data System (ADS)
Yang, Fumeng; Ye, Zhiqiang; Wei, Xueqi; Wu, Congzhong
2018-01-01
With benefits of low storage costs and high query speeds, binary code representation methods are widely researched for efficiently retrieving large-scale data. In image hashing method, learning hashing function to embed highdimensions feature to Hamming space is a key step for accuracy retrieval. Principal component analysis (PCA) technical is widely used in compact hashing methods, and most these hashing methods adopt PCA projection functions to project the original data into several dimensions of real values, and then each of these projected dimensions is quantized into one bit by thresholding. The variances of different projected dimensions are different, and with real-valued projection produced more quantization error. To avoid the real-valued projection with large quantization error, in this paper we proposed to use Cosine similarity projection for each dimensions, the angle projection can keep the original structure and more compact with the Cosine-valued. We used our method combined the ITQ hashing algorithm, and the extensive experiments on the public CIFAR-10 and Caltech-256 datasets validate the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Sobel, A. H.; Wang, S.; Bellon, G.; Sessions, S. L.; Woolnough, S.
2013-12-01
Parameterizations of large-scale dynamics have been developed in the past decade for studying the interaction between tropical convection and large-scale dynamics, based on our physical understanding of the tropical atmosphere. A principal advantage of these methods is that they offer a pathway to attack the key question of what controls large-scale variations of tropical deep convection. These methods have been used with both single column models (SCMs) and cloud-resolving models (CRMs) to study the interaction of deep convection with several kinds of environmental forcings. While much has been learned from these efforts, different groups' efforts are somewhat hard to compare. Different models, different versions of the large-scale parameterization methods, and experimental designs that differ in other ways are used. It is not obvious which choices are consequential to the scientific conclusions drawn and which are not. The methods have matured to the point that there is value in an intercomparison project. In this context, the Global Atmospheric Systems Study - Weak Temperature Gradient (GASS-WTG) project was proposed at the Pan-GASS meeting in September 2012. The weak temperature gradient approximation is one method to parameterize large-scale dynamics, and is used in the project name for historical reasons and simplicity, but another method, the damped gravity wave (DGW) method, will also be used in the project. The goal of the GASS-WTG project is to develop community understanding of the parameterization methods currently in use. Their strengths, weaknesses, and functionality in models with different physics and numerics will be explored in detail, and their utility to improve our understanding of tropical weather and climate phenomena will be further evaluated. This presentation will introduce the intercomparison project, including background, goals, and overview of the proposed experimental design. Interested groups will be invited to join (it will not be too late), and preliminary results will be presented.
Xu, Jiuping; Feng, Cuiying
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.
Xu, Jiuping
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708
Efficient data management in a large-scale epidemiology research project.
Meyer, Jens; Ostrzinski, Stefan; Fredrich, Daniel; Havemann, Christoph; Krafczyk, Janina; Hoffmann, Wolfgang
2012-09-01
This article describes the concept of a "Central Data Management" (CDM) and its implementation within the large-scale population-based medical research project "Personalized Medicine". The CDM can be summarized as a conjunction of data capturing, data integration, data storage, data refinement, and data transfer. A wide spectrum of reliable "Extract Transform Load" (ETL) software for automatic integration of data as well as "electronic Case Report Forms" (eCRFs) was developed, in order to integrate decentralized and heterogeneously captured data. Due to the high sensitivity of the captured data, high system resource availability, data privacy, data security and quality assurance are of utmost importance. A complex data model was developed and implemented using an Oracle database in high availability cluster mode in order to integrate different types of participant-related data. Intelligent data capturing and storage mechanisms are improving the quality of data. Data privacy is ensured by a multi-layered role/right system for access control and de-identification of identifying data. A well defined backup process prevents data loss. Over the period of one and a half year, the CDM has captured a wide variety of data in the magnitude of approximately 5terabytes without experiencing any critical incidents of system breakdown or loss of data. The aim of this article is to demonstrate one possible way of establishing a Central Data Management in large-scale medical and epidemiological studies. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
McCrorie, Paul; Walker, David; Ellaway, Anne
2018-04-30
Large-scale primary data collections are complex, costly, and time-consuming. Study protocols for trial-based research are now commonplace, with a growing number of similar pieces of work being published on observational research. However, useful additions to the literature base are publications that describe the issues and challenges faced while conducting observational studies. These can provide researchers with insightful knowledge that can inform funding proposals or project development work. In this study, we identify and reflectively discuss the unforeseen or often unpublished issues associated with organizing and implementing a large-scale objectively measured physical activity and global positioning system (GPS) data collection. The SPACES (Studying Physical Activity in Children's Environments across Scotland) study was designed to collect objectively measured physical activity and GPS data from 10- to 11-year-old children across Scotland, using a postal delivery method. The 3 main phases of the project (recruitment, delivery of project materials, and data collection and processing) are described within a 2-stage framework: (1) intended design and (2) implementation of the intended design. Unanticipated challenges arose, which influenced the data collection process; these encompass four main impact categories: (1) cost, budget, and funding; (2) project timeline; (3) participation and engagement; and (4) data challenges. The main unforeseen issues that impacted our timeline included the informed consent process for children under the age of 18 years; the use of, and coordination with, the postal service to deliver study information and equipment; and the variability associated with when participants began data collection and the time taken to send devices and consent forms back (1-12 months). Unanticipated budgetary issues included the identification of some study materials (AC power adapter) not fitting through letterboxes, as well as the employment of fieldworkers to increase recruitment and the return of consent forms. Finally, we encountered data issues when processing physical activity and GPS data that had been initiated across daylight saving time. We present learning points and recommendations that may benefit future studies of similar methodology in their early stages of development. ©Paul McCrorie, David Walker, Anne Ellaway. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 30.04.2018.
Beowulf Distributed Processing and the United States Geological Survey
Maddox, Brian G.
2002-01-01
Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.
Regional Climate Change across the Continental U.S. Projected from Downscaling IPCC AR5 Simulations
NASA Astrophysics Data System (ADS)
Otte, T. L.; Nolte, C. G.; Otte, M. J.; Pinder, R. W.; Faluvegi, G.; Shindell, D. T.
2011-12-01
Projecting climate change scenarios to local scales is important for understanding and mitigating the effects of climate change on society and the environment. Many of the general circulation models (GCMs) that are participating in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) do not fully resolve regional-scale processes and therefore cannot capture local changes in temperature and precipitation extremes. We seek to project the GCM's large-scale climate change signal to the local scale using a regional climate model (RCM) by applying dynamical downscaling techniques. The RCM will be used to better understand the local changes of temperature and precipitation extremes that may result from a changing climate. Preliminary results from downscaling NASA/GISS ModelE simulations of the IPCC AR5 Representative Concentration Pathway (RCP) scenario 6.0 will be shown. The Weather Research and Forecasting (WRF) model will be used as the RCM to downscale decadal time slices for ca. 2000 and ca. 2030 and illustrate potential changes in regional climate for the continental U.S. that are projected by ModelE and WRF under RCP6.0.
NASA Technical Reports Server (NTRS)
Branscome, Lee E.; Bleck, Rainer; Obrien, Enda
1990-01-01
The project objectives are to develop process models to investigate the interaction of planetary and synoptic-scale waves including the effects of latent heat release (precipitation), nonlinear dynamics, physical and boundary-layer processes, and large-scale topography; to determine the importance of latent heat release for temporal variability and time-mean behavior of planetary and synoptic-scale waves; to compare the model results with available observations of planetary and synoptic wave variability; and to assess the implications of the results for monitoring precipitation in oceanic-storm tracks by satellite observing systems. Researchers have utilized two different models for this project: a two-level quasi-geostrophic model to study intraseasonal variability, anomalous circulations and the seasonal cycle, and a 10-level, multi-wave primitive equation model to validate the two-level Q-G model and examine effects of convection, surface processes, and spherical geometry. It explicitly resolves several planetary and synoptic waves and includes specific humidity (as a predicted variable), moist convection, and large-scale precipitation. In the past year researchers have concentrated on experiments with the multi-level primitive equation model. The dynamical part of that model is similar to the spectral model used by the National Meteorological Center for medium-range forecasts. The model includes parameterizations of large-scale condensation and moist convection. To test the validity of results regarding the influence of convective precipitation, researchers can use either one of two different convective schemes in the model, a Kuo convective scheme or a modified Arakawa-Schubert scheme which includes downdrafts. By choosing one or the other scheme, they can evaluate the impact of the convective parameterization on the circulation. In the past year researchers performed a variety of initial-value experiments with the primitive-equation model. Using initial conditions typical of climatological winter conditions, they examined the behavior of synoptic and planetary waves growing in moist and dry environments. Surface conditions were representative of a zonally averaged ocean. They found that moist convection associated with baroclinic wave development was confined to the subtropics.
ERIC Educational Resources Information Center
Vincent, Jack E.
Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph presents data on the application of distance theory to patterns of cooperation among nations. Distance theory implies that international relations systems (nations, organizations, individuals, etc.) can be…
A Needs Analysis for Technology Integration Plan: Challenges and Needs of Teachers
ERIC Educational Resources Information Center
Vatanartiran, Sinem; Karadeniz, Sirin
2015-01-01
Lack of technology leadership and technology integration plans are important obstacles for using technology effectively in schools. We carried out a large-scale study to be able to design a technology integration plan for one of the pilot provinces that Fatih Project was initiated. The purpose of this research is to examine the perceived…
Data Use Agreement | Office of Cancer Clinical Proteomics Research
CPTAC requests that data users abide by the same principles that were previously established in the Fort Lauderdale and Amsterdam meetings. The recommendations from the Fort Lauderdale meeting (2003) on best practices and principles for sharing large-scale genomic data address the roles and responsibilities of data producers, data users and funders of community resource projects.
Jeffrey S. Evans; Andrew T. Hudak; Russ Faux; Alistair M. S. Smith
2009-01-01
Recent years have seen the progression of light detection and ranging (lidar) from the realm of research to operational use in natural resource management. Numerous government agencies, private industries, and public/private stakeholder consortiums are planning or have recently acquired large-scale acquisitions, and a national U.S. lidar acquisition is likely before...
Solar Technical Assistance Team Profile: Megan Day | State, Local, and
What are your primary research interests? I'm looking into the most effective ways for local projects to aid cities in finding the most effective paths toward a clean energy future. What is the most state regulators about how solar, particularly large scale facilities, can be a very cost effective
ERIC Educational Resources Information Center
Leakey, Tricia; Lunde, Kevin B.; Koga, Karin; Glanz, Karen
2004-01-01
More Institutional Review Boards (IRBs) are requiring written parental consent in school health intervention trials. Because this requirement presents a formidable challenge in conducting large-scale research, it is vital for investigators to share effective strategies learned from completed trials. Investigators for the recently completed Project…
ERIC Educational Resources Information Center
Dale, Philip S.; Rice, Mabel L.; Rimfeld, Kaili; Hayiou-Thomas, Marianna E.
2018-01-01
Purpose: There is a need for well-defined language phenotypes suitable for adolescents in twin studies and other large-scale research projects. Rice, Hoffman, and Wexler (2009) have developed a grammatical judgment measure as a clinical marker of language impairment, which has an extended developmental range to adolescence. Method: We conducted…
Examining the Characteristics of Student Postings That Are Liked and Linked in a CSCL Environment
ERIC Educational Resources Information Center
Makos, Alexandra; Lee, Kyungmee; Zingaro, Daniel
2015-01-01
This case study is the first iteration of a large-scale design-based research project to improve Pepper, an interactive discussion-based learning environment. In this phase, we designed and implemented two social features to scaffold positive learner interactivity behaviors: a "Like" button and linking tool. A mixed-methods approach was…
Dilemmas of Leading National Curriculum Reform in a Global Era: A Chinese Perspective
ERIC Educational Resources Information Center
Yin, Hongbiao; Lee, John Chi-Kin; Wang, Wenlan
2014-01-01
Since the mid-1980s, a global resurgence of large-scale reform in the field of education has been witnessed. Implementing these reforms has created many dilemmas for change leaders. Following a three-year qualitative research project, the present study explores the dilemmas leaders faced during the implementation of the national curriculum reform…
Class Size Effects on Reading Achievement Using PIRLS Data: Evidence from Greece
ERIC Educational Resources Information Center
Konstantopoulos, Spyros; Traynor, Anne
2014-01-01
Background/Context: The effects of class size on student achievement have gained considerable attention in education research and policy, especially over the last 30 years. Perhaps the best evidence about the effects of class size thus far has been produced from analyses of Project STAR data, a large-scale experiment where students and teachers…
Becoming a Principal in Indonesia: Possibility, Pitfalls and Potential
ERIC Educational Resources Information Center
Sumintono, Bambang; Sheyoputri, Elslee Y. A.; Jiang, Na; Misbach, Ifa H.; Jumintono
2015-01-01
The preparation and development of school leaders is now considered to be fundamental to school and system improvement. In the pursuit of educational change and reform, the leadership of the principal is deemed to be of critical importance. This qualitative study is part of a large scale research project that is exploring principal preparation and…
Production, Cost and Chip Characteristics of In-Woods Microchipping
J. Thompson; W. Sprinkle
2013-01-01
Emerging markets for biomass have increased the interest in producing microchips in the field. As a component of a large United States Department of Energy (DOE) funded project, microchipping has been trialed on a limited scale. The goal of the research was to evaluate the production, cost and chip characteristics of a mobile disc chipper configured to produce...
Practical strategies of black walnut genetic improvement—an update
George Rink; J.W. Van Sambeek; Phil O' Connor; Mark Coggeshall
2017-01-01
The ultimate goal of any tree improvement program is the large-scale production and distribution of genetically improved seedlings. In black walnut, projections based on earlier research indicate that genetically improved seedlings could provide growth improvement of between 15 to 25 percent by using seed or seedlings of the proper geographic origin (Bey 1980; Clausen...
NASA Astrophysics Data System (ADS)
Curdt, C.; Hoffmeister, D.; Bareth, G.; Lang, U.
2017-12-01
Science conducted in collaborative, cross-institutional research projects, requires active sharing of research ideas, data, documents and further information in a well-managed, controlled and structured manner. Thus, it is important to establish corresponding infrastructures and services for the scientists. Regular project meetings and joint field campaigns support the exchange of research ideas. Technical infrastructures facilitate storage, documentation, exchange and re-use of data as results of scientific output. Additionally, also publications, conference contributions, reports, pictures etc. should be managed. Both, knowledge and data sharing is essential to create synergies. Within the coordinated programme `Collaborative Research Center' (CRC), the German Research Foundation offers funding to establish research data management (RDM) infrastructures and services. CRCs are large-scale, interdisciplinary, multi-institutional, long-term (up to 12 years), university-based research institutions (up to 25 sub-projects). These CRCs address complex and scientifically challenging research questions. This poster presents the RDM services and infrastructures that have been established for two CRCs, both focusing on environmental sciences. Since 2007, a RDM support infrastructure and associated services have been set up for the CRC/Transregio 32 (CRC/TR32) `Patterns in Soil-Vegetation-Atmosphere-Systems: Monitoring, Modelling and Data Assimilation' (www.tr32.de). The experiences gained have been used to arrange RDM services for the CRC1211 `Earth - Evolution at the Dry Limit' (www.crc1211.de), funded since 2016. In both projects scientists from various disciplines collect heterogeneous data at field campaigns or by modelling approaches. To manage the scientific output, the TR32DB data repository (www.tr32db.de) has been designed and implemented for the CRC/TR32. This system was transferred and adapted to the CRC1211 needs (www.crc1211db.uni-koeln.de) in 2016. Both repositories support secure and sustainable data storage, backup, documentation, publication with DOIs, search, download, statistics as well as web mapping features. Moreover, RDM consulting and support services as well as training sessions are carried out regularly.
A study of rotor and platform design trade-offs for large-scale floating vertical axis wind turbines
NASA Astrophysics Data System (ADS)
Griffith, D. Todd; Paquette, Joshua; Barone, Matthew; Goupee, Andrew J.; Fowler, Matthew J.; Bull, Diana; Owens, Brian
2016-09-01
Vertical axis wind turbines are receiving significant attention for offshore siting. In general, offshore wind offers proximity to large populations centers, a vast & more consistent wind resource, and a scale-up opportunity, to name a few beneficial characteristics. On the other hand, offshore wind suffers from high levelized cost of energy (LCOE) and in particular high balance of system (BoS) costs owing to accessibility challenges and limited project experience. To address these challenges associated with offshore wind, Sandia National Laboratories is researching large-scale (MW class) offshore floating vertical axis wind turbines (VAWTs). The motivation for this work is that floating VAWTs are a potential transformative technology solution to reduce offshore wind LCOE in deep-water locations. This paper explores performance and cost trade-offs within the design space for floating VAWTs between the configurations for the rotor and platform.
NASA Astrophysics Data System (ADS)
Tyralis, Hristos; Karakatsanis, Georgios; Tzouka, Katerina; Mamassis, Nikos
2015-04-01
The Greek electricity system is examined for the period 2002-2014. The demand load data are analysed at various time scales (hourly, daily, seasonal and annual) and they are related to the mean daily temperature and the gross domestic product (GDP) of Greece for the same time period. The prediction of energy demand, a product of the Greek Independent Power Transmission Operator, is also compared with the demand load. Interesting results about the change of the electricity demand scheme after the year 2010 are derived. This change is related to the decrease of the GDP, during the period 2010-2014. The results of the analysis will be used in the development of an energy forecasting system which will be a part of a framework for optimal planning of a large-scale hybrid renewable energy system in which hydropower plays the dominant role. Acknowledgement: This research was funded by the Greek General Secretariat for Research and Technology through the research project Combined REnewable Systems for Sustainable ENergy DevelOpment (CRESSENDO; grant number 5145)
Scale in Education Research: Towards a Multi-Scale Methodology
ERIC Educational Resources Information Center
Noyes, Andrew
2013-01-01
This article explores some theoretical and methodological problems concerned with scale in education research through a critique of a recent mixed-method project. The project was framed by scale metaphors drawn from the physical and earth sciences and I consider how recent thinking around scale, for example, in ecosystems and human geography might…
Trunk, Achim
2007-01-01
The 1939 Nobel Laureate in Chemistry and later President of the Max Planck Society Adolf Butenandt has been increasingly exposed to criticism in recent years. One far-reaching accusation against him is his postulated participation in the human experiments executed by the SS-physician Josef Mengele in the Auschwitz concentration camp. It concerns a project initiated by anthropologist Otmar von Verschuer in 1943. For this, Verschu-ER Obtained blood samples from his assistant Mengele in the Auschwitz concentration camp. When methodological problems occurred in the project Butenandt helped Verschuer. According to the reconstruction of geneticist Benno Müller-Hill the research project included lethal human experiments: Mengele had selectively infected concentration camp detainees with tuberculosis to observe their racially conditioned resistibility against that disease, he claims. This reconstruction, however, contradicts other sources. Therefore an alternative reconstruction is offered here. According to that, the project represented a large-scale attempt of serological race diagnosis in man. Human experiments are not plausible for this project. Yet it is clearly connected to race biological research and implementation.
Laycock, Alison; Bailie, Jodie; Matthews, Veronica; Cunningham, Frances; Harvey, Gillian; Percival, Nikki; Bailie, Ross
2017-01-01
Introduction Bringing together continuous quality improvement (CQI) data from multiple health services offers opportunities to identify common improvement priorities and to develop interventions at various system levels to achieve large-scale improvement in care. An important principle of CQI is practitioner participation in interpreting data and planning evidence-based change. This study will contribute knowledge about engaging diverse stakeholders in collaborative and theoretically informed processes to identify and address priority evidence-practice gaps in care delivery. This paper describes a developmental evaluation to support and refine a novel interactive dissemination project using aggregated CQI data from Aboriginal and Torres Strait Islander primary healthcare centres in Australia. The project aims to effect multilevel system improvement in Aboriginal and Torres Strait Islander primary healthcare. Methods and analysis Data will be gathered using document analysis, online surveys, interviews with participants and iterative analytical processes with the research team. These methods will enable real-time feedback to guide refinements to the design, reports, tools and processes as the interactive dissemination project is implemented. Qualitative data from interviews and surveys will be analysed and interpreted to provide in-depth understanding of factors that influence engagement and stakeholder perspectives about use of the aggregated data and generated improvement strategies. Sources of data will be triangulated to build up a comprehensive, contextualised perspective and integrated understanding of the project's development, implementation and findings. Ethics and dissemination The Human Research Ethics Committee (HREC) of the Northern Territory Department of Health and Menzies School of Health Research (Project 2015-2329), the Central Australian HREC (Project 15-288) and the Charles Darwin University HREC (Project H15030) approved the study. Dissemination will include articles in peer-reviewed journals, policy and research briefs. Results will be presented at conferences and quality improvement network meetings. Researchers, clinicians, policymakers and managers developing evidence-based system and policy interventions should benefit from this research. PMID:28710222
NASA Astrophysics Data System (ADS)
Eltahir, E. A. B.; IM, E. S.
2014-12-01
This study investigates the impact of potential large-scale (about 400,000 km2) and medium-scale (about 60,000 km2) irrigation on the climate of West Africa using the MIT Regional Climate Model. A new irrigation module is implemented to assess the impact of location and scheduling of irrigation on rainfall distribution over West Africa. A control simulation (without irrigation) and various sensitivity experiments (with irrigation) are performed and compared to discern the effects of irrigation location, size and scheduling. In general, the irrigation-induced surface cooling due to anomalously wet soil tends to suppress moist convection and rainfall, which in turn induces local subsidence and low level anti-cyclonic circulation. These local effects are dominated by a consistent reduction of local rainfall over the irrigated land, irrespective of its location. However, the remote response of rainfall distribution to irrigation exhibits a significant sensitivity to the latitudinal position of irrigation. The low-level northeasterly flow associated with anti-cyclonic circulation centered over the irrigation area can enhance the extent of low level convergence through interaction with the prevailing monsoon flow, leading to significant increase in rainfall. Despite much reduced forcing of irrigation water, the medium-scale irrigation seems to draw the same response as large-scale irrigation, which supports the robustness of the response to irrigation in our modeling system. Both large-scale and medium-scale irrigation experiments show that an optimal irrigation location and scheduling exists that would lead to a more efficient use of irrigation water. The approach of using a regional climate model to investigate the impact of location and size of irrigation schemes may be the first step in incorporating land-atmosphere interactions in the design of location and size of irrigation projects. However, this theoretical approach is still in early stages of development and further research is needed before any practical application in water resources planning. Acknowledgements.This research was supported by the National Research Foundation Singapore through the Singapore MIT Alliance for Research and Technology's Center for Environmental Sensing and Modeling interdisciplinary research program.
Research and Development of Large Capacity CFB Boilers in TPRI
NASA Astrophysics Data System (ADS)
Xianbin, Sun; Minhua, Jiang
This paper presents an overview of advancements of circulating fluidized bed (CFB) technology in Thermal Power Research Institute (TPRI),including technologies and configuration and progress of scaling up. For devoloping large CFB boiler, the CFB combustion test facilities have been established, the key technologies of large capacity CFB boiler have been research systematically, the 100MW ˜330MW CFB boiler have been developed and manufactured. The first domestically designed 100MW and 210MW CFB boiler have been put into commericial operation and have good operating performance. Domestic 330MW CFB boiler demonstration project also has been put into commericial operation,which is H type CFB boiler with Compact heat exchanger. This boiler is China's largest CFB boiler. The technical plan of domestic 600MW supercritical CFB boiler are also briefly introduced.
Freedom and Responsibility in Synthetic Genomics: The Synthetic Yeast Project
Sliva, Anna; Yang, Huanming; Boeke, Jef D.; Mathews, Debra J. H.
2015-01-01
First introduced in 2011, the Synthetic Yeast Genome (Sc2.0) Project is a large international synthetic genomics project that will culminate in the first eukaryotic cell (Saccharomyces cerevisiae) with a fully synthetic genome. With collaborators from across the globe and from a range of institutions spanning from do-it-yourself biology (DIYbio) to commercial enterprises, it is important that all scientists working on this project are cognizant of the ethical and policy issues associated with this field of research and operate under a common set of principles. In this commentary, we survey the current ethics and regulatory landscape of synthetic biology and present the Sc2.0 Statement of Ethics and Governance to which all members of the project adhere. This statement focuses on four aspects of the Sc2.0 Project: societal benefit, intellectual property, safety, and self-governance. We propose that such project-level agreements are an important, valuable, and flexible model of self-regulation for similar global, large-scale synthetic biology projects in order to maximize the benefits and minimize potential harms. PMID:26272997
Freedom and Responsibility in Synthetic Genomics: The Synthetic Yeast Project.
Sliva, Anna; Yang, Huanming; Boeke, Jef D; Mathews, Debra J H
2015-08-01
First introduced in 2011, the Synthetic Yeast Genome (Sc2.0) PROJECT is a large international synthetic genomics project that will culminate in the first eukaryotic cell (Saccharomyces cerevisiae) with a fully synthetic genome. With collaborators from across the globe and from a range of institutions spanning from do-it-yourself biology (DIYbio) to commercial enterprises, it is important that all scientists working on this project are cognizant of the ethical and policy issues associated with this field of research and operate under a common set of principles. In this commentary, we survey the current ethics and regulatory landscape of synthetic biology and present the Sc2.0 Statement of Ethics and Governance to which all members of the project adhere. This statement focuses on four aspects of the Sc2.0 PROJECT: societal benefit, intellectual property, safety, and self-governance. We propose that such project-level agreements are an important, valuable, and flexible model of self-regulation for similar global, large-scale synthetic biology projects in order to maximize the benefits and minimize potential harms. Copyright © 2015 by the Genetics Society of America.
LeWinn, Kaja Z.; Hutson, Malo A.; Dare, Ramie; Falk, Janet
2014-01-01
Billions of dollars are invested annually to improve low-income neighborhoods, the health impacts of which are rarely studied prospectively. University of California researchers and Mercy Housing/The Related Companies formed a “Learning Community” with the dual goals of examining the health impacts of a large-scale San Francisco redevelopment project and informing the development team of best public health practices. Early experiences highlight challenges and opportunities, including differences in stakeholders, incentives, and funding, the strengths of local partnerships, and fresh insights from new analytic tools and perspectives. Research suggests interventions that ameliorate upstream causes of poor health would save health care dollars, but policy makers must incentivize collaboration in order to spread a Learning Community model. PMID:22068398
Abrahamson, Melanie; Hooker, Elizabeth; Ajami, Nadim J; Petrosino, Joseph F; Orwoll, Eric S
2017-09-01
The relationship of the gastrointestinal microbiome to health and disease is of major research interest, including the effects of the gut microbiota on age related conditions. Here we report on the outcome of a project to collect stool samples on a large number of community dwelling elderly men using the OMNIgene-GUT stool/feces collection kit (OMR-200, DNA Genotek, Ottawa, Canada). Among 1,328 men who were eligible for stool collection, 982 (74%) agreed to participate and 951 submitted samples. The collection process was reported to be acceptable, almost all samples obtained were adequate, the process of sample handling by mail was uniformly successful. The DNA obtained provided excellent results in microbiome analyses, yielding an abundance of species and a diversity of taxa as would be predicted. Our results suggest that population studies of older participants involving remote stool sample collection are feasible. These approaches would allow large scale research projects of the association of the gut microbiota with important clinical outcomes.
Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.
Eichelberg, Marco; Chronaki, Catherine
2016-01-01
Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.
Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C
2011-11-27
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project
Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.
2011-01-01
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969
Integrated water and renewable energy management: the Acheloos-Peneios region case study
NASA Astrophysics Data System (ADS)
Koukouvinos, Antonios; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Tegos, Aristotelis; Rozos, Evangelos; Papalexiou, Simon-Michael; Dimitriadis, Panayiotis; Markonis, Yiannis; Kossieris, Panayiotis; Tyralis, Christos; Karakatsanis, Georgios; Tzouka, Katerina; Christofides, Antonis; Karavokiros, George; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris
2015-04-01
Within the ongoing research project "Combined Renewable Systems for Sustainable Energy Development" (CRESSENDO), we have developed a novel stochastic simulation framework for optimal planning and management of large-scale hybrid renewable energy systems, in which hydropower plays the dominant role. The methodology and associated computer tools are tested in two major adjacent river basins in Greece (Acheloos, Peneios) extending over 15 500 km2 (12% of Greek territory). River Acheloos is characterized by very high runoff and holds ~40% of the installed hydropower capacity of Greece. On the other hand, the Thessaly plain drained by Peneios - a key agricultural region for the national economy - usually suffers from water scarcity and systematic environmental degradation. The two basins are interconnected through diversion projects, existing and planned, thus formulating a unique large-scale hydrosystem whose future has been the subject of a great controversy. The study area is viewed as a hypothetically closed, energy-autonomous, system, in order to evaluate the perspectives for sustainable development of its water and energy resources. In this context we seek an efficient configuration of the necessary hydraulic and renewable energy projects through integrated modelling of the water and energy balance. We investigate several scenarios of energy demand for domestic, industrial and agricultural use, assuming that part of the demand is fulfilled via wind and solar energy, while the excess or deficit of energy is regulated through large hydroelectric works that are equipped with pumping storage facilities. The overall goal is to examine under which conditions a fully renewable energy system can be technically and economically viable for such large spatial scale.
NASA Astrophysics Data System (ADS)
Lim, D. S.; Brady, A. L.; Cardman, Z.; Cowie, B. R.; Forrest, A.; Marinova, M.; Shepard, R.; Laval, B.; Slater, G. F.; Gernhardt, M.; Andersen, D. T.; Hawes, I.; Sumner, D. Y.; Trembanis, A. C.; McKay, C. P.
2009-12-01
Microbialites can be metre-scale or larger discrete structures that cover kilometre-scale regions, for example in Pavilion Lake, British Columbia, Canada, while the organisms associated with their growth and development are much smaller (less than millimeter scale). As such, a multi-scaled approach to understanding their provenance, maintenance and morphological characteristics is required. Research members of the Pavilion Lake Research Project (PLRP) (www.pavilionlake.com) have been working to understand microbialite morphogenesis in Pavilion Lake, B.C., Canada and the potential for biosignature preservation in these carbonate rocks using a combination of field and lab based techniques. PLRP research participants have been: (1) exploring the physical and chemical limnological properties of the lake, especially as these characteristics pertain to microbialite formation, (2) using geochemical and molecular tools to test the hypothesized biological origin of the microbialites and the associated meso-scale processes, and (3) using geochemical and microscopic tools to characterize potential biosignature preservation in the microbialites on the micro scale. To address these goals, PLRP identified the need to (a) map Pavilion Lake to gain a contextual understanding of microbialite distribution and possible correlation between their lake-wide distribution and the ambient growth conditions, and (b) sample the microbialites, including those from deepest regions of the lake (60m). Initial assessments showed that PLRP science diving operations did not prove adequate for mapping and sample recovery in the large and deep (0.8 km x 5.7 km; 65m max depth) lake. As such, the DeepWorker Science and Exploration (DSE) program was established by the PLRP. At the heart of this program are two DeepWorker (DW) submersibles, single-person vehicles that offer Scientist-Pilots (SP) an opportunity to study the lake in a 1 atm pressurized environment. In addition, the use of Autonomous Underwater Vehicles (AUVs) for landscape level geophysical mapping (side-scan and multibeam) provides and additional large-scale context of the microbialite associations. The multi-scaled approach undertaken by the PLRP team members has created an opportunity to weave together a comprehensive understanding of the modern microbialites in Pavilion Lake, and their relevance to interpreting ancient carbonate fabrics. An overview of the team’s findings to date and on-going research will be presented.
2009-01-01
Background Insertional mutagenesis is an effective method for functional genomic studies in various organisms. It can rapidly generate easily tractable mutations. A large-scale insertional mutagenesis with the piggyBac (PB) transposon is currently performed in mice at the Institute of Developmental Biology and Molecular Medicine (IDM), Fudan University in Shanghai, China. This project is carried out via collaborations among multiple groups overseeing interconnected experimental steps and generates a large volume of experimental data continuously. Therefore, the project calls for an efficient database system for recording, management, statistical analysis, and information exchange. Results This paper presents a database application called MP-PBmice (insertional mutation mapping system of PB Mutagenesis Information Center), which is developed to serve the on-going large-scale PB insertional mutagenesis project. A lightweight enterprise-level development framework Struts-Spring-Hibernate is used here to ensure constructive and flexible support to the application. The MP-PBmice database system has three major features: strict access-control, efficient workflow control, and good expandability. It supports the collaboration among different groups that enter data and exchange information on daily basis, and is capable of providing real time progress reports for the whole project. MP-PBmice can be easily adapted for other large-scale insertional mutation mapping projects and the source code of this software is freely available at http://www.idmshanghai.cn/PBmice. Conclusion MP-PBmice is a web-based application for large-scale insertional mutation mapping onto the mouse genome, implemented with the widely used framework Struts-Spring-Hibernate. This system is already in use by the on-going genome-wide PB insertional mutation mapping project at IDM, Fudan University. PMID:19958505
Success in large high-technology projects: What really works?
NASA Astrophysics Data System (ADS)
Crosby, P.
2014-08-01
Despite a plethora of tools, technologies and management systems, successful execution of big science and engineering projects remains problematic. The sheer scale of globally funded projects such as the Large Hadron Collider and the Square Kilometre Array telescope means that lack of project success can impact both on national budgets, and collaborative reputations. In this paper, I explore data from contemporary literature alongside field research from several current high-technology projects in Europe and Australia, and reveal common `pressure points' that are shown to be key influencers of project control and success. I discuss the how mega-science projects sit between being merely complicated, and chaotic, and explain the importance of understanding multiple dimensions of project complexity. Project manager/leader traits are briefly discussed, including capability to govern and control such enterprises. Project structures are examined, including the challenge of collaborations. I show that early attention to building project resilience, curbing optimism, and risk alertness can help prepare large high-tech projects against threats, and why project managers need to understand aspects of `the silent power of time'. Mission assurance is advanced as a critical success function, alongside the deployment of task forces and new combinations of contingency plans. I argue for increased project control through industrial-style project reviews, and show how post-project reviews are an under-used, yet invaluable avenue of personal and organisational improvement. Lastly, I discuss the avoidance of project amnesia through effective capture of project knowledge, and transfer of lessons-learned to subsequent programs and projects.
Jackson, Rebecca D; Best, Thomas M; Borlawsky, Tara B; Lai, Albert M; James, Stephen; Gurcan, Metin N
2012-01-01
The conduct of clinical and translational research regularly involves the use of a variety of heterogeneous and large-scale data resources. Scalable methods for the integrative analysis of such resources, particularly when attempting to leverage computable domain knowledge in order to generate actionable hypotheses in a high-throughput manner, remain an open area of research. In this report, we describe both a generalizable design pattern for such integrative knowledge-anchored hypothesis discovery operations and our experience in applying that design pattern in the experimental context of a set of driving research questions related to the publicly available Osteoarthritis Initiative data repository. We believe that this ‘test bed’ project and the lessons learned during its execution are both generalizable and representative of common clinical and translational research paradigms. PMID:22647689
2016-04-30
renewable energy projects with a focus on novel onshore/offshore and small/large scale wind turbine designs for expanding their operational range and...ROA to estimate the values of maintenance options created by the implementation of PHM in wind turbines . When an RUL is predicted for a subsystem...predicted for the system. The section titled Example— Wind Turbine With an Outcome-Based Contract presents a case study for a PHM enabled wind
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less
The future of management: The NASA paradigm
NASA Technical Reports Server (NTRS)
Harris, Philip R.
1992-01-01
Prototypes of 21st century management, especially for large scale enterprises, may well be found within the aerospace industry. The space era inaugurated a number of projects of such scope and magnitude that another type of management had to be created to ensure successful achievement. The challenges will be not just in terms of technology and its management, but also human and cultural in dimension. Futurists, students of management, and those concerned with technological administration would do well to review the literature of emerging space management for its wider implications. NASA offers a paradigm, or demonstrated model, of future trends in the field of management at large. More research is needed on issues of leadership for Earth based project in space and space based programs with managers there. It is needed to realize that large scale technical enterprises, such as are undertaken in space, require a new form of management. NASA and other responsible agencies are urged to study excellence in space macromanagement, including the necessary multidisciplinary skills. Two recommended targets are the application of general living systems theory and macromanagement concepts for space stations in the 1990s.
Influence of small-scale turbulence on cup anemometer calibrations
NASA Astrophysics Data System (ADS)
Marraccini, M.; Bak-Kristensen, K.; Horn, A.; Fifield, E.; Hansen, S. O.
2017-11-01
The paper presents and discusses the calibration results of cup anemometers under different levels of small-scale turbulence. Small-scale turbulence is known to govern the curvature of shear layers around structures and is not related to the traditional under and over speeding of cup anemometers originating from large-scale turbulence components. The paper has shown that the small-scale turbulence has a significant effect on the calibration results obtained for cup anemometers. At 10m/s the rotational speed seems to change by approx. 0.5% due to different simulations of the small-scale turbulence. The work which this paper is based on, is part of the TrueWind research project, aiming to increase accuracy of mast top-mounted cup anemometer measurements.
NASA Astrophysics Data System (ADS)
Gao, Z.; Song, Y.; Li, C.; Zeng, F.; Wang, F.
2017-08-01
Rapid acquisition and processing method of large scale topographic map data, which relies on the Unmanned Aerial Vehicle (UAV) low-altitude aerial photogrammetry system, is studied in this paper, elaborating the main work flow. Key technologies of UAV photograph mapping is also studied, developing a rapid mapping system based on electronic plate mapping system, thus changing the traditional mapping mode and greatly improving the efficiency of the mapping. Production test and achievement precision evaluation of Digital Orth photo Map (DOM), Digital Line Graphic (DLG) and other digital production were carried out combined with the city basic topographic map update project, which provides a new techniques for large scale rapid surveying and has obvious technical advantage and good application prospect.
Optical mapping and its potential for large-scale sequencing projects.
Aston, C; Mishra, B; Schwartz, D C
1999-07-01
Physical mapping has been rediscovered as an important component of large-scale sequencing projects. Restriction maps provide landmark sequences at defined intervals, and high-resolution restriction maps can be assembled from ensembles of single molecules by optical means. Such optical maps can be constructed from both large-insert clones and genomic DNA, and are used as a scaffold for accurately aligning sequence contigs generated by shotgun sequencing.
Integrating Data from GRACE and Other Observing Systems for Hydrological Research and Applications
NASA Technical Reports Server (NTRS)
Rodell, M.; Famiglietti, J. S.; McWilliams, E.; Beaudoing, H. K.; Li, B.; Zaitchik, B.; Reichle, R.; Bolten, J.
2011-01-01
The Gravity Recovery and Climate Experiment (GRACE) mission provides a unique view of water cycle dynamics, enabling the only space based observations of water on and beneath the land surface that are not limited by depth. GRACE data are immediately useful for large scale applications such as ice sheet ablation monitoring, but they are even more valuable when combined with other types of observations, either directly or within a data assimilation system. Here we describe recent results of hydrological research and applications projects enabled by GRACE. These include the following: 1) global monitoring of interannual variability of terrestrial water storage and groundwater; 2) water balance estimates of evapotranspiration over several large river basins; 3) NASA's Energy and Water Cycle Study (NEWS) state of the global water budget project; 4) drought indicator products now being incorporated into the U.S. Drought Monitor; 5) GRACE data assimilation over several regions.
ERIC Educational Resources Information Center
Dexter, Barbara; Seden, Roy
2012-01-01
Following an internal evaluation exercise, using Action Research, this paper identifies the positive impact of small-scale research projects on teaching and learning at a single case study UK University. Clear evidence is given of how the projects benefited students and staff, and enhanced institutional culture. Barriers to better practice are…
Brief Self-Report Scales Assessing Life History Dimensions of Mating and Parenting Effort.
Kruger, Daniel J
2017-01-01
Life history theory (LHT) is a powerful evolutionary framework for understanding physiological, psychological, and behavioral variation both between and within species. Researchers and theorists are increasingly integrating LHT into evolutionary psychology, as it provides a strong foundation for research across many topical areas. Human life history variation has been represented in psychological and behavioral research in several ways, including indicators of conditions in the developmental environment, indicators of conditions in the current environment, and indicators of maturation and life milestones (e.g., menarche, initial sexual activity, first pregnancy), and in self-report survey scale measures. Survey scale measures have included constructs such as time perspective and future discounting, although the most widely used index is a constellation of indicators assessing the K-factor, thought to index general life history speed (from fast to slow). The current project examined the utility of two brief self-report survey measures assessing the life history dimensions of mating effort and parenting effort with a large undergraduate sample in the United States. Consistent with the theory, items reflected two inversely related dimensions. In regressions including the K-factor, the Mating Effort Scale proved to be a powerful predictor of other constructs and indicators related to life history variation. The Parenting Effort Scale had less predictive power overall, although it explained unique variance across several constructs and was the only unique predictor of the number of long-term (serious and committed) relationships. These scales may be valuable additions to self-report survey research projects examining life history variation.
ROADNET: A Real-time Data Aware System for Earth, Oceanographic, and Environmental Applications
NASA Astrophysics Data System (ADS)
Vernon, F.; Hansen, T.; Lindquist, K.; Ludascher, B.; Orcutt, J.; Rajasekar, A.
2003-12-01
The Real-time Observatories, Application, and Data management Network (ROADNet) Program aims to develop an integrated, seamless, and transparent environmental information network that will deliver geophysical, oceanographic, hydrological, ecological, and physical data to a variety of users in real-time. ROADNet is a multidisciplinary, multinational partnership of researchers, policymakers, natural resource managers, educators, and students who aim to use the data to advance our understanding and management of coastal, ocean, riparian, and terrestrial Earth systems in Southern California, Mexico, and well off shore. To date, project activity and funding have focused on the design and deployment of network linkages and on the exploratory development of the real-time data management system. We are currently adapting powerful "Data Grid" technologies to the unique challenges associated with the management and manipulation of real-time data. Current "Grid" projects deal with static data files, and significant technical innovation is required to address fundamental problems of real-time data processing, integration, and distribution. The technologies developed through this research will create a system that dynamically adapt downstream processing, cataloging, and data access interfaces when sensors are added or removed from the system; provide for real-time processing and monitoring of data streams--detecting events, and triggering computations, sensor and logger modifications, and other actions; integrate heterogeneous data from multiple (signal) domains; and provide for large-scale archival and querying of "consolidated" data. The software tools which must be developed do not exist, although limited prototype systems are available. This research has implications for the success of large-scale NSF initiatives in the Earth sciences (EarthScope), ocean sciences (OOI- Ocean Observatories Initiative), biological sciences (NEON - National Ecological Observatory Network) and civil engineering (NEES - Network for Earthquake Engineering Simulation). Each of these large scale initiatives aims to collect real-time data from thousands of sensors, and each will require new technologies to process, manage, and communicate real-time multidisciplinary environmental data on regional, national, and global scales.
NASA Astrophysics Data System (ADS)
Mendoza, Pablo A.; Mizukami, Naoki; Ikeda, Kyoko; Clark, Martyn P.; Gutmann, Ethan D.; Arnold, Jeffrey R.; Brekke, Levi D.; Rajagopalan, Balaji
2016-10-01
We examine the effects of regional climate model (RCM) horizontal resolution and forcing scaling (i.e., spatial aggregation of meteorological datasets) on the portrayal of climate change impacts. Specifically, we assess how the above decisions affect: (i) historical simulation of signature measures of hydrologic behavior, and (ii) projected changes in terms of annual water balance and hydrologic signature measures. To this end, we conduct our study in three catchments located in the headwaters of the Colorado River basin. Meteorological forcings for current and a future climate projection are obtained at three spatial resolutions (4-, 12- and 36-km) from dynamical downscaling with the Weather Research and Forecasting (WRF) regional climate model, and hydrologic changes are computed using four different hydrologic model structures. These projected changes are compared to those obtained from running hydrologic simulations with current and future 4-km WRF climate outputs re-scaled to 12- and 36-km. The results show that the horizontal resolution of WRF simulations heavily affects basin-averaged precipitation amounts, propagating into large differences in simulated signature measures across model structures. The implications of re-scaled forcing datasets on historical performance were primarily observed on simulated runoff seasonality. We also found that the effects of WRF grid resolution on projected changes in mean annual runoff and evapotranspiration may be larger than the effects of hydrologic model choice, which surpasses the effects from re-scaled forcings. Scaling effects on projected variations in hydrologic signature measures were found to be generally smaller than those coming from WRF resolution; however, forcing aggregation in many cases reversed the direction of projected changes in hydrologic behavior.
Multiresolution comparison of precipitation datasets for large-scale models
NASA Astrophysics Data System (ADS)
Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.
2014-12-01
Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.
NASA Astrophysics Data System (ADS)
Rosendahl, D. H.; Ćwik, P.; Martin, E. R.; Basara, J. B.; Brooks, H. E.; Furtado, J. C.; Homeyer, C. R.; Lazrus, H.; Mcpherson, R. A.; Mullens, E.; Richman, M. B.; Robinson-Cook, A.
2017-12-01
Extreme precipitation events cause significant damage to homes, businesses, infrastructure, and agriculture, as well as many injures and fatalities as a result of fast-moving water or waterborne diseases. In the USA, these natural hazard events claimed the lives of more than 300 people during 2015 - 2016 alone, with total damage reaching $24.4 billion. Prior studies of extreme precipitation events have focused on the sub-daily to sub-weekly timeframes. However, many decisions for planning, preparing and resilience-building require sub-seasonal to seasonal timeframes (S2S; 14 to 90 days), but adequate forecasting tools for prediction do not exist. Therefore, the goal of this newly funded project is an enhancement in understanding of the large-scale forcing and dynamics of S2S extreme precipitation events in the United States, and improved capability for modeling and predicting such events. Here, we describe the project goals, objectives, and research activities that will take place over the next 5 years. In this project, a unique team of scientists and stakeholders will identify and understand weather and climate processes connected with the prediction of S2S extreme precipitation events by answering these research questions: 1) What are the synoptic patterns associated with, and characteristic of, S2S extreme precipitation evens in the contiguous U.S.? 2) What role, if any, do large-scale modes of climate variability play in modulating these events? 3) How predictable are S2S extreme precipitation events across temporal scales? 4) How do we create an informative prediction of S2S extreme precipitation events for policymaking and planing? This project will use observational data, high-resolution radar composites, dynamical climate models and workshops that engage stakeholders (water resource managers, emergency managers and tribal environmental professionals) in co-production of knowledge. The overarching result of this project will be predictive models to reduce of the societal and economic impacts of extreme precipitation events. Another outcome will include statistical and co-production frameworks, which could be applied across other meteorological extremes, all time scales and in other parts of the world to increase resilience to extreme meteorological events.
An epidemiological approach to welfare research in zoos: the Elephant Welfare Project.
Carlstead, Kathy; Mench, Joy A; Meehan, Cheryl; Brown, Janine L
2013-01-01
Multi-institutional studies of welfare have proven to be valuable in zoos but are hampered by limited sample sizes and difficulty in evaluating more than just a few welfare indicators. To more clearly understand how interactions of husbandry factors influence the interrelationships among welfare outcomes, epidemiological approaches are needed as well as multifactorial assessments of welfare. Many questions have been raised about the housing and care of elephants in zoos and whether their environmental and social needs are being met in a manner that promotes good welfare. This article describes the background and rationale for a large-scale study of elephant welfare in North American zoos funded by the (U.S.) Institute of Museum and Library Services. The goals of this project are to document the prevalence of positive and negative welfare states in 291 elephants exhibited in 72 Association of Zoos and Aquariums zoos and then determine the environmental, management, and husbandry factors that impact elephant welfare. This research is the largest scale nonhuman animal welfare project ever undertaken by the zoo community, and the scope of environmental variables and welfare outcomes measured is unprecedented.
The Art of Teaching Children the Arts: Music, Dance and Poetry with Children Aged 2-8 Years Old
ERIC Educational Resources Information Center
Samuelsson, Ingrid Pramling; Carlsson, Maj Asplund; Olsson, Bengt; Pramling, Niklas; Wallerstedt, Cecilia
2009-01-01
In this article, the theoretical framework of developmental pedagogy is presented as a tool in studying and developing children's knowing within the arts. The domains of art focused on are music, poetry and dance/aesthetic movement. Through empirical examples from a large-scale research project, we illustrate the tools of developmental pedagogy…
ERIC Educational Resources Information Center
Geerdink, Gerda; Bergen, Theo; Dekkers, Hetty
2011-01-01
In the Netherlands only a small number of male students opt for primary school teaching and a relatively large percentage of them leave without graduating. A small-scale research project was set up to explore the question: Can gender-specific student factors be identified in relation to the initial teacher education curriculum that leads to the…
Visual Culture Learning Communities: How and What Students Come to Know in Informal Art Groups
ERIC Educational Resources Information Center
Freedman, Kerry; Heijnen, Emiel; Kallio-Tavin, Mira; Karpati, Andrea; Papp, Laszlo
2013-01-01
This article is the report of a large-scale, international research project involving focus group interviews of adolescent and young adult members of a variety of self-initiated visual culture groups in five urban areas (Amsterdam, Budapest, Chicago, Helsinki, and Hong Kong). Each group was established by young people around their interests in the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolinger, Mark; Seel, Joachim
2015-09-01
Other than the nine Solar Energy Generation Systems (“SEGS”) parabolic trough projects built in the 1980s, virtually no large-scale or “utility-scale” solar projects – defined here to include any groundmounted photovoltaic (“PV”), concentrating photovoltaic (“CPV”), or concentrating solar thermal power (“CSP”) project larger than 5 MW AC – existed in the United States prior to 2007. By 2012 – just five years later – utility-scale had become the largest sector of the overall PV market in the United States, a distinction that was repeated in both 2013 and 2014 and that is expected to continue for at least the nextmore » few years. Over this same short period, CSP also experienced a bit of a renaissance in the United States, with a number of large new parabolic trough and power tower systems – some including thermal storage – achieving commercial operation. With this critical mass of new utility-scale projects now online and in some cases having operated for a number of years (generating not only electricity, but also empirical data that can be mined), the rapidly growing utility-scale sector is ripe for analysis. This report, the third edition in an ongoing annual series, meets this need through in-depth, annually updated, data-driven analysis of not just installed project costs or prices – i.e., the traditional realm of solar economics analyses – but also operating costs, capacity factors, and power purchase agreement (“PPA”) prices from a large sample of utility-scale solar projects in the United States. Given its current dominance in the market, utility-scale PV also dominates much of this report, though data from CPV and CSP projects are presented where appropriate.« less
High Resolution Modelling of the Congo River's Multi-Threaded Main Stem Hydraulics
NASA Astrophysics Data System (ADS)
Carr, A. B.; Trigg, M.; Tshimanga, R.; Neal, J. C.; Borman, D.; Smith, M. W.; Bola, G.; Kabuya, P.; Mushie, C. A.; Tschumbu, C. L.
2017-12-01
We present the results of a summer 2017 field campaign by members of the Congo River users Hydraulics and Morphology (CRuHM) project, and a subsequent reach-scale hydraulic modelling study on the Congo's main stem. Sonar bathymetry, ADCP transects, and water surface elevation data have been collected along the Congo's heavily multi-threaded middle reach, which exhibits complex in-channel hydraulic processes that are not well understood. To model the entire basin's hydrodynamics, these in-channel hydraulic processes must be parameterised since it is not computationally feasible to represent them explicitly. Furthermore, recent research suggests that relative to other large global rivers, in-channel flows on the Congo represent a relatively large proportion of total flow through the river-floodplain system. We therefore regard sufficient representation of in-channel hydraulic processes as a Congo River hydrodynamic research priority. To enable explicit representation of in-channel hydraulics, we develop a reach-scale (70 km), high resolution hydraulic model. Simulation of flow through individual channel threads provides new information on flow depths and velocities, and will be used to inform the parameterisation of a broader basin-scale hydrodynamic model. The basin-scale model will ultimately be used to investigate floodplain fluxes, flood wave attenuation, and the impact of future hydrological change scenarios on basin hydrodynamics. This presentation will focus on the methodology we use to develop a reach-scale bathymetric DEM. The bathymetry of only a small proportion of channel threads can realistically be captured, necessitating some estimation of the bathymetry of channels not surveyed. We explore different approaches to this bathymetry estimation, and the extent to which it influences hydraulic model predictions. The CRuHM project is a consortium comprising the Universities of Kinshasa, Rhodes, Dar es Salaam, Bristol, and Leeds, and is funded by Royal Society-DFID Africa Capacity Building Initiative. The project aims to strengthen institutional research capacity and advance our understanding of the hydrology, hydrodynamics and sediment dynamics of the world's second largest river system through fieldwork and development of numerical models.
The ENCODE project: implications for psychiatric genetics.
Kavanagh, D H; Dwyer, S; O'Donovan, M C; Owen, M J
2013-05-01
The ENCyclopedia Of DNA Elements (ENCODE) project is a public research consortium that aims to identify all functional elements of the human genome sequence. The project comprised 1640 data sets, from 147 different cell type and the findings were released in a coordinated set of 34 publications across several journals. The ENCODE publications report that 80.4% of the human genome displays some functionality. These data have important implications for interpreting results from large-scale genetics studies. We reviewed some of the key findings from the ENCODE publications and discuss how they can influence or inform further investigations into the genetic factors contributing to neuropsychiatric disorders.
Status of HiLASE project: High average power pulsed DPSSL systems for research and industry
NASA Astrophysics Data System (ADS)
Mocek, T.; Divoky, M.; Smrz, M.; Sawicka, M.; Chyla, M.; Sikocinski, P.; Vohnikova, H.; Severova, P.; Lucianetti, A.; Novak, J.; Rus, B.
2013-11-01
We introduce the Czech national R&D project HiLASE which focuses on strategic development of advanced high-repetition rate, diode pumped solid state laser (DPSSL) systems that may find use in research, high-tech industry and in the future European large-scale facilities such as HiPER and ELI. Within HiLASE we explore two major concepts: thin-disk and cryogenically cooled multislab amplifiers capable of delivering average output powers above 1 kW level in picosecond-to-nanosecond pulsed regime. In particular, we have started a programme of technology development to demonstrate the scalability of multislab concept up to the kJ level at repetition rate of 1-10 Hz.
Influence of the normal modes on the plasma uniformity in large scale CCP reactors
NASA Astrophysics Data System (ADS)
Eremin, Denis; Brinkmann, Ralf Peter; Mussenbrock, Thomas; Lane, Barton; Matsukuma, Masaaki; Ventzek, Peter
2016-09-01
Large scale capacitively coupled plasmas (CCP) driven by sources with high frequency components often exhibit phenomena which are absent in relatively well understood small scale CCPs driven at low frequencies. Of particular interest are such phenomena which affect discharge parameters of direct relevance to the plasma processing applications. One of such parameters is plasma uniformity. By using a self-consistent 2d3v Particle-in-cell/Monte-Carlo (PIC/MCC) code parallelized on GPU we have been able to show that uniformity of the plasma generated is influenced predominantly by two factors, the ionization pattern caused by high-energy electrons and the average temperature of low-energy plasma electrons. The heating mechanisms for these two groups of electrons appear to be different leading to different transversal (radial) profiles of the corresponding factors, which is well captured by the kinetic PIC/MCC code. We find that the heating mechanisms are intrinsically connected with excitation of normal modes inherent to a plasma-filled CCP reactor. In this work we study the wave nature of these phenomena, such as their excitation, propagation, and interaction with electrons. Supported by SFB-TR 87 project of the German Research Foundation and by the ``Experimental and numerical analysis of very high frequency capacitively coupled plasma discharges'' mutual research project between RUB and Tokyo Electron Ltd.
Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets
Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L
2014-01-01
Background As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Methods Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Results Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Conclusions Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. PMID:24464852
Johnson, Bruce D.; Dunlap, Eloise; Benoit, Ellen
2008-01-01
Qualitative research creates mountains of words. U.S. federal funding supports mostly structured qualitative research, which is designed to test hypotheses using semi-quantitative coding and analysis. The authors have 30 years of experience in designing and completing major qualitative research projects, mainly funded by the US National Institute on Drug Abuse [NIDA]. This article reports on strategies for planning, organizing, collecting, managing, storing, retrieving, analyzing, and writing about qualitative data so as to most efficiently manage the mountains of words collected in large-scale ethnographic projects. Multiple benefits accrue from this approach. Several different staff members can contribute to the data collection, even when working from remote locations. Field expenditures are linked to units of work so productivity is measured, many staff in various locations have access to use and analyze the data, quantitative data can be derived from data that is primarily qualitative, and improved efficiencies of resources are developed. The major difficulties involve a need for staff who can program and manage large databases, and who can be skillful analysts of both qualitative and quantitative data. PMID:20222777
X-ray techniques for innovation in industry
Lawniczak-Jablonska, Krystyna; Cutler, Jeffrey
2014-01-01
The smart specialization declared in the European program Horizon 2020, and the increasing cooperation between research and development found in companies and researchers at universities and research institutions have created a new paradigm where many calls for proposals require participation and funding from public and private entities. This has created a unique opportunity for large-scale facilities, such as synchrotron research laboratories, to participate in and support applied research programs. Scientific staff at synchrotron facilities have developed many advanced tools that make optimal use of the characteristics of the light generated by the storage ring. These tools have been exceptionally valuable for materials characterization including X-ray absorption spectroscopy, diffraction, tomography and scattering, and have been key in solving many research and development issues. Progress in optics and detectors, as well as a large effort put into the improvement of data analysis codes, have resulted in the development of reliable and reproducible procedures for materials characterization. Research with photons has contributed to the development of a wide variety of products such as plastics, cosmetics, chemicals, building materials, packaging materials and pharma. In this review, a few examples are highlighted of successful cooperation leading to solutions of a variety of industrial technological problems which have been exploited by industry including lessons learned from the Science Link project, supported by the European Commission, as a new approach to increase the number of commercial users at large-scale research infrastructures. PMID:25485139
Wade, Victoria A; Taylor, Alan D; Kidd, Michael R; Carati, Colin
2016-05-16
This study was a component of the Flinders Telehealth in the Home project, which tested adding home telehealth to existing rehabilitation, palliative care and geriatric outreach services. Due to the known difficulty of transitioning telehealth projects services, a qualitative study was conducted to produce a preferred implementation approach for sustainable and large-scale operations, and a process model that offers practical advice for achieving this goal. Initially, semi-structured interviews were conducted with senior clinicians, health service managers and policy makers, and a thematic analysis of the interview transcripts was undertaken to identify the range of options for ongoing operations, plus the factors affecting sustainability. Subsequently, the interviewees and other decision makers attended a deliberative forum in which participants were asked to select a preferred model for future implementation. Finally, all data from the study was synthesised by the researchers to produce a process model. 19 interviews with senior clinicians, managers, and service development staff were conducted, finding strong support for home telehealth but a wide diversity of views on governance, models of clinical care, technical infrastructure operations, and data management. The deliberative forum worked through these options and recommended a collaborative consortium approach for large-scale implementation. The process model proposes that the key factor for large-scale implementation is leadership support, which is enabled by 1) showing solutions to the problems of service demand, budgetary pressure and the relationship between hospital and primary care, 2) demonstrating how home telehealth aligns with health service policies, and 3) achieving clinician acceptance through providing evidence of benefit and developing new models of clinical care. Two key actions to enable change were marketing telehealth to patients, clinicians and policy-makers, and building a community of practice. The implementation of home telehealth services is still in an early stage. Change agents and a community of practice can contribute by marketing telehealth, demonstrating policy alignment and providing potential solutions for difficult health services problems. This should assist health leaders to move from trials to large-scale services.
Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities
NASA Technical Reports Server (NTRS)
Grinstead, Jay H.; Wilder, Michael C.; Porter, Barry J.; Brown, Jeffrey D.; Yeung, Dickson; Battazzo, Stephen J.; Brubaker, Timothy R.
2016-01-01
The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (LIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper will document the latest improvements of the LIF system design and demonstrations of the redeveloped AHF and IHF LIF systems.
The U. S. DOE Carbon Storage Program: Status and Future Directions
NASA Astrophysics Data System (ADS)
Damiani, D.
2016-12-01
The U.S. Department of Energy (DOE) is taking steps to reduce carbon dioxide (CO2) emissions through clean energy innovation, including carbon capture and storage (CCS) research. The Office of Fossil Energy Carbon Storage Program is focused on ensuring the safe and permanent storage and/or utilization of CO2 captured from stationary sources. The Program is developing and advancing geologic storage technologies both onshore and offshore that will significantly improve the effectiveness of CCS, reduce the cost of implementation, and be ready for widespread commercial deployment in the 2025-2035 timeframe. The technology development and field testing conducted through this Program will be used to benefit the existing and future fleet of fossil fuel power generating and industrial facilities by creating tools to increase our understanding of geologic reservoirs appropriate for CO2 storage and the behavior of CO2 in the subsurface. The Program is evaluating the potential for storage in depleted oil and gas reservoirs, saline formations, unmineable coal, organic-rich shale formations, and basalt formations. Since 1997, DOE's Carbon Storage Program has significantly advanced the CCS knowledge base through a diverse portfolio of applied research projects. The Core Storage R&D research component focuses on analytic studies, laboratory, and pilot- scale research to develop technologies that can improve wellbore integrity, increase reservoir storage efficiency, improve management of reservoir pressure, ensure storage permanence, quantitatively assess risks, and identify and mitigate potential release of CO2 in all types of storage formations. The Storage Field Management component focuses on scale-up of CCS and involves field validation of technology options, including large-volume injection field projects at pre-commercial scale to confirm system performance and economics. Future research involves commercial-scale characterization for regionally significant storage locations capable of storing from 50 to 100 million metric tons of CO2 in a saline formation. These projects will lay the foundation for fully integrated carbon capture and storage demonstrations of future first of a kind (FOAK) coal power projects. Future research will also bring added focus on offshore CCS.
NASA Astrophysics Data System (ADS)
Foster, I.; Elliott, J. W.; Jones, J.; Montella, R.
2013-12-01
Issues relating to climate change and food security require an understanding of the interaction between the natural world and human society over long time scales. Understanding climate change, its impacts on the natural world and society, and the tradeoffs inherent in societal responses demands an unprecedented degree of cooperation across academic fields. New data sources on expected future climate, soil characteristics, economic activity, historical weather, population, and land cover, provide a potential basis for this cooperation. New methods are needed for sharing within and across communities not only data but the software used to generate, synthesize, and analyze it. Progress on these research challenges is hindered by the extreme difficulties that researchers, collaborators, and the community experiences when they collaborate around data. Multiplicity of data formats; inadequate computational tools; difficulty in sharing data and programs, lack of incentives for pro-social behavior and large data volumes are among the technology barriers. The FACE-IT project at the University of Chicago, NASA, and University of Florida employs an integrated approach to cyberinfrastructure to advance the characterization of vulnerabilities, impacts, mitigation, and adaptation to climate change in human and environmental systems. Leveraging existing research cyberinfrastructure the project is creating a full-featured FACE-IT Platform prototype with new capabilities for ingesting, organizing, managing, analyzing and using large quantities of diverse data. The project team collaborates with two distinct interdisciplinary communities to create community specific FACE-IT Instances to both advance their research and enable at-scale evaluation of the utility of the FACE-IT approach. In this talk I will introduce the FACE-IT system and discuss early applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hancu, Dan
GE Global Research has developed, over the last 8 years, a platform of cost effective CO2 capture technologies based on a non-aqueous aminosilicone solvent (GAP-1m). As demonstrated in previous funded DOE projects (DE-FE0007502 and DEFE0013755), the GAP-1m solvent has increased CO2 working capacity, lower volatility and corrosivity than the benchmark aqueous amine technology. Performance of the GAP-1m solvent was recently demonstrated in a 0.5 MWe pilot at National Carbon Capture Center, AL with real flue gas for over 500 hours of operation using a Steam Stripper Column (SSC). The pilot-scale PSTU engineering data were used to (i) update the techno-economicmore » analysis, and EH&S assessment, (ii) perform technology gap analysis, and (iii) conduct the solvent manufacturability and scale-up study.« less
PUMa - modelling the groundwater flow in Baltic Sedimentary Basin
NASA Astrophysics Data System (ADS)
Kalvane, G.; Marnica, A.; Bethers, U.
2012-04-01
In 2009-2012 at University of Latvia and Latvia University of Agriculture project "Establishment of interdisciplinary scientist group and modelling system for groundwater research" is implemented financed by the European Social Fund. The aim of the project is to develop groundwater research in Latvia by establishing interdisciplinary research group and modelling system covering groundwater flow in the Baltic Sedimentary Basin. Researchers from fields like geology, chemistry, mathematical modelling, physics and environmental engineering are involved in the project. The modelling system is used as a platform for addressing scientific problems such as: (1) large-scale groundwater flow in Baltic Sedimentary Basin and impact of human activities on it; (2) the evolution of groundwater flow since the last glaciation and subglacial groundwater recharge; (3) the effects of climate changes on shallow groundwater and interaction of hydrographical network and groundwater; (4) new programming approaches for groundwater modelling. Within the frame of the project most accessible geological information such as description of geological wells, geological maps and results of seismic profiling in Latvia as well as Estonia and Lithuania are collected and integrated into modelling system. For example data form more then 40 thousands wells are directly used to automatically generate the geological structure of the model. Additionally a groundwater sampling campaign is undertaken. Contents of CFC, stabile isotopes of O and H and radiocarbon are the most significant parameters of groundwater that are established in unprecedented scale for Latvia. The most important modelling results will be published in web as a data set. Project number: 2009/0212/1DP/1.1.1.2.0/09/APIA/VIAA/060. Project web-site: www.puma.lu.lv
Project Final Report: The Institute for Sustained Performance, Energy, and Resilience (SUPER)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollingsworth, Jeffrey K.
This project concentrated on various aspects of creating and applying tool infrastructure to make it easier to effectively use large-scale parallel computers. This project was collaborative with Argonne National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, U.C. San Diego, University of Maryland, University of North Carolina, University of Oregon, University Southern California, University of Tennessee, and University of Utah. The research conducted during this project at the University of Maryland is summarized in this report. The complete details of the work are available in the publications listed at the end of the report. Manymore » of the concepts created during this project have been incorporated into tools and made available as freely downloadable software (www.dyninst.org/harmony). It also supported the studies of six graduate students, one undergraduate student, and two post-docs. The funding also provided summer support for the PI and part of the salary of a research staff member.« less
Horiguchi, Hiromasa; Yasunaga, Hideo; Hashimoto, Hideki; Ohe, Kazuhiko
2012-12-22
Secondary use of large scale administrative data is increasingly popular in health services and clinical research, where a user-friendly tool for data management is in great demand. MapReduce technology such as Hadoop is a promising tool for this purpose, though its use has been limited by the lack of user-friendly functions for transforming large scale data into wide table format, where each subject is represented by one row, for use in health services and clinical research. Since the original specification of Pig provides very few functions for column field management, we have developed a novel system called GroupFilterFormat to handle the definition of field and data content based on a Pig Latin script. We have also developed, as an open-source project, several user-defined functions to transform the table format using GroupFilterFormat and to deal with processing that considers date conditions. Having prepared dummy discharge summary data for 2.3 million inpatients and medical activity log data for 950 million events, we used the Elastic Compute Cloud environment provided by Amazon Inc. to execute processing speed and scaling benchmarks. In the speed benchmark test, the response time was significantly reduced and a linear relationship was observed between the quantity of data and processing time in both a small and a very large dataset. The scaling benchmark test showed clear scalability. In our system, doubling the number of nodes resulted in a 47% decrease in processing time. Our newly developed system is widely accessible as an open resource. This system is very simple and easy to use for researchers who are accustomed to using declarative command syntax for commercial statistical software and Structured Query Language. Although our system needs further sophistication to allow more flexibility in scripts and to improve efficiency in data processing, it shows promise in facilitating the application of MapReduce technology to efficient data processing with large scale administrative data in health services and clinical research.
Statistical genetics concepts and approaches in schizophrenia and related neuropsychiatric research.
Schork, Nicholas J; Greenwood, Tiffany A; Braff, David L
2007-01-01
Statistical genetics is a research field that focuses on mathematical models and statistical inference methodologies that relate genetic variations (ie, naturally occurring human DNA sequence variations or "polymorphisms") to particular traits or diseases (phenotypes) usually from data collected on large samples of families or individuals. The ultimate goal of such analysis is the identification of genes and genetic variations that influence disease susceptibility. Although of extreme interest and importance, the fact that many genes and environmental factors contribute to neuropsychiatric diseases of public health importance (eg, schizophrenia, bipolar disorder, and depression) complicates relevant studies and suggests that very sophisticated mathematical and statistical modeling may be required. In addition, large-scale contemporary human DNA sequencing and related projects, such as the Human Genome Project and the International HapMap Project, as well as the development of high-throughput DNA sequencing and genotyping technologies have provided statistical geneticists with a great deal of very relevant and appropriate information and resources. Unfortunately, the use of these resources and their interpretation are not straightforward when applied to complex, multifactorial diseases such as schizophrenia. In this brief and largely nonmathematical review of the field of statistical genetics, we describe many of the main concepts, definitions, and issues that motivate contemporary research. We also provide a discussion of the most pressing contemporary problems that demand further research if progress is to be made in the identification of genes and genetic variations that predispose to complex neuropsychiatric diseases.
The age of citizen science: Stimulating future environmental research
NASA Astrophysics Data System (ADS)
Burgess, S. N.
2010-12-01
Public awareness of the state of the ocean is growing with issues such as climate change, over-harvesting, marine pollution, coral bleaching, ocean acidification and sea level rise appearing regularly in popular media outlets. Society is also placing greater value on the range of ecosystem services the ocean provides. This increased consciousness of environmental change due to a combination of anthropogenic activities and impacts from climate change offers scientists the opportunity of engaging citizens in environmental research. The term citizen science refers to scientific research carried out by citizens and led by professionals, which involves large scale data collection whilst simultaneously engaging and educating those who participate. Most projects that engage citizen scientists have been specifically designed to provide an educational benefit to the volunteer and benefit the scientific inquiry by collecting extensive data sets over large geographical areas. Engaging the public in environmental science is not a new concept and successful projects (such as the Audobon Christmas Bird Count and Earthwatch) have been running for several decades resulting in hundreds of thousands of people conducting long-term field research in partnership with scientists based at universities worldwide. The realm of citizen science projects is continually expanding, with public engagement options ranging from science online; to backyard afternoon studies; to fully immersive experiential learning projects running for weeks at a time. Some organisations, such as Earthwatch also work in partnership with private industry; giving scientists access to more funding opportunities than those avenues traditionally available. These scientist -industry partnerships provide mutual benefits as the results of research projects in environments such as coastal ecosystems feed directly back into business risk strategies; for example mitigating shoreline erosion, storm surges, over fishing and warming water temperatures. Citizen science projects fulfill the requirements of government granting institutions for outreach and scientific communication. This presentation will highlight marine research projects, which have not only engaged citizens in the scientific process but also discuss the impacts of associated outreach, capacity building and community environmental stewardship.
Cascading failure in the wireless sensor scale-free networks
NASA Astrophysics Data System (ADS)
Liu, Hao-Ran; Dong, Ming-Ru; Yin, Rong-Rong; Han, Li
2015-05-01
In the practical wireless sensor networks (WSNs), the cascading failure caused by a failure node has serious impact on the network performance. In this paper, we deeply research the cascading failure of scale-free topology in WSNs. Firstly, a cascading failure model for scale-free topology in WSNs is studied. Through analyzing the influence of the node load on cascading failure, the critical load triggering large-scale cascading failure is obtained. Then based on the critical load, a control method for cascading failure is presented. In addition, the simulation experiments are performed to validate the effectiveness of the control method. The results show that the control method can effectively prevent cascading failure. Project supported by the Natural Science Foundation of Hebei Province, China (Grant No. F2014203239), the Autonomous Research Fund of Young Teacher in Yanshan University (Grant No. 14LGB017) and Yanshan University Doctoral Foundation, China (Grant No. B867).
Modeling the Economic Feasibility of Large-Scale Net-Zero Water Management: A Case Study.
Guo, Tianjiao; Englehardt, James D; Fallon, Howard J
While municipal direct potable water reuse (DPR) has been recommended for consideration by the U.S. National Research Council, it is unclear how to size new closed-loop DPR plants, termed "net-zero water (NZW) plants", to minimize cost and energy demand assuming upgradient water distribution. Based on a recent model optimizing the economics of plant scale for generalized conditions, the authors evaluated the feasibility and optimal scale of NZW plants for treatment capacity expansion in Miami-Dade County, Florida. Local data on population distribution and topography were input to compare projected costs for NZW vs the current plan. Total cost was minimized at a scale of 49 NZW plants for the service population of 671,823. Total unit cost for NZW systems, which mineralize chemical oxygen demand to below normal detection limits, is projected at ~$10.83 / 1000 gal, approximately 13% above the current plan and less than rates reported for several significant U.S. cities.
Public-Private Partnerships in Cloud-Computing Services in the Context of Genomic Research.
Granados Moreno, Palmira; Joly, Yann; Knoppers, Bartha Maria
2017-01-01
Public-private partnerships (PPPs) have been increasingly used to spur and facilitate innovation in a number of fields. In healthcare, the purpose of using a PPP is commonly to develop and/or provide vaccines and drugs against communicable diseases, mainly in developing or underdeveloped countries. With the advancement of technology and of the area of genomics, these partnerships also focus on large-scale genomic research projects that aim to advance the understanding of diseases that have a genetic component and to develop personalized treatments. This new focus has created new forms of PPPs that involve information technology companies, which provide computing infrastructure and services to store, analyze, and share the massive amounts of data genomic-related projects produce. In this article, we explore models of PPPs proposed to handle, protect, and share the genomic data collected and to further develop genomic-based medical products. We also identify the reasons that make these models suitable and the challenges they have yet to overcome. To achieve this, we describe the details and complexities of MSSNG, International Cancer Genome Consortium, and 100,000 Genomes Project, the three PPPs that focus on large-scale genomic research to better understand the genetic components of autism, cancer, rare diseases, and infectious diseases with the intention to find appropriate treatments. Organized as PPP and employing cloud-computing services, the three projects have advanced quickly and are likely to be important sources of research and development for future personalized medicine. However, there still are unresolved matters relating to conflicts of interest, commercialization, and data control. Learning from the challenges encountered by past PPPs allowed us to establish that developing guidelines to adequately manage personal health information stored in clouds and ensuring the protection of data integrity and privacy would be critical steps in the development of future PPPs.
Public–Private Partnerships in Cloud-Computing Services in the Context of Genomic Research
Granados Moreno, Palmira; Joly, Yann; Knoppers, Bartha Maria
2017-01-01
Public–private partnerships (PPPs) have been increasingly used to spur and facilitate innovation in a number of fields. In healthcare, the purpose of using a PPP is commonly to develop and/or provide vaccines and drugs against communicable diseases, mainly in developing or underdeveloped countries. With the advancement of technology and of the area of genomics, these partnerships also focus on large-scale genomic research projects that aim to advance the understanding of diseases that have a genetic component and to develop personalized treatments. This new focus has created new forms of PPPs that involve information technology companies, which provide computing infrastructure and services to store, analyze, and share the massive amounts of data genomic-related projects produce. In this article, we explore models of PPPs proposed to handle, protect, and share the genomic data collected and to further develop genomic-based medical products. We also identify the reasons that make these models suitable and the challenges they have yet to overcome. To achieve this, we describe the details and complexities of MSSNG, International Cancer Genome Consortium, and 100,000 Genomes Project, the three PPPs that focus on large-scale genomic research to better understand the genetic components of autism, cancer, rare diseases, and infectious diseases with the intention to find appropriate treatments. Organized as PPP and employing cloud-computing services, the three projects have advanced quickly and are likely to be important sources of research and development for future personalized medicine. However, there still are unresolved matters relating to conflicts of interest, commercialization, and data control. Learning from the challenges encountered by past PPPs allowed us to establish that developing guidelines to adequately manage personal health information stored in clouds and ensuring the protection of data integrity and privacy would be critical steps in the development of future PPPs. PMID:28164085
ERIC Educational Resources Information Center
Richardson, Jayson W.; Sales, Gregory; Sentocnik, Sonja
2015-01-01
Integrating ICTs into international development projects is common. However, focusing on how ICTs support leading, teaching, and learning is often overlooked. This article describes a team's approach to technology integration into the design of a large-scale, five year, teacher and leader professional development project in the country of Georgia.…
NASA Astrophysics Data System (ADS)
Guiquan, Xi; Lin, Cong; Xuehui, Jin
2018-05-01
As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.
Analysis of central enterprise architecture elements in models of six eHealth projects.
Virkanen, Hannu; Mykkänen, Juha
2014-01-01
Large-scale initiatives for eHealth services have been established in many countries on regional or national level. The use of Enterprise Architecture has been suggested as a methodology to govern and support the initiation, specification and implementation of large-scale initiatives including the governance of business changes as well as information technology. This study reports an analysis of six health IT projects in relation to Enterprise Architecture elements, focusing on central EA elements and viewpoints in different projects.
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis to 'topdog'…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis to combined…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents the computer printout of data on the application of second stage factor analysis of 'underdog'…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Computer printout of the analysis is included. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis of combined…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis of 'underdog'…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis to combined…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of second stage factor analysis of combined…
ERIC Educational Resources Information Center
Medina, Antonio; Beyebach, Mark
2014-01-01
This paper presents the first results of a large-scale research project on the child protection services in Tenerife, Spain. In Study 1, the professional beliefs and practices of 152 child protection workers, as measured by a Professional Beliefs and Practices Questionnaire, were correlated with their scores on the Maslach Burnout Inventory.…
ERIC Educational Resources Information Center
Gray, Colette
2008-01-01
This article reports findings from one aspect of a large scale research project funded by the Guide Dogs for the Blind Association, to investigate the mobility, independence and life skills education available to children and young people with a visual impairment between 0 and 19 years of age in Northern Ireland (NI). Here the focus is on the…
Government Support for Synthetic Pipeline Gas Uncertain and Needs Attention.
1982-05-14
coal gas. Tear Sheetii RECOMMENDATIONS GAO recommends that the Secretary of Energy - --establish a plan to guide future support of high-Btu coal...recognizes that there are basic dif- ferences expected from large and small scale research projects, GAO believes that the report recognizes these...transportation, including the pipeline system. In its price-setting, or ratemaking function, it represents the interests of gas customers, sometimes
RELIABILITY, AVAILABILITY, AND SERVICEABILITY FOR PETASCALE HIGH-END COMPUTING AND BEYOND
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chokchai "Box" Leangsuksun
2011-05-31
Our project is a multi-institutional research effort that adopts interplay of RELIABILITY, AVAILABILITY, and SERVICEABILITY (RAS) aspects for solving resilience issues in highend scientific computing in the next generation of supercomputers. results lie in the following tracks: Failure prediction in a large scale HPC; Investigate reliability issues and mitigation techniques including in GPGPU-based HPC system; HPC resilience runtime & tools.
Large scale geologic sequestration (GS) of carbon dioxide poses a novel set of challenges for regulators. This paper focuses on the unique needs of large scale GS projects in light of the existing regulatory regimes in the United States and Canada and identifies several differen...
Projected changes to precipitation extremes over the Canadian Prairies using multi-RCM ensemble
NASA Astrophysics Data System (ADS)
Masud, M. B.; Khaliq, M. N.; Wheater, H. S.
2016-12-01
Information on projected changes to precipitation extremes is needed for future planning of urban drainage infrastructure and storm water management systems and to sustain socio-economic activities and ecosystems at local, regional and other scales of interest. This study explores the projected changes to seasonal (April-October) precipitation extremes at daily, hourly and sub-hourly scales over the Canadian Prairie Provinces of Alberta, Saskatchewan, and Manitoba, based on the North American Regional Climate Change Assessment Program multi-Regional Climate Model (RCM) ensemble and regional frequency analysis. The performance of each RCM is evaluated regarding boundary and performance errors to study various sources of uncertainties and the impact of large-scale driving fields. In the absence of RCM-simulated short-duration extremes, a framework is developed to derive changes to extremes of these durations. Results from this research reveal that the relative changes in sub-hourly extremes are higher than those in the hourly and daily extremes. Overall, projected changes in precipitation extremes are larger for southeastern parts of this region than southern and northern areas, and smaller for southwestern and western parts of the study area. Keywords: climate change, precipitation extremes, regional frequency analysis, NARCCAP, Canadian Prairie provinces
SUBTASK 2.19 – OPERATIONAL FLEXIBILITY OF CO2 TRANSPORT AND STORAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jensen, Melanie; Schlasner, Steven; Sorensen, James
2014-12-31
Carbon dioxide (CO2) is produced in large quantities during electricity generation and by industrial processes. These CO2 streams vary in terms of both composition and mass flow rate, sometimes substantially. The impact of a varying CO2 stream on pipeline and storage operation is not fully understood in terms of either operability or infrastructure robustness. This study was performed to summarize basic background from the literature on the topic of operational flexibility of CO2 transport and storage, but the primary focus was on compiling real-world lessons learned about flexible operation of CO2 pipelines and storage from both large-scale field demonstrations andmore » commercial operating experience. Modeling and pilot-scale results of research in this area were included to illustrate some of the questions that exist relative to operation of carbon capture and storage (CCS) projects with variable CO2 streams. It is hoped that this report’s real-world findings provide readers with useful information on the topic of transport and storage of variable CO2 streams. The real-world results were obtained from two sources. The first source consisted of five full-scale, commercial transport–storage projects: Sleipner, Snøhvit, In Salah, Weyburn, and Illinois Basin–Decatur. These scenarios were reviewed to determine the information that is available about CO2 stream variability/intermittency on these demonstration-scale projects. The five projects all experienced mass flow variability or an interruption in flow. In each case, pipeline and/or injection engineers were able to accommodate any issues that arose. Significant variability in composition has not been an issue at these five sites. The second source of real- world results was telephone interviews conducted with experts in CO2 pipeline transport, injection, and storage during which commercial anecdotal information was acquired to augment that found during the literature search of the five full-scale projects. The experts represented a range of disciplines and hailed from North America and Europe. Major findings of the study are that compression and transport of CO2 for enhanced oil recovery (EOR) purposes in the United States has shown that impurities are not likely to cause transport problems if CO2 stream composition standards are maintained and pressures are kept at 10.3 MPa or higher. Cyclic, or otherwise intermittent, CO2 supplies historically have not impacted in-field distribution pipeline networks, wellbore integrity, or reservoir conditions. The U.S. EOR industry has demonstrated that it is possible to adapt to variability and intermittency in CO2 supply through flexible operation of the pipeline and geologic storage facility. This CO2 transport and injection experience represents knowledge that can be applied in future CCS projects. A number of gaps in knowledge were identified that may benefit from future research and development, further enhancing the possibility for widespread application of CCS. This project was funded through the Energy & Environmental Research Center–U.S. Department of Energy Joint Program on Research and Development for Fossil Energy-Related Resources Cooperative Agreement No. DE-FC26-08NT43291. Nonfederal funding was provided by the IEA Greenhouse Gas R&D Programme.« less
On initial Brain Activity Mapping of episodic and semantic memory code in the hippocampus.
Tsien, Joe Z; Li, Meng; Osan, Remus; Chen, Guifen; Lin, Longian; Wang, Phillip Lei; Frey, Sabine; Frey, Julietta; Zhu, Dajiang; Liu, Tianming; Zhao, Fang; Kuang, Hui
2013-10-01
It has been widely recognized that the understanding of the brain code would require large-scale recording and decoding of brain activity patterns. In 2007 with support from Georgia Research Alliance, we have launched the Brain Decoding Project Initiative with the basic idea which is now similarly advocated by BRAIN project or Brain Activity Map proposal. As the planning of the BRAIN project is currently underway, we share our insights and lessons from our efforts in mapping real-time episodic memory traces in the hippocampus of freely behaving mice. We show that appropriate large-scale statistical methods are essential to decipher and measure real-time memory traces and neural dynamics. We also provide an example of how the carefully designed, sometime thinking-outside-the-box, behavioral paradigms can be highly instrumental to the unraveling of memory-coding cell assembly organizing principle in the hippocampus. Our observations to date have led us to conclude that the specific-to-general categorical and combinatorial feature-coding cell assembly mechanism represents an emergent property for enabling the neural networks to generate and organize not only episodic memory, but also semantic knowledge and imagination. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
On Initial Brain Activity Mapping of Associative Memory Code in the Hippocampus
Tsien, Joe Z.; Li, Meng; Osan, Remus; Chen, Guifen; Lin, Longian; Lei Wang, Phillip; Frey, Sabine; Frey, Julietta; Zhu, Dajiang; Liu, Tianming; Zhao, Fang; Kuang, Hui
2013-01-01
It has been widely recognized that the understanding of the brain code would require large-scale recording and decoding of brain activity patterns. In 2007 with support from Georgia Research Alliance, we have launched the Brain Decoding Project Initiative with the basic idea which is now similarly advocated by BRAIN project or Brain Activity Map proposal. As the planning of the BRAIN project is currently underway, we share our insights and lessons from our efforts in mapping real-time episodic memory traces in the hippocampus of freely behaving mice. We show that appropriate large-scale statistical methods are essential to decipher and measure real-time memory traces and neural dynamics. We also provide an example of how the carefully designed, sometime thinking-outside-the-box, behavioral paradigms can be highly instrumental to the unraveling of memory-coding cell assembly organizing principle in the hippocampus. Our observations to date have led us to conclude that the specific-to-general categorical and combinatorial feature-coding cell assembly mechanism represents an emergent property for enabling the neural networks to generate and organize not only episodic memory, but also semantic knowledge and imagination. PMID:23838072
NASA Astrophysics Data System (ADS)
Stallard, R. F.
2011-12-01
The importance of biological processes in controlling weathering, erosion, stream-water composition, soil formation, and overall landscape development is generally accepted. The U.S. Geological Survey (USGS) Water, Energy, and Biogeochemical Budgets (WEBB) Project in eastern Puerto Rico and Panama and the Smithsonian Tropical Research Institute (STRI) Panama Canal Watershed Experiment (PCWE) are landscape-scale studies based in the humid tropics where the warm temperatures, moist conditions, and luxuriant vegetation promote especially rapid biological and chemical processes - photosynthesis, respiration, decay, and chemical weathering. In both studies features of small-watershed, large-watershed, and landscape-scale-biology experiments are blended to satisfy the research needs of the physical and biological sciences. The WEBB Project has successfully synthesized its first fifteen years of data, and has addressed the influence of land cover, geologic, topographic, and hydrologic variability, including huge storms on a wide range of hydrologic, physical, and biogeochemical processes. The ongoing PCWE should provide a similar synthesis of a moderate-sized humid tropical watershed. The PCWE and the Agua Salud Project (ASP) within the PCWE are now addressing the role of land cover (mature forests, pasture, invasive-grass dominated, secondary succession, native species plantation, and teak) at scales ranging from small watersheds to the whole Panama Canal watershed. Biologists have participated in the experimental design at both watershed scales, and small (0.1 ha) to large (50 ha) forest-dynamic plots have a central role in interfacing between physical scientists and biologists. In these plots, repeated, high-resolution mapping of all woody plants greater than 1-cm diameter provides a description of population changes through time presumably reflecting individual life histories, interactions with other organisms and the influence of landscape processes and climate, thereby bridging the research needs and conceptual scales of hydrologists and biogeochemists with those of biologists. Both experiments are embedded in larger data-collection networks: the WEBB within the hydrological and meteorological monitoring programs of the USGS and other federal agencies, and the PCWE in the long-term monitoring conducted by the Panama Canal Authority (ACP), its antecedents, and STRI. Examination of landscape-scale processes in a changing world requires the development of detailed landscape-scale data sets, including a formulation of reference states that can act as surrogate experimental controls. For example, the concept of a landscape steady state provides a convenient reference in which present-day observations can be interpreted. Extreme hydrological states must also be described, and both WEBB and PCWE have successfully examined the role of droughts and large storms and their impact on geomorphology, biogeochemistry, and biology. These experiments also have provided platforms for research endeavors never contemplated in the original objectives, a testament to the importance of developing approaches that consider the needs of physical and biological sciences.
StePS: Stereographically Projected Cosmological Simulations
NASA Astrophysics Data System (ADS)
Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László
2018-05-01
StePS (Stereographically Projected Cosmological Simulations) compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to simulate the evolution of the large-scale structure. This eliminates the need for periodic boundary conditions, which are a numerical convenience unsupported by observation and which modifies the law of force on large scales in an unrealistic fashion. StePS uses stereographic projection for space compactification and naive O(N2) force calculation; this arrives at a correlation function of the same quality more quickly than standard (tree or P3M) algorithms with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence StePS can function as a high-speed prediction tool for modern large-scale surveys.
An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.
2012-11-01
Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.
A vision for chronic disease prevention intervention research: report from a workshop.
Ashbury, Frederick D; Little, Julian; Ioannidis, John P A; Kreiger, Nancy; Palmer, Lyle J; Relton, Clare; Taylor, Peter
2014-04-17
The Population Studies Research Network of Cancer Care Ontario hosted a strategic planning workshop to establish an agenda for a prevention intervention research program in Ontario, including priority topics for investigation and design considerations. The two-day workshop included: presentations on background papers developed to facilitate participants' preparation for and discussions in the workshop; keynote presentations on intervention research concerning primary prevention of chronic diseases, design and study implementation considerations; a dedicated session on critical and creative thinking to stimulate participation and discussion topics; break out groups to identify, discuss and present study ideas, designs, implementation considerations; and a consensus process to discuss and identify recommendations for research priorities and next steps. The retreat yielded the following recommendations: 1) develop an intervention research agenda that includes working with existing large-scale cohorts; 2) develop an intervention research agenda that includes novel research designs that could target individuals or groups; and 3) develop an intervention research agenda in which studies collect data on costs, define stakeholders, and ensure clear strategies for stakeholder engagement and knowledge transfer. The Population Studies Research Network will develop options from these recommendations and release a call for proposals in 2014 for intervention research pilot projects that reflect these recommendations. Pilot projects will be evaluated based on their fit with the retreat's recommendations, and their potential to scale up to full studies and application in practice.
Risk management in a large-scale CO2 geosequestration pilot project, Illinois, USA
Hnottavange-Telleen, K.; Chabora, E.; Finley, R.J.; Greenberg, S.E.; Marsteller, S.
2011-01-01
Like most large-scale infrastructure projects, carbon dioxide (CO 2) geological sequestration (GS) projects have multiple success criteria and multiple stakeholders. In this context "risk evaluation" encompasses multiple scales. Yet a risk management program aims to maximize the chance of project success by assessing, monitoring, minimizing all risks in a consistent framework. The 150,000-km2 Illinois Basin underlies much of the state of Illinois, USA, and parts of adjacent Kentucky and Indiana. Its potential for CO2 storage is first-rate among basins in North America, an impression that has been strengthened by early testing of the injection well of the Midwest Geological Sequestration Consortium's (MGSC's) Phase III large scale demonstration project, the Illinois Basin - Decatur Project (IBDP). The IBDP, funded by the U.S. Department of Energy's National Energy Technology Laboratory (NETL), represents a key trial of GS technologies and project-management techniques. Though risks are specific to each site and project, IBDP risk management methodologies provide valuable experience for future GS projects. IBDP views risk as the potential for negative impact to any of these five values: health and safety, environment, financial, advancing the viability and public acceptability of a GS industry, and research. Research goals include monitoring one million metric tonnes of injected CO2 in the subsurface. Risk management responds to the ways in which any values are at risk: for example, monitoring is designed to reduce uncertainties in parameter values that are important for research and system control, and is also designed to provide public assurance. Identified risks are the primary basis for risk-reduction measures: risks linked to uncertainty in geologic parameters guide further characterization work and guide simulations applied to performance evaluation. Formally, industry defines risk (more precisely risk criticality) as the product L*S, the Likelihood multiplied by the Severity of negative impact. L and S are each evaluated on five-point scales, yielding a theoretical spread in risk values of 1 through 25. So defined, these judgment-based values are categorical and ordinal - they do not represent physically measurable quantities, but are nonetheless useful for comparison and therefore decision support. The "risk entities" first evaluated are FEPs - conceptual Features, Events, and Processes based on the list published by Quintessa Ltd. After concrete scenarios are generated based on selected FEPs, scenarios become the critical entities whose associated risks are evaluated and tracked. In IBDP workshops, L and S values for 123 FEPs were generated through expert elicitation. About 30 experts in the project or in GS in general were assigned among six facilitated working groups, and each group was charged to envision risks within a sphere of project operations. Working groups covered FEPs with strong spatial characteristics - such as those related to the injection wellbore and simulated plume footprint - and "nonspatial" FEPs related to finance, regulations, legal, and stakeholder issues. Within these working groups, experts shared information, examined assumptions, refined and extended the FEP list, calibrated responses, and provided initial L and S values by consensus. Individual rankings were collected in a follow-up process via emailed spreadsheets. For each of L and S, three values were collected: Lower Bound, Best Guess, and Upper Bound. The Lower-Upper Bound ranges and the spreads among experts can be interpreted to yield rough confidence measures. Based on experts' responses, FEPs were ranked in terms of their L*S risk levels. FEP rankings were determined from individual (not consensus or averaged) results, thus no high-risk responses were damped out. The higher-risk FEPs were used to generate one or more concrete, well defined risk-bearing scenarios for each FEP. Any FEP scored by any expert as having associated risk of
Program Correctness, Verification and Testing for Exascale (Corvette)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Koushik; Iancu, Costin; Demmel, James W
The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less
Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia
2016-09-01
Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.
Laycock, Alison; Bailie, Jodie; Matthews, Veronica; Cunningham, Frances; Harvey, Gillian; Percival, Nikki; Bailie, Ross
2017-07-13
Bringing together continuous quality improvement (CQI) data from multiple health services offers opportunities to identify common improvement priorities and to develop interventions at various system levels to achieve large-scale improvement in care. An important principle of CQI is practitioner participation in interpreting data and planning evidence-based change. This study will contribute knowledge about engaging diverse stakeholders in collaborative and theoretically informed processes to identify and address priority evidence-practice gaps in care delivery. This paper describes a developmental evaluation to support and refine a novel interactive dissemination project using aggregated CQI data from Aboriginal and Torres Strait Islander primary healthcare centres in Australia. The project aims to effect multilevel system improvement in Aboriginal and Torres Strait Islander primary healthcare. Data will be gathered using document analysis, online surveys, interviews with participants and iterative analytical processes with the research team. These methods will enable real-time feedback to guide refinements to the design, reports, tools and processes as the interactive dissemination project is implemented. Qualitative data from interviews and surveys will be analysed and interpreted to provide in-depth understanding of factors that influence engagement and stakeholder perspectives about use of the aggregated data and generated improvement strategies. Sources of data will be triangulated to build up a comprehensive, contextualised perspective and integrated understanding of the project's development, implementation and findings. The Human Research Ethics Committee (HREC) of the Northern Territory Department of Health and Menzies School of Health Research (Project 2015-2329), the Central Australian HREC (Project 15-288) and the Charles Darwin University HREC (Project H15030) approved the study. Dissemination will include articles in peer-reviewed journals, policy and research briefs. Results will be presented at conferences and quality improvement network meetings. Researchers, clinicians, policymakers and managers developing evidence-based system and policy interventions should benefit from this research. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Integrating the Complete Research Project into a Large Qualitative Methods Course
ERIC Educational Resources Information Center
Raddon, Mary-Beth; Nault, Caleb; Scott, Alexis
2008-01-01
Participatory exercises are standard practice in qualitative methods courses; less common are projects that engage students in the entire research process, from research design to write-up. Although the teaching literature provides several models of complete research projects, their feasibility, and appropriateness for large, compulsory,…
NASA Technical Reports Server (NTRS)
Zapata, R. N.; Humphris, R. R.; Henderson, K. C.
1975-01-01
The unique design and operational characteristics of a prototype magnetic suspension and balance facility which utilizes superconductor technology are described and discussed from the point of view of scalability to large sizes. The successful experimental demonstration of the feasibility of this new magnetic suspension concept of the University of Virginia, together with the success of the cryogenic wind-tunnel concept developed at Langley Research Center, appear to have finally opened the way to clean-tunnel, high-Re aerodynamic testing. Results of calculations corresponding to a two-step design extrapolation from the observed performance of the prototype magnetic suspension system to a system compatible with the projected cryogenic transonic research tunnel are presented to give an order-of-magnitude estimate of expected performance characteristics. Research areas where progress should lead to improved design and performance of large facilities are discussed.
ERIC Educational Resources Information Center
Veaner, Allen B.
Project BALLOTS is a large-scale library automation development project of the Stanford University Libraries which has demonstrated the feasibility of conducting on-line interactive searches of complex bibliographic files, with a large number of users working simultaneously in the same or different files. This report documents the continuing…
NASA Astrophysics Data System (ADS)
Mills, W. B.; Costa-Cabral, M. C.; Bromirski, P. D.; Miller, N. L.; Coats, R. N.; Loewenstein, M.; Roy, S. B.; MacWilliams, M.
2012-12-01
This work evaluates the implications to flooding risk at the low-lying NASA Ames Research Center in South San Francisco Bay under historical and projected climate and sea level rise. Atmospheric circulation patterns over the Pacific Ocean, influenced by ENSO and PDO, can result in extended periods of higher mean coastal sea level in California. Simultaneously they originate a larger number of storms that make landfall and have higher mean intensity. These storms generate barometrically-induced high water anomalies, and winds that are sometimes capable of producing large coastal waves. Storm surges that propagate from the coast into the estuary and South Bay, and locally-generated waves, may compromise the discharge capacity of stream channels. These conditions also typically generate high intensity rainfall, and the reduced channel capacity may result in fluvial flooding. Such atmospheric circulation patterns may persist for many months, during which California experiences more precipitation events of longer mean duration and higher intensity, leading to large precipitation totals that saturate soils and may exceed the storage capacity of stormwater retention ponds. Future scenarios of sea level rise, that may surpass a meter in this century according to the projections recently published by the National Research Council for states of CA, OR and WA, and projected atmospheric circulation changes associated with anthropogenic climate change, may amplify these risks. We evaluate the impacts of these changes on NASA's Ames Research Center through four areas of study: (i) wetland accretion and evolution as mean sea level rises, with implications to the Bay's response to the sea level rise and storm surges, (ii) hydrodynamic modeling to simulate the propagation of tidal height and storm surges in the Bay and the influence of local winds on wave height, (iii) evaluation of historical data and future climate projections to identify extreme precipitation events, and (iv) regional climate models to identify moisture source areas and evaluate the role of moisture flux on projected California precipitation.;
Caxaj, C Susana; Berman, Helene; Ray, Susan L; Restoule, Jean-Paul; Varcoe, Coleen
2014-11-01
The influence of large-scale mining on the psychosocial wellbeing and mental health of diverse Indigenous communities has attracted increased attention. In previous reports, we have discussed the influence of a gold mining operation on the health of a community in the Western highlands of Guatemala. Here, we discuss the community strengths, and acts of resistance of this community, that is, community processes that promoted mental health amidst this context. Using an anti-colonial narrative methodology that incorporated participatory action research principles, we developed a research design in collaboration with community leaders and participants. Data collection involved focus groups, individual interviews and photo-sharing with 54 men and women between the ages of 18 and 67. Data analysis was guided by iterative and ongoing conversations with participants and McCormack's narrative lenses. Study findings revealed key mechanisms and sources of resistance, including a shared cultural identity, a spiritual knowing and being, 'defending our rights, defending our territory,' and, speaking truth to power. These overlapping strengths were identified by participants as key protective factors in facing challenges and adversity. Yet ultimately, these same strengths were often the most eroded or endangered due the influence of large-scale mining operations in the region. These community strengths and acts of resistance reveal important priorities for promoting mental health and wellbeing for populations impacted by large-scale mining operations. Mental health practitioners must attend to both the strengths and parallel vulnerabilities that may be occasioned by large-scale projects of this nature.
Sediment dynamics in the Adriatic Sea investigated with coupled models
Sherwood, Christopher R.; Book, Jeffrey W.; Carniel, Sandro; Cavaleri, Luigi; Chiggiato, Jacopo; Das, Himangshu; Doyle, James D.; Harris, Courtney K.; Niedoroda, Alan W.; Perkins, Henry; Poulain, Pierre-Marie; Pullen, Julie; Reed, Christopher W.; Russo, Aniello; Sclavo, Mauro; Signell, Richard P.; Traykovski, Peter A.; Warner, John C.
2004-01-01
Several large research programs focused on the Adriatic Sea in winter 2002-2003, making it an exciting place for sediment dynamics modelers (Figure 1). Investigations of atmospheric forcing and oceanic response (including wave generation and propagation, water-mass formation, stratification, and circulation), suspended material, bottom boundary layer dynamics, bottom sediment, and small-scale stratigraphy were performed by European and North American researchers participating in several projects. The goal of EuroSTRATAFORM researchers is to improve our ability to understand and simulate the physical processes that deliver sediment to the marine environment and generate stratigraphic signatures. Scientists involved in the Po and Apennine Sediment Transport and Accumulation (PASTA) experiment benefited from other major research programs including ACE (Adriatic Circulation Experiment), DOLCE VITA (Dynamics of Localized Currents and Eddy Variability in the Adriatic), EACE (the Croatian East Adriatic Circulation Experiment project), WISE (West Istria Experiment), and ADRICOSM (Italian nowcasting and forecasting) studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malony, Allen D; Shende, Sameer
This is the final progress report for the FastOS (Phase 2) (FastOS-2) project with Argonne National Laboratory and the University of Oregon (UO). The project started at UO on July 1, 2008 and ran until April 30, 2010, at which time a six-month no-cost extension began. The FastOS-2 work at UO delivered excellent results in all research work areas: * scalable parallel monitoring * kernel-level performance measurement * parallel I/0 system measurement * large-scale and hybrid application performance measurement * onlne scalable performance data reduction and analysis * binary instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anders, R.
2013-05-01
The Long Island Solar Farm (LISF) is a remarkable success story, whereby very different interest groups found a way to capitalize on unusual circumstances to develop a mutually beneficial source of renewable energy. The uniqueness of the circumstances that were necessary to develop the Long Island Solar Farm make it very difficult to replicate. The project is, however, an unparalleled resource for solar energy research, which will greatly inform large-scale PV solar development in the East. Lastly, the LISF is a superb model for the process by which the project developed and the innovation and leadership shown by the differentmore » players.« less
Silventoinen, Karri; Jelenkovic, Aline; Sund, Reijo; Honda, Chika; Aaltonen, Sari; Yokoyama, Yoshie; Tarnoki, Adam D; Tarnoki, David L; Ning, Feng; Ji, Fuling; Pang, Zengchang; Ordoñana, Juan R; Sánchez-Romera, Juan F; Colodro-Conde, Lucia; Burt, S Alexandra; Klump, Kelly L; Medland, Sarah E; Montgomery, Grant W; Kandler, Christian; McAdams, Tom A; Eley, Thalia C; Gregory, Alice M; Saudino, Kimberly J; Dubois, Lise; Boivin, Michel; Haworth, Claire M A; Plomin, Robert; Öncel, Sevgi Y; Aliev, Fazil; Stazi, Maria A; Fagnani, Corrado; D'Ippolito, Cristina; Craig, Jeffrey M; Saffery, Richard; Siribaddana, Sisira H; Hotopf, Matthew; Sumathipala, Athula; Spector, Timothy; Mangino, Massimo; Lachance, Genevieve; Gatz, Margaret; Butler, David A; Bayasgalan, Gombojav; Narandalai, Danshiitsoodol; Freitas, Duarte L; Maia, José Antonio; Harden, K Paige; Tucker-Drob, Elliot M; Christensen, Kaare; Skytthe, Axel; Kyvik, Kirsten O; Hong, Changhee; Chong, Youngsook; Derom, Catherine A; Vlietinck, Robert F; Loos, Ruth J F; Cozen, Wendy; Hwang, Amie E; Mack, Thomas M; He, Mingguang; Ding, Xiaohu; Chang, Billy; Silberg, Judy L; Eaves, Lindon J; Maes, Hermine H; Cutler, Tessa L; Hopper, John L; Aujard, Kelly; Magnusson, Patrik K E; Pedersen, Nancy L; Aslan, Anna K Dahl; Song, Yun-Mi; Yang, Sarah; Lee, Kayoung; Baker, Laura A; Tuvblad, Catherine; Bjerregaard-Andersen, Morten; Beck-Nielsen, Henning; Sodemann, Morten; Heikkilä, Kauko; Tan, Qihua; Zhang, Dongfeng; Swan, Gary E; Krasnow, Ruth; Jang, Kerry L; Knafo-Noam, Ariel; Mankuta, David; Abramson, Lior; Lichtenstein, Paul; Krueger, Robert F; McGue, Matt; Pahlen, Shandell; Tynelius, Per; Duncan, Glen E; Buchwald, Dedra; Corley, Robin P; Huibregtse, Brooke M; Nelson, Tracy L; Whitfield, Keith E; Franz, Carol E; Kremen, William S; Lyons, Michael J; Ooki, Syuichi; Brandt, Ingunn; Nilsen, Thomas Sevenius; Inui, Fujio; Watanabe, Mikio; Bartels, Meike; van Beijsterveldt, Toos C E M; Wardle, Jane; Llewellyn, Clare H; Fisher, Abigail; Rebato, Esther; Martin, Nicholas G; Iwatani, Yoshinori; Hayakawa, Kazuo; Rasmussen, Finn; Sung, Joohon; Harris, Jennifer R; Willemsen, Gonneke; Busjahn, Andreas; Goldberg, Jack H; Boomsma, Dorret I; Hur, Yoon-Mi; Sørensen, Thorkild I A; Kaprio, Jaakko
2015-08-01
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
Silventoinen, Karri; Jelenkovic, Aline; Sund, Reijo; Honda, Chika; Aaltonen, Sari; Yokoyama, Yoshie; Tarnoki, Adam D; Tarnoki, David L; Ning, Feng; Ji, Fuling; Pang, Zengchang; Ordoñana, Juan R; Sánchez-Romera, Juan F; Colodro-Conde, Lucia; Burt, S Alexandra; Klump, Kelly L; Medland, Sarah E; Montgomery, Grant W; Kandler, Christian; McAdams, Tom A; Eley, Thalia C; Gregory, Alice M; Saudino, Kimberly J; Dubois, Lise; Boivin, Michel; Haworth, Claire MA; Plomin, Robert; Öncel, Sevgi Y; Aliev, Fazil; Stazi, Maria A; Fagnani, Corrado; D'Ippolito, Cristina; Craig, Jeffrey M; Saffery, Richard; Siribaddana, Sisira H; Hotopf, Matthew; Sumathipala, Athula; Spector, Timothy; Mangino, Massimo; Lachance, Genevieve; Gatz, Margaret; Butler, David A; Bayasgalan, Gombojav; Narandalai, Danshiitsoodol; Freitas, Duarte L; Maia, José Antonio; Harden, K Paige; Tucker-Drob, Elliot M; Christensen, Kaare; Skytthe, Axel; Kyvik, Kirsten O; Hong, Changhee; Chong, Youngsook; Derom, Catherine A; Vlietinck, Robert F; Loos, Ruth JF; Cozen, Wendy; Hwang, Amie E; Mack, Thomas M; He, Mingguang; Ding, Xiaohu; Chang, Billy; Silberg, Judy L; Eaves, Lindon J; Maes, Hermine H; Cutler, Tessa L; Hopper, John L; Aujard, Kelly; Magnusson, Patrik KE; Pedersen, Nancy L; Dahl-Aslan, Anna K; Song, Yun-Mi; Yang, Sarah; Lee, Kayoung; Baker, Laura A; Tuvblad, Catherine; Bjerregaard-Andersen, Morten; Beck-Nielsen, Henning; Sodemann, Morten; Heikkilä, Kauko; Tan, Qihua; Zhang, Dongfeng; Swan, Gary E; Krasnow, Ruth; Jang, Kerry L; Knafo-Noam, Ariel; Mankuta, David; Abramson, Lior; Lichtenstein, Paul; Krueger, Robert F; McGue, Matt; Pahlen, Shandell; Tynelius, Per; Duncan, Glen E; Buchwald, Dedra; Corley, Robin P; Huibregtse, Brooke M; Nelson, Tracy L; Whitfield, Keith E; Franz, Carol E; Kremen, William S; Lyons, Michael J; Ooki, Syuichi; Brandt, Ingunn; Nilsen, Thomas Sevenius; Inui, Fujio; Watanabe, Mikio; Bartels, Meike; van Beijsterveldt, Toos CEM; Wardle, Jane; Llewellyn, Clare H; Fisher, Abigail; Rebato, Esther; Martin, Nicholas G; Iwatani, Yoshinori; Hayakawa, Kazuo; Rasmussen, Finn; Sung, Joohon; Harris, Jennifer R; Willemsen, Gonneke; Busjahn, Andreas; Goldberg, Jack H; Boomsma, Dorret I; Hur, Yoon-Mi; Sørensen, Thorkild IA; Kaprio, Jaakko
2015-01-01
For over one hundred years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically 1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and 2) to study the effects of birth related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects including both monozygotic and dizygotic twins using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes. PMID:26014041
Ellinas, Christos; Allan, Neil; Durugbo, Christopher; Johansson, Anders
2015-01-01
Current societal requirements necessitate the effective delivery of complex projects that can do more while using less. Yet, recent large-scale project failures suggest that our ability to successfully deliver them is still at its infancy. Such failures can be seen to arise through various failure mechanisms; this work focuses on one such mechanism. Specifically, it examines the likelihood of a project sustaining a large-scale catastrophe, as triggered by single task failure and delivered via a cascading process. To do so, an analytical model was developed and tested on an empirical dataset by the means of numerical simulation. This paper makes three main contributions. First, it provides a methodology to identify the tasks most capable of impacting a project. In doing so, it is noted that a significant number of tasks induce no cascades, while a handful are capable of triggering surprisingly large ones. Secondly, it illustrates that crude task characteristics cannot aid in identifying them, highlighting the complexity of the underlying process and the utility of this approach. Thirdly, it draws parallels with systems encountered within the natural sciences by noting the emergence of self-organised criticality, commonly found within natural systems. These findings strengthen the need to account for structural intricacies of a project's underlying task precedence structure as they can provide the conditions upon which large-scale catastrophes materialise.
Becker, Christian M.; Laufer, Marc R.; Stratton, Pamela; Hummelshoj, Lone; Missmer, Stacey A.; Zondervan, Krina T.; Adamson, G. David; Adamson, G.D.; Allaire, C.; Anchan, R.; Becker, C.M.; Bedaiwy, M.A.; Buck Louis, G.M.; Calhaz-Jorge, C.; Chwalisz, K.; D'Hooghe, T.M.; Fassbender, A.; Faustmann, T.; Fazleabas, A.T.; Flores, I.; Forman, A.; Fraser, I.; Giudice, L.C.; Gotte, M.; Gregersen, P.; Guo, S.-W.; Harada, T.; Hartwell, D.; Horne, A.W.; Hull, M.L.; Hummelshoj, L.; Ibrahim, M.G.; Kiesel, L.; Laufer, M.R.; Machens, K.; Mechsner, S.; Missmer, S.A.; Montgomery, G.W.; Nap, A.; Nyegaard, M.; Osteen, K.G.; Petta, C.A.; Rahmioglu, N.; Renner, S.P.; Riedlinger, J.; Roehrich, S.; Rogers, P.A.; Rombauts, L.; Salumets, A.; Saridogan, E.; Seckin, T.; Stratton, P.; Sharpe-Timms, K.L.; Tworoger, S.; Vigano, P.; Vincent, K.; Vitonis, A.F.; Wienhues-Thelen, U.-H.; Yeung, P.P.; Yong, P.; Zondervan, K.T.
2014-01-01
Objective To standardize the recording of surgical phenotypic information on endometriosis and related sample collections obtained at laparoscopy, allowing large-scale collaborative research into the condition. Design An international collaboration involving 34 clinical/academic centers and three industry collaborators from 16 countries. Setting Two workshops were conducted in 2013, bringing together 54 clinical, academic, and industry leaders in endometriosis research and management worldwide. Patient(s) None. Intervention(s) A postsurgical scoring sheet containing general and gynecological patient and procedural information, extent of disease, the location and type of endometriotic lesion, and any other findings was developed during several rounds of review. Comments and any systematic surgical data collection tools used in the reviewers' centers were incorporated. Main Outcome Measure(s) The development of a standard recommended (SSF) and minimum required (MSF) form to collect data on the surgical phenotype of endometriosis. Result(s) SSF and MSF include detailed descriptions of lesions, modes of procedures and sample collection, comorbidities, and potential residual disease at the end of surgery, along with previously published instruments such as the revised American Society for Reproductive Medicine and Endometriosis Fertility Index classification tools for comparison and validation. Conclusion(s) This is the first multicenter, international collaboration between academic centers and industry addressing standardization of phenotypic data collection for a specific disease. The Endometriosis Phenome and Biobanking Harmonisation Project SSF and MSF are essential tools to increase our understanding of the pathogenesis of endometriosis by allowing large-scale collaborative research into the condition. PMID:25150390
Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets.
Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L
2014-01-01
As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Huang, Zhenzhen; Duan, Huilong; Li, Haomin
2015-01-01
Large-scale human cancer genomics projects, such as TCGA, generated large genomics data for further study. Exploring and mining these data to obtain meaningful analysis results can help researchers find potential genomics alterations that intervene the development and metastasis of tumors. We developed a web-based gene analysis platform, named TCGA4U, which used statistics methods and models to help translational investigators explore, mine and visualize human cancer genomic characteristic information from the TCGA datasets. Furthermore, through Gene Ontology (GO) annotation and clinical data integration, the genomic data were transformed into biological process, molecular function, cellular component and survival curves to help researchers identify potential driver genes. Clinical researchers without expertise in data analysis will benefit from such a user-friendly genomic analysis platform.
Using real options to evaluate the flexibility in the deployment of SMR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Locatelli, G.; Mancini, M.; Ruiz, F.
2012-07-01
According to recent estimations the financial gap between Large Reactors (LR) and Small Medium Reactors (SMRs) seems not as huge as the economy of scale would suggest, so the SMRs are going to be important players of the worldwide nuclear renaissance. POLIMIs INCAS model has been developed to compare the investment in SMR with respect to LR. It provides the value of IRR (Internal Rate of Return), NPV (Net Present Value), LUEC (Levelized Unitary Electricity Cost), up-front investment, etc. The aim of this research is to integrate the actual INCAS model, based on discounted cash flows, with the real optionmore » theory to measure flexibility of the investor to expand, defer or abandon a nuclear project, under future uncertainties. The work compares the investment in a large nuclear power plant with a series of smaller, modular nuclear power plants on the same site. As a consequence it compares the benefits of the large power plant, coming from the economy of scale, to the benefit of the modular project (flexibility) concluding that managerial flexibility can be measured and used by an investor to face the investment risks. (authors)« less
Enabling responsible public genomics.
Conley, John M; Doerr, Adam K; Vorhaus, Daniel B
2010-01-01
As scientific understandings of genetics advance, researchers require increasingly rich datasets that combine genomic data from large numbers of individuals with medical and other personal information. Linking individuals' genetic data and personal information precludes anonymity and produces medically significant information--a result not contemplated by the established legal and ethical conventions governing human genomic research. To pursue the next generation of human genomic research and commerce in a responsible fashion, scientists, lawyers, and regulators must address substantial new issues, including researchers' duties with respect to clinically significant data, the challenges to privacy presented by genomic data, the boundary between genomic research and commerce, and the practice of medicine. This Article presents a new model for understanding and addressing these new challenges--a "public genomics" premised on the idea that ethically, legally, and socially responsible genomics research requires openness, not privacy, as its organizing principle. Responsible public genomics combines the data contributed by informed and fully consenting information altruists and the research potential of rich datasets in a genomic commons that is freely and globally available. This Article examines the risks and benefits of this public genomics model in the context of an ambitious genetic research project currently under way--the Personal Genome Project. This Article also (i) demonstrates that large-scale genomic projects are desirable, (ii) evaluates the risks and challenges presented by public genomics research, and (iii) determines that the current legal and regulatory regimes restrict beneficial and responsible scientific inquiry while failing to adequately protect participants. The Article concludes by proposing a modified normative and legal framework that embraces and enables a future of responsible public genomics.
NASA Space Engineering Research Center for VLSI systems design
NASA Technical Reports Server (NTRS)
1991-01-01
This annual review reports the center's activities and findings on very large scale integration (VLSI) systems design for 1990, including project status, financial support, publications, the NASA Space Engineering Research Center (SERC) Symposium on VLSI Design, research results, and outreach programs. Processor chips completed or under development are listed. Research results summarized include a design technique to harden complementary metal oxide semiconductors (CMOS) memory circuits against single event upset (SEU); improved circuit design procedures; and advances in computer aided design (CAD), communications, computer architectures, and reliability design. Also described is a high school teacher program that exposes teachers to the fundamentals of digital logic design.
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents the computer printout of data on the application of discriminant function analysis of…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international conflict over a three-year period. Computer printout of the analysis is included. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969…
ERIC Educational Resources Information Center
Vincent, Jack E.
Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this computer printout presents data on the application of social field theory to patterns of conflict among nations. Social field theory implies that international relations is a field which consists of all the…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international conflict over a three-year period. Computer printout of the analysis is included. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents a computer printout of data regarding 'topdog' behavior among nations with regard to economic development and…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969 for…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents the computer printout of data on the application of discriminant function analysis…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969…
ERIC Educational Resources Information Center
Buckley, Barbara C.; Gobert, Janice D.; Kindfield, Ann C. H.; Horwitz, Paul; Tinker, Robert F.; Gerlits, Bobbi; Wilensky, Uri; Dede, Chris; Willett, John
2004-01-01
This paper describes part of a project called Modeling Across the Curriculum which is a large-scale research study in 15 schools across the United States. The specific data presented and discussed here in this paper is based on BioLogica, a hypermodel, interactive environment for learning genetics, which was implemented in multiple classes in…
Women Aboard Navy Ships: A Comprehensive Health and Readiness Research Project
1996-03-01
Pediatric Adolescent Endocrinology 1980; 10: 123-132. 19. Inui TS, Yourtee EL, Williamson JW. Improved outcomes in hypertension after physician...These shipboard, duty, and military life stressors and psychosocial stress outcomes need to be examined in relationship to a number of health and health...Pregnancy outcomes were investigated in large-scale surveys conducted by the Navy in 1988, 1990, and 1992 [9]. The findings indicated that military
ERIC Educational Resources Information Center
Waxman, Hersh C.; Padron, Yolanda N.; Lee, Yuan-Hsuan
2010-01-01
The No Child Left Behind Act (NCLB) of 2002 calls for several changes in the K-12 education system in the United States. It focuses on evidence-based educational practices for schools in the United States. This study was part of a large-scale, 8-year research project that examined the quality of classroom instruction from three elementary schools…
NASA Technical Reports Server (NTRS)
Saunders, J. D.; Stueber, T. J.; Thomas, S. R.; Suder, K. L.; Weir, L. J.; Sanders, B. W.
2012-01-01
Status on an effort to develop Turbine Based Combined Cycle (TBCC) propulsion is described. This propulsion technology can enable reliable and reusable space launch systems. TBCC propulsion offers improved performance and safety over rocket propulsion. The potential to realize aircraft-like operations and reduced maintenance are additional benefits. Among most the critical TBCC enabling technologies are: 1) mode transition from turbine to scramjet propulsion, 2) high Mach turbine engines and 3) TBCC integration. To address these TBCC challenges, the effort is centered on a propulsion mode transition experiment and includes analytical research. The test program, the Combined-Cycle Engine Large Scale Inlet Mode Transition Experiment (CCE LIMX), was conceived to integrate TBCC propulsion with proposed hypersonic vehicles. The goals address: (1) dual inlet operability and performance, (2) mode-transition sequences enabling a switch between turbine and scramjet flow paths, and (3) turbine engine transients during transition. Four test phases are planned from which a database can be used to both validate design and analysis codes and characterize operability and integration issues for TBCC propulsion. In this paper we discuss the research objectives, features of the CCE hardware and test plans, and status of the parametric inlet characterization testing which began in 2011. This effort is sponsored by the NASA Fundamental Aeronautics Hypersonics project
Cogollor, José M; Rojo-Lacal, Javier; Hermsdörfer, Joachim; Arredondo Waldmeyer, Maria Teresa; Giachritsis, Christos; Armstrong, Alan; Breñosa Martinez, Jose Manuel; Bautista Loza, Doris Anabelle; Sebastián, José María
2018-01-01
Background Neurological patients after stroke usually present cognitive deficits that cause dependencies in their daily living. These deficits mainly affect the performance of some of their daily activities. For that reason, stroke patients need long-term processes for their cognitive rehabilitation. Considering that classical techniques are focused on acting as guides and are dependent on help from therapists, significant efforts are being made to improve current methodologies and to use eHealth and Web-based architectures to implement information and communication technology (ICT) systems that achieve reliable, personalized, and home-based platforms to increase efficiency and level of attractiveness for patients and carers. Objective The goal of this work was to provide an overview of the practices implemented for the assessment of stroke patients and cognitive rehabilitation. This study puts together traditional methods and the most recent personalized platforms based on ICT technologies and Internet of Things. Methods A literature review has been distributed to a multidisciplinary team of researchers from engineering, psychology, and sport science fields. The systematic review has been focused on published scientific research, other European projects, and the most current innovative large-scale initiatives in the area. A total of 3469 results were retrieved from Web of Science, 284 studies from Journal of Medical Internet Research, and 15 European research projects from Community Research and Development Information Service from the last 15 years were reviewed for classification and selection regarding their relevance. Results A total of 7 relevant studies on the screening of stroke patients have been presented with 6 additional methods for the analysis of kinematics and 9 studies on the execution of goal-oriented activities. Meanwhile, the classical methods to provide cognitive rehabilitation have been classified in the 5 main techniques implemented. Finally, the review has been finalized with the selection of 8 different ICT–based approaches found in scientific-technical studies, 9 European projects funded by the European Commission that offer eHealth architectures, and other large-scale activities such as smart houses and the initiative City4Age. Conclusions Stroke is one of the main causes that most negatively affect countries in the socioeconomic aspect. The design of new ICT-based systems should provide 4 main features for an efficient and personalized cognitive rehabilitation: support in the execution of complex daily tasks, automatic error detection, home-based performance, and accessibility. Only 33% of the European projects presented fulfilled those requirements at the same time. For this reason, current and future large-scale initiatives focused on eHealth and smart environments should try to solve this situation by providing more complete and sophisticated platforms. PMID:29581093
Environmental impacts of large-scale CSP plants in northwestern China.
Wu, Zhiyong; Hou, Anping; Chang, Chun; Huang, Xiang; Shi, Duoqi; Wang, Zhifeng
2014-01-01
Several concentrated solar power demonstration plants are being constructed, and a few commercial plants have been announced in northwestern China. However, the mutual impacts between the concentrated solar power plants and their surrounding environments have not yet been addressed comprehensively in literature by the parties involved in these projects. In China, these projects are especially important as an increasing amount of low carbon electricity needs to be generated in order to maintain the current economic growth while simultaneously lessening pollution. In this study, the authors assess the potential environmental impacts of large-scale concentrated solar power plants. Specifically, the water use intensity, soil erosion and soil temperature are quantitatively examined. It was found that some of the impacts are favorable, while some impacts are negative in relation to traditional power generation techniques and some need further research before they can be reasonably appraised. In quantitative terms, concentrated solar power plants consume about 4000 L MW(-1) h(-1) of water if wet cooling technology is used, and the collectors lead to the soil temperature changes of between 0.5 and 4 °C; however, it was found that the soil erosion is dramatically alleviated. The results of this study are helpful to decision-makers in concentrated solar power site selection and regional planning. Some conclusions of this study are also valid for large-scale photovoltaic plants.
Large-scale water projects in the developing world: Revisiting the past and looking to the future
NASA Astrophysics Data System (ADS)
Sivakumar, Bellie; Chen, Ji
2014-05-01
During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning and management, with proper consideration to potential technological developments and new options. Finally, it highlights the urgent need for a broader framework that integrates the physical science-related aspects ("hard sciences") and the human science-related aspects ("soft sciences").
Sun, Ying; Huang, Yu; Li, Xiaofeng; Baldwin, Carole C; Zhou, Zhuocheng; Yan, Zhixiang; Crandall, Keith A; Zhang, Yong; Zhao, Xiaomeng; Wang, Min; Wong, Alex; Fang, Chao; Zhang, Xinhui; Huang, Hai; Lopez, Jose V; Kilfoyle, Kirk; Zhang, Yong; Ortí, Guillermo; Venkatesh, Byrappa; Shi, Qiong
2016-01-01
Ray-finned fishes (Actinopterygii) represent more than 50 % of extant vertebrates and are of great evolutionary, ecologic and economic significance, but they are relatively underrepresented in 'omics studies. Increased availability of transcriptome data for these species will allow researchers to better understand changes in gene expression, and to carry out functional analyses. An international project known as the "Transcriptomes of 1,000 Fishes" (Fish-T1K) project has been established to generate RNA-seq transcriptome sequences for 1,000 diverse species of ray-finned fishes. The first phase of this project has produced transcriptomes from more than 180 ray-finned fishes, representing 142 species and covering 51 orders and 109 families. Here we provide an overview of the goals of this project and the work done so far.
NASA: Assessments of Selected Large-Scale Projects
2011-03-01
REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the
Laurent, Olivier; Gomolka, Maria; Haylock, Richard; Blanchardon, Eric; Giussani, Augusto; Atkinson, Will; Baatout, Sarah; Bingham, Derek; Cardis, Elisabeth; Hall, Janet; Tomasek, Ladislav; Ancelet, Sophie; Badie, Christophe; Bethel, Gary; Bertho, Jean-Marc; Bouet, Ségolène; Bull, Richard; Challeton-de Vathaire, Cécile; Cockerill, Rupert; Davesne, Estelle; Ebrahimian, Teni; Engels, Hilde; Gillies, Michael; Grellier, James; Grison, Stephane; Gueguen, Yann; Hornhardt, Sabine; Ibanez, Chrystelle; Kabacik, Sylwia; Kotik, Lukas; Kreuzer, Michaela; Lebacq, Anne Laure; Marsh, James; Nosske, Dietmar; O'Hagan, Jackie; Pernot, Eileen; Puncher, Matthew; Rage, Estelle; Riddell, Tony; Roy, Laurence; Samson, Eric; Souidi, Maamar; Turner, Michelle C; Zhivin, Sergey; Laurier, Dominique
2016-06-01
The potential health impacts of chronic exposures to uranium, as they occur in occupational settings, are not well characterized. Most epidemiological studies have been limited by small sample sizes, and a lack of harmonization of methods used to quantify radiation doses resulting from uranium exposure. Experimental studies have shown that uranium has biological effects, but their implications for human health are not clear. New studies that would combine the strengths of large, well-designed epidemiological datasets with those of state-of-the-art biological methods would help improve the characterization of the biological and health effects of occupational uranium exposure. The aim of the European Commission concerted action CURE (Concerted Uranium Research in Europe) was to develop protocols for such a future collaborative research project, in which dosimetry, epidemiology and biology would be integrated to better characterize the effects of occupational uranium exposure. These protocols were developed from existing European cohorts of workers exposed to uranium together with expertise in epidemiology, biology and dosimetry of CURE partner institutions. The preparatory work of CURE should allow a large scale collaborative project to be launched, in order to better characterize the effects of uranium exposure and more generally of alpha particles and low doses of ionizing radiation.
NASA Astrophysics Data System (ADS)
Diiwu, J.; Silins, U.; Kevin, B.; Anderson, A.
2008-12-01
Like many areas of the Rocky Mountains, Alberta's forests on the eastern slopes of the Rockies have been shaped by decades of successful fire suppression. These forests are at high risk to fire and large scale insect infestation, and climate change will continue to increase these risks. These headwaters forests provide the vast majority of usable surface water supplies to large region of the province, and large scale natural disasters can have dramatic effects on water quality and water availability. The population in the region has steadily increased and now this area is the main source water for many Alberta municipalities, including the City of Calgary, which has a population of over one million. In 2003 a fire burned 21,000 ha in the southern foothills area. The government land managers were concerned about the downstream implications of the fire and salvage operations, however there was very limited scientific information to guide the decision making. This led to establishment of the Southern Rockies Watershed Project, which is a partnership between Alberta Sustainable Resource Development, the provincial government department responsible for land management and the University of Alberta. After five years of data collection, the project has produced quantitative information that was not previously available about the effects of fire and management interventions such as salvage logging on headwaters and regional water quality. This information can be used to make decisions on forest operations, fire suppression, and post-fire salvage operations. In the past few years this project has captured the interest of large municipalities and water treatment researchers who are keen to investigate the potential implications of large natural disturbances to large and small drinking water treatment facilities. Examples from this project will be used to highlight the challenges and successes encountered while bridging the gap between science and land management policy.
Exposing the Science in Citizen Science: Fitness to Purpose and Intentional Design.
Parrish, Julia K; Burgess, Hillary; Weltzin, Jake F; Fortson, Lucy; Wiggins, Andrea; Simmons, Brooke
2018-05-21
Citizen science is a growing phenomenon. With millions of people involved and billions of in-kind dollars contributed annually, this broad extent, fine grain approach to data collection should be garnering enthusiastic support in the mainstream science and higher education communities. However, many academic researchers demonstrate distinct biases against the use of citizen science as a source of rigorous information. To engage the public in scientific research, and the research community in the practice of citizen science, a mutual understanding is needed of accepted quality standards in science, and the corresponding specifics of project design and implementation when working with a broad public base. We define a science-based typology focused on the degree to which projects deliver the type(s) and quality of data/work needed to produce valid scientific outcomes directly useful in science and natural resource management. Where project intent includes direct contribution to science and the public is actively involved either virtually or hands-on, we examine the measures of quality assurance (methods to increase data quality during the design and implementation phases of a project) and quality control (post hoc methods to increase the quality of scientific outcomes). We suggest that high quality science can be produced with massive, largely one-off, participation if data collection is simple and quality control includes algorithm voting, statistical pruning and/or computational modeling. Small to mid-scale projects engaging participants in repeated, often complex, sampling can advance quality through expert-led training and well-designed materials, and through independent verification. Both approaches - simplification at scale and complexity with care - generate more robust science outcomes.
NASA Astrophysics Data System (ADS)
Fitzgerald, Michael; Danaia, Lena; McKinnon, David H.
2017-07-01
In recent years, calls for the adoption of inquiry-based pedagogies in the science classroom have formed a part of the recommendations for large-scale high school science reforms. However, these pedagogies have been problematic to implement at scale. This research explores the perceptions of 34 positively inclined early-adopter teachers in relation to their implementation of inquiry-based pedagogies. The teachers were part of a large-scale Australian high school intervention project based around astronomy. In a series of semi-structured interviews, the teachers identified a number of common barriers that prevented them from implementing inquiry-based approaches. The most important barriers identified include the extreme time restrictions on all scales, the poverty of their common professional development experiences, their lack of good models and definitions for what inquiry-based teaching actually is, and the lack of good resources enabling the capacity for change. Implications for expectations of teachers and their professional learning during educational reform and curriculum change are discussed.
Wilcox, S.; Andreas, A.
2010-03-16
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Stoffel, T.; Andreas, A.
2010-04-26
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-07-13
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2012-11-03
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Solar Resource & Meteorological Assessment Project (SOLRMAP): Sun Spot Two; Swink, Colorado (Data)
Wilcox, S.; Andreas, A.
2010-11-10
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-07-14
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2009-07-22
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-11-03
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
NASA's Hypersonic Research Engine Project: A review
NASA Technical Reports Server (NTRS)
Andrews, Earl H.; Mackley, Ernest A.
1994-01-01
The goals of the NASA Hypersonic Research Engine (HRE) Project, which began in 1964, were to design, develop, and construct a high-performance hypersonic research ramjet/scramjet engine for flight tests of the developed concept over the speed range of Mach 4 to 8. The project was planned to be accomplished in three phases: project definition, research engine development, and flight test using the X-15A-2 research airplane, which was modified to carry hydrogen fuel for the research engine. The project goal of an engine flight test was eliminated when the X-15 program was canceled in 1968. Ground tests of full-scale engine models then became the focus of the project. Two axisymmetric full-scale engine models, having 18-inch-diameter cowls, were fabricated and tested: a structural model and combustion/propulsion model. A brief historical review of the project, with salient features, typical data results, and lessons learned, is presented. An extensive number of documents were generated during the HRE Project and are listed.
Experimental plasma research project summaries
NASA Astrophysics Data System (ADS)
1992-06-01
This is the latest in a series of Project Summary books that date back to 1976. It is the first after a hiatus of several years. They are published to provide a short description of each project supported by the Experimental Plasma Research Branch of the Division of Applied Plasma Physics in the Office of Fusion Energy. The Experimental Plasma Research Branch seeks to provide a broad range of experimental data, physics understanding, and new experimental techniques that contribute to operation, interpretation, and improvement of high temperature plasma as a source of fusion energy. In pursuit of these objectives, the branch supports research at universities, DOE laboratories, other federal laboratories, and industry. About 70 percent of the funds expended are spent at universities and a significant function of this program is the training of students in fusion physics. The branch supports small- and medium-scale experimental studies directly related to specific critical plasma issues of the magnetic fusion program. Plasma physics experiments are conducted on transport of particles and energy within plasma. Additionally, innovative approaches for operating, controlling, and heating plasma are evaluated for application to the larger confinement devices of the magnetic fusion program. New diagnostic approaches to measuring the properties of high temperature plasmas are developed to the point where they can be applied with confidence on the large-scale confinement experiments. Atomic data necessary for impurity control, interpretation of diagnostic data, development of heating devices, and analysis of cooling by impurity ion radiation are obtained. The project summaries are grouped into the three categories of plasma physics, diagnostic development, and atomic physics.
Overview of Opportunities for Co-Location of Solar Energy Technologies and Vegetation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macknick, Jordan; Beatty, Brenda; Hill, Graham
2013-12-01
Large-scale solar facilities have the potential to contribute significantly to national electricity production. Many solar installations are large-scale or utility-scale, with a capacity over 1 MW and connected directly to the electric grid. Large-scale solar facilities offer an opportunity to achieve economies of scale in solar deployment, yet there have been concerns about the amount of land required for solar projects and the impact of solar projects on local habitat. During the site preparation phase for utility-scale solar facilities, developers often grade land and remove all vegetation to minimize installation and operational costs, prevent plants from shading panels, and minimizemore » potential fire or wildlife risks. However, the common site preparation practice of removing vegetation can be avoided in certain circumstances, and there have been successful examples where solar facilities have been co-located with agricultural operations or have native vegetation growing beneath the panels. In this study we outline some of the impacts that large-scale solar facilities can have on the local environment, provide examples of installations where impacts have been minimized through co-location with vegetation, characterize the types of co-location, and give an overview of the potential benefits from co-location of solar energy projects and vegetation. The varieties of co-location can be replicated or modified for site-specific use at other solar energy installations around the world. We conclude with opportunities to improve upon our understanding of ways to reduce the environmental impacts of large-scale solar installations.« less
ERIC Educational Resources Information Center
Grabowski, Barbara L.; Koszalka, Tiffany A.
Combining assessment and research components on a large development and research project is a complex task. There are many descriptions of how either assessment or research should be conducted, but detailed examples illustrating integration of such strategies in complex projects are scarce. This paper provides definitions of assessment,…
RENEB - Running the European Network of biological dosimetry and physical retrospective dosimetry.
Kulka, Ulrike; Abend, Michael; Ainsbury, Elizabeth; Badie, Christophe; Barquinero, Joan Francesc; Barrios, Lleonard; Beinke, Christina; Bortolin, Emanuela; Cucu, Alexandra; De Amicis, Andrea; Domínguez, Inmaculada; Fattibene, Paola; Frøvig, Anne Marie; Gregoire, Eric; Guogyte, Kamile; Hadjidekova, Valeria; Jaworska, Alicja; Kriehuber, Ralf; Lindholm, Carita; Lloyd, David; Lumniczky, Katalin; Lyng, Fiona; Meschini, Roberta; Mörtl, Simone; Della Monaca, Sara; Monteiro Gil, Octávia; Montoro, Alegria; Moquet, Jayne; Moreno, Mercedes; Oestreicher, Ursula; Palitti, Fabrizio; Pantelias, Gabriel; Patrono, Clarice; Piqueret-Stephan, Laure; Port, Matthias; Prieto, María Jesus; Quintens, Roel; Ricoul, Michelle; Romm, Horst; Roy, Laurence; Sáfrány, Géza; Sabatier, Laure; Sebastià, Natividad; Sommer, Sylwester; Terzoudi, Georgia; Testa, Antonella; Thierens, Hubert; Turai, Istvan; Trompier, François; Valente, Marco; Vaz, Pedro; Voisin, Philippe; Vral, Anne; Woda, Clemens; Zafiropoulos, Demetre; Wojcik, Andrzej
2017-01-01
A European network was initiated in 2012 by 23 partners from 16 European countries with the aim to significantly increase individualized dose reconstruction in case of large-scale radiological emergency scenarios. The network was built on three complementary pillars: (1) an operational basis with seven biological and physical dosimetric assays in ready-to-use mode, (2) a basis for education, training and quality assurance, and (3) a basis for further network development regarding new techniques and members. Techniques for individual dose estimation based on biological samples and/or inert personalized devices as mobile phones or smart phones were optimized to support rapid categorization of many potential victims according to the received dose to the blood or personal devices. Communication and cross-border collaboration were also standardized. To assure long-term sustainability of the network, cooperation with national and international emergency preparedness organizations was initiated and links to radiation protection and research platforms have been developed. A legal framework, based on a Memorandum of Understanding, was established and signed by 27 organizations by the end of 2015. RENEB is a European Network of biological and physical-retrospective dosimetry, with the capacity and capability to perform large-scale rapid individualized dose estimation. Specialized to handle large numbers of samples, RENEB is able to contribute to radiological emergency preparedness and wider large-scale research projects.
Collaborative Working for Large Digitisation Projects
ERIC Educational Resources Information Center
Yeates, Robin; Guy, Damon
2006-01-01
Purpose: To explore the effectiveness of large-scale consortia for disseminating local heritage via the web. To describe the creation of a large geographically based cultural heritage consortium in the South East of England and management lessons resulting from a major web site digitisation project. To encourage the improved sharing of experience…
The Tuskegee Legacy Project: willingness of minorities to participate in biomedical research.
Katz, Ralph V; Kegeles, S Steven; Kressin, Nancy R; Green, B Lee; Wang, Min Qi; James, Sherman A; Russell, Stefanie Luise; Claudio, Cristina
2006-11-01
The broad goal of the Tuskegee Legacy Project (TLP) study was to address, and understand, a range of issues related to the recruitment and retention of Blacks and other minorities in biomedical research studies. The specific aim of this analysis was to compare the self-reported willingness of Blacks, Hispanics, and Whites to participate as research subjects in biomedical studies, as measured by the Likelihood of Participation (LOP) Scale and the Guinea Pig Fear Factor (GPFF) Scale. The Tuskegee Legacy Project Questionnaire, a 60 item instrument, was administered to 1,133 adult Blacks, Hispanics, and non-Hispanic Whites in 4 U.S. cities. The findings revealed no difference in self-reported willingness to participate in biomedical research, as measured by the LOP Scale, between Blacks, Hispanics, and Whites, despite Blacks being 1.8 times as likely as Whites to have a higher fear of participation in biomedical research on the GPFF Scale.
The Tuskegee Legacy Project: Willingness of Minorities to Participate in Biomedical Research
Katz, Ralph V.; Russell, Stefanie L.; Kegeles, S. Steven; Kressin, Nancy R.; Green, B. Lee; Wang, Min Qi; James, Sherman A.; Claudio, Cristina
2006-01-01
The broad goal of the Tuskegee Legacy Project (TLP) study was to address, and understand, a range of issues related to the recruitment and retention of Blacks and other minorities in biomedical research studies. The specific aim of this analysis was to compare the self-reported willingness of Blacks, Hispanics, and Whites to participate as research subjects in biomedical studies, as measured by the Likelihood of Participation (LOP) Scale and the Guinea Pig Fear Factor (GPFF) Scale. The Tuskegee Legacy Project Questionnaire, a 60 item instrument, was administered to 1,133 adult Blacks, Hispanics, and non-Hispanic Whites in 4 U.S. cities. The findings revealed no difference in self-reported willingness to participate in biomedical research, as measured by the LOP Scale, between Blacks, Hispanics, and Whites, despite Blacks being 1.8 times as likely as Whites to have a higher fear of participation in biomedical research on the GPFF Scale. PMID:17242525
DOE Office of Scientific and Technical Information (OSTI.GOV)
Call, Justin
This contract report is one of a series of reports that document implementation components of the Bonneville Power Administration's (BPA) funded project: Integrated Status and Effectiveness Monitoring Program (ISEMP - BPA project No.2003-017-00, Chris Jordan, NOAA-NWFSC project sponsor). Other components of the project are separately reported, as explained below. The ISEMP project has been created as a cost effective means of developing protocols and new technologies, novel indicators, sample designs, analytical data management, communication tools and skills, and restoration experiments that support the development of a region-wide Research, Monitoring, and Evaluation (RME) program to assess the status of anadromous salmonidsmore » populations, their tributary habitat and restoration and management actions. The most straightforward approach to developing a regional-scale monitoring and evaluation program would be to increase standardization among status and trend monitoring programs. However, the diversity of species and their habitat, as well as the overwhelming uncertainty surrounding indicators, metrics, and data interpretation methods requires the testing of multiple approaches. Thus, ISEMP has adopted an approach to develop a broad template that may differ in the details among subbasins, but one that will ultimately lead to the formation of a unified RME process for the management of anadromous salmonid populations and habitat across the Columbia River Basin. ISEMP has been initiated in three pilot areas, the Wenatchee/Entiat, John Day, and Salmon. To balance replicating experimental approaches with the goal of developing monitoring and evaluation tools that apply as broadly as possible across the Pacific Northwest, these subbasins were chosen as representative of a wide range of potential challenges and conditions, e.g., differing fish species composition and life histories, ecoregions, institutional settings, and existing data. ISEMP has constructed a framework that builds on current status and trend monitoring infrastructures in these pilot subbasins, but challenges current programs by testing alternative monitoring approaches. In addition, the ISEMP is: (1) Collecting information over a hierarchy of spatial scales, allowing for a greater flexibility of data aggregation for multi-scale recovery planning assessments, and (2) Designing methods that: (a) Identify factors limiting fish production in watersheds; (b) Determine restoration actions to address these problems; (c) Implement actions as a large-scale experiment (e.g. Before After Control Impact, or BACI design), and (d) Implement intensive monitoring and research to evaluate the action's success. The intent of the ISEMP project is to design monitoring programs that can efficiently collect information to address multiple management objectives over a broad range of scales. This includes: Evaluating the status of anadromous salmonids and their habitat; Identifying opportunities to restore habitat function and fish performance, and Evaluating the benefits of the actions to the fish populations across the Columbia River Basin. The multi-scale nature of this goal requires the standardization of protocols and sampling designs that are statistically valid and powerful, properties that are currently inconsistent across the multiple monitoring programs in the region. Other aspects of the program will aid in the ability to extrapolate information beyond the study area, such as research to elucidate causal mechanisms, and a classification of watersheds throughout the Columbia River Basin. Obviously, the scale of the problem is immense and the ISEMP does not claim to be the only program working towards this goal. As such, ISEMP relies heavily on the basin's current monitoring infrastructure to test and develop monitoring strategies, while acting as a coordinating body and providing support for key elements such as data management and technical analyses. The ISEMP also ensures that monitoring programs can address large-scale management objectives (resulting largely from the ESA) through these local efforts. While the ISEMP maintains a regional focus it also returns the necessary information to aid in management at the smaller spatial scales (individual projects) where manipulations (e.g., habitat restoration actions) actually occur. The work captured in this report is a component of the overall ISEMP, and while it stands alone as an important contribution to the management of anadromous salmonids and their habitat, it also plays a key role within ISEMP. Each component of work within ISEMP is reported on individually, as is done so here, and in annual and triennial summary reports that present all of the overall project components in their programmatic context and shows how the data and tools developed can be applied to the development of regionally consistent, efficient and effective Research, Monitoring and Evaluation.« less
High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6
NASA Astrophysics Data System (ADS)
Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; Senior, Catherine A.; Bellucci, Alessio; Bao, Qing; Chang, Ping; Corti, Susanna; Fučkar, Neven S.; Guemas, Virginie; von Hardenberg, Jost; Hazeleger, Wilco; Kodama, Chihiro; Koenigk, Torben; Leung, L. Ruby; Lu, Jian; Luo, Jing-Jia; Mao, Jiafu; Mizielinski, Matthew S.; Mizuta, Ryo; Nobre, Paulo; Satoh, Masaki; Scoccimarro, Enrico; Semmler, Tido; Small, Justin; von Storch, Jin-Song
2016-11-01
Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. HighResMIP thereby focuses on one of the CMIP6 broad questions, "what are the origins and consequences of systematic model biases?", but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fix, N. J.
The purpose of the project is to conduct research at an Integrated Field-Scale Research Challenge Site in the Hanford Site 300 Area, CERCLA OU 300-FF-5 (Figure 1), to investigate multi-scale mass transfer processes associated with a subsurface uranium plume impacting both the vadose zone and groundwater. The project will investigate a series of science questions posed for research related to the effect of spatial heterogeneities, the importance of scale, coupled interactions between biogeochemical, hydrologic, and mass transfer processes, and measurements/approaches needed to characterize a mass-transfer dominated system. The research will be conducted by evaluating three (3) different hypotheses focused onmore » multi-scale mass transfer processes in the vadose zone and groundwater, their influence on field-scale U(VI) biogeochemistry and transport, and their implications to natural systems and remediation. The project also includes goals to 1) provide relevant materials and field experimental opportunities for other ERSD researchers and 2) generate a lasting, accessible, and high-quality field experimental database that can be used by the scientific community for testing and validation of new conceptual and numerical models of subsurface reactive transport.« less
Large-Scale Aerosol Modeling and Analysis
2009-09-30
Modeling of Burning Emissions ( FLAMBE ) project, and other related parameters. Our plans to embed NAAPS inside NOGAPS may need to be put on hold...AOD, FLAMBE and FAROP at FNMOC are supported by 6.4 funding from PMW-120 for “Large-scale Atmospheric Models”, “Small-scale Atmospheric Models
NASA Astrophysics Data System (ADS)
Beichner, Robert
2016-03-01
The Student-Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) Project combines curricula and a specially-designed instructional space to enhance learning. SCALE-UP students practice communication and teamwork skills while performing activities that enhance their conceptual understanding and problem solving skills. This can be done with small or large classes and has been implemented at more than 250 institutions. Educational research indicates that students should collaborate on interesting tasks and be deeply involved with the material they are studying. SCALE-UP classtime is spent primarily on ``tangibles'' and ``ponderables''--hands-on measurements/observations and interesting questions. There are also computer simulations (called ``visibles'') and hypothesis-driven labs. Students sit at tables designed to facilitate group interactions. Instructors circulate and engage in Socratic dialogues. The setting looks like a banquet hall, with lively interactions nearly all the time. Impressive learning gains have been measured at institutions across the US and internationally. This talk describes today's students, how lecturing got started, what happens in a SCALE-UP classroom, and how the approach has spread. The SCALE-UP project has greatly benefitted from numerous Grants made by NSF and FIPSE to NCSU and other institutions.
NASA Astrophysics Data System (ADS)
Favali, Paolo; Beranzoli, Laura; Best, Mairi; Franceschini, PierLuigi; Materia, Paola; Peppoloni, Silvia; Picard, John
2014-05-01
EMSO (European Multidisciplinary Seafloor and Water Column Observatory) is a large-scale European Research Infrastructure (RI). It is a geographically distributed infrastructure composed of several deep-seafloor and water-column observatories, which will be deployed at key sites in European waters, spanning from the Arctic, through the Atlantic and Mediterranean, to the Black Sea, with the basic scientific objective of real-time, long-term monitoring of environmental processes related to the interaction between the geosphere, biosphere and hydrosphere. EMSO is one of the environmental RIs on the ESFRI roadmap. The ESRFI Roadmap identifies new RIs of pan-European importance that correspond to the long term needs of European research communities. EMSO will be the sub-sea segment of the EU's large-scale Earth Observation program, Copernicus (previously known as GMES - Global Monitoring for Environment and Security) and will significantly enhance the observational capabilities of European member states. An open data policy compliant with the recommendations being developed within the GEOSS initiative (Global Earth Observation System of Systems) will allow for shared use of the infrastructure and the exchange of scientific information and knowledge. The processes that occur in the oceans have a direct impact on human societies, therefore it is crucial to improve our understanding of how they operate and interact. To encompass the breadth of these major processes, sustained and integrated observations are required that appreciate the interconnectedness of atmospheric, surface ocean, biological pump, deep-sea, and solid-Earth dynamics and that can address: • natural and anthropogenic change; • interactions between ecosystem services, biodiversity, biogeochemistry, physics, and climate; • impacts of exploration and extraction of energy, minerals, and living resources; • geo-hazard early warning capability for earthquakes, tsunamis, gas-hydrate release, and slope instability and failure; • connecting scientific outcomes to stakeholders and policy makers, including to government decision-makers. The development of a large research infrastructure initiatives like EMSO must continuously take into account wide-reaching environmental and socio-economic implications and objectives. For this reason, an Ethics Commitee was established early in EMSO's initial Preparatory Phase with responsibility for overseeing the key ethical and social aspects of the project. These include: • promoting inclusive science communication and data dissemination services to civil society according to Open Access principles; • guaranteeing top quality scientific information and data as results of top quality research; • promoting the increased adoption of eco-friendly, sustainable technologies through the dissemination of advanced scientific knowledge and best practices to the private sector and to policy makers; • developing Education Strategies in cooperation with academia and industry aimed at informing and sensitizing the general public on the environmental and socio-economic implications and benefits of large research infrastructure initiatives such as EMSO; • carrying out Excellent Science following strict criteria of research integrity, as expressed in the Montreal Statement (2013); • promoting Geo-ethical awareness and innovation by spurring innovative approaches in the management of environmental aspects of large research projects; • supporting technological Innovation by working closely in support of SMEs; • providing a constant, qualified and authoritative one-stop-shopping Reference Point and Advisory for politicians and decision-makers. The paper shows how Geoethics is an essential tool for guiding methodological and operational choices, and management of an European project with great impact on the environment and society.
The 300 Area Integrated Field Research Challenge Quality Assurance Project Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fix, N. J.
Pacific Northwest National Laboratory and a group of expert collaborators are using the U.S. Department of Energy Hanford Site 300 Area uranium plume within the footprint of the 300-FF-5 groundwater operable unit as a site for an Integrated Field-Scale Subsurface Research Challenge (IFRC). The IFRC is entitled Multi-Scale Mass Transfer Processes Controlling Natural Attenuation and Engineered Remediation: An IFRC Focused on the Hanford Site 300 Area Uranium Plume Project. The theme is investigation of multi-scale mass transfer processes. A series of forefront science questions on mass transfer are posed for research that relate to the effect of spatial heterogeneities; themore » importance of scale; coupled interactions between biogeochemical, hydrologic, and mass transfer processes; and measurements/approaches needed to characterize and model a mass transfer-dominated system. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the 300 Area IFRC Project. This plan is designed to be used exclusively by project staff.« less
Development of a large-scale transportation optimization course.
DOT National Transportation Integrated Search
2011-11-01
"In this project, a course was developed to introduce transportation and logistics applications of large-scale optimization to graduate students. This report details what : similar courses exist in other universities, and the methodology used to gath...
Large-Scale 3D Printing: The Way Forward
NASA Astrophysics Data System (ADS)
Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid
2018-03-01
Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.
Devi, V; Abraham, R R; Adiga, A; Ramnarayan, K; Kamath, A
2010-01-01
Healthcare decision-making is largely reliant on evidence-based medicine; building skills in scientific reasoning and thinking among medical students becomes an important part of medical education. Medical students in India have no formal path to becoming physicians, scientists or academicians. This study examines students' perceptions regarding research skills improvement after participating in the Mentored Student Project programme at Melaka Manipal Medical College, Manipal Campus, India. Additionally, this paper describes the initiatives taken for the continual improvement of the Mentored Student Project programme based on faculty and student perspectives. At Melaka Manipal Medical College, Mentored Student Project was implemented in the curriculum during second year of Bachelor of Medicine and Bachelor of Surgery programme with the intention of developing research skills essential to the career development of medical students. The study design was cross-sectional. To inculcate the spirit of team work students were grouped (n=3 to 5) and each group was asked to select a research project. The students' research projects were guided by their mentors. A questionnaire (Likert's five point scale) on students' perceptions regarding improvement in research skills after undertaking projects and guidance received from the mentor was administered to medical students after they had completed their Mentored Student Project. The responses of students were summarised using percentages. The median grade with inter-quartile range was reported for each item in the questionnaire. The median grade for all the items related to perceptions regarding improvement in research skills was 4 which reflected that the majority of the students felt that Mentored Student Project had improved their research skills. The problems encountered by the students during Mentored Student Project were related to time management for the Mentored Student Project and mentors. This study shows that students acknowledged that their research skills were improved after participating in the Mentored Student Project programme. The Mentored Student Project programme was successful in fostering positive attitudes among medical students towards scientific research. The present study also provides scope for further improvement of the Mentored Student Project programme based on students' and faculty perspectives.
Penders, Bart; Vos, Rein; Horstman, Klasien
2009-11-01
Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.
The Next Level in Automated Solar Flare Forecasting: the EU FLARECAST Project
NASA Astrophysics Data System (ADS)
Georgoulis, M. K.; Bloomfield, D.; Piana, M.; Massone, A. M.; Gallagher, P.; Vilmer, N.; Pariat, E.; Buchlin, E.; Baudin, F.; Csillaghy, A.; Soldati, M.; Sathiapal, H.; Jackson, D.; Alingery, P.; Argoudelis, V.; Benvenuto, F.; Campi, C.; Florios, K.; Gontikakis, C.; Guennou, C.; Guerra, J. A.; Kontogiannis, I.; Latorre, V.; Murray, S.; Park, S. H.; Perasso, A.; Sciacchitano, F.; von Stachelski, S.; Torbica, A.; Vischi, D.
2017-12-01
We attempt an informative description of the Flare Likelihood And Region Eruption Forecasting (FLARECAST) project, European Commission's first large-scale investment to explore the limits of reliability and accuracy achieved for the forecasting of major solar flares. We outline the consortium, top-level objectives and first results of the project, highlighting the diversity and fusion of expertise needed to deliver what was promised. The project's final product, featuring an openly accessible, fully modular and free to download flare forecasting facility will be delivered in early 2018. The project's three objectives, namely, science, research-to-operations and dissemination / communication, are also discussed: in terms of science, we encapsulate our close-to-final assessment on how close (or far) are we from a practically exploitable solar flare forecasting. In terms of R2O, we briefly describe the architecture of the FLARECAST infrastructure that includes rigorous validation for each forecasting step. From the three different communication levers of the project we finally focus on lessons learned from the two-way interaction with the community of stakeholders and governmental organizations. The FLARECAST project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No. 640216.
Scientific Grid activities and PKI deployment in the Cybermedia Center, Osaka University.
Akiyama, Toyokazu; Teranishi, Yuuichi; Nozaki, Kazunori; Kato, Seiichi; Shimojo, Shinji; Peltier, Steven T; Lin, Abel; Molina, Tomas; Yang, George; Lee, David; Ellisman, Mark; Naito, Sei; Koike, Atsushi; Matsumoto, Shuichi; Yoshida, Kiyokazu; Mori, Hirotaro
2005-10-01
The Cybermedia Center (CMC), Osaka University, is a research institution that offers knowledge and technology resources obtained from advanced researches in the areas of large-scale computation, information and communication, multimedia content and education. Currently, CMC is involved in Japanese national Grid projects such as JGN II (Japan Gigabit Network), NAREGI and BioGrid. Not limited to Japan, CMC also actively takes part in international activities such as PRAGMA. In these projects and international collaborations, CMC has developed a Grid system that allows scientists to perform their analysis by remote-controlling the world's largest ultra-high voltage electron microscope located in Osaka University. In another undertaking, CMC has assumed a leadership role in BioGrid by sharing its experiences and knowledge on the system development for the area of biology. In this paper, we will give an overview of the BioGrid project and introduce the progress of the Telescience unit, which collaborates with the Telescience Project led by the National Center for Microscopy and Imaging Research (NCMIR). Furthermore, CMC collaborates with seven Computing Centers in Japan, NAREGI and National Institute of Informatics to deploy PKI base authentication infrastructure. The current status of this project and future collaboration with Grid Projects will be delineated in this paper.
Volk, Carol J; Lucero, Yasmin; Barnas, Katie
2014-05-01
Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents (n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.
NASA Astrophysics Data System (ADS)
Volk, Carol J.; Lucero, Yasmin; Barnas, Katie
2014-05-01
Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents ( n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.
Marler, Thomas E.; Lindström, Anders J.
2017-01-01
ABSTRACT Conservation agencies charged with care of threatened plant species should be governed by the concepts that conservation actions should do no harm. Adaptive management research progresses in imperfect situations due to incomplete knowledge. Interpreting new experimental or observational evidence for inclusion in conservation plans should first consider the big picture by identifying collateral quandaries before scaling up the approach to large-scale implementation. We discuss a case study of Cycas micronesica conservation activities on the island of Guam. The use of large stem cuttings has been shown to be a viable approach for rescuing trees from planned construction sites. However, this artificial means of producing transplants exhibits shortcomings, some of which may add new threats to the existing plant population. Moreover, devoting funds for use of the new technique in tree rescue projects does not address the primary threats that have led to listing under the United States Endangered Species Act (ESA). Transplanted trees will likely succumb to those ubiquitous threats shortly after the completion of a successful rescue project. Alternatively, investing conservation funds into mitigation of the primary threats could lead to removal of the species from the ESA. PMID:29260802
Rethinking Big Science. Modest, mezzo, grand science and the development of the Bevalac, 1971-1993.
Westfall, Catherine
2003-03-01
Historians of science have tended to focus exclusively on scale in investigations of largescale research, perhaps because it has been easy to assume that comprehending a phenomenon dubbed "Big Science" hinges on an understanding of bigness. A close look at Lawrence Berkeley Laboratory's Bevalac, a medium-scale "mezzo science" project formed by uniting two preexisting machines--the modest SuperHILAC and the grand Bevatron--shows what can be gained by overcoming this preoccupation with bigness. The Bevalac story reveals how interconnections, connections, and disconnections ultimately led to the development of a new kind of science that transformed the landscape of large-scale research in the United States. Important lessons in historiography also emerge: the value of framing discussions in terms of networks, the necessity of constantly expanding and refining methodology, and the importance of avoiding the rhetoric of participants and instead finding words to tell our own stories.
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2013-10-01
Accelerator science and technology is one of a key enablers of the developments in the particle physic, photon physics and also applications in medicine and industry. The paper presents a digest of the research results in the domain of accelerator science and technology in Europe, shown during the realization of CARE (Coordinated Accelerator R&D), EuCARD (European Coordination of Accelerator R&D) and during the national annual review meeting of the TIARA - Test Infrastructure of European Research Area in Accelerator R&D. The European projects on accelerator technology started in 2003 with CARE. TIARA is an European Collaboration of Accelerator Technology, which by running research projects, technical, networks and infrastructural has a duty to integrate the research and technical communities and infrastructures in the global scale of Europe. The Collaboration gathers all research centers with large accelerator infrastructures. Other ones, like universities, are affiliated as associate members. TIARA-PP (preparatory phase) is an European infrastructural project run by this Consortium and realized inside EU-FP7. The paper presents a general overview of CARE, EuCARD and especially TIARA activities, with an introduction containing a portrait of contemporary accelerator technology and a digest of its applications in modern society. CARE, EuCARD and TIARA activities integrated the European accelerator community in a very effective way. These projects are expected very much to be continued.
Physical habitat monitoring strategy (PHAMS) for reach-scale restoration effectiveness monitoring
Jones, Krista L.; O'Daniel, Scott J.; Beechie, Tim J.; Zakrajsek, John; Webster, John G.
2015-04-14
Habitat restoration efforts by the Confederated Tribes of the Umatilla Indian Reservation (CTUIR) have shifted from the site scale (1-10 meters) to the reach scale (100-1,000 meters). This shift was in response to the growing scientific emphasis on process-based restoration and to support from the 2007 Accords Agreement with the Bonneville Power Administration. With the increased size of restoration projects, the CTUIR and other agencies are in need of applicable monitoring methods for assessing large-scale changes in river and floodplain habitats following restoration. The goal of the Physical Habitat Monitoring Strategy is to outline methods that are useful for capturing reach-scale changes in surface and groundwater hydrology, geomorphology, hydrologic connectivity, and riparian vegetation at restoration projects. The Physical Habitat Monitoring Strategy aims to avoid duplication with existing regional effectiveness monitoring protocols by identifying complimentary reach-scale metrics and methods that may improve the ability of CTUIR and others to detect instream and riparian changes at large restoration projects.
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics
2010-01-01
Background Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. Description An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Conclusions Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms. PMID:21210976
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics.
Taylor, Ronald C
2010-12-21
Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms.
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph is a computer printout which presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph presents the computer printout of data on the application of…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph is a computer printout which presents findings from an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph presents the computer printout of data on the application of…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. A computer printout of the analysis is included. The document is part of a large scale research project to test various theories with regard to their power in analyzing international relations. In this monograph, an inventory is…
Network models of biology, whether curated or derived from large-scale data analysis, are critical tools in the understanding of cancer mechanisms and in the design and personalization of therapies. The NDEx Project (Network Data Exchange) will create, deploy, and maintain an open-source, web-based software platform and public website to enable scientists, organizations, and software applications to share, store, manipulate, and publish biological networks.
ERIC Educational Resources Information Center
Lee, Kyungmee; Brett, Clare
2013-01-01
This qualitative case study is the first phase of a large-scale design-based research project to implement a theoretically derived double-layered CoP model within real-world teacher development practices. The main goal of this first iteration is to evaluate the courses and test and refine the CoP model for future implementations. This paper…
Thermally Optimized Paradigm of Thermal Management (TOP-M)
2017-07-18
ELEMENT NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 6. AUTHOR(S) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8...19b. TELEPHONE NUMBER (Include area code) 18-07-2017 Final Technical Jul 2015 - Jul 2017 NICOP - Thermally Optimized Paradigm of Thermal Management ...The main goal of this research was to present a New Thermal Management Approach, which combines thermally aware Very/Ultra Large Scale Integration
A Study of Rapid Biodegradation of Oily Wastes through Composting.
1979-10-01
effective method for large-scale composting of organic wastes. This research project was based on the principles of the forced aeration technique. The...carbon results in heat loss and subsequent reduction in effectiveness of pathogen destruction. It is therefore desirable to maintain the C/N ratio at a...investigated the effect of composting on the degradation of hydrocarbons in sewage sludge. Sludge extracts were fractionated into classes of compounds and a
2014-05-21
simulating air-water free -surface flow, fluid-object interaction (FOI), and fluid-structure interaction (FSI) phenomena for complex geometries, and...with no limitations on the motion of the free surface, and with particular emphasis on ship hydrodynamics. The following specific research objectives...were identified for this project: 1) Development of a theoretical framework for free -surface flow, FOI and FSI that is a suitable starting point
Industry Study, Environment Industry, Spring 2009
2009-01-01
Amazonia (LBA), Tapajós National Forest, Brazil Oficinas Caboclas Project, Santarém, Brazil Maguari, Jamaraquá, Nuquini, and Suruacá communities...from clear cut, however. Research at the Large-Scale Biosphere-Atmosphere Experiment in Amazonia suggests that the Amazon basin may not be, as...Times Magazine , April 15, 2007, http://www.nytimes.com/2007/04/15/ magazine /15green.t.html. 26 EBI, 100 27 OECD, 23. 28 Ban Ki-moon, “Acting on Climate
The Holocene Geoarchaeology of the Desert Nile in Northern Sudan
NASA Astrophysics Data System (ADS)
Woodward, Jamie; Macklin, Mark; Spencer, Neal; Welsby, Derek; Dalton, Matthew; Hay, Sophie; Hardy, Andrew
2016-04-01
Invited Paper Forty years ago Colin Renfrew declared that "every archaeological problem starts as a problem in geoarchaeology" (Renfrew, 1976 p. 2). With this assertion in mind, this paper draws upon the findings from field research in two sectors of the Nile Valley of Northern Sudan dedicated to the exploration of human-environment interactions during the middle and late Holocene. This part of the Nile corridor contains a rich cultural record and an exceptionally well preserved Holocene fluvial archive. A distinctive feature of these records is the variety of evidence for interaction between desert and river over a range of spatial and temporal scales. This interaction presented both challenges and opportunities for its ancient inhabitants. This paper will present evidence for large-scale landscape changes driven by shifts in global climate. It will also show how we have integrated the archaeological and geological records in the Northern Dongola Reach and at Amara West - where long-term field projects led by archaeologists from the British Museum have recognised the importance of a sustained commitment to interdisciplinary research to achieve a fully integrated geoarchaeological approach across a range of scales. The former project is a large-scale landscape survey with multiple sites across an 80 km reach of the Nile whilst the latter has a strong focus on a single New Kingdom town site and changes in its environmental setting. By combining multiple archaeological and geological datasets - and pioneering the use of OSL dating and strontium isotope analysis in the Desert Nile - we have developed a new understanding of human responses to Holocene climate and landscape change in this region. Renfrew, C. (1976) Archaeology and the earth sciences. In: D.A. Davidson and M.I. Shackley (eds) Geoarchaeology: Earth Science and the Past, Duckworth, London, 1-5.
NASA Astrophysics Data System (ADS)
Eisenhamer, Bonnie; Ryer, H.; McCallister, D.; Taylor, J.; Bishop, M.
2010-05-01
The Hubble Space Telescope's Early Release Observations (EROs) were revealed to the public on September 9, 2009, and K-12 students and educators in six states across the country are joining in the celebration. Students and educators in Maryland, Ohio, New York, California, New Mexico, and Delaware have been invited to participate in the Hubble Space Telescope's Student ERO Pilot Project. This is an interdisciplinary project created by STScI's Office of Public Outreach in which students research the four ERO objects and create various types of projects. In recognition of their participation, the projects are displayed at host institutions in each state (museum, science center, school, planetarium or library) during a special public event for participating students, their families, and teachers. As part of its evaluation program, STScI's Office of Public Outreach has been conducting an evaluation of the project to determine the viability and potential of conducting large-scale, formal/informal collaborative projects in the future. This poster will highlight preliminary findings and share lessons learned.
Schweitzer, Peter; Povoroznyuk, Olga; Schiesser, Sigrid
2017-01-01
Abstract Public and academic discourses about the Polar regions typically focus on the so-called natural environment. While, these discourses and inquiries continue to be relevant, the current article asks the question how to conceptualize the on-going industrial and infrastructural build-up of the Arctic. Acknowledging that the “built environment” is not an invention of modernity, the article nevertheless focuses on large-scale infrastructural projects of the twentieth century, which marks a watershed of industrial and infrastructural development in the north. Given that the Soviet Union was at the vanguard of these developments, the focus will be on Soviet and Russian large-scale projects. We will be discussing two cases of transportation infrastructure, one of them based on an on-going research project being conducted by the authors along the Baikal–Amur Mainline (BAM) and the other focused on the so-called Northern Sea Route, the marine passage with a long history that has recently been regaining public and academic attention. The concluding section will argue for increased attention to the interactions between humans and the built environment, serving as a kind of programmatic call for more anthropological attention to infrastructure in the Russian north and other polar regions. PMID:29098112
Team building: electronic management-clinical translational research (eM-CTR) systems.
Cecchetti, Alfred A; Parmanto, Bambang; Vecchio, Marcella L; Ahmad, Sjarif; Buch, Shama; Zgheib, Nathalie K; Groark, Stephen J; Vemuganti, Anupama; Romkes, Marjorie; Sciurba, Frank; Donahoe, Michael P; Branch, Robert A
2009-12-01
Classical drug exposure: response studies in clinical pharmacology represent the quintessential prototype for Bench to Bedside-Clinical Translational Research. A fundamental premise of this approach is for a multidisciplinary team of researchers to design and execute complex, in-depth mechanistic studies conducted in relatively small groups of subjects. The infrastructure support for this genre of clinical research is not well-handled by scaling down of infrastructure used for large Phase III clinical trials. We describe a novel, integrated strategy, whose focus is to support and manage a study using an Information Hub, Communication Hub, and Data Hub design. This design is illustrated by an application to a series of varied projects sponsored by Special Clinical Centers of Research in chronic obstructive pulmonary disease at the University of Pittsburgh. In contrast to classical informatics support, it is readily scalable to large studies. Our experience suggests the culture consequences of research group self-empowerment is not only economically efficient but transformative to the research process.
1965-08-10
Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly 2 million dollars. James Hansen wrote: This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled. (p. 379) Ellis J. White described the simulator as follows: Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. 200 feet. All models are in full relief except the sphere. -- Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379 Ellis J. White, Discussion of Three Typical Langley Research Center Simulation Programs, Paper presented at the Eastern Simulation Council (EAI s Princeton Computation Center), Princeton, NJ, October 20, 1966.
1964-10-28
Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: "This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled." (p. 379) Ellis J. White further described LOLA in his paper "Discussion of Three Typical Langley Research Center Simulation Programs," "Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere." -- Published in James R. Hansen, Spaceflight Revolution, NASA SP-4308, p. 379; Ellis J. White, "Discussion of Three Typical Langley Research Center Simulation Programs," Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.
Kasperowski, Dick; Hillman, Thomas
2018-05-01
In the past decade, some areas of science have begun turning to masses of online volunteers through open calls for generating and classifying very large sets of data. The purpose of this study is to investigate the epistemic culture of a large-scale online citizen science project, the Galaxy Zoo, that turns to volunteers for the classification of images of galaxies. For this task, we chose to apply the concepts of programs and antiprograms to examine the 'essential tensions' that arise in relation to the mobilizing values of a citizen science project and the epistemic subjects and cultures that are enacted by its volunteers. Our premise is that these tensions reveal central features of the epistemic subjects and distributed cognition of epistemic cultures in these large-scale citizen science projects.
NASA Technical Reports Server (NTRS)
Nolte, Christopher; Otte, Tanya; Pinder, Robert; Bowden, J.; Herwehe, J.; Faluvegi, Gregory; Shindell, Drew
2013-01-01
Projecting climate change scenarios to local scales is important for understanding, mitigating, and adapting to the effects of climate change on society and the environment. Many of the global climate models (GCMs) that are participating in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) do not fully resolve regional-scale processes and therefore cannot capture regional-scale changes in temperatures and precipitation. We use a regional climate model (RCM) to dynamically downscale the GCM's large-scale signal to investigate the changes in regional and local extremes of temperature and precipitation that may result from a changing climate. In this paper, we show preliminary results from downscaling the NASA/GISS ModelE IPCC AR5 Representative Concentration Pathway (RCP) 6.0 scenario. We use the Weather Research and Forecasting (WRF) model as the RCM to downscale decadal time slices (1995-2005 and 2025-2035) and illustrate potential changes in regional climate for the continental U.S. that are projected by ModelE and WRF under RCP6.0. The regional climate change scenario is further processed using the Community Multiscale Air Quality modeling system to explore influences of regional climate change on air quality.
The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research
ERIC Educational Resources Information Center
Mertens, Steven B.
2006-01-01
This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…
ERIC Educational Resources Information Center
Saul, Jeffery M.; Deardorff, Duane L.; Abbott, David S.; Allain, Rhett J.; Beichner, Robert J.
The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project at North Carolina State University (NCSU) is developing a curriculum to promote learning through in-class group activities in introductory physics classes up to 100 students. The authors are currently in Phase II of the project using a specially designed…
Weinfurt, Kevin P; Hernandez, Adrian F; Coronado, Gloria D; DeBar, Lynn L; Dember, Laura M; Green, Beverly B; Heagerty, Patrick J; Huang, Susan S; James, Kathryn T; Jarvik, Jeffrey G; Larson, Eric B; Mor, Vincent; Platt, Richard; Rosenthal, Gary E; Septimus, Edward J; Simon, Gregory E; Staman, Karen L; Sugarman, Jeremy; Vazquez, Miguel; Zatzick, Douglas; Curtis, Lesley H
2017-09-18
The clinical research enterprise is not producing the evidence decision makers arguably need in a timely and cost effective manner; research currently involves the use of labor-intensive parallel systems that are separate from clinical care. The emergence of pragmatic clinical trials (PCTs) poses a possible solution: these large-scale trials are embedded within routine clinical care and often involve cluster randomization of hospitals, clinics, primary care providers, etc. Interventions can be implemented by health system personnel through usual communication channels and quality improvement infrastructure, and data collected as part of routine clinical care. However, experience with these trials is nascent and best practices regarding design operational, analytic, and reporting methodologies are undeveloped. To strengthen the national capacity to implement cost-effective, large-scale PCTs, the Common Fund of the National Institutes of Health created the Health Care Systems Research Collaboratory (Collaboratory) to support the design, execution, and dissemination of a series of demonstration projects using a pragmatic research design. In this article, we will describe the Collaboratory, highlight some of the challenges encountered and solutions developed thus far, and discuss remaining barriers and opportunities for large-scale evidence generation using PCTs. A planning phase is critical, and even with careful planning, new challenges arise during execution; comparisons between arms can be complicated by unanticipated changes. Early and ongoing engagement with both health care system leaders and front-line clinicians is critical for success. There is also marked uncertainty when applying existing ethical and regulatory frameworks to PCTS, and using existing electronic health records for data capture adds complexity.
Bioinformatics by Example: From Sequence to Target
NASA Astrophysics Data System (ADS)
Kossida, Sophia; Tahri, Nadia; Daizadeh, Iraj
2002-12-01
With the completion of the human genome, and the imminent completion of other large-scale sequencing and structure-determination projects, computer-assisted bioscience is aimed to become the new paradigm for conducting basic and applied research. The presence of these additional bioinformatics tools stirs great anxiety for experimental researchers (as well as for pedagogues), since they are now faced with a wider and deeper knowledge of differing disciplines (biology, chemistry, physics, mathematics, and computer science). This review targets those individuals who are interested in using computational methods in their teaching or research. By analyzing a real-life, pharmaceutical, multicomponent, target-based example the reader will experience this fascinating new discipline.
An increase in aerosol burden due to the land-sea warming contrast
NASA Astrophysics Data System (ADS)
Hassan, T.; Allen, R.; Randles, C. A.
2017-12-01
Climate models simulate an increase in most aerosol species in response to warming, particularly over the tropics and Northern Hemisphere midlatitudes. This increase in aerosol burden is related to a decrease in wet removal, primarily due to reduced large-scale precipitation. Here, we show that the increase in aerosol burden, and the decrease in large-scale precipitation, is related to a robust climate change phenomenon—the land/sea warming contrast. Idealized simulations with two state of the art climate models, the National Center for Atmospheric Research Community Atmosphere Model version 5 (NCAR CAM5) and the Geophysical Fluid Dynamics Laboratory Atmospheric Model 3 (GFDL AM3), show that muting the land-sea warming contrast negates the increase in aerosol burden under warming. This is related to smaller decreases in near-surface relative humidity over land, and in turn, smaller decreases in large-scale precipitation over land—especially in the NH midlatitudes. Furthermore, additional idealized simulations with an enhanced land/sea warming contrast lead to the opposite result—larger decreases in relative humidity over land, larger decreases in large-scale precipitation, and larger increases in aerosol burden. Our results, which relate the increase in aerosol burden to the robust climate projection of enhanced land warming, adds confidence that a warmer world will be associated with a larger aerosol burden.
Large-scale correlations in gas traced by Mg II absorbers around low-mass galaxies
NASA Astrophysics Data System (ADS)
Kauffmann, Guinevere
2018-03-01
The physical origin of the large-scale conformity in the colours and specific star formation rates of isolated low-mass central galaxies and their neighbours on scales in excess of 1 Mpc is still under debate. One possible scenario is that gas is heated over large scales by feedback from active galactic nuclei (AGNs), leading to coherent modulation of cooling and star formation between well-separated galaxies. In this Letter, the metal line absorption catalogue of Zhu & Ménard is used to probe gas out to large projected radii around a sample of a million galaxies with stellar masses ˜1010M⊙ and photometric redshifts in the range 0.4 < z < 0.8 selected from Sloan Digital Sky Survey imaging data. This galaxy sample covers an effective volume of 2.2 Gpc3. A statistically significant excess of Mg II absorbers is present around the red-low-mass galaxies compared to their blue counterparts out to projected radii of 10 Mpc. In addition, the equivalent width distribution function of Mg II absorbers around low-mass galaxies is shown to be strongly affected by the presence of a nearby (Rp < 2 Mpc) radio-loud AGNs out to projected radii of 5 Mpc.
Messiah College Biodiesel Fuel Generation Project Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zummo, Michael M; Munson, J; Derr, A
Many obvious and significant concerns arise when considering the concept of small-scale biodiesel production. Does the fuel produced meet the stringent requirements set by the commercial biodiesel industry? Is the process safe? How are small-scale producers collecting and transporting waste vegetable oil? How is waste from the biodiesel production process handled by small-scale producers? These concerns and many others were the focus of the research preformed in the Messiah College Biodiesel Fuel Generation project over the last three years. This project was a unique research program in which undergraduate engineering students at Messiah College set out to research the feasibilitymore » of small-biodiesel production for application on a campus of approximately 3000 students. This Department of Energy (DOE) funded research program developed out of almost a decade of small-scale biodiesel research and development work performed by students at Messiah College. Over the course of the last three years the research team focused on four key areas related to small-scale biodiesel production: Quality Testing and Assurance, Process and Processor Research, Process and Processor Development, and Community Education. The objectives for the Messiah College Biodiesel Fuel Generation Project included the following: 1. Preparing a laboratory facility for the development and optimization of processors and processes, ASTM quality assurance, and performance testing of biodiesel fuels. 2. Developing scalable processor and process designs suitable for ASTM certifiable small-scale biodiesel production, with the goals of cost reduction and increased quality. 3. Conduct research into biodiesel process improvement and cost optimization using various biodiesel feedstocks and production ingredients.« less
StructRNAfinder: an automated pipeline and web server for RNA families prediction.
Arias-Carrasco, Raúl; Vásquez-Morán, Yessenia; Nakaya, Helder I; Maracaja-Coutinho, Vinicius
2018-02-17
The function of many noncoding RNAs (ncRNAs) depend upon their secondary structures. Over the last decades, several methodologies have been developed to predict such structures or to use them to functionally annotate RNAs into RNA families. However, to fully perform this analysis, researchers should utilize multiple tools, which require the constant parsing and processing of several intermediate files. This makes the large-scale prediction and annotation of RNAs a daunting task even to researchers with good computational or bioinformatics skills. We present an automated pipeline named StructRNAfinder that predicts and annotates RNA families in transcript or genome sequences. This single tool not only displays the sequence/structural consensus alignments for each RNA family, according to Rfam database but also provides a taxonomic overview for each assigned functional RNA. Moreover, we implemented a user-friendly web service that allows researchers to upload their own nucleotide sequences in order to perform the whole analysis. Finally, we provided a stand-alone version of StructRNAfinder to be used in large-scale projects. The tool was developed under GNU General Public License (GPLv3) and is freely available at http://structrnafinder.integrativebioinformatics.me . The main advantage of StructRNAfinder relies on the large-scale processing and integrating the data obtained by each tool and database employed along the workflow, of which several files are generated and displayed in user-friendly reports, useful for downstream analyses and data exploration.
Ralph Alig; Darius Adams; John Mills; Richard Haynes; Peter Ince; Robert Moulton
2001-01-01
The TAMM/NAPAP/ATLAS/AREACHANGE(TNAA) system and the Forest and Agriculture Sector Optimization Model (FASOM) are two large-scale forestry sector modeling systems that have been employed to analyze the U.S. forest resource situation. The TNAA system of static, spatial equilibrium models has been applied to make SO-year projections of the U.S. forest sector for more...
On a Game of Large-Scale Projects Competition
NASA Astrophysics Data System (ADS)
Nikonov, Oleg I.; Medvedeva, Marina A.
2009-09-01
The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].
Becker, Carolyn Black; Perez, Marisol; Kilpela, Lisa Smith; Diedrichs, Phillippa C; Trujillo, Eva; Stice, Eric
2017-04-01
Despite recent advances in developing evidence-based psychological interventions, substantial changes are needed in the current system of intervention delivery to impact mental health on a global scale (Kazdin & Blase, 2011). Prevention offers one avenue for reaching large populations because prevention interventions often are amenable to scaling-up strategies, such as task-shifting to lay providers, which further facilitate community stakeholder partnerships. This paper discusses the dissemination and implementation of the Body Project, an evidence-based body image prevention program, across 6 diverse stakeholder partnerships that span academic, non-profit and business sectors at national and international levels. The paper details key elements of the Body Project that facilitated partnership development, dissemination and implementation, including use of community-based participatory research methods and a blended train-the-trainer and task-shifting approach. We observed consistent themes across partnerships, including: sharing decision making with community partners, engaging of community leaders as gatekeepers, emphasizing strengths of community partners, working within the community's structure, optimizing non-traditional and/or private financial resources, placing value on cost-effectiveness and sustainability, marketing the program, and supporting flexibility and creativity in developing strategies for evolution within the community and in research. Ideally, lessons learned with the Body Project can be generalized to implementation of other body image and eating disorder prevention programs. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Joint Genome Institute 2008 Progress Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, David
2009-03-12
While initially a virtual institute, the driving force behind the creation of the DOE Joint Genome Institute in Walnut Creek, California in the Fall of 1999 was the Department of Energy's commitment to sequencing the human genome. With the publication in 2004 of a trio of manuscripts describing the finished 'DOE Human Chromosomes', the Institute successfully completed its human genome mission. In the time between the creation of the Department of Energy Joint Genome Institute (DOE JGI) and completion of the Human Genome Project, sequencing and its role in biology spread to fields extending far beyond what could be imaginedmore » when the Human Genome Project first began. Accordingly, the targets of the DOE JGI's sequencing activities changed, moving from a single human genome to the genomes of large numbers of microbes, plants, and other organisms, and the community of users of DOE JGI data similarly expanded and diversified. Transitioning into operating as a user facility, the DOE JGI modeled itself after other DOE user facilities, such as synchrotron light sources and supercomputer facilities, empowering the science of large numbers of investigators working in areas of relevance to energy and the environment. The JGI's approach to being a user facility is based on the concept that by focusing state-of-the-art sequencing and analysis capabilities on the best peer-reviewed ideas drawn from a broad community of scientists, the DOE JGI will effectively encourage creative approaches to DOE mission areas and produce important science. This clearly has occurred, only partially reflected in the fact that the DOE JGI has played a major role in more than 45 papers published in just the past three years alone in Nature and Science. The involvement of a large and engaged community of users working on important problems has helped maximize the impact of JGI science. A seismic technological change is presently underway at the JGI. The Sanger capillary-based sequencing process that dominated how sequencing was done in the last decade is being replaced by a variety of new processes and sequencing instruments. The JGI, with an increasing number of next-generation sequencers, whose throughput is 100- to 1,000-fold greater than the Sanger capillary-based sequencers, is increasingly focused in new directions on projects of scale and complexity not previously attempted. These new directions for the JGI come, in part, from the 2008 National Research Council report on the goals of the National Plant Genome Initiative as well as the 2007 National Research Council report on the New Science of Metagenomics. Both reports outline a crucial need for systematic large-scale surveys of the plant and microbial components of the biosphere as well as an increasing need for large-scale analysis capabilities to meet the challenge of converting sequence data into knowledge. The JGI is extensively discussed in both reports as vital to progress in these fields of major national interest. JGI's future plan for plants and microbes includes a systematic approach for investigation of these organisms at a scale requiring the special capabilities of the JGI to generate, manage, and analyze the datasets. JGI will generate and provide not only community access to these plant and microbial datasets, but also the tools for analyzing them. These activities will produce essential knowledge that will be needed if we are to be able to respond to the world's energy and environmental challenges. As the JGI Plant and Microbial programs advance, the JGI as a user facility is also evolving. The Institute has been highly successful in bending its technical and analytical skills to help users solve large complex problems of major importance, and that effort will continue unabated. The JGI will increasingly move from a central focus on 'one-off' user projects coming from small user communities to much larger scale projects driven by systematic and problem-focused approaches to selection of sequencing targets. Entire communities of scientists working in a particular field, such as feedstock improvement or biomass degradation, will be users of this information. Despite this new emphasis, an investigator-initiated user program will remain. This program in the future will replace small projects that increasingly can be accomplished without the involvement of JGI, with imaginative large-scale 'Grand Challenge' projects of foundational relevance to energy and the environment that require a new scale of sequencing and analysis capabilities. Close interactions with the DOE Bioenergy Research Centers, and with other DOE institutions that may follow, will also play a major role in shaping aspects of how the JGI operates as a user facility. Based on increased availability of high-throughput sequencing, the JGI will increasingly provide to users, in addition to DNA sequencing, an array of both pre- and post-sequencing value-added capabilities to accelerate their science.« less
Anderson-Schmidt, Heike; Adler, Lothar; Aly, Chadiga; Anghelescu, Ion-George; Bauer, Michael; Baumgärtner, Jessica; Becker, Joachim; Bianco, Roswitha; Becker, Thomas; Bitter, Cosima; Bönsch, Dominikus; Buckow, Karoline; Budde, Monika; Bührig, Martin; Deckert, Jürgen; Demiroglu, Sara Y; Dietrich, Detlef; Dümpelmann, Michael; Engelhardt, Uta; Fallgatter, Andreas J; Feldhaus, Daniel; Figge, Christian; Folkerts, Here; Franz, Michael; Gade, Katrin; Gaebel, Wolfgang; Grabe, Hans-Jörgen; Gruber, Oliver; Gullatz, Verena; Gusky, Linda; Heilbronner, Urs; Helbing, Krister; Hegerl, Ulrich; Heinz, Andreas; Hensch, Tilman; Hiemke, Christoph; Jäger, Markus; Jahn-Brodmann, Anke; Juckel, Georg; Kandulski, Franz; Kaschka, Wolfgang P; Kircher, Tilo; Koller, Manfred; Konrad, Carsten; Kornhuber, Johannes; Krause, Marina; Krug, Axel; Lee, Mahsa; Leweke, Markus; Lieb, Klaus; Mammes, Mechthild; Meyer-Lindenberg, Andreas; Mühlbacher, Moritz; Müller, Matthias J; Nieratschker, Vanessa; Nierste, Barbara; Ohle, Jacqueline; Pfennig, Andrea; Pieper, Marlenna; Quade, Matthias; Reich-Erkelenz, Daniela; Reif, Andreas; Reitt, Markus; Reininghaus, Bernd; Reininghaus, Eva Z; Riemenschneider, Matthias; Rienhoff, Otto; Roser, Patrik; Rujescu, Dan; Schennach, Rebecca; Scherk, Harald; Schmauss, Max; Schneider, Frank; Schosser, Alexandra; Schott, Björn H; Schwab, Sybille G; Schwanke, Jens; Skrowny, Daniela; Spitzer, Carsten; Stierl, Sebastian; Stöckel, Judith; Stübner, Susanne; Thiel, Andreas; Volz, Hans-Peter; von Hagen, Martin; Walter, Henrik; Witt, Stephanie H; Wobrock, Thomas; Zielasek, Jürgen; Zimmermann, Jörg; Zitzelsberger, Antje; Maier, Wolfgang; Falkai, Peter G; Rietschel, Marcella; Schulze, Thomas G
2013-12-01
The German Association for Psychiatry and Psychotherapy (DGPPN) has committed itself to establish a prospective national cohort of patients with major psychiatric disorders, the so-called DGPPN-Cohort. This project will enable the scientific exploitation of high-quality data and biomaterial from psychiatric patients for research. It will be set up using harmonised data sets and procedures for sample generation and guided by transparent rules for data access and data sharing regarding the central research database. While the main focus lies on biological research, it will be open to all kinds of scientific investigations, including epidemiological, clinical or health-service research.
NASA Astrophysics Data System (ADS)
Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W. S.; Bale, S. D.; Carter, T. A.; Crocker, N.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.; Belova, E.; Ellis, R.; Fox, W. R., II; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C. E.; Que, W.; Ren, Y.; Titus, P.; Yamada, M.; Yoo, J.
2014-12-01
A new intermediate-scale plasma experiment, called the Facility for Laboratory Reconnection Experiments or FLARE, is under construction at Princeton as a joint project by five universities and two national labs to study magnetic reconnection in regimes directly relevant to space, solar and astrophysical plasmas. The currently existing small-scale experiments have been focusing on the single X-line reconnection process in plasmas either with small effective sizes or at low Lundquist numbers, both of which are typically very large in natural plasmas. These new regimes involve multiple X-lines as guided by a reconnection "phase diagram", in which different coupling mechanisms from the global system scale to the local dissipation scale are classified into different reconnection phases [H. Ji & W. Daughton, Phys. Plasmas 18, 111207 (2011)]. The design of the FLARE device is based on the existing Magnetic Reconnection Experiment (MRX) at Princeton (http://mrx.pppl.gov) and is to provide experimental access to the new phases involving multiple X-lines at large effective sizes and high Lundquist numbers, directly relevant to space and solar plasmas. The motivating major physics questions, the construction status, and the planned collaborative research especially with space and solar research communities will be discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarmiento, Jorge L; Gnanadesikan, Anand; Gruber, Nicolas
2007-06-21
This final report summarizes research undertaken collaboratively between Princeton University, the NOAA Geophysical Fluid Dynamics Laboratory on the Princeton University campus, the State University of New York at Stony Brook, and the University of California, Los Angeles between September 1, 2000, and November 30, 2006, to do fundamental research on ocean iron fertilization as a means to enhance the net oceanic uptake of CO2 from the atmosphere. The approach we proposed was to develop and apply a suite of coupled physical-ecologicalbiogeochemical models in order to (i) determine to what extent enhanced carbon fixation from iron fertilization will lead to anmore » increase in the oceanic uptake of atmospheric CO2 and how long this carbon will remain sequestered (efficiency), and (ii) examine the changes in ocean ecology and natural biogeochemical cycles resulting from iron fertilization (consequences). The award was funded in two separate three-year installments: • September 1, 2000 to November 30, 2003, for a project entitled “Ocean carbon sequestration by fertilization: An integrated biogeochemical assessment.” A final report was submitted for this at the end of 2003 and is included here as Appendix 1. • December 1, 2003 to November 30, 2006, for a follow-on project under the same grant number entitled “Carbon sequestration by patch fertilization: A comprehensive assessment using coupled physical-ecological-biogeochemical models.” This report focuses primarily on the progress we made during the second period of funding subsequent to the work reported on in Appendix 1. When we began this project, we were thinking almost exclusively in terms of long-term fertilization over large regions of the ocean such as the Southern Ocean, with much of our focus being on how ocean circulation and biogeochemical cycling would interact to control the response to a given fertilization scenario. Our research on these types of scenarios, which was carried out largely during the first three years of our project, led to several major new insights on the interaction between ocean biogeochemistry and circulation. This work, which is described in 2 the following Section II on “Large scale fertilization,” has continued to appear in the literature over the past few years, including two high visibility papers in Nature. Early on in the first three years of our project, it became clear that small "patch-scale" fertilizations over limited regions of order 100 km diameter were much more likely than large scale fertilization, and we carried out a series of idealized patch fertilization simulations reported on in Gnanadesikan et al. (2003). Based on this paper and other results we had obtained by the end of our first three-year grant, we identified a number of important issues that needed to be addressed in the second three-year period of this grant. Section III on “patch fertilization” discusses the major findings of this phase of our research, which is described in two major manuscripts that will be submitted for publication in the near future. This research makes use of new more realistic ocean ecosystem and iron cycling models than our first paper on this topic. We have several major new insights into what controls the efficiency of iron fertilization in the ocean. Section IV on “model development” summarizes a set of papers describing the progress that we made on improving the ecosystem models we use for our iron fertilization simulations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarmiento, Jorge L.; Gnanadesikan, Anand; Gruber, Nicolas
2007-06-21
This final report summarizes research undertaken collaboratively between Princeton University, the NOAA Geophysical Fluid Dynamics Laboratory on the Princeton University campus, the State University of New York at Stony Brook, and the University of California, Los Angeles between September 1, 2000, and November 30, 2006, to do fundamental research on ocean iron fertilization as a means to enhance the net oceanic uptake of CO2 from the atmosphere. The approach we proposed was to develop and apply a suite of coupled physical-ecological-biogeochemical models in order to (i) determine to what extent enhanced carbon fixation from iron fertilization will lead to anmore » increase in the oceanic uptake of atmospheric CO2 and how long this carbon will remain sequestered (efficiency), and (ii) examine the changes in ocean ecology and natural biogeochemical cycles resulting from iron fertilization (consequences). The award was funded in two separate three-year installments: September 1, 2000 to November 30, 2003, for a project entitled “Ocean carbon sequestration by fertilization: An integrated biogeochemical assessment.” A final report was submitted for this at the end of 2003 and is included here as Appendix 1; and, December 1, 2003 to November 30, 2006, for a follow-on project under the same grant number entitled “Carbon sequestration by patch fertilization: A comprehensive assessment using coupled physical-ecological-biogeochemical models.” This report focuses primarily on the progress we made during the second period of funding subsequent to the work reported on in Appendix 1. When we began this project, we were thinking almost exclusively in terms of long-term fertilization over large regions of the ocean such as the Southern Ocean, with much of our focus being on how ocean circulation and biogeochemical cycling would interact to control the response to a given fertilization scenario. Our research on these types of scenarios, which was carried out largely during the first three years of our project, led to several major new insights on the interaction between ocean biogeochemistry and circulation. This work, which is described in the following Section II on “Large scale fertilization,” has continued to appear in the literature over the past few years, including two high visibility papers in Nature. Early on in the first three years of our project, it became clear that small "patch-scale" fertilizations over limited regions of order 100 km diameter were much more likely than large scale fertilization, and we carried out a series of idealized patch fertilization simulations reported on in Gnanadesikan et al. (2003). Based on this paper and other results we had obtained by the end of our first three-year grant, we identified a number of important issues that needed to be addressed in the second three-year period of this grant. Section III on “patch fertilization” discusses the major findings of this phase of our research, which is described in two major manuscripts that will be submitted for publication in the near future. This research makes use of new more realistic ocean ecosystem and iron cycling models than our first paper on this topic. We have several major new insights into what controls the efficiency of iron fertilization in the ocean. Section IV on “model development” summarizes a set of papers describing the progress that we made on improving the ecosystem models we use for our iron fertilization simulations.« less
Computational aerodynamics development and outlook /Dryden Lecture in Research for 1979/
NASA Technical Reports Server (NTRS)
Chapman, D. R.
1979-01-01
Some past developments and current examples of computational aerodynamics are briefly reviewed. An assessment is made of the requirements on future computer memory and speed imposed by advanced numerical simulations, giving emphasis to the Reynolds averaged Navier-Stokes equations and to turbulent eddy simulations. Experimental scales of turbulence structure are used to determine the mesh spacings required to adequately resolve turbulent energy and shear. Assessment also is made of the changing market environment for developing future large computers, and of the projections of micro-electronics memory and logic technology that affect future computer capability. From the two assessments, estimates are formed of the future time scale in which various advanced types of aerodynamic flow simulations could become feasible. Areas of research judged especially relevant to future developments are noted.
Research on copying system of dynamic multiplex holographic stereograms
NASA Astrophysics Data System (ADS)
Fu, Huaiping; Yang, Hong; Zheng, Tong
2003-05-01
The most important advantage of holographic stereograms over conventional hologram is that they can produce 3D images at any desired scale with movement, holographers in many countries involved in the studies towards it. We began our works in the early 80's and accomplished two research projects automatic system for making synthetic holograms and multiplex synthetic rainbow holograms, Based on these works, a large scale holographic stereogram of an animated goldfish was made by us for practical advertisement. In order to meet the needs of the market, a copying system for making multiplex holographic stereograms, and a special kind of silver halide holographic film developed by us recently. The characteristic of the copying system and the property of the special silver-halide emulsion are introduced in this paper.
The Social Climate Scales: An Annotated Bibliography. April, 1977.
ERIC Educational Resources Information Center
Moos, Rudolf, Comp.; Max, Wendy, Comp.
The Social Climate Scales were designed to assess the dimensions of four types of environments: treatment, total institutions, school, and community settings. This report provides abstracts of recent research projects in which the scales have been used. The projects involved use of the Ward Atmosphere Scale, Community-Oriented Programs Environment…
Sign: large-scale gene network estimation environment for high performance computing.
Tamada, Yoshinori; Shimamura, Teppei; Yamaguchi, Rui; Imoto, Seiya; Nagasaki, Masao; Miyano, Satoru
2011-01-01
Our research group is currently developing software for estimating large-scale gene networks from gene expression data. The software, called SiGN, is specifically designed for the Japanese flagship supercomputer "K computer" which is planned to achieve 10 petaflops in 2012, and other high performance computing environments including Human Genome Center (HGC) supercomputer system. SiGN is a collection of gene network estimation software with three different sub-programs: SiGN-BN, SiGN-SSM and SiGN-L1. In these three programs, five different models are available: static and dynamic nonparametric Bayesian networks, state space models, graphical Gaussian models, and vector autoregressive models. All these models require a huge amount of computational resources for estimating large-scale gene networks and therefore are designed to be able to exploit the speed of 10 petaflops. The software will be available freely for "K computer" and HGC supercomputer system users. The estimated networks can be viewed and analyzed by Cell Illustrator Online and SBiP (Systems Biology integrative Pipeline). The software project web site is available at http://sign.hgc.jp/ .
Opportunities for Breakthroughs in Large-Scale Computational Simulation and Design
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia; Alter, Stephen J.; Atkins, Harold L.; Bey, Kim S.; Bibb, Karen L.; Biedron, Robert T.; Carpenter, Mark H.; Cheatwood, F. McNeil; Drummond, Philip J.; Gnoffo, Peter A.
2002-01-01
Opportunities for breakthroughs in the large-scale computational simulation and design of aerospace vehicles are presented. Computational fluid dynamics tools to be used within multidisciplinary analysis and design methods are emphasized. The opportunities stem from speedups and robustness improvements in the underlying unit operations associated with simulation (geometry modeling, grid generation, physical modeling, analysis, etc.). Further, an improved programming environment can synergistically integrate these unit operations to leverage the gains. The speedups result from reducing the problem setup time through geometry modeling and grid generation operations, and reducing the solution time through the operation counts associated with solving the discretized equations to a sufficient accuracy. The opportunities are addressed only at a general level here, but an extensive list of references containing further details is included. The opportunities discussed are being addressed through the Fast Adaptive Aerospace Tools (FAAST) element of the Advanced Systems Concept to Test (ASCoT) and the third Generation Reusable Launch Vehicles (RLV) projects at NASA Langley Research Center. The overall goal is to enable greater inroads into the design process with large-scale simulations.
The circuit architecture of whole brains at the mesoscopic scale.
Mitra, Partha P
2014-09-17
Vertebrate brains of even moderate size are composed of astronomically large numbers of neurons and show a great degree of individual variability at the microscopic scale. This variation is presumably the result of phenotypic plasticity and individual experience. At a larger scale, however, relatively stable species-typical spatial patterns are observed in neuronal architecture, e.g., the spatial distributions of somata and axonal projection patterns, probably the result of a genetically encoded developmental program. The mesoscopic scale of analysis of brain architecture is the transitional point between a microscopic scale where individual variation is prominent and the macroscopic level where a stable, species-typical neural architecture is observed. The empirical existence of this scale, implicit in neuroanatomical atlases, combined with advances in computational resources, makes studying the circuit architecture of entire brains a practical task. A methodology has previously been proposed that employs a shotgun-like grid-based approach to systematically cover entire brain volumes with injections of neuronal tracers. This methodology is being employed to obtain mesoscale circuit maps in mouse and should be applicable to other vertebrate taxa. The resulting large data sets raise issues of data representation, analysis, and interpretation, which must be resolved. Even for data representation the challenges are nontrivial: the conventional approach using regional connectivity matrices fails to capture the collateral branching patterns of projection neurons. Future success of this promising research enterprise depends on the integration of previous neuroanatomical knowledge, partly through the development of suitable computational tools that encapsulate such expertise. Copyright © 2014 Elsevier Inc. All rights reserved.
Tropical precipitation extremes: Response to SST-induced warming in aquaplanet simulations
NASA Astrophysics Data System (ADS)
Bhattacharya, Ritthik; Bordoni, Simona; Teixeira, João.
2017-04-01
Scaling of tropical precipitation extremes in response to warming is studied in aquaplanet experiments using the global Weather Research and Forecasting (WRF) model. We show how the scaling of precipitation extremes is highly sensitive to spatial and temporal averaging: while instantaneous grid point extreme precipitation scales more strongly than the percentage increase (˜7% K-1) predicted by the Clausius-Clapeyron (CC) relationship, extremes for zonally and temporally averaged precipitation follow a slight sub-CC scaling, in agreement with results from Climate Model Intercomparison Project (CMIP) models. The scaling depends crucially on the employed convection parameterization. This is particularly true when grid point instantaneous extremes are considered. These results highlight how understanding the response of precipitation extremes to warming requires consideration of dynamic changes in addition to the thermodynamic response. Changes in grid-scale precipitation, unlike those in convective-scale precipitation, scale linearly with the resolved flow. Hence, dynamic changes include changes in both large-scale and convective-scale motions.
NASA Astrophysics Data System (ADS)
Wilcox, Steve; Myers, Daryl
2009-08-01
The U.S. Department of Energy's National Renewable Energy Laboratory has embarked on a collaborative effort with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of concentrating solar thermal power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result will be high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Overview of the SHIELDS Project at LANL
NASA Astrophysics Data System (ADS)
Jordanova, V.; Delzanno, G. L.; Henderson, M. G.; Godinez, H. C.; Jeffery, C. A.; Lawrence, E. C.; Meierbachtol, C.; Moulton, D.; Vernon, L.; Woodroffe, J. R.; Toth, G.; Welling, D. T.; Yu, Y.; Birn, J.; Thomsen, M. F.; Borovsky, J.; Denton, M.; Albert, J.; Horne, R. B.; Lemon, C. L.; Markidis, S.; Young, S. L.
2015-12-01
The near-Earth space environment is a highly dynamic and coupled system through a complex set of physical processes over a large range of scales, which responds nonlinearly to driving by the time-varying solar wind. Predicting variations in this environment that can affect technologies in space and on Earth, i.e. "space weather", remains a big space physics challenge. We present a recently funded project through the Los Alamos National Laboratory (LANL) Directed Research and Development (LDRD) program that is developing a new capability to understand, model, and predict Space Hazards Induced near Earth by Large Dynamic Storms, the SHIELDS framework. The project goals are to specify the dynamics of the hot (keV) particles (the seed population for the radiation belts) on both macro- and micro-scale, including important physics of rapid particle injection and acceleration associated with magnetospheric storms/substorms and plasma waves. This challenging problem is addressed using a team of world-class experts in the fields of space science and computational plasma physics and state-of-the-art models and computational facilities. New data assimilation techniques employing data from LANL instruments on the Van Allen Probes and geosynchronous satellites are developed in addition to physics-based models. This research will provide a framework for understanding of key radiation belt drivers that may accelerate particles to relativistic energies and lead to spacecraft damage and failure. The ability to reliably distinguish between various modes of failure is critically important in anomaly resolution and forensics. SHIELDS will enhance our capability to accurately specify and predict the near-Earth space environment where operational satellites reside.
Discovering Beaten Paths in Collaborative Ontology-Engineering Projects using Markov Chains
Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A.; Noy, Natalya F.
2014-01-01
Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50, 000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. PMID:24953242
Discovering beaten paths in collaborative ontology-engineering projects using Markov chains.
Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A; Noy, Natalya F
2014-10-01
Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50,000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. Copyright © 2014 Elsevier Inc. All rights reserved.
Stahl, Olivier; Duvergey, Hugo; Guille, Arnaud; Blondin, Fanny; Vecchio, Alexandre Del; Finetti, Pascal; Granjeaud, Samuel; Vigy, Oana; Bidaut, Ghislain
2013-06-06
With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. We developed Djeen (Database for Joomla!'s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group.Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material.
2013-01-01
Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665
Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER.
Chatzigeorgiou, Giorgos; Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos
2016-01-01
Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues.
He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z
2013-12-04
Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.
Wang, Jack T H; Daly, Joshua N; Willner, Dana L; Patil, Jayee; Hall, Roy A; Schembri, Mark A; Tyson, Gene W; Hugenholtz, Philip
2015-05-01
Clinical microbiology testing is crucial for the diagnosis and treatment of community and hospital-acquired infections. Laboratory scientists need to utilize technical and problem-solving skills to select from a wide array of microbial identification techniques. The inquiry-driven laboratory training required to prepare microbiology graduates for this professional environment can be difficult to replicate within undergraduate curricula, especially in courses that accommodate large student cohorts. We aimed to improve undergraduate scientific training by engaging hundreds of introductory microbiology students in an Authentic Large-Scale Undergraduate Research Experience (ALURE). The ALURE aimed to characterize the microorganisms that reside in the healthy human oral cavity-the oral microbiome-by analyzing hundreds of samples obtained from student volunteers within the course. Students were able to choose from selective and differential culture media, Gram-staining, microscopy, as well as polymerase chain reaction (PCR) and 16S rRNA gene sequencing techniques, in order to collect, analyze, and interpret novel data to determine the collective oral microbiome of the student cohort. Pre- and postsurvey analysis of student learning gains across two iterations of the course (2012-2013) revealed significantly higher student confidence in laboratory skills following the completion of the ALURE (p < 0.05 using the Mann-Whitney U-test). Learning objectives on effective scientific communication were also met through effective student performance in laboratory reports describing the research outcomes of the project. The integration of undergraduate research in clinical microbiology has the capacity to deliver authentic research experiences and improve scientific training for large cohorts of undergraduate students.
How uncertain are climate model projections of water availability indicators across the Middle East?
Hemming, Debbie; Buontempo, Carlo; Burke, Eleanor; Collins, Mat; Kaye, Neil
2010-11-28
The projection of robust regional climate changes over the next 50 years presents a considerable challenge for the current generation of climate models. Water cycle changes are particularly difficult to model in this area because major uncertainties exist in the representation of processes such as large-scale and convective rainfall and their feedback with surface conditions. We present climate model projections and uncertainties in water availability indicators (precipitation, run-off and drought index) for the 1961-1990 and 2021-2050 periods. Ensembles from two global climate models (GCMs) and one regional climate model (RCM) are used to examine different elements of uncertainty. Although all three ensembles capture the general distribution of observed annual precipitation across the Middle East, the RCM is consistently wetter than observations, especially over the mountainous areas. All future projections show decreasing precipitation (ensemble median between -5 and -25%) in coastal Turkey and parts of Lebanon, Syria and Israel and consistent run-off and drought index changes. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) GCM ensemble exhibits drying across the north of the region, whereas the Met Office Hadley Centre work Quantifying Uncertainties in Model ProjectionsAtmospheric (QUMP-A) GCM and RCM ensembles show slight drying in the north and significant wetting in the south. RCM projections also show greater sensitivity (both wetter and drier) and a wider uncertainty range than QUMP-A. The nature of these uncertainties suggests that both large-scale circulation patterns, which influence region-wide drying/wetting patterns, and regional-scale processes, which affect localized water availability, are important sources of uncertainty in these projections. To reduce large uncertainties in water availability projections, it is suggested that efforts would be well placed to focus on the understanding and modelling of both large-scale processes and their teleconnections with Middle East climate and localized processes involved in orographic precipitation.
Ganchoon, Filipinas; Bugho, Rommel; Calina, Liezel; Dy, Rochelle; Gosney, James
2017-06-09
Physiatrists have provided humanitarian assistance in recent large-scale global natural disasters. Super Typhoon Haiyan, the deadliest and most costly typhoon in modern Philippine history, made landfall on 8 November 2013 resulting in significant humanitarian needs. Philippine Academy of Rehabilitation Medicine physiatrists conducted a project of 23 emergency basic relief and medical aid missions in response to Super Typhoon Haiyan from November 2013 to February 2014. The final mission was a medical aid mission to the inland rural community of Burauen, Leyte. Summary data were collected, collated, and tabulated; project and mission evaluation was performed. During the humanitarian assistance project, 31,254 basic relief kits containing a variety of food and non-food items were distributed and medical services including consultation, treatment, and medicines were provided to 7255 patients. Of the 344 conditions evaluated in the medical aid mission to Burauen, Leyte 85 (59%) were physical and rehabilitation medicine conditions comprised of musculoskeletal (62 [73%]), neurological (17 [20%]), and dermatological (6 [7%]) diagnoses. Post-mission and project analysis resulted in recommendations and programmatic changes to strengthen response in future disasters. Physiatrists functioned as medical providers, mission team leaders, community advocates, and in other roles. This physiatrist-led humanitarian assistance project met critical basic relief and medical aid needs of persons impacted by Super Typhoon Haiyan, demonstrating significant roles performed by physiatrists in response to a large-scale natural disaster. Resulting disaster programing changes and recommendations may inform a more effective response by PARM mission teams in the Philippines as well as by other South-Eastern Asia teams comprising rehabilitation professionals to large-scale, regional natural disasters. Implications for rehabilitation Large-scale natural disasters including tropical cyclones can have a catastrophic impact on the affected population. In response to Super Typhoon Haiyan, physiatrists representing the Philippine Academy of Rehabilitation Medicine conducted a project of 23 emergency basic relief and medical aid missions from November 2013 to February 2014. Project analysis indicates that medical mission teams responding in similar settings may expect to evaluate a significant number of physical medicine and rehabilitation conditions. Medical rehabilitation with participation by rehabilitation professionals including rehabilitation doctors is essential to the emergency medical response in large-scale natural disasters.
Sagili, Karuna D; Satyanarayana, Srinath; Chadha, Sarabjit S; Wilson, Nevin C; Kumar, Ajay M V; Moonan, Patrick K; Oeltmann, John E; Chadha, Vineet K; Nagaraja, Sharath Burugina; Ghosh, Smita; Q Lo, Terrence; Volkmann, Tyson; Willis, Matthew; Shringarpure, Kalpita; Reddy, Ravichandra Chinnappa; Kumar, Prahlad; Nair, Sreenivas A; Rao, Raghuram; Yassin, Mohammed; Mwangala, Perry; Zachariah, Rony; Tonsing, Jamhoih; Harries, Anthony D; Khaparde, Sunil
2018-01-01
The Global Fund encourages operational research (OR) in all its grants; however very few reports describe this aspect. In India, Project Axshya was supported by a Global Fund grant to improve the reach and visibility of the government Tuberculosis (TB) services among marginalised and vulnerable communities. OR was incorporated to build research capacity of professionals working with the national TB programme and to generate evidence to inform policies and practices. To describe how Project Axshya facilitated building OR capacity within the country, helped in addressing several TB control priority research questions, documented project activities and their outcomes, and influenced policy and practice. From September 2010 to September 2016, three key OR-related activities were implemented. First, practical output-oriented modular training courses were conducted (n = 3) to build research capacity of personnel involved in the TB programme, co-facilitated by The Union, in collaboration with the national TB programme, WHO country office and CDC, Atlanta. Second, two large-scale Knowledge, Attitude and Practice (KAP) surveys were conducted at baseline and mid-project to assess the changes pertaining to TB knowledge, attitudes and practices among the general population, TB patients and health care providers over the project period. Third, studies were conducted to describe the project's core activities and outcomes. In the training courses, 44 participant teams were supported to develop research protocols on topics of national priority, resulting in 28 peer-reviewed scientific publications. The KAP surveys and description of project activities resulted in 14 peer-reviewed publications. Of the published papers at least 12 have influenced change in policy or practice. OR within a Global Fund supported TB project has resulted in building OR capacity, facilitating research in areas of national priority and influencing policy and practice. We believe this experience will provide guidance for undertaking OR in Global Fund projects.
Clinical research in Finland in 2002 and 2007: quantity and type
2013-01-01
Background Regardless of worries over clinical research and various initiatives to overcome problems, few quantitative data on the numbers and type of clinical research exist. This article aims to describe the volume and type of clinical research in 2002 and 2007 in Finland. Methods The research law in Finland requires all medical research to be submitted to regional ethics committees (RECs). Data from all new projects in 2002 and 2007 were collected from REC files and the characteristics of clinical projects (76% of all submissions) were analyzed. Results The number of clinical projects was large, but declining: 794 in 2002 and 762 in 2007. Drug research (mainly trials) represented 29% and 34% of the clinical projects; their total number had not declined, but those without a commercial sponsor had. The number of different principal investigators was large (630 and 581). Most projects were observational, while an experimental design was used in 43% of projects. Multi-center studies were common. In half of the projects, the main funder was health care or was done as unpaid work; 31% had industry funding as the main source. There was a clear difference in the type of research by sponsorship. Industry-funded research was largely drug research, international multi-center studies, with randomized controlled or other experimental design. The findings for the two years were similar, but a university hospital as the main research site became less common between 2002 and 2007. Conclusions Clinical research projects were common, but numbers are declining; research was largely funded by health care, with many physicians involved. Drug trials were a minority, even though most research promotion efforts and regulation concerns them. PMID:23680289
Clinical research in Finland in 2002 and 2007: quantity and type.
Hemminki, Elina; Virtanen, Jorma; Veerus, Piret; Regushevskaya, Elena
2013-05-16
Regardless of worries over clinical research and various initiatives to overcome problems, few quantitative data on the numbers and type of clinical research exist. This article aims to describe the volume and type of clinical research in 2002 and 2007 in Finland. The research law in Finland requires all medical research to be submitted to regional ethics committees (RECs). Data from all new projects in 2002 and 2007 were collected from REC files and the characteristics of clinical projects (76% of all submissions) were analyzed. The number of clinical projects was large, but declining: 794 in 2002 and 762 in 2007. Drug research (mainly trials) represented 29% and 34% of the clinical projects; their total number had not declined, but those without a commercial sponsor had. The number of different principal investigators was large (630 and 581). Most projects were observational, while an experimental design was used in 43% of projects. Multi-center studies were common. In half of the projects, the main funder was health care or was done as unpaid work; 31% had industry funding as the main source. There was a clear difference in the type of research by sponsorship. Industry-funded research was largely drug research, international multi-center studies, with randomized controlled or other experimental design. The findings for the two years were similar, but a university hospital as the main research site became less common between 2002 and 2007. Clinical research projects were common, but numbers are declining; research was largely funded by health care, with many physicians involved. Drug trials were a minority, even though most research promotion efforts and regulation concerns them.
Implementation of Fiber Optic Sensing System on Sandwich Composite Cylinder Buckling Test
NASA Technical Reports Server (NTRS)
Pena, Francisco; Richards, W. Lance; Parker, Allen R.; Piazza, Anthony; Schultz, Marc R.; Rudd, Michelle T.; Gardner, Nathaniel W.; Hilburger, Mark W.
2018-01-01
The National Aeronautics and Space Administration (NASA) Engineering and Safety Center Shell Buckling Knockdown Factor Project is a multicenter project tasked with developing new analysis-based shell buckling design guidelines and design factors (i.e., knockdown factors) through high-fidelity buckling simulations and advanced test technologies. To validate these new buckling knockdown factors for future launch vehicles, the Shell Buckling Knockdown Factor Project is carrying out structural testing on a series of large-scale metallic and composite cylindrical shells at the NASA Marshall Space Flight Center (Marshall Space Flight Center, Alabama). A fiber optic sensor system was used to measure strain on a large-scale sandwich composite cylinder that was tested under multiple axial compressive loads up to more than 850,000 lb, and equivalent bending loads over 22 million in-lb. During the structural testing of the composite cylinder, strain data were collected from optical cables containing distributed fiber Bragg gratings using a custom fiber optic sensor system interrogator developed at the NASA Armstrong Flight Research Center. A total of 16 fiber-optic strands, each containing nearly 1,000 fiber Bragg gratings, measuring strain, were installed on the inner and outer cylinder surfaces to monitor the test article global structural response through high-density real-time and post test strain measurements. The distributed sensing system provided evidence of local epoxy failure at the attachment-ring-to-barrel interface that would not have been detected with conventional instrumentation. Results from the fiber optic sensor system were used to further refine and validate structural models for buckling of the large-scale composite structures. This paper discusses the techniques employed for real-time structural monitoring of the composite cylinder for structural load introduction and distributed bending-strain measurements over a large section of the cylinder by utilizing unique sensing capabilities of fiber optic sensors.
SCALING-UP INFORMATION IN LAND-COVER DATA FOR LARGE-SCALE ENVIRONMENTAL ASSESSMENTS
The NLCD project provides national-scope land-cover data for the conterminous United States. The first land-cover data set was completed in 2000, and the continuing need for recent land-cover information has motivated continuation of the project to provide current and change info...
NASA Astrophysics Data System (ADS)
Langouët, Loïc; Daire, Marie-Yvane
2009-12-01
The present-day maritime landscape of Western France forms the geographical framework for a recent research project dedicated to the archaeological study of ancient fish-traps, combining regional-scale and site-scale investigations. Based on the compilation and exploitation of a large unpublished dataset including more than 550 sites, a preliminary synthetic study allows us to present some examples of synchronic and thematic approaches, and propose a morphological classification of the weirs. These encouraging first results open up new perspectives on fish-trap chronology closely linked to wider studies on Holocene sea-level changes.
The Utility of Template Analysis in Qualitative Psychology Research.
Brooks, Joanna; McCluskey, Serena; Turley, Emma; King, Nigel
2015-04-03
Thematic analysis is widely used in qualitative psychology research, and in this article, we present a particular style of thematic analysis known as Template Analysis. We outline the technique and consider its epistemological position, then describe three case studies of research projects which employed Template Analysis to illustrate the diverse ways it can be used. Our first case study illustrates how the technique was employed in data analysis undertaken by a team of researchers in a large-scale qualitative research project. Our second example demonstrates how a qualitative study that set out to build on mainstream theory made use of the a priori themes (themes determined in advance of coding) permitted in Template Analysis. Our final case study shows how Template Analysis can be used from an interpretative phenomenological stance. We highlight the distinctive features of this style of thematic analysis, discuss the kind of research where it may be particularly appropriate, and consider possible limitations of the technique. We conclude that Template Analysis is a flexible form of thematic analysis with real utility in qualitative psychology research.
Scaling of data communications for an advanced supercomputer network
NASA Technical Reports Server (NTRS)
Levin, E.; Eaton, C. K.; Young, Bruce
1986-01-01
The goal of NASA's Numerical Aerodynamic Simulation (NAS) Program is to provide a powerful computational environment for advanced research and development in aeronautics and related disciplines. The present NAS system consists of a Cray 2 supercomputer connected by a data network to a large mass storage system, to sophisticated local graphics workstations and by remote communication to researchers throughout the United States. The program plan is to continue acquiring the most powerful supercomputers as they become available. The implications of a projected 20-fold increase in processing power on the data communications requirements are described.
Detection and analysis of part load and full load instabilities in a real Francis turbine prototype
NASA Astrophysics Data System (ADS)
Presas, Alexandre; Valentin, David; Egusquiza, Eduard; Valero, Carme
2017-04-01
Francis turbines operate in many cases out of its best efficiency point, in order to regulate their output power according to the instantaneous energy demand of the grid. Therefore, it is of paramount importance to analyse and determine the unstable operating points for these kind of units. In the framework of the HYPERBOLE project (FP7-ENERGY-2013-1; Project number 608532) a large Francis unit was investigated numerically, experimentally in a reduced scale model and also experimentally and numerically in the real prototype. This paper shows the unstable operating points identified during the experimental tests on the real Francis unit and the analysis of the main characteristics of these instabilities. Finally, it is shown that similar phenomena have been identified on previous research in the LMH (Laboratory for Hydraulic Machines, Lausanne) with the reduced scale model.
Capturing change: the duality of time-lapse imagery to acquire data and depict ecological dynamics
Brinley Buckley, Emma M.; Allen, Craig R.; Forsberg, Michael; Farrell, Michael; Caven, Andrew J.
2017-01-01
We investigate the scientific and communicative value of time-lapse imagery by exploring applications for data collection and visualization. Time-lapse imagery has a myriad of possible applications to study and depict ecosystems and can operate at unique temporal and spatial scales to bridge the gap between large-scale satellite imagery projects and observational field research. Time-lapse data sequences, linking time-lapse imagery with data visualization, have the ability to make data come alive for a wider audience by connecting abstract numbers to images that root data in time and place. Utilizing imagery from the Platte Basin Timelapse Project, water inundation and vegetation phenology metrics are quantified via image analysis and then paired with passive monitoring data, including streamflow and water chemistry. Dynamic and interactive time-lapse data sequences elucidate the visible and invisible ecological dynamics of a significantly altered yet internationally important river system in central Nebraska.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
From November 1991 to April 1996, Kerr McGee Coal Corporation (K-M Coal) led a project to develop the Institute of Gas Technology (IGT) Mild Gasification (MILDGAS) process for near-term commercialization. The specific objectives of the program were to: design, construct, and operate a 24-tons/day adiabatic process development unit (PDU) to obtain process performance data suitable for further design scale-up; obtain large batches of coal-derived co-products for industrial evaluation; prepare a detailed design of a demonstration unit; and develop technical and economic plans for commercialization of the MILDGAS process. The project team for the PDU development program consisted of: K-M Coal,more » IGT, Bechtel Corporation, Southern Illinois University at Carbondale (SIUC), General Motors (GM), Pellet Technology Corporation (PTC), LTV Steel, Armco Steel, Reilly Industries, and Auto Research.« less
NASA Astrophysics Data System (ADS)
Ryan, J. G.
2012-12-01
Bringing the use of cutting-edge research tools into student classroom experiences has long been a popular educational strategy in the geosciences and other STEM disciplines. The NSF CCLI and TUES programs have funded a large number of projects that placed research-grade instrumentation at educational institutions for instructional use and use in supporting undergraduate research activities. While student and faculty response to these activities has largely been positive, a range of challenges exist related to their educational effectiveness. Many of the obstacles these approaches have faced relate to "scaling up" of research mentoring experiences (e.g., providing training and time for use for an entire classroom of students, as opposed to one or two), and to time tradeoffs associated with providing technical training for effective instrument use versus course content coverage. The biggest challenge has often been simple logistics: a single instrument, housed in a different space, is difficult to integrate effectively into instructional activities. My CCLI-funded project sought primarily to knock down the logistical obstacles to research instrument use by taking advantage of remote instrument operation technologies, which allow the in-classroom use of networked analytical tools. Remote use of electron microprobe and SEM instruments of the Florida Center for Analytical Electron Microscopy (FCAEM) in Miami, FL was integrated into two geoscience courses at USF in Tampa, FL. Remote operation permitted the development of whole-class laboratory exercises to familiarize students with the tools, their function, and their capabilities; and it allowed students to collect high-quality chemical and image data on their own prepared samples in the classroom during laboratory periods. These activities improve student engagement in the course, appear to improve learning of key concepts in mineralogy and petrology, and have led to students pursuing independent research projects, as well as requesting additional Geology elective courses offering similar kinds of experiences. I have sustained these activities post-project via student lab fees to pay for in-class microprobe time.
Successful contracting of prevention services: fighting malnutrition in Senegal and Madagascar.
Marek, T; Diallo, I; Ndiaye, B; Rakotosalama, J
1999-12-01
There are very few documented large-scale successes in nutrition in Africa, and virtually no consideration of contracting for preventive services. This paper describes two successful large-scale community nutrition projects in Africa as examples of what can be done in prevention using the contracting approach in rural as well as urban areas. The two case-studies are the Secaline project in Madagascar, and the Community Nutrition Project in Senegal. The article explains what is meant by 'success' in the context of these two projects, how these results were achieved, and how certain bottlenecks were avoided. Both projects are very similar in the type of service they provide, and in combining private administration with public finance. The article illustrates that contracting out is a feasible option to be seriously considered for organizing certain prevention programmes on a large scale. There are strong indications from these projects of success in terms of reducing malnutrition, replicability and scale, and community involvement. When choosing that option, a government can tap available private local human resources through contracting out, rather than delivering those services by the public sector. However, as was done in both projects studied, consideration needs to be given to using a contract management unit for execution and monitoring, which costs 13-17% of the total project's budget. Rigorous assessments of the cost-effectiveness of contracted services are not available, but improved health outcomes, targeting of the poor, and basic cost data suggest that the programmes may well be relatively cost-effective. Although the contracting approach is not presented as the panacea to solve the malnutrition problem faced by Africa, it can certainly provide an alternative in many countries to increase coverage and quality of services.
Methods and Management of the Healthy Brain Study: A Large Multisite Qualitative Research Project
ERIC Educational Resources Information Center
Laditka, Sarah B.; Corwin, Sara J.; Laditka, James N.; Liu, Rui; Friedman, Daniela B.; Mathews, Anna E.; Wilcox, Sara
2009-01-01
Purpose of the study: To describe processes used in the Healthy Brain project to manage data collection, coding, and data distribution in a large qualitative project, conducted by researchers at 9 universities in 9 states. Design and Methods: Project management protocols included: (a) managing audiotapes and surveys to ensure data confidentiality,…
Modeling veterans healthcare administration disclosure processes :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.
As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested acrossmore » a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.« less
Study of Travelling Interplanetary Phenomena Report
NASA Astrophysics Data System (ADS)
Dryer, Murray
1987-09-01
Scientific progress on the topic of energy, mass, and momentum transport from the Sun into the heliosphere is contingent upon interdisciplinary and international cooperative efforts on the part of many workers. Summarized here is a report of some highlights of research carried out during the SMY/SMA by the STIP (Study of Travelling Interplanetary Phenomena) Project that included solar and interplanetary scientists around the world. These highlights are concerned with coronal mass ejections from solar flares or erupting prominences (sometimes together); their large-scale consequences in interplanetary space (such as shocks and magnetic 'bubbles'); and energetic particles and their relationship to these large-scale structures. It is concluded that future progress is contingent upon similar international programs assisted by real-time (or near-real-time) warnings of solar activity by cooperating agencies along the lines experienced during the SMY/SMA.
Grid-Enabled Quantitative Analysis of Breast Cancer
2010-10-01
large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also
Lyman L. McDonald; Robert Bilby; Peter A. Bisson; Charles C. Coutant; John M. Epifanio; Daniel Goodman; Susan Hanna; Nancy Huntly; Erik Merrill; Brian Riddell; William Liss; Eric J. Loudenslager; David P. Philipp; William Smoker; Richard R. Whitney; Richard N. Williams
2007-01-01
The year 2006 marked two milestones in the Columbia River Basin and the Pacific Northwest region's efforts to rebuild its once great salmon and steelhead runs: the 25th anniversary of the creation of the Northwest Power and Conservation Council and the 10th anniversary of an amendment to the Northwest Power Act that formalized scientific peer review of the council...
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Gallagher, Mary C.
1985-01-01
There exists a large number of large-scale bibliographic Information Storage and Retrieval Systems containing large amounts of valuable data of interest in a wide variety of research applications. These systems are not used to capacity because the end users, i.e., the researchers, have not been trained in the techniques of accessing such systems. This thesis describes the development of a transportable, university-level course in methods of querying on-line interactive Information Storage and Retrieval systems as a solution to this problem. This course was designed to instruct upper division science and engineering students to enable these end users to directly access such systems. The course is designed to be taught by instructors who are not specialists in either computer science or research skills. It is independent of any particular IS and R system or computer hardware. The project is sponsored by NASA and conducted by the University of Southwestern Louisiana and Southern University.
Quantifying Climate Change Hydrologic Risk at NASA Ames Research Center
NASA Astrophysics Data System (ADS)
Mills, W. B.; Bromirski, P. D.; Coats, R. N.; Costa-Cabral, M.; Fong, J.; Loewenstein, M.; Milesi, C.; Miller, N.; Murphy, N.; Roy, S.
2013-12-01
In response to 2009 Executive Order 13514 mandating U.S. federal agencies to evaluate infrastructure vulnerabilities due to climate variability and change we provide an analysis of future climate flood risk at NASA Ames Research Center (Ames) along South S.F. Bay. This includes likelihood analysis of large-scale water vapor transport, statistical analysis of intense precipitation, high winds, sea level rise, storm surge, estuary dynamics, saturated overland flooding, and likely impacts to wetlands and habitat loss near Ames. We use the IPCC CMIP5 data from three Atmosphere-Ocean General Circulation Models with Radiative Concentration Pathways of 8.5 Wm-2 and 4.5 Wm-2 and provide an analysis of climate variability and change associated with flooding and impacts at Ames. Intense storms impacting Ames are due to two large-scale processes, sub-tropical atmospheric rivers (AR) and north Pacific Aleutian low-pressure (AL) storm systems, both of which are analyzed here in terms of the Integrated Water Vapor (IWV) exceeding a critical threshold within a search domain and the wind vector transporting the IWV from southerly to westerly to northwesterly for ARs and northwesterly to northerly for ALs and within the Ames impact area during 1970-1999, 2040-2069, and 2070-2099. We also include a statistical model of extreme precipitation at Ames based on large-scale climatic predictors, and characterize changes using CMIP5 projections. Requirements for levee height to protect Ames are projected to increase and continually accelerate throughout this century as sea level rises. We use empirical statistical and analytical methods to determine the likelihood, in each year from present through 2099, of water level surpassing different threshold values in SF Bay near NASA Ames. We study the sensitivity of the water level corresponding to a 1-in-10 and 1-in-100 likelihood of exceedance to changes in the statistical distribution of storm surge height and ENSO height, in addition to increasing mean sea level. We examine the implications in the face of the CMIP5 projections. Storm intensification may result in increased flooding hazards at Ames. We analyze how the changes in precipitation intensity will impact the storm drainage system at Ames through continuous stormwater modeling of runoff with the EPA model SWMM 5 and projected downscaled daily precipitation data. Although extreme events will not adversely affect wetland habitats, adaptation projects--especially levee construction and improvement--will require filling of wetlands. Federal law mandates mitigation for fill placed in wetlands. We are currently calculating the potential mitigation burden by habitat type.
Basin-Scale Hydrologic Impacts of CO2 Storage: Regulatory and Capacity Implications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birkholzer, J.T.; Zhou, Q.
Industrial-scale injection of CO{sub 2} into saline sedimentary basins will cause large-scale fluid pressurization and migration of native brines, which may affect valuable groundwater resources overlying the deep sequestration reservoirs. In this paper, we discuss how such basin-scale hydrologic impacts can (1) affect regulation of CO{sub 2} storage projects and (2) may reduce current storage capacity estimates. Our assessment arises from a hypothetical future carbon sequestration scenario in the Illinois Basin, which involves twenty individual CO{sub 2} storage projects in a core injection area suitable for long-term storage. Each project is assumed to inject five million tonnes of CO{sub 2}more » per year for 50 years. A regional-scale three-dimensional simulation model was developed for the Illinois Basin that captures both the local-scale CO{sub 2}-brine flow processes and the large-scale groundwater flow patterns in response to CO{sub 2} storage. The far-field pressure buildup predicted for this selected sequestration scenario suggests that (1) the area that needs to be characterized in a permitting process may comprise a very large region within the basin if reservoir pressurization is considered, and (2) permits cannot be granted on a single-site basis alone because the near- and far-field hydrologic response may be affected by interference between individual sites. Our results also support recent studies in that environmental concerns related to near-field and far-field pressure buildup may be a limiting factor on CO{sub 2} storage capacity. In other words, estimates of storage capacity, if solely based on the effective pore volume available for safe trapping of CO{sub 2}, may have to be revised based on assessments of pressure perturbations and their potential impact on caprock integrity and groundwater resources, respectively. We finally discuss some of the challenges in making reliable predictions of large-scale hydrologic impacts related to CO{sub 2} sequestration projects.« less
Telecommunications technology and rural education in the United States
NASA Technical Reports Server (NTRS)
Perrine, J. R.
1975-01-01
The rural sector of the US is examined from the point of view of whether telecommunications technology can augment the development of rural education. Migratory farm workers and American Indians were the target groups which were examined as examples of groups with special needs in rural areas. The general rural population and the target groups were examined to identify problems and to ascertain specific educational needs. Educational projects utilizing telecommunications technology in target group settings were discussed. Large scale regional ATS-6 satellite-based experimental educational telecommunications projects were described. Costs and organizational factors were also examined for large scale rural telecommunications projects.
Induced Seismicity Potential of Energy Technologies
NASA Astrophysics Data System (ADS)
Hitzman, Murray
2013-03-01
Earthquakes attributable to human activities-``induced seismic events''-have received heightened public attention in the United States over the past several years. Upon request from the U.S. Congress and the Department of Energy, the National Research Council was asked to assemble a committee of experts to examine the scale, scope, and consequences of seismicity induced during fluid injection and withdrawal associated with geothermal energy development, oil and gas development, and carbon capture and storage (CCS). The committee's report, publicly released in June 2012, indicates that induced seismicity associated with fluid injection or withdrawal is caused in most cases by change in pore fluid pressure and/or change in stress in the subsurface in the presence of faults with specific properties and orientations and a critical state of stress in the rocks. The factor that appears to have the most direct consequence in regard to induced seismicity is the net fluid balance (total balance of fluid introduced into or removed from the subsurface). Energy technology projects that are designed to maintain a balance between the amount of fluid being injected and withdrawn, such as most oil and gas development projects, appear to produce fewer seismic events than projects that do not maintain fluid balance. Major findings from the study include: (1) as presently implemented, the process of hydraulic fracturing for shale gas recovery does not pose a high risk for inducing felt seismic events; (2) injection for disposal of waste water derived from energy technologies does pose some risk for induced seismicity, but very few events have been documented over the past several decades relative to the large number of disposal wells in operation; and (3) CCS, due to the large net volumes of injected fluids suggested for future large-scale carbon storage projects, may have potential for inducing larger seismic events.
Housing first on a large scale: Fidelity strengths and challenges in the VA's HUD-VASH program.
Kertesz, Stefan G; Austin, Erika L; Holmes, Sally K; DeRussy, Aerin J; Van Deusen Lukas, Carol; Pollio, David E
2017-05-01
Housing First (HF) combines permanent supportive housing and supportive services for homeless individuals and removes traditional treatment-related preconditions for housing entry. There has been little research describing strengths and shortfalls of HF implementation outside of research demonstration projects. The U.S. Department of Veterans Affairs (VA) has transitioned to an HF approach in a supportive housing program serving over 85,000 persons. This offers a naturalistic window to study fidelity when HF is adopted on a large scale. We operationalized HF into 20 criteria grouped into 5 domains. We assessed 8 VA medical centers twice (1 year apart), scoring each criterion using a scale ranging from 1 ( low fidelity ) to 4 ( high fidelity ). There were 2 HF domains (no preconditions and rapidly offering permanent housing) for which high fidelity was readily attained. There was uneven progress in prioritizing the most vulnerable clients for housing support. Two HF domains (sufficient supportive services and a modern recovery philosophy) had considerably lower fidelity. Interviews suggested that operational issues such as shortfalls in staffing and training likely hindered performance in these 2 domains. In this ambitious national HF program, the largest to date, we found substantial fidelity in focusing on permanent housing and removal of preconditions to housing entry. Areas of concern included the adequacy of supportive services and adequacy in deployment of a modern recovery philosophy. Under real-world conditions, large-scale implementation of HF is likely to require significant additional investment in client service supports to assure that results are concordant with those found in research studies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Cogollor, José M; Rojo-Lacal, Javier; Hermsdörfer, Joachim; Ferre, Manuel; Arredondo Waldmeyer, Maria Teresa; Giachritsis, Christos; Armstrong, Alan; Breñosa Martinez, Jose Manuel; Bautista Loza, Doris Anabelle; Sebastián, José María
2018-03-26
Neurological patients after stroke usually present cognitive deficits that cause dependencies in their daily living. These deficits mainly affect the performance of some of their daily activities. For that reason, stroke patients need long-term processes for their cognitive rehabilitation. Considering that classical techniques are focused on acting as guides and are dependent on help from therapists, significant efforts are being made to improve current methodologies and to use eHealth and Web-based architectures to implement information and communication technology (ICT) systems that achieve reliable, personalized, and home-based platforms to increase efficiency and level of attractiveness for patients and carers. The goal of this work was to provide an overview of the practices implemented for the assessment of stroke patients and cognitive rehabilitation. This study puts together traditional methods and the most recent personalized platforms based on ICT technologies and Internet of Things. A literature review has been distributed to a multidisciplinary team of researchers from engineering, psychology, and sport science fields. The systematic review has been focused on published scientific research, other European projects, and the most current innovative large-scale initiatives in the area. A total of 3469 results were retrieved from Web of Science, 284 studies from Journal of Medical Internet Research, and 15 European research projects from Community Research and Development Information Service from the last 15 years were reviewed for classification and selection regarding their relevance. A total of 7 relevant studies on the screening of stroke patients have been presented with 6 additional methods for the analysis of kinematics and 9 studies on the execution of goal-oriented activities. Meanwhile, the classical methods to provide cognitive rehabilitation have been classified in the 5 main techniques implemented. Finally, the review has been finalized with the selection of 8 different ICT-based approaches found in scientific-technical studies, 9 European projects funded by the European Commission that offer eHealth architectures, and other large-scale activities such as smart houses and the initiative City4Age. Stroke is one of the main causes that most negatively affect countries in the socioeconomic aspect. The design of new ICT-based systems should provide 4 main features for an efficient and personalized cognitive rehabilitation: support in the execution of complex daily tasks, automatic error detection, home-based performance, and accessibility. Only 33% of the European projects presented fulfilled those requirements at the same time. For this reason, current and future large-scale initiatives focused on eHealth and smart environments should try to solve this situation by providing more complete and sophisticated platforms. ©José M Cogollor, Javier Rojo-Lacal, Joachim Hermsdörfer, Manuel Ferre, Maria Teresa Arredondo Waldmeyer, Christos Giachritsis, Alan Armstrong, Jose Manuel Breñosa Martinez, Doris Anabelle Bautista Loza, José María Sebastián. Originally published in JMIR Rehabilitation and Assistive Technology (http://rehab.jmir.org), 26.03.2018.
Bierer, S Beth; Prayson, Richard A; Dannefer, Elaine F
2015-05-01
This study used variables proposed in social cognitive career theory (SCCT) to focus the evaluation of a research curriculum at the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University (CCLCM). Eight cohorts of CCLCM medical students completed a web-based version of the six-scale Clinical Research Appraisal Inventory-Short Form (CRAI-SF) items at matriculation (n = 128) or graduation (n = 111) during 2009-2013. Parametric statistics were used to compare CRAI-SF scales to domains proposed in SCCT: trainees' characteristics (gender, training level, advanced degree), career interests, career intentions (medical specialty), and performance (peer-reviewed publications and required thesis topic). A number of lessons emerged in using theory to frame the evaluation of a complex educational program. Graduates rated their research self-efficacy significantly higher on all six CRAI-SF scales with large effect sizes (>.90) on five scales (Conceptualizing a Study, Study Design and Analysis, Responsible Research Conduct, Collaborating with Others, and Reporting a Study). Women and men did not have significantly different scores on CRAI-SF scales (p > .05), suggesting that the research program provides adequate supports for women students. Most thesis projects addressed clinical (36.9 %, n = 41) or translational (34.2 %, n = 38) research topics. The CRAI-SF discriminated between medical school matriculates and graduates, suggesting that research self-efficacy increases with mastery experiences. No significant relationships occurred between CRAI-SF scores and graduates' thesis topics or chosen clinical specialty. Correlations demonstrated significant relationships between graduates' perceptions of research self-efficacy and their interest in clinical research careers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schauder, C.
This subcontract report was completed under the auspices of the NREL/SCE High-Penetration Photovoltaic (PV) Integration Project, which is co-funded by the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE) and the California Solar Initiative (CSI) Research, Development, Demonstration, and Deployment (RD&D) program funded by the California Public Utility Commission (CPUC) and managed by Itron. This project is focused on modeling, quantifying, and mitigating the impacts of large utility-scale PV systems (generally 1-5 MW in size) that are interconnected to the distribution system. This report discusses the concerns utilities have when interconnecting large PV systems thatmore » interconnect using PV inverters (a specific application of frequency converters). Additionally, a number of capabilities of PV inverters are described that could be implemented to mitigate the distribution system-level impacts of high-penetration PV integration. Finally, the main issues that need to be addressed to ease the interconnection of large PV systems to the distribution system are presented.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Supinski, B R; Miller, B P; Liblit, B
2011-09-13
Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques.more » Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two years of the project, we have successfully extended STAT to determine the relative progress of different MPI processes. We have shown that the STAT, which is now included in the debugging tools distributed by Cray with their large-scale systems, substantially reduces the scale at which traditional debugging techniques are applied. We have extended CBI to large-scale systems and developed new compiler based analyses that reduce its instrumentation overhead. Our results demonstrate that CBI can identify the source of errors in large-scale applications. Finally, we have developed MPIecho, a new technique that will reduce the time required to perform key correctness analyses, such as the detection of writes to unallocated memory. Overall, our research results are the foundations for new debugging paradigms that will improve application scientist productivity by reducing the time to determine which package or module contains the root cause of a problem that arises at all scales of our high end systems. While we have made substantial progress in the first two years of CoPS research, significant work remains. While STAT provides scalable debugging assistance for incorrect application runs, we could apply its techniques to assertions in order to observe deviations from expected behavior. Further, we must continue to refine STAT's techniques to represent behavioral equivalence classes efficiently as we expect systems with millions of threads in the next year. We are exploring new CBI techniques that can assess the likelihood that execution deviations from past behavior are the source of erroneous execution. Finally, we must develop usable correctness analyses that apply the MPIecho parallelization strategy in order to locate coding errors. We expect to make substantial progress on these directions in the next year but anticipate that significant work will remain to provide usable, scalable debugging paradigms.« less
NASA Astrophysics Data System (ADS)
Soja, Amber; Westberg, David; Stackhouse, Paul, Jr.; McRae, Douglas; Jin, Ji-Zhong; Sukhinin, Anatoly
2010-05-01
Fire is the dominant disturbance that precipitates ecosystem change in boreal regions, and fire is largely under the control of weather and climate. Fire frequency, fire severity, area burned and fire season length are predicted to increase in boreal regions under current climate change scenarios. Therefore, changes in fire regimes have the potential to compel ecological change, moving ecosystems more quickly towards equilibrium with a new climate. The ultimate goal of this research is to assess the viability of large-scale (1°) data to be used to define fire weather danger and fire regimes, so that large-scale data can be confidently used to predict future fire regimes using large-scale fire weather data, like that available from current Intergovernmental Panel on Climate Change (IPCC) climate change scenarios. In this talk, we intent to: (1) evaluate Fire Weather Indices (FWI) derived using reanalysis and interpolated station data; (2) discuss the advantages and disadvantages of using these distinct data sources; and (3) highlight established relationships between large-scale fire weather data, area burned, active fires and ecosystems burned. Specifically, the Canadian Forestry Service (CFS) Fire Weather Index (FWI) will be derived using: (1) NASA Goddard Earth Observing System version 4 (GEOS-4) large-scale reanalysis and NASA Global Precipitation Climatology Project (GPCP) data; and National Climatic Data Center (NCDC) surface station-interpolated data. Requirements of the FWI are local noon surface-level air temperature, relative humidity, wind speed, and daily (noon-noon) rainfall. GEOS-4 reanalysis and NCDC station-interpolated fire weather indices are generally consistent spatially, temporally and quantitatively. Additionally, increased fire activity coincides with increased FWI ratings in both data products. Relationships have been established between large-scale FWI to area burned, fire frequency, ecosystem types, and these can be use to estimate historic and future fire regimes.
NASA Astrophysics Data System (ADS)
Burgos-Martin, J.; Sanchez-Padron, M.; Sanchez, F.; Martinez-Roger, Carlos
2004-07-01
Large-Scale observing facilities are scarce and costly. Even so, the perspective to enlarge or to increase the number of these facilities are quite real and several projects are undertaking their first steps in this direction. These costly facilities require the cooperation of highly qualified institutions, able to undertake the project from the scientific and technological point of view, as well as the vital collaboration and effective support of several countries, at the highest level, able to provide the necessary investment for their construction. Because of these technological implications and the financial magnitude of these projects, their impact goes well beyond the international astrophysical community. We propose to carry out a study on the socio-economic impact from the construction and operation of an Extremely Large Telescope of class 30 - 100 m. We plan to approach several aspects such as its impact in the promotion of the employment; social, educational and cultural integration of the population; the impulse of industries; its impact on the national and international policies on research; environmental issues; etc. We will also analyze the financial instruments available, and those special aids only accessible for some countries and regions to encourage their participation in projects of this magnitude.
Evans Emily Evans Project Controller Emily.Evans@nrel.gov | 303-275-3125 Emily joined NREL in 2010 . As a Project Administrator in the Integrated Applications Center, Emily works with project managers and teams to develop and maintain project management excellence on large-scale, multi-year projects
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2012-01-01
This paper presents past and current work in dealing with indirect industry and NASA costs when providing cost estimation or analysis for NASA projects and programs. Indirect costs, when defined as those costs in a project removed from the actual hardware or software hands-on labor; makes up most of the costs of today's complex large scale NASA space/industry projects. This appears to be the case across phases from research into development into production and into the operation of the system. Space transportation is the case of interest here. Modeling and cost estimation as a process rather than a product will be emphasized. Analysis as a series of belief systems in play among decision makers and decision factors will also be emphasized to provide context.
NASA Technical Reports Server (NTRS)
1983-01-01
The Flat Plate Solar Array Project, focuses on advancing technologies relevant to the design and construction of megawatt level central station systems. Photovoltaic modules and arrays for flat plate central station or other large scale electric power production facilities require the establishment of a technical base that resolves design issues and results in practical and cost effective configurations. Design, qualification and maintenance issues related to central station arrays derived from the engineering and operating experiences of early applications and parallel laboratory reserch activities are investigated. Technical issues are examined from the viewpoint of the utility engineer, architect/engineer and laboratory researcher. Topics on optimum source circuit designs, module insulation design for high system voltages, array safety, structural interface design, measurements, and array operation and maintenance are discussed.
Sagili, Karuna D; Satyanarayana, Srinath; Chadha, Sarabjit S; Wilson, Nevin C; Kumar, Ajay M V; Oeltmann, John E; Chadha, Vineet K; Nagaraja, Sharath Burugina; Ghosh, Smita; Q Lo, Terrence; Volkmann, Tyson; Willis, Matthew; Shringarpure, Kalpita; Reddy, Ravichandra Chinnappa; Kumar, Prahlad; Nair, Sreenivas A; Rao, Raghuram; Yassin, Mohammed; Mwangala, Perry; Zachariah, Rony; Tonsing, Jamhoih; Harries, Anthony D; Khaparde, Sunil
2018-01-01
ABSTRACT Background: The Global Fund encourages operational research (OR) in all its grants; however very few reports describe this aspect. In India, Project Axshya was supported by a Global Fund grant to improve the reach and visibility of the government Tuberculosis (TB) services among marginalised and vulnerable communities. OR was incorporated to build research capacity of professionals working with the national TB programme and to generate evidence to inform policies and practices. Objectives: To describe how Project Axshya facilitated building OR capacity within the country, helped in addressing several TB control priority research questions, documented project activities and their outcomes, and influenced policy and practice. Methods: From September 2010 to September 2016, three key OR-related activities were implemented. First, practical output-oriented modular training courses were conducted (n = 3) to build research capacity of personnel involved in the TB programme, co-facilitated by The Union, in collaboration with the national TB programme, WHO country office and CDC, Atlanta. Second, two large-scale Knowledge, Attitude and Practice (KAP) surveys were conducted at baseline and mid-project to assess the changes pertaining to TB knowledge, attitudes and practices among the general population, TB patients and health care providers over the project period. Third, studies were conducted to describe the project’s core activities and outcomes. Results: In the training courses, 44 participant teams were supported to develop research protocols on topics of national priority, resulting in 28 peer-reviewed scientific publications. The KAP surveys and description of project activities resulted in 14 peer-reviewed publications. Of the published papers at least 12 have influenced change in policy or practice. Conclusions: OR within a Global Fund supported TB project has resulted in building OR capacity, facilitating research in areas of national priority and influencing policy and practice. We believe this experience will provide guidance for undertaking OR in Global Fund projects. PMID:29553308
2010-01-01
Background Shared-usage high throughput screening (HTS) facilities are becoming more common in academe as large-scale small molecule and genome-scale RNAi screening strategies are adopted for basic research purposes. These shared facilities require a unique informatics infrastructure that must not only provide access to and analysis of screening data, but must also manage the administrative and technical challenges associated with conducting numerous, interleaved screening efforts run by multiple independent research groups. Results We have developed Screensaver, a free, open source, web-based lab information management system (LIMS), to address the informatics needs of our small molecule and RNAi screening facility. Screensaver supports the storage and comparison of screening data sets, as well as the management of information about screens, screeners, libraries, and laboratory work requests. To our knowledge, Screensaver is one of the first applications to support the storage and analysis of data from both genome-scale RNAi screening projects and small molecule screening projects. Conclusions The informatics and administrative needs of an HTS facility may be best managed by a single, integrated, web-accessible application such as Screensaver. Screensaver has proven useful in meeting the requirements of the ICCB-Longwood/NSRB Screening Facility at Harvard Medical School, and has provided similar benefits to other HTS facilities. PMID:20482787
Tolopko, Andrew N; Sullivan, John P; Erickson, Sean D; Wrobel, David; Chiang, Su L; Rudnicki, Katrina; Rudnicki, Stewart; Nale, Jennifer; Selfors, Laura M; Greenhouse, Dara; Muhlich, Jeremy L; Shamu, Caroline E
2010-05-18
Shared-usage high throughput screening (HTS) facilities are becoming more common in academe as large-scale small molecule and genome-scale RNAi screening strategies are adopted for basic research purposes. These shared facilities require a unique informatics infrastructure that must not only provide access to and analysis of screening data, but must also manage the administrative and technical challenges associated with conducting numerous, interleaved screening efforts run by multiple independent research groups. We have developed Screensaver, a free, open source, web-based lab information management system (LIMS), to address the informatics needs of our small molecule and RNAi screening facility. Screensaver supports the storage and comparison of screening data sets, as well as the management of information about screens, screeners, libraries, and laboratory work requests. To our knowledge, Screensaver is one of the first applications to support the storage and analysis of data from both genome-scale RNAi screening projects and small molecule screening projects. The informatics and administrative needs of an HTS facility may be best managed by a single, integrated, web-accessible application such as Screensaver. Screensaver has proven useful in meeting the requirements of the ICCB-Longwood/NSRB Screening Facility at Harvard Medical School, and has provided similar benefits to other HTS facilities.
Andreas, Afshin; Wilcox, Steve
2016-03-14
Located in Colorado, near Denver International Airport, SolarTAC is a private, member-based, 74-acre outdoor facility where the solar industry tests, validates, and demonstrates advanced solar technologies. SolarTAC was launched in 2008 by a public-private consortium, including Midwest Research Institute (MRI). As a supporting member of SolarTAC, the U.S. Department of Energy National Renewable Energy Laboratory (NMREL) has established a high quality solar and meteorological measurement station at this location. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar powered projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
NASA Astrophysics Data System (ADS)
Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena
2015-12-01
In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials and general approach can evolve in response to solicited feedback. The improvement cycle concentrates on avoiding overly positive self-evaluation while addressing relevant external school and community factors while concentrating on backward mapping from clearly set goals. Limiting factors, including time, resources, support and the potential for failure in the classroom, are dealt with as much as possible in the large-scale design allowing teachers the best chance of successful implementation in their real-world classroom. The actual approach adopted following the principles of this design is also outlined, which has seen success in bringing real astronomical data and access to telescopes into the high school classroom.
Understanding life together: A brief history of collaboration in biology
Vermeulen, Niki; Parker, John N.; Penders, Bart
2013-01-01
The history of science shows a shift from single-investigator ‘little science’ to increasingly large, expensive, multinational, interdisciplinary and interdependent ‘big science’. In physics and allied fields this shift has been well documented, but the rise of collaboration in the life sciences and its effect on scientific work and knowledge has received little attention. Research in biology exhibits different historical trajectories and organisation of collaboration in field and laboratory – differences still visible in contemporary collaborations such as the Census of Marine Life and the Human Genome Project. We employ these case studies as strategic exemplars, supplemented with existing research on collaboration in biology, to expose the different motives, organisational forms and social dynamics underpinning contemporary large-scale collaborations in biology and their relations to historical patterns of collaboration in the life sciences. We find the interaction between research subject, research approach as well as research organisation influencing collaboration patterns and the work of scientists. PMID:23578694
Really Large Scale Computer Graphic Projection Using Lasers and Laser Substitutes
NASA Astrophysics Data System (ADS)
Rother, Paul
1989-07-01
This paper reflects on past laser projects to display vector scanned computer graphic images onto very large and irregular surfaces. Since the availability of microprocessors and high powered visible lasers, very large scale computer graphics projection have become a reality. Due to the independence from a focusing lens, lasers easily project onto distant and irregular surfaces and have been used for amusement parks, theatrical performances, concert performances, industrial trade shows and dance clubs. Lasers have been used to project onto mountains, buildings, 360° globes, clouds of smoke and water. These methods have proven successful in installations at: Epcot Theme Park in Florida; Stone Mountain Park in Georgia; 1984 Olympics in Los Angeles; hundreds of Corporate trade shows and thousands of musical performances. Using new ColorRayTM technology, the use of costly and fragile lasers is no longer necessary. Utilizing fiber optic technology, the functionality of lasers can be duplicated for new and exciting projection possibilities. The use of ColorRayTM technology has enjoyed worldwide recognition in conjunction with Pink Floyd and George Michaels' world wide tours.
Sustainable Biofuels Development Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reardon, Kenneth F.
2015-03-01
The mission of the Sustainable Bioenergy Development Center (SBDC) is to enhance the capability of America’s bioenergy industry to produce transportation fuels and chemical feedstocks on a large scale, with significant energy yields, at competitive cost, through sustainable production techniques. Research within the SBDC is organized in five areas: (1) Development of Sustainable Crops and Agricultural Strategies, (2) Improvement of Biomass Processing Technologies, (3) Biofuel Characterization and Engine Adaptation, (4) Production of Byproducts for Sustainable Biorefining, and (5) Sustainability Assessment, including evaluation of the ecosystem/climate change implication of center research and evaluation of the policy implications of widespread production andmore » utilization of bioenergy. The overall goal of this project is to develop new sustainable bioenergy-related technologies. To achieve that goal, three specific activities were supported with DOE funds: bioenergy-related research initiation projects, bioenergy research and education via support of undergraduate and graduate students, and Research Support Activities (equipment purchases, travel to attend bioenergy conferences, and seminars). Numerous research findings in diverse fields related to bioenergy were produced from these activities and are summarized in this report.« less
NASA Astrophysics Data System (ADS)
Gorgas, Thomas; Conze, Ronald; Lorenz, Henning; Elger, Kirsten; Ulbricht, Damian; Wilkens, Roy; Lyle, Mitchell; Westerhold, Thomas; Drury, Anna Joy; Tian, Jun; Hahn, Annette
2017-04-01
Scientific ocean drilling over the past >40 years and corresponding efforts on land (by now for more than >20 years) has led to the accumulation of an enormous amount of valuable petrophysical, geochemical, biological and geophysical data obtained through laboratory and field experiments across a multitude of scale-and time dimensions. Such data can be utilized comprehensively in a holistic fashion, and thereby provide base toward an enhanced "Core-Log-Integration", modeling small-scale basin processes to large-scale Earth phenomena, while also storing and managing all relevant information in an "Open Access" fashion. Since the early 1990's members of our team have acquired and measured a large dataset of physical and geochemical properties representing both terrestrial and marine geological environments. This dataset cover a variety of both macro-to-microscale dimensions, and thereby allowing this type of interdisciplinary data examination. Over time, data management and processing tools have been developed and were recently merged with modern data publishing methods, which allow identifying and tracking data and associated publications in a trackable and concise manner. Our current presentation summarizes an important part of the value chain in geosciences, comprising: 1) The state-of-the-art in data management for continental and lake drilling projects performed with and through ICDP's Drilling Information System (DIS). 2) The CODD (Code for Ocean Drilling Data) as numerical-based, programmable data processing toolbox and applicable for both continental and marine drilling projects. 3) The implementation of Persistent Identifiers, such as the International Geo Sample Number (IGSN) to identify and track sample material as part of Digital-Object-Identifier (DOI)-tagged operation reports and research publications. 4) A list of contacts provided for scientists with an interest in learning and applying methods and techniques we offer in form of basic and advanced training courses at our respective research institutions and facilities around the world.
NASA Astrophysics Data System (ADS)
Vaquero, C.; López de Ipiña, J.; Galarza, N.; Hargreaves, B.; Weager, B.; Breen, C.
2011-07-01
New developments based on nanotechnology have to guarantee safe products and processes to be accepted by society. The Polyfire project will develop and scale-up techniques for processing halogen-free, fire-retardant nanocomposite materials and coatings based on unsaturated polyester resins and organoclays. The project includes a work package that will assess the Health and Environmental impacts derived from the manipulation of nanoparticles. This work package includes the following tasks: (1) Identification of Health and Environment Impacts derived from the processes, (2) Experimentation to study specific Nanoparticle Emissions, (3) Development of a Risk Management Methodology for the process, and (4) A Comparison of the Health and Environmental Impact of New and Existing Materials. To date, potential exposure scenarios to nanomaterials have been identified through the development of a Preliminary Hazard Analysis (PHA) of the new production processes. In the next step, these scenarios will be studied and simulated to evaluate potential emissions of nanomaterials. Polyfire is a collaborative European project, funded by the European Commission 7th Framework Programme (Grant Agreement No 229220). It features 11 partners from 5 countries (5 SMEs, 3 research institutes, 2 large companies, 1 association) and runs for three years (1st September 2009 - 31st August 2012). This project is an example of an industrial research development which aims to introduce to the market new products promoting the safe use of nanomaterials.
Activating social strategies: Face-to-face interaction in technology-mediated citizen science.
Cappa, Francesco; Laut, Jeffrey; Nov, Oded; Giustiniano, Luca; Porfiri, Maurizio
2016-11-01
The use of crowds in research activities by public and private organizations is growing under different forms. Citizen science is a popular means of engaging the general public in research activities led by professional scientists. By involving a large number of amateur scientists, citizen science enables distributed data collection and analysis on a scale that would be otherwise difficult and costly to achieve. While advancements in information technology in the past few decades have fostered the growth of citizen science through online participation, several projects continue to fail due to limited participation. Such web-based projects may isolate the citizen scientists from the researchers. By adopting the perspective of social strategy, we investigate within a measure-manipulate-measure experiment if motivations to participate in a citizen science project can be positively influenced by a face-to-face interaction with the scientists leading the project. Such an interaction provides the participants with the possibility of asking questions on the spot and obtaining a detailed explanation of the citizen science project, its scientific merit, and environmental relevance. Social and cultural factors that moderate the effect brought about by face-to-face interactions on the motivations are also dissected and analyzed. Our findings provide an exploratory insight into a means for motivating crowds to participate in online environmental monitoring projects, also offering possible selection criteria of target audience. Copyright © 2016 Elsevier Ltd. All rights reserved.
Translational Genomics in Low and Middle Income Countries: Opportunities and Challenges
Tekola-Ayele, Fasil; Rotimi, Charles N.
2015-01-01
Translation of genomic discoveries into patient care is slowly becoming a reality in developed economies around the world. In contrast, low and middle income countries (LMIC) have participated minimally in genomic research for several reasons including lack of coherent national policies, limited number of well-trained genomic scientists, poor research infrastructure, and local economic and cultural challenges. Recent initiatives such as the Human Heredity and Health in Africa (H3Africa), the Qatar Genome Project and the Mexico National Institute of Genomic Medicine (INMEGEN) that aim to address these problems through capacity building and empowerment of local researchers have sparked a paradigm shift. In this short communication, we describe experiences of small-scale medical genetics and translational genomics research programs in LMIC. The lessons drawn from these programs drive home the importance of addressing resource, policy, and socio-cultural dynamics to realize the promise of precision medicine driven by genomic science globally. By echoing lessons from a bench-to-community translational genomics research, we advocate that large-scale genomics research projects can be successfully linked with health care programs. To harness the benefits of genomics-led health care, LMIC governments should begin to develop national genomics policies that will address human and technology capacity development within the context of their national economic and socio-cultural uniqueness. These policies should encourage international collaboration and promote link between the public health program and genomics researchers. Finally, we highlight the potential catalytic roles of the global community to foster translational genomics in LMIC. PMID:26138992
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update
Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy
2016-01-01
High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889
Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs
ERIC Educational Resources Information Center
Carr, Nathan T.
2008-01-01
Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…
Development of Affordable, Low-Carbon Hydrogen Supplies at an Industrial Scale
ERIC Educational Resources Information Center
Roddy, Dermot J.
2008-01-01
An existing industrial hydrogen generation and distribution infrastructure is described, and a number of large-scale investment projects are outlined. All of these projects have the potential to generate significant volumes of low-cost, low-carbon hydrogen. The technologies concerned range from gasification of coal with carbon capture and storage…
NASA Astrophysics Data System (ADS)
Kavka, Petr; Zumr, David; Neumann, Martin; Lidmila, Martin; Dufka, Dušan
2017-04-01
Soil erosion of the slopes along the linear construction sites, such as railroads, roads, pipelines or watercourses, is usually underestimated by the construction companies and controlling authorities. But under certain circumstances, when the construction site is not maintained and protected properly, a large amounts of soil may be transported from the sites to the surrounding environment during the intensive rainfall. Transported sediment, often carrying adsorbed pollutants, may reach watercourses and cause water recipient siltation and pollution. Within the applied research project we investigate ways of low cost, quick and easy technical measures that would help to protect the slopes against the splash erosion, rills development and sliding. The methodology is based on testing of various permeable covers, sheets, anchoring and patchy vegetation on a plot and hillslope scales. In this contribution we will present the experimental plot setup, consisting of large soil blocks encapsulated in the monitored steel containers and nozzle rainfall simulator. The presentation is funded by the Technological Agency of the Czech Republic (research project TH02030428) and an internal student CTU grant.
ERIC Educational Resources Information Center
Gaffney, Jon D. H.; Richards, Evan; Kustusch, Mary Bridget; Ding, Lin; Beichner, Robert J.
2008-01-01
The SCALE-UP (Student-Centered Activities for Large Enrollment for Undergraduate Programs) project was developed to implement reforms designed for small classes into large physics classes. Over 50 schools across the country, ranging from Wake Technical Community College to Massachusetts Institute of Technology (MIT), have adopted it for classes of…
NASA Astrophysics Data System (ADS)
Black, R. X.
2017-12-01
We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.
Budgeting for Solar PV Plant Operations & Maintenance: Practices and Pricing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enbar, Nadav; Weng, Dean; Klise, Geoffrey Taylor
2016-01-01
With rising grid interconnections of solar photovoltaic (PV) systems, greater attention is being trained on lifecycle performance, reliability, and project economics. Expected to meet production thresholds over a 20-30 year timeframe, PV plants require a steady diet of operations and maintenance (O&M) oversight to meet contractual terms. However, industry best practices are only just beginning to emerge, and O&M budgets—given the arrangement of the solar project value chain—appear to vary widely. Based on insights from in-depth interviews and survey research, this paper presents an overview of the utility-scale PV O&M budgeting process along with guiding rationales, before detailing perspectives onmore » current plant upkeep activities and price points largely in the U.S. It concludes by pondering potential opportunities for improving upon existing O&M budgeting approaches in ways that can benefit the industry at-large.« less
Budgeting for Solar PV Plant Operations & Maintenance: Practices and Pricing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enbar, Nadav; Weng, Dean; Klise, Geoffrey Taylor
2015-12-01
With rising grid interconnections of solar photovoltaic (PV) systems, greater attention is being trained on lifecycle performance, reliability, and project economics. Expected to meet production thresholds over a 20-30 year timeframe, PV plants require a steady diet of operations and maintenance (O&M) oversight to meet contractual terms. However, industry best practices are only just beginning to emerge, and O&M budgets—given the arrangement of the solar project value chain—appear to vary widely. Based on insights from in-depth interviews and survey research, this paper presents an overview of the utility-scale PV O&M budgeting process along with guiding rationales, before detailing perspectives onmore » current plant upkeep activities and price points largely in the U.S. It concludes by pondering potential opportunities for improving upon existing O&M budgeting approaches in ways that can benefi t the industry at-large.« less
Mems: Platform for Large-Scale Integrated Vacuum Electronic Circuits
2017-03-20
SECURITY CLASSIFICATION OF: The objective of the LIVEC advanced study project was to develop a platform for large-scale integrated vacuum electronic ...Distribution Unlimited UU UU UU UU 20-03-2017 1-Jul-2014 30-Jun-2015 Final Report: MEMS Platform for Large-Scale Integrated Vacuum Electronic ... Electronic Circuits (LIVEC) Contract No: W911NF-14-C-0093 COR Dr. James Harvey U.S. ARO RTP, NC 27709-2211 Phone: 702-696-2533 e-mail
Hernandez-Villafuerte, Karla; Sussex, Jon; Robin, Enora; Guthrie, Sue; Wooding, Steve
2017-02-02
Publicly funded biomedical and health research is expected to achieve the best return possible for taxpayers and for society generally. It is therefore important to know whether such research is more productive if concentrated into a small number of 'research groups' or dispersed across many. We undertook a systematic rapid evidence assessment focused on the research question: do economies of scale and scope exist in biomedical and health research? In other words, is that research more productive per unit of cost if more of it, or a wider variety of it, is done in one location? We reviewed English language literature without date restriction to the end of 2014. To help us to classify and understand that literature, we first undertook a review of econometric literature discussing models for analysing economies of scale and/or scope in research generally (not limited to biomedical and health research). We found a large and disparate literature. We reviewed 60 empirical studies of (dis-)economies of scale and/or scope in biomedical and health research, or in categories of research including or overlapping with biomedical and health research. This literature is varied in methods and findings. At the level of universities or research institutes, studies more often point to positive economies of scale than to diseconomies of scale or constant returns to scale in biomedical and health research. However, all three findings exist in the literature, along with inverse U-shaped relationships. At the level of individual research units, laboratories or projects, the numbers of studies are smaller and evidence is mixed. Concerning economies of scope, the literature more often suggests positive economies of scope than diseconomies, but the picture is again mixed. The effect of varying the scope of activities by a research group was less often reported than the effect of scale and the results were more mixed. The absence of predominant findings for or against the existence of economies of scale or scope implies a continuing need for case by case decisions when distributing research funding, rather than a general policy either to concentrate funding in a few centres or to disperse it across many.